WO2019194079A1 - Système d'estimation de position, corps mobile comprenant ledit système d'estimation de position, et programme informatique - Google Patents

Système d'estimation de position, corps mobile comprenant ledit système d'estimation de position, et programme informatique Download PDF

Info

Publication number
WO2019194079A1
WO2019194079A1 PCT/JP2019/013741 JP2019013741W WO2019194079A1 WO 2019194079 A1 WO2019194079 A1 WO 2019194079A1 JP 2019013741 W JP2019013741 W JP 2019013741W WO 2019194079 A1 WO2019194079 A1 WO 2019194079A1
Authority
WO
WIPO (PCT)
Prior art keywords
estimated value
position estimation
map
scan data
reference map
Prior art date
Application number
PCT/JP2019/013741
Other languages
English (en)
Japanese (ja)
Inventor
慎治 鈴木
佐伯 哲夫
Original Assignee
日本電産株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電産株式会社 filed Critical 日本電産株式会社
Priority to CN201980022370.6A priority Critical patent/CN111971633B/zh
Publication of WO2019194079A1 publication Critical patent/WO2019194079A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present disclosure relates to a position estimation system and a moving object including the position estimation system.
  • the present disclosure also relates to a computer program used for position estimation.
  • Japanese Patent Application Laid-Open No. 2008-250905 discloses a mobile robot that performs self-position estimation by matching sensor data acquired from a laser range finder with a map prepared in advance.
  • the environment may change after the map is created. For example, layouts are often temporarily changed in factories and distribution warehouses.
  • a free space may be temporarily reduced by placing a load, a device, or another moving body in a place (free space) where the moving body can pass on the map. If matching is performed using an environment map that does not reflect the actual environment, a situation in which the self-position cannot be estimated may occur. *
  • Embodiments of the present disclosure provide a self-estimation system and a moving body that can estimate a self-position even when a part of a map does not reflect an actual environment, and a computer program used for the self-estimation.
  • the position estimation system of the present disclosure is a position estimation system for a moving body that is used in connection with an external sensor that repeatedly scans an environment and outputs sensor data for each scan in a non-limiting exemplary embodiment. And a first memory for storing an environmental map prepared in advance, and a second memory for storing a computer program for operating the processor.
  • the at least one processor generates a first estimated value of the position and orientation of the moving body based on a result of collation between the environmental map and the sensor data in accordance with a command of the computer program.
  • a moving body of the present disclosure includes the position estimation system described above, the external sensor, and a driving device for movement.
  • the computer program of the present disclosure is a computer program used in any one of the above-described position estimation systems in a non-limiting exemplary embodiment.
  • FIG. 1 is a diagram illustrating a configuration of an embodiment of a moving object according to the present disclosure.
  • FIG. 2 is a plan layout diagram schematically showing an example of an environment in which a moving body moves.
  • FIG. 3 is a diagram showing an environment map of the environment shown in FIG.
  • FIG. 4A is a diagram schematically illustrating an example of scan data SD (t) acquired by the external sensor at time t.
  • FIG. 4B is a diagram schematically illustrating an example of scan data SD (t + ⁇ t) acquired by the external sensor at time t + ⁇ t.
  • FIG. 4C is a diagram schematically illustrating a state in which the scan data SD (t + ⁇ t) is matched with the scan data SD (t).
  • FIG. 5 is a diagram schematically illustrating how the point group constituting the scan data rotates and translates from the initial position and approaches the point group of the environment map.
  • FIG. 6 is a diagram illustrating the position and orientation of the scan data after the rigid body conversion.
  • FIG. 7A is a diagram schematically illustrating a state in which scan data is acquired from an external sensor, a reference map is created from the scan data, and then newly acquired scan data is matched with the reference map.
  • FIG. 7B is a diagram schematically illustrating a reference map that is updated by adding newly acquired scan data to the reference map of FIG. 7A.
  • FIG. 7C is a diagram schematically illustrating a reference map that is updated by adding newly acquired scan data to the reference map of FIG. 7B.
  • FIG. 7A is a diagram schematically illustrating a state in which scan data is acquired from an external sensor, a reference map is created from the scan data, and then newly acquired scan data is matched with the reference map.
  • FIG. 7B is a diagram
  • FIG. 8A is a diagram schematically illustrating an example of scan data SD (t) acquired by the external sensor at time t.
  • FIG. 8B is a diagram schematically illustrating a state when matching of the scan data SD (t) with the environment map M is started.
  • FIG. 8C is a diagram schematically illustrating a state where the matching of the scan data SD (t) with the environment map M is completed.
  • FIG. 9 is a diagram schematically illustrating the history of the position and orientation of the moving body obtained in the past and the predicted values of the current position and orientation.
  • FIG. 10 is a flowchart illustrating a part of the operation of the position estimation device according to the embodiment of the present disclosure.
  • FIG. 10 is a flowchart illustrating a part of the operation of the position estimation device according to the embodiment of the present disclosure.
  • FIG. 11A is a diagram illustrating an example in which the amount of change in the first estimated value due to the first position estimation process (offline SLAM) varies.
  • FIG. 11B is a diagram illustrating an example in which the difference between the first estimated value obtained by the first position estimating process (offline SLAM) and the measured value obtained by the sensor varies.
  • FIG. 11C is a diagram illustrating an example in which the reliability of the first estimated value by the first position estimation process (offline SLAM) and the reliability of the second estimated value by the second position estimation process (online SLAM) vary.
  • FIG. 12 is a flowchart illustrating a part of the operation of the position estimation device according to the embodiment of the present disclosure.
  • FIG. 12 is a flowchart illustrating a part of the operation of the position estimation device according to the embodiment of the present disclosure.
  • FIG. 13 is a flowchart illustrating an example of second position estimation processing (online SLAM) of the position estimation device according to the embodiment of the present disclosure.
  • FIG. 14 is a diagram illustrating an overview of a control system that controls traveling of each AGV according to the present disclosure.
  • FIG. 15 is a perspective view showing an example of an environment where AGVs exist.
  • FIG. 16 is a perspective view showing the AGV and the towing cart before being connected.
  • FIG. 17 is a perspective view showing the AGV and the traction cart connected to each other.
  • FIG. 18 is an external view of an exemplary AGV according to the present embodiment.
  • FIG. 19A is a diagram illustrating a first hardware configuration example of AGV.
  • FIG. 19B is a diagram illustrating a second hardware configuration example of AGV.
  • FIG. 20 is a diagram illustrating a hardware configuration example of the operation management apparatus.
  • AGV Automatic guided vehicle
  • Automated guided vehicle means a trackless vehicle in which a load is loaded on the main body manually or automatically, automatically travels to a designated place, and unloads manually or automatically.
  • Automated guided vehicle includes automatic guided vehicles and automatic forklifts.
  • unmanned means that no person is required to steer the vehicle, and it does not exclude that the automated guided vehicle transports “person (for example, a person who loads and unloads luggage)”.
  • An “unmanned towing vehicle” is a trackless vehicle that automatically pulls a cart that loads and unloads luggage manually or automatically travels to a designated location.
  • An “unmanned forklift” is a trackless vehicle that includes a mast that moves up and down a load transfer fork, automatically transfers the load to the fork, etc., automatically travels to a designated location, and performs automatic cargo handling work.
  • a “trackless vehicle” is a vehicle that includes wheels and an electric motor or engine that rotates the wheels.
  • the “moving body” is a device that carries a person or a load and moves, and includes a driving device such as a wheel, a biped or multi-legged walking device, and a propeller that generate driving force (traction) for movement.
  • a driving device such as a wheel, a biped or multi-legged walking device, and a propeller that generate driving force (traction) for movement.
  • the term “mobile body” in the present disclosure includes not only a narrow automatic guided vehicle but also a mobile robot, a service robot, and a drone. *
  • the “automatic traveling” includes traveling based on a command of a computer operation management system to which the automatic guided vehicle is connected by communication, and autonomous traveling by a control device included in the automatic guided vehicle. Autonomous traveling includes not only traveling where the automated guided vehicle travels to a destination along a predetermined route, but also traveling following a tracking target. Moreover, the automatic guided vehicle may temporarily perform manual travel based on an instruction from the worker. “Automatic travel” generally includes both “guided” travel and “guideless” travel, but in the present disclosure, it means “guideless” travel. *
  • the “guide type” is a system in which a derivative is installed continuously or intermittently and the guided vehicle is guided using the derivative.
  • the “guideless type” is a method of guiding without installing a derivative.
  • An automatic guided vehicle according to an embodiment of the present disclosure includes a position estimation device and can travel in a guideless manner. *
  • the “position estimation device” is a device that estimates a self-position on an environmental map based on sensor data acquired by an external sensor such as a laser range finder. *
  • An “external sensor” is a sensor that senses an external state of a moving body.
  • the external sensor include a laser range finder (also referred to as a range sensor), a camera (or an image sensor), a LIDAR (Light Detection and Ranging), a millimeter wave radar, an ultrasonic sensor, and a magnetic sensor.
  • the “inner world sensor” is a sensor that senses the state inside the moving body.
  • Examples of the internal sensor include a rotary encoder (hereinafter sometimes simply referred to as “encoder”), an acceleration sensor, and an angular acceleration sensor (for example, a gyro sensor).
  • encoder rotary encoder
  • acceleration sensor acceleration sensor
  • angular acceleration sensor for example, a gyro sensor
  • SAM is an abbreviation of “Simultaneous” “Localization” and “Mapping”, and means that self-location estimation and environmental map creation are performed simultaneously.
  • the mobile body 10 of the present disclosure includes an external sensor 102 that scans an environment and periodically outputs scan data in the exemplary embodiment shown in FIG. 1.
  • a typical example of the external sensor 102 is a laser range finder (LRF).
  • the LRF periodically emits, for example, an infrared or visible laser beam to the surroundings to scan the surrounding environment.
  • the laser beam is reflected by a surface such as a structure such as a wall or a pillar or an object placed on the floor.
  • the LRF receives the reflected light of the laser beam, calculates the distance to each reflection point, and outputs measurement result data indicating the position of each reflection point.
  • the direction and distance of the reflected light are reflected at the position of each reflection point.
  • the measurement result data (scan data) may be referred to as “environmental measurement data” or “sensor data”. *
  • the environment scan by the external sensor 102 is performed, for example, on an environment in a range of 135 degrees to the left and right (total 270 degrees) with respect to the front of the external sensor 102. Specifically, a pulsed laser beam is emitted while changing the direction at predetermined step angles in the horizontal plane, and the reflected light of each laser beam is detected to measure the distance. If the step angle is 0.3 degree, measurement data of the distance to the reflection point in the direction determined by the angle corresponding to the total of 901 steps can be obtained.
  • the scanning of the surrounding space performed by the external sensor 102 is substantially parallel to the floor surface and is planar (two-dimensional). However, the external sensor may perform a three-dimensional scan. *
  • a typical example of the scan data can be expressed by the position coordinates of each point constituting a point cloud acquired for each scan.
  • the position coordinates of the points are defined by a local coordinate system that moves with the moving body 10.
  • a local coordinate system may be referred to as a mobile coordinate system or a sensor coordinate system.
  • the origin of the local coordinate system fixed to the moving body 10 is defined as the “position” of the moving body 10
  • the orientation of the local coordinate system is defined as the “posture” of the moving body 10.
  • the position and orientation may be collectively referred to as “pose”.
  • “pose” may be simply referred to as “position”.
  • the scan data When the scan data is displayed in a polar coordinate system, the scan data may be constituted by a numerical set indicating the position of each point by “direction” and “distance” from the origin in the local coordinate system.
  • the polar coordinate system representation may be converted to an orthogonal coordinate system representation.
  • the moving body 10 includes a storage device (first memory) 104 that stores an environment map, and a position estimation system 115.
  • the environment map is prepared by the mobile object 10 or another map creation device and stored in the storage device 104. *
  • the position estimation system 115 is used by being connected to the external sensor 102, and includes a processor 106 and a memory (second memory) 107 that stores a computer program for controlling the operation of the processor.
  • FIG. 1 shows one processor 106 and one memory 107, the number of processors 106 and memory 107 may be two or three or more, respectively. *
  • the position estimation system 115 performs matching between the scan data acquired from the external sensor 102 and the environment map read from the storage device 104, and estimates the position and orientation of the moving body 10, that is, the pose. This matching is called pattern matching or scan matching and can be performed according to various algorithms.
  • a typical example of the matching algorithm is an Iterative Closest Point (ICP: iterative nearest neighbor) algorithm. *
  • the position estimation system 115 can create an environmental map by matching and connecting a plurality of scan data output from the external sensor 102 by matching.
  • the position estimation system 115 in the embodiment of the present disclosure is realized by the processor 106 and the memory 107 that stores a computer program that operates the processor 106.
  • the processor 106 performs the following operations in accordance with instructions from the computer program.
  • First position estimation process (offline SLAM): Generates a first estimated value of the position and posture of the external sensor 102 (that is, the position and posture of the moving body 10) based on the result of collation between the environmental map and the sensor data. . *
  • the processor 106 executes the first position estimation process and the second position estimation process simultaneously or in parallel.
  • the processor 106 may include a first processor part that executes a first position estimation process and a second processor part that executes a second position estimation process. Further, one processor 106 may alternately perform operations necessary for the first position estimation process and the second position estimation process.
  • the processor 106 when the processor 106 outputs the first estimated value as the selected estimated value and any of the following events occurs, the processor 106 outputs the second estimated value instead of the first estimated value. Is output as the selected estimated value. *
  • first reliability The reliability of the first estimated value (first reliability) and the reliability of the second estimated value (second reliability) were calculated, and the first reliability was lower than the second reliability.
  • the current first estimated value has changed from the past first estimated value (time-series data) beyond a predetermined range.
  • the calculation for determining the first estimated value could not be completed within a predetermined time (the matching error did not converge to a sufficiently small level within the predetermined time).
  • the reliability can be expressed quantitatively by, for example, a final error (“position shift amount” or “match rate”) of ICP matching described later. *
  • Each of the above events occurs when the environment map used in offline SLAM does not accurately reflect the current environment. For example, when a layout is temporarily changed in a factory or a distribution warehouse, a load, a device, or another moving body is placed in a place (free space) where the moving body can pass on the environmental map. In spite of such an environmental change, if matching is performed using the previous environmental map as it is, the above-mentioned phenomenon is likely to occur.
  • position estimation is performed by the first position estimation process (offline SLAM), while position estimation by the second position estimation process (online SLAM) is also performed in the background. For this reason, when a shift
  • the second position estimation process (online SLAM) can be executed by the following process, for example. *
  • Scan data is acquired from the external sensor 102, and a reference map (local map) is created from the scan data.
  • the reference map is reset by deleting a part other than the part including the latest scan data from the reference map updated a plurality of times.
  • the environment map is updated based on the reference map updated multiple times before resetting (online map update).
  • the first estimated value obtained by the first position estimation process (offline SLAM) can be obtained without the above-described phenomenon even when the environment change state (for example, layout change) continues. Can be used as the selected estimated value.
  • the moving body 10 further includes a drive device 108, an automatic travel control device 110, and a communication circuit 112.
  • the driving device 108 is a device that generates a driving force for the moving body 10 to move.
  • Examples of drive device 108 include a biped or multi-legged walking device that is operated by a wheel (drive wheel) that is rotated by an electric motor or engine, a motor or other actuator.
  • the wheel may be an omnidirectional wheel such as a Mecanum wheel.
  • the moving body 10 may be a moving body that moves in the air or underwater, or a hovercraft.
  • the driving device 108 includes a propeller that is rotated by a motor. *
  • the automatic travel control device 110 operates the driving device 108 to control the moving conditions (speed, acceleration, moving direction, etc.) of the moving body 10.
  • the automatic traveling control device 110 may move the moving body 10 along a predetermined traveling route, or may move it according to a command given from the outside.
  • the position estimation system 115 calculates an estimated value of the position and orientation of the moving body 10 while the moving body 10 is moving or stopped.
  • the automatic traveling control device 110 controls traveling of the moving body 10 with reference to the estimated value. *
  • the position estimation system 115 and the automatic travel control device 110 may be collectively referred to as a travel control device 120.
  • the automatic traveling control device 110 may be configured by the position estimation system 115 and the processor 106 described above and a memory 107 storing a computer program for controlling the operation of the processor 106.
  • Such a processor 106 and memory 107 can be realized by one or a plurality of semiconductor integrated circuits.
  • the communication circuit 112 is a circuit in which the mobile unit 10 is connected to a communication network including an external management device, another mobile unit, or an operator's mobile terminal device, and exchanges data and / or commands.
  • FIG. 2 is a plan layout diagram schematically showing an example of an environment 200 in which the moving body 10 moves.
  • the environment 200 is part of a wider environment.
  • a thick straight line indicates a fixed wall 202 of a building, for example.
  • FIG. 3 is a diagram showing a map (environment map M) of the environment 200 shown in FIG.
  • Each dot 204 in the figure corresponds to each point of the point group constituting the environment map M.
  • the point cloud of the environment map M may be referred to as a “reference point cloud”
  • the scan data point cloud may be referred to as a “data point cloud” or a “source point cloud”.
  • Matching is, for example, aligning scan data (data point group) with respect to an environmental map (reference point group) having a fixed position.
  • a pair of corresponding points is selected between the reference point group and the data point group, and the distance (error) between the points constituting each pair is minimized. The position and orientation of the data point group are adjusted. *
  • the dots 204 are arranged at equal intervals on a plurality of line segments for simplicity.
  • the point cloud in the actual environment map M may have a more complicated arrangement pattern.
  • the environment map M is not limited to a point cloud map, and may be a map having a straight line or a curve as a constituent element, or may be an occupied grid map. That is, it is only necessary that the environment map M has a structure capable of performing matching between the scan data and the environment map M. In the case of an occupied grid map, Monte Carlo position estimation can be performed.
  • an embodiment of the present disclosure will be described by taking matching by the ICP algorithm as an example, the embodiment of the present disclosure is not limited to this example. *
  • the scan data acquired by the external sensor 102 of the moving body 10 has a different point cloud arrangement. If the moving time from the position PA to the position PC via the position PB is sufficiently long compared to the scanning period of the external sensor 102, that is, if the moving of the moving body 10 is slow, the time axis
  • the two adjacent scan data above are very similar. However, when the moving body 10 moves remarkably fast, the two adjacent scan data on the time axis may be greatly different.
  • map creation can be performed online or offline. For this reason, as for the principle of map creation, the following description can be applied to both offline SLAM and online SLAM. However, in the case of offline SLAM, there is no particular restriction on the time required for creating a map using scan data, and it is possible to perform matching with less error by taking time. *
  • FIG. 4A is a diagram schematically illustrating an example of scan data SD (t) acquired by the external sensor 102 at time t.
  • the scan data SD (t) It is displayed in a sensor coordinate system whose position and orientation change with zero.
  • the scan data SD (t) is represented by a UV coordinate system in which the front surface of the external sensor 102 is the V axis, and the direction rotated 90 ° clockwise from the V axis is the U axis.
  • the moving body 10 more precisely, the external sensor 102 is located at the origin of the UV coordinate system. In the present disclosure, when the moving body 10 moves forward, the moving body 10 advances in front of the external sensor 102, that is, in the direction of the V-axis.
  • the points constituting the scan data SD (t) are indicated by black circles.
  • ⁇ t the period at which the position estimation system 115 acquires scan data from the external sensor 102 is denoted by ⁇ t.
  • ⁇ t is, for example, 200 milliseconds.
  • FIG. 4B is a diagram schematically illustrating an example of scan data SD (t + ⁇ t) acquired by the external sensor 102 at time t + ⁇ t.
  • the points constituting the scan data SD (t + ⁇ t) are indicated by white circles. *
  • ⁇ t is, for example, 200 milliseconds
  • the moving body 10 moves at a speed of 1 mail per second
  • the moving body 10 moves about 20 centimeters during ⁇ t.
  • the environment of the moving body 10 does not change greatly due to movement of about 20 centimeters. Therefore, there is a wide overlap between the environment scanned by the external sensor 102 at time t + ⁇ t and the environment scanned at time t. Is included. Therefore, many corresponding points are included between the point group of the scan data SD (t) and the point group of the scan data SD (t + ⁇ t). *
  • FIG. 4C schematically shows a state where matching between the scan data SD (t) and the scan data SD (t + ⁇ t) is completed.
  • alignment is performed so that the scan data SD (t + ⁇ t) is aligned with the scan data SD (t).
  • the moving body 10 at time t is located at the origin of the UV coordinate system in FIG. 4C, and the moving body 10 at time t + ⁇ t is at a position moved from the origin of the UV coordinate system.
  • the arrangement relationship of the other local coordinate system with respect to one local coordinate system is obtained.
  • a local environment map can be created by connecting a plurality of scan data SD (t), SD (t + ⁇ t),..., SD (t + N ⁇ ⁇ t) acquired periodically.
  • N is an integer of 1 or more.
  • FIG. 5 is a diagram schematically showing how the point group constituting the scan data at time t rotates and translates from the initial position and approaches the point group on the map.
  • m k be the coordinate value of the point on the map corresponding to.
  • the error of the corresponding points in the two point groups can be evaluated by using ⁇ (Z t, k ⁇ m k ) 2 that is the sum of squares of the errors calculated for the K corresponding points as a cost function.
  • Rotational and translational rigid body transformations are determined so as to reduce ⁇ (Z t, k ⁇ m k ) 2 .
  • the rigid transformation is defined by a transformation matrix (homogeneous transformation matrix) that includes a rotation angle and a translation vector as parameters.
  • FIG. 6 is a diagram illustrating the position and orientation of the scan data after the rigid body conversion.
  • matching between the scan data and the map is not completed, and a large error (positional deviation) still exists between the two point groups.
  • rigid body transformation is further performed.
  • the error becomes smaller than a predetermined value, matching is completed.
  • FIG. 7A is a diagram schematically illustrating a state where matching between the latest scan data SD (b) newly acquired and the scan data SD (a) acquired last time is completed.
  • a black circle point group represents the previous scan data
  • a white circle point group represents the latest scan data.
  • FIG. 7A shows the position a of the moving body 10 when the previous scan data is acquired and the position b of the moving body 10 when the latest scan data is acquired.
  • the scan data SD (a) acquired last time constitutes a “reference map RM”.
  • the reference map RM is a part of the environmental map that is being created. Matching is executed so that the position and orientation of the latest scan data SD (b) matches the position and orientation of the previously acquired scan data SD (a).
  • the position and orientation of the moving body 10b on the reference map RM can be known.
  • the scan data SD (b) is added to the reference map RM to update the reference map RM.
  • the coordinate system of the scan data SD (b) is connected to the coordinate system of the scan data SD (a).
  • This connection is represented in a matrix that defines rotation and translational transformation (rigid transformation) of the two coordinate systems. According to such a conversion matrix, the coordinate value of each point on the scan data SD (b) can be converted into the coordinate value in the coordinate system of the scan data SD (a).
  • FIG. 7B shows a reference map RM obtained by adding the next acquired scan data to the reference map RM of FIG. 7A and updating it.
  • a black circle point cloud represents the reference map RM before update
  • a white circle dot cloud represents the latest scan data SD (c).
  • FIG. 7B shows the positions a, b, and c of the moving body 10 when the previous scan data, the previous scan data, and the latest scan data are acquired.
  • the whole point group of white circles and the point group of black circles in FIG. 7B constitute an updated reference map RM. *
  • FIG. 7C shows a reference map RM updated by adding newly acquired scan data SD (d) to the reference map RM of FIG. 7B.
  • the black circle point cloud represents the reference map RM before update
  • the white circle dot cloud represents the latest scan data SD (d).
  • the position d of the moving body 10 at the position estimated by matching of the latest scan data SD (d) is shown. ing.
  • the whole of the white circle point group and the black circle point group in FIG. 7C constitute an updated reference map RM. *
  • the number of points in the reference map RM increases every time the external sensor 102 scans. This causes an increase in the amount of calculation when matching the latest scan data with the reference map RM. For example, when one piece of scan data includes a maximum of about 1000 points, when 2000 pieces of scan data are connected to create one reference map RM, the number of points in the reference map RM is about maximum. It reaches 2 million. When the corresponding point is found and the calculation for matching is repeated, if the point group of the reference map RM is too large, the matching may not be completed within the period of ⁇ t that is the scan cycle. *
  • the reference map is reset by deleting a part other than the part including the latest scan data from the reference map updated a plurality of times. Moreover, when resetting, it is also possible to update an environmental map based on the reference map updated several times before resetting.
  • the environment map itself prepared in advance by the offline SLAM can be maintained without being updated.
  • the “predetermined number” may be 100 times, for example.
  • the “predetermined amount” in the case of (ii) may be 10,000, for example.
  • the “predetermined length” in the case of (iii) can be, for example, 5 minutes.
  • the matching accuracy is sufficiently higher than the rate at which the amount of calculation required for matching increases. Saturation can occur without improvement.
  • the density of the point cloud constituting the scan data and / or the reference map exceeds a predetermined density, several points are thinned out from the point cloud, and the density of the point cloud is set to a predetermined density or less. You may perform the process to reduce.
  • the “predetermined density” may be, for example, 1 piece / (10 cm) 2 .
  • FIG. 8A is a diagram schematically illustrating an example of scan data SD (t) acquired by an external sensor at time t.
  • the scan data SD (t) is displayed in a sensor coordinate system whose position and orientation change together with the moving body 10, and points constituting the scan data SD (t) are described by white circles. *
  • FIG. 8B is a diagram schematically illustrating a state when matching of the scan data SD (t) with the environment map M is started.
  • the processor 106 in FIG. 1 acquires the scan data SD (t) from the external sensor 102
  • the processor 106 performs matching between the scan data SD (t) and the environment map M read from the storage device 104, thereby The position and orientation on the map M can be estimated.
  • FIG. 8C is a diagram schematically illustrating a state where the matching of the scan data SD (t) with the environment map M is completed. *
  • the amount of change is measured by odometry from the position and orientation estimated by the previous matching.
  • the moving amount and moving direction of the moving body 10 can be obtained by encoders attached to the respective driving wheels or motors. Since the method using odometry is known, no further detailed explanation is necessary. *
  • the second method is to predict the current position and posture based on the history of the estimated values of the position and posture of the moving body 10.
  • FIG. 9 is a diagram schematically showing the history of the position and orientation of the moving body 10 obtained in the past by the position estimation system 115 of FIG. 1, and the predicted value of the current position and orientation. .
  • the position and orientation history is stored in the memory 107 inside the position estimation system 115. Part or all of such history may be stored in a storage device outside the position estimation device 105, for example, the storage device 104 in FIG. *
  • FIG. 9 also shows a UV coordinate system that is a local coordinate system (sensor coordinate system) of the moving body 10.
  • Scan data is expressed in the UV coordinate system.
  • the position of the moving body 10 on the environment map M is the coordinate value (xi, yi) of the origin of the UV coordinate system in the coordinate system of the environment map M.
  • the posture (orientation) of the moving body 10 is the orientation ( ⁇ i) of the UV coordinate system with respect to the coordinate system of the environment map M. ⁇ i is “positive” in the counterclockwise direction.
  • the predicted value of the current position and orientation is calculated from the history of positions and orientations obtained in the past by the position estimation device.
  • the position and orientation of the moving body obtained by the previous matching are (x i ⁇ 1 , y i ⁇ 1 , ⁇ i ⁇ 1 ), and the position and orientation of the moving body obtained by the previous matching are further represented by (x i -2 , y i-2 , ⁇ i-2 ). Further, the predicted value of the current position and orientation of the moving object is (x i , y i , ⁇ i ). At this time, it is assumed that the following assumptions hold.
  • the moving speed from the position (x i ⁇ 1 , y i ⁇ 1 ) to the position (x i , y i ) is from the position (x i ⁇ 2 , y i ⁇ 2 ) to the position (x i ⁇ 1 , y i ⁇ 1 ) equal to the moving speed during movement.
  • Equation 2 When approximation that ⁇ is zero is performed, the matrix of the second term on the right side of Equation 2 can be simplified as a unit matrix. *
  • step S ⁇ b> 10 the processor 106 of the position estimation system 115 acquires the latest scan data from the external sensor 102.
  • step S12 the processor 106 acquires the current position and orientation values by odometry.
  • the current position and orientation values may be predicted as described with reference to FIG. *
  • step S14 the processor 106 performs initial alignment of the latest scan data with respect to the environment map using the current position and orientation values acquired from odometry as initial values.
  • step S16 the processor 106 performs misregistration correction by the ICP algorithm.
  • step S18 the processor 106 generates a first estimated value of the position and orientation by offline SLAM.
  • step S20 it is determined whether an event has occurred in which the second estimated value of the position and orientation based on the online SLAM is output as the selected estimated value instead of the first estimated value of the position and orientation based on the offline SLAM. To do. In No, it progresses to step S21 and outputs a 1st estimated value as a selected estimated value. Thereafter, the process returns to step S10, and the next scan data is acquired. In the case of Yes, it progresses to step S22. *
  • FIG. 11A is a diagram illustrating an example in which the change amount ⁇ P t of the first estimated value by the first position estimation process (offline SLAM), that is, P t ⁇ P t ⁇ 1 fluctuates. This is because when the first estimated value at the current time t is P t and the previous first estimated value one time before (for example, 200 milliseconds before) is P t ⁇ 1 , the difference is monitored. It is possible to detect an estimated abnormality. For example, when the change amount ⁇ P t exceeds the threshold value indicated by the broken line in FIG. 11A, the second estimated value by the second position estimating process (online SLAM) is used instead of the first estimated value by the first position estimating process (offline SLAM).
  • online SLAM the second estimated value by the second position estimating process
  • the estimated value can be selected as a more accurate estimated value.
  • the threshold value is not selected immediately after the second position estimation process (online SLAM) but is continued for a predetermined number of times (for example, three times). You may make it select a 2nd estimated value when exceeding.
  • FIG. 11B is a diagram illustrating an example in which the difference between the first estimated value obtained by the first position estimating process (offline SLAM) and the measured value obtained by the sensor varies.
  • the sensor is a position and orientation value of the moving body measured by odometry such as a rotary encoder.
  • the second estimated value based on the second position estimating process is used instead of the first estimated value based on the first position estimating process (offline SLAM). Can be selected as a more accurate estimate.
  • the second estimated value by the second position estimation process (online SLAM) is not selected immediately, but the predetermined range is continued for a predetermined number of times (for example, three times).
  • the second estimated value may be selected when deviating from the above.
  • FIG. 11C is a diagram illustrating an example in which the reliability of the first estimated value by the first position estimation process (offline SLAM) and the reliability of the second estimated value by the second position estimation process (online SLAM) vary.
  • the first estimation value by the first position estimation process (offline SLAM)
  • the second estimated value obtained by the two-position estimation process (online SLAM) can be selected as a more accurate estimated value.
  • the second estimated value may be selected when the number of times that the reliability of the first estimation is lower than the reliability of the second estimated value exceeds a predetermined number.
  • step S22 the processor 106 performs position and orientation estimation by online SLAM. Specifically, the process proceeds to step S40.
  • the online SLAM flow will be described later. *
  • step S32 the processor 106 searches for corresponding points from two sets of point groups. Specifically, the processor 106 selects a point on the environment map corresponding to each point constituting the point group included in the scan data. *
  • step S34 the processor 106 performs rigid transformation (coordinate transformation) of rotation and translation of the scan data so as to reduce the distance between corresponding points between the scan data and the environment map. This is to optimize the parameters of the coordinate transformation matrix so as to reduce the distance between corresponding points, that is, the total sum (square sum) of errors of corresponding points. This optimization is performed by iterative calculation. *
  • step S36 the processor 106 determines whether or not the result of the iterative calculation has converged. Specifically, the processor 106 determines that it has converged when the amount of reduction in the sum of the errors (corresponding to the squares) of the corresponding points falls below a predetermined value even if the parameters of the coordinate transformation matrix are changed. If not converged, the process returns to step S32, and the processor 106 repeats the processing from the corresponding point search. If it is determined in step S36 that the process has converged, the process proceeds to step S38. *
  • step S38 the processor 106 converts the coordinate value of the scan data from the value of the sensor coordinate system to the value of the coordinate system of the environment map using the coordinate conversion matrix.
  • the coordinate values of the scan data obtained in this way can be used for updating the environmental map.
  • step S ⁇ b> 40 the processor 106 of the position estimation system 115 acquires the latest scan data from the external sensor 102.
  • step S42 the processor 106 acquires the current position and orientation values by odometry.
  • step S44 the processor 106 performs initial alignment of the latest scan data with respect to the reference map using the current position and orientation values acquired from odometry as initial values.
  • step S ⁇ b> 46 the processor 106 performs misalignment correction using an ICP algorithm.
  • step S48 the processor 106 generates an estimated value (second estimated value) of the position and orientation of the moving object obtained as a result of matching with the reference map.
  • This second estimated value is output as a selected estimated value instead of the first estimated value when a “Yes” determination is made in step S20 of FIG.
  • the estimation of the position and orientation by online SLAM is continuously executed, and the second estimated value is generated. Whether or not to adopt the second estimated value as the selected estimated value depends on the presence or absence of the event described with reference to FIG. *
  • step S50 it is determined whether or not the reference map satisfies the update condition.
  • the update conditions are (i) when the number of times of updating the reference map reaches a predetermined number, (ii) when the data amount of the reference map reaches a predetermined amount, or (iii) since the previous reset. This is a condition such as when the elapsed time of a predetermined length has been reached. In No, it returns to step S40 and acquires the next scan data. In the case of Yes, it progresses to step S52. *
  • step S52 the processor 106 deletes a portion other than the portion including the latest scan data from the reference map updated a plurality of times, and resets the reference map. In this way, the number and density of points in the point group constituting the reference map can be reduced.
  • step S20 of FIG. 10 determines whether there is no need to perform online SLAM. For this reason, when there is no need to perform online SLAM, the online SLAM quickly returns to the offline SLAM. *
  • the position estimation system according to the present disclosure can be applied to various moving bodies that are moved by various driving devices.
  • the position estimation system according to the present disclosure may not be used by being mounted on a moving body including a driving device. For example, it may be placed on a handcart driven by a user and used for map creation. *
  • an automatic guided vehicle is taken as an example of the moving body.
  • an abbreviation is used to describe an automatic guided vehicle as “AGV: Automatic Guided Vehicle”.
  • AGV As for the mobile object 10 as well, the reference symbol “10” is attached thereto.
  • FIG. 14 shows a basic configuration example of an exemplary mobile management system 100 according to the present disclosure.
  • the mobile management system 100 includes at least one AGV 10 and an operation management device 50 that manages the operation of the AGV 10.
  • FIG. 14 also shows a terminal device 20 operated by the user 1. *
  • the AGV 10 is an automatic guided vehicle capable of “guideless type” traveling that does not require a derivative such as a magnetic tape for traveling.
  • the AGV 10 can perform self-position estimation and transmit the estimation result to the terminal device 20 and the operation management device 50.
  • the AGV 10 can automatically travel in the environment S according to a command from the operation management device 50.
  • the operation management device 50 is a computer system that tracks the position of each AGV 10 and manages the running of each AGV 10.
  • the operation management device 50 may be a desktop PC, a notebook PC, and / or a server computer.
  • the operation management device 50 communicates with each AGV 10 via the plurality of access points 2. For example, the operation management device 50 transmits the data of the coordinates of the position to which each AGV 10 should go next to each AGV 10.
  • Each AGV 10 transmits data indicating its position and orientation to the operation management device 50 periodically, for example, every 250 milliseconds.
  • the operation management device 50 transmits data on the coordinates of the position to be further headed.
  • the AGV 10 can also travel in the environment S according to the operation of the user 1 input to the terminal device 20.
  • An example of the terminal device 20 is a tablet computer. *
  • FIG. 15 shows an example of an environment S in which three AGVs 10a, 10b, and 10c exist. Assume that all AGVs are traveling in the depth direction in the figure. The AGVs 10a and 10b are transporting loads placed on the top board. The AGV 10c travels following the front AGV 10b. For convenience of explanation, reference numerals 10a, 10b, and 10c are attached in FIG. 15, but are described as “AGV10” below. *
  • the AGV 10 can also transport the load using a tow cart connected to itself.
  • FIG. 16 shows the AGV 10 and the traction cart 5 before being connected.
  • a caster is provided on each foot of the traction cart 5.
  • the AGV 10 is mechanically connected to the traction cart 5.
  • FIG. 17 shows the AGV 10 and the traction cart 5 connected to each other.
  • connection method between the AGV 10 and the traction cart 5 is arbitrary.
  • a plate 6 is fixed to the top plate of the AGV 10.
  • the pulling cart 5 is provided with a guide 7 having a slit.
  • the AGV 10 approaches the tow truck 5 and inserts the plate 6 into the slit of the guide 7.
  • the AGV 10 passes an electromagnetic lock pin (not shown) through the plate 6 and the guide 7 and applies an electromagnetic lock. Thereby, AGV10 and tow cart 5 are physically connected.
  • Each AGV 10 and the terminal device 20 can be connected, for example, on a one-to-one basis, and can perform communication based on the Bluetooth (registered trademark) standard.
  • Each AGV 10 and the terminal device 20 can perform communication based on Wi-Fi (registered trademark) using one or a plurality of access points 2.
  • the plurality of access points 2 are connected to each other via, for example, the switching hub 3.
  • FIG. 14 shows two access points 2a and 2b.
  • the AGV 10 is wirelessly connected to the access point 2a.
  • the terminal device 20 is wirelessly connected to the access point 2b.
  • the data transmitted by the AGV 10 is received by the access point 2a, then transferred to the access point 2b via the switching hub 3, and transmitted from the access point 2b to the terminal device 20.
  • the data transmitted by the terminal device 20 is received by the access point 2b, then transferred to the access point 2a via the switching hub 3, and transmitted from the access point 2a to the AGV 10.
  • bidirectional communication between the AGV 10 and the terminal device 20 is realized.
  • the plurality of access points 2 are also connected to the operation management device 50 via the switching hub 3. Thereby, bidirectional communication is also realized between the operation management device 50 and each AGV 10. *
  • a map in the environment S is created in advance so that the AGV 10 can run while estimating its own position by offline SLAM.
  • the AGV 10 is equipped with a position estimation device and an LRF, and an environmental map can be created using the output of the LRF.
  • the AGV 10 transitions to a data acquisition mode by a user operation.
  • the AGV 10 starts acquiring sensor data (scan data) using LRF. Subsequent processing is as described above. *
  • the movement in the environment S for acquiring the sensor data can be realized by the AGV 10 traveling according to the user's operation.
  • the AGV 10 receives a travel command instructing movement in the front, rear, left, and right directions from the user via the terminal device 20 wirelessly.
  • the AGV 10 travels forward / backward / left / right in the environment S according to the travel command and creates a map.
  • a map may be created by traveling in the environment S in the front / rear and left / right directions according to a control signal from the steering device.
  • Sensor data may be acquired by a person walking on a measurement carriage equipped with an LRF. *
  • the number of AGVs may be one.
  • the user 1 can use the terminal device 20 to select one AGV 10 from among the plurality of registered AGVs and create a map of the environment S. *
  • FIG. 18 is an external view of an exemplary AGV 10 according to the present embodiment.
  • the AGV 10 includes two drive wheels 11a and 11b, four casters 11c, 11d, 11e, and 11f, a frame 12, a transport table 13, a travel control device 14, and an LRF 15.
  • the two drive wheels 11a and 11b are provided on the right side and the left side of the AGV 10, respectively.
  • the four casters 11c, 11d, 11e, and 11f are arranged at the four corners of the AGV 10.
  • the AGV 10 also has a plurality of motors connected to the two drive wheels 11a and 11b, but the plurality of motors are not shown in FIG. Also, FIG.
  • the travel control device 14 is a device that controls the operation of the AGV 10, and mainly includes an integrated circuit including a microcomputer (described later), electronic components, and a board on which they are mounted.
  • the traveling control device 14 performs the above-described data transmission / reception with the terminal device 20 and the preprocessing calculation.
  • the LRF 15 is an optical device that measures the distance to a reflection point by, for example, emitting an infrared laser beam 15a and detecting the reflected light of the laser beam 15a.
  • the LRF 15 of the AGV 10 emits a pulsed laser beam 15a while changing its direction every 0.25 degrees in a space of 135 degrees left and right (total 270 degrees) with respect to the front of the AGV 10, for example. Then, the reflected light of each laser beam 15a is detected. Thereby, data of the distance to the reflection point in the direction determined by the angle corresponding to the total of 1081 steps every 0.25 degrees can be obtained.
  • the scanning of the surrounding space performed by the LRF 15 is substantially parallel to the floor surface and is planar (two-dimensional). However, the LRF 15 may scan in the height direction. *
  • the AGV 10 can create a map of the environment S based on the position and orientation (orientation) of the AGV 10 and the scan result of the LRF 15.
  • the map may reflect the arrangement of walls, pillars and other structures around the AGV, and objects placed on the floor.
  • the map data is stored in a storage device provided in the AGV 10. *
  • the position and orientation of the AGV 10, that is, the pose (x, y, ⁇ ) may be simply referred to as “position”.
  • the traveling control device 14 compares the measurement result of the LRF 15 with the map data held by itself, and estimates its current position.
  • the map data may be map data created by another AGV 10.
  • FIG. 19A shows a first hardware configuration example of the AGV 10.
  • FIG. 19A also shows a specific configuration of the travel control device 14. *
  • the AGV 10 includes a travel control device 14, an LRF 15, two motors 16a and 16b, a drive device 17, and wheels 11a and 11b. *
  • the travel control device 14 includes a microcomputer 14a, a memory 14b, a storage device 14c, a communication circuit 14d, and a position estimation device 14e.
  • the microcomputer 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the position estimation device 14e are connected by a communication bus 14f and can exchange data with each other.
  • the LRF 15 is also connected to the communication bus 14f via a communication interface (not shown), and transmits measurement data as a measurement result to the microcomputer 14a, the position estimation device 14e, and / or the memory 14b. *
  • the microcomputer 14 a is a processor or a control circuit (computer) that performs an operation for controlling the entire AGV 10 including the travel control device 14.
  • the microcomputer 14a is a semiconductor integrated circuit.
  • the microcomputer 14a transmits a PWM (Pulse Width Modulation) signal, which is a control signal, to the drive device 17 to control the drive device 17 and adjust the voltage applied to the motor.
  • PWM Pulse Width Modulation
  • One or more control circuits for example, a microcomputer
  • the motor driving device 17 may include two microcomputers that control the driving of the motors 16a and 16b, respectively.
  • the memory 14b is a volatile storage device that stores a computer program executed by the microcomputer 14a.
  • the memory 14b can also be used as a work memory when the microcomputer 14a and the position estimation device 14e perform calculations. *
  • the storage device 14c is a nonvolatile semiconductor memory device.
  • the storage device 14c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk.
  • the storage device 14c may include a head device for writing and / or reading data on any recording medium and a control device for the head device.
  • the storage device 14c stores an environment map M of the traveling environment S and data (travel route data) R of one or more travel routes.
  • the environmental map M is created by the AGV 10 operating in the map creation mode and stored in the storage device 14c.
  • the travel route data R is transmitted from the outside after the environment map M is created.
  • the environment map M and the travel route data R are stored in the same storage device 14c, but may be stored in different storage devices.
  • the AGV 10 receives travel route data R indicating a travel route from the tablet computer.
  • the travel route data R at this time includes marker data indicating the positions of a plurality of markers. “Marker” indicates the passing position (route point) of the traveling AGV 10.
  • the travel route data R includes at least position information of a start marker indicating a travel start position and an end marker indicating a travel end position.
  • the travel route data R may further include position information of one or more intermediate waypoint markers. When the travel route includes one or more intermediate waypoints, a route from the start marker to the end marker via the travel route point in order is defined as the travel route.
  • the data of each marker may include data on the direction (angle) and traveling speed of the AGV 10 until moving to the next marker, in addition to the coordinate data of the marker.
  • the data of each marker includes acceleration time required for acceleration to reach the travel speed, and / or Further, it may include data of deceleration time required for deceleration from the traveling speed until the vehicle stops at the position of the next marker.
  • the operation management device 50 may control the movement of the AGV 10 instead of the terminal device 20. In that case, the operation management device 50 may instruct the AGV 10 to move to the next marker every time the AGV 10 reaches the marker. For example, the AGV 10 receives, from the operation management device 50, coordinate data of a target position to be next, or data of a distance to the target position and an angle to be traveled as travel route data R indicating a travel route. *
  • the AGV 10 can travel along the stored travel route while estimating its own position using the created map and the sensor data output by the LRF 15 acquired during travel.
  • the details of the operation at this point are as described above. According to the embodiment of the present disclosure, even when a part of the environmental map prepared in advance does not reflect the actual environment, the self-position can be continuously estimated by quickly switching to the online SLAM. *
  • the communication circuit 14d is a wireless communication circuit that performs wireless communication conforming to, for example, Bluetooth (registered trademark) and / or Wi-Fi (registered trademark) standards. Each standard includes a wireless communication standard using a frequency of 2.4 GHz band. For example, in the mode in which the AGV 10 is run to create a map, the communication circuit 14d performs wireless communication based on the Bluetooth (registered trademark) standard and communicates with the terminal device 20 one-on-one.
  • Bluetooth registered trademark
  • Wi-Fi registered trademark
  • the position estimation device 14e performs map creation processing and self-position estimation processing during traveling.
  • the position estimation device 14e creates a map of the environment S based on the position and orientation of the AGV 10 and the LRF scan result.
  • the position estimation device 14e receives sensor data from the LRF 15 and reads the environmental map M stored in the storage device 14c.
  • the local map data (sensor data) created from the scan result of the LRF 15 is matched with a wider range of the environment map M to identify the self-position (x, y, ⁇ ) on the environment map M.
  • the position estimation device 14e generates “reliability” data representing the degree to which the local map data matches the environmental map M.
  • Each data of the self position (x, y, ⁇ ) and the reliability can be transmitted from the AGV 10 to the terminal device 20 or the operation management device 50.
  • the terminal device 20 or the operation management device 50 can receive each data of its own position (x, y, ⁇ ) and reliability and display it on a built-in or connected display device.
  • the microcomputer 14a and the position estimation device 14e are separate components, but this is an example. It may be a single chip circuit or a semiconductor integrated circuit capable of independently performing the operations of the microcomputer 14a and the position estimation device 14e.
  • FIG. 19A shows a chip circuit 14g including the microcomputer 14a and the position estimation device 14e. Below, the example in which the microcomputer 14a and the position estimation apparatus 14e are provided independently is demonstrated. *
  • the two motors 16a and 16b are attached to the two wheels 11a and 11b, respectively, and rotate each wheel. That is, the two wheels 11a and 11b are drive wheels, respectively.
  • the motor 16a and the motor 16b are described as being motors that drive the right wheel and the left wheel of the AGV 10, respectively. *
  • the moving body 10 may further include a rotary encoder that measures the rotational positions or rotational speeds of the wheels 11a and 11b.
  • the microcomputer 14a may estimate not only the signal received from the position estimation device 14e but also the position and orientation of the moving body 10 using a signal received from the rotary encoder. *
  • the drive device 17 has motor drive circuits 17a and 17b for adjusting the voltage applied to each of the two motors 16a and 16b.
  • Each of motor drive circuits 17a and 17b includes a so-called inverter circuit.
  • the motor drive circuits 17a and 17b turn on or off the current flowing through each motor by a PWM signal transmitted from the microcomputer 14a or the microcomputer in the motor drive circuit 17a, thereby adjusting the voltage applied to the motor.
  • FIG. 19B shows a second hardware configuration example of the AGV 10.
  • the second hardware configuration example is different from the first hardware configuration example (FIG. 19A) in that it has a laser positioning system 14h and the microcomputer 14a is connected to each component in a one-to-one relationship. To do. *
  • the laser positioning system 14h includes a position estimation device 14e and an LRF 15.
  • the position estimation device 14e and the LRF 15 are connected by, for example, an Ethernet (registered trademark) cable. Each operation of the position estimation device 14e and the LRF 15 is as described above.
  • the laser positioning system 14h outputs information indicating the pause (x, y, ⁇ ) of the AGV 10 to the microcomputer 14a. *
  • the microcomputer 14a has various general purpose I / O interfaces or general purpose input / output ports (not shown).
  • the microcomputer 14a is directly connected to other components in the travel control device 14 such as the communication circuit 14d and the laser positioning system 14h via the general-purpose input / output port.
  • AGV10 in embodiment of this indication may be provided with safety sensors, such as an obstacle detection sensor and a bumper switch which are not illustrated. *
  • FIG. 20 shows a hardware configuration example of the operation management device 50.
  • the operation management device 50 includes a CPU 51, a memory 52, a position database (position DB) 53, a communication circuit 54, a map database (map DB) 55, and an image processing circuit 56.
  • position database position DB
  • map database map DB
  • the CPU 51, the memory 52, the position DB 53, the communication circuit 54, the map DB 55, and the image processing circuit 56 are connected by a communication bus 57, and can exchange data with each other.
  • the CPU 51 is a signal processing circuit (computer) that controls the operation of the operation management device 50.
  • the CPU 51 is a semiconductor integrated circuit.
  • the memory 52 is a volatile storage device that stores a computer program executed by the CPU 51.
  • the memory 52 can also be used as a work memory when the CPU 51 performs calculations. *
  • the position DB 53 stores position data indicating each position that can be a destination of each AGV 10.
  • the position data can be represented by coordinates virtually set in the factory by an administrator, for example.
  • the location data is determined by the administrator. *
  • the communication circuit 54 performs wired communication based on, for example, the Ethernet (registered trademark) standard.
  • the communication circuit 54 is connected to the access point 2 (FIG. 14) by wire, and can communicate with the AGV 10 via the access point 2.
  • the communication circuit 54 receives data to be transmitted to the AGV 10 from the CPU 51 via the bus 57.
  • the communication circuit 54 transmits the data (notification) received from the AGV 10 to the CPU 51 and / or the memory 52 via the bus 57. *
  • the map DB 55 stores internal map data of a factory or the like where the AGV 10 travels. As long as the map has a one-to-one correspondence with the position of each AGV 10, the format of the data is not limited.
  • the map stored in the map DB 55 may be a map created by CAD. *
  • the position DB 53 and the map DB 55 may be constructed on a nonvolatile semiconductor memory, or may be constructed on a magnetic recording medium represented by a hard disk or an optical recording medium represented by an optical disk.
  • the image processing circuit 56 is a circuit that generates video data to be displayed on the monitor 58.
  • the image processing circuit 56 operates exclusively when the administrator operates the operation management device 50. In the present embodiment, further detailed explanation is omitted.
  • the monitor 59 may be integrated with the operation management device 50. Further, the CPU 51 may perform the processing of the image processing circuit 56. *
  • an AGV that travels in a two-dimensional space is taken as an example.
  • the present disclosure can also be applied to a moving object that moves in a three-dimensional space, such as a flying object (drone).
  • a flying object drone
  • the 2D space can be expanded to a 3D space.
  • the comprehensive or specific aspect described above may be realized by a system, a method, an integrated circuit, a computer program, or a recording medium.
  • the present invention may be realized by any combination of a system, an apparatus, a method, an integrated circuit, a computer program, and a recording medium.
  • the mobile body of the present disclosure can be suitably used for moving and transporting goods such as luggage, parts, and finished products in factories, warehouses, construction sites, logistics, hospitals, and the like.

Abstract

Le problème de la présente invention réside dans la fourniture d'un système d'auto-estimation capable de réaliser une auto-estimation de position même si une partie d'une carte ne représente pas l'environnement réel. Pour le solutionner, l'invention propose un système d'estimation de position (115) qui est destiné à un corps mobile et qui est utilisé tout en étant connecté à un capteur externe pour balayer de manière répétée un environnement et délivrer en sortie des données de capteur pour chaque balayage. Le système d'estimation de position (115) comprend un processeur (106), une première mémoire (104) pour stocker une carte d'environnement qui a été préparée au préalable, et une seconde mémoire (107) pour stocker un programme informatique pour faire fonctionner le processeur. Le processeur effectue (A) un premier traitement d'estimation de position pour générer des premières valeurs estimées pour la position et l'orientation du corps mobile sur la base des résultats de comparaison de la carte d'environnement et des données de capteur, (B) un second traitement d'estimation de position pour utiliser les données de capteur pour générer une carte de référence pour la zone environnante tout en générant des secondes valeurs estimées pour la position et l'orientation du corps mobile sur la base des résultats de comparaison de la carte de référence et des données de capteur, et (C) un traitement dans lequel les premières valeurs estimées ou les secondes valeurs estimées sont sélectionnées et les valeurs estimées sélectionnées sont délivrées en tant que valeurs estimées sélectionnées pour la position et l'orientation du corps mobile.
PCT/JP2019/013741 2018-04-02 2019-03-28 Système d'estimation de position, corps mobile comprenant ledit système d'estimation de position, et programme informatique WO2019194079A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201980022370.6A CN111971633B (zh) 2018-04-02 2019-03-28 位置推断系统、具有该位置推断系统的移动体以及记录介质

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018070527 2018-04-02
JP2018-070527 2018-04-02

Publications (1)

Publication Number Publication Date
WO2019194079A1 true WO2019194079A1 (fr) 2019-10-10

Family

ID=68100670

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/013741 WO2019194079A1 (fr) 2018-04-02 2019-03-28 Système d'estimation de position, corps mobile comprenant ledit système d'estimation de position, et programme informatique

Country Status (2)

Country Link
CN (1) CN111971633B (fr)
WO (1) WO2019194079A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210293973A1 (en) * 2020-03-20 2021-09-23 Abb Schweiz Ag Position estimation for vehicles based on virtual sensor response
CN116466382A (zh) * 2023-04-24 2023-07-21 贵州一招信息技术有限公司 一种基于gps的高精度实时定位系统
JP7424438B1 (ja) 2022-09-22 2024-01-30 いすゞ自動車株式会社 車両位置推定装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010277548A (ja) * 2009-06-01 2010-12-09 Hitachi Ltd ロボット管理システム、ロボット管理端末、ロボット管理方法およびプログラム
JP2017045447A (ja) * 2015-08-28 2017-03-02 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 地図生成方法、自己位置推定方法、ロボットシステム、およびロボット
JP2017146893A (ja) * 2016-02-19 2017-08-24 トヨタ自動車株式会社 自己位置推定方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009148118A1 (fr) * 2008-06-04 2009-12-10 株式会社日立製作所 Dispositif de navigation, procédé de navigation et système de navigation
JP5245139B2 (ja) * 2008-09-29 2013-07-24 鹿島建設株式会社 移動体の誘導システム及び誘導方法
CN105953798B (zh) * 2016-04-19 2018-09-18 深圳市神州云海智能科技有限公司 移动机器人的位姿确定方法和设备
CN107167148A (zh) * 2017-05-24 2017-09-15 安科机器人有限公司 同步定位与地图构建方法和设备

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010277548A (ja) * 2009-06-01 2010-12-09 Hitachi Ltd ロボット管理システム、ロボット管理端末、ロボット管理方法およびプログラム
JP2017045447A (ja) * 2015-08-28 2017-03-02 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 地図生成方法、自己位置推定方法、ロボットシステム、およびロボット
JP2017146893A (ja) * 2016-02-19 2017-08-24 トヨタ自動車株式会社 自己位置推定方法

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210293973A1 (en) * 2020-03-20 2021-09-23 Abb Schweiz Ag Position estimation for vehicles based on virtual sensor response
CN113494912A (zh) * 2020-03-20 2021-10-12 Abb瑞士股份有限公司 基于虚拟传感器响应的交通工具的位置估计
US11953613B2 (en) * 2020-03-20 2024-04-09 Abb Schweiz Ag Position estimation for vehicles based on virtual sensor response
JP7424438B1 (ja) 2022-09-22 2024-01-30 いすゞ自動車株式会社 車両位置推定装置
CN116466382A (zh) * 2023-04-24 2023-07-21 贵州一招信息技术有限公司 一种基于gps的高精度实时定位系统

Also Published As

Publication number Publication date
CN111971633A (zh) 2020-11-20
CN111971633B (zh) 2023-10-20

Similar Documents

Publication Publication Date Title
JP6816830B2 (ja) 位置推定システム、および当該位置推定システムを備える移動体
JP6825712B2 (ja) 移動体、位置推定装置、およびコンピュータプログラム
TWI665538B (zh) 進行障礙物之迴避動作的移動體及記錄其之電腦程式的記錄媒體
US20200110410A1 (en) Device and method for processing map data used for self-position estimation, mobile body, and control system for mobile body
JP2019168942A (ja) 移動体、管理装置および移動体システム
JPWO2019026761A1 (ja) 移動体およびコンピュータプログラム
JP7111424B2 (ja) 移動体、位置推定装置、およびコンピュータプログラム
JP7136426B2 (ja) 管理装置および移動体システム
WO2019054208A1 (fr) Corps mobile et système de corps mobile
WO2019054209A1 (fr) Système et dispositif de création de carte
JP2019053391A (ja) 移動体
WO2019194079A1 (fr) Système d'estimation de position, corps mobile comprenant ledit système d'estimation de position, et programme informatique
JP2019175137A (ja) 移動体および移動体システム
JP2019175136A (ja) 移動体
JP2019179497A (ja) 移動体および移動体システム
JP2019067001A (ja) 移動体
CN112578789A (zh) 移动体
JP2020166702A (ja) 移動体システム、地図作成システム、経路作成プログラムおよび地図作成プログラム
JP2019148871A (ja) 移動体および移動体システム
JPWO2019059299A1 (ja) 運行管理装置
JP2020166701A (ja) 移動体およびコンピュータプログラム
JP2019175138A (ja) 移動体および管理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19782084

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19782084

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP