WO2019059299A1 - Dispositif de gestion opérationnelle - Google Patents

Dispositif de gestion opérationnelle Download PDF

Info

Publication number
WO2019059299A1
WO2019059299A1 PCT/JP2018/034878 JP2018034878W WO2019059299A1 WO 2019059299 A1 WO2019059299 A1 WO 2019059299A1 JP 2018034878 W JP2018034878 W JP 2018034878W WO 2019059299 A1 WO2019059299 A1 WO 2019059299A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
moving body
agv
operation management
individual difference
Prior art date
Application number
PCT/JP2018/034878
Other languages
English (en)
Japanese (ja)
Inventor
和典 島村
Original Assignee
日本電産シンポ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電産シンポ株式会社 filed Critical 日本電産シンポ株式会社
Priority to JP2019543706A priority Critical patent/JPWO2019059299A1/ja
Publication of WO2019059299A1 publication Critical patent/WO2019059299A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present disclosure relates to an operation management device.
  • An autonomous mobile robot has been developed which moves space autonomously along a predetermined route.
  • the autonomous mobile robot senses the surrounding space using an external sensor such as a laser distance sensor, matches the sensing result with a map prepared in advance, and estimates (identifies) its current position and posture. .
  • the autonomous mobile robot can move along the path while controlling its current position and attitude.
  • Japanese Patent Application Laid-Open No. 2011-150443 discloses a technique in which a first robot calculates movement parameters using the recognition result of the position and orientation of a second robot, and uses the parameters for its own movement. By this, it is described that a robot capable of accurate position estimation can be realized with a low cost and a simple method.
  • the present disclosure provides a technique for moving each moving object as accurately as possible when moving a space to a plurality of moving objects.
  • An exemplary first operation management apparatus is an operation management apparatus that manages the operation of a plurality of mobile bodies that can travel autonomously, including a first mobile body and a second mobile body, Each of the first moving body and the second moving body collates the memory with the memory storing the same map data, a sensor that senses the surrounding space and outputs sensor data, and the sensor data,
  • a communication interface device comprising: a positioning device for outputting pose data indicating a position and a posture; and a communication circuit for communicating with the outside, wherein the operation management device communicates with the first mobile body and the second mobile body.
  • the individual difference data which is the difference between the pose data received from the first mobile unit and the pose data received from the second mobile unit is calculated by receiving the pose data sensed and output in one posture, and the individual is calculated The difference data is stored in the storage device.
  • An exemplary second operation management apparatus is an operation management apparatus that manages the operation of a plurality of mobile bodies that can travel autonomously, including a first mobile body and a second mobile body, Each of the first moving body and the second moving body collates the memory with the memory storing the same map data, a sensor that senses the surrounding space and outputs sensor data, and the sensor data,
  • a communication interface device comprising: a positioning device for outputting pose data indicating a position and a posture; and a communication circuit for communicating with the outside, wherein the operation management device communicates with the first mobile body and the second mobile body.
  • a storage device storing individual difference data of each of the first mobile unit and the second mobile unit, and an arithmetic circuit, wherein the individual difference data includes the first mobile unit and the second mobile unit.
  • each Is the data created in advance based on the pose data sensed and output in the same posture at the same position in the space, and the individual difference data of the first moving body is the pose of the first moving body Data, and the individual difference data of the second moving body is a difference between the pose data received from the first moving body and the pose data received from the second moving body, and the first moving body is the space
  • the arithmetic circuit Data obtained based on the data of the position included in the second pose data and the individual difference data of the second moving body are stored in the storage device as coordinate values of the second position.
  • individual difference data representing individual differences of moving objects is calculated and stored in the storage device.
  • the individual difference data is calculated as a difference between the pose data output from the first moving body and the second moving body at the same position in the space and sensed in the same posture (direction).
  • the obtained individual difference data can be used for various purposes. For example, it is possible to transmit a movement destination command in consideration of the individual difference of each moving body. In addition, when the influence of the individual difference of the mobile object is removed from the pose data of the position acquired by the mobile object, an accurate position not including the influence of the individual difference can be obtained.
  • FIG. 1 is a block diagram showing an outline of processing performed in a mobile management system in an exemplary embodiment of the present disclosure.
  • FIG. 2 is a diagram showing an outline of a control system that controls traveling of each AGV according to the present disclosure.
  • FIG. 3 is a view showing an example of a moving space S in which an AGV is present.
  • FIG. 4A shows the AGV and tow truck before being connected.
  • FIG. 4B shows the connected AGV and tow truck.
  • FIG. 5 is an external view of an exemplary AGV according to the present embodiment.
  • FIG. 6A is a diagram illustrating an example of a first hardware configuration of an AGV.
  • FIG. 6B is a diagram showing an example of a second hardware configuration of an AGV.
  • FIG. 7A shows an AGV that generates a map while moving.
  • FIG. 7B is a diagram showing an AGV that generates a map while moving.
  • FIG. 7C is a diagram showing an AGV that generates a map while moving.
  • FIG. 7D is a diagram showing an AGV that generates a map while moving.
  • FIG. 7E is a diagram showing an AGV that generates a map while moving.
  • FIG. 7F is a view schematically showing a part of the completed map.
  • FIG. 8 is a diagram showing an example in which a map of one floor is configured by a plurality of partial maps.
  • FIG. 9 is a diagram showing an example of a hardware configuration of the operation management device.
  • FIG. 9 is a diagram showing an example of a hardware configuration of the operation management device.
  • FIG. 10 is a diagram schematically showing an example of the AGV movement route determined by the operation management device.
  • FIG. 11 is a plan layout view schematically showing an example of an environment for acquiring individual difference data.
  • FIG. 12A is a view showing a metal jig fixed to a floor surface.
  • FIG. 12B is a view showing an AGV whose position and posture are fixed by a jig.
  • FIG. 13 is a diagram showing an example of scan data acquired by the laser range finder of the AGV.
  • FIG. 14 is a diagram showing an example of pose data output by each position estimation device of AGV.
  • FIG. 15 is a view showing each pose data of the AGV in the station and individual difference data.
  • FIG. 16 is a flow chart showing a procedure of acquisition processing of individual difference data of AGV.
  • FIG. 17 is a diagram showing the individual difference data group stored in the individual difference DB.
  • FIG. 18 is a flowchart showing a procedure of correction processing of coordinate values and posture angle values using individual difference data.
  • FIG. 19 shows an AGV present in the station.
  • FIG. 20 is a diagram for explaining the outline of the processing for removing the influence of individual differences and registering station position data.
  • FIG. 21 is a flowchart showing a procedure of position data registration processing of a station using individual difference data.
  • unmanned transport vehicle means a trackless vehicle that manually or automatically loads a load on a main body, travels automatically to a designated location, and unloads manually or automatically.
  • unmanned aerial vehicle includes unmanned tow vehicles and unmanned forklifts.
  • unmanned means that the steering of the vehicle does not require a person, and does not exclude that the unmanned carrier conveys a "person (e.g., a person who unloads a package)".
  • the "unmanned tow truck” is a trackless vehicle that is to automatically travel to a designated location by towing a cart for manual or automatic loading and unloading of luggage.
  • the "unmanned forklift” is a trackless vehicle equipped with a mast for raising and lowering a load transfer fork and the like, automatically transferring the load to the fork and the like and automatically traveling to a designated location and performing an automatic load handling operation.
  • a “trackless vehicle” is a vehicle that includes a wheel and an electric motor or engine that rotates the wheel.
  • a “mobile” is a device that moves while carrying a person or a load, and includes a driving device such as a wheel, a biped or multi-legged walking device, or a propeller that generates a traction for movement.
  • a driving device such as a wheel, a biped or multi-legged walking device, or a propeller that generates a traction for movement.
  • the term “mobile” in the present disclosure includes mobile robots and drone as well as unmanned transport vehicles in a narrow sense.
  • the “automatic traveling” includes traveling based on an instruction of an operation management system of a computer to which the automated guided vehicle is connected by communication, and autonomous traveling by a control device provided in the automated guided vehicle.
  • the autonomous traveling includes not only traveling by the automated guided vehicle toward a destination along a predetermined route, but also traveling by following a tracking target.
  • the automatic guided vehicle may perform manual traveling temporarily based on the instruction of the worker.
  • “automatic travel” generally includes both “guided” travel and “guideless” travel, in the present disclosure, “guideless” travel is meant.
  • the “guided type” is a system in which a derivative is installed continuously or intermittently and a guided vehicle is guided using the derivative.
  • the “guideless type” is a method of guiding without installing a derivative.
  • the unmanned transfer vehicle in the embodiment of the present disclosure includes a self position estimation device, and can travel in a guideless manner.
  • the “self-position estimation device” is a device that estimates the self-location on the environment map based on sensor data acquired by an external sensor such as a laser range finder.
  • the “external sensor” is a sensor that senses the external state of the mobile object.
  • the external sensor includes, for example, a laser range finder (also referred to as a range sensor), a camera (or an image sensor), LIDAR (Light Detection and Ranging), a millimeter wave radar, and a magnetic sensor.
  • the “internal sensor” is a sensor that senses the internal state of the mobile object.
  • the internal sensors include, for example, a rotary encoder (hereinafter, may be simply referred to as an "encoder"), an acceleration sensor, and an angular acceleration sensor (for example, a gyro sensor).
  • SAM Simultaneous Localization and Mapping
  • the present inventors focused attention on individual differences existing in each moving body, in managing the operation of the moving body such as an unmanned carrier (hereinafter referred to as "AGV").
  • AGV unmanned carrier
  • the “individual difference” is a difference between estimated values of position and posture, which is caused by physical factors such as an assembly error at the time of manufacturing each moving body.
  • the AGV has a laser range finder (LRF) and holds in advance map data of a traveling space.
  • LRF laser range finder
  • the AGV scans the surrounding space using LRF, matches the obtained sensor data with the map data, and estimates the current position and orientation (orientation). As a result, the AGV can travel along a target route.
  • the height of the vehicle, the degree of wear of the wheels, the mounting condition of the LRF, the orientation of the lens of the LRF, etc. are slightly different depending on each AGV.
  • the accumulation of these various physical differences causes a mismatch in position and orientation estimates even if matching is performed using scan data acquired at the same position and orientation and using the same map.
  • Each estimated value of position and posture can be defined as data including “individual difference” for each AGV.
  • each moving object travels not on the same coordinates but on different coordinates and routes, even if the same travel route is indicated using the same map. Therefore, if "individual differences" are defined for each moving object and a position or a route is indicated in consideration of individual differences, a plurality of moving objects can travel on the same physical coordinates with as high accuracy as possible.
  • FIG. 1 is a block diagram showing an overview of processing performed in a mobile management system 100 in an exemplary embodiment of the present disclosure.
  • the mobile management system 100 includes a plurality of mobiles including mobiles 1a to 1c, and an operation management apparatus 50. Three mobiles 1a to 1c are illustrated as an example.
  • the mobile bodies 1a to 1c store the same map data M in advance.
  • Each of the moving bodies 1a to 1c performs sensing using a laser range finder at the same position and in the same posture, and outputs pose data PDa to PDc, which are sensing results.
  • Each of the pose data PDa to PDc includes coordinate values indicating the estimated own position and angle values indicating the posture.
  • the operation management device 50 receives the pose data PDa to PDc, and calculates individual difference data of the moving bodies 1b and 1c based on the moving body 1a.
  • the individual difference data IVb is each difference between the position and posture estimated by the moving body 1b as viewed from the position and posture estimated by the moving body 1a.
  • the individual difference data IVc is each difference between the position and posture estimated by the moving body 1c as viewed from the position and posture estimated by the moving body 1a.
  • the operation management device 50 stores the calculated individual difference data IVb and IVc in a storage device (not shown). Even when the number of mobile units 10 is increased, individual difference data of each mobile unit 1 b may be calculated based on the mobile unit 1 a and stored in the storage device. Thus, the process of acquiring individual difference data is completed.
  • each individual difference data can be used for various purposes.
  • An example of removing the influence of the individual difference of the mobile object from the pose data of the position and registering an accurate position not including the influence of the individual difference will be described. The example of (a) mentioned above is shown by FIG.
  • the operation management apparatus 50 transmits, to the mobile unit 1a, a command Ia for specifying the coordinates of the target position and the attitude of the mobile unit 1a at the target position.
  • the moving body 1a moves toward the target position in accordance with the command Ia.
  • the operation management device 50 transmits the command Ib to the mobile unit 1b.
  • the operation management device 50 transmits a command Ic to the mobile unit 1c.
  • Both of the commands Ib and Ic designate the position and posture different from the command Ia transmitted to the mobile unit 1a by the amount corresponding to the individual difference data IVb and IVc.
  • the position and orientation designated by each of instructions Ib and Ic are taken by the mobile 1a at the actual position where the mobile 1a is instructed to go and at that position. Represents the attitude to be done.
  • the moving objects 1a to 1c can reach the physically same position in the same posture sequentially.
  • an unmanned carrier may be described as "AGV" using abbreviations.
  • AGV unmanned carrier
  • the following description can be similarly applied to mobile bodies other than AGVs, for example, mobile robots, drone, or manned vehicles, as long as no contradiction arises.
  • FIG. 2 shows an example of the basic configuration of an exemplary mobile management system 100 according to the present disclosure.
  • the mobile management system 100 includes at least one AGV 10 and an operation management apparatus 50 that manages the operation of the AGV 10.
  • the terminal device 20 operated by the user 1 is also shown in FIG.
  • the AGV 10 is an unmanned transport carriage capable of "guideless" traveling, which does not require a derivative such as a magnetic tape for traveling.
  • the AGV 10 can perform self-position estimation, and can transmit the result of estimation to the terminal device 20 and the operation management device 50.
  • the AGV 10 can automatically travel in the moving space S in accordance with a command from the operation management device 50.
  • the AGV 10 can also operate in a "tracking mode" that moves following a person or other moving object.
  • the operation management device 50 is a computer system that tracks the position of each AGV 10 and manages traveling of each AGV 10.
  • the operation management device 50 may be a desktop PC, a laptop PC, and / or a server computer.
  • the operation management apparatus 50 communicates with each AGV 10 via the plurality of access points 2. For example, the operation management device 50 transmits, to each AGV 10, data of coordinates of a position to which each AGV 10 should go next.
  • Each AGV 10 periodically transmits data indicating its position and orientation to the operation management device 50, for example, every 100 milliseconds.
  • the operation management device 50 transmits data of coordinates of a position to be further advanced.
  • the AGV 10 can also travel in the moving space S in accordance with the operation of the user 1 input to the terminal device 20.
  • An example of the terminal device 20 is a tablet computer.
  • travel of the AGV 10 using the terminal device 20 is performed at the time of map creation, and travel of the AGV 10 using the operation management device 50 is performed after the map creation.
  • FIG. 3 shows an example of a moving space S in which three AGVs 10a, 10b and 10c exist. All AGVs are assumed to travel in the depth direction in the figure. The AGVs 10a and 10b are carrying the load placed on the top plate. The AGV 10 c runs following the front AGV 10 b.
  • the referential mark 10a, 10b and 10c were attached
  • the AGV 10 can also transfer a load using a tow truck connected to itself, in addition to the method of transferring the load placed on the top plate.
  • FIG. 4A shows the AGV 10 and the tow truck 5 before being connected. Each leg of the tow truck 5 is provided with a caster. The AGV 10 is mechanically connected to the tow truck 5.
  • FIG. 4B shows the connected AGV 10 and tow truck 5. When the AGV 10 travels, the tow truck 5 is pulled by the AGV 10. By pulling the tow truck 5, the AGV 10 can transport the load placed on the tow truck 5.
  • connection method of AGV10 and the pulling truck 5 is arbitrary.
  • a plate 6 is fixed to the top plate of the AGV 10.
  • the tow truck 5 is provided with a guide 7 having a slit.
  • the AGV 10 approaches the tow truck 5 and inserts the plate 6 into the slit of the guide 7.
  • the AGV 10 penetrates the plate 6 and the guide 7 with an electromagnetic lock type pin (not shown) to lock the electromagnetic lock.
  • AGV10 and the pulling truck 5 are physically connected.
  • Each AGV 10 and the terminal device 20 can be connected, for example, on a one-to-one basis to perform communication conforming to the Bluetooth (registered trademark) standard.
  • Each AGV 10 and the terminal device 20 can also perform communication conforming to Wi-Fi (registered trademark) using one or more access points 2.
  • the plurality of access points 2 are connected to one another via, for example, a switching hub 3. Two access points 2a, 2b are shown in FIG.
  • the AGV 10 is wirelessly connected to the access point 2a.
  • the terminal device 20 is wirelessly connected to the access point 2b.
  • the data transmitted by the AGV 10 is received by the access point 2 a, transferred to the access point 2 b via the switching hub 3, and transmitted from the access point 2 b to the terminal device 20.
  • the data transmitted by the terminal device 20 is received by the access point 2 b, transferred to the access point 2 a via the switching hub 3, and transmitted from the access point 2 a to the AGV 10. Thereby, bi-directional communication between the AGV 10 and the terminal device 20 is realized.
  • the plurality of access points 2 are also connected to the operation management device 50 via the switching hub 3. Thereby, bidirectional communication is realized also between the operation management device 50 and each of the AGVs 10.
  • the AGV 10 transitions to the data acquisition mode by the operation of the user.
  • the AGV 10 starts acquiring sensor data using a laser range finder.
  • the laser range finder periodically scans the surrounding space S by emitting a laser beam of, for example, infrared or visible light around.
  • the laser beam is reflected by, for example, a surface such as a wall, a structure such as a pillar, or an object placed on the floor.
  • the laser range finder receives the reflected light of the laser beam, calculates the distance to each reflection point, and outputs measurement data indicating the position of each reflection point.
  • the direction of arrival of reflected light and the distance are reflected in the position of each reflection point.
  • Data of measurement results obtained by one scan may be referred to as "measurement data" or "sensor data”.
  • the position estimation device stores sensor data in a storage device.
  • the sensor data accumulated in the storage device is transmitted to the external device.
  • the external device is, for example, a computer that has a signal processor and has a mapping program installed.
  • the signal processor of the external device superimposes sensor data obtained for each scan.
  • a map of the space S can be created by repeatedly performing the process of overlaying the signal processor.
  • the external device transmits the created map data to the AGV 10.
  • the AGV 10 stores the created map data in an internal storage device.
  • the external device may be the operation management device 50 or another device.
  • the AGV 10 may create the map instead of the external device.
  • the processing performed by the signal processing processor of the external device described above may be performed by a circuit such as a microcontroller unit (microcomputer) of the AGV 10.
  • a microcontroller unit microcomputer
  • the data capacity of sensor data is generally considered to be large. Since it is not necessary to transmit sensor data to an external device, occupation of the communication line can be avoided.
  • the movement in the movement space S for acquiring sensor data can be implement
  • the AGV 10 wirelessly receives a traveling instruction instructing movement in each of the front, rear, left, and right directions from the user via the terminal device 20.
  • the AGV 10 travels back and forth and left and right in the moving space S in accordance with a travel command to create a map.
  • the map may be created by traveling in the moving space S in the front, rear, left, and right according to a control signal from the steering apparatus.
  • the sensor data may be acquired by a person pushing on the measurement cart on which the laser range finder is mounted.
  • FIGS. 2 and 3 Although a plurality of AGVs 10 are shown in FIGS. 2 and 3, one AGV may be provided. When there are a plurality of AGVs 10, the user 1 can use the terminal device 20 to select one AGV 10 out of the plurality of registered AGVs and create a map of the moving space S.
  • each AGV 10 can automatically travel while estimating its own position using the map.
  • the description of the process of estimating the self position will be described later.
  • FIG. 5 is an external view of an exemplary AGV 10 according to the present embodiment.
  • the AGV 10 has two drive wheels 11a and 11b, four casters 11c, 11d, 11e and 11f, a frame 12, a transport table 13, a travel control device 14, and a laser range finder 15.
  • the two drive wheels 11a and 11b are provided on the right and left sides of the AGV 10, respectively.
  • Four casters 11 c, 11 d, 11 e and 11 f are disposed at the four corners of the AGV 10.
  • the AGV 10 also has a plurality of motors connected to the two drive wheels 11a and 11b, but the plurality of motors are not shown in FIG. Further, FIG.
  • FIG. 5 shows one drive wheel 11a and two casters 11c and 11e located on the right side of the AGV 10 and a caster 11f located on the left rear, but the left drive wheel 11b and the left front
  • the caster 11 d is not shown because it is hidden by the frame 12.
  • the four casters 11c, 11d, 11e and 11f can freely pivot.
  • the drive wheel 11a and the drive wheel 11b are also referred to as a wheel 11a and a wheel 11b, respectively.
  • the travel control device 14 is a device that controls the operation of the AGV 10, and mainly includes an integrated circuit including a microcomputer (described later), an electronic component, and a substrate on which the components are mounted.
  • the traveling control device 14 performs transmission and reception of data with the terminal device 20 described above and pre-processing calculation.
  • the laser range finder 15 is an optical device that measures the distance to the reflection point by emitting a laser beam 15a of infrared or visible light, for example, and detecting the reflected light of the laser beam 15a.
  • the laser range finder 15 of the AGV 10 is, for example, a pulsed laser beam while changing the direction every 0.25 degree in a space within a range of 135 degrees (270 degrees in total) with reference to the front of the AGV 10
  • the light 15a is emitted, and the reflected light of each laser beam 15a is detected. This makes it possible to obtain data of the distance to the reflection point in the direction determined by the angle for a total of 1081 steps every 0.25 degrees.
  • the scan of the surrounding space performed by the laser range finder 15 is substantially parallel to the floor surface and planar (two-dimensional). However, the laser range finder 15 may scan in the height direction.
  • the AGV 10 can create a map of the space S based on the position and orientation (orientation) of the AGV 10 and the scan result of the laser range finder 15.
  • the map may reflect the surrounding walls of the AGV, structures such as columns, and the placement of objects placed on the floor. Map data is stored in a storage device provided in the AGV 10.
  • the position and posture of a mobile are called a pose.
  • the position and orientation of the moving body in a two-dimensional plane are represented by position coordinates (x, y) in the XY orthogonal coordinate system and an angle ⁇ with respect to the X axis.
  • the position and posture of the AGV 10, that is, the pose (x, y, ⁇ ) may be hereinafter simply referred to as "position”.
  • the position of the reflection point viewed from the emission position of the laser beam 15a can be expressed using polar coordinates determined by the angle and the distance.
  • the laser range finder 15 outputs sensor data represented by polar coordinates.
  • the laser range finder 15 may convert the position expressed in polar coordinates into orthogonal coordinates and output it.
  • the structure and the operating principle of the laser range finder are known, so a further detailed description will be omitted herein.
  • Examples of objects that can be detected by the laser range finder 15 are people, luggage, shelves, walls.
  • the laser range finder 15 is an example of an external sensor for sensing surrounding space and acquiring sensor data.
  • an image sensor and an ultrasonic sensor can be considered.
  • the traveling control device 14 can estimate the current position of itself by comparing the measurement result of the laser range finder 15 with the map data held by itself.
  • maintained may be the map data which other AGV10 created.
  • FIG. 6A shows a first hardware configuration example of the AGV 10.
  • FIG. 6A also shows a specific configuration of the traveling control device 14.
  • the AGV 10 includes a travel control device 14, a laser range finder 15, two motors 16a and 16b, a drive device 17, wheels 11a and 11b, and two rotary encoders 18a and 18b.
  • the traveling control device 14 includes a microcomputer 14a, a memory 14b, a storage device 14c, a communication circuit 14d, and a position estimation device 14e.
  • the microcomputer 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the position estimation device 14e are connected by a communication bus 14f and can exchange data with each other.
  • the laser range finder 15 is also connected to the communication bus 14f via a communication interface (not shown), and transmits measurement data as a measurement result to the microcomputer 14a, the position estimation device 14e and / or the memory 14b.
  • the microcomputer 14 a is a processor or control circuit (computer) that performs calculations for controlling the entire AGV 10 including the traveling control device 14.
  • the microcomputer 14a is a semiconductor integrated circuit.
  • the microcomputer 14a transmits a PWM (Pulse Width Modulation) signal, which is a control signal, to the drive unit 17 to control the drive unit 17 to adjust the voltage applied to the motor. This causes each of the motors 16a and 16b to rotate at a desired rotational speed.
  • PWM Pulse Width Modulation
  • One or more control circuits for example, microcomputers for controlling the drive of the left and right motors 16a and 16b may be provided independently of the microcomputer 14a.
  • motor drive device 17 may be provided with two microcomputers for controlling the drive of motors 16a and 16b, respectively.
  • Those two microcomputers may perform coordinate calculation using encoder information output from the encoders 18a and 18b, respectively, to estimate the moving distance of the AGV 10 from a given initial position.
  • the two microcomputers may control the motor drive circuits 17a and 17b using encoder information.
  • the memory 14 b is a volatile storage device that stores a computer program executed by the microcomputer 14 a.
  • the memory 14b can also be used as a work memory when the microcomputer 14a and the position estimation device 14e perform an operation.
  • the storage device 14 c is a non-volatile semiconductor memory device.
  • the storage device 14 c may be a magnetic recording medium represented by a hard disk, or an optical recording medium represented by an optical disk.
  • the storage device 14 c may include a head device for writing and / or reading data on any recording medium and a control device of the head device.
  • the storage device 14c stores map data M of the space S in which the vehicle travels and data (traveling route data) R of one or more traveling routes.
  • the map data M is created by the AGV 10 operating in the mapping mode and stored in the storage device 14c.
  • the travel route data R is transmitted from the outside after the map data M is created.
  • the map data M and the traveling route data R are stored in the same storage device 14c, but may be stored in different storage devices.
  • the AGV 10 receives traveling route data R indicating a traveling route from the tablet computer.
  • the travel route data R at this time includes marker data indicating the positions of a plurality of markers. “Marker” indicates the passing position (passing point) of the traveling AGV 10.
  • the travel route data R includes at least position information of a start marker indicating a travel start position and an end marker indicating a travel end position.
  • the travel route data R may further include positional information of markers at one or more intermediate waypoints. When the travel route includes one or more intermediate via points, a route from the start marker to the end marker via the travel via points in order is defined as the travel route.
  • the data of each marker may include, in addition to the coordinate data of the marker, data of the orientation (angle) and traveling speed of the AGV 10 until moving to the next marker.
  • the data of each marker is an acceleration time required to accelerate to the traveling speed, and / or It may include data of deceleration time required to decelerate from the traveling speed to a stop at the position of the next marker.
  • the operation management device 50 may control the movement of the AGV 10.
  • the operation management apparatus 50 may instruct the AGV 10 to move to the next marker each time the AGV 10 reaches the marker.
  • the AGV 10 receives, from the operation management apparatus 50, coordinate data of a target position to be headed to next, or data of a distance to the target position and data of an angle to be traveled as travel route data R indicating a travel route.
  • the AGV 10 can travel along the stored travel path while estimating its own position using the created map and the sensor data output from the laser range finder 15 acquired during travel.
  • the communication circuit 14d is, for example, a wireless communication circuit that performs wireless communication compliant with the Bluetooth (registered trademark) and / or the Wi-Fi (registered trademark) standard. Both standards include wireless communication standards using frequencies in the 2.4 GHz band. For example, in the mode in which the AGV 10 is run to create a map, the communication circuit 14d performs wireless communication conforming to the Bluetooth (registered trademark) standard, and communicates with the terminal device 20 on a one-to-one basis.
  • the position estimation device 14e performs map creation processing and estimation processing of the self position when traveling.
  • the position estimation device 14e creates a map of the moving space S based on the position and attitude of the AGV 10 and the scan result of the laser range finder.
  • the position estimation device 14e receives sensor data from the laser range finder 15, and reads out the map data M stored in the storage device 14c.
  • the self position (x, y, ⁇ ) on the map data M is obtained Identify
  • the position estimation device 14 e generates “reliability” data indicating the degree to which the local map data matches the map data M.
  • the data of the self position (x, y, ⁇ ) and the reliability can be transmitted from the AGV 10 to the terminal device 20 or the operation management device 50.
  • the terminal device 20 or the operation management device 50 can receive each data of the self position (x, y, ⁇ ) and the reliability and can display it on a built-in or connected display device.
  • microcomputer 14a and the position estimation device 14e are separate components, this is an example. It may be a single chip circuit or a semiconductor integrated circuit capable of independently performing each operation of the microcomputer 14a and the position estimation device 14e.
  • FIG. 6A shows a chip circuit 14g including the microcomputer 14a and the position estimation device 14e.
  • the microcomputer 14a and the position estimation device 14e are provided separately and independently will be described.
  • Two motors 16a and 16b are attached to two wheels 11a and 11b, respectively, to rotate each wheel. That is, the two wheels 11a and 11b are respectively drive wheels.
  • the motor 16a and the motor 16b are described as being motors for driving the right and left wheels of the AGV 10, respectively.
  • the moving body 10 further includes an encoder unit 18 that measures the rotational position or rotational speed of the wheels 11a and 11b.
  • the encoder unit 18 includes a first rotary encoder 18a and a second rotary encoder 18b.
  • the first rotary encoder 18a measures the rotation at any position of the power transmission mechanism from the motor 16a to the wheel 11a.
  • the second rotary encoder 18 b measures the rotation at any position of the power transmission mechanism from the motor 16 b to the wheel 11 b.
  • the encoder unit 18 transmits the signals acquired by the rotary encoders 18a and 18b to the microcomputer 14a.
  • the microcomputer 14 a may control the movement of the mobile unit 10 using not only the signal received from the position estimation device 14 e but also the signal received from the encoder unit 18.
  • the drive device 17 has motor drive circuits 17a and 17b for adjusting the voltage applied to each of the two motors 16a and 16b.
  • Each of motor drive circuits 17a and 17b includes a so-called inverter circuit.
  • the motor drive circuits 17a and 17b turn on or off the current flowing to each motor by the PWM signal transmitted from the microcomputer 14a or the microcomputer in the motor drive circuit 17a, thereby adjusting the voltage applied to the motor.
  • FIG. 6B shows a second hardware configuration example of the AGV 10.
  • the second hardware configuration example differs from the first hardware configuration example (FIG. 6A) in that it has the laser positioning system 14 h and that the microcomputer 14 a is connected to each component on a one-to-one basis. Do.
  • the laser positioning system 14 h includes a position estimation device 14 e and a laser range finder 15.
  • the position estimation device 14e and the laser range finder 15 are connected by, for example, an Ethernet (registered trademark) cable.
  • the operations of the position estimation device 14e and the laser range finder 15 are as described above.
  • the laser positioning system 14 h outputs information indicating the pose (x, y, ⁇ ) of the AGV 10 to the microcomputer 14 a.
  • the microcomputer 14a has various general purpose I / O interfaces or general purpose input / output ports (not shown).
  • the microcomputer 14a is directly connected to other components in the travel control device 14, such as the communication circuit 14d and the laser positioning system 14h, via the general-purpose input / output port.
  • the AGV 10 in the embodiment of the present disclosure may include a safety sensor such as a bumper switch not shown.
  • the AGV 10 may include an inertial measurement device such as a gyro sensor.
  • an inertial measurement device such as a gyro sensor.
  • FIGS. 7A to 7F schematically show the AGV 10 moving while acquiring sensor data.
  • the user 1 may move the AGV 10 manually while operating the terminal device 20.
  • the unit provided with the travel control device 14 shown in FIGS. 6A and 6B, or the AGV 10 itself may be mounted on a carriage, and sensor data may be acquired by the user 1 manually pushing or holding the carriage.
  • FIG. 7A shows an AGV 10 that scans the surrounding space using a laser range finder 15. A laser beam is emitted for each predetermined step angle and scanning is performed.
  • the illustrated scan range is an example schematically shown, and is different from the total scan range of 270 degrees described above.
  • the position of the reflection point of the laser beam is schematically shown using a plurality of black points 4 represented by a symbol “ ⁇ ”.
  • the scanning of the laser beam is performed at short intervals while the position and attitude of the laser range finder 15 change. Therefore, the number of actual reflection points is much larger than the number of reflection points 4 shown.
  • the position estimation device 14e stores, for example, in the memory 14b, the position of the black point 4 obtained as the vehicle travels.
  • the map data is gradually completed as the AGV 10 continues to scan while traveling.
  • FIGS. 7B-7E only the scan range is shown for simplicity.
  • the scan range is an example, and is different from the above-described example of 270 degrees in total.
  • the map may be created using the microcomputer 14a in the AGV 10 or an external computer based on the sensor data after acquiring the sensor data of the amount necessary for creating the map. Alternatively, a map may be created in real time based on sensor data acquired by the moving AGV 10.
  • FIG. 7F schematically shows a part of the completed map 40.
  • free space is partitioned by a point cloud (Point Cloud) corresponding to a collection of reflection points of the laser beam.
  • Point Cloud Point Cloud
  • Another example of the map is an occupied grid map that distinguishes space occupied by an object from free space in grid units.
  • the position estimation device 14e stores map data (map data M) in the memory 14b or the storage device 14c.
  • map data M maps map data in the memory 14b or the storage device 14c.
  • the illustrated number or density of black spots is an example.
  • the map data thus obtained may be shared by multiple AGVs 10.
  • a typical example of an algorithm in which the AGV 10 estimates its own position based on map data is ICP (Iterative Closest Point) matching.
  • ICP Intelligent Closest Point
  • the map data M may be created and recorded as data of a plurality of partial maps.
  • FIG. 8 shows an example in which the entire area of one floor of one factory is covered by a combination of four partial map data M1, M2, M3 and M4.
  • one partial map data covers an area of 50 m ⁇ 50 m.
  • a rectangular overlapping area of 5 m in width is provided at the boundary between two adjacent maps in each of the X direction and the Y direction. This overlapping area is called "map switching area".
  • Map switching area When the AGV 10 traveling while referring to one partial map reaches the map switching area, it switches to a traveling referring to another adjacent partial map.
  • the number of partial maps is not limited to four, and may be appropriately set according to the area of the floor on which the AGV 10 travels, and the performance of a computer that executes map creation and self-position estimation.
  • the size of the partial map data and the width of the overlapping area are not limited to the above example, and may be set arbitrarily.
  • FIG. 9 shows a hardware configuration example of the operation management device 50.
  • the operation management apparatus 50 includes a CPU 51, a memory 52, a position database (position DB) 53, a communication circuit 54, a map database (map DB) 55, an image processing circuit 56, and an individual difference database (individual difference DB) And 57.
  • the CPU 51, the memory 52, the position DB 53, the communication circuit 54, the map DB 55, the image processing circuit 56, and the individual difference DB 57 are connected by a bus 58 and can exchange data with each other.
  • the CPU 51 is a signal processing circuit (computer) that controls the operation of the operation management device 50.
  • the CPU 51 is a semiconductor integrated circuit.
  • the memory 52 is a volatile storage device that stores a computer program that the CPU 51 executes.
  • the memory 52 can also be used as a work memory when the CPU 51 performs an operation.
  • the position DB 53 stores position data indicating each position that can be a destination of each AGV 10.
  • the position data includes a set of coordinate values of the position output by the reference AGV and angle values of the posture.
  • the said positional data may be called "operation management registration data.”
  • the reference AGV corresponds to the mobile unit 1a shown in FIG.
  • the communication circuit 54 performs wired communication conforming to, for example, the Ethernet (registered trademark) standard.
  • the communication circuit 54 is connected to the access point 2 (FIG. 1) by wire, and can communicate with the AGV 10 via the access point 2.
  • the communication circuit 54 receives data to be transmitted to the AGV 10 from the CPU 51 via the bus 58.
  • the communication circuit 54 also transmits data (notification) received from the AGV 10 to the CPU 51 and / or the memory 52 via the bus 58.
  • the map DB 55 stores data of an internal map of a factory or the like on which the AGV 10 travels.
  • the map may be the same as or different from the map 40 (FIG. 7F).
  • the data format is not limited as long as the map has a one-to-one correspondence with the position of each AGV 10.
  • the map stored in the map DB 55 may be a map created by CAD.
  • the position DB 53 and the map DB 55 may be constructed on a non-volatile semiconductor memory, or may be constructed on a magnetic recording medium represented by a hard disk or an optical recording medium represented by an optical disc.
  • the image processing circuit 56 is a circuit that generates data of an image displayed on the monitor 59.
  • the image processing circuit 56 operates only when the administrator operates the operation management device 50. In the present embodiment, particularly the detailed description is omitted.
  • the monitor 59 may be integrated with the operation management device 50. Further, the CPU 51 may perform the processing of the image processing circuit 56.
  • the individual difference DB 57 is a storage device that stores individual difference data of each AGV 10. It is assumed that the AGV 10 that has performed sensing at a certain position outputs coordinate values (x, y) and an angle ⁇ that represents a posture. In the present embodiment, the individual difference data is acquired as the amount of deviation from the reference value of each value of (x, y, ⁇ ). The “reference value” is (x, y, ⁇ ) output from the reference AGV that has performed sensing at the position. Individual difference data is calculated for each AGV. In addition, when there are a plurality of maps, individual difference data can be calculated for each map and for each AGV. Details of the individual difference data will be described later.
  • the individual difference data can be acquired as a deviation amount from the reference value of each value of (x, y, z, ⁇ , ⁇ ).
  • FIG. 10 is a view schematically showing an example of the movement route of the AGV 10 determined by the operation management device 50. As shown in FIG.
  • the outline of the operation of the AGV 10 and the operation management device 50 is as follows. In the following, an example will be described in which an AGV 10 is currently at position M 1 and travels through several positions to the final destination position M n + 1 (n: 1 or more positive integer) .
  • position DB 53 coordinate data indicating positions such as a position M 2 to be passed next to the position M 1 and a position M 3 to be passed next to the position M 2 are recorded.
  • CPU51 of traffic control device 50 reads out the coordinate data of the position M 2 with reference to the position DB 53, further from the individual differences DB57 reads individual difference data of the AGV 10, and generates a travel command to direct the position M 2.
  • the communication circuit 54 transmits a traveling command to the AGV 10 via the access point 2.
  • the CPU 51 periodically receives data indicating the current position and attitude from the AGV 10 via the access point 2.
  • the operation management device 50 can track the position of each AGV 10.
  • CPU51 determines that the current position of the AGV10 matches the position M 2, similarly reads out the coordinate data and the individual difference data position M 3, and transmits the AGV10 generates a travel command to direct the position M 3 . That is, when it is determined that the AGV 10 has reached a certain position, the operation management device 50 transmits a traveling command for directing to the next passing position.
  • the AGV 10 can reach the final target position Mn + 1 .
  • the passing position and the target position of the AGV 10 described above may be referred to as a “marker”.
  • AGV10a AGV10a
  • AGV10b AGV10b
  • AGV10c AGV 10a to 10c
  • FIG. 11 is a plan layout view schematically showing an example of an environment 200 for acquiring individual difference data.
  • Environment 200 is part of a broader environment.
  • a thick straight line indicates, for example, a fixed wall 202 of a building.
  • the operation management apparatus 50 acquires individual difference data of each of the AGVs 10a, 10b and 10c using the position of the station ST.
  • the “station ST” is, for example, a place where a human or a robot loads and unloads the AGV 10.
  • One or more markers indicating the passing position of the AGV 10 may be set between the plurality of stations.
  • the route from one station to another is an example of a travel route of the AGV 10.
  • the station is merely mentioned as an example.
  • Individual difference data may be acquired at a position other than the station, for example, a marker position.
  • the AGVs 10a, 10b and 10c store common map data M in advance. Each of the AGVs 10a, 10b and 10c is arranged at the same position of the station ST and in the same posture, changing time. Then, each outputs an estimated value of the position and orientation of the station ST using the laser range finder 15 and the position estimation device 14e. The operation management device 50 acquires individual difference data using the output estimated position and orientation values.
  • the laser range finder 15 emits a laser beam every 0.25 degrees.
  • FIG. 12A shows a metal jig 204 fixed to the floor surface.
  • the jig 204 may be fixed to the floor at least at the time of individual difference data acquisition, and may be removed after individual difference data acquisition.
  • the jig 204 has a concave shape.
  • FIG. 12B shows the AGV 10 whose position and posture are fixed by the jig 204.
  • the jig 204 receives the front portion of the AGV 10 substantially without any gap in the concave portion.
  • the position and angle of the AGV 10 can be fixedly held.
  • the illustrated jig 204 is an example. As long as the position and posture of the AGV 10 can be fixed, the shape, structure, material and the like are arbitrary.
  • FIG. 13 shows an example of scan data acquired by the laser range finder 15 of the AGVs 10a and 10b. Black circles represent scan data of the AGV 10a, and white circles represent scan data of the AGV 10b. The illustration of scan data of the AGV 10c is omitted for the sake of simplicity.
  • the scan data of the AGVs 10a and 10b do not completely match due to differences in LRF attachment condition, LRF lens orientation, and so forth.
  • the individual differences of the AGVs 10 a and 10 b are reflected in the pose data (x, y, ⁇ ) output by the position estimation devices 14 e of the AGVs 10 a and 10 b.
  • FIG. 14 shows an example of pose data outputted by each position estimation device 14e of the AGVs 10a to 10c.
  • the pose data of the AGV 10a is (100, 100, 90).
  • the pose data of the AGVs 10b and 10c are (105, 98, 89) and (90, 110, 91), respectively.
  • the pose data of each of the AGVs 10a to 10c do not match and differ with respect to one station ST.
  • Each pose data reflects the individual differences of the respective AGVs 10a to 10c.
  • individual differences of each AGV are defined by “difference” from pose data of a certain AGV (reference AGV).
  • reference AGV standard AGV is set to AGV10a. Under this assumption, it is more preferable that common map data stored in advance in the AGVs 10a to 10c be created using the AGV 10a.
  • individual differences can be treated as nonexistent in relation to the map.
  • individual difference data of the AGV 10a which is the reference AGV is described as (0, 0, 0).
  • the AGV 10a is an AGV not used for generating map data, even if the individual difference data is (0, 0, 0), the coordinate values of the map and the coordinates of the pose data of the reference AGV It should be noted that an error may be included between the values.
  • FIG. 15 shows pose data of the AGVs 10a to 10c at the station ST and individual difference data.
  • the pose data of each of the AGVs 10a to 10c is sent to the operation management device 50 and stored, for example, in the position DB 53 or the memory 52.
  • the CPU 51 of the operation management apparatus 50 calculates individual difference data of the AGV 10 b by subtracting each value of the pose data of the AGV 10 a from each of the coordinate value of the position of the pose data of the AGV 10 b and the angle value of the posture. Similarly, the CPU 51 calculates individual difference data of the AGV 10 c by subtracting each value of the pose data of the AGV 10 a from each value of the pose data of the AGV 10 c.
  • AGV 10b An example of a calculation formula of individual difference data (x2, y2, ⁇ 2) of AGV 10b is shown below.
  • the pose data of the AGVs 10a and 10b are represented as (xa, ya, ⁇ a) and (xb, yb, ⁇ b).
  • (X2, y2, ⁇ 2) (xb, yb, ⁇ b)-(xa, ya, ⁇ a)
  • each individual difference data shown in FIG. 15 is calculated by the above-mentioned equation.
  • the individual difference data of AGV10b is also the same.
  • the acquired individual difference data of each AGV 10 is stored in the individual difference DB 57.
  • position data ("operation management registration data") of the station ST output by the reference AGV 10a is registered in the position DB 53.
  • FIG. 15 is a table for easy understanding. The illustrated table is generated and does not have to be stored in storage.
  • FIG. 16 is a flowchart showing the procedure of the process of acquiring individual difference data of the AGV 10 b.
  • the subject that performs acquisition processing is the CPU 51 of the operation management device 50. It is assumed that the reference AGV is AGV 10a.
  • step S10 the CPU 51 receives, from the AGVs 10a and 10b, pose data sensed and output at the same position and the same attitude via the communication circuit 54 and the bus 58.
  • step S11 the CPU 51 calculates the difference (x2, y2, ⁇ 2) between the pose data received from the AGV 10a and the pose data received from the AGV 10b.
  • step S12 the CPU 51 stores the calculated difference (x2, y2, ⁇ 2) in the individual difference DB 57 as individual difference data of the AGV 10b.
  • the operation management device 50 can also acquire individual difference data such as the AGV 10 c existing elsewhere.
  • the individual difference DB 57 stores data (ID, serial number, etc.) for identifying each AGV in association with individual difference data.
  • FIG. 17 shows the individual difference data group 210 stored in the individual difference DB 57.
  • FIG. 18 is a flowchart showing a procedure of correction processing of coordinate values and posture angle values using individual difference data.
  • the correction process is executed by the CPU 51 of the operation management apparatus 50 when sending a command for specifying the position of the movement destination and the posture at the position to the AGV 10b.
  • step S20 the CPU 51 determines the coordinate value of the movement destination position of the AGV 10b and the angle value (x, y, ⁇ ) of the posture at the position.
  • (X, y, ⁇ ) are, for example, actual positions and postures at which the manager of the mobile management system 100 moves the AGV 10 b.
  • step S21 the CPU 51 reads individual difference data (x2, y2, ⁇ 2) of the AGV 10b from the individual difference DB 57.
  • step S22 the CPU 51 corrects the determined coordinate value and posture angle value with the individual difference data of the AGV 10b. Specifically, the CPU 51 corrects the initial values (x, y, ⁇ ) by adding the individual difference data (x2, y2, ⁇ 2) to (x, y, ⁇ ). Thereby, correction data (x + x2, y + y2, ⁇ + ⁇ 2) is generated.
  • step S23 the CPU 51 transmits, to the AGV 10b, a command whose position and orientation are designated by the correction data.
  • the correction data (x + x2, y + y2, .theta. +. Theta.2) obtained by the process of step S22 described above is obtained from the position estimation device 14e when the AGV 10b actually performs the sensing in the state of the position and orientation (x, y, .theta.). It corresponds to each value of the position and attitude to be output.
  • the individual difference data of the AGV 10b it is possible to specify the coordinate value of the position and the angle value of the posture in consideration of the individual difference of the AGV 10b.
  • the AGV 10b When a command is transmitted to the AGV 10b in step S23, the AGV 10b actually moves to the position (x, y), and can realize the attitude of the angle value ⁇ at that position.
  • the CPU 51 of the operation management apparatus 50 generates correction data and transmits the correction data to the AGV 10 b
  • the above-described process may cause the AGV 10 b to generate correction data.
  • the CPU 51 performs the processes of the above-described steps S20 and S21, and transmits the determined (x, y, ⁇ ) and the read individual difference data of the AGV 10b to the AGV 10b.
  • the microcomputer 14a of the AGV 10b may execute the process of step S22 to generate correction data, and may move in accordance with the correction data.
  • the processing load on the operation management device 50 can be reduced. For example, if it is desired to move all AGVs 10 to the same position, it is only necessary to determine the position. Of course, the operation management device 50 may determine the position to be moved for each AGV 10. Furthermore, since the operation management device 50 does not perform the correction process, the implementation of the operation management device 50 can be further simplified.
  • the registration target is not limited to the position of the station, and for example, marker position data that can be set between the stations can also be registered.
  • FIG. 19 shows AGVs 10a and 10b present in stations ST1 and ST2.
  • the AGVs 10a and 10b perform sensing at their respective positions and output pose data that is the sensing result.
  • FIG. 20 is a diagram for explaining an outline of a process of removing the influence of individual differences and registering position data of stations ST1 and ST2.
  • the CPU 51 of the operation management apparatus 50 performs the following calculation for each station STk, and registers the obtained values (x, y, ⁇ ) k of the position and orientation as position data.
  • AGV10p in the following formula represents an AGV sensed at a station to be registered.
  • (X, y, ⁇ ) k Pose data of station STk (xk, yk, ⁇ k) -AGV10k individual difference data (xp, yp, ⁇ p)
  • the position data (x, y, ⁇ ) 1 of the station ST1 is obtained and registered as follows using the pose data output from the AGV 10a.
  • Position data (x, y, ⁇ ) 2 of station ST 2 is determined and registered as follows using the pose data output from the AGV 10 b.
  • the acquired position data may be registered in the position DB 53 of the operation management device 50.
  • FIG. 21 is a flowchart showing a procedure of position data registration processing of a station using individual difference data.
  • the subject that performs acquisition processing is the CPU 51 of the operation management device 50.
  • step S30 the CPU 51 acquires pose data (xk, yk, ⁇ k) of the station STk to be registered.
  • pose data xk, yk, ⁇ k
  • the AGV 10 p outputs pause data at the station STk.
  • step S31 the CPU 51 reads individual difference data (xp, yp, ⁇ p) of the AGV 10p from the individual difference DB 57.
  • step S32 the CPU 51 generates data (xk-xp, yk-yp, ⁇ k- ⁇ p) obtained by removing the influence of the individual difference from the pose data of the station STk.
  • step S33 the CPU 51 registers the obtained data (xk-xp, yk-yp, ⁇ k- ⁇ p) in the position DB 53 as position data of the station STk.
  • individual difference data need not be generated from only one piece of pose data.
  • the same AGV 10 may perform sensing at the same position and posture a plurality of times, output a plurality of pose data, and generate individual difference data using an average value of the plurality of obtained pose data.
  • the operation management device 50 may update the individual difference data periodically or at any timing instructed by the administrator.
  • an AGV traveling in a two-dimensional space is taken as an example.
  • the present disclosure can also be applied to a mobile object moving in three-dimensional space, such as a flying object (drone).
  • a drone creates a three-dimensional space map while flying, the two-dimensional space can be expanded to a three-dimensional space.
  • the above general or specific aspects may be realized by a system, a method, an integrated circuit, a computer program, or a recording medium.
  • the present invention may be realized by any combination of a system, an apparatus, a method, an integrated circuit, a computer program, and a storage medium.
  • the operation management apparatus and mobile management system of the present disclosure can be suitably used for moving and transporting objects such as luggage, parts, finished products, etc. in factories, warehouses, construction sites, logistics, hospitals and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

L'invention concerne un dispositif de gestion opérationnelle gérant le fonctionnement d'une pluralité de corps mobiles capables de déplacement autonome et comprenant un premier corps mobile et un second corps mobile. Chaque corps mobile comprend : une mémoire qui stocke les mêmes données cartographiques ; un capteur qui détecte l'espace environnant et délivre des données de capteur ; un dispositif de positionnement qui compare les données de capteur aux données de carte et délivre des données de pose indiquant la position et l'attitude ; et un circuit de communication. Le dispositif de gestion de fonctionnement est pourvu d'un dispositif d'interface de communication, d'un circuit de calcul et d'un dispositif de stockage. Le circuit de calcul reçoit, par l'intermédiaire du dispositif d'interface de communication, les données de pose délivrées par les premier et second corps mobiles suite à une détection effectuée à la même attitude et dans la même position dans l'espace, et le circuit de calcul stocke des données de différence individuelle qui sont la différence entre les données de pose reçues du premier corps mobile et les données de pose reçues du second corps mobile.
PCT/JP2018/034878 2017-09-25 2018-09-20 Dispositif de gestion opérationnelle WO2019059299A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019543706A JPWO2019059299A1 (ja) 2017-09-25 2018-09-20 運行管理装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017183537 2017-09-25
JP2017-183537 2017-09-25

Publications (1)

Publication Number Publication Date
WO2019059299A1 true WO2019059299A1 (fr) 2019-03-28

Family

ID=65811409

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/034878 WO2019059299A1 (fr) 2017-09-25 2018-09-20 Dispositif de gestion opérationnelle

Country Status (2)

Country Link
JP (1) JPWO2019059299A1 (fr)
WO (1) WO2019059299A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014016747A (ja) * 2012-07-06 2014-01-30 Honda Motor Co Ltd 配置決定方法、配置決定装置及び移動体
JP2014112059A (ja) * 2012-12-05 2014-06-19 Chugoku Electric Power Co Inc:The 移動体に位置情報を提供するシステム、及び位置情報提供方法
JP2017134794A (ja) * 2016-01-29 2017-08-03 パナソニックIpマネジメント株式会社 移動ロボット制御システム及び移動ロボットを制御するサーバ装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014016747A (ja) * 2012-07-06 2014-01-30 Honda Motor Co Ltd 配置決定方法、配置決定装置及び移動体
JP2014112059A (ja) * 2012-12-05 2014-06-19 Chugoku Electric Power Co Inc:The 移動体に位置情報を提供するシステム、及び位置情報提供方法
JP2017134794A (ja) * 2016-01-29 2017-08-03 パナソニックIpマネジメント株式会社 移動ロボット制御システム及び移動ロボットを制御するサーバ装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ISHIOKA, KOJI ET AL.: "MARSHA: Design and implementation of map acquiring system for multiple autonomous mobile robots", JOURNAL OF THE ROBOTICS SOCIETY OF JAPAN, vol. 12, no. 6, September 1994 (1994-09-01), pages 846 - 856, XP055583948 *

Also Published As

Publication number Publication date
JPWO2019059299A1 (ja) 2020-10-15

Similar Documents

Publication Publication Date Title
JP6816830B2 (ja) 位置推定システム、および当該位置推定システムを備える移動体
TWI665538B (zh) 進行障礙物之迴避動作的移動體及記錄其之電腦程式的記錄媒體
JP2019168942A (ja) 移動体、管理装置および移動体システム
JP6825712B2 (ja) 移動体、位置推定装置、およびコンピュータプログラム
JP7081881B2 (ja) 移動体および移動体システム
WO2019026761A1 (fr) Corps mobile et programme informatique
JP2020057307A (ja) 自己位置推定のための地図データを加工する装置および方法、ならびに移動体およびその制御システム
JP7136426B2 (ja) 管理装置および移動体システム
WO2019054209A1 (fr) Système et dispositif de création de carte
US11537140B2 (en) Mobile body, location estimation device, and computer program
JP2019053391A (ja) 移動体
JP2019175137A (ja) 移動体および移動体システム
WO2019194079A1 (fr) Système d'estimation de position, corps mobile comprenant ledit système d'estimation de position, et programme informatique
JP2019175136A (ja) 移動体
JP2019079171A (ja) 移動体
JP2019179497A (ja) 移動体および移動体システム
JP2019067001A (ja) 移動体
JP2020166702A (ja) 移動体システム、地図作成システム、経路作成プログラムおよび地図作成プログラム
WO2019059299A1 (fr) Dispositif de gestion opérationnelle
JP2021056764A (ja) 移動体
WO2019069921A1 (fr) Corps mobile
JP2019148871A (ja) 移動体および移動体システム
JP2020166701A (ja) 移動体およびコンピュータプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18858659

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019543706

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18858659

Country of ref document: EP

Kind code of ref document: A1