WO2024034025A1 - Autonomous movement control device, autonomous movement system, autonomous movement method, and program - Google Patents

Autonomous movement control device, autonomous movement system, autonomous movement method, and program Download PDF

Info

Publication number
WO2024034025A1
WO2024034025A1 PCT/JP2022/030490 JP2022030490W WO2024034025A1 WO 2024034025 A1 WO2024034025 A1 WO 2024034025A1 JP 2022030490 W JP2022030490 W JP 2022030490W WO 2024034025 A1 WO2024034025 A1 WO 2024034025A1
Authority
WO
WIPO (PCT)
Prior art keywords
autonomous
self
quasi
periodic
vehicle
Prior art date
Application number
PCT/JP2022/030490
Other languages
French (fr)
Japanese (ja)
Inventor
誠史 吉田
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to PCT/JP2022/030490 priority Critical patent/WO2024034025A1/en
Publication of WO2024034025A1 publication Critical patent/WO2024034025A1/en

Links

Images

Definitions

  • the present disclosure relates to a technology for controlling a mobile body capable of autonomous movement.
  • AGVs Automatic Guided Vehicles
  • AGVs are used in indoor environments such as factories and warehouses, which autonomously travel along routes without requiring human operation.
  • an automated guided vehicle vehicle
  • Control methods for driving automatic guided vehicles include magnetic guidance, optical guidance, image recognition, laser guidance, and SLAM (Simultaneous Localization and Mapping).
  • the magnetic induction method is a method in which a magnetic sensor mounted on the vehicle guides the vehicle along the driving route by detecting the magnetic field emitted by magnetic bars or magnetic tape laid on the floor along the route.
  • the magnetic induction method has the advantage of being simple and highly reliable, but has the disadvantage of being costly when changing routes. In addition, there is a possibility that it will be affected by magnetic fields due to piping and the like existing near the route.
  • the optical guidance method is a method in which an optical sensor mounted on the vehicle guides the vehicle along the driving route by detecting the route formed on the floor by laying guidance tape.
  • the optical induction method has the advantage of being cheaper than the magnetic induction method.
  • the image recognition method uses a camera mounted on the vehicle to read markers such as QR codes (registered trademarks) installed on the floor or ceiling, allowing the vehicle to determine its own location and automatically drive to its destination. be.
  • markers such as QR codes (registered trademarks) installed on the floor or ceiling, allowing the vehicle to determine its own location and automatically drive to its destination. be.
  • these three control methods are methods that allow the vehicle to travel only on a pre-fixed route, and are also called route guidance methods.
  • a laser beam irradiation device mounted on a vehicle irradiates a laser beam onto a reflective plate installed on a wall, floor, or pillar, and a receiver mounted on the vehicle
  • This is a method in which a vehicle estimates its own position by detecting reflected waves from a reflector and travels without using guides.
  • SLAM guidance methods include the Visual SLAM method that uses a camera and the LiDAR SLAM method that uses LiDAR (Light Detection and Ranging).
  • Visual SLAM is a method in which a camera mounted on a vehicle acquires continuous changes in image data of the surrounding environment, such as walls, to estimate its own position and drive without using guides.
  • a laser light irradiation device mounted on a vehicle irradiates laser light onto the surrounding environment such as a wall, and a laser light receiving device mounted on the vehicle takes the time it takes for reflected light from the surrounding environment to be returned.
  • the vehicle estimates its own position by measuring the distance and direction to the surrounding environment and runs without using guides.
  • the laser guidance method and SLAM method are also called autonomous mobile methods because they do not require a guide.
  • AGVs that use SLAM are also called AMRs (Autonomous Mobile Robots).
  • AMRs Autonomous Mobile Robots
  • the advantage of the SLAM method is that there is no need to install guides, so you can freely set the travel route.
  • the vehicle temporarily stops when it detects an obstacle, but with the SLAM method, the vehicle can avoid the obstacle and continue driving by resetting the route.
  • the SLAM method requires processing to eliminate the effects of changes in the surrounding environment, such as opening and closing of doors, changes in the loading status of goods on shelves, and movements of workers and other transport vehicles. In addition, there is a problem that movement may be hindered in a space without feature points.
  • Non-patent document 1 a system that extracts feature points from floor image data captured by a camera mounted on a vehicle and traces them to automatically drive is being considered ( Non-patent document 1).
  • methods for extracting feature points from image data of the surrounding environment include a method that extracts feature points from the image (feature based method) and a method that refers to the entire image (direct method). ), but in either case, if there are various changes in the surrounding environment due to pedestrians other than static objects, moving vehicles, opening/closing of doors, etc. on the driving route, map data is generated and images are generated. In the process of performing the matching process, image processing becomes complicated and the amount of processing increases, resulting in an increase in processing load.
  • the present invention was made in view of the above-mentioned circumstances, and an object of the present invention is to suppress the processing load and enable autonomous movement such as autonomous driving.
  • the invention according to claim 1 provides an autonomous movement control device for controlling the movement of an autonomous mobile body capable of autonomously moving, wherein quasi-periodic tiles constituted by a plurality of types of tile shapes are provided.
  • the self-position and traveling direction of the autonomous mobile body in the quasi-periodic tile can be determined by pattern matching data regarding the image obtained by photographing and map data having a coordinate system corresponding to the quasi-periodic tile.
  • Control of movement of the autonomous mobile body based on a self-position estimating unit to estimate, the self-position and the traveling direction estimated by the self-position estimation unit, and a preset travel route of the autonomous mobile body.
  • FIG. 1 is an overall configuration diagram of an autonomous driving system according to an embodiment.
  • FIG. 1 is a configuration diagram of an automatic driving vehicle.
  • FIG. 2 is an electrical hardware configuration diagram of the information processing device.
  • FIG. 3 is a plan view of a quasi-periodic tile. It is a figure showing the shape of each tile.
  • FIG. 3 is a plan view showing a travel pattern of initialization travel. It is a flowchart which shows the process of initialization driving
  • FIG. 3 is a plan view showing a driving route for the actual driving. It is a flowchart which shows the process of a main run.
  • FIG. 4 is a diagram clarifying the 12-fold symmetry part in the quasi-periodic tile.
  • FIG. 1 is an overall configuration diagram of an autonomous driving system according to an embodiment.
  • an autonomous driving system 1 is constructed of an autonomous driving vehicle (also referred to as a "vehicle") 2 such as an automatic guided vehicle capable of autonomous driving, and quasi-periodic tiles 8 on the floor. .
  • the vehicle 2 estimates its own position and traveling direction and moves by reading a pattern formed by a plurality of tile shapes of the quasi-periodic tiles 8.
  • the quasi-periodic tile 8 is a tile that is formed of a plurality of types of tile shapes and has a non-periodic geometric pattern drawn thereon, and is installed on the floor surface.
  • the plurality of types of tile shapes are composed of three types of figures (a square with equal side length L and two types of diamonds).
  • the quasi-periodic tile 8 realizes a planar filling with no periodic pattern (no overlapping period due to parallel movement) using finite types of polygons, unlike a planar filling with a periodic pattern made up of regular polygons. This is a tiling method.
  • Penrose tiling is known, which realizes flat filling using two types of rhombuses.
  • the geometric pattern of the quasi-periodic tiles 8 shown in FIGS. 4A and 4B is just an example, and may be a geometric pattern without periodicity.
  • the aperiodic nature of the pattern of the quasi-periodic tiles 8 is utilized for estimating the self-position of the vehicle 2.
  • the quasi-periodic tiles 8 are constructed on the floor surface, and are imaged by an imaging unit 20 (described later) mounted on the vehicle 2. Note that as long as the straight lines constituting the quasi-periodic tiles 8 can be optically identified by the imaging unit 20, the tile pattern may be drawn with paint or the like on the floor surface on which the vehicle 2 runs, or the sheet on which the tile pattern is printed may be used as a carpet.
  • the pattern of the quasi-periodic tiles 8 may be constructed by pasting tape or the like on the floor surface, or the pattern of the quasi-periodic tiles 8 may be constructed by laying tiles of a plurality of shapes on the floor surface.
  • the pattern of the quasi-periodic tiles 8 may be displayed on a display element such as a display installed on the floor, or the pattern of the quasi-periodic tiles 8 may be projected onto the floor.
  • the tiles are configured with sufficient density to specify the position of the vehicle 2.
  • quasi-periodic tiles are configured with the density (size) necessary to estimate the self-position of the vehicle 2 based on the number of pixels of the imaging unit 20 mounted on the vehicle 2 and the distance from the floor surface. shall be.
  • the pattern of the quasi-periodic tiles 8 does not necessarily need to be visible, and may be constructed using, for example, an invisible marker that emits light in response to light of a wavelength outside visible light.
  • the imaging unit 20 is configured with an element that detects the light beam of the invisible marker.
  • FIG. 2 is a configuration diagram of the autonomous vehicle 2. As shown in FIG. 2,
  • the vehicle 2 is configured by an autonomous driving control device 3 and a traveling device 6.
  • the autonomous driving control device 3 photographs the quasi-periodic tiles 8 to estimate the self-position and traveling direction of the vehicle 2, and controls the operation of the traveling device 6.
  • the traveling device 6 is a device that is configured with a motor, tires, a battery, a steering mechanism, etc., and causes the vehicle 2 to travel.
  • the autonomous running control device 3 includes an information processing device 5, an imaging unit 20, a gyro sensor 26, an acceleration sensor 27, and an odometry 28.
  • the photographing unit 20 is a camera that photographs the floor surface of the quasi-periodic tiles 8 and the like to generate (obtain) image data.
  • the photographing unit 20 is fixedly installed in the vehicle 2 facing toward the floor surface, and photographs the floor surface. It is assumed that the offset (relative position) between any pixel on the image data obtained by the imaging unit 20 and the reference position of the vehicle 2 has been measured and calibrated (calibration, proofreading, adjustment) in advance.
  • the photographing unit 20 may be a camera that performs monocular photography, a camera that performs stereo photography, or an omnidirectional imaging camera equipped with a fisheye lens.
  • the imaging unit 20 may have one camera, or a plurality of cameras may share the imaging range.
  • the imaging unit 20 may be a remote sensing device such as LiDAR other than a camera.
  • the information processing device 5 is a computer such as a PC (Personal Computer), and based on the position and orientation of the photographing section 20 installed in the vehicle 2 and the data regarding the image acquired from the photographing section 20, the information processing device 5 determines the characteristics on the image. By measuring the relative position of the vehicle 2 with respect to points (sides and intersections of the quasi-periodic tiles 8), the self-position (absolute position) and traveling direction of the vehicle 2 can be estimated, and the operation of the traveling device 6 can be controlled. A control signal is sent to the traveling device 6.
  • the information processing device 5 will be explained in detail later.
  • the autonomous driving control device 3 of this embodiment has the imaging unit 20 and the information processing device 5, autonomous driving is possible by reading the quasi-periodic tiles 8, but due to a failure of the imaging unit 20, etc. Even if it becomes impossible to obtain image-related data, it is possible to continue autonomous driving through dead reckoning using sensors, etc. Furthermore, the sensors and the like provide robustness to the local periodicity and rotational symmetry of the pattern of the quasi-periodic tiles 8 in the self-position and orientation estimation operation of the vehicle 2 during autonomous travel. However, each parameter of the gyro sensor 26, the acceleration sensor 27, and the odometry 28 needs to be calibrated at the stage of initialization driving (FIG. 5), which will be described later, before a failure of the photographing unit 20 occurs.
  • the gyro sensor 26 is a sensor that detects the angle, angular velocity, or angular acceleration of an object (here, the vehicle 2).
  • the acceleration sensor 27 is a sensor that measures the acceleration of an object (here, the vehicle 2).
  • the odometry 28 is a device that measures the number of rotations of the wheels of the vehicle 2 using a rotary encoder, for example, and calculates the moving distance, speed, and turning angle of the vehicle 2 from the number of rotations of the wheels.
  • the odometry 28 is a device for further increasing the accuracy of self-position estimation, and is a device that complements the gyro sensor 26 and the acceleration sensor 27, so it does not necessarily need to be provided in the autonomous travel control device 3. That is, the odometry 28 allows the vehicle 2 to improve the estimation accuracy and reliability of its own position.
  • FIG. 3 is an electrical hardware configuration diagram of the information processing device 5. As shown in FIG.
  • the information processing device 5 is a computer that includes a CPU (Central Processing Unit) 501, a ROM (Read Only Memory) 502, a RAM (Random Access Memory) 503, and an SSD (Solid State Drive) 504. , an external device connection I/F (Interface) 505, a network I/F 506, a media I/F 509, and a bus line 510.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • SSD Solid State Drive
  • the CPU 501 controls the operation of the entire information processing device 5.
  • the ROM 502 stores programs used to drive the CPU 501, such as IPL (Initial Program Loader).
  • RAM 503 is used as a work area for CPU 501.
  • the SSD 504 reads or writes various data under the control of the CPU 501. Note that an HDD (Hard Disk Drive) may be used instead of the SSD 504.
  • HDD Hard Disk Drive
  • the external device connection I/F 505 is an interface for connecting various external devices.
  • External devices in this case include a display, speaker, keyboard, mouse, USB (Universal Serial Bus) memory, printer, and the like.
  • the network I/F 506 is an interface for data communication via a communication network such as the Internet and/or LAN (Local Area Network). Note that a wireless communication device may be used instead of the network I/F 506.
  • the media I/F 509 controls reading or writing (storage) of data to a recording medium 509m such as a flash memory.
  • the recording media 509m also include DVDs (Digital Versatile Discs), Blu-ray Discs (registered trademark), and the like.
  • the bus line 510 is an address bus, a data bus, etc. for electrically connecting each component such as the CPU 501 shown in FIG. 3.
  • the information processing device 5 includes a route data setting section 31, a self-position estimation section 33, and a control section 35. Each of these units is a function realized by instructions from the CPU 501 according to a program stored in the RAM 503 or the like. Further, a route data storage section 21, an image data storage section 22, and a map data storage section 23 are constructed in the RAM 503 or SSD 504 of the information processing device 5.
  • the route data storage section 21 stores data of the travel route set by the route data setting section 31.
  • the route data is given, for example, as a set of coordinate values of a starting point, an ending point, and a waypoint on the floor.
  • the image data storage unit 22 stores data regarding each image obtained by being continuously captured by the imaging unit 20 while the vehicle 2 is running.
  • the image-related data stored in the image data storage unit 22 may be the image data itself, or may be data with a smaller data capacity (for example, polygon data that extracts the shape of a tile from the image data or identified finite types). data recording the arrangement of tiles).
  • the map data storage unit 23 stores map data (information) with coordinate data of drawing data of the floor surface forming the quasi-periodic tile 8, and is referred to during processing in the self-position estimating unit 33.
  • the route data setting unit 31 accepts the setting of the travel route of the vehicle 2 from the user, and stores the data of the set travel route in the route data storage unit 21.
  • the self-position estimating unit 33 performs a matching process (pattern matching) between the image-related data stored in the image data storage unit 22 and the map data stored in the map data storage unit 23, thereby determining the vehicle's self-position. (current position) and advancing direction (movement direction). Note that when an emergency situation occurs due to a failure of the photographing unit 20 or the like, as described above, the self-position estimating unit 33 uses at least the gyro sensor 26 and the acceleration sensor of the gyro sensor 26, the acceleration sensor 27, and the odometry 28. An output signal is obtained from the sensor 27 and used as data on the traveling direction and speed of the vehicle 2 in an emergency.
  • pattern matching pattern matching
  • the control unit 35 controls the traveling device 6 based on the data acquired from the self-position estimating unit 33 (estimated values of the self-position and traveling direction of the vehicle 2) and the travel route data acquired from the route data storage unit 21. A control signal is generated and sent to the traveling device 6.
  • the vehicle 2 may be equipped with a photographing unit (the photographing unit 20 or another photographing unit), LiDAR, RADAR, etc. for detecting obstacles in the traveling direction of the vehicle 2.
  • the route data setting section 31 sets a re-route to avoid the obstacle in real time, and stores the re-route data in the route data storage section 21.
  • the information processing device 5 may be installed at a location other than the autonomous vehicle 2, for example, on a cloud platform via a communication network.
  • FIG. 5 is a plan view showing a travel pattern for initialization travel.
  • FIG. 6 is a flowchart showing the initialization traveling process.
  • FIG. 7 is a plan view showing the route of the main run.
  • FIG. 8 is a flowchart showing the processing of the main run.
  • the map data storage unit 23 stores map data having a coordinate system corresponding to a plan view of a floor surface having a pattern of quasi-periodic tiles 8.
  • the imaging unit 20 is fixed to the vehicle 2 and continuously images the floor surface near the vehicle 2 while the vehicle 2 is running.
  • the self-position estimating unit 33 of the vehicle 2 does not extract feature points from the entire image data of the captured floor surface near the vehicle 2, but extracts feature points from the image data of the floor surface by straight lines and intersection points.
  • Image-related data stored in the image data storage unit 22 is used to extract each shape of the quasi-periodic tiles 8 and identify which of the three types of tile shapes it corresponds to by pattern matching. use.
  • the vehicle 2 performs initialization (preparation) run in order to identify its own position and orientation.
  • the user uses the route data setting section 31 to set a driving pattern for initializing driving.
  • the initial driving pattern is such that the vehicle 2 travels straight from the initial state, detects the boundary (end) of the quasi-periodic tile 8, and then moves clockwise along the boundary.
  • a running pattern is used.
  • a traveling pattern may be used in which the vehicle moves randomly on the quasi-periodic tiles 8 during the initial traveling.
  • the data of the set driving pattern for the initialized driving is stored in the route data storage section 21.
  • the user uses the route data setting unit 31 to set a travel route (route) on the floor surface on which the quasi-periodic tiles 8 are constructed, as indicated by arrows in FIG. 7 for the main run. Set the route.
  • the data of the set route for the actual driving is stored in the route data storage section 21.
  • the vehicle 2 performs an initialization run as shown below using the data of the initialization run pattern with the user placing the vehicle 2 on the quasi-periodic tile 8 in an arbitrary position and orientation as the initial state. Perform processing or operation ( Figure 6). Note that, as described above, when the vehicle 2 detects an obstacle, the vehicle 2 may avoid the obstacle and construct a new route.
  • the photographing section 20 stores data regarding an image obtained by photographing the pattern of the quasi-periodic tiles 8 in the image data storage section 22. Thereby, the vehicle 2 can store in chronological order the state of change in the pattern formed by the three types of tiles on the image as the vehicle 2 travels.
  • the self-position estimating unit 33 extracts the pattern of the quasi-periodic tiles 8 from the data related to the image read from the image data storage unit 22.
  • the self-position estimating unit 33 compares the extracted pattern with the map data pattern stored in the map data storage unit 23 (pattern matching) to determine the self-position (absolute position) and progress of the vehicle 2.
  • pattern matching pattern matching
  • the quasi-periodic tile 8 has no periodicity in its pattern, but locally has periodicity and rotational symmetry. Therefore, by driving the vehicle 2 over a sufficient range during initialization, the positional ambiguity can be resolved and the self-position of the vehicle 2 can be specified as an absolute position (coordinates) on the map.
  • the vehicle 2 can continuously estimate its own position and traveling direction by continuously photographing and tracking the tile pattern while driving.
  • the self-position estimating unit 33 continuously photographs and tracks the pattern of the quasi-periodic tiles 8 to continuously estimate the self-position and the direction of travel. Calibrate (calibrate, calibrate, adjust) each parameter. Note that each parameter may be held by the gyro sensor 26, the acceleration sensor 27, and the odometry 28, respectively, or may be stored separately by a parameter storage unit constructed from the RAM 503 or the like. Calibration of the gyro sensor 26 and the acceleration sensor 27 includes initialization of parameters in an initial (stationary) state and optimization of parameters such as an extended Kalman filter in a moving state. Calibration of the odometry 28 includes optimization of coefficients when calculating the speed of the vehicle 2 from the rotation speed of the wheels of the vehicle 2 obtained by the encoder.
  • the self-position estimating unit 33 determines whether the initialization run has been completed based on the estimated state of the self-position and traveling direction of the vehicle 2, and the state of calibration. If the initialization run has not been completed, the process returns to step S11. On the other hand, if the initialization run is completed, the process proceeds to S21 below. Note that the calibration of each parameter may be performed after the initialization run is completed.
  • the intersection point that is cyclically arranged at the same angle on the quasi-periodic tile 8 is point C, but by detecting point B following point A in the direction of travel of the vehicle 2, it can be specified that the intersection point is point A. Can be done.
  • the time required to complete the initialization run can be shortened.
  • the vehicle 2 uses the route data for the actual trip (FIG. 7) to perform the following processing or operation for the actual trip (FIG. 8). Note that, as described above, when the vehicle 2 detects an obstacle, the vehicle 2 may avoid the obstacle and construct a new route.
  • the photographing section 20 stores data regarding an image obtained by photographing the pattern of the quasi-periodic tiles 8 in the image data storage section 22. Thereby, the vehicle 2 can store in chronological order the state of change in the pattern formed by the three types of tiles on the image as the vehicle 2 travels.
  • the self-position estimating unit 33 extracts a pattern from the data related to the image read from the image data storage unit 22.
  • the self-position estimating unit 33 compares (or collates) the extracted pattern with the map pattern stored in the map data storage unit 23 to determine the self-position (absolute position) and traveling direction of the vehicle 2. Estimate.
  • the control unit 35 compares the self-position and traveling direction of the vehicle 2 with the route data stored in the route data storage unit 21, generates a control signal, and outputs this control signal to the traveling device 6. Note that even during the actual running, the self-position estimating unit 33 may calibrate each parameter of the gyro sensor 26, acceleration sensor 27, and odometry 28 from the estimated self-position and orientation at any time.
  • the self-position estimating unit 33 determines whether the end point has been reached based on the route data for the actual run. If the end point has not been reached, the process returns to step S21. On the other hand, when the end point has been reached, the processing of the main run ends.
  • the initialization run does not necessarily need to be performed before the main run.
  • the main run if the vehicle 2 is stopped at an arbitrary parking position on the quasi-periodic tile 8 and the data of the position and orientation of the vehicle 2 are memorized, the main run can be started immediately without having to perform initialization run at startup. You can move to the starting point and start the actual run.
  • the vehicle 2 of this embodiment extracts a combination pattern of simple-shaped tiles from image data, performs pattern matching with the stored data of the quasi-periodic tiles 8, and obtains absolute coordinates from the tile positions. By doing so, it is possible to estimate the self-position with high accuracy while suppressing the processing load.
  • the vehicle 2 can trace the relative displacement, that is, the moving direction and moving distance with high accuracy even when moving in any direction, by the tile shape, the vehicle 2 can continuously estimate its own position and traveling direction. By controlling the vehicle with reference to route data, it is possible to accurately travel along a virtual route. It can also create routes to avoid obstacles in real time and avoid them accurately.
  • the quasi-periodic tile 8 locally has two-dimensional rotational symmetry.
  • the part within the broken line has 12-fold symmetry, so if the area near the center of rotational symmetry in the radial direction is included in the movement route of the vehicle 2, , it is necessary to consider the ambiguity of orientation. This ambiguity can be solved by using orientation data from the gyro sensor 26.
  • a pattern of quasi-periodic tiles 8 is constructed on a part of the road (for example, at an intersection), and when the vehicle 2 passes by, the photographing unit 20 images the tile pattern of the quasi-periodic tiles 8 to determine the absolute position of the vehicle 2. Possible uses include estimating .
  • the method in which the vehicle 2 moves while reading the quasi-periodic tiles 8, as in this embodiment does not require brightness to recognize the entire environment in which the vehicle 2 is traveling. It is sufficient if there is enough local brightness to distinguish the pattern on the floor near 2. Also, unlike the SLAM method, it is not affected by changes in the surrounding environment, so processing and operation are not affected by changes in the positions of other unmanned vehicles or workers. It can also be applied to indoor or outdoor environments with a large floor area and few feature points. Furthermore, it is possible to move objects (such as luggage) placed on the floor surface on which the vehicle 2 travels at any time, and to dynamically change the layout of the floor surface.
  • objects such as luggage
  • the information processing device 5 can be realized by a computer and a program, but this program can also be recorded on a (non-temporary) recording medium or provided via a communication network such as the Internet.
  • the CPU 101 may be one or more.
  • the vehicle 2 of the above embodiment is an example of an autonomous mobile body that can move autonomously.
  • Autonomous mobile objects also include ships, aircraft, and underwater vehicles (for example, a mobile object that detects cracks in a swimming pool).
  • the autonomous travel control device 3 is also an example of an autonomous movement control device. That is, in this specification, the word "running" can be replaced with "moving". Further, the traveling device 6 may be moved by a jet engine, a propeller, or the like instead of tires.
  • Autonomous driving system (an example of an autonomous moving system) 2 Autonomous vehicle (an example of an autonomous mobile object) 3 Autonomous travel control device (an example of an autonomous movement control device) 5 Information processing device 6 Traveling device (an example of a moving device) 8 Quasi-periodic tiles 20 Photographing section 21 Route data storage section 22 Image data storage section 23 Map data storage section 26 Gyro sensor 27 Acceleration sensor 28 Odometry 31 Route data setting section 33 Self-position estimation section 35 Control section

Abstract

The present disclosure has a purpose of reducing a processing load and enable autonomous movement such as autonomous travel and the like. For that purpose, the present disclosure provides an autonomous movement control device that controls a movement of an autonomous moving body that is capable of moving autonomously, the autonomous movement control device comprising: a self-position estimation unit that estimates, by pattern matching data that relates to an image that is obtained by imaging quasi-periodic tiles that are composed of a plurality of types of tile shapes and data that has a coordinate system that corresponds to the quasi-periodic tiles, a self-position and a traveling direction of the autonomous moving body on the quasi-periodic tiles; and a control unit that controls a movement of the autonomous moving body, on the basis of the self-position and the traveling direction of the autonomous moving body that were estimated by the self-position estimation unit, and a movement route of the autonomous moving body that is set in advance.

Description

自律移動制御装置、自律移動システム、自律移動制御方法、及びプログラムAutonomous movement control device, autonomous movement system, autonomous movement control method, and program
 本開示内容は、自律移動が可能な移動体の制御技術に関する。 The present disclosure relates to a technology for controlling a mobile body capable of autonomous movement.
 工場や倉庫といった屋内環境において人的な操作を必要とせず自律的に経路に沿った走行を行うAGV(Automatic Guided Vehicle:無人搬送車)が使用されている。無人搬送車(車両)の走行では車両の自己位置を推定し、予め定められた経路に沿った走行を行うための制御が必要となる。無人搬送車の走行の制御方式には、磁気誘導方式、光学誘導方式、画像認識方式、レーザー誘導方式、SLAM (Simultaneous Localization and Mapping)方式等がある。 AGVs (Automatic Guided Vehicles) are used in indoor environments such as factories and warehouses, which autonomously travel along routes without requiring human operation. When an automated guided vehicle (vehicle) runs, it is necessary to estimate the vehicle's own position and control the vehicle so that it travels along a predetermined route. Control methods for driving automatic guided vehicles include magnetic guidance, optical guidance, image recognition, laser guidance, and SLAM (Simultaneous Localization and Mapping).
 磁気誘導方式は、車両に搭載された磁気センサが、経路に沿って床に敷設された磁気棒や磁気テープが発する磁界を検知することで、走行経路に車両を誘導する方式である。磁気誘導方式は、仕組みが単純で信頼性が高いメリットがある一方、ルートを変更する際にコストを要するデメリットがある。また、経路近傍に存在する配管などによる磁界の影響を受ける可能性がある。 The magnetic induction method is a method in which a magnetic sensor mounted on the vehicle guides the vehicle along the driving route by detecting the magnetic field emitted by magnetic bars or magnetic tape laid on the floor along the route. The magnetic induction method has the advantage of being simple and highly reliable, but has the disadvantage of being costly when changing routes. In addition, there is a possibility that it will be affected by magnetic fields due to piping and the like existing near the route.
 光学誘導方式は、車両に搭載された光学センサが、誘導用のテープが敷設されることで床に形成された経路を検知することで、走行経路に車両を誘導する方式である。光学誘導方式は、磁気誘導方式に比べるとコストが抑えられるメリットがある。 The optical guidance method is a method in which an optical sensor mounted on the vehicle guides the vehicle along the driving route by detecting the route formed on the floor by laying guidance tape. The optical induction method has the advantage of being cheaper than the magnetic induction method.
 画像認識方式は、車両に搭載されたカメラが、床や天井に設置されたQRコード(登録商標)等のマーカーを読み取ることで、車両が自己位置を把握して目的地まで自動で走行する方式である。なお、これら3つの制御方式は、車両が予め固定されたルートのみを走行することができる方式であり、経路誘導方式とも呼ばれる。 The image recognition method uses a camera mounted on the vehicle to read markers such as QR codes (registered trademarks) installed on the floor or ceiling, allowing the vehicle to determine its own location and automatically drive to its destination. be. Note that these three control methods are methods that allow the vehicle to travel only on a pre-fixed route, and are also called route guidance methods.
 レーザー誘導方式(反射板誘導方式)は、車両に搭載されたレーザー光照射装置が、壁、床、又は柱などに設置された反射板にレーザー光を照射し、車両に搭載された受信装置が、反射板からの反射波を検知することで、車両が自己位置を推定して誘導体を使用せずに走行する方式である。 In the laser guidance method (reflector guidance method), a laser beam irradiation device mounted on a vehicle irradiates a laser beam onto a reflective plate installed on a wall, floor, or pillar, and a receiver mounted on the vehicle This is a method in which a vehicle estimates its own position by detecting reflected waves from a reflector and travels without using guides.
 SLAM誘導方式は、カメラを用いるVisual SLAM方式とLiDAR(Light Detection and Ranging)を用いるLiDAR SLAM方式がある。Visual SLAM方式は、車両に搭載されたカメラが壁などの周辺環境の連続する画像データの変化を取得することで、車両が自己位置を推定して誘導体を使用せずに走行する方式である。LiDAR SLAM方式は、車両に搭載されたレーザー光照射装置が壁などの周辺環境にレーザー光を照射し、車両に搭載されたレーザー光受信装置が周辺環境の反射光が返ってくるまでの時間を計測し、周辺環境までの距離や方向を測定することで、車両が自己位置を推定して誘導体を使用せずに走行する方式である。レーザー誘導方式及びSLAM方式は、誘導体を必要としないことから自律移動方式とも呼ばれる。 SLAM guidance methods include the Visual SLAM method that uses a camera and the LiDAR SLAM method that uses LiDAR (Light Detection and Ranging). Visual SLAM is a method in which a camera mounted on a vehicle acquires continuous changes in image data of the surrounding environment, such as walls, to estimate its own position and drive without using guides. In the LiDAR SLAM method, a laser light irradiation device mounted on a vehicle irradiates laser light onto the surrounding environment such as a wall, and a laser light receiving device mounted on the vehicle takes the time it takes for reflected light from the surrounding environment to be returned. In this method, the vehicle estimates its own position by measuring the distance and direction to the surrounding environment and runs without using guides. The laser guidance method and SLAM method are also called autonomous mobile methods because they do not require a guide.
 また、SLAMを使用するAGVはAMR(Autonomous Mobile Robot:自律走行搬送ロボット)とも呼ばれる。SLAM方式のメリットとしては、誘導体の設置が不要なため自由に走行ルートが設定できることにある。また、経路誘導方式では、車両が障害物を検知した際に一時停止するが、SLAM方式では、車両は障害物を回避し、再ルート設定により走行を継続することができる。 Additionally, AGVs that use SLAM are also called AMRs (Autonomous Mobile Robots). The advantage of the SLAM method is that there is no need to install guides, so you can freely set the travel route. In addition, with the route guidance method, the vehicle temporarily stops when it detects an obstacle, but with the SLAM method, the vehicle can avoid the obstacle and continue driving by resetting the route.
 一方、SLAM方式は、ドアの開閉、棚の荷物の積載状態の変化、作業者や他の搬送車の動き等の周辺環境の変化の影響を排除する処理が必要となる。また、特徴点のない空間では、動作に支障が出る場合があるといった課題がある。 On the other hand, the SLAM method requires processing to eliminate the effects of changes in the surrounding environment, such as opening and closing of doors, changes in the loading status of goods on shelves, and movements of workers and other transport vehicles. In addition, there is a problem that movement may be hindered in a space without feature points.
 こうしたSLAM方式の課題に対処した自律移動方式として、車両に搭載されたカメラが撮像した床面の画像データから特徴点を抽出し、これをトレースすることにより自動走行する方式が検討されている(非特許文献1)。 As an autonomous movement system that addresses these issues with the SLAM system, a system that extracts feature points from floor image data captured by a camera mounted on a vehicle and traces them to automatically drive is being considered ( Non-patent document 1).
 しかし、周辺環境の画像データから特徴点を抽出する方式としては、画像の中から特徴点を抽出する方式(特徴点抽出法: Feature based method)や画像全体を参照する方式(直接法: Direct method)が考えられるが、いずれの場合も、走行経路において静的な物体以外の歩行者や移動車両、扉の開閉などによる多種多様な周辺環境の変化がある場合にはマップデータを生成し、画像のマッチング処理を行う過程で画像処理が複雑となって処理量が増え、処理負荷が増大してしまうという課題が生じる。 However, methods for extracting feature points from image data of the surrounding environment include a method that extracts feature points from the image (feature based method) and a method that refers to the entire image (direct method). ), but in either case, if there are various changes in the surrounding environment due to pedestrians other than static objects, moving vehicles, opening/closing of doors, etc. on the driving route, map data is generated and images are generated. In the process of performing the matching process, image processing becomes complicated and the amount of processing increases, resulting in an increase in processing load.
 本発明は、上述の事情に鑑みてなされたもので、処理負荷を抑制して自律走行等の自律移動を可能とすることを目的とする。 The present invention was made in view of the above-mentioned circumstances, and an object of the present invention is to suppress the processing load and enable autonomous movement such as autonomous driving.
 上記目的を達成するため、請求項1に係る発明は、自律して移動可能な自律移動体の移動を制御する自律移動制御装置であって、複数種類のタイル形状によって構成された準周期タイルが撮影されることで得られた画像に関するデータと、前記準周期タイルに対応した座標系を有するマップデータとをパターンマッチングすることにより、前記準周期タイルにおける前記自律移動体の自己位置及び進行方向を推定する自己位置推定部と、前記自己位置推定部によって推定された前記自己位置及び前記進行方向、並びに予め設定されている前記自律移動体の移動ルートに基づいて、前記自律移動体の移動の制御を行う制御部と、を有する自律移動制御装置である。 In order to achieve the above object, the invention according to claim 1 provides an autonomous movement control device for controlling the movement of an autonomous mobile body capable of autonomously moving, wherein quasi-periodic tiles constituted by a plurality of types of tile shapes are provided. The self-position and traveling direction of the autonomous mobile body in the quasi-periodic tile can be determined by pattern matching data regarding the image obtained by photographing and map data having a coordinate system corresponding to the quasi-periodic tile. Control of movement of the autonomous mobile body based on a self-position estimating unit to estimate, the self-position and the traveling direction estimated by the self-position estimation unit, and a preset travel route of the autonomous mobile body. This is an autonomous movement control device having a control unit that performs the following.
 以上説明したように本発明によれば、処理負荷を抑制した自律移動が可能になるという効果を奏する。 As explained above, according to the present invention, it is possible to achieve autonomous movement with a reduced processing load.
実施形態に係る自律走行システムの全体構成図である。1 is an overall configuration diagram of an autonomous driving system according to an embodiment. 自動走行車両の構成図である。FIG. 1 is a configuration diagram of an automatic driving vehicle. 情報処理装置の電気的なハードウェア構成図である。FIG. 2 is an electrical hardware configuration diagram of the information processing device. 準周期タイルの平面図である。FIG. 3 is a plan view of a quasi-periodic tile. 各タイルの形状を示した図である。It is a figure showing the shape of each tile. 初期化走行の走行パターンを示す平面図である。FIG. 3 is a plan view showing a travel pattern of initialization travel. 初期化走行の処理を示すフローチャートである。It is a flowchart which shows the process of initialization driving|running|working. 本走行の走行ルートを示す平面図である。FIG. 3 is a plan view showing a driving route for the actual driving. 本走行の処理を示すフローチャートである。It is a flowchart which shows the process of a main run. 準周期タイルにおける12回対称性部分を明確化した図である。FIG. 4 is a diagram clarifying the 12-fold symmetry part in the quasi-periodic tile.
 以下、図面を用いて本発明の実施形態を説明する。 Hereinafter, embodiments of the present invention will be described using the drawings.
 〔実施形態のシステム構成〕
 図1を用いて、実施形態の自律走行システムの構成の概略について説明する。図1は、実施形態に係る自律走行システムの全体構成図である。
[System configuration of embodiment]
An outline of the configuration of an autonomous driving system according to an embodiment will be described using FIG. 1. FIG. 1 is an overall configuration diagram of an autonomous driving system according to an embodiment.
 図1に示されているように、自律走行システム1は、自律走行が可能な無人搬送車等の自律走行車両(「車両」とも示す)2、及び床の準周期タイル8によって構築されている。車両2は、準周期タイル8の複数のタイル形状により構成された模様を読み取ることで自己位置及び進行方向を推定すると共に移動を行う。 As shown in FIG. 1, an autonomous driving system 1 is constructed of an autonomous driving vehicle (also referred to as a "vehicle") 2 such as an automatic guided vehicle capable of autonomous driving, and quasi-periodic tiles 8 on the floor. . The vehicle 2 estimates its own position and traveling direction and moves by reading a pattern formed by a plurality of tile shapes of the quasi-periodic tiles 8.
 〔準周期タイルの説明〕
 準周期タイル8は、図4Aに示すように、複数種類のタイル形状によって構成され、周期性を有しない幾何学的パターンが描画されたタイルであり、床面に設置されている。複数種類のタイル形状は、図4Bに示すように、3種類の図形(辺の長さLが等しい正方形と2種類のひし形)によって構成されている。準周期タイル8は、正多角形により構成される周期的なパターンを有する平面充填とは異なり、有限種類の多角形により周期的なパターンのない(平行移動により重なる周期のない)平面充填を実現するタイリングの方式である。一例として2種類のひし形により平面充填を実現するペンローズタイリング(Penrose tiling)が知られている。図4A及び図4Bに示す準周期タイル8の幾何学的パターンはあくまで一例であり、周期性を有しない幾何学的パターンであってもよい。
[Explanation of quasi-periodic tiles]
As shown in FIG. 4A, the quasi-periodic tile 8 is a tile that is formed of a plurality of types of tile shapes and has a non-periodic geometric pattern drawn thereon, and is installed on the floor surface. As shown in FIG. 4B, the plurality of types of tile shapes are composed of three types of figures (a square with equal side length L and two types of diamonds). The quasi-periodic tile 8 realizes a planar filling with no periodic pattern (no overlapping period due to parallel movement) using finite types of polygons, unlike a planar filling with a periodic pattern made up of regular polygons. This is a tiling method. As an example, Penrose tiling is known, which realizes flat filling using two types of rhombuses. The geometric pattern of the quasi-periodic tiles 8 shown in FIGS. 4A and 4B is just an example, and may be a geometric pattern without periodicity.
 本実施形態では、準周期タイル8の模様の非周期性を車両2の自己位置推定に利用する。準周期タイル8は床面に構築され、車両2に搭載された(後述の)撮影部20により撮像される。なお、準周期タイル8を構成する直線が撮影部20により光学的に識別できれば、車両2が走行する床面にタイル模様が塗料等で描画されもよいし、タイル模様がプリントされたシートをカーペットとして使用してもよい。また、床面にテープ等を貼ることで準周期タイル8の模様を構築してもよいし、複数種類の形状のタイルを床面に敷き詰めて準周期タイル8の模様を構築してもよい。または、床面に設置されたディスプレイなどの表示素子で準周期タイル8の模様を表示してもよいし、床面に準周期タイル8のパターンが投影されていてもよい。いずれの場合も車両2の位置を特定するために十分な密度でタイルが構成されているものとする。すなわち、車両2に搭載された撮影部20の画素数、床面との距離に対し、車両2の自己位置を推定するのに必要な密度(大きさ)で準周期タイルが構成されているものとする。 In this embodiment, the aperiodic nature of the pattern of the quasi-periodic tiles 8 is utilized for estimating the self-position of the vehicle 2. The quasi-periodic tiles 8 are constructed on the floor surface, and are imaged by an imaging unit 20 (described later) mounted on the vehicle 2. Note that as long as the straight lines constituting the quasi-periodic tiles 8 can be optically identified by the imaging unit 20, the tile pattern may be drawn with paint or the like on the floor surface on which the vehicle 2 runs, or the sheet on which the tile pattern is printed may be used as a carpet. May be used as Furthermore, the pattern of the quasi-periodic tiles 8 may be constructed by pasting tape or the like on the floor surface, or the pattern of the quasi-periodic tiles 8 may be constructed by laying tiles of a plurality of shapes on the floor surface. Alternatively, the pattern of the quasi-periodic tiles 8 may be displayed on a display element such as a display installed on the floor, or the pattern of the quasi-periodic tiles 8 may be projected onto the floor. In either case, it is assumed that the tiles are configured with sufficient density to specify the position of the vehicle 2. In other words, quasi-periodic tiles are configured with the density (size) necessary to estimate the self-position of the vehicle 2 based on the number of pixels of the imaging unit 20 mounted on the vehicle 2 and the distance from the floor surface. shall be.
 なお、準周期タイル8の模様は必ずしも可視できる必要はなく、例えば可視光外の波長の光に反応して発光する不可視マーカー(Invisible Marker)により構築されてもよい。その場合、撮影部20は不可視マーカーの光線を検知する素子で構成される。 Note that the pattern of the quasi-periodic tiles 8 does not necessarily need to be visible, and may be constructed using, for example, an invisible marker that emits light in response to light of a wavelength outside visible light. In that case, the imaging unit 20 is configured with an element that detects the light beam of the invisible marker.
 〔自律走行車両の構成〕
 続いて、図2を用いて、車両2の構成を説明する。図2は、自律走行車両2の構成図である。
[Configuration of autonomous vehicle]
Next, the configuration of the vehicle 2 will be explained using FIG. 2. FIG. 2 is a configuration diagram of the autonomous vehicle 2. As shown in FIG.
 車両2は、自律走行制御装置3及び走行装置6によって構成されている。自律走行制御装置3は、準周期タイル8を撮影して車両2の自己位置及び進行方向を推定すると共に走行装置6の動作を制御する。走行装置6は、モータ、タイヤ、バッテリー、操舵機構等によって構成され、車両2を走行させる装置である。 The vehicle 2 is configured by an autonomous driving control device 3 and a traveling device 6. The autonomous driving control device 3 photographs the quasi-periodic tiles 8 to estimate the self-position and traveling direction of the vehicle 2, and controls the operation of the traveling device 6. The traveling device 6 is a device that is configured with a motor, tires, a battery, a steering mechanism, etc., and causes the vehicle 2 to travel.
 <自律走行制御装置>
 自律走行制御装置3は、情報処理装置5、撮影部20、ジャイロセンサ26、加速度センサ27、及びオドメトリ28によって構成されている。
<Autonomous driving control device>
The autonomous running control device 3 includes an information processing device 5, an imaging unit 20, a gyro sensor 26, an acceleration sensor 27, and an odometry 28.
 撮影部20は、準周期タイル8等の床面を撮影して画像データを生成(得る)するカメラである。撮影部20は、車両2において床面方向へ向けて固定して設置されており、床面を撮影する。撮影部20により得られる画像データ上の任意のピクセルと車両2の参照位置間のオフセット(相対位置)はあらかじめ計測し、キャリブレーション(較正、校正、調整)されているものとする。なお、撮影部20は、単眼撮影するカメラであってもよいし、ステレオ撮影するカメラであってもよいし、あるいは、魚眼レンズを備えた全方位撮像カメラであってもよい。また、撮影部20のカメラは一台であってもよいし、複数のカメラでそれぞれ撮像する範囲を分担してもよい。さらに、撮影部20は、カメラ以外のLiDAR等のリモートセンシングデバイスであってもよい。 The photographing unit 20 is a camera that photographs the floor surface of the quasi-periodic tiles 8 and the like to generate (obtain) image data. The photographing unit 20 is fixedly installed in the vehicle 2 facing toward the floor surface, and photographs the floor surface. It is assumed that the offset (relative position) between any pixel on the image data obtained by the imaging unit 20 and the reference position of the vehicle 2 has been measured and calibrated (calibration, proofreading, adjustment) in advance. Note that the photographing unit 20 may be a camera that performs monocular photography, a camera that performs stereo photography, or an omnidirectional imaging camera equipped with a fisheye lens. Furthermore, the imaging unit 20 may have one camera, or a plurality of cameras may share the imaging range. Furthermore, the imaging unit 20 may be a remote sensing device such as LiDAR other than a camera.
 情報処理装置5は、PC(Personal Computer)等のコンピュータであり、車両2に撮影部20が設置された位置及び姿勢、並びに撮影部20から取得された画像に関するデータに基づいて、画像上の特徴点(準周期タイル8の辺や交点)に対する車両2の相対位置を計測することで、車両2の自己位置(絶対位置)及び進行方向を推定すると共に、走行装置6の動作を制御するための制御信号を走行装置6に送る。情報処理装置5については、後ほど詳細に説明する。 The information processing device 5 is a computer such as a PC (Personal Computer), and based on the position and orientation of the photographing section 20 installed in the vehicle 2 and the data regarding the image acquired from the photographing section 20, the information processing device 5 determines the characteristics on the image. By measuring the relative position of the vehicle 2 with respect to points (sides and intersections of the quasi-periodic tiles 8), the self-position (absolute position) and traveling direction of the vehicle 2 can be estimated, and the operation of the traveling device 6 can be controlled. A control signal is sent to the traveling device 6. The information processing device 5 will be explained in detail later.
 本実施形態の自律走行制御装置3は、撮影部20及び情報処理装置5を有していれば、準周期タイル8を読み取ることで自律走行が可能であるが、撮影部20の故障等により一時的に画像に関するデータを得られなくなった場合でも、センサ等を使用し、デッドレコニング (Dead Reckoning)による自律走行を継続することが可能となる。また、センサ等は準周期タイル8の模様の局所的な周期性及び回転対称性に対し、車両2の自律走行における自己位置及び方位推定動作における頑強性を与える。但し、ジャイロセンサ26、加速度センサ27、及びオドメトリ28の各パラメータは、後述の初期化走行(図5)の段階で、撮影部20の故障等の前にキャリブレーションが行われる必要がある。 If the autonomous driving control device 3 of this embodiment has the imaging unit 20 and the information processing device 5, autonomous driving is possible by reading the quasi-periodic tiles 8, but due to a failure of the imaging unit 20, etc. Even if it becomes impossible to obtain image-related data, it is possible to continue autonomous driving through dead reckoning using sensors, etc. Furthermore, the sensors and the like provide robustness to the local periodicity and rotational symmetry of the pattern of the quasi-periodic tiles 8 in the self-position and orientation estimation operation of the vehicle 2 during autonomous travel. However, each parameter of the gyro sensor 26, the acceleration sensor 27, and the odometry 28 needs to be calibrated at the stage of initialization driving (FIG. 5), which will be described later, before a failure of the photographing unit 20 occurs.
 ここで、ジャイロセンサ26は、物体(ここでは、車両2)の角度、角速度又は角加速度を検出するセンサである。加速度センサ27は、物体(ここでは、車両2)の加速度を計測するセンサである。オドメトリ28は、例えば、ロータリーエンコーダを使用して車両2の車輪の回転数を計測することにより、車輪の回転数から、車両2の移動距離、速度、旋回角度を求めるデバイスである。オドメトリ28は、自己位置推定の精度を更に上げるためのデバイスであり、ジャイロセンサ26及び加速度センサ27を補完するデバイスであるため、自律走行制御装置3に必ずしも設けられなくてもよい。すなわち、オドメトリ28により、車両2は自己位置の推定精度及び信頼性を向上することができる。 Here, the gyro sensor 26 is a sensor that detects the angle, angular velocity, or angular acceleration of an object (here, the vehicle 2). The acceleration sensor 27 is a sensor that measures the acceleration of an object (here, the vehicle 2). The odometry 28 is a device that measures the number of rotations of the wheels of the vehicle 2 using a rotary encoder, for example, and calculates the moving distance, speed, and turning angle of the vehicle 2 from the number of rotations of the wheels. The odometry 28 is a device for further increasing the accuracy of self-position estimation, and is a device that complements the gyro sensor 26 and the acceleration sensor 27, so it does not necessarily need to be provided in the autonomous travel control device 3. That is, the odometry 28 allows the vehicle 2 to improve the estimation accuracy and reliability of its own position.
 <情報処理装置>
 続いて、情報処理装置5について詳細に説明する。
<Information processing device>
Next, the information processing device 5 will be explained in detail.
 (情報処理装置のハードウェア構成)
 図3を用いて、情報処理装置のハードウェア構成を説明する。図3は、情報処理装置5の電気的なハードウェア構成図である。
(Hardware configuration of information processing device)
The hardware configuration of the information processing device will be explained using FIG. 3. FIG. 3 is an electrical hardware configuration diagram of the information processing device 5. As shown in FIG.
 情報処理装置5は、コンピュータとして、図3に示されているように、CPU(Central Processing Unit)501、ROM(Read Only Memory)502、RAM(Random Access Memory)503、SSD(Solid State Drive)504、外部機器接続I/F(Interface)505、ネットワークI/F506、メディアI/F509、及びバスライン510を備えている。 As shown in FIG. 3, the information processing device 5 is a computer that includes a CPU (Central Processing Unit) 501, a ROM (Read Only Memory) 502, a RAM (Random Access Memory) 503, and an SSD (Solid State Drive) 504. , an external device connection I/F (Interface) 505, a network I/F 506, a media I/F 509, and a bus line 510.
 これらのうち、CPU501は、情報処理装置5全体の動作を制御する。ROM502は、IPL(Initial Program Loader)等のCPU501の駆動に用いられるプログラムを記憶する。RAM503は、CPU501のワークエリアとして使用される。 Among these, the CPU 501 controls the operation of the entire information processing device 5. The ROM 502 stores programs used to drive the CPU 501, such as IPL (Initial Program Loader). RAM 503 is used as a work area for CPU 501.
 SSD504は、CPU501の制御に従って各種データの読み出し又は書き込みを行う。なお、SSD504の代わりに、HDD(Hard Disk Drive)を用いても良い。 The SSD 504 reads or writes various data under the control of the CPU 501. Note that an HDD (Hard Disk Drive) may be used instead of the SSD 504.
 外部機器接続I/F505は、各種の外部機器を接続するためのインターフェースである。この場合の外部機器は、ディスプレイ、スピーカ、キーボード、マウス、USB(Universal Serial Bus)メモリ、及びプリンタ等である。 The external device connection I/F 505 is an interface for connecting various external devices. External devices in this case include a display, speaker, keyboard, mouse, USB (Universal Serial Bus) memory, printer, and the like.
 ネットワークI/F506は、インターネット及び(又は)LAN(Local Area Network)等の通信ネットワークを介してデータ通信をするためのインターフェースである。なお、ネットワークI/F506の代わりに、無線通信機器であってもよい。 The network I/F 506 is an interface for data communication via a communication network such as the Internet and/or LAN (Local Area Network). Note that a wireless communication device may be used instead of the network I/F 506.
 メディアI/F509は、フラッシュメモリ等の記録メディア509mに対するデータの読み出し又は書き込み(記憶)を制御する。記録メディア509mには、DVD(Digital Versatile Disc)やBlu-ray Disc(登録商標)等も含まれる。 The media I/F 509 controls reading or writing (storage) of data to a recording medium 509m such as a flash memory. The recording media 509m also include DVDs (Digital Versatile Discs), Blu-ray Discs (registered trademark), and the like.
 バスライン510は、図3に示されているCPU501等の各構成要素を電気的に接続するためのアドレスバスやデータバス等である。 The bus line 510 is an address bus, a data bus, etc. for electrically connecting each component such as the CPU 501 shown in FIG. 3.
 (情報処理装置の機能構成)
 続いて、図2に戻り、情報処理装置5の機能構成を説明する。
(Functional configuration of information processing device)
Next, returning to FIG. 2, the functional configuration of the information processing device 5 will be described.
 情報処理装置5は、ルートデータ設定部31、自己位置推定部33、及び制御部35を有している。これら各部は、RAM503等に記憶されているプログラムに従ったCPU501からの命令によって実現される機能である。また、情報処理装置5のRAM503又はSSD504には、ルートデータ記憶部21、画像データ記憶部22、及びマップデータ記憶部23が構築されている。 The information processing device 5 includes a route data setting section 31, a self-position estimation section 33, and a control section 35. Each of these units is a function realized by instructions from the CPU 501 according to a program stored in the RAM 503 or the like. Further, a route data storage section 21, an image data storage section 22, and a map data storage section 23 are constructed in the RAM 503 or SSD 504 of the information processing device 5.
 これらのうち、ルートデータ記憶部21は、ルートデータ設定部31によって設定された移動ルートのデータを記憶する。ルートのデータは、例えば、床面上の始点、終点、経由点の各座標値のセットで与えられる。 Among these, the route data storage section 21 stores data of the travel route set by the route data setting section 31. The route data is given, for example, as a set of coordinate values of a starting point, an ending point, and a waypoint on the floor.
 画像データ記憶部22は、車両2の走行中に撮影部20によって連続的に撮像されることで得られた各画像に関するデータを記憶する。画像データ記憶部22に記憶される画像に関するデータは、画像データそのものであってもよいし、データ容量のより小さいデータ(例えば、画像データからタイルの形状を抽出したポリゴンデータ又は識別された有限種類のタイルの配置を記録したデータ)であってもよい。 The image data storage unit 22 stores data regarding each image obtained by being continuously captured by the imaging unit 20 while the vehicle 2 is running. The image-related data stored in the image data storage unit 22 may be the image data itself, or may be data with a smaller data capacity (for example, polygon data that extracts the shape of a tile from the image data or identified finite types). data recording the arrangement of tiles).
 マップデータ記憶部23は、準周期タイル8を構成した床面の図面データの座標データ付きのマップデータ(情報)が記憶され、自己位置推定部33における処理の際に参照される。 The map data storage unit 23 stores map data (information) with coordinate data of drawing data of the floor surface forming the quasi-periodic tile 8, and is referred to during processing in the self-position estimating unit 33.
 ルートデータ設定部31は、ユーザから車両2の移動ルートの設定を受け付け、設定された移動ルートのデータをルートデータ記憶部21に記憶する。 The route data setting unit 31 accepts the setting of the travel route of the vehicle 2 from the user, and stores the data of the set travel route in the route data storage unit 21.
 自己位置推定部33は、画像データ記憶部22に記憶されている画像に関するデータと、マップデータ記憶部23に記憶されているマップデータとをマッチング処理(パターンマッチング)することによって、車両の自己位置(現在位置)及び進行方向(移動方向)を推定する。なお、撮影部20等の故障により緊急事態が発生した場合には、上述のように、自己位置推定部33は、ジャイロセンサ26、加速度センサ27、及びオドメトリ28のうち、少なくともジャイロセンサ26及び加速度センサ27から出力信号を取得し、車両2の進行方向及び速度のデータとして緊急時に使用する。 The self-position estimating unit 33 performs a matching process (pattern matching) between the image-related data stored in the image data storage unit 22 and the map data stored in the map data storage unit 23, thereby determining the vehicle's self-position. (current position) and advancing direction (movement direction). Note that when an emergency situation occurs due to a failure of the photographing unit 20 or the like, as described above, the self-position estimating unit 33 uses at least the gyro sensor 26 and the acceleration sensor of the gyro sensor 26, the acceleration sensor 27, and the odometry 28. An output signal is obtained from the sensor 27 and used as data on the traveling direction and speed of the vehicle 2 in an emergency.
 制御部35は、自己位置推定部33から取得したデータ(車両2の自己位置及び進行方向の推定値)、並びにルートデータ記憶部21から取得した移動ルートのデータに基づき、走行装置6を制御するための制御信号を生成して、走行装置6に送る。なお、車両2には、車両2の進行方向の障害物を検知するための撮影部(撮影部20又は別の撮影部)、LiDAR、RADAR等が搭載されていてもよい。障害物が検知された場合には、ルートデータ設定部31が障害物を回避するための再ルートの設定をリアルタイムで行い、再ルートのデータをルートデータ記憶部21に記憶する。 The control unit 35 controls the traveling device 6 based on the data acquired from the self-position estimating unit 33 (estimated values of the self-position and traveling direction of the vehicle 2) and the travel route data acquired from the route data storage unit 21. A control signal is generated and sent to the traveling device 6. Note that the vehicle 2 may be equipped with a photographing unit (the photographing unit 20 or another photographing unit), LiDAR, RADAR, etc. for detecting obstacles in the traveling direction of the vehicle 2. When an obstacle is detected, the route data setting section 31 sets a re-route to avoid the obstacle in real time, and stores the re-route data in the route data storage section 21.
 尚、情報処理装置5の機能の全て、または一部が通信ネットワークを介して、自律走行車両2以外の場所、例えば、クラウド基盤上に設置されていてもよい。 Note that all or part of the functions of the information processing device 5 may be installed at a location other than the autonomous vehicle 2, for example, on a cloud platform via a communication network.
 〔自律走行車両の処理又は動作〕
 続いて、図5乃至図8を用いて、車両2の処理又は動作を説明する。図5は、初期化走行の走行パターンを示す平面図である。図6は、初期化走行の処理を示すフローチャートである。図7は、本走行の走行ルートを示す平面図である。図8は、本走行の処理を示すフローチャートである。
[Processing or operation of autonomous vehicle]
Next, the processing or operation of the vehicle 2 will be explained using FIGS. 5 to 8. FIG. 5 is a plan view showing a travel pattern for initialization travel. FIG. 6 is a flowchart showing the initialization traveling process. FIG. 7 is a plan view showing the route of the main run. FIG. 8 is a flowchart showing the processing of the main run.
 マップデータ記憶部23には、準周期タイル8の模様を有する床面の平面図に対応した座標系を有するマップデータが記憶されている。撮影部20は車両2に固定され、車両2の走行中に車両2の近傍の床面を連続して撮像する。車両2の自己位置推定部33は、従来技術とは異なり、撮像された車両2の近傍の床面の画像データ全体から特徴点を抽出するのはなく、床面の画像データから直線と交点で構成される準周期タイル8の各形状を抽出し、パターンマッチングにより3種類のタイルの形状のいずれに該当するかを識別判定するために、画像データ記憶部22に記憶されている、画像に関するデータを使用する。 The map data storage unit 23 stores map data having a coordinate system corresponding to a plan view of a floor surface having a pattern of quasi-periodic tiles 8. The imaging unit 20 is fixed to the vehicle 2 and continuously images the floor surface near the vehicle 2 while the vehicle 2 is running. Unlike the conventional technology, the self-position estimating unit 33 of the vehicle 2 does not extract feature points from the entire image data of the captured floor surface near the vehicle 2, but extracts feature points from the image data of the floor surface by straight lines and intersection points. Image-related data stored in the image data storage unit 22 is used to extract each shape of the quasi-periodic tiles 8 and identify which of the three types of tile shapes it corresponds to by pattern matching. use.
 まず、車両2の本走行の前に、車両2の自己位置及び向きを特定するために、車両2は初期化(準備)走行を行う。例えば、ユーザは、ルートデータ設定部31により、初期化走行用の走行パターンを設定する。初期走行用の走行パターンは、例えば、図5に示すように車両2が初期状態から直進して走行し、準周期タイル8の境界(端)部を検知した後、境界に沿って時計回りに走行するパターンが使用される。また、初期走行において準周期タイル8上をランダムに移動する走行パターンを使用してもよい。設定された初期化走行用の走行パターンのデータは、ルートデータ記憶部21に記憶される。 First, before the main run of the vehicle 2, the vehicle 2 performs initialization (preparation) run in order to identify its own position and orientation. For example, the user uses the route data setting section 31 to set a driving pattern for initializing driving. For example, as shown in FIG. 5, the initial driving pattern is such that the vehicle 2 travels straight from the initial state, detects the boundary (end) of the quasi-periodic tile 8, and then moves clockwise along the boundary. A running pattern is used. Alternatively, a traveling pattern may be used in which the vehicle moves randomly on the quasi-periodic tiles 8 during the initial traveling. The data of the set driving pattern for the initialized driving is stored in the route data storage section 21.
 更に、車両2の本走行のために、例えば、ユーザは、ルートデータ設定部31により、準周期タイル8を構築した床面上の移動ルート(経路)として図7に矢印で示す本走行用のルートを設定する。設定された本走行用のルートのデータは、ルートデータ記憶部21に記憶される。 Further, for the main run of the vehicle 2, for example, the user uses the route data setting unit 31 to set a travel route (route) on the floor surface on which the quasi-periodic tiles 8 are constructed, as indicated by arrows in FIG. 7 for the main run. Set the route. The data of the set route for the actual driving is stored in the route data storage section 21.
 このような準備が終えた後に、以下に示す処理が実行される。 After such preparations are completed, the following processing is executed.
 <初期化走行の処理又は動作>
 まず、車両2は、ユーザが車両2を準周期タイル8上に設置した、任意の位置及び向きを初期状態として初期化用の走行パターンのデータを用いて、以下に示すような初期化走行の処理又は動作を行う(図6)。なお、上述のように、車両2が障害物を検知した場合には、障害物を回避し、再ルートを構成してもよい。
<Initialization running process or operation>
First, the vehicle 2 performs an initialization run as shown below using the data of the initialization run pattern with the user placing the vehicle 2 on the quasi-periodic tile 8 in an arbitrary position and orientation as the initial state. Perform processing or operation (Figure 6). Note that, as described above, when the vehicle 2 detects an obstacle, the vehicle 2 may avoid the obstacle and construct a new route.
 S11:車両2の走行に伴い、撮影部20は、準周期タイル8のパターンを撮影して得た画像に関するデータを画像データ記憶部22に記憶する。これにより、車両2は、車両2の走行に伴う画像上の3種類のタイルが構成するパターンの変化の状況を時系列に記憶することができる。 S11: As the vehicle 2 travels, the photographing section 20 stores data regarding an image obtained by photographing the pattern of the quasi-periodic tiles 8 in the image data storage section 22. Thereby, the vehicle 2 can store in chronological order the state of change in the pattern formed by the three types of tiles on the image as the vehicle 2 travels.
 S12:自己位置推定部33は、画像データ記憶部22から読み出した画像に関するデータから準周期タイル8のパターンを抽出する。 S12: The self-position estimating unit 33 extracts the pattern of the quasi-periodic tiles 8 from the data related to the image read from the image data storage unit 22.
 S13:自己位置推定部33は、抽出したパターンと、マップデータ記憶部23に記憶されているマップデータのパターンとを比較(パターンマッチング)することにより、車両2の自己位置(絶対位置)及び進行方向を推定する。準周期タイル8は、パターンに周期性を有しないが、局所的には周期性や回転対称性を有する。このため、車両2が初期化走行で十分な範囲を走行することで、位置のアンビギュイティを解決し、車両2の自己位置をマップ上の絶対位置(座標)として特定することができる。また、一度マップ上の位置が特定できれば、走行時に、車両2は連続的にタイルの模様を撮影してトラッキングすることにより、継続的に自己位置と進行方向を推定することができる。 S13: The self-position estimating unit 33 compares the extracted pattern with the map data pattern stored in the map data storage unit 23 (pattern matching) to determine the self-position (absolute position) and progress of the vehicle 2. Estimate direction. The quasi-periodic tile 8 has no periodicity in its pattern, but locally has periodicity and rotational symmetry. Therefore, by driving the vehicle 2 over a sufficient range during initialization, the positional ambiguity can be resolved and the self-position of the vehicle 2 can be specified as an absolute position (coordinates) on the map. Moreover, once the position on the map can be specified, the vehicle 2 can continuously estimate its own position and traveling direction by continuously photographing and tracking the tile pattern while driving.
 S14:自己位置推定部33は、連続的に準周期タイル8の模様を撮影してトラッキングして継続的に自己位置と進行方向を推定する度に、ジャイロセンサ26、加速度センサ27、及びオドメトリ28の各パラメータのキャリブレーション(較正、校正、調整)を行う。なお、各パラメータは、ジャイロセンサ26、加速度センサ27、及びオドメトリ28がそれぞれ保持しても良いし、別途、RAM503等で構築されたパラメータ記憶部が記憶してもよい。ジャイロセンサ26及び加速度センサ27のキャリブレーションには、初期(静止)状態におけるパラメータの初期化及び移動状態での拡張カルマンフィルタ―等のパラメータの最適化が含まれる。オドメトリ28のキャリブレーションには、エンコーダにより得られる車両2の車輪の回転数から車両2の速度を算出する際の係数の最適化が含まれる。 S14: The self-position estimating unit 33 continuously photographs and tracks the pattern of the quasi-periodic tiles 8 to continuously estimate the self-position and the direction of travel. Calibrate (calibrate, calibrate, adjust) each parameter. Note that each parameter may be held by the gyro sensor 26, the acceleration sensor 27, and the odometry 28, respectively, or may be stored separately by a parameter storage unit constructed from the RAM 503 or the like. Calibration of the gyro sensor 26 and the acceleration sensor 27 includes initialization of parameters in an initial (stationary) state and optimization of parameters such as an extended Kalman filter in a moving state. Calibration of the odometry 28 includes optimization of coefficients when calculating the speed of the vehicle 2 from the rotation speed of the wheels of the vehicle 2 obtained by the encoder.
 S15:自己位置推定部33は、車両2の自己位置及び進行方向の推定状態、並びにキャリブレーションの状態に基づき、初期化走行が終了したかを判断する。初期化走行が終了していない場合には、上記S11の処理に戻る。一方、初期化走行が終了した場合には、下記S21の処理に進む。なお、各パラメータのキャリブレーションは、初期化走行が終了してから行われるようにしてもよい。 S15: The self-position estimating unit 33 determines whether the initialization run has been completed based on the estimated state of the self-position and traveling direction of the vehicle 2, and the state of calibration. If the initialization run has not been completed, the process returns to step S11. On the other hand, if the initialization run is completed, the process proceeds to S21 below. Note that the calibration of each parameter may be performed after the initialization run is completed.
 なお、S11~S13において、画像データそのものではなく、画像に関するデータを扱う例として、画像データ上の(画像データで映し出された)準周期タイル8における各タイルの交点に着目し、交点を構成する線分の数及びこれらの線分のなす角度のパターンを手掛かりにマップデータを参照(パターンマッチング)し、準周期タイル8におけるどの交点であるかを特定することにより、準周期タイル8における車両2の自己位置及び進行方向を推定する方法がある。図5のA点は9本の線分が交わる交点であり、A点において各線分のなす角度はa: 30°、b: 60°、c: 90°、d: 120°、e: 150°として時計回りにbaaabbaaaとなる。準周期タイル8上でサイクリックに同じ角度の並びとなる交点はC点だが、車両2の進行方向にA点に続きB点を検知することにより、交点がA点であることを特定することができる。準周期タイル8上で特徴的な交点付近を初期位置として初期化走行を行うことにより、初期化走行が完了するまでの時間を短縮することができる。 Note that in S11 to S13, as an example of handling image-related data rather than image data itself, we focus on the intersections of each tile in the quasi-periodic tiles 8 on the image data (represented by the image data), and construct the intersections. By referring to the map data using the number of line segments and the pattern of angles formed by these line segments as clues (pattern matching) and specifying which intersection point in the quasi-periodic tile 8, the vehicle 2 in the quasi-periodic tile 8 is There is a method for estimating the self-position and direction of travel. Point A in Figure 5 is the intersection of nine line segments, and the angles formed by each line segment at point A are a: 30°, b: 60°, c: 90°, d: 120°, and e: 150°. baaabbaaa clockwise. The intersection point that is cyclically arranged at the same angle on the quasi-periodic tile 8 is point C, but by detecting point B following point A in the direction of travel of the vehicle 2, it can be specified that the intersection point is point A. Can be done. By performing the initialization run on the quasi-periodic tile 8 with the vicinity of the characteristic intersection as the initial position, the time required to complete the initialization run can be shortened.
 以上により、初期化走行の処理が終了する。 With the above steps, the initialization running process is completed.
 <本走行の処理又は動作>
 続いて、車両2は、本走行用のルートのデータ(図7)を用いて、以下に示すような本走行の処理又は動作を行う(図8)。なお、上述のように、車両2が障害物を検知した場合には、障害物を回避し、再ルートを構成してもよい。
<Processing or operation of main run>
Subsequently, the vehicle 2 uses the route data for the actual trip (FIG. 7) to perform the following processing or operation for the actual trip (FIG. 8). Note that, as described above, when the vehicle 2 detects an obstacle, the vehicle 2 may avoid the obstacle and construct a new route.
 S21:車両2の走行に伴い、撮影部20は、準周期タイル8のパターンを撮影して得た画像に関するデータを画像データ記憶部22に記憶する。これにより、車両2は、車両2の走行に伴う画像上の3種類のタイルが構成するパターンの変化の状況を時系列に記憶することができる。 S21: As the vehicle 2 travels, the photographing section 20 stores data regarding an image obtained by photographing the pattern of the quasi-periodic tiles 8 in the image data storage section 22. Thereby, the vehicle 2 can store in chronological order the state of change in the pattern formed by the three types of tiles on the image as the vehicle 2 travels.
 S22:自己位置推定部33は、画像データ記憶部22から読み出した画像に関するデータからパターンを抽出する。 S22: The self-position estimating unit 33 extracts a pattern from the data related to the image read from the image data storage unit 22.
 S23:自己位置推定部33は、抽出したパターンと、マップデータ記憶部23に記憶されているマップのパターンとを比較(又は照合)することにより、車両2の自己位置(絶対位置)及び進行方向を推定する。 S23: The self-position estimating unit 33 compares (or collates) the extracted pattern with the map pattern stored in the map data storage unit 23 to determine the self-position (absolute position) and traveling direction of the vehicle 2. Estimate.
 S24:制御部35は、車両2の自己位置及び進行方向と、ルートデータ記憶部21に記憶されているルートデータを比較して制御信号を生成し、この制御信号を走行装置6に出力する。なお、本走行中においても、自己位置推定部33は、随時、推定した自己位置及び方位から、ジャイロセンサ26、加速度センサ27、及びオドメトリ28の各パラメータのキャリブレーションを行ってもよい。 S24: The control unit 35 compares the self-position and traveling direction of the vehicle 2 with the route data stored in the route data storage unit 21, generates a control signal, and outputs this control signal to the traveling device 6. Note that even during the actual running, the self-position estimating unit 33 may calibrate each parameter of the gyro sensor 26, acceleration sensor 27, and odometry 28 from the estimated self-position and orientation at any time.
 S25:自己位置推定部33は、本走行用のルートデータに基づき、終点に到着したかを判断する。終点に到着していない場合には、上記S21の処理に戻る。一方、終点に到着した場合には、本走行の処理が終了する。 S25: The self-position estimating unit 33 determines whether the end point has been reached based on the route data for the actual run. If the end point has not been reached, the process returns to step S21. On the other hand, when the end point has been reached, the processing of the main run ends.
 以上により、本走行の処理が終了する。 With the above, the processing of the main run ends.
 なお、初期化走行は本走行の前に必ずしも実施する必要はない。本走行後に車両2を準周期タイル8上の任意の駐車位置に停車し、車両2の位置と向きのデータを記憶しておけば、起動時に初期化走行を実施することなく、直ちに本走行の始点へ移動し、本走行を開始することができる。 Note that the initialization run does not necessarily need to be performed before the main run. After the main run, if the vehicle 2 is stopped at an arbitrary parking position on the quasi-periodic tile 8 and the data of the position and orientation of the vehicle 2 are memorized, the main run can be started immediately without having to perform initialization run at startup. You can move to the starting point and start the actual run.
 上述のごとく、本実施形態の車両2は、画像データから単純な形状のタイルの組み合わせパターンを抽出し、記憶された準周期タイル8のデータとパターンマッチングさせ、タイルの位置から絶対座標を取得することにより、処理負荷を抑制しつつ高精度に自己位置を推定することができる。この際、車両2は、任意の方向への移動においても相対変位、すなわち移動方位と移動距離をタイル形状により高精度にトレースすることができるため、自己位置及び進行方向を継続的に推定し、ルートデータを参照して制御することにより、仮想的な経路に沿った正確な走行を行うことができる。また、障害物を回避するルートをリアルタイムに構築し、正確に回避することができる。 As described above, the vehicle 2 of this embodiment extracts a combination pattern of simple-shaped tiles from image data, performs pattern matching with the stored data of the quasi-periodic tiles 8, and obtains absolute coordinates from the tile positions. By doing so, it is possible to estimate the self-position with high accuracy while suppressing the processing load. At this time, since the vehicle 2 can trace the relative displacement, that is, the moving direction and moving distance with high accuracy even when moving in any direction, by the tile shape, the vehicle 2 can continuously estimate its own position and traveling direction. By controlling the vehicle with reference to route data, it is possible to accurately travel along a virtual route. It can also create routes to avoid obstacles in real time and avoid them accurately.
 なお、準周期タイル8は局所的には2次元の回転対称性を有する。例えば、図9に示すように、準周期タイル8では、破線内の部分が12回対称性を有するため、回転対称の中心から放射線方向の近傍の領域が車両2の移動ルートに含まれる場合は、方位のアンビギュイティを考慮する必要がある。このアンビギュイティはジャイロセンサ26による方位データを併用すれば解決することができる。 Note that the quasi-periodic tile 8 locally has two-dimensional rotational symmetry. For example, as shown in FIG. 9, in the quasi-periodic tile 8, the part within the broken line has 12-fold symmetry, so if the area near the center of rotational symmetry in the radial direction is included in the movement route of the vehicle 2, , it is necessary to consider the ambiguity of orientation. This ambiguity can be solved by using orientation data from the gyro sensor 26.
 〔本実施形態の適用例〕
 本実施形態のように、車両2が準周期タイル8を読み取りながら移動する方式は、屋外環境で使用された場合、高架下等のGNSS(Global Navigation Satellite Systems: 航法衛星システム)信号の受信できないエリアにおける複合測位に使用可能であり、又はアーバン・キャニオン受信環境における絶対位置の推定に使用可能である。具体的にはGNSS信号の受信が困難なエリアにおいて、撮影部20、LiDAR、Radar等の相対測位手段をGNSS測位にフュージョンした複合測位を行う場合に、相対測位手段の累積誤差を解消するために道路の一部(例えば交差点など)に準周期タイル8の模様を構築し、自動車場合の車両2の通過時に、撮影部20で準周期タイル8のタイル模様を撮像して、車両2の絶対位置を推定するといった利用形態が考えられる。
[Application example of this embodiment]
As in this embodiment, when the vehicle 2 moves while reading the quasi-periodic tiles 8, when used in an outdoor environment, the system is used in areas where GNSS (Global Navigation Satellite Systems) signals cannot be received, such as under elevated tracks. or for estimating absolute position in an urban canyon reception environment. Specifically, in areas where it is difficult to receive GNSS signals, when performing composite positioning in which relative positioning means such as the imaging unit 20, LiDAR, Radar, etc. are fused to GNSS positioning, in order to eliminate the cumulative error of the relative positioning means. A pattern of quasi-periodic tiles 8 is constructed on a part of the road (for example, at an intersection), and when the vehicle 2 passes by, the photographing unit 20 images the tile pattern of the quasi-periodic tiles 8 to determine the absolute position of the vehicle 2. Possible uses include estimating .
 〔実施形態の効果〕
 以上説明したように、本実施形態によれば、車両2の走行経路中に歩行者や移動車両、扉の開閉など多種多様な周辺環境の変化がある場合であっても、車両2が床面の準周期タイル8のパターンを読み取りながら移動することで、処理負荷を抑制しながら、高い精度で自己位置を推定すると共に、目的地までの自律走行が可能である。
[Effects of embodiment]
As explained above, according to the present embodiment, even when there are various changes in the surrounding environment such as pedestrians, moving vehicles, opening and closing of doors, etc. on the travel route of the vehicle 2, the vehicle 2 can By moving while reading the pattern of the quasi-periodic tiles 8, it is possible to estimate the self-position with high accuracy while suppressing the processing load, and to autonomously travel to the destination.
 また、本実施形態によれば、誘導体を使用せずにフレキシブルなルート設定で車両2の運行が可能となる。そのため、車両2の自律走行において、一時的に障害物を回避した際にも任意の位置から速やかに経路に復帰することができる。 Furthermore, according to the present embodiment, it is possible to operate the vehicle 2 with flexible route setting without using a guide. Therefore, during autonomous driving of the vehicle 2, even if the vehicle 2 temporarily avoids an obstacle, it can quickly return to the route from any position.
 更に、本実施形態のように、車両2が準周期タイル8を読み取りながら移動する方式では、Visual SLAM方式とは異なり車両2が走行する環境全体を認識するための明るさは必要とせず、車両2の近傍の床面のパターンを識別できる程度の局所的な明るさがあれば十分である。また、SLAM方式とは異なり、周辺環境の変化の影響を受けないため、他の無人走行車や作業者の位置の変動により処理及び動作の影響を受けない。また、特徴点が少ない、広大な床面積を有する屋内環境又は屋外環境においても適用することができる。更に、車両2が走行する床面に設置した物体(荷物等)を随時移動したり、動的に床面のレイアウトの変更を行ったりすることができる。 Furthermore, unlike the Visual SLAM method, the method in which the vehicle 2 moves while reading the quasi-periodic tiles 8, as in this embodiment, does not require brightness to recognize the entire environment in which the vehicle 2 is traveling. It is sufficient if there is enough local brightness to distinguish the pattern on the floor near 2. Also, unlike the SLAM method, it is not affected by changes in the surrounding environment, so processing and operation are not affected by changes in the positions of other unmanned vehicles or workers. It can also be applied to indoor or outdoor environments with a large floor area and few feature points. Furthermore, it is possible to move objects (such as luggage) placed on the floor surface on which the vehicle 2 travels at any time, and to dynamically change the layout of the floor surface.
 〔補足〕
 以上、本発明は、上記の実施形態に限定されることなく、例えば以下に示すように、種々の変更及び応用が可能である。
〔supplement〕
As described above, the present invention is not limited to the above-described embodiments, and various modifications and applications can be made, for example, as shown below.
 (1)情報処理装置5は、コンピュータとプログラムによって実現できるが、このプログラムを(非一時的)記録媒体に記録することも、インターネット等の通信ネットワークを介して提供することも可能である。 (1) The information processing device 5 can be realized by a computer and a program, but this program can also be recorded on a (non-temporary) recording medium or provided via a communication network such as the Internet.
 (2)CPU101(マイクロプロセッサ)は、単一だけでなく、複数であってもよい。 (2) The CPU 101 (microprocessor) may be one or more.
 (3)上記実施形態の車両2は、自律移動可能な自律移動体の一例である。自律移動体には、船舶、航空機、水中航行体(例えば、プール内のひび割れを検知する移動体)も含まれる。自律走行制御装置3も、同様に自律移動制御装置の一例である。すなわち、本明細書において、「走行」の文言は「移動」に置き換えることができる。また、走行装置6は、タイヤではなく、ジェットエンジン又はプロペラ等によって移動してもよい。 (3) The vehicle 2 of the above embodiment is an example of an autonomous mobile body that can move autonomously. Autonomous mobile objects also include ships, aircraft, and underwater vehicles (for example, a mobile object that detects cracks in a swimming pool). The autonomous travel control device 3 is also an example of an autonomous movement control device. That is, in this specification, the word "running" can be replaced with "moving". Further, the traveling device 6 may be moved by a jet engine, a propeller, or the like instead of tires.
1 自律走行システム(自律移動システムの一例)
2 自律走行車両(自律移動体の一例)
3 自律走行制御装置(自律移動制御装置の一例)
5 情報処理装置
6 走行装置(移動装置の一例)
8 準周期タイル
20 撮影部
21 ルートデータ記憶部
22 画像データ記憶部
23 マップデータ記憶部
26 ジャイロセンサ
27 加速度センサ
28 オドメトリ
31 ルートデータ設定部
33 自己位置推定部
35 制御部
1 Autonomous driving system (an example of an autonomous moving system)
2 Autonomous vehicle (an example of an autonomous mobile object)
3 Autonomous travel control device (an example of an autonomous movement control device)
5 Information processing device 6 Traveling device (an example of a moving device)
8 Quasi-periodic tiles 20 Photographing section 21 Route data storage section 22 Image data storage section 23 Map data storage section 26 Gyro sensor 27 Acceleration sensor 28 Odometry 31 Route data setting section 33 Self-position estimation section 35 Control section

Claims (8)

  1.  自律して移動可能な自律移動体の移動を制御する自律移動制御装置であって、
     複数種類のタイル形状によって構成された準周期タイルが撮影されることで得られた画像に関するデータと、前記準周期タイルに対応した座標系を有するマップデータとをパターンマッチングすることにより、前記準周期タイルにおける前記自律移動体の自己位置及び進行方向を推定する自己位置推定部と、
     前記自己位置推定部によって推定された前記自己位置及び前記進行方向、並びに予め設定されている前記自律移動体の移動ルートに基づいて、前記自律移動体の移動の制御を行う制御部と、
     を有する自律移動制御装置。
    An autonomous movement control device that controls the movement of an autonomous mobile body that can move autonomously,
    By pattern matching data regarding images obtained by photographing quasi-periodic tiles composed of multiple types of tile shapes and map data having a coordinate system corresponding to the quasi-periodic tiles, the quasi-periodic a self-position estimation unit that estimates the self-position and traveling direction of the autonomous mobile body in the tile;
    a control unit that controls movement of the autonomous mobile body based on the self-position and the traveling direction estimated by the self-position estimation unit, and a preset movement route of the autonomous mobile body;
    An autonomous movement control device having:
  2.  請求項1に記載の自律移動制御装置であって、
     前記自律移動体の角度、角速度又は角加速度を検出するジャイロセンサ、及び前記自律移動体の加速度を計測する加速度センサを有し、
     前記自己位置推定部は、推定した前記自己位置及び前記進行方向に基づいて、前記ジャイロセンサ及び前記加速度センサの各パラメータをキャリブレーションする、自律移動制御装置。
    The autonomous movement control device according to claim 1,
    It has a gyro sensor that detects the angle, angular velocity, or angular acceleration of the autonomous moving body, and an acceleration sensor that measures the acceleration of the autonomous moving body,
    The self-position estimation unit is an autonomous movement control device, wherein the self-position estimating unit calibrates each parameter of the gyro sensor and the acceleration sensor based on the estimated self-position and the traveling direction.
  3.  前記自己位置推定部は、前記画像に関するデータを取得できない場合に、前記パターンマッチングに代えて、前記キャリブレーションの後の前記ジャイロセンサ及び前記加速度センサを用いて、前記自律移動体の前記自己位置及び前記進行方向を推定する、請求項2に記載の自律移動制御装置。 The self-position estimating unit calculates the self-position and the self-position of the autonomous mobile body using the calibrated gyro sensor and the acceleration sensor instead of the pattern matching when data related to the image cannot be acquired. The autonomous movement control device according to claim 2, which estimates the traveling direction.
  4.  前記自律移動体が車輪を有する自律走行車両であって、
     前記自律移動制御装置は、前記車輪の回転数から、移動距離、速度、回転角度を求めるオドメトリを有し、
     前記自己位置推定部は、推定した前記自己位置及び前記進行方向に基づいて、前記オドメトリのパラメータをキャリブレーションする、請求項2に記載の自律移動制御装置。
    The autonomous mobile body is an autonomous vehicle having wheels,
    The autonomous movement control device has an odometry that calculates the travel distance, speed, and rotation angle from the rotation speed of the wheel,
    The autonomous movement control device according to claim 2, wherein the self-position estimation unit calibrates the odometry parameters based on the estimated self-position and the traveling direction.
  5.  前記自己位置推定部は、画像に関するデータで映し出された前記準周期タイルにおける各タイルの交点を構成する線分の数及び当該線分のなす角度のパターンに基づいて前記マップデータをパターンマッチングし、前記準周期タイルにおけるどの交点であるかを特定することにより、前記準周期タイルにおける前記自律移動体の前記自己位置及び前記進行方向を推定する、請求項1に記載の自律移動制御装置。 The self-position estimating unit performs pattern matching on the map data based on the number of line segments constituting the intersection of each tile in the quasi-periodic tiles displayed in the data related to the image and the pattern of angles formed by the line segments, The autonomous movement control device according to claim 1, wherein the self-position and the traveling direction of the autonomous mobile body in the quasi-periodic tile are estimated by specifying which intersection in the quasi-periodic tile.
  6.  請求項1乃至5のいずれか一項に記載の自律移動制御装置を備えた前記自律移動体と、
     前記準周期タイルと、
     を有する自律移動システム。
    The autonomous mobile body comprising the autonomous movement control device according to any one of claims 1 to 5,
    the quasi-periodic tile;
    An autonomous mobile system with
  7.  自律して移動可能な自律移動体の移動を制御する自律移動制御装置が実行する自律移動制御方法であって、
     前記自律移動制御装置は、
     複数種類のタイル形状によって構成された準周期タイルが撮影されることで得られた画像に関するデータと、前記準周期タイルに対応した座標系を有するマップデータとをパターンマッチングすることにより、前記準周期タイルにおける前記自律移動体の自己位置及び進行方向を推定する自己位置推定処理と、
     前記自己位置推定処理によって推定された前記自己位置及び前記進行方向、並びに予め設定されている前記自律移動体の移動ルートに基づいて、前記自律移動体の移動の制御を行う制御処理と、
     を実行する自律移動制御方法。
    An autonomous movement control method executed by an autonomous movement control device that controls the movement of an autonomous mobile body that can move autonomously, the method comprising:
    The autonomous movement control device includes:
    By pattern matching data regarding images obtained by photographing quasi-periodic tiles composed of multiple types of tile shapes and map data having a coordinate system corresponding to the quasi-periodic tiles, the quasi-periodic a self-position estimation process for estimating the self-position and traveling direction of the autonomous mobile body in the tile;
    a control process that controls movement of the autonomous mobile body based on the self-position and the traveling direction estimated by the self-position estimation process, and a preset movement route of the autonomous mobile body;
    An autonomous movement control method that performs
  8.  コンピュータに、請求項7に記載の方法を実行させるプログラム。 A program that causes a computer to execute the method according to claim 7.
PCT/JP2022/030490 2022-08-09 2022-08-09 Autonomous movement control device, autonomous movement system, autonomous movement method, and program WO2024034025A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/030490 WO2024034025A1 (en) 2022-08-09 2022-08-09 Autonomous movement control device, autonomous movement system, autonomous movement method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/030490 WO2024034025A1 (en) 2022-08-09 2022-08-09 Autonomous movement control device, autonomous movement system, autonomous movement method, and program

Publications (1)

Publication Number Publication Date
WO2024034025A1 true WO2024034025A1 (en) 2024-02-15

Family

ID=89851266

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/030490 WO2024034025A1 (en) 2022-08-09 2022-08-09 Autonomous movement control device, autonomous movement system, autonomous movement method, and program

Country Status (1)

Country Link
WO (1) WO2024034025A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10291180A (en) * 1997-04-21 1998-11-04 Kawasaki Heavy Ind Ltd Position and attitude displacement detecting method for moving body
JP2010117847A (en) * 2008-11-12 2010-05-27 Toyota Motor Corp Moving object, moving object control system, and control method for moving object
CN112729289A (en) * 2020-12-23 2021-04-30 联通物联网有限责任公司 Positioning method, device, equipment and storage medium applied to automatic guided vehicle

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10291180A (en) * 1997-04-21 1998-11-04 Kawasaki Heavy Ind Ltd Position and attitude displacement detecting method for moving body
JP2010117847A (en) * 2008-11-12 2010-05-27 Toyota Motor Corp Moving object, moving object control system, and control method for moving object
CN112729289A (en) * 2020-12-23 2021-04-30 联通物联网有限责任公司 Positioning method, device, equipment and storage medium applied to automatic guided vehicle

Similar Documents

Publication Publication Date Title
JP6925363B2 (en) Automatic positioning system
CA2328227C (en) Method of tracking and sensing position of objects
TWI665538B (en) A vehicle performing obstacle avoidance operation and recording medium storing computer program thereof
JP5105596B2 (en) Travel route determination map creation device and travel route determination map creation method for autonomous mobile body
US20210365038A1 (en) Local sensing based autonomous navigation, and associated systems and methods
US11340093B2 (en) Submap geographic projections
JP6676814B2 (en) Object detection based on rider strength
JP2003015739A (en) External environment map, self-position identifying device and guide controller
WO2019124343A1 (en) Moving body
EP4102327A1 (en) Position recognition method and position recognition system for vehicle
RU2740229C1 (en) Method of localizing and constructing navigation maps of mobile service robot
US11537140B2 (en) Mobile body, location estimation device, and computer program
CN112041634A (en) Mobile robot positioning method, map building method and mobile robot
US20190094025A1 (en) Apparatus and method for localising a vehicle
WO2024034025A1 (en) Autonomous movement control device, autonomous movement system, autonomous movement method, and program
WO2021074660A1 (en) Object recognition method and object recognition device
CN111736599A (en) AGV navigation obstacle avoidance system, method and equipment based on multiple laser radars
WO2019124342A1 (en) Moving body
JP7396353B2 (en) Map creation system, signal processing circuit, mobile object and map creation method
Conduraru et al. Localization methods for mobile robots-a review
JP7121489B2 (en) moving body
JP2021056764A (en) Movable body
KR102651108B1 (en) Apparatus and Method for Estimating Position
WO2021039378A1 (en) Information processing device, information processing method, and program
EP4177693A1 (en) Method and device for obtaining representation models of structural elements