WO2015141445A1 - Mobile object - Google Patents

Mobile object Download PDF

Info

Publication number
WO2015141445A1
WO2015141445A1 PCT/JP2015/055870 JP2015055870W WO2015141445A1 WO 2015141445 A1 WO2015141445 A1 WO 2015141445A1 JP 2015055870 W JP2015055870 W JP 2015055870W WO 2015141445 A1 WO2015141445 A1 WO 2015141445A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
orientation
environment
unit
uniform pattern
Prior art date
Application number
PCT/JP2015/055870
Other languages
French (fr)
Japanese (ja)
Inventor
修一 槙
高斉 松本
正木 良三
一登 白根
Original Assignee
株式会社日立産機システム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立産機システム filed Critical 株式会社日立産機システム
Priority to JP2016508645A priority Critical patent/JP6348971B2/en
Publication of WO2015141445A1 publication Critical patent/WO2015141445A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser

Definitions

  • the present invention relates to a mobile object, and more particularly to an autonomous mobile object equipped with a system for estimating a position / posture of a mobile object equipped with a device capable of detecting its own position / attitude.
  • the present invention relates to a technique capable of providing a moving body that transports a transported object, and more particularly, to an autonomous moving body that can autonomously move in a line pattern environment.
  • the present invention is described as an embodiment of an autonomous mobile robot as a specific embodiment, the present invention can also be applied to a robot that does not perform autonomous movement, and further includes a camera and GPS.
  • the present invention can be easily applied to a moving body such as a car navigation system of an automobile, and design changes that can be easily made by those skilled in the art within the scope of the inventive idea are included in the technical scope of the invention.
  • This method is a technology called SLAM (Simultaneous Localization and Mapping), and the robot obtains its own position and orientation within the range of the obtained sensor information, and at the same time generates and updates the environment map, based on this autonomously in the environment It has a feature that moves in a moving manner.
  • SLAM Simultaneous Localization and Mapping
  • Patent Document 1 Japanese Patent Laid-Open No. 2005-332204 detects the distance and direction between self-position detecting means such as GPS (Global Positioning System) and surrounding objects.
  • GPS Global Positioning System
  • a movement control device having an object detection means and a function of generating an environmental map in a moving direction based on the detection data is known.
  • the problem of this patent document 1 is to control a mobile body to accurately move along a target route to avoid an unexpected obstacle on the target route even when radio waves from GPS satellites are not received. is there.
  • the solution means is provided with the movement control apparatus which controls the movement to the mobile body which moves autonomously according to the preset target route information.
  • the movement control device includes a self-position detecting unit that detects a current position and direction of the moving body, an ALR (AreaALaser Rader) that detects a distance between the moving body and an object existing around the moving body, a self-position detecting unit, and an ALR. And a control means for controlling the moving body so as to move the moving body along the path.
  • the control means cumulatively generates an environment map around the moving body in consideration of the presence of the object along with the movement of the moving body, and moves the moving body that does not interfere with the object based on the target route information and the environment map. Determine the course.
  • Patent Document 2 Japanese Patent Application Laid-Open No. 2004-276168
  • the subject of this patent document 2 is provision of the map creation system which can create the map for mobile robots with high accuracy.
  • the solution step first, new map information is created by the mobile robot. Next, the existing map information is read, and the relative postures of the existing map are taken out one by one. Then, it is checked whether the same relative posture is also in the new map information.
  • the relative orientation that the new map information and the existing map information have in common is stochastically fused. If the new map information does not have the same relative posture, the relative posture is added to the new map information. After that, if there is a piece constituting a loop in the relative posture set of the new map information, a loop solution process for eliminating the deviation is performed.
  • Patent Document 3 Japanese Patent Laid-Open No. 2007-94743 shows that a map data generation unit and a position estimation unit are arranged in an autonomous mobile robot or a server device.
  • the problem of this patent document 3 is that it is possible to easily teach the position information arbitrarily selected by the user, and by specifying the destination based on the taught position information.
  • An autonomous mobile robot that moves autonomously and its system are provided.
  • an information input unit that receives input of position information and destination information from a user, a position that stores the position information input from the information input unit and the position estimated by the position estimation unit in association with each other in a table And an information storage unit.
  • the movement path planning unit associates the input destination information with the table stored in the position information storage unit, and moves the robot within the movement space of the robot. With reference to the map data which is the obstacle information, the travel path from the position estimated by the position estimator is obtained to move autonomously.
  • Patent Document 4 Japanese Patent Application Laid-Open No. 2005-242489 describes, as a background art relating to a moving body that transports a transported object, “a route with the shortest distance can be set by reducing restrictions on the route on which the autonomous mobile body travels. And preventing collisions and traffic jams even when a plurality of autonomous mobile bodies travel simultaneously.
  • Patent Document 5 Japanese Patent Laid-Open No. 2007-133891
  • Patent Document 6 Japanese Patent Laid-Open No. 2006-293975 disclose an invention relating to an autonomous moving body that moves along a wall surface.
  • Patent Document 5 describes a “wall-following mode” that moves along a wall surface. This is because “wall-following, that is, edge cleaning in the case of a cleaning robot, "Only the edge of the room can be cleaned" (paragraph number 0053), and as a cleaning robot, there is an explanation of an operation mode aimed at cleaning only the edge of a room or the edge of an object in a room. It is only done.
  • an environment map around a moving object is generated cumulatively in the moving direction in consideration of the presence of the object as the moving object moves, and the object is not detected based on the target route information and the environment map.
  • Techniques such as determining the course of a moving body that causes interference are disclosed.
  • Such a conventional technique is possible in a general environment having many geometric features (such as irregularities).
  • An object of the present invention is to realize a robot capable of automatic traveling even in an environment where such geometric characteristics are poor.
  • sensor data is obtained by having sufficient geometric features such as irregularities in the environment around an autonomous mobile body, such as a factory or warehouse where a large number of equipment such as machine tools and storage shelves are installed.
  • An environment in which all parameters of position and orientation are uniquely obtained when matching with a map is called a “general environment”.
  • it has poor geometric features such as flat walls on both sides of the passage, and even if matching between the sensor data and the map, multiple matching points appear and multiple candidate position / posture parameter solutions appear.
  • This environment is called a “uniform pattern environment” because the pattern of geometric features continues uniformly.
  • track environment an environment in which walls on both sides of a route continue, that is, an environment in which two parallel lines on a map continue like a track when the environment is expressed as a map. This is the meaning of the subordinate concept of “uniform pattern environment”.
  • a representative mobile object estimates the position and orientation of its own device based on a distance sensor unit that detects a distance between the own device and an obstacle around the own device, and a detection result of the distance sensor unit. And a controller that controls traveling and a moving mechanism that autonomously travels the device based on the control of the controller.
  • the controller unit estimates the position and orientation of the own device using a map of a travel environment and distance data that is a detection result of the distance sensor unit, and the position and orientation of the own device are uniquely determined.
  • the driving mode is switched between a state where the position and orientation of the device itself is not uniquely determined.
  • the controller unit stops traveling in a segment for which termination data is not set, and is not a wall in a segment for which termination data is set. Control is performed so that the vehicle travels in the travel mode.
  • the controller unit resumes traveling when the moving body that has stopped traveling in a segment for which the terminal data is not set returns to a state in which the position and orientation of the device itself is uniquely determined. It controls so that it may do.
  • Patent Document 5 and Patent Document 6 described above disclose an invention relating to an autonomous moving body that moves along a wall surface
  • Patent Document 5 describes a cleaning robot that includes an edge of a room and an edge of an object in a room. It only stays moving along the wall in order to clean only.
  • Patent Document 6 there is a problem that “if the wall surface is discontinuous, the distance from the wall surface cannot be obtained, so that there is a problem that appropriate movement along the boundary is not performed”. It presents a solution.
  • Patent Document 6 does not disclose any related technology.
  • the typical effect is that there are many geometric features such as irregularities due to the above configuration, and in an environment where all the parameters of position and orientation are required with high accuracy by matching the sensor data with the map.
  • Automatic driving in the general driving mode using all parameters can be performed.
  • the wall-climbing mode that uses only parameters with high accuracy. Automatic driving can be performed.
  • the mobile body can autonomously move in an environment in which an environment such as a factory in which many geometric features exist due to facilities and an environment such as a corridor with poor geometric features are mixed.
  • FIG. 2 is a block diagram showing a software configuration in the mobile unit of FIG. 1.
  • FIG. 3 is an explanatory diagram showing a current position estimation process by a distance sensor unit in the moving body of FIG. 1.
  • 2 is a flowchart illustrating an outline of autonomous moving processing of a moving object in the moving object of FIG. 1.
  • FIG. 3 is an explanatory diagram showing an example of the data structure of route data and uniform pattern environment termination data in the mobile body of FIG. 1.
  • FIG. 2 is an explanatory diagram showing position and orientation estimation processing in the mobile body of FIG. 1.
  • FIG. 7 is an enlarged view of a range C shown in FIG.
  • FIG. 2 is a flowchart showing processing of a general traveling mode and a wall-following traveling mode of the moving body in the moving body of FIG. 1.
  • FIG. 3 is an explanatory diagram showing an example of detailed processing for switching from a wall-following travel mode to a general travel mode in the mobile body of FIG. 1.
  • the constituent elements are not necessarily indispensable unless otherwise specified and apparently indispensable in principle. Needless to say.
  • the shape and positional relationship of components and the like when referring to the shape and positional relationship of components and the like, the shape is substantially the same unless otherwise specified and the case where it is not clearly apparent in principle. And the like are included. The same applies to the above numerical values and ranges. [Outline of the embodiment]
  • the mobile body includes a distance sensor unit (distance sensor unit 12) that detects a distance between the own device and an obstacle around the own device, and the own device based on a detection result of the distance sensor unit.
  • a controller unit controller unit 11 that estimates the position and orientation of the vehicle and controls the traveling, and a moving mechanism unit (moving mechanism unit 21) that autonomously travels the device based on the control of the controller unit. .
  • the controller unit estimates a position and orientation of the own device using a map of a traveling environment and distance data that is a detection result of the distance sensor unit, and a state in which the position and orientation of the own device is uniquely determined (
  • the driving mode is switched between a general environment) and a state (uniform pattern environment) in which the position and orientation of the device itself are not uniquely determined.
  • the controller unit stops traveling in a segment for which termination data is not set, and is not a wall in a segment for which termination data is set. Control is performed so that the vehicle travels in the travel mode.
  • the controller unit resumes traveling when the moving body that has stopped traveling in a segment for which the terminal data is not set returns to a state in which the position and orientation of the device itself is uniquely determined. It controls so that it may do.
  • FIGS. 1 to 10 by limiting it to an autonomous mobile body (specifically, applied to an autonomous mobile robot).
  • an autonomous mobile body specifically, applied to an autonomous mobile robot.
  • the function of the mobile object assumed in the present embodiment will be described for a mobile object that automatically travels between the general environment and the uniform pattern environment.
  • the configuration and main processing contents will be described first, followed by a more specific hardware / software configuration and their operations. ⁇ Functional configuration of mobile unit>
  • FIG. 1 shows a functional configuration of a mobile body (also referred to as an autonomous mobile body) 10 that can autonomously move in the track pattern environment of the present embodiment.
  • the autonomous mobile body 10 includes a controller unit 11 that controls traveling by estimating the position and orientation of the autonomous mobile body 10, and a distance sensor unit that detects a distance between the autonomous mobile body 10 and an outer wall surface such as an obstacle around the autonomous mobile body 10. 12 and moving mechanism parts 21 and 21 for autonomously traveling.
  • the controller unit 11 includes a distance sensor control unit 13 that controls the distance sensor unit 12, and receives a detection result from the distance sensor unit 12 to estimate a position / posture of the autonomous mobile body 10. 14 is provided.
  • the position / orientation estimation unit 14 preferably includes two estimation units.
  • these two position / posture estimation units are a normal position / posture estimation unit and an initial position / posture estimation unit.
  • a plurality of obstacles are arranged in the operation area of the autonomous mobile body, and in the invention of the prior application, the outer peripheral contour shape of the obstacle when the obstacles are arranged on the map data is referred to as the obstacle arrangement shape. I will explain.
  • the normal position / orientation estimation unit estimates the position / orientation according to the prior art, and is not a main problem in the prior invention.
  • the prior invention is characterized in that an initial position / posture estimation unit is provided, and the initial position / posture estimation unit includes a position / posture candidate display / setting unit.
  • the autonomous mobile body according to the embodiment of the invention of the prior application includes a route planning unit that sets a route of an area in which the vehicle travels based on the position / posture estimated by the normal position / posture estimation unit or the initial position / posture estimation unit.
  • a moving mechanism control unit that autonomously moves the autonomous moving body by driving the wheels along the route planned by the route planning unit.
  • the controller unit 11 of the autonomous mobile body 10 of the present invention further includes a uniform pattern environment end determination unit 15 that determines whether or not the current position of the mobile body 10 is the end of the uniform pattern environment. ing. Furthermore, the operation planning unit 16 that calculates a target position / posture on a path to be followed from the current position / posture of the mobile body 10, and the calculated target position / posture and the current position / posture of the mobile body 10 A moving mechanism control unit 17 that controls the shift to be small, and a moving mechanism unit 21 having a front caster and a rear drive wheel are provided.
  • the storage unit for various data includes a map data storage unit 18, a route data storage unit 19, and a uniform pattern environment termination data storage unit 20. Further, although not shown here, it is assumed that the parts necessary for the respective parts to cooperate and operate, such as a case supporting each part and a power source / wiring, are provided.
  • the distance sensor control unit 13 controls the distance sensor unit 12 to obtain distance data including measurement of the distance and direction to the outer periphery of the object such as equipment in the surrounding environment.
  • a laser distance sensor is used as the distance sensor unit 12. This laser distance sensor measures the distance from the sensor to the outer periphery of the object by measuring the time from when the laser is irradiated until the irradiated laser is reflected by the object in the environment and returns to the sensor.
  • a laser irradiation unit is provided. Then, by measuring while rotating this laser irradiation unit for every fixed rotation angle, it is possible to measure the distance to the outer edge of the object within the range of the rotation angle (hereinafter referred to as “scan”).
  • a laser distance sensor having such a function is conventionally known, a specific distance measuring method will not be further described.
  • this scan is performed in a plane, the distance and direction from the sensor to the outer edge of the object on the plane formed by the laser (hereinafter referred to as “scan surface”) are obtained by the scan.
  • the data obtained by combining the distance data between the sensor and the object outer edge in each direction and the direction data irradiated with the laser are simply referred to as distance data.
  • each piece of distance data is recorded as a set of distance and direction data, it can be converted into position data with the sensor as a reference.
  • the data obtained by converting the distance data into the position data in this way is referred to herein as geometric shape data.
  • the scanning surface of the laser distance sensor is attached to the moving body 10 so as to be parallel to the floor surface, and geometric shape data at the height of the scanning surface of the laser distance sensor can be obtained.
  • the geometric shape data obtained as described above is sent to the position / orientation estimation unit 14.
  • a map data storage unit 18 in which the geometric shape of the environment at the height of the scan plane is recorded as an image is read in advance.
  • object existence pixel When the object existence pixel of the geometric shape data regarded as an image most overlaps with the pixel indicating that the object exists in the map data storage unit 18 (hereinafter, the term “object existence pixel” is used).
  • a process for searching for the position and orientation of the geometric shape data on the map data hereinafter, the term “matching (matching process)” is used for the process) is performed.
  • the position and orientation of the geometric shape data in the coordinate system of the map data storage unit 18 when the geometric shape data and the map data storage unit 18 are most overlapped are obtained.
  • the process proceeds to the process in the motion planning unit 16.
  • the motion planning unit 16 operates in the general travel mode, and should follow the current position / posture of the mobile body 10 obtained by the route data storage unit 19 and the position / posture estimation unit 14 of the mobile body 10 read in advance.
  • the target position / orientation on the route is calculated.
  • the moving mechanism control unit 17 controls the moving mechanism unit 21 so as to reduce the deviation between the calculated target position / posture and the current position / posture of the moving body 10. That is, an instruction is given to the motor or the like by determining the rotational speed of the wheel (generally the rear drive wheel) and the turning angle of the steering (generally the front caster). As a result, tracking of the moving body 10 along a preset route, and thus automatic traveling to the destination is realized.
  • the autonomous mobile body 10 is autonomous by a position / orientation estimation unit 14 that receives a detection result from the distance sensor unit 12 and estimates the position / posture of the mobile body 10 when traveling in a general environment. Travel along the route set in advance.
  • the moving body 10 moves forward in the traveling direction so as not to contact the obstacle detected by the distance sensor unit 12. Enter into the wall-climbing mode operation.
  • the uniform pattern environment termination determination unit 15 performs determination processing subsequent to the position / orientation estimation unit 14.
  • the uniform pattern environment end determination unit 15 is previously loaded with data from the map data storage unit 18 and the uniform pattern environment end data storage unit 20 that is data at the end position of the uniform pattern environment.
  • the uniform pattern environment end data storage unit 20 when matching between the geometric shape data and the map data storage unit 18, there are a plurality of solution candidates as in the matching in the process of the uniform pattern environment.
  • the position / posture of the matching search range and the shape of the search range in which the position / posture is uniquely determined without appearing are recorded.
  • the search range for matching indicates a range (a range of values that a solution can take) for searching for a solution candidate in matching.
  • the range is related to the three parameters of position and orientation.
  • the map data storage unit is similar to the position / orientation estimation unit 14 based on the position / posture of the matching search range recorded in the uniform pattern environment end data storage unit 20. 18 is matched with geometric data.
  • the moving body 10 While the moving body 10 is automatically traveling in the uniform pattern environment, the moving body 10 moves while performing matching in the position / orientation estimation unit 14 and matching in the uniform pattern environment end determination unit 15. More specifically, in a track pattern environment such as a long corridor, the vehicle travels linearly away from both walls by a predetermined distance.
  • the motion planning unit 16 operates in a wall-climbing traveling mode in a uniform pattern environment, and the movement partially obtained by the route data storage unit 19 and the position / orientation estimation unit 14 of the mobile body 10 read in advance.
  • the posture and speed of the moving body 10 for moving along the environment such as the wall surface are calculated from the current position and posture of the body 10 and the relative position and posture with the environment such as the distance and posture to the surrounding wall surface. Is done.
  • the movement mechanism control unit 17 controls the movement mechanism unit 21 so as to reduce the deviation between the calculated posture / speed and the current posture / speed of the moving body 10, that is, the rotational speed of the wheel or the steering interruption.
  • An instruction is given to a motor or the like for a corner or the like. Accordingly, tracking of the moving body 10 to the route, and thus automatic traveling to the destination is realized.
  • the position / orientation estimation is performed using the position / orientation obtained by the uniform pattern environment end determination unit 15 as the initial position / orientation for matching in the next position / orientation estimation unit 14.
  • the position / orientation estimation unit 14 that has fallen into a state where any or all of the three parameters of position / orientation cannot be estimated during traveling in the uniform pattern environment returns to a state where all the parameters can be estimated again.
  • the moving body 10 switches from the wall-climbing travel mode for the uniform pattern environment to the general travel mode for the general environment, and continues to automatically travel to the destination.
  • the above is the outline of the flow of processing performed by the mobile object 10. ⁇ Software configuration of mobile unit>
  • the hardware and software configurations of the mobile body 10 will be described, and the flow of processing of the hardware and software as a whole will be described through an example in which the mobile body automatically runs in both the general environment and the uniform pattern environment.
  • the moving body moves in a plane, and two parameters for positions that the moving body 10 can take and one parameter for posture are estimated.
  • FIG. 2 shows the configuration of the hardware of the autonomous mobile body 10 of this embodiment and the software stored therein.
  • the moving body 10 includes a controller (corresponding to the controller unit in FIG. 1) 11, a laser distance sensor (also denoted by reference numeral 12) as a sensor of the distance sensor unit 12, a moving mechanism (corresponding to the moving mechanism unit in FIG. 1) 21, a display 25, an input device 26, and a communication line 27 for communication between these devices.
  • a controller corresponding to the controller unit in FIG. 1
  • a laser distance sensor also denoted by reference numeral 12
  • a moving mechanism corresponding to the moving mechanism unit in FIG. 1
  • FIG. 2 only elements directly related to the flow of processing are shown, and it is assumed that a power source and the like necessary for the operation of each element are provided.
  • the distance sensor of the distance sensor unit 12 a sensor of the same type as the laser distance sensor mentioned in the above-described distance sensor unit 12 is used here.
  • the angle range scanned by the laser distance sensor 12 is 180 degrees, the laser is irradiated every 0.5 degrees in this angle range, and the distance to the object is measured.
  • the angle range, the step size of the angle of laser irradiation, the maximum distance measurement range, and the like may be different.
  • FIG. 3 shows an estimation process of the current position by the distance sensor unit 12, and shows how the outer edge of an object such as equipment in a general environment is measured by the laser distance sensor 12 attached to the moving body 10.
  • FIG. 3 shows a state where an object is present in the illustrated environment (hereinafter referred to as an object existence region) 32 (a wide hatched portion) and a moving body 10 (corresponding to the moving body 10 in FIG. 2) as viewed from above. It is a top view showing.
  • an object existence region hereinafter referred to as an object existence region 32 (a wide hatched portion)
  • a moving body 10 corresponding to the moving body 10 in FIG. 2 as viewed from above. It is a top view showing.
  • the moving body 10 in the position / orientation in the figure scans an angle range of 180 degrees by a laser distance sensor 12 (corresponding to the laser distance sensor 12 in FIG. 2).
  • the laser distance sensor 12 obtains geometric shape data 34 (the broken line portion is the geometric shape data, the thin line connecting the broken line portions is an auxiliary line for indicating the scanned range, and the narrow hatched portion is the scanned range). It becomes.
  • the laser distance sensor 12 is used as described above.
  • the sensor method may be different as long as the sensor can measure the geometric shape of the object.
  • a stereo camera or a depth camera that can measure the distance to an object for each pixel by irradiating the object with infrared rays in a plane may be used.
  • the moving body 10 is provided with a front caster and a rear drive wheel, and the moving mechanism unit 21 is a rear drive wheel and a front caster. Suppose that it can turn.
  • the method of the moving mechanism may be different as long as the effect of moving in the environment can be obtained.
  • other moving mechanisms such as a vehicle having an endless track, a moving body having legs, a ship, an aircraft, and an airship may be used.
  • the moving body 10 is configured to automatically travel.
  • a person may board and steer the moving body. The person may be controlled by remote communication without boarding.
  • the controller 11 includes a processor 22, a memory 23, and a storage device 24.
  • the storage device 24 includes an operating system (OS) 24a, a controller initialization program 24b for reading the BIOS and starting the OS, a laser distance sensor control program 24c, a position / orientation estimation program 24d, an operation plan program 24e, a moving mechanism control program 24f,
  • the uniform pattern environment termination determination program 24g, the uniform pattern environment termination data storage unit 20, the map data storage unit 18, and the route data storage unit 19 are included.
  • the laser distance sensor control program 24c acquires distance data from the laser distance sensor 12.
  • the position / orientation estimation program 24d calculates the position / orientation by matching the geometric shape data with the map data stored in the map data storage unit 18.
  • the motion planning program 24e calculates a route for reaching the destination based on the route data stored in the route data storage unit 19.
  • the moving mechanism control program 24f calculates the rotational speed of the wheels and the like so that the moving body 10 moves along the route.
  • the uniform pattern environment end determination program 24g detects that the mobile object 10 has reached the vicinity of the end of the uniform pattern environment while traveling in the uniform pattern environment.
  • the uniform pattern environment termination data storage unit 20 stores uniform pattern environment termination data used when the uniform pattern environment termination determination program 24g determines the arrival of the uniform pattern environment near the end.
  • the program and data of the embodiment shown in FIG. 2 are loaded into the memory 23 and then processed by the processor 22.
  • the implementation may be different.
  • the above processing may be realized by programmable hardware such as FPGA (Field Programmable Grid Array) or CPLD (Complex Programmable Logic Device).
  • the program and data may be transferred from a storage medium such as a CD-ROM, or may be downloaded from another device via a network.
  • the devices constituting the moving body 10 such as the processor 22, the storage device 24, and the moving mechanism 21 communicate with each other through the wired communication line 27, they may be wireless. As long as communication is possible, the controller 11, the display 25, and the input device 26 may be physically remote. Further, the above hardware and software may be selected according to the embodiment. ⁇ Autonomous movement processing of moving objects>
  • FIG. 4 mainly shows processing related to preparation before automatic traveling (outline of autonomous moving processing of the moving body) in the flow of processing in the controller 11 mounted on the moving body 10, and FIG. Processing related to traveling itself (processing in the general traveling mode and the wall traveling mode) is shown.
  • the controller initialization program 24b reads the OS 24a and activates the programs 24c to 24g (402).
  • map data is read from the map data storage unit 18 (hereinafter simply referred to as map data 18) by the position / orientation estimation program 24d (403).
  • the map data 18 is image data, and it is assumed that the presence or absence of an object in the environment is recorded as a pixel value for each pixel.
  • the laser distance sensor control program 24c controls the laser distance sensor 12 to scan the environment, thereby obtaining the geometric data of the environment (404).
  • initial position / orientation estimation is performed (405).
  • the initial position / orientation estimation is a process performed using the position / orientation estimation program 24d, and particularly refers to a position / orientation estimation process performed at the start of the operation of the mobile object 10. In this sense, since the basic principles of position and orientation estimation and initial position and orientation estimation are the same, here, position and orientation estimation will be described first.
  • the position / orientation estimation program 24d has a position / orientation calculation function based on matching between geometric shape data and map data 18. This function will be described with reference to FIG. Now, the map data 18 is recorded as image data by the object existence pixel (data indicating the outer edge of the object) represented by 60 in FIG. 6, and the moving body 10 in FIG.
  • the geometric shape data 34 obtained by scanning with the position / posture is matched to estimate the position / posture of the moving body 10.
  • the matching search range is X centered on the estimated position 62.
  • the search range X only the position search range is shown, but it is assumed that the posture search range is actually set.
  • geometric shape data having an angle range of ⁇ 30 degrees centering on an arrow extending from the previous estimated position 62 is set as a search range related to the posture. Also good.
  • the overlapping state between the geometric shape data and the object existence pixel 60 that is the map data 18 is evaluated by the position / posture that can be taken by the geometric shape data, and the position / posture when the overlapping state becomes the largest is obtained.
  • the geometric shape data is superimposed on the object existence pixel 60 of the map data 18 in the posture of the arrow extending from the estimated position 63 at the estimated position 63 of the moving body 10, It becomes like this.
  • FIG. 6 shows a state where there is a discrepancy between the object existence pixel 60 and the geometric shape data 34.
  • FIG. 7 is an enlarged view of the range indicated by C in FIG.
  • the map data 18 is represented by an image composed of an object presence pixel 67 (black pixel) and a pixel 69 (white pixel) indicating that no object exists, and the geometric shape data 34 for this image. It is represented by an object existence pixel 68 (hatched pixel) forming At this time, the object existence pixel 68 forming the geometric shape data 34 overlaps the object existence pixel 67 of the map data 18 at the position of the pixel 66, and in this case, this pixel is regarded as a match. However, if the laser distance sensor 12 of the moving body 10 is on the right side of FIG.
  • the white pixel on the left side of the line composed of the object existence pixels 67 indicates that no object is immediately present. It is not shown.
  • the line formed by the object existence pixel 67 black pixel
  • the white pixel portion on the left side indicates the position of the inner surface of the object. It may be a thing.
  • the number of object existence pixels in which the geometric shape data 34 and the object existence pixel 60 of the map data 18 are matched is obtained in a round robin manner by the pixel unit matching method as described above.
  • the entire map data is set as a search range without using the previous estimated position / posture, and the search range of the posture is also matched to 360 degrees, and the place where the moving body 10 is activated Find the detailed position / posture.
  • the above-described matching process is assumed here as the process of calculating the position / orientation by the position / orientation estimation program 24d, but other methods may be used as long as the same effect can be obtained. .
  • ICP Intelligent Closest Point
  • the position / posture when the geometric shape data most closely matches the map data 18 is obtained as the initial position / posture.
  • the search takes time. For this reason, for example, matching is performed around the position / posture specified by the operator on the map data 18 displayed on the display 25, or if there is a predetermined parking lot of the moving body 10, matching is performed around that.
  • the initial position and orientation may be obtained by performing
  • a confirmation screen as to whether or not to finish the destination setting is displayed on the display 25.
  • the operator selects to set the destination by using the input device 26 (406). If the mobile object 10 is not automatically driven, end is selected. In this case, the program ends immediately (407).
  • the process proceeds to A, that is, the process proceeds to process 801 in FIG.
  • uniform pattern environment termination data 20 uniform pattern environment termination data 20 is read from the uniform pattern environment termination data storage unit 20 by the uniform pattern environment termination determination program 24g (803).
  • this data indicates that the mobile object 10 that is in a state where any or all of the three parameters of position / posture cannot be estimated while traveling in a uniform pattern environment is
  • FIG. 5 shows the structure of uniform pattern environment termination data 20 and route data 19.
  • Uniform pattern environment termination data is recorded in pairs with path segment data.
  • the route segment data is the information on the line segment (hereinafter referred to as the segment) that forms the route end point from the route start point, the coordinates in the coordinate system of the map data about the segment start point and the segment end point, and the judgment criteria when the end point is reached (hereinafter referred to as Simply referred to as arrival criteria) and the type of segment (hereinafter referred to as segment type).
  • segment type either a segment in a general environment or a segment in a uniform pattern environment is recorded.
  • the arrival determination criterion is a criterion for determining that the moving body 10 travels from the route start point toward the route end point and arrives at the coordinates of the route end point.
  • the route data 19 in FIG. 2 is configured by collecting the route segment data.
  • the uniform pattern environment end data recorded together with the route segment data includes a return search range position and orientation and a return search range shape.
  • the search range position and orientation at the time of return is the geometric shape data to return to a state where all parameters can be estimated from any of the three parameters of position and orientation while driving in a uniform pattern environment. This refers to the position and orientation of the search range for matching when matching is performed with map data.
  • the return search range shape indicates the shape of this search range for matching.
  • the rectangle is simply a rectangle having a length in the horizontal direction h and a length in the height direction v, but may be changed in accordance with the operation status of the moving body 10.
  • the first segment to follow is selected from the route data 19 (804).
  • the laser distance sensor control program 24c controls the laser distance sensor 12 and scans the environment to obtain geometric shape data (805).
  • position and orientation estimation is performed according to the processing flow described in the processing 405 (806).
  • the type of the segment that the moving body 10 is trying to follow is determined. If the segment is a general environment segment, the process proceeds to process 816, and if the segment is a uniform pattern environment segment, the process proceeds to process 817 (808).
  • the moving body 10 includes a segment 95a connecting the point 91 and the point 92, a segment 95b connecting the point 92 and the point 93, and a segment connecting the point 93 and the point 94. It is assumed that the vehicle travels from the start point 91 to the destination 94 using the route data consisting of 95c. It is assumed that the path data 19 records that the segments 95a and 95c are general environment segments and the segment 95b is a uniform pattern environment segment. It is assumed that the map data 18 having the same shape as the shape of the surface portion of the object existence area 60 has already been obtained.
  • the moving body 10 when the moving body 10 is traveling the first segment 95a, for example, when the moving body 10 is at the position / posture 10a, geometric shape data obtained by scanning within the measurement range 12a of the laser distance sensor 12 is obtained. Since 34a includes geometric features such as front walls and corners in addition to the walls on both sides of the passage, in the matching between the map data 18 and the geometric shape data 34a, the position / posture solution when the two overlap most is obtained. It is uniquely determined.
  • the vehicle 10 travels in the general travel mode 816 because it is a general environment segment from the segment type of the route data 19.
  • the segment end point of the currently selected segment (in this case, the segment 95a) has been reached based on the position / posture obtained in the process 806 and the path data 19 (809). If the segment end point has not been reached, general travel control of the moving mechanism is performed so that the deviation between the determined position / posture and the segment end point is reduced (810).
  • the selected segment is recorded as running and the next segment is selected (811). In the case of FIG. 9, when it is determined that the moving body 10 has reached the position of the point 92, the segment 95 b is next selected.
  • the geometric shape data 34b obtained by scanning within the measurement range 12b of the laser distance sensor 12 includes the walls on both sides of the passage. Since there are no geometric features such as front walls and corners other than that, since there are a plurality of positions where the map data 18 and the geometric shape data 34b most overlap each other, The solution cannot be uniquely determined. More specifically, among the three parameters of the position / orientation of the geometric shape data 34b, the position in the longitudinal direction (that is, the traveling direction or the line direction) is not uniquely determined with respect to the passage, and the solution is solved in the passage longitudinal direction. Multiple candidates are obtained.
  • the general travel mode 816 is an operation mode in which three parameters of position and orientation are uniquely determined. Can not. From this, two parameters, the position with respect to the wall and the posture with respect to the wall, can be obtained as parameters other than the position in the longitudinal direction of the passage. I do. Traveling along the wall is an operation mode in which the distance and posture from the wall are kept constant and the vehicle moves toward the end of the uniform pattern environment. In the wall- following traveling mode 817 in the uniform pattern environment, the end of the uniform pattern environment segment is detected during traveling (812). Here, the same processing as the position / orientation estimation processing 806 is performed.
  • the return search range by the uniform pattern environment end data 20 is set as indicated by a broken line 96. It is assumed that the direction of the return search range 96 is set upward in the drawing. Here, it is assumed that the moving body 10 having the position / posture of 10b has finally reached the position / posture of 10c while traveling in the wall-following travel mode 817 in the uniform pattern environment. At this time, since the uniform pattern environment segment 95b is still selected, the uniform pattern environment end determination program 24g is similar to the position and orientation estimation of the geometric shape data 34c and the map data 18 corresponding to the object existence area 60. Matching is performed. However, in this matching, matching is performed based on the return search range 96, not the normal search range at the time of position and orientation estimation.
  • the return search range 96 is provided in a square shape and range as shown by E centering on the position of D shown in FIG. 10, and is set to the posture of the arrow of the segment F (segment 95b in FIG. 9).
  • the search range for the attitude of the uniform pattern environment in the return search range is set as an angle range indicated by G.
  • the position / posture that the geometric shape data can take when searching for a matching solution is rotated by an angle of G / 2 left and right from the direction indicated by the semicircular segment F from each vertex of the return search range E.
  • a range that is, a range obtained by superimposing the semicircles depicted in FIG.
  • the geometric shape data 34c (FIG. 9) obtained by scanning the moving body 10 at the position / posture 10c is within the range where the semicircles shown in FIG. 10 are overlapped, and in addition, the front wall Since there are geometric features such as angle and corner, three parameters of position and orientation are uniquely obtained.
  • the geometric shape data 34c in the end portion of the uniform pattern environment are matched and the matching is established, more specifically, the geometric shape data 34c is converted into the end portion of the uniform pattern environment. If the solution is uniquely determined when it is overlaid on the map data 18, it is determined that the moving body 10 has reached the end of the uniform pattern environment (813). At this time, as the map data 18 used for matching, only the map of the portion that can be observed from the region set by the terminal data is used. If it is determined in this determination that the end of the uniform pattern environment has not been reached, the control of the moving mechanism for performing the above-mentioned wall tracing is continued (814).
  • the selected segment is recorded as being traveled, and the next segment is selected from the route data 19 (815).
  • the segment 95 c is next selected.
  • the end data may not be set depending on the segment.
  • the moving body 10 stops moving when the solution is not uniquely determined due to the influence of a person or an object not described in the map.
  • the moving body 10 performs position and orientation estimation even when it is stopped, and resumes operation when it is determined that there is no disturbance element such as a person and a solution is uniquely obtained.
  • the distance sensor unit 12 that detects the distance between the device itself and the surrounding obstacle, and the position and orientation of the device based on the detection result of the distance sensor unit 12
  • the following effects can be obtained by including the controller unit 11 that estimates the travel and controls the traveling, and the moving mechanism unit 21 that autonomously travels the own device based on the control of the controller unit 11.
  • the controller unit 11 estimates the position and orientation of the own device using the map (map data) of the driving environment and the distance data (geometric shape data) that is the detection result of the distance sensor unit 12, and
  • the driving mode can be switched between a state in which the position and orientation are uniquely determined and a state in which the position and orientation of the device itself are not uniquely determined.
  • map map data
  • distance data geometric shape data
  • the mobile body 10 can autonomously move in an environment in which an environment such as a factory where many geometric features exist due to facilities and an environment such as a corridor with poor geometric features are mixed.
  • the controller unit 11 can perform control so as to stop traveling in a segment for which terminal data is not set. Further, in the segment in which the end data is set, it is possible to control to run in the wall running mode. Specifically, end data is set in a uniform pattern environment where geometric features are scarce and all parameters of position and orientation cannot be obtained with high accuracy even when geometric data and map data are matched. In no segment, you can stop running in the wall-following mode.
  • the controller unit 11 performs control so that the traveling is resumed when the moving body 10 that has stopped traveling in a segment for which no termination data is set returns to a state in which the position and orientation of the device itself is uniquely determined. be able to. Specifically, in order to switch from the wall-climbing travel mode to the general travel mode, it is necessary to determine that an environment in which all the parameters of position and orientation are required has been reached. For this reason, if the accuracy of matching is high due to the matching between the map data and the geometric shape data of the place where the environment with few geometric features switches to the environment where there are many geometric features, the place where the switchable place has been reached And the operation mode can be switched.
  • the controller unit 11 can perform control so as to keep the direction of the wall surface of the passage where the device travels and the sensor of the distance sensor unit 12 constant during traveling in the wall-following traveling mode.
  • two parameters, the position with respect to the wall and the posture with respect to the wall can be obtained as parameters other than the position in the longitudinal direction of the passage. It is possible to run by.
  • the controller unit 11 compares the terminal data of the uniform pattern environment in which the position and orientation of the device itself is not uniquely determined with the geometric shape data that is the detection result of the distance sensor unit 12, thereby determining the terminal end. It can be performed. In this case, the controller unit 11 can perform the end determination using only the shape data that can be observed from the end of the uniform pattern environment as the end data.
  • the invention made by the present inventor has been specifically described based on the embodiment.
  • the present invention is not limited to the embodiment, and various modifications can be made without departing from the scope of the invention. Needless to say.
  • the above-described embodiment has been described in detail for easy understanding of the present invention, and is not necessarily limited to one having all the configurations described.
  • the present invention can be applied to a moving body that does not perform autonomous movement.
  • a distance sensor unit that detects the distance between the device and an obstacle around the device, and a controller unit that determines the position and orientation of the device based on the detection result of the distance sensor unit and controls traveling.
  • the configuration is provided (for example, similar to FIG. 1 and FIG. 2 described above).
  • the controller unit compares the terminal data of the uniform pattern environment in which the position and orientation of the device itself is not uniquely determined with the distance data that is the detection result of the distance sensor unit, thereby Judgment can be made.
  • the controller unit can perform the end determination using only the shape data that can be observed from the end of the uniform pattern environment as the end data.

Abstract

The present invention addresses the problem of achieving an autonomous mobile object in a mixed environment including both a general environment where there is a large number of geometric features (such as projections and depressions) and a uniform pattern environment where there are few geometric features. A control unit of the mobile object estimates the location and orientation of the mobile object by using map data about travelling environments and geometric shape data detected by a distance sensor unit and changes the travelling mode according to when the location and orientation of the mobile object are uniquely determined and when the location and orientation of the mobile object are not uniquely determined. Specifically, in a general environment where the presence of a large number of geometric features allows all parameters for location and orientation to be calculated with high accuracy by matching the map data and the geometric shape data with each other, automatic travelling is performed in a general travelling mode that uses all the parameters. In a uniform pattern environment where the lack of geometric features prevents all parameters for location and orientation from being calculated with high accuracy, the automatic travelling is performed in a wall following mode that uses only the parameters having high accuracy.

Description

移動体Moving body
 本発明は、移動体に関し、自らの位置・姿勢を検知できる装置を搭載した移動体の移動経路における位置・姿勢を推定するシステムを搭載した自律移動体に関する。 The present invention relates to a mobile object, and more particularly to an autonomous mobile object equipped with a system for estimating a position / posture of a mobile object equipped with a device capable of detecting its own position / attitude.
 特に、本発明は、搬送物を搬送する移動体を提供することができる技術に関し、特に線路パターン環境を自律的に移動することが可能な自律移動体に関する。 In particular, the present invention relates to a technique capable of providing a moving body that transports a transported object, and more particularly, to an autonomous moving body that can autonomously move in a line pattern environment.
 本発明は、具体的な実施の形態としては、自律移動ロボットを実施の形態として説明するものであるが、自律移動を行わないロボットにも適用できるものであり、さらには、カメラやGPSを実装した自動車のカーナビ等の移動体への適用も容易なものであり、発明思想の範囲内での当業者が容易になし得る設計変更は発明の技術的範囲として含まれる。 Although the present invention is described as an embodiment of an autonomous mobile robot as a specific embodiment, the present invention can also be applied to a robot that does not perform autonomous movement, and further includes a camera and GPS. The present invention can be easily applied to a moving body such as a car navigation system of an automobile, and design changes that can be easily made by those skilled in the art within the scope of the inventive idea are included in the technical scope of the invention.
 従来から、予め施設されたレールや磁気テープ等の規定の移動経路に沿って移動する移動ロボットは多く開発されていたが、近年になって、規定の移動経路は持たずに、コンピュータ内で設定された移動経路に沿って自律的に移動するロボットが開発されている。このような自律移動ロボットにおいて、自律的な移動を実現するためには、ロボットの現在位置と姿勢を認識する自己位置・姿勢推定機能が不可欠であり、例えば自律移動ロボットが周囲状態を検知し、そのデータを基に自己位置を推定しながら、かつ、同時に地図を生成する方法が提案されている。この方法はSLAM(Simultaneous Localization and Mapping)と呼ばれる技術で、得られるセンサ情報の範囲でロボットが自己位置・姿勢を求めると同時に環境地図の生成・更新を行い、これをもとに環境内を自律的に移動する特徴を有している。 Conventionally, many mobile robots have been developed that move along a predetermined movement path such as rails and magnetic tapes that have been installed in advance. However, in recent years, there are no predetermined movement paths and they are set in a computer. A robot that moves autonomously along the travel path has been developed. In order to realize autonomous movement in such an autonomous mobile robot, a self-position / posture estimation function that recognizes the current position and posture of the robot is indispensable. For example, the autonomous mobile robot detects the surrounding state, A method has been proposed in which a self-position is estimated based on the data and a map is generated at the same time. This method is a technology called SLAM (Simultaneous Localization and Mapping), and the robot obtains its own position and orientation within the range of the obtained sensor information, and at the same time generates and updates the environment map, based on this autonomously in the environment It has a feature that moves in a moving manner.
 このような自律移動ロボットに関して、例えば、特許文献1(特開2005-332204号公報)には、GPS(Global Positioning System)などの自己位置検知手段と、周囲の物体との距離と方向を検知する物体検知手段と、それらの検知データを基に移動する方向の環境地図を生成する機能を備えた移動制御装置が知られている。この特許文献1の課題は、GPS衛星からの電波を受信していなくても目標経路に沿って移動体を正確に移動させ、目標経路上の予期しない障害物を回避するように制御するものである。そして、その解決手段は、予め設定された目標経路情報に従って自律移動する移動体に、その移動を制御する移動制御装置が設けられている。移動制御装置は、移動体の現在位置及び方位を検知する自己位置検知手段と、移動体とその周囲に存在する物体との距離を検知するALR(Area Laser Rader)と、自己位置検知手段及びALRのデータに基づいて移動体の進路を決定するとともに、当該進路に沿って移動体を移動させるように移動体を制御する制御手段とを備えている。制御手段は、移動体の移動に伴い物体の存在を考慮した移動体周囲の環境地図を移動方向に累積的に生成し、目標経路情報及び環境地図に基づいて物体に非干渉となる移動体の進路を決定する。 With regard to such an autonomous mobile robot, for example, Patent Document 1 (Japanese Patent Laid-Open No. 2005-332204) detects the distance and direction between self-position detecting means such as GPS (Global Positioning System) and surrounding objects. 2. Description of the Related Art A movement control device having an object detection means and a function of generating an environmental map in a moving direction based on the detection data is known. The problem of this patent document 1 is to control a mobile body to accurately move along a target route to avoid an unexpected obstacle on the target route even when radio waves from GPS satellites are not received. is there. And the solution means is provided with the movement control apparatus which controls the movement to the mobile body which moves autonomously according to the preset target route information. The movement control device includes a self-position detecting unit that detects a current position and direction of the moving body, an ALR (AreaALaser Rader) that detects a distance between the moving body and an object existing around the moving body, a self-position detecting unit, and an ALR. And a control means for controlling the moving body so as to move the moving body along the path. The control means cumulatively generates an environment map around the moving body in consideration of the presence of the object along with the movement of the moving body, and moves the moving body that does not interfere with the object based on the target route information and the environment map. Determine the course.
 また、特許文献2(特開2004-276168号公報)の移動ロボットは、移動センサと認識手段により、物体間の相対姿勢で表わされる地図情報とロボットの姿勢の同時推定を行うことで、新規地図情報を作成していく方法が示されている。この特許文献2の課題は、精度の高い移動ロボット用地図を作成することができる地図作成システムの提供である。その解決手段のステップでは、まず、移動ロボットにより新規地図情報を作成する。次に、既存地図情報を読み込んで、既存地図のもつ相対姿勢を1つずつ取り出す。そして、同じ相対姿勢が新規地図情報にもあるかを調べる。新規地図情報にも同じ相対姿勢があれば、新規地図情報と既存地図情報が共通に持つ相対姿勢を確率的に融合する。新規地図情報に同じ相対姿勢がなければ、その相対姿勢を新規地図情報に追加する。その後、新規地図情報の相対姿勢集合の中にループを構成するものがある場合には、そのずれを解消するループ解決処理を行う。 In addition, the mobile robot disclosed in Patent Document 2 (Japanese Patent Application Laid-Open No. 2004-276168) uses a movement sensor and a recognition unit to simultaneously estimate the map information represented by the relative posture between objects and the posture of the robot, thereby creating a new map. It shows how to create information. The subject of this patent document 2 is provision of the map creation system which can create the map for mobile robots with high accuracy. In the solution step, first, new map information is created by the mobile robot. Next, the existing map information is read, and the relative postures of the existing map are taken out one by one. Then, it is checked whether the same relative posture is also in the new map information. If the new map information has the same relative orientation, the relative orientation that the new map information and the existing map information have in common is stochastically fused. If the new map information does not have the same relative posture, the relative posture is added to the new map information. After that, if there is a piece constituting a loop in the relative posture set of the new map information, a loop solution process for eliminating the deviation is performed.
 さらに、特許文献3(特開2007-94743号公報)においては、マップデータ生成部と位置推定部が自律移動型ロボット、あるいは、サーバ装置に配置されていることが示されている。この特許文献3の課題は、ユーザが任意に選択した位置情報を容易に教示することができ、かつ、この教示された位置情報に基づいて移動先を指定することで、指定された移動先に自律移動する自律移動型ロボットとそのシステムを提供するものである。その解決手段としては、ユーザからの位置情報及び移動先情報の入力を受ける情報入力部と、情報入力部から入力された位置情報と位置推定部で推定した位置とを関連付けてテーブルに記憶する位置情報記憶部とを備える。そして、情報入力部に対して移動先情報の入力があると、移動経路計画部が、入力された移動先情報と位置情報記憶部に記憶されているテーブルとを対応させ、ロボットの移動空間内の障害物情報であるマップデータを参照して、位置推定部で推定された位置からの移動経路を求め自律移動する。 Further, Patent Document 3 (Japanese Patent Laid-Open No. 2007-94743) shows that a map data generation unit and a position estimation unit are arranged in an autonomous mobile robot or a server device. The problem of this patent document 3 is that it is possible to easily teach the position information arbitrarily selected by the user, and by specifying the destination based on the taught position information, An autonomous mobile robot that moves autonomously and its system are provided. As a means for solving the problem, an information input unit that receives input of position information and destination information from a user, a position that stores the position information input from the information input unit and the position estimated by the position estimation unit in association with each other in a table And an information storage unit. When the destination information is input to the information input unit, the movement path planning unit associates the input destination information with the table stored in the position information storage unit, and moves the robot within the movement space of the robot. With reference to the map data which is the obstacle information, the travel path from the position estimated by the position estimator is obtained to move autonomously.
 また、特許文献4(特開2005-242489号公報)には、搬送物を搬送する移動体に関する背景技術として、「自律移動体が走行するルートの制約を低減して最短距離のルートを設定可能とし、複数台の自律移動体が同時に走行しても衝突や渋滞が生じないようにする」ものが開示されている。 Patent Document 4 (Japanese Patent Application Laid-Open No. 2005-242489) describes, as a background art relating to a moving body that transports a transported object, “a route with the shortest distance can be set by reducing restrictions on the route on which the autonomous mobile body travels. And preventing collisions and traffic jams even when a plurality of autonomous mobile bodies travel simultaneously.
 また、特許文献5(特開2007-133891号公報)、特許文献6(特開2006-293975号公報)には、壁面に沿って移動する自律移動体に関する発明の開示がある。 Further, Patent Document 5 (Japanese Patent Laid-Open No. 2007-133891) and Patent Document 6 (Japanese Patent Laid-Open No. 2006-293975) disclose an invention relating to an autonomous moving body that moves along a wall surface.
 特許文献5では、壁面に沿って移動する「壁面追従モード」が記載されているが、これは、「壁面追従、すなわち清掃ロボットの場合には縁端清掃により、部屋の縁端や室内の物体の縁端のみを清掃可能である。」との記載(段落番号0053)があり、清掃ロボットとして、部屋の縁端や室内の物体の縁端のみを清掃することを狙った動作モードの説明が行われているに過ぎないものである。 Patent Document 5 describes a “wall-following mode” that moves along a wall surface. This is because “wall-following, that is, edge cleaning in the case of a cleaning robot, "Only the edge of the room can be cleaned" (paragraph number 0053), and as a cleaning robot, there is an explanation of an operation mode aimed at cleaning only the edge of a room or the edge of an object in a room. It is only done.
 また、特許文献6では、「壁面が不連続である場合には壁面からの距離が得られないため、境界に沿った適切な移動が行われないという問題がある」との問題の指摘(段落番号0003)に対して、「特定領域の境界が離散的に配置されている物体で形成されている場合においても、境界に沿った適切な移動を実現できる自律移動装置を提供することを目的とする」(段落番号0004)との開示がある。 Moreover, in patent document 6, when the wall surface is discontinuous, since the distance from the wall surface cannot be obtained, there is a problem that appropriate movement along the boundary is not performed (paragraph) No. 0003) “To provide an autonomous mobile device capable of realizing appropriate movement along a boundary even when the boundary of a specific region is formed of objects arranged discretely. (Paragraph number 0004).
特開2005-332204号公報JP 2005-332204 A 特開2004-276168号公報JP 2004-276168 A 特開2007-94743号公報JP 2007-94743 A 特開2005-242489号公報Japanese Patent Laid-Open No. 2005-242489 特開2007-133891号公報JP 2007-133891 A 特開2006-293975号公報JP 2006-293975 A
 以上のように、従来技術においては、移動体の移動に伴い物体の存在を考慮した移動体周囲の環境地図を移動方向に累積的に生成し、目標経路情報及び環境地図に基づいて物体に非干渉となる移動体の進路を決定する等の技術は開示されている。このような従来技術は、幾何的特徴(凹凸等)が多数ある一般的な環境においては可能なものである。しかしながら、幾何的特徴に乏しい環境が混在する環境においては、ロボットが自位置を周囲の幾何学的特徴から判断することが困難となる。本発明は、正にこのような幾何的特徴に乏しい環境においても自動走行が可能なロボットの実現を課題とするものである。 As described above, in the related art, an environment map around a moving object is generated cumulatively in the moving direction in consideration of the presence of the object as the moving object moves, and the object is not detected based on the target route information and the environment map. Techniques such as determining the course of a moving body that causes interference are disclosed. Such a conventional technique is possible in a general environment having many geometric features (such as irregularities). However, in an environment in which environments having poor geometric features are mixed, it is difficult for the robot to determine its own position from surrounding geometric features. An object of the present invention is to realize a robot capable of automatic traveling even in an environment where such geometric characteristics are poor.
 そこで、本明細書では、工作機械や収納棚等の設備が多数設置された工場や倉庫等のように、自律移動体周辺の環境において凹凸等の幾何的特徴が十分に有ることで、センサデータと地図とのマッチングを行った際に位置・姿勢のすべてのパラメータが一意に求められる環境を「一般環境」と呼ぶものとする。また、通路両側に平坦な壁が続くような幾何的特徴に乏しく、センサデータと地図とのマッチングを行っても、マッチする個所が複数現れ、位置・姿勢のパラメータの解の候補が複数現れてしまう環境を、幾何的特徴のパターンが一様に続くことから「一様パターン環境」と呼ぶものとする。なお、本明細書においては、経路の両側に壁が続くような環境、即ち地図として環境を表現した場合に地図上に平行な二本線が線路のように続く環境については特に線路環境という用語にて説明することがあるが「一様パターン環境」の下位概念の意義である。 Therefore, in this specification, sensor data is obtained by having sufficient geometric features such as irregularities in the environment around an autonomous mobile body, such as a factory or warehouse where a large number of equipment such as machine tools and storage shelves are installed. An environment in which all parameters of position and orientation are uniquely obtained when matching with a map is called a “general environment”. In addition, it has poor geometric features such as flat walls on both sides of the passage, and even if matching between the sensor data and the map, multiple matching points appear and multiple candidate position / posture parameter solutions appear. This environment is called a “uniform pattern environment” because the pattern of geometric features continues uniformly. In this specification, an environment in which walls on both sides of a route continue, that is, an environment in which two parallel lines on a map continue like a track when the environment is expressed as a map, is particularly referred to as the term track environment. This is the meaning of the subordinate concept of “uniform pattern environment”.
 本発明の前記ならびにその他の目的と新規な特徴は、本明細書の記述および添付図面から明らかになるであろう。 The above and other objects and novel features of the present invention will be apparent from the description of this specification and the accompanying drawings.
 本願において開示される発明のうち、代表的なものの概要を簡単に説明すれば、次の通りである。 The outline of a representative one of the inventions disclosed in the present application will be briefly described as follows.
 すなわち、代表的な移動体は、自装置と前記自装置の周囲にある障害物との距離を検知する距離センサ部と、前記距離センサ部の検知結果に基づいて前記自装置の位置姿勢を推定して走行を制御するコントローラ部と、前記コントローラ部の制御に基づいて前記自装置を自律的に走行させる移動機構部と、を備える。そして、前記コントローラ部は、走行環境の地図と前記距離センサ部の検知結果である距離データとを用いて前記自装置の位置姿勢を推定して、前記自装置の位置姿勢が一意に定まる状態と、前記自装置の位置姿勢が一意に定まらない状態とで走行モードを切り替える、ことを特徴とする。 That is, a representative mobile object estimates the position and orientation of its own device based on a distance sensor unit that detects a distance between the own device and an obstacle around the own device, and a detection result of the distance sensor unit. And a controller that controls traveling and a moving mechanism that autonomously travels the device based on the control of the controller. The controller unit estimates the position and orientation of the own device using a map of a travel environment and distance data that is a detection result of the distance sensor unit, and the position and orientation of the own device are uniquely determined. The driving mode is switched between a state where the position and orientation of the device itself is not uniquely determined.
 より好ましくは、前記コントローラ部は、前記自装置の位置姿勢が一意に定まらない状態の場合に、終端データが設定されていないセグメントでは走行を停止し、終端データが設定されているセグメントでは壁ならい走行モードで走行を行うように制御する、ことを特徴とする。 More preferably, in a state where the position and orientation of the device itself is not uniquely determined, the controller unit stops traveling in a segment for which termination data is not set, and is not a wall in a segment for which termination data is set. Control is performed so that the vehicle travels in the travel mode.
 さらに、より好ましくは、前記コントローラ部は、前記終端データが設定されていないセグメントで走行を停止中の前記移動体が、前記自装置の位置姿勢が一意に定まる状態に復帰した場合に走行を再開するように制御する、ことを特徴とする。 More preferably, the controller unit resumes traveling when the moving body that has stopped traveling in a segment for which the terminal data is not set returns to a state in which the position and orientation of the device itself is uniquely determined. It controls so that it may do.
 なお、前述の特許文献5、特許文献6には、壁面に沿って移動する自律移動体に関する発明の開示があるが、特許文献5は、清掃ロボットに部屋の縁端や室内の物体の縁端のみを清掃させる為に、単に壁面に沿って移動するに留まるものである。 In addition, although Patent Document 5 and Patent Document 6 described above disclose an invention relating to an autonomous moving body that moves along a wall surface, Patent Document 5 describes a cleaning robot that includes an edge of a room and an edge of an object in a room. It only stays moving along the wall in order to clean only.
 また、特許文献6では、「壁面が不連続である場合には壁面からの距離が得られないため、境界に沿った適切な移動が行われないという問題がある」ことに対して、その解決策を提示しているものである。 Further, in Patent Document 6, there is a problem that “if the wall surface is discontinuous, the distance from the wall surface cannot be obtained, so that there is a problem that appropriate movement along the boundary is not performed”. It presents a solution.
 本発明においては、「一様パターン環境」と称して、幾何的特徴に乏しい環境が混在する環境である場合には、ロボットが自位置を周囲の幾何学的特徴から判断することが困難となることを問題としているが、この点に対して、特許文献6では、何ら関連する技術を開示しているものではない。 In the present invention, it is difficult to determine the position of the robot from the surrounding geometric features when the environment is a mixture of environments having poor geometric features, referred to as “uniform pattern environment”. However, in this respect, Patent Document 6 does not disclose any related technology.
 本願において開示される発明のうち、代表的なものによって得られる効果を簡単に説明すれば以下の通りである。 Among the inventions disclosed in the present application, effects obtained by typical ones will be briefly described as follows.
 すなわち、代表的な効果は、上記構成により、凹凸等の幾何的特徴が多数存在することで、センサデータと地図とのマッチングにより、位置・姿勢のすべてのパラメータが高い確度で求められる環境においては、すべてのパラメータを用いる一般走行モードでの自動走行を行うことができる。かつ、幾何的特徴が乏しく、センサデータと地図とのマッチングを行っても位置・姿勢のすべてのパラメータが高い確度で求められない環境においては、確度が高いパラメータのみを用いる壁ならい走行モードでの自動走行を行うことができる。また、壁ならい走行モードから一般走行モードに切り替えるためには、位置・姿勢のすべてのパラメータが求められる環境に到達したことを判定する必要がある。このため、幾何的特徴が乏しい環境から幾何的特徴が多数存在する環境への切り替わる場所の地図とセンサデータとのマッチングにより、マッチングの確度が高い場合は、切り替え可能な場所に到達したものと判定し、運転モードを切り替え、目的地までの走行を行うことができる。これにより、設備などにより幾何的特徴が多数存在する工場などの環境と、幾何的特徴に乏しい廊下などの環境とが混在する環境での移動体の自律移動が可能となる。 That is, the typical effect is that there are many geometric features such as irregularities due to the above configuration, and in an environment where all the parameters of position and orientation are required with high accuracy by matching the sensor data with the map. Automatic driving in the general driving mode using all parameters can be performed. Also, in an environment where the geometric features are scarce and all the parameters of position and orientation cannot be obtained with high accuracy even if matching between sensor data and map is performed, it is possible to use the wall-climbing mode that uses only parameters with high accuracy. Automatic driving can be performed. In addition, in order to switch from the wall-following traveling mode to the general traveling mode, it is necessary to determine that the environment where all the parameters of position and orientation are required has been reached. For this reason, when the accuracy of matching is high by matching the map of the place where the geometric feature is poor to the environment where there are many geometric features and the sensor data, it is determined that the switchable location has been reached. In addition, the driving mode can be switched to travel to the destination. As a result, the mobile body can autonomously move in an environment in which an environment such as a factory in which many geometric features exist due to facilities and an environment such as a corridor with poor geometric features are mixed.
本発明の一実施の形態の線路パターン環境を自律移動可能な移動体の機能構成を示すブロック図である。It is a block diagram which shows the function structure of the mobile body which can move autonomously in the track | line pattern environment of one embodiment of this invention. 図1の移動体において、ソフトウェア構成を示すブロック図である。FIG. 2 is a block diagram showing a software configuration in the mobile unit of FIG. 1. 図1の移動体において、距離センサ部による現在位置の推定処理を示す説明図である。FIG. 3 is an explanatory diagram showing a current position estimation process by a distance sensor unit in the moving body of FIG. 1. 図1の移動体において、移動体の自律移動処理の概要を示すフローチャートである。2 is a flowchart illustrating an outline of autonomous moving processing of a moving object in the moving object of FIG. 1. 図1の移動体において、経路データ及び一様パターン環境終端データのデータ構造の一例を示す説明図である。FIG. 3 is an explanatory diagram showing an example of the data structure of route data and uniform pattern environment termination data in the mobile body of FIG. 1. 図1の移動体において、位置姿勢推定処理を示す説明図である。FIG. 2 is an explanatory diagram showing position and orientation estimation processing in the mobile body of FIG. 1. 図1の移動体において、図6に示す範囲Cの拡大図である。FIG. 7 is an enlarged view of a range C shown in FIG. 6 in the moving body of FIG. 1. 図1の移動体において、移動体の一般走行モード及び壁ならい走行モードの処理を示すフローチャートである。FIG. 2 is a flowchart showing processing of a general traveling mode and a wall-following traveling mode of the moving body in the moving body of FIG. 1. 図1の移動体において、一般走行モード及び壁ならい走行モードの互いのモードの切り替えの一例を示す説明図である。In the mobile body of FIG. 1, it is explanatory drawing which shows an example of the mode switching of a general traveling mode and a wall-following traveling mode. 図1の移動体において、壁ならい走行モードから一般走行モードへの切り替えの詳細な処理の一例を示す説明図である。FIG. 3 is an explanatory diagram showing an example of detailed processing for switching from a wall-following travel mode to a general travel mode in the mobile body of FIG. 1.
 以下の実施の形態においては、便宜上その必要があるときは、複数のセクションまたは実施の形態に分割して説明するが、特に明示した場合を除き、それらは互いに無関係なものではなく、一方は他方の一部または全部の変形例、詳細、補足説明などの関係にある。また、以下の実施の形態において、要素の数など(個数、数値、量、範囲などを含む)に言及する場合、特に明示した場合および原理的に明らかに特定の数に限定される場合などを除き、その特定の数に限定されるものではなく、特定の数以上でも以下でも良い。 In the following embodiments, when it is necessary for the sake of convenience, the description will be divided into a plurality of sections or embodiments. However, unless otherwise specified, they are not irrelevant and one is the other. There are some or all of the modifications, details, supplementary explanations, and the like. Also, in the following embodiments, when referring to the number of elements (including the number, numerical value, quantity, range, etc.), particularly when clearly indicated and when clearly limited to a specific number in principle, etc. Except, it is not limited to the specific number, and may be more or less than the specific number.
 さらに、以下の実施の形態において、その構成要素(要素ステップなども含む)は、特に明示した場合および原理的に明らかに必須であると考えられる場合などを除き、必ずしも必須のものではないことは言うまでもない。同様に、以下の実施の形態において、構成要素などの形状、位置関係などに言及するときは、特に明示した場合および原理的に明らかにそうでないと考えられる場合などを除き、実質的にその形状などに近似または類似するものなどを含むものとする。このことは、上記数値および範囲についても同様である。
 [実施の形態の概要]
Further, in the following embodiments, the constituent elements (including element steps and the like) are not necessarily indispensable unless otherwise specified and apparently indispensable in principle. Needless to say. Similarly, in the following embodiments, when referring to the shape and positional relationship of components and the like, the shape is substantially the same unless otherwise specified and the case where it is not clearly apparent in principle. And the like are included. The same applies to the above numerical values and ranges.
[Outline of the embodiment]
 まず、本発明の実施の形態の概要について説明する。本実施の形態の概要では、一例として、括弧内に実施の形態の対応する構成要素、符号を付して説明する。 First, an outline of an embodiment of the present invention will be described. In the outline of the present embodiment, as an example, description will be given with the corresponding constituent elements and reference numerals of the embodiment in parentheses.
 実施の形態の移動体は、自装置と前記自装置の周囲にある障害物との距離を検知する距離センサ部(距離センサ部12)と、前記距離センサ部の検知結果に基づいて前記自装置の位置姿勢を推定して走行を制御するコントローラ部(コントローラ部11)と、前記コントローラ部の制御に基づいて前記自装置を自律的に走行させる移動機構部(移動機構部21)と、を備える。そして、前記コントローラ部は、走行環境の地図と前記距離センサ部の検知結果である距離データとを用いて前記自装置の位置姿勢を推定して、前記自装置の位置姿勢が一意に定まる状態(一般環境)と、前記自装置の位置姿勢が一意に定まらない状態(一様パターン環境)とで走行モードを切り替える、ことを特徴とする。 The mobile body according to the embodiment includes a distance sensor unit (distance sensor unit 12) that detects a distance between the own device and an obstacle around the own device, and the own device based on a detection result of the distance sensor unit. A controller unit (controller unit 11) that estimates the position and orientation of the vehicle and controls the traveling, and a moving mechanism unit (moving mechanism unit 21) that autonomously travels the device based on the control of the controller unit. . Then, the controller unit estimates a position and orientation of the own device using a map of a traveling environment and distance data that is a detection result of the distance sensor unit, and a state in which the position and orientation of the own device is uniquely determined ( The driving mode is switched between a general environment) and a state (uniform pattern environment) in which the position and orientation of the device itself are not uniquely determined.
 より好ましくは、前記コントローラ部は、前記自装置の位置姿勢が一意に定まらない状態の場合に、終端データが設定されていないセグメントでは走行を停止し、終端データが設定されているセグメントでは壁ならい走行モードで走行を行うように制御する、ことを特徴とする。 More preferably, in a state where the position and orientation of the device itself is not uniquely determined, the controller unit stops traveling in a segment for which termination data is not set, and is not a wall in a segment for which termination data is set. Control is performed so that the vehicle travels in the travel mode.
 さらに、より好ましくは、前記コントローラ部は、前記終端データが設定されていないセグメントで走行を停止中の前記移動体が、前記自装置の位置姿勢が一意に定まる状態に復帰した場合に走行を再開するように制御する、ことを特徴とする。 More preferably, the controller unit resumes traveling when the moving body that has stopped traveling in a segment for which the terminal data is not set returns to a state in which the position and orientation of the device itself is uniquely determined. It controls so that it may do.
 以下、上述した実施の形態の概要に基づいた実施の形態を図面に基づいて詳細に説明する。なお、実施の形態を説明するための全図において、同一の部材には原則として同一の符号を付し、その繰り返しの説明は省略する。
 [一実施の形態]
Hereinafter, an embodiment based on the outline of the above-described embodiment will be described in detail with reference to the drawings. Note that components having the same function are denoted by the same reference symbols throughout the drawings for describing the embodiment, and the repetitive description thereof will be omitted.
[One Embodiment]
 以下、本発明の一実施の形態を自律移動体(具体的には自律移動ロボットに適用)に限定して、図1~図10を参照して説明する。以下、前述の「一般環境」及び「一様パターン環境」の分類定義のもと、一般環境と一様パターン環境とを自動走行する移動体について、本実施の形態で想定する移動体の機能の構成・主な処理内容について始めに述べ、続いて、より具体的なハードウェア・ソフトウェアの構成とこれらの動作について述べる。
 <移動体の機能構成>
Hereinafter, an embodiment of the present invention will be described with reference to FIGS. 1 to 10 by limiting it to an autonomous mobile body (specifically, applied to an autonomous mobile robot). Hereinafter, based on the classification definition of “general environment” and “uniform pattern environment” described above, the function of the mobile object assumed in the present embodiment will be described for a mobile object that automatically travels between the general environment and the uniform pattern environment. The configuration and main processing contents will be described first, followed by a more specific hardware / software configuration and their operations.
<Functional configuration of mobile unit>
 図1は、本実施の形態の線路パターン環境を自律移動可能な移動体(自律移動体とも称する)10の機能の構成を示す。自律移動体10は、この自律移動体10の位置・姿勢を推定して走行を制御するコントローラ部11、自律移動体10と周囲にある障害物等の外壁面との距離を検知する距離センサ部12、及び自律的に走行させる移動機構部21,21を備えている。また、コントローラ部11は、距離センサ部12を制御する距離センサ制御部13を備えており、距離センサ部12からの検知結果を受けて自律移動体10の位置・姿勢を推定する位置姿勢推定部14を備えている。 FIG. 1 shows a functional configuration of a mobile body (also referred to as an autonomous mobile body) 10 that can autonomously move in the track pattern environment of the present embodiment. The autonomous mobile body 10 includes a controller unit 11 that controls traveling by estimating the position and orientation of the autonomous mobile body 10, and a distance sensor unit that detects a distance between the autonomous mobile body 10 and an outer wall surface such as an obstacle around the autonomous mobile body 10. 12 and moving mechanism parts 21 and 21 for autonomously traveling. In addition, the controller unit 11 includes a distance sensor control unit 13 that controls the distance sensor unit 12, and receives a detection result from the distance sensor unit 12 to estimate a position / posture of the autonomous mobile body 10. 14 is provided.
 位置姿勢推定部14は、2つの推定部を備えることが望ましい。この点に関しては、本願発明の課題ではなく、同一出願人の先願(特開2013-25351)として既に出願している。簡単に説明しておくが、この2つの位置・姿勢推定部は通常位置・姿勢推定部と初期位置・姿勢推定部である。障害物は自律移動体の動作領域内に複数配置されており、先願発明においては、地図データ上におけるそれら障害物を配置させた際の障害物の外周輪郭形状を障害物の配置形状と称して説明する。通常位置・姿勢推定部は、従前技術による位置・姿勢の推定を行うものであり、先願発明における主要な課題ではない。先願発明においては、初期位置・姿勢推定部を設けたことを特徴としており、この初期位置・姿勢推定部は、位置姿勢候補表示・設定部を備えている。先願発明の実施の形態の自律移動体は、通常位置・姿勢推定部又は初期位置・姿勢推定部により推定された位置・姿勢に基づいて自らが走行する領域の経路を設定する経路計画部を設け、この経路計画部で計画された経路に沿うように車輪を駆動して自律移動体を自律移動させる移動機構制御部を設けている。 The position / orientation estimation unit 14 preferably includes two estimation units. In this regard, the application has already been filed as a prior application (Japanese Patent Laid-Open No. 2013-25351) of the same applicant, not the subject of the present invention. Briefly described, these two position / posture estimation units are a normal position / posture estimation unit and an initial position / posture estimation unit. A plurality of obstacles are arranged in the operation area of the autonomous mobile body, and in the invention of the prior application, the outer peripheral contour shape of the obstacle when the obstacles are arranged on the map data is referred to as the obstacle arrangement shape. I will explain. The normal position / orientation estimation unit estimates the position / orientation according to the prior art, and is not a main problem in the prior invention. The prior invention is characterized in that an initial position / posture estimation unit is provided, and the initial position / posture estimation unit includes a position / posture candidate display / setting unit. The autonomous mobile body according to the embodiment of the invention of the prior application includes a route planning unit that sets a route of an area in which the vehicle travels based on the position / posture estimated by the normal position / posture estimation unit or the initial position / posture estimation unit. Provided is a moving mechanism control unit that autonomously moves the autonomous moving body by driving the wheels along the route planned by the route planning unit.
 それに対して、本発明の自律移動体10のコントローラ部11は、さらに、移動体10の現在位置が一様パターン環境の終端であるか否かを判定する一様パターン環境終端判定部15を備えている。さらに、移動体10の現在の位置・姿勢から追従すべき経路上の目標位置・姿勢の算出を行う動作計画部16、算出された目標位置・姿勢と移動体10の現在の位置・姿勢とのずれを小さくするように制御する移動機構制御部17、及び前部キャスタと後部駆動輪を有した移動機構部21を備えている。また、各種データの記憶部は、地図データ記憶部18、経路データ記憶部19、一様パターン環境終端データ記憶部20から構成される。また、ここでは図示していないが、各部を支持する筐体、電源・配線など、各部が連携し、動作するために必要なものは備わっているものとする。 On the other hand, the controller unit 11 of the autonomous mobile body 10 of the present invention further includes a uniform pattern environment end determination unit 15 that determines whether or not the current position of the mobile body 10 is the end of the uniform pattern environment. ing. Furthermore, the operation planning unit 16 that calculates a target position / posture on a path to be followed from the current position / posture of the mobile body 10, and the calculated target position / posture and the current position / posture of the mobile body 10 A moving mechanism control unit 17 that controls the shift to be small, and a moving mechanism unit 21 having a front caster and a rear drive wheel are provided. The storage unit for various data includes a map data storage unit 18, a route data storage unit 19, and a uniform pattern environment termination data storage unit 20. Further, although not shown here, it is assumed that the parts necessary for the respective parts to cooperate and operate, such as a case supporting each part and a power source / wiring, are provided.
 本実施の形態の自律移動体10は、距離センサ制御部13により、距離センサ部12を制御し、周囲の環境中の設備等の物体外周縁までの距離と方向の計測からなる距離データを得る。ここでは距離センサ部12として、レーザ距離センサを用いるものとする。このレーザ距離センサは、レーザを照射してから、照射したレーザが環境中の物体により反射してセンサに返ってくるまでの時間を計測することで、センサから物体外周縁までの距離を計測するレーザ照射部を備えている。そして、このレーザ照射部を一定の回転角毎に回転させながら計測することで、回転角度の範囲内にある物体外縁までの距離の計測(以下、「スキャン」との用語を用いる)が可能であるとする。このような機能を備えたレーザ距離センサは、従来公知であるので、具体的測距方法はこれ以上の説明はしない。今、このスキャンが平面内で行われるとすると、スキャンによりレーザがなす平面(以下、「スキャン面」との用語を用いる)上におけるセンサから物体外縁までの距離と方向が得られることとなる。このときに得られる各方向におけるセンサと物体外縁間の距離のデータとレーザを照射した方向データを組としたデータをここでは単に距離データと呼ぶものとする。この距離データの1つ1つは距離と方向のデータを組として記録されているため、センサを基準とした位置のデータに変換することができる。このようにして距離データを位置のデータに変換したものをここでは幾何形状データと呼ぶものとする。今、レーザ距離センサのスキャン面が床面に平行となるように移動体10に取り付けられており、レーザ距離センサのスキャン面の高さでの幾何形状データが得られるものとする。 In the autonomous mobile body 10 according to the present embodiment, the distance sensor control unit 13 controls the distance sensor unit 12 to obtain distance data including measurement of the distance and direction to the outer periphery of the object such as equipment in the surrounding environment. . Here, a laser distance sensor is used as the distance sensor unit 12. This laser distance sensor measures the distance from the sensor to the outer periphery of the object by measuring the time from when the laser is irradiated until the irradiated laser is reflected by the object in the environment and returns to the sensor. A laser irradiation unit is provided. Then, by measuring while rotating this laser irradiation unit for every fixed rotation angle, it is possible to measure the distance to the outer edge of the object within the range of the rotation angle (hereinafter referred to as “scan”). Suppose there is. Since a laser distance sensor having such a function is conventionally known, a specific distance measuring method will not be further described. Now, assuming that this scan is performed in a plane, the distance and direction from the sensor to the outer edge of the object on the plane formed by the laser (hereinafter referred to as “scan surface”) are obtained by the scan. Here, the data obtained by combining the distance data between the sensor and the object outer edge in each direction and the direction data irradiated with the laser are simply referred to as distance data. Since each piece of distance data is recorded as a set of distance and direction data, it can be converted into position data with the sensor as a reference. The data obtained by converting the distance data into the position data in this way is referred to herein as geometric shape data. Now, it is assumed that the scanning surface of the laser distance sensor is attached to the moving body 10 so as to be parallel to the floor surface, and geometric shape data at the height of the scanning surface of the laser distance sensor can be obtained.
 以上で得られた幾何形状データは位置姿勢推定部14に送られる。位置姿勢推定部14には、スキャン面の高さでの環境の幾何形状を画像として記録した地図データ記憶部18が予め読み込まれている。そして、この地図データ記憶部18の物体が存在することを示す画素(以下、「物体存在画素」との用語を用いる)に対して画像と見なした幾何形状データの物体存在画素が最も重なり合うときの地図データ上での幾何形状データの位置・姿勢を探索する処理(以下、当該処理に「マッチング(マッチング処理)」との用語を用いる)が行われる。これにより、幾何形状データと地図データ記憶部18が最も重なり合うときの地図データ記憶部18の座標系での幾何形状データの位置・姿勢が求まる。これは、直接的には距離センサ部12のレーザ照射部の位置・姿勢に相当するが、移動体10の位置・姿勢を表す際には移動体10の筐体のどこの位置を基準としてもよいため、ここで得られた距離センサ部12の位置・姿勢をもって移動体10の位置・姿勢とする。 The geometric shape data obtained as described above is sent to the position / orientation estimation unit 14. In the position / orientation estimation unit 14, a map data storage unit 18 in which the geometric shape of the environment at the height of the scan plane is recorded as an image is read in advance. When the object existence pixel of the geometric shape data regarded as an image most overlaps with the pixel indicating that the object exists in the map data storage unit 18 (hereinafter, the term “object existence pixel” is used). A process for searching for the position and orientation of the geometric shape data on the map data (hereinafter, the term “matching (matching process)” is used for the process) is performed. As a result, the position and orientation of the geometric shape data in the coordinate system of the map data storage unit 18 when the geometric shape data and the map data storage unit 18 are most overlapped are obtained. This directly corresponds to the position / orientation of the laser irradiation unit of the distance sensor unit 12, but when representing the position / orientation of the moving body 10, any position of the casing of the moving body 10 is used as a reference. Therefore, the position / orientation of the distance sensor unit 12 obtained here is used as the position / orientation of the moving body 10.
 本実施の形態の自律移動体10が一般環境を走行中の場合は、動作計画部16での処理に進む。動作計画部16は、一般走行モードで動作し、予め読み込んでおいた移動体10の経路データ記憶部19と位置姿勢推定部14で得られた移動体10の現在の位置・姿勢から追従すべき経路上の目標位置・姿勢の算出が行われる。 When the autonomous mobile body 10 of the present embodiment is traveling in the general environment, the process proceeds to the process in the motion planning unit 16. The motion planning unit 16 operates in the general travel mode, and should follow the current position / posture of the mobile body 10 obtained by the route data storage unit 19 and the position / posture estimation unit 14 of the mobile body 10 read in advance. The target position / orientation on the route is calculated.
 そして、移動機構制御部17では、算出された目標位置・姿勢と移動体10の現在の位置・姿勢とのずれを小さくするように移動機構部21の制御をする。すなわち、車輪(一般的には後部駆動輪)の回転速度やステアリング(一般的には前部キャスタ)の切れ角などを求めてモータなどへの指示が行われる。これらにより、移動体10の予め設定された経路に沿った追従、ひいては目的地までの自動走行が実現される。 Then, the moving mechanism control unit 17 controls the moving mechanism unit 21 so as to reduce the deviation between the calculated target position / posture and the current position / posture of the moving body 10. That is, an instruction is given to the motor or the like by determining the rotational speed of the wheel (generally the rear drive wheel) and the turning angle of the steering (generally the front caster). As a result, tracking of the moving body 10 along a preset route, and thus automatic traveling to the destination is realized.
 本実施の形態の自律移動体10は、一般環境での走行の際には、距離センサ部12からの検知結果を受けて移動体10の位置・姿勢を推定する位置姿勢推定部14により自律的に予め設定された経路に沿って走行する。位置姿勢推定部14の推定処理により、一般環境から一様パターン環境に入ったと推定された時点で、移動体10は、距離センサ部12によって検出した障害物に接触しないように進行方向に前進を行う壁ならい走行モードの運転に入る。このように、移動体10が壁ならい走行モードで一様パターン環境を走行中の場合は、位置姿勢推定部14に続いて一様パターン環境終端判定部15にて判定処理が行われる。一様パターン環境終端判定部15には、予め地図データ記憶部18と、一様パターン環境の終端位置におけるデータである一様パターン環境終端データ記憶部20からのデータが読み込まれている。この一様パターン環境終端データ記憶部20には、幾何形状データと地図データ記憶部18とのマッチングを行った場合に、一様パターン環境の行程中でのマッチングのように複数の解の候補が現れることなしに、一意に位置・姿勢が求められるマッチングの探索範囲の位置・姿勢、探索範囲の形状(一様パターン環境終端での位置・姿勢、探索範囲の形状)が記録されている。マッチングの探索範囲とは、マッチングにおいて解の候補を探索する範囲(解の取り得る値の範囲)を示す。ここでは位置・姿勢の3つのパラメータに関する範囲となる。 The autonomous mobile body 10 according to the present embodiment is autonomous by a position / orientation estimation unit 14 that receives a detection result from the distance sensor unit 12 and estimates the position / posture of the mobile body 10 when traveling in a general environment. Travel along the route set in advance. When it is estimated by the estimation process of the position / orientation estimation unit 14 that the uniform environment has been entered from the general environment, the moving body 10 moves forward in the traveling direction so as not to contact the obstacle detected by the distance sensor unit 12. Enter into the wall-climbing mode operation. As described above, when the moving body 10 is traveling in the uniform pattern environment in the wall-following traveling mode, the uniform pattern environment termination determination unit 15 performs determination processing subsequent to the position / orientation estimation unit 14. The uniform pattern environment end determination unit 15 is previously loaded with data from the map data storage unit 18 and the uniform pattern environment end data storage unit 20 that is data at the end position of the uniform pattern environment. In the uniform pattern environment end data storage unit 20, when matching between the geometric shape data and the map data storage unit 18, there are a plurality of solution candidates as in the matching in the process of the uniform pattern environment. The position / posture of the matching search range and the shape of the search range (position / posture at the end of the uniform pattern environment, the shape of the search range) in which the position / posture is uniquely determined without appearing are recorded. The search range for matching indicates a range (a range of values that a solution can take) for searching for a solution candidate in matching. Here, the range is related to the three parameters of position and orientation.
 一様パターン環境終端判定部15では、この一様パターン環境終端データ記憶部20に記録されたマッチングの探索範囲の位置・姿勢等をもとに、位置姿勢推定部14と同様に地図データ記憶部18と幾何形状データとのマッチングを行う。移動体10が一様パターン環境を自動走行している間は、位置姿勢推定部14でのマッチングと一様パターン環境終端判定部15でのマッチングを行いながら移動する。より具体的には、長い廊下等の線路パターン環境では、両方の壁から所定距離だけ離れて直線的に走行する。このとき、動作計画部16は、一様パターン環境における壁ならい走行モードで動作し、予め読み込んでおいた移動体10の経路データ記憶部19と位置姿勢推定部14で部分的に得られた移動体10の現在の位置・姿勢、そして周囲にある壁面までの距離・姿勢といった環境との相対的位置・姿勢より、壁面などの環境に沿って移動するための移動体10の姿勢と速度が算出される。 In the uniform pattern environment end determination unit 15, the map data storage unit is similar to the position / orientation estimation unit 14 based on the position / posture of the matching search range recorded in the uniform pattern environment end data storage unit 20. 18 is matched with geometric data. While the moving body 10 is automatically traveling in the uniform pattern environment, the moving body 10 moves while performing matching in the position / orientation estimation unit 14 and matching in the uniform pattern environment end determination unit 15. More specifically, in a track pattern environment such as a long corridor, the vehicle travels linearly away from both walls by a predetermined distance. At this time, the motion planning unit 16 operates in a wall-climbing traveling mode in a uniform pattern environment, and the movement partially obtained by the route data storage unit 19 and the position / orientation estimation unit 14 of the mobile body 10 read in advance. The posture and speed of the moving body 10 for moving along the environment such as the wall surface are calculated from the current position and posture of the body 10 and the relative position and posture with the environment such as the distance and posture to the surrounding wall surface. Is done.
 そして、移動機構制御部17では、算出された姿勢・速度と移動体10の現在の姿勢・速度とのずれを小さくするように移動機構部21の制御、すなわち、車輪の回転速度やステアリングの切れ角などを求めてモータなどへの指示が行われる。これらにより、移動体10の経路への追従、ひいては目的地までの自動走行が実現される。 Then, the movement mechanism control unit 17 controls the movement mechanism unit 21 so as to reduce the deviation between the calculated posture / speed and the current posture / speed of the moving body 10, that is, the rotational speed of the wheel or the steering interruption. An instruction is given to a motor or the like for a corner or the like. Accordingly, tracking of the moving body 10 to the route, and thus automatic traveling to the destination is realized.
 一様パターン環境での走行から、やがて一様パターン環境の終端付近に移動体10が到達したとき、そのときの幾何形状データと地図データ記憶部18とのマッチングが一様パターン環境終端判定部15により行われ、マッチすると一様パターン環境の終端に到達したと判定される。これに伴い、一様パターン環境終端判定部15で求められた位置・姿勢を次の位置姿勢推定部14でのマッチングの初期位置・姿勢とし、位置姿勢推定を行う。これにより、一様パターン環境の走行中に位置・姿勢の3つのパラメータのいずれか、もしくはすべてが推定できない状態に陥っている位置姿勢推定部14が再びすべてのパラメータを推定できる状態に復帰する。 When the mobile object 10 eventually reaches the vicinity of the end of the uniform pattern environment after traveling in the uniform pattern environment, the matching between the geometric shape data at that time and the map data storage unit 18 is the uniform pattern environment end determination unit 15. If it matches, it is determined that the end of the uniform pattern environment has been reached. Accordingly, the position / orientation estimation is performed using the position / orientation obtained by the uniform pattern environment end determination unit 15 as the initial position / orientation for matching in the next position / orientation estimation unit 14. As a result, the position / orientation estimation unit 14 that has fallen into a state where any or all of the three parameters of position / orientation cannot be estimated during traveling in the uniform pattern environment returns to a state where all the parameters can be estimated again.
 一様パターン環境の終端へ到達したと判定されると同時に、移動体10は、一様パターン環境用の壁ならい走行モードから一般環境用の一般走行モードに切り替え、目的地への自動走行を続ける。以上が、移動体10で行われる処理の流れの概要となる。
 <移動体のソフトウェア構成>
At the same time as it is determined that the end of the uniform pattern environment has been reached, the moving body 10 switches from the wall-climbing travel mode for the uniform pattern environment to the general travel mode for the general environment, and continues to automatically travel to the destination. . The above is the outline of the flow of processing performed by the mobile object 10.
<Software configuration of mobile unit>
 続いて、移動体10のハードウェアとソフトウェアの構成について述べ、移動体が一般環境と一様パターン環境の両方の環境で自動走行する例を通して、ハードウェアとソフトウェア全体の処理の流れについて述べる。なお、以下では移動体が平面内を動作することを想定し、移動体10が取り得る位置についての2つのパラメータ、姿勢についての1つのパラメータを推定する場合を想定して述べる。 Subsequently, the hardware and software configurations of the mobile body 10 will be described, and the flow of processing of the hardware and software as a whole will be described through an example in which the mobile body automatically runs in both the general environment and the uniform pattern environment. In the following description, it is assumed that the moving body moves in a plane, and two parameters for positions that the moving body 10 can take and one parameter for posture are estimated.
 図2は、本実施の形態の自律移動体10のハードウェアとこれに格納されるソフトウェアの構成を示す。移動体10は、コントローラ(図1のコントローラ部に相当)11、距離センサ部12のセンサたるレーザ距離センサ(同じく符号12とする)、移動機構(図1の移動機構部に相当)21、ディスプレイ25、入力機器26、これらの機器同士が通信するための通信線27より構成される。なお、図2では処理の流れに直接関わる要素のみを表記しており、各要素の動作に必要な電源等は当然備わっているものとする。 FIG. 2 shows the configuration of the hardware of the autonomous mobile body 10 of this embodiment and the software stored therein. The moving body 10 includes a controller (corresponding to the controller unit in FIG. 1) 11, a laser distance sensor (also denoted by reference numeral 12) as a sensor of the distance sensor unit 12, a moving mechanism (corresponding to the moving mechanism unit in FIG. 1) 21, a display 25, an input device 26, and a communication line 27 for communication between these devices. In FIG. 2, only elements directly related to the flow of processing are shown, and it is assumed that a power source and the like necessary for the operation of each element are provided.
 距離センサ部12の距離センサには、ここでは前述の距離センサ部12で挙げたレーザ距離センサと同じ方式のセンサを用いるものとする。ここでは、例として、レーザ距離センサ12がスキャンする角度範囲を180度とし、この角度範囲において0.5度毎にレーザを照射し、物体までの距離を計測することを想定するが、スキャンする角度範囲やレーザを照射する角度の刻み幅、距離の最大計測範囲などは異なっていてもよい。 As the distance sensor of the distance sensor unit 12, a sensor of the same type as the laser distance sensor mentioned in the above-described distance sensor unit 12 is used here. Here, as an example, it is assumed that the angle range scanned by the laser distance sensor 12 is 180 degrees, the laser is irradiated every 0.5 degrees in this angle range, and the distance to the object is measured. The angle range, the step size of the angle of laser irradiation, the maximum distance measurement range, and the like may be different.
 図3は、距離センサ部12による現在位置の推定処理を示すもので、移動体10に取り付けられたレーザ距離センサ12により、一般環境中での設備等の物体外縁を計測する様子を示す。図3は図示の環境において物体が存在することを示す領域(以下、物体存在領域)32(幅広のハッチング部分)と、移動体10(図2の移動体10に相当)を上から見下ろした様子を表す平面図である。ここで、例として図中の位置・姿勢にある移動体10がレーザ距離センサ12(図2のレーザ距離センサ12に相当)により180度の角度範囲をスキャンしたとする。このとき、レーザ距離センサ12では幾何形状データ34(破線部が幾何形状データ、破線部同士をつなぐ細線はスキャンした範囲を示すための補助線、幅狭のハッチング部分がスキャン範囲)が得られることとなる。本実施の形態では、以上のようなレーザ距離センサ12の使用を想定しているが、同様に物体の幾何形状の計測が可能なセンサであればセンサの方式は異なっていてもよい。例えばステレオカメラや、赤外線を面状に物体に照射することで画素毎の物体までの距離計測が可能なデプスカメラなどであってもよい。 FIG. 3 shows an estimation process of the current position by the distance sensor unit 12, and shows how the outer edge of an object such as equipment in a general environment is measured by the laser distance sensor 12 attached to the moving body 10. FIG. 3 shows a state where an object is present in the illustrated environment (hereinafter referred to as an object existence region) 32 (a wide hatched portion) and a moving body 10 (corresponding to the moving body 10 in FIG. 2) as viewed from above. It is a top view showing. Here, as an example, it is assumed that the moving body 10 in the position / orientation in the figure scans an angle range of 180 degrees by a laser distance sensor 12 (corresponding to the laser distance sensor 12 in FIG. 2). At this time, the laser distance sensor 12 obtains geometric shape data 34 (the broken line portion is the geometric shape data, the thin line connecting the broken line portions is an auxiliary line for indicating the scanned range, and the narrow hatched portion is the scanned range). It becomes. In the present embodiment, it is assumed that the laser distance sensor 12 is used as described above. However, the sensor method may be different as long as the sensor can measure the geometric shape of the object. For example, a stereo camera or a depth camera that can measure the distance to an object for each pixel by irradiating the object with infrared rays in a plane may be used.
 移動体10は、前部のキャスタ及び後部の駆動輪が備えられており、移動機構部21は、この後部の駆動輪及び前部のキャスタであり、回転角速度の差を制御することで直進・旋回を行えるようになっているものとする。本実施の形態ではこのような移動機構の使用を想定しているが、同様に環境内を移動する効果が得られるのであれば移動機構の方式は異なっていてもよい。例えば、無限軌道を備える車両、脚を備える移動体、船舶、航空機、飛行船などの他の移動機構であってもよい。また、本実施の形態においては、移動体10が自動走行するようになっているが、本発明を実施した上で人が搭乗して移動体を操縦するようになっていてもよいし、また、搭乗せずに遠隔からの通信によって人が操縦するようになっていてもよい。 The moving body 10 is provided with a front caster and a rear drive wheel, and the moving mechanism unit 21 is a rear drive wheel and a front caster. Suppose that it can turn. Although the use of such a moving mechanism is assumed in the present embodiment, the method of the moving mechanism may be different as long as the effect of moving in the environment can be obtained. For example, other moving mechanisms such as a vehicle having an endless track, a moving body having legs, a ship, an aircraft, and an airship may be used. Further, in the present embodiment, the moving body 10 is configured to automatically travel. However, after the present invention has been implemented, a person may board and steer the moving body. The person may be controlled by remote communication without boarding.
 また、コントローラ11は、図2に示すように、プロセッサ22、メモリ23、記憶装置24を備えている。記憶装置24は、オペレーティングシステム(OS)24a、BIOS読み込みやOSの起動を行うコントローラ初期化プログラム24b、レーザ距離センサ制御プログラム24c、位置姿勢推定プログラム24d、動作計画プログラム24e、移動機構制御プログラム24f、一様パターン環境終端判定プログラム24g、一様パターン環境終端データ記憶部20、地図データ記憶部18及び経路データ記憶部19から構成されている。 Further, as shown in FIG. 2, the controller 11 includes a processor 22, a memory 23, and a storage device 24. The storage device 24 includes an operating system (OS) 24a, a controller initialization program 24b for reading the BIOS and starting the OS, a laser distance sensor control program 24c, a position / orientation estimation program 24d, an operation plan program 24e, a moving mechanism control program 24f, The uniform pattern environment termination determination program 24g, the uniform pattern environment termination data storage unit 20, the map data storage unit 18, and the route data storage unit 19 are included.
 レーザ距離センサ制御プログラム24cは、レーザ距離センサ12から距離データを取得する。位置姿勢推定プログラム24dは、幾何形状データと地図データ記憶部18に記憶される地図データとのマッチングによって位置・姿勢を算出する。動作計画プログラム24eは、経路データ記憶部19に記憶された経路データをもとに目的地に辿り着くための経路を算出する。移動機構制御プログラム24fは、経路に沿って移動体10が移動するように車輪の回転速度などを算出する。一様パターン環境終端判定プログラム24gは、移動体10が一様パターン環境を走行中に、一様パターン環境の終端付近に到達したことを検出する。一様パターン環境終端データ記憶部20には、一様パターン環境終端判定プログラム24gが一様パターン環境の終端付近への到達を判定する際に用いる一様パターン環境終端データが記憶されている。 The laser distance sensor control program 24c acquires distance data from the laser distance sensor 12. The position / orientation estimation program 24d calculates the position / orientation by matching the geometric shape data with the map data stored in the map data storage unit 18. The motion planning program 24e calculates a route for reaching the destination based on the route data stored in the route data storage unit 19. The moving mechanism control program 24f calculates the rotational speed of the wheels and the like so that the moving body 10 moves along the route. The uniform pattern environment end determination program 24g detects that the mobile object 10 has reached the vicinity of the end of the uniform pattern environment while traveling in the uniform pattern environment. The uniform pattern environment termination data storage unit 20 stores uniform pattern environment termination data used when the uniform pattern environment termination determination program 24g determines the arrival of the uniform pattern environment near the end.
 なお、図2に図示した実施の形態のプログラムやデータは、メモリ23にロードされた上で、プロセッサ22により処理されることを想定しているが、これと同様の効果が得られるのであれば、実装は異なっていてもよい。例えば、FPGA(Field Programmable Grid Array)やCPLD(Complex Programmable Logic Device)などのプログラマブルなハードウェアで以上の処理を実現してもよい。なお、プログラムやデータは、CD-ROM等の記憶媒体から移してもよいし、ネットワーク経由で他の装置からダウンロードしてもよい。 It is assumed that the program and data of the embodiment shown in FIG. 2 are loaded into the memory 23 and then processed by the processor 22. However, if the same effect can be obtained, The implementation may be different. For example, the above processing may be realized by programmable hardware such as FPGA (Field Programmable Grid Array) or CPLD (Complex Programmable Logic Device). The program and data may be transferred from a storage medium such as a CD-ROM, or may be downloaded from another device via a network.
 また、プロセッサ22や記憶装置24、移動機構21など、移動体10を構成する各デバイスは、ここでは有線の通信線27により互いに通信することを想定しているが無線であってもよく、また、通信が可能であれば、コントローラ11、ディスプレイ25、入力機器26の各デバイスが物理的に遠隔にあってもよい。また、以上のハードウェアやソフトウェアは、実施の形態に応じて、取捨選択してもよい。
 <移動体の自律移動処理>
In addition, although it is assumed here that the devices constituting the moving body 10 such as the processor 22, the storage device 24, and the moving mechanism 21 communicate with each other through the wired communication line 27, they may be wireless. As long as communication is possible, the controller 11, the display 25, and the input device 26 may be physically remote. Further, the above hardware and software may be selected according to the embodiment.
<Autonomous movement processing of moving objects>
 続いて、移動体10で行われる自律移動処理の流れについて、図4および図8に従って、図5~図7および図9~図10を参照しながら述べる。図4は、移動体10に搭載されるコントローラ11での処理の流れのうち、主に自動走行前の準備に関する処理(移動体の自律移動処理の概要)を示し、図8は、主に自動走行そのものに関する処理(一般走行モード及び壁ならい走行モードの処理)を示す。 Subsequently, the flow of the autonomous movement process performed by the moving body 10 will be described according to FIGS. 4 and 8 with reference to FIGS. 5 to 7 and FIGS. 9 to 10. FIG. 4 mainly shows processing related to preparation before automatic traveling (outline of autonomous moving processing of the moving body) in the flow of processing in the controller 11 mounted on the moving body 10, and FIG. Processing related to traveling itself (processing in the general traveling mode and the wall traveling mode) is shown.
 コントローラ11が起動されると(401)、コントローラ初期化プログラム24bにより、OS24aの読み込み、各プログラム24c~24gの起動が行われる(402)。次に、位置姿勢推定プログラム24dにより、地図データが地図データ記憶部18(以降、単に地図データ18と称呼する)より読み込まれる(403)。なお、ここでの地図データ18は画像データとなっており、画素毎に環境中の物体の有無が画素値として記録されているものとする。次に、レーザ距離センサ制御プログラム24cがレーザ距離センサ12を制御し、環境をスキャンすることによって、環境の幾何形状データが得られる(404)。次に、初期位置姿勢推定が行われる(405)。初期位置姿勢推定は、位置姿勢推定プログラム24dを用いて行われる処理で、特に移動体10の動作開始時に行われる位置姿勢推定の処理を指す。この意味で、位置姿勢推定と初期位置姿勢推定の基本原理は同じであるため、ここでは、まず位置姿勢推定について述べる。 When the controller 11 is activated (401), the controller initialization program 24b reads the OS 24a and activates the programs 24c to 24g (402). Next, map data is read from the map data storage unit 18 (hereinafter simply referred to as map data 18) by the position / orientation estimation program 24d (403). Here, the map data 18 is image data, and it is assumed that the presence or absence of an object in the environment is recorded as a pixel value for each pixel. Next, the laser distance sensor control program 24c controls the laser distance sensor 12 to scan the environment, thereby obtaining the geometric data of the environment (404). Next, initial position / orientation estimation is performed (405). The initial position / orientation estimation is a process performed using the position / orientation estimation program 24d, and particularly refers to a position / orientation estimation process performed at the start of the operation of the mobile object 10. In this sense, since the basic principles of position and orientation estimation and initial position and orientation estimation are the same, here, position and orientation estimation will be described first.
 位置姿勢推定プログラム24dには、幾何形状データと地図データ18のマッチングにもとづく位置・姿勢の算出機能が備えられている。この機能について図6を用いて説明する。今、地図データ18が、図6の60で表される物体存在画素(物体の外縁を示すデータ)による画像データとして記録されており、これに対して図3の移動体10が同図中の位置・姿勢でスキャンすることで得た幾何形状データ34のマッチングを行い、移動体10の位置・姿勢を推定する場合を考える。また、このとき、移動体10の前回の位置姿勢推定による推定位置が図6の62で、推定姿勢は62から伸びた矢印の方向とするとき、マッチングの探索範囲が推定位置62を中心としてXのように設けられたとする。 The position / orientation estimation program 24d has a position / orientation calculation function based on matching between geometric shape data and map data 18. This function will be described with reference to FIG. Now, the map data 18 is recorded as image data by the object existence pixel (data indicating the outer edge of the object) represented by 60 in FIG. 6, and the moving body 10 in FIG. Consider a case where the geometric shape data 34 obtained by scanning with the position / posture is matched to estimate the position / posture of the moving body 10. At this time, when the estimated position of the mobile object 10 estimated by the previous position and orientation is 62 in FIG. 6 and the estimated orientation is the direction of the arrow extending from 62, the matching search range is X centered on the estimated position 62. Suppose that it is provided as follows.
 なお、簡単のため、探索範囲Xでは、その範囲のうち、位置の探索範囲のみを記しているが、実際はさらに姿勢の探索範囲が設定されているものとする。ここでは、前回の推定位置62から伸びた矢印を中心として±30度の角度範囲の幾何形状データが、姿勢に関する探索範囲として設定されているものとするが、探索範囲はより広くても狭くてもよい。 For the sake of simplicity, in the search range X, only the position search range is shown, but it is assumed that the posture search range is actually set. Here, it is assumed that geometric shape data having an angle range of ± 30 degrees centering on an arrow extending from the previous estimated position 62 is set as a search range related to the posture. Also good.
 この探索範囲において、幾何形状データが取り得る位置・姿勢で幾何形状データと地図データ18である物体存在画素60との重なり具合を評価し、重なり具合が最も大きくなるときの位置・姿勢を求める。具体的には、例えば、移動体10の推定位置63にて、推定位置63から伸びる矢印の姿勢において、幾何形状データを地図データ18の物体存在画素60に重ね合わせたとすると、幾何形状データ34のような重なり方となる。図6は、物体存在画素60と幾何形状データ34に食い違いがある状態を示している。 In this search range, the overlapping state between the geometric shape data and the object existence pixel 60 that is the map data 18 is evaluated by the position / posture that can be taken by the geometric shape data, and the position / posture when the overlapping state becomes the largest is obtained. Specifically, for example, assuming that the geometric shape data is superimposed on the object existence pixel 60 of the map data 18 in the posture of the arrow extending from the estimated position 63 at the estimated position 63 of the moving body 10, It becomes like this. FIG. 6 shows a state where there is a discrepancy between the object existence pixel 60 and the geometric shape data 34.
 このとき、例えば図6に示すCの範囲に着目する。図6のCに示す範囲の拡大図が図7である。地図データ18は、物体存在画素67(黒色の画素)と物体が存在しないことを示す画素69(白抜きの画素)からなる画像で表されており、また、この画像に対して幾何形状データ34をなす物体存在画素68(ハッチングされた画素)で表される。このとき、幾何形状データ34をなす物体存在画素68は、画素66の位置にて地図データ18の物体存在画素67と重なり合っており、この場合、この画素はマッチしたものとみなされる。ただし、移動体10のレーザ距離センサ12が、図7の右側にあるとすると、物体存在画素67(黒色の画素)で構成された線より左側の白抜きの画素は直ちに物体が存在しないことを示すものではない。図6に示す実施の形態の通りに、物体存在画素67(黒色の画素)で構成された線は物体の外縁であり、左側の白抜きの画素部分は物体の内面の位置を意味しているものかもしれない。 At this time, attention is paid to the range C shown in FIG. FIG. 7 is an enlarged view of the range indicated by C in FIG. The map data 18 is represented by an image composed of an object presence pixel 67 (black pixel) and a pixel 69 (white pixel) indicating that no object exists, and the geometric shape data 34 for this image. It is represented by an object existence pixel 68 (hatched pixel) forming At this time, the object existence pixel 68 forming the geometric shape data 34 overlaps the object existence pixel 67 of the map data 18 at the position of the pixel 66, and in this case, this pixel is regarded as a match. However, if the laser distance sensor 12 of the moving body 10 is on the right side of FIG. 7, the white pixel on the left side of the line composed of the object existence pixels 67 (black pixels) indicates that no object is immediately present. It is not shown. As in the embodiment shown in FIG. 6, the line formed by the object existence pixel 67 (black pixel) is the outer edge of the object, and the white pixel portion on the left side indicates the position of the inner surface of the object. It may be a thing.
 以上のような画素単位のマッチング方法によって、幾何形状データ34と地図データ18の物体存在画素60とがマッチする物体存在画素の数をあらゆるパターンにおいて総当たりで求める。 The number of object existence pixels in which the geometric shape data 34 and the object existence pixel 60 of the map data 18 are matched is obtained in a round robin manner by the pixel unit matching method as described above.
 この図6においては、幾何形状データ34が位置63において、この位置63から伸びる矢印の姿勢をとるときに幾何形状データ34と地図データ18の物体存在画素60との重なり具合が最大となり、マッチングの解、つまりは移動体10の位置・姿勢が位置63で求められることとなる。以上が位置姿勢推定の処理となるが、初期位置姿勢推定では基本的にはこの位置姿勢推定と同じ処理を行うが、位置姿勢推定では前回の推定位置姿勢をもとに探索範囲を設定していたのに対し、初期位置姿勢推定では、前回の推定位置姿勢を用いず、地図データ全体を探索範囲と設定し、姿勢の探索範囲も360度としてマッチングを行い、移動体10が起動された場所での詳細な位置・姿勢を求める。なお、位置姿勢推定プログラム24dにて位置・姿勢を算出する処理としては前述のようなマッチングの処理をここでは想定しているが、同様の効果が得られるならば他の方法であってもよい。例えばICP(Iterative Closest Point)等を用いてもよい。 In FIG. 6, when the geometric shape data 34 is at the position 63 and the posture of the arrow extending from the position 63 is taken, the degree of overlap between the geometric shape data 34 and the object existence pixel 60 of the map data 18 is maximized. The solution, that is, the position / posture of the moving body 10 is obtained at the position 63. The above is the position / orientation estimation process, but the initial position / orientation estimation basically performs the same process as the position / orientation estimation, but the position / orientation estimation sets the search range based on the previous estimated position / orientation. In contrast, in the initial position / posture estimation, the entire map data is set as a search range without using the previous estimated position / posture, and the search range of the posture is also matched to 360 degrees, and the place where the moving body 10 is activated Find the detailed position / posture. Note that the above-described matching process is assumed here as the process of calculating the position / orientation by the position / orientation estimation program 24d, but other methods may be used as long as the same effect can be obtained. . For example, ICP (Iterative Closest Point) or the like may be used.
 また、前述のとおり、本実施の形態では、地図データ18全体を探索範囲としたときに、幾何形状データが地図データ18に対して最もマッチするときの位置・姿勢を初期位置姿勢として求めることを想定しているが、この場合、探索に時間を要する。このため、例えば、ディスプレイ25に表示された地図データ18上で操作者が指定した位置・姿勢周辺でマッチングを行ったり、あるいは予め決まっている移動体10の駐車場があれば、その周辺でマッチングを行うなどして初期位置姿勢を求めてもよい。 Further, as described above, in the present embodiment, when the entire map data 18 is set as the search range, the position / posture when the geometric shape data most closely matches the map data 18 is obtained as the initial position / posture. In this case, the search takes time. For this reason, for example, matching is performed around the position / posture specified by the operator on the map data 18 displayed on the display 25, or if there is a predetermined parking lot of the moving body 10, matching is performed around that. The initial position and orientation may be obtained by performing
 続いて、目的地の設定を終了するかどうかの確認画面がディスプレイ25に表示される。移動体10に自動走行を行わせる場合、操作者は目的地の設定を行うことを入力機器26により選択する(406)。もし、移動体10の自動走行を行わないのであれば終了を選択する。この場合、直ちにプログラムは終了となる(407)。今、移動体10に自動走行を行わせるため、操作者が目的地の設定を行う方を選択した場合、処理はAに進み、つまりは図8の処理801に進む。 Subsequently, a confirmation screen as to whether or not to finish the destination setting is displayed on the display 25. When the mobile object 10 is caused to perform automatic traveling, the operator selects to set the destination by using the input device 26 (406). If the mobile object 10 is not automatically driven, end is selected. In this case, the program ends immediately (407). Now, in order to cause the moving body 10 to perform automatic traveling, when the operator selects the destination setting method, the process proceeds to A, that is, the process proceeds to process 801 in FIG.
 続いて、操作者は、ディスプレイ25に表示される搬送先の候補のリスト上で目的地を確認し、入力機器26により目的地を設定する(801)。次に、初期位置姿勢推定の処理405で得られた位置・姿勢をもとに、動作計画プログラム24eにより、経路データ19が読み込まれる(802)。次に、一様パターン環境終端判定プログラム24gにより、一様パターン環境終端データ記憶部20から一様パターン環境終端データ(以降、一様パターン環境終端データ20と称呼する)が読み込まれる(803)。このデータは先に述べたように、一様パターン環境の走行中に位置・姿勢の3つのパラメータのいずれか、もしくはすべてが推定できない状態に陥っている移動体10が、位置・姿勢の3つのパラメータのすべてが推定可能な状態に復帰するために、幾何形状データと地図データとのマッチングを行う際に必要なマッチングの探索範囲の位置・姿勢、探索範囲の形状が記録されたデータである。 Subsequently, the operator confirms the destination on the list of transport destination candidates displayed on the display 25 and sets the destination using the input device 26 (801). Next, the route data 19 is read by the motion planning program 24e based on the position / orientation obtained in the initial position / orientation estimation process 405 (802). Next, uniform pattern environment termination data (hereinafter referred to as uniform pattern environment termination data 20) is read from the uniform pattern environment termination data storage unit 20 by the uniform pattern environment termination determination program 24g (803). As described above, this data indicates that the mobile object 10 that is in a state where any or all of the three parameters of position / posture cannot be estimated while traveling in a uniform pattern environment is This is data in which the position / posture of the search range for matching and the shape of the search range necessary for matching between the geometric shape data and the map data are recorded in order to return all parameters to a presumable state.
 図5に一様パターン環境終端データ20と経路データ19の構造を示す。一様パターン環境終端データは経路セグメントデータと組になって記録されている。経路セグメントデータは、経路始点から経路終点を構成する線分(以下、セグメント)の情報として、セグメント始点とセグメント終点についての地図データの座標系における座標、終点に到達したときの判定基準(以下、単に到達判定基準と称する)、及びセグメントの種類(以下、セグメント種別)が記録されている。セグメント種別とは、そのセグメントが一般環境でのセグメントなのか、或いは一様パターン環境でのセグメントなのかのいずれかが記録されている。また、到達判定基準とは、移動体10が経路始点から経路終点に向かって走行し、経路終点の座標に到着したと判定するための基準である。ここでは単純に経路終点の座標から半径rの範囲内に移動体10の位置の座標が入っていれば到着と見なすとしているが、移動体10の運用状況に応じて変更してもよい。この経路セグメントデータが集まることで図2での経路データ19が構成されている。 FIG. 5 shows the structure of uniform pattern environment termination data 20 and route data 19. Uniform pattern environment termination data is recorded in pairs with path segment data. The route segment data is the information on the line segment (hereinafter referred to as the segment) that forms the route end point from the route start point, the coordinates in the coordinate system of the map data about the segment start point and the segment end point, and the judgment criteria when the end point is reached (hereinafter referred to as Simply referred to as arrival criteria) and the type of segment (hereinafter referred to as segment type). As the segment type, either a segment in a general environment or a segment in a uniform pattern environment is recorded. The arrival determination criterion is a criterion for determining that the moving body 10 travels from the route start point toward the route end point and arrives at the coordinates of the route end point. Here, if the coordinates of the position of the moving body 10 are simply within the radius r from the coordinates of the end point of the route, it is assumed that the mobile body 10 has arrived. However, it may be changed according to the operation status of the moving body 10. The route data 19 in FIG. 2 is configured by collecting the route segment data.
 また、経路セグメントデータと共に記録される一様パターン環境終端データとしては、復帰時探索範囲位置姿勢と復帰時探索範囲形状がある。復帰時探索範囲位置姿勢とは、一様パターン環境の走行中に位置・姿勢の3つのパラメータのいずれか、もしくはすべてが推定できない状態からすべてのパラメータを推定できる状態に復帰するために幾何形状データと地図データとのマッチングを行う際のマッチングの探索範囲の位置と姿勢を指す。また、復帰時探索範囲形状は、このマッチングの探索範囲の形状を指す。ここでは単純に横方向の長さがh、高さ方向の長さがvの長方形としているが、移動体10の運用状況に応じて変更してもよい。 Also, the uniform pattern environment end data recorded together with the route segment data includes a return search range position and orientation and a return search range shape. The search range position and orientation at the time of return is the geometric shape data to return to a state where all parameters can be estimated from any of the three parameters of position and orientation while driving in a uniform pattern environment. This refers to the position and orientation of the search range for matching when matching is performed with map data. The return search range shape indicates the shape of this search range for matching. Here, the rectangle is simply a rectangle having a length in the horizontal direction h and a length in the height direction v, but may be changed in accordance with the operation status of the moving body 10.
 次に、経路データ19の中から最初に追従するセグメントが選択される(804)。次に、処理404と同様にレーザ距離センサ制御プログラム24cがレーザ距離センサ12を制御し、環境をスキャンすることで幾何形状データが得られる(805)。次に、処理405のところで述べた処理の流れに従い、位置姿勢推定が行われる(806)。次に、読み込んだ経路データ19をなすセグメントのうち、最後のセグメントの終点に到達したかどうかの判定が行われる(807)。この判定で終点に到達したと判定された場合、処理はBに進み、つまりは図4の処理406に進む。また、この判定で終点に到達したと判定されなかった場合、処理808に進む。次に、移動体10が追従しようとしているセグメントの種別の判定が行われ、セグメントが一般環境セグメントの場合は処理816に、セグメントが一様パターン環境セグメントの場合は処理817に進む(808)。 Next, the first segment to follow is selected from the route data 19 (804). Next, similarly to the process 404, the laser distance sensor control program 24c controls the laser distance sensor 12 and scans the environment to obtain geometric shape data (805). Next, position and orientation estimation is performed according to the processing flow described in the processing 405 (806). Next, it is determined whether or not the end point of the last segment among the segments constituting the read route data 19 has been reached (807). If it is determined in this determination that the end point has been reached, the process proceeds to B, that is, the process proceeds to process 406 in FIG. If it is not determined that the end point has been reached in this determination, the process proceeds to step 808. Next, the type of the segment that the moving body 10 is trying to follow is determined. If the segment is a general environment segment, the process proceeds to process 816, and if the segment is a uniform pattern environment segment, the process proceeds to process 817 (808).
 このセグメントの種別の判定にもとづき、一般環境と一様パターン環境が混在する環境を移動体10が自動走行する様子について、図9を用いて述べる。ここでは、物体存在領域60で表される環境において、移動体10が点91と点92とを結ぶセグメント95a、点92と点93とを結ぶセグメント95b、そして点93と点94とを結ぶセグメント95cからなる経路データを用いて、スタート地点91から目的地94まで走行するものとする。また、セグメント95aと95cは一般環境セグメント、セグメント95bは一様パターン環境セグメントであることが経路データ19に記録されているものとする。なお、物体存在領域60の表面部分の形状と同じ形の地図データ18が既に得られているものとする。 Based on this segment type determination, the state in which the mobile body 10 automatically travels in an environment in which a general environment and a uniform pattern environment are mixed will be described with reference to FIG. Here, in the environment represented by the object existence area 60, the moving body 10 includes a segment 95a connecting the point 91 and the point 92, a segment 95b connecting the point 92 and the point 93, and a segment connecting the point 93 and the point 94. It is assumed that the vehicle travels from the start point 91 to the destination 94 using the route data consisting of 95c. It is assumed that the path data 19 records that the segments 95a and 95c are general environment segments and the segment 95b is a uniform pattern environment segment. It is assumed that the map data 18 having the same shape as the shape of the surface portion of the object existence area 60 has already been obtained.
 この状況のもと、移動体10が最初のセグメント95aを走行中で、例えば移動体10が10aの位置・姿勢にあるとき、レーザ距離センサ12の計測範囲12a内のスキャンによって得られる幾何形状データ34aには、通路両側の壁以外に正面の壁や角などの幾何的特徴が含まれることから、地図データ18と幾何形状データ34aのマッチングにおいては互いが最も重なり合うときの位置・姿勢の解が一意に求められる。移動体10がこのセグメント95aを選択中のとき、経路データ19のセグメント種別より、一般環境セグメントであることから、一般走行モード816で走行する。このとき、処理806で得られた位置・姿勢と経路データ19をもとに、選択中のセグメント(このケースではセグメント95a)のセグメント終点に到達したかが判定される(809)。セグメント終点に到達していない場合は、求められた位置・姿勢とセグメント終点とのずれが小さくなるように移動機構の一般走行制御を行う(810)。セグメント終点に到達した場合は、選択中のセグメントを走行済みとして記録し、次のセグメントを選択する(811)。図9の場合、移動体10が点92の位置に到達したと判定された場合には、次にセグメント95bが選択される。 Under this circumstance, when the moving body 10 is traveling the first segment 95a, for example, when the moving body 10 is at the position / posture 10a, geometric shape data obtained by scanning within the measurement range 12a of the laser distance sensor 12 is obtained. Since 34a includes geometric features such as front walls and corners in addition to the walls on both sides of the passage, in the matching between the map data 18 and the geometric shape data 34a, the position / posture solution when the two overlap most is obtained. It is uniquely determined. When the mobile body 10 is selecting the segment 95a, the vehicle 10 travels in the general travel mode 816 because it is a general environment segment from the segment type of the route data 19. At this time, it is determined whether the segment end point of the currently selected segment (in this case, the segment 95a) has been reached based on the position / posture obtained in the process 806 and the path data 19 (809). If the segment end point has not been reached, general travel control of the moving mechanism is performed so that the deviation between the determined position / posture and the segment end point is reduced (810). When the end point of the segment is reached, the selected segment is recorded as running and the next segment is selected (811). In the case of FIG. 9, when it is determined that the moving body 10 has reached the position of the point 92, the segment 95 b is next selected.
 移動体10がセグメント95bを走行中で、例えば移動体10が10bの位置・姿勢にあるとき、レーザ距離センサ12の計測範囲12b内のスキャンによって得られる幾何形状データ34bには、通路両側の壁のデータのみであり、それ以外に正面の壁や角などの幾何的特徴が含まれていないことから、地図データ18と幾何形状データ34bのマッチングにおいては互いが最も重なり合う位置が複数箇所あるため、解が一意に求められない。より具体的には、幾何形状データ34bの位置・姿勢の3つのパラメータのうち、通路に対して長手方向(即ち進行方向或いは線路方向)の位置は一意に求められず、通路長手方向に解の候補が複数得られてしまう。よって、移動体10が一様パターン環境でのセグメントを選択して走行するときは、位置・姿勢の3つのパラメータが一意に求まっている状態での運転モードである一般走行モード816を行うことはできない。このことから、通路の長手方向の位置以外のパラメータとして、壁に対する位置、壁に対する姿勢の2つは求めることができることから、これを用いて一様パターン環境における壁ならい走行モード817による壁ならい走行を行う。壁ならい走行とは壁との距離、姿勢を一定に保つようにして一様パターン環境の終端に向かって移動する運転モードである。この一様パターン環境における壁ならい走行モード817においては、走行中に一様パターン環境セグメントの終端の検出処理が行われる(812)。ここでは、位置姿勢推定の処理806と同様の処理が行われる。 When the moving body 10 is traveling in the segment 95b, for example, when the moving body 10 is at the position / posture 10b, the geometric shape data 34b obtained by scanning within the measurement range 12b of the laser distance sensor 12 includes the walls on both sides of the passage. Since there are no geometric features such as front walls and corners other than that, since there are a plurality of positions where the map data 18 and the geometric shape data 34b most overlap each other, The solution cannot be uniquely determined. More specifically, among the three parameters of the position / orientation of the geometric shape data 34b, the position in the longitudinal direction (that is, the traveling direction or the line direction) is not uniquely determined with respect to the passage, and the solution is solved in the passage longitudinal direction. Multiple candidates are obtained. Therefore, when the mobile body 10 travels by selecting a segment in a uniform pattern environment, it is possible to perform the general travel mode 816 that is an operation mode in which three parameters of position and orientation are uniquely determined. Can not. From this, two parameters, the position with respect to the wall and the posture with respect to the wall, can be obtained as parameters other than the position in the longitudinal direction of the passage. I do. Traveling along the wall is an operation mode in which the distance and posture from the wall are kept constant and the vehicle moves toward the end of the uniform pattern environment. In the wall- following traveling mode 817 in the uniform pattern environment, the end of the uniform pattern environment segment is detected during traveling (812). Here, the same processing as the position / orientation estimation processing 806 is performed.
 今、一様パターン環境終端データ20による復帰時探索範囲が96に示す破線のように設定されていたとする。なお、復帰時探索範囲96の向きは図面上向きに設定されているものとする。ここで、10bの位置・姿勢にあった移動体10が、一様パターン環境における壁ならい走行モード817で走行していくうちに、やがて10cの位置・姿勢になったものとする。この時点では、まだ一様パターン環境セグメント95bが選択されているため、一様パターン環境終端判定プログラム24gにより、幾何形状データ34cと物体存在領域60に対応する地図データ18との位置姿勢推定と同様のマッチングが行われる。ただし、このマッチングに際しては通常の位置姿勢推定時の探索範囲ではなく、復帰時探索範囲96をもとにマッチングが行われる。 Suppose now that the return search range by the uniform pattern environment end data 20 is set as indicated by a broken line 96. It is assumed that the direction of the return search range 96 is set upward in the drawing. Here, it is assumed that the moving body 10 having the position / posture of 10b has finally reached the position / posture of 10c while traveling in the wall-following travel mode 817 in the uniform pattern environment. At this time, since the uniform pattern environment segment 95b is still selected, the uniform pattern environment end determination program 24g is similar to the position and orientation estimation of the geometric shape data 34c and the map data 18 corresponding to the object existence area 60. Matching is performed. However, in this matching, matching is performed based on the return search range 96, not the normal search range at the time of position and orientation estimation.
 今、復帰時探索範囲96が図10に示すDの位置を中心にEに示すような正方形の形状・範囲で設けられ、セグメントF(図9におけるセグメント95b)の矢印の姿勢に設定されており、復帰時探索範囲での一様パターン環境の姿勢の探索範囲についてはGに示す角度の範囲として設定されていたとする。このとき、マッチングの解の探索時に幾何形状データが取り得る位置・姿勢は、復帰時探索範囲Eの各々の頂点から半円状のセグメントFが示す方向から左右G/2の角度で回転させた範囲、即ちおよそ図10に描かれている各半円を重ね合わせた範囲となる。つまりは、この範囲内に含まれる物体存在画素60と実際の環境をスキャンすることで得られる幾何形状データ34とのマッチングが行われることとなる。移動体10が10cの位置・姿勢でスキャンして得た幾何形状データ34c(図9)は、図10に示した各半円を重ね合わせた範囲内に入っており、加えて、正面の壁や角などの幾何的特徴があることから位置・姿勢の3つのパラメータが一意に求められる。 Now, the return search range 96 is provided in a square shape and range as shown by E centering on the position of D shown in FIG. 10, and is set to the posture of the arrow of the segment F (segment 95b in FIG. 9). Assume that the search range for the attitude of the uniform pattern environment in the return search range is set as an angle range indicated by G. At this time, the position / posture that the geometric shape data can take when searching for a matching solution is rotated by an angle of G / 2 left and right from the direction indicated by the semicircular segment F from each vertex of the return search range E. A range, that is, a range obtained by superimposing the semicircles depicted in FIG. That is, the matching between the object existence pixel 60 included in this range and the geometric shape data 34 obtained by scanning the actual environment is performed. The geometric shape data 34c (FIG. 9) obtained by scanning the moving body 10 at the position / posture 10c is within the range where the semicircles shown in FIG. 10 are overlapped, and in addition, the front wall Since there are geometric features such as angle and corner, three parameters of position and orientation are uniquely obtained.
 このように、一様パターン環境の終端部分の地図データ18と幾何形状データ34cとのマッチングを行い、マッチングが成立した場合、より具体的には、幾何形状データ34cを一様パターン環境の終端部分の地図データ18に重ね合わせたときに、解が一意に定まった場合は、移動体10が一様パターン環境の終端に到達したものと判定する(813)。このとき、マッチングに用いる地図データ18としては終端データで設定された領域から観測可能な部分の地図のみを用いる。この判定において一様パターン環境の終端に到達していないと判定された場合は、前述の壁ならい走行を行うための移動機構の制御を継続して行う(814)。また、一様パターン環境の終端に到達した場合は、選択中のセグメントを走行済みとして記録し、経路データ19より次のセグメントを選択する(815)。図9の場合、移動体10が点93に到達したと判定された場合、次にセグメント95cが選択される。 As described above, when the map data 18 and the geometric shape data 34c in the end portion of the uniform pattern environment are matched and the matching is established, more specifically, the geometric shape data 34c is converted into the end portion of the uniform pattern environment. If the solution is uniquely determined when it is overlaid on the map data 18, it is determined that the moving body 10 has reached the end of the uniform pattern environment (813). At this time, as the map data 18 used for matching, only the map of the portion that can be observed from the region set by the terminal data is used. If it is determined in this determination that the end of the uniform pattern environment has not been reached, the control of the moving mechanism for performing the above-mentioned wall tracing is continued (814). When the end of the uniform pattern environment is reached, the selected segment is recorded as being traveled, and the next segment is selected from the route data 19 (815). In the case of FIG. 9, when it is determined that the moving body 10 has reached the point 93, the segment 95 c is next selected.
 上記の実施の形態においては、セグメントに終端データが設定されているもとでの説明を行ったが、セグメントによっては終端データが設定できない場合がある。終端データが設定されていない一般環境セグメントにおいて、地図に記載されていない人や物体の影響により、解が一意に定まらない場合には、移動体10は移動を停止する。移動体10は停止している間も位置姿勢推定を実施し、人などの外乱要素がなくなり、解が一意に求まると判定されると、動作を再開する。 In the above embodiment, the description was made with the end data set in the segment. However, the end data may not be set depending on the segment. In the general environment segment in which no terminal data is set, the moving body 10 stops moving when the solution is not uniquely determined due to the influence of a person or an object not described in the map. The moving body 10 performs position and orientation estimation even when it is stopped, and resumes operation when it is determined that there is no disturbance element such as a person and a solution is uniquely obtained.
 以上の処理によって移動体10が走行し、目的地に到着したと判定されたならば(807)、前述の通り、目的地の設定確認の処理406に戻る。以上が、移動体10が通常動作を行っている際の処理の流れとなる。
 <一実施の形態の効果>
If it is determined that the mobile body 10 has traveled and arrived at the destination by the above processing (807), the processing returns to the destination setting confirmation processing 406 as described above. The above is the flow of processing when the moving body 10 is performing normal operation.
<Effect of one embodiment>
 以上説明した本実施の形態の移動体10によれば、自装置と周囲にある障害物との距離を検知する距離センサ部12と、距離センサ部12の検知結果に基づいて自装置の位置姿勢を推定して走行を制御するコントローラ部11と、コントローラ部11の制御に基づいて自装置を自律的に走行させる移動機構部21とを備えることで、以下のような効果を得ることができる。 According to the moving body 10 of the present embodiment described above, the distance sensor unit 12 that detects the distance between the device itself and the surrounding obstacle, and the position and orientation of the device based on the detection result of the distance sensor unit 12 The following effects can be obtained by including the controller unit 11 that estimates the travel and controls the traveling, and the moving mechanism unit 21 that autonomously travels the own device based on the control of the controller unit 11.
 (1)コントローラ部11は、走行環境の地図(地図データ)と距離センサ部12の検知結果である距離データ(幾何形状データ)とを用いて自装置の位置姿勢を推定して、自装置の位置姿勢が一意に定まる状態と、自装置の位置姿勢が一意に定まらない状態とで走行モードを切り替えることができる。具体的には、凹凸等の幾何的特徴が多数存在することで、幾何形状データと地図データとのマッチングにより、位置・姿勢のすべてのパラメータが高い確度で求められる一般環境においては、すべてのパラメータを用いる一般走行モードでの自動走行を行うことができる。かつ、幾何的特徴が乏しく、幾何形状データと地図データとのマッチングを行っても位置・姿勢のすべてのパラメータが高い確度で求められない一様パターン環境においては、確度が高いパラメータのみを用いる壁ならい走行モードでの自動走行を行うことができる。これにより、設備などにより幾何的特徴が多数存在する工場などの環境と、幾何的特徴に乏しい廊下などの環境とが混在する環境での移動体10の自律移動が可能となる。 (1) The controller unit 11 estimates the position and orientation of the own device using the map (map data) of the driving environment and the distance data (geometric shape data) that is the detection result of the distance sensor unit 12, and The driving mode can be switched between a state in which the position and orientation are uniquely determined and a state in which the position and orientation of the device itself are not uniquely determined. Specifically, since there are many geometric features such as irregularities, all parameters in the general environment where all position and orientation parameters are required with high accuracy by matching geometric data with map data. It is possible to perform automatic traveling in the general traveling mode using. In a uniform pattern environment where the geometric features are scarce and all the parameters of position and orientation cannot be obtained with high accuracy even when geometric shape data and map data are matched, a wall that uses only parameters with high accuracy It is possible to perform automatic traveling in the following traveling mode. Thereby, the mobile body 10 can autonomously move in an environment in which an environment such as a factory where many geometric features exist due to facilities and an environment such as a corridor with poor geometric features are mixed.
 (2)コントローラ部11は、自装置の位置姿勢が一意に定まらない状態の場合に、終端データが設定されていないセグメントでは、走行を停止するように制御することができる。また、終端データが設定されているセグメントでは、壁ならい走行モードで走行を行うように制御することができる。具体的には、幾何的特徴が乏しく、幾何形状データと地図データとのマッチングを行っても位置・姿勢のすべてのパラメータが高い確度で求められない一様パターン環境において、終端データが設定されていないセグメントでは、壁ならい走行モードでの走行を停止することができる。 (2) When the position and orientation of the device itself is not uniquely determined, the controller unit 11 can perform control so as to stop traveling in a segment for which terminal data is not set. Further, in the segment in which the end data is set, it is possible to control to run in the wall running mode. Specifically, end data is set in a uniform pattern environment where geometric features are scarce and all parameters of position and orientation cannot be obtained with high accuracy even when geometric data and map data are matched. In no segment, you can stop running in the wall-following mode.
 (3)コントローラ部11は、終端データが設定されていないセグメントで走行を停止中の移動体10が、自装置の位置姿勢が一意に定まる状態に復帰した場合に走行を再開するように制御することができる。具体的には、壁ならい走行モードから一般走行モードに切り替えるためには、位置・姿勢のすべてのパラメータが求められる環境に到達したことを判定する必要がある。このため、幾何的特徴が乏しい環境から幾何的特徴が多数存在する環境への切り替わる場所の地図データと幾何形状データとのマッチングにより、マッチングの確度が高い場合は、切り替え可能な場所に到達したものと判定し、運転モードを切り替えることができる。 (3) The controller unit 11 performs control so that the traveling is resumed when the moving body 10 that has stopped traveling in a segment for which no termination data is set returns to a state in which the position and orientation of the device itself is uniquely determined. be able to. Specifically, in order to switch from the wall-climbing travel mode to the general travel mode, it is necessary to determine that an environment in which all the parameters of position and orientation are required has been reached. For this reason, if the accuracy of matching is high due to the matching between the map data and the geometric shape data of the place where the environment with few geometric features switches to the environment where there are many geometric features, the place where the switchable place has been reached And the operation mode can be switched.
 (4)コントローラ部11は、壁ならい走行モードでの走行時に、自装置が走行する通路の壁面と距離センサ部12のセンサとの方向を一定に保つように制御することができる。これにより、一様パターン環境において、通路の長手方向の位置以外のパラメータとして、壁に対する位置、壁に対する姿勢の2つは求めることができることから、これを用いて一様パターン環境における壁ならい走行モードによる走行を行うことができる。 (4) The controller unit 11 can perform control so as to keep the direction of the wall surface of the passage where the device travels and the sensor of the distance sensor unit 12 constant during traveling in the wall-following traveling mode. As a result, in the uniform pattern environment, two parameters, the position with respect to the wall and the posture with respect to the wall, can be obtained as parameters other than the position in the longitudinal direction of the passage. It is possible to run by.
 (5)コントローラ部11は、自装置の位置姿勢が一意に定まらない状態の一様パターン環境の終端データと、距離センサ部12の検知結果である幾何形状データとを比較することで、終端判定を行うことができる。この場合に、コントローラ部11は、終端データとして、一様パターン環境の終端から観測できる形状データのみを用いて、終端判定を行うことができる。 (5) The controller unit 11 compares the terminal data of the uniform pattern environment in which the position and orientation of the device itself is not uniquely determined with the geometric shape data that is the detection result of the distance sensor unit 12, thereby determining the terminal end. It can be performed. In this case, the controller unit 11 can perform the end determination using only the shape data that can be observed from the end of the uniform pattern environment as the end data.
 以上、本発明者によってなされた発明を実施の形態に基づき具体的に説明したが、本発明は前記実施の形態に限定されるものではなく、その要旨を逸脱しない範囲で種々変更可能であることはいうまでもない。例えば、上記した実施の形態は、本発明を分かり易く説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、実施の形態の構成の一部について、他の構成の追加・削除・置換をすることが可能である。 As mentioned above, the invention made by the present inventor has been specifically described based on the embodiment. However, the present invention is not limited to the embodiment, and various modifications can be made without departing from the scope of the invention. Needless to say. For example, the above-described embodiment has been described in detail for easy understanding of the present invention, and is not necessarily limited to one having all the configurations described. In addition, it is possible to add, delete, and replace other configurations for a part of the configuration of the embodiment.
 例えば、本発明は、自律移動を行わない移動体にも適用可能である。この場合には、自装置と周囲にある障害物との距離を検知する距離センサ部と、距離センサ部の検知結果に基づいて自装置の位置姿勢を判定して走行を制御するコントローラ部とを備えた構成(例えば前述した図1や図2などと同様)とする。このような構成においても、コントローラ部は、自装置の位置姿勢が一意に定まらない状態の一様パターン環境の終端データと、距離センサ部の検知結果である距離データとを比較することで、終端判定を行うことができる。この場合に、コントローラ部は、終端データとして、一様パターン環境の終端から観測できる形状データのみを用いて、終端判定を行うことができる。 For example, the present invention can be applied to a moving body that does not perform autonomous movement. In this case, a distance sensor unit that detects the distance between the device and an obstacle around the device, and a controller unit that determines the position and orientation of the device based on the detection result of the distance sensor unit and controls traveling. It is assumed that the configuration is provided (for example, similar to FIG. 1 and FIG. 2 described above). Even in such a configuration, the controller unit compares the terminal data of the uniform pattern environment in which the position and orientation of the device itself is not uniquely determined with the distance data that is the detection result of the distance sensor unit, thereby Judgment can be made. In this case, the controller unit can perform the end determination using only the shape data that can be observed from the end of the uniform pattern environment as the end data.
10 移動体
11 コントローラ部
12 距離センサ部
13 距離センサ制御部
14 位置姿勢推定部
15 一様パターン環境終端判定部
16 動作計画部
17 移動機構制御部
18 地図データ記憶部
19 経路データ記憶部
20 一様パターン環境終端データ記憶部
21 移動機構部
 
 
DESCRIPTION OF SYMBOLS 10 Mobile body 11 Controller part 12 Distance sensor part 13 Distance sensor control part 14 Position and orientation estimation part 15 Uniform pattern environment termination | terminus determination part 16 Operation plan part 17 Movement mechanism control part 18 Map data storage part 19 Path | route data storage part 20 Uniform Pattern environment termination data storage unit 21 Movement mechanism unit

Claims (10)

  1.  自装置と前記自装置の周囲にある障害物との距離を検知する距離センサ部と、
     前記距離センサ部の検知結果に基づいて前記自装置の位置姿勢を推定して走行を制御するコントローラ部と、
     前記コントローラ部の制御に基づいて前記自装置を自律的に走行させる移動機構部と、
     を備え、
     前記コントローラ部は、走行環境の地図と前記距離センサ部の検知結果である距離データとを用いて前記自装置の位置姿勢を推定して、前記自装置の位置姿勢が一意に定まる状態と、前記自装置の位置姿勢が一意に定まらない状態とで走行モードを切り替える、移動体。
    A distance sensor unit for detecting a distance between the device and an obstacle around the device;
    A controller unit for controlling the traveling by estimating the position and orientation of the device based on the detection result of the distance sensor unit;
    A moving mechanism that autonomously travels the device based on the control of the controller;
    With
    The controller unit estimates the position and orientation of the device using a map of a traveling environment and distance data that is a detection result of the distance sensor unit, and a state in which the position and orientation of the device is uniquely determined; A moving body that switches the driving mode between a state in which the position and orientation of its own device are not uniquely determined.
  2.  請求項1記載の移動体において、
     前記コントローラ部は、前記自装置の位置姿勢が一意に定まらない状態の場合に、終端データが設定されていないセグメントでは走行を停止し、終端データが設定されているセグメントでは壁ならい走行モードで走行を行うように制御する、移動体。
    The mobile body according to claim 1,
    When the position / orientation of the device itself is not uniquely determined, the controller unit stops traveling in a segment for which termination data is not set, and travels in a wall-by-wall driving mode in a segment for which termination data is set. To control the moving body.
  3.  請求項2記載の移動体において、
     前記コントローラ部は、前記終端データが設定されていないセグメントで走行を停止中の前記移動体が、前記自装置の位置姿勢が一意に定まる状態に復帰した場合に走行を再開するように制御する、移動体。
    The mobile body according to claim 2,
    The controller unit controls the mobile unit that stops traveling in a segment for which the terminal data is not set to resume traveling when the position and orientation of the device itself is uniquely determined; Moving body.
  4.  請求項2記載の移動体において、
     前記コントローラ部は、前記壁ならい走行モードでの走行時に、前記自装置が走行する通路の壁面と前記距離センサ部のセンサとの方向を一定に保つように制御する、移動体。
    The mobile body according to claim 2,
    The said controller part is a moving body which controls so that the direction of the wall surface of the channel | path which the said own apparatus drive | works and the sensor of the said distance sensor part may be kept constant at the time of driving | running | working in the said wall- following travel mode.
  5.  請求項1記載の移動体において、
     前記コントローラ部は、前記自装置の位置姿勢が一意に定まらない状態の一様パターン環境の終端データと、前記距離センサ部の検知結果である距離データとを比較することで、終端判定を行う、移動体。
    The mobile body according to claim 1,
    The controller unit performs termination determination by comparing termination data of a uniform pattern environment in which the position and orientation of the device itself is not uniquely determined, and distance data that is a detection result of the distance sensor unit. Moving body.
  6.  請求項5記載の移動体において、
     前記コントローラ部は、前記終端データとして、前記一様パターン環境の終端から観測できる形状データのみを用いる、移動体。
    The mobile body according to claim 5,
    The controller unit uses only shape data that can be observed from the end of the uniform pattern environment as the end data.
  7.  請求項1記載の移動体において、
     前記移動体は、
     走行環境の地図データを記憶する地図データ記憶部と、
     前記自装置を所望の経路で走行させるための経路データを記憶する経路データ記憶部と、
     前記地図データにおいて特定のパターンが連続する環境である一様パターン環境の始点と終点とを管理する一様パターン環境終端データ記憶部と、
     所定の範囲内における走行の障害となり得る障害物と、前記自装置から前記障害物までの距離と方向を検出する前記距離センサ部と、
     前記距離センサ部から得られる前記自装置から前記障害物までの距離と方向から、周囲の障害物の位置情報である幾何形状データに変換するとともに、前記距離センサ部の動作を制御する距離センサ制御部と、
     前記幾何形状データと前記地図データとをマッチングして前記自装置の地図上における現在位置と進行方向を推定する位置姿勢推定部と、
     前記位置姿勢推定部における現在位置推定の結果と、前記一様パターン環境終端データ記憶部に記憶される一様パターン環境終端データとを比較し、前記自装置の現在位置が一様パターン環境の終点であるか否かを判定する一様パターン環境終端判定部と、
     を備え、
     前記位置姿勢推定部は、前記地図データと前記幾何形状データとをマッチングして前記自装置の位置姿勢を推定して、前記自装置の位置姿勢が一意に定まる状態での一般走行モードと、前記自装置の位置姿勢が一意に定まらない状態での壁ならい走行モードとを切り替える、移動体。
    The mobile body according to claim 1,
    The moving body is
    A map data storage unit for storing map data of the driving environment;
    A route data storage unit for storing route data for causing the device to travel on a desired route;
    A uniform pattern environment end data storage unit for managing a start point and an end point of a uniform pattern environment that is an environment in which a specific pattern is continuous in the map data;
    An obstacle that may become an obstacle to traveling within a predetermined range; and the distance sensor unit that detects a distance and a direction from the own device to the obstacle;
    Distance sensor control that converts the distance and direction from the device to the obstacle obtained from the distance sensor unit into geometric shape data that is position information of surrounding obstacles and controls the operation of the distance sensor unit And
    A position and orientation estimation unit that estimates the current position and the traveling direction on the map of the device by matching the geometric shape data and the map data;
    The result of the current position estimation in the position and orientation estimation unit is compared with the uniform pattern environment termination data stored in the uniform pattern environment termination data storage unit, and the current position of the device is the end point of the uniform pattern environment A uniform pattern environment termination determination unit that determines whether or not
    With
    The position and orientation estimation unit matches the map data and the geometric shape data to estimate the position and orientation of the own device, and the general travel mode in a state where the position and orientation of the own device is uniquely determined; A moving body that switches between a wall-tracing running mode when the position and orientation of its own device is not uniquely determined.
  8.  自装置と前記自装置の周囲にある障害物との距離を検知する距離センサ部と、
     前記距離センサ部の検知結果に基づいて前記自装置の位置姿勢を推定して走行を制御するコントローラ部と、
     を備え、
     前記コントローラ部は、前記自装置の位置姿勢が一意に定まらない状態の一様パターン環境の終端データと、前記距離センサ部の検知結果である距離データとを比較することで、終端判定を行う、移動体。
    A distance sensor unit for detecting a distance between the device and an obstacle around the device;
    A controller unit for controlling the traveling by estimating the position and orientation of the device based on the detection result of the distance sensor unit;
    With
    The controller unit performs termination determination by comparing termination data of a uniform pattern environment in which the position and orientation of the device itself is not uniquely determined, and distance data that is a detection result of the distance sensor unit. Moving body.
  9.  請求項8記載の移動体において、
     前記コントローラ部は、前記終端データとして、前記一様パターン環境の終端から観測できる形状データのみを用いる、移動体。
    The mobile body according to claim 8, wherein
    The controller unit uses only shape data that can be observed from the end of the uniform pattern environment as the end data.
  10.  請求項8記載の移動体において、
     前記移動体は、
     走行環境の地図データを記憶する地図データ記憶部と、
     前記自装置を所望の経路で走行させるための経路データを記憶する経路データ記憶部と、
     前記地図データにおいて特定のパターンが連続する環境である一様パターン環境の始点と終点とを管理する一様パターン環境終端データ記憶部と、
     所定の範囲内における走行の障害となり得る障害物と、前記自装置から前記障害物までの距離と方向を検出する前記距離センサ部と、
     前記距離センサ部から得られる前記自装置から前記障害物までの距離と方向から、周囲の障害物の位置情報である幾何形状データに変換するとともに、前記距離センサ部の動作を制御する距離センサ制御部と、
     前記幾何形状データと前記地図データとをマッチングして前記自装置の地図上における現在位置と進行方向を推定する位置姿勢推定部と、
     前記位置姿勢推定部における現在位置推定の結果と、前記一様パターン環境終端データ記憶部に記憶される一様パターン環境終端データとを比較し、前記自装置の現在位置が一様パターン環境の終点であるか否かを判定する一様パターン環境終端判定部と、
     を備え、
     前記一様パターン環境終端判定部は、前記自装置の位置姿勢が一意に定まらない状態の一様パターン環境の終端データと、前記距離センサ部の検知結果である幾何形状データとを比較することで、一様パターン環境の終端判定を行う、移動体。
     
     
    The mobile body according to claim 8, wherein
    The moving body is
    A map data storage unit for storing map data of the driving environment;
    A route data storage unit for storing route data for causing the device to travel on a desired route;
    A uniform pattern environment end data storage unit for managing a start point and an end point of a uniform pattern environment that is an environment in which a specific pattern is continuous in the map data;
    An obstacle that may become an obstacle to traveling within a predetermined range; and the distance sensor unit that detects a distance and a direction from the own device to the obstacle;
    Distance sensor control that converts the distance and direction from the device to the obstacle obtained from the distance sensor unit into geometric shape data that is position information of surrounding obstacles and controls the operation of the distance sensor unit And
    A position and orientation estimation unit that estimates the current position and the traveling direction on the map of the device by matching the geometric shape data and the map data;
    The result of the current position estimation in the position and orientation estimation unit is compared with the uniform pattern environment termination data stored in the uniform pattern environment termination data storage unit, and the current position of the device is the end point of the uniform pattern environment A uniform pattern environment termination determination unit that determines whether or not
    With
    The uniform pattern environment termination determination unit compares the termination data of the uniform pattern environment in a state where the position and orientation of the device itself is not uniquely determined with the geometric shape data that is the detection result of the distance sensor unit. A moving object that determines the end of a uniform pattern environment.

PCT/JP2015/055870 2014-03-19 2015-02-27 Mobile object WO2015141445A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016508645A JP6348971B2 (en) 2014-03-19 2015-02-27 Moving body

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-055843 2014-03-19
JP2014055843 2014-03-19

Publications (1)

Publication Number Publication Date
WO2015141445A1 true WO2015141445A1 (en) 2015-09-24

Family

ID=54144421

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/055870 WO2015141445A1 (en) 2014-03-19 2015-02-27 Mobile object

Country Status (2)

Country Link
JP (1) JP6348971B2 (en)
WO (1) WO2015141445A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2015141445A1 (en) * 2014-03-19 2017-05-25 株式会社日立産機システム Moving body
JP2017182175A (en) * 2016-03-28 2017-10-05 国立大学法人豊橋技術科学大学 Autonomous travel device and start position determination program
CN108437833A (en) * 2018-04-13 2018-08-24 山东时风(集团)有限责任公司 A kind of special storage from turn truck and control method
WO2018233401A1 (en) * 2017-06-20 2018-12-27 南京阿凡达机器人科技有限公司 Optoelectronic mouse sensor module-based method and system for creating indoor map
US10930162B2 (en) 2016-06-13 2021-02-23 SZ DJI Technology Co., Ltd. Unmanned aerial vehicle, delivery system, control method for unmanned aerial vehicle, and program for controlling unmanned aerial vehicle

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109839936A (en) * 2019-03-04 2019-06-04 中新智擎科技有限公司 Automatic navigation method, robot and storage medium under a kind of overall situation

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0322108A (en) * 1989-06-20 1991-01-30 Shinko Electric Co Ltd Mobile robot
JP2008059218A (en) * 2006-08-30 2008-03-13 Fujitsu Ltd Method for restoring self-position of autonomously traveling robot
JP2013020345A (en) * 2011-07-08 2013-01-31 Hitachi Industrial Equipment Systems Co Ltd Position and posture estimation system for traveling object
JP2014006835A (en) * 2012-06-27 2014-01-16 Murata Mach Ltd Autonomous traveling apparatus, autonomous traveling method, markers, and autonomous traveling system
JP2014067223A (en) * 2012-09-26 2014-04-17 Hitachi Industrial Equipment Systems Co Ltd Autonomous mobile body

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3333223B2 (en) * 1991-11-29 2002-10-15 マツダ株式会社 Roadway recognition device for mobile vehicles
JP2005211442A (en) * 2004-01-30 2005-08-11 Tottori Univ Autonomously movable wheelchair
JP5909486B2 (en) * 2011-06-29 2016-04-26 株式会社日立産機システム Self-position / posture estimation system
WO2015141445A1 (en) * 2014-03-19 2015-09-24 株式会社日立産機システム Mobile object

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0322108A (en) * 1989-06-20 1991-01-30 Shinko Electric Co Ltd Mobile robot
JP2008059218A (en) * 2006-08-30 2008-03-13 Fujitsu Ltd Method for restoring self-position of autonomously traveling robot
JP2013020345A (en) * 2011-07-08 2013-01-31 Hitachi Industrial Equipment Systems Co Ltd Position and posture estimation system for traveling object
JP2014006835A (en) * 2012-06-27 2014-01-16 Murata Mach Ltd Autonomous traveling apparatus, autonomous traveling method, markers, and autonomous traveling system
JP2014067223A (en) * 2012-09-26 2014-04-17 Hitachi Industrial Equipment Systems Co Ltd Autonomous mobile body

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2015141445A1 (en) * 2014-03-19 2017-05-25 株式会社日立産機システム Moving body
JP2017182175A (en) * 2016-03-28 2017-10-05 国立大学法人豊橋技術科学大学 Autonomous travel device and start position determination program
US10930162B2 (en) 2016-06-13 2021-02-23 SZ DJI Technology Co., Ltd. Unmanned aerial vehicle, delivery system, control method for unmanned aerial vehicle, and program for controlling unmanned aerial vehicle
WO2018233401A1 (en) * 2017-06-20 2018-12-27 南京阿凡达机器人科技有限公司 Optoelectronic mouse sensor module-based method and system for creating indoor map
CN108437833A (en) * 2018-04-13 2018-08-24 山东时风(集团)有限责任公司 A kind of special storage from turn truck and control method

Also Published As

Publication number Publication date
JPWO2015141445A1 (en) 2017-05-25
JP6348971B2 (en) 2018-06-27

Similar Documents

Publication Publication Date Title
JP6074205B2 (en) Autonomous mobile
JP6348971B2 (en) Moving body
JP7355500B2 (en) Robotic system and method for operating on workpieces
US9244463B2 (en) Automated guided vehicle and method of operating an automated guided vehicle
JP5157803B2 (en) Autonomous mobile device
KR100772912B1 (en) Robot using absolute azimuth and method for mapping by the robot
US8679260B2 (en) Methods and systems for movement of an automatic cleaning device using video signal
JP5800613B2 (en) Position / posture estimation system for moving objects
WO2010038353A1 (en) Autonomous movement device
JP6825712B2 (en) Mobiles, position estimators, and computer programs
US11747825B2 (en) Autonomous map traversal with waypoint matching
JP4735476B2 (en) Autonomous mobile device
JP5805841B1 (en) Autonomous mobile body and autonomous mobile body system
JP7081881B2 (en) Mobiles and mobile systems
JPWO2018110568A1 (en) Mobile object for performing obstacle avoidance operation and computer program therefor
US20110112714A1 (en) Methods and systems for movement of robotic device using video signal
JP2008152600A (en) Moving route generation method, autonomous moving object, and autonomous moving object control system
JP4670807B2 (en) Travel route creation method, autonomous mobile body, and autonomous mobile body control system
JPWO2019059307A1 (en) Mobiles and mobile systems
JP5212939B2 (en) Autonomous mobile device
KR20200070087A (en) Autonomous Mobile Robot and Method for Driving Control the same
JP5439552B2 (en) Robot system
JP5427662B2 (en) Robot system
JP7396353B2 (en) Map creation system, signal processing circuit, mobile object and map creation method
WO2021246170A1 (en) Information processing device, information processing system and method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15764740

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2016508645

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 15764740

Country of ref document: EP

Kind code of ref document: A1