WO2015141445A1 - Objet mobile - Google Patents

Objet mobile Download PDF

Info

Publication number
WO2015141445A1
WO2015141445A1 PCT/JP2015/055870 JP2015055870W WO2015141445A1 WO 2015141445 A1 WO2015141445 A1 WO 2015141445A1 JP 2015055870 W JP2015055870 W JP 2015055870W WO 2015141445 A1 WO2015141445 A1 WO 2015141445A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
orientation
environment
unit
uniform pattern
Prior art date
Application number
PCT/JP2015/055870
Other languages
English (en)
Japanese (ja)
Inventor
修一 槙
高斉 松本
正木 良三
一登 白根
Original Assignee
株式会社日立産機システム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立産機システム filed Critical 株式会社日立産機システム
Priority to JP2016508645A priority Critical patent/JP6348971B2/ja
Publication of WO2015141445A1 publication Critical patent/WO2015141445A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser

Definitions

  • the present invention relates to a mobile object, and more particularly to an autonomous mobile object equipped with a system for estimating a position / posture of a mobile object equipped with a device capable of detecting its own position / attitude.
  • the present invention relates to a technique capable of providing a moving body that transports a transported object, and more particularly, to an autonomous moving body that can autonomously move in a line pattern environment.
  • the present invention is described as an embodiment of an autonomous mobile robot as a specific embodiment, the present invention can also be applied to a robot that does not perform autonomous movement, and further includes a camera and GPS.
  • the present invention can be easily applied to a moving body such as a car navigation system of an automobile, and design changes that can be easily made by those skilled in the art within the scope of the inventive idea are included in the technical scope of the invention.
  • This method is a technology called SLAM (Simultaneous Localization and Mapping), and the robot obtains its own position and orientation within the range of the obtained sensor information, and at the same time generates and updates the environment map, based on this autonomously in the environment It has a feature that moves in a moving manner.
  • SLAM Simultaneous Localization and Mapping
  • Patent Document 1 Japanese Patent Laid-Open No. 2005-332204 detects the distance and direction between self-position detecting means such as GPS (Global Positioning System) and surrounding objects.
  • GPS Global Positioning System
  • a movement control device having an object detection means and a function of generating an environmental map in a moving direction based on the detection data is known.
  • the problem of this patent document 1 is to control a mobile body to accurately move along a target route to avoid an unexpected obstacle on the target route even when radio waves from GPS satellites are not received. is there.
  • the solution means is provided with the movement control apparatus which controls the movement to the mobile body which moves autonomously according to the preset target route information.
  • the movement control device includes a self-position detecting unit that detects a current position and direction of the moving body, an ALR (AreaALaser Rader) that detects a distance between the moving body and an object existing around the moving body, a self-position detecting unit, and an ALR. And a control means for controlling the moving body so as to move the moving body along the path.
  • the control means cumulatively generates an environment map around the moving body in consideration of the presence of the object along with the movement of the moving body, and moves the moving body that does not interfere with the object based on the target route information and the environment map. Determine the course.
  • Patent Document 2 Japanese Patent Application Laid-Open No. 2004-276168
  • the subject of this patent document 2 is provision of the map creation system which can create the map for mobile robots with high accuracy.
  • the solution step first, new map information is created by the mobile robot. Next, the existing map information is read, and the relative postures of the existing map are taken out one by one. Then, it is checked whether the same relative posture is also in the new map information.
  • the relative orientation that the new map information and the existing map information have in common is stochastically fused. If the new map information does not have the same relative posture, the relative posture is added to the new map information. After that, if there is a piece constituting a loop in the relative posture set of the new map information, a loop solution process for eliminating the deviation is performed.
  • Patent Document 3 Japanese Patent Laid-Open No. 2007-94743 shows that a map data generation unit and a position estimation unit are arranged in an autonomous mobile robot or a server device.
  • the problem of this patent document 3 is that it is possible to easily teach the position information arbitrarily selected by the user, and by specifying the destination based on the taught position information.
  • An autonomous mobile robot that moves autonomously and its system are provided.
  • an information input unit that receives input of position information and destination information from a user, a position that stores the position information input from the information input unit and the position estimated by the position estimation unit in association with each other in a table And an information storage unit.
  • the movement path planning unit associates the input destination information with the table stored in the position information storage unit, and moves the robot within the movement space of the robot. With reference to the map data which is the obstacle information, the travel path from the position estimated by the position estimator is obtained to move autonomously.
  • Patent Document 4 Japanese Patent Application Laid-Open No. 2005-242489 describes, as a background art relating to a moving body that transports a transported object, “a route with the shortest distance can be set by reducing restrictions on the route on which the autonomous mobile body travels. And preventing collisions and traffic jams even when a plurality of autonomous mobile bodies travel simultaneously.
  • Patent Document 5 Japanese Patent Laid-Open No. 2007-133891
  • Patent Document 6 Japanese Patent Laid-Open No. 2006-293975 disclose an invention relating to an autonomous moving body that moves along a wall surface.
  • Patent Document 5 describes a “wall-following mode” that moves along a wall surface. This is because “wall-following, that is, edge cleaning in the case of a cleaning robot, "Only the edge of the room can be cleaned" (paragraph number 0053), and as a cleaning robot, there is an explanation of an operation mode aimed at cleaning only the edge of a room or the edge of an object in a room. It is only done.
  • an environment map around a moving object is generated cumulatively in the moving direction in consideration of the presence of the object as the moving object moves, and the object is not detected based on the target route information and the environment map.
  • Techniques such as determining the course of a moving body that causes interference are disclosed.
  • Such a conventional technique is possible in a general environment having many geometric features (such as irregularities).
  • An object of the present invention is to realize a robot capable of automatic traveling even in an environment where such geometric characteristics are poor.
  • sensor data is obtained by having sufficient geometric features such as irregularities in the environment around an autonomous mobile body, such as a factory or warehouse where a large number of equipment such as machine tools and storage shelves are installed.
  • An environment in which all parameters of position and orientation are uniquely obtained when matching with a map is called a “general environment”.
  • it has poor geometric features such as flat walls on both sides of the passage, and even if matching between the sensor data and the map, multiple matching points appear and multiple candidate position / posture parameter solutions appear.
  • This environment is called a “uniform pattern environment” because the pattern of geometric features continues uniformly.
  • track environment an environment in which walls on both sides of a route continue, that is, an environment in which two parallel lines on a map continue like a track when the environment is expressed as a map. This is the meaning of the subordinate concept of “uniform pattern environment”.
  • a representative mobile object estimates the position and orientation of its own device based on a distance sensor unit that detects a distance between the own device and an obstacle around the own device, and a detection result of the distance sensor unit. And a controller that controls traveling and a moving mechanism that autonomously travels the device based on the control of the controller.
  • the controller unit estimates the position and orientation of the own device using a map of a travel environment and distance data that is a detection result of the distance sensor unit, and the position and orientation of the own device are uniquely determined.
  • the driving mode is switched between a state where the position and orientation of the device itself is not uniquely determined.
  • the controller unit stops traveling in a segment for which termination data is not set, and is not a wall in a segment for which termination data is set. Control is performed so that the vehicle travels in the travel mode.
  • the controller unit resumes traveling when the moving body that has stopped traveling in a segment for which the terminal data is not set returns to a state in which the position and orientation of the device itself is uniquely determined. It controls so that it may do.
  • Patent Document 5 and Patent Document 6 described above disclose an invention relating to an autonomous moving body that moves along a wall surface
  • Patent Document 5 describes a cleaning robot that includes an edge of a room and an edge of an object in a room. It only stays moving along the wall in order to clean only.
  • Patent Document 6 there is a problem that “if the wall surface is discontinuous, the distance from the wall surface cannot be obtained, so that there is a problem that appropriate movement along the boundary is not performed”. It presents a solution.
  • Patent Document 6 does not disclose any related technology.
  • the typical effect is that there are many geometric features such as irregularities due to the above configuration, and in an environment where all the parameters of position and orientation are required with high accuracy by matching the sensor data with the map.
  • Automatic driving in the general driving mode using all parameters can be performed.
  • the wall-climbing mode that uses only parameters with high accuracy. Automatic driving can be performed.
  • the mobile body can autonomously move in an environment in which an environment such as a factory in which many geometric features exist due to facilities and an environment such as a corridor with poor geometric features are mixed.
  • FIG. 2 is a block diagram showing a software configuration in the mobile unit of FIG. 1.
  • FIG. 3 is an explanatory diagram showing a current position estimation process by a distance sensor unit in the moving body of FIG. 1.
  • 2 is a flowchart illustrating an outline of autonomous moving processing of a moving object in the moving object of FIG. 1.
  • FIG. 3 is an explanatory diagram showing an example of the data structure of route data and uniform pattern environment termination data in the mobile body of FIG. 1.
  • FIG. 2 is an explanatory diagram showing position and orientation estimation processing in the mobile body of FIG. 1.
  • FIG. 7 is an enlarged view of a range C shown in FIG.
  • FIG. 2 is a flowchart showing processing of a general traveling mode and a wall-following traveling mode of the moving body in the moving body of FIG. 1.
  • FIG. 3 is an explanatory diagram showing an example of detailed processing for switching from a wall-following travel mode to a general travel mode in the mobile body of FIG. 1.
  • the constituent elements are not necessarily indispensable unless otherwise specified and apparently indispensable in principle. Needless to say.
  • the shape and positional relationship of components and the like when referring to the shape and positional relationship of components and the like, the shape is substantially the same unless otherwise specified and the case where it is not clearly apparent in principle. And the like are included. The same applies to the above numerical values and ranges. [Outline of the embodiment]
  • the mobile body includes a distance sensor unit (distance sensor unit 12) that detects a distance between the own device and an obstacle around the own device, and the own device based on a detection result of the distance sensor unit.
  • a controller unit controller unit 11 that estimates the position and orientation of the vehicle and controls the traveling, and a moving mechanism unit (moving mechanism unit 21) that autonomously travels the device based on the control of the controller unit. .
  • the controller unit estimates a position and orientation of the own device using a map of a traveling environment and distance data that is a detection result of the distance sensor unit, and a state in which the position and orientation of the own device is uniquely determined (
  • the driving mode is switched between a general environment) and a state (uniform pattern environment) in which the position and orientation of the device itself are not uniquely determined.
  • the controller unit stops traveling in a segment for which termination data is not set, and is not a wall in a segment for which termination data is set. Control is performed so that the vehicle travels in the travel mode.
  • the controller unit resumes traveling when the moving body that has stopped traveling in a segment for which the terminal data is not set returns to a state in which the position and orientation of the device itself is uniquely determined. It controls so that it may do.
  • FIGS. 1 to 10 by limiting it to an autonomous mobile body (specifically, applied to an autonomous mobile robot).
  • an autonomous mobile body specifically, applied to an autonomous mobile robot.
  • the function of the mobile object assumed in the present embodiment will be described for a mobile object that automatically travels between the general environment and the uniform pattern environment.
  • the configuration and main processing contents will be described first, followed by a more specific hardware / software configuration and their operations. ⁇ Functional configuration of mobile unit>
  • FIG. 1 shows a functional configuration of a mobile body (also referred to as an autonomous mobile body) 10 that can autonomously move in the track pattern environment of the present embodiment.
  • the autonomous mobile body 10 includes a controller unit 11 that controls traveling by estimating the position and orientation of the autonomous mobile body 10, and a distance sensor unit that detects a distance between the autonomous mobile body 10 and an outer wall surface such as an obstacle around the autonomous mobile body 10. 12 and moving mechanism parts 21 and 21 for autonomously traveling.
  • the controller unit 11 includes a distance sensor control unit 13 that controls the distance sensor unit 12, and receives a detection result from the distance sensor unit 12 to estimate a position / posture of the autonomous mobile body 10. 14 is provided.
  • the position / orientation estimation unit 14 preferably includes two estimation units.
  • these two position / posture estimation units are a normal position / posture estimation unit and an initial position / posture estimation unit.
  • a plurality of obstacles are arranged in the operation area of the autonomous mobile body, and in the invention of the prior application, the outer peripheral contour shape of the obstacle when the obstacles are arranged on the map data is referred to as the obstacle arrangement shape. I will explain.
  • the normal position / orientation estimation unit estimates the position / orientation according to the prior art, and is not a main problem in the prior invention.
  • the prior invention is characterized in that an initial position / posture estimation unit is provided, and the initial position / posture estimation unit includes a position / posture candidate display / setting unit.
  • the autonomous mobile body according to the embodiment of the invention of the prior application includes a route planning unit that sets a route of an area in which the vehicle travels based on the position / posture estimated by the normal position / posture estimation unit or the initial position / posture estimation unit.
  • a moving mechanism control unit that autonomously moves the autonomous moving body by driving the wheels along the route planned by the route planning unit.
  • the controller unit 11 of the autonomous mobile body 10 of the present invention further includes a uniform pattern environment end determination unit 15 that determines whether or not the current position of the mobile body 10 is the end of the uniform pattern environment. ing. Furthermore, the operation planning unit 16 that calculates a target position / posture on a path to be followed from the current position / posture of the mobile body 10, and the calculated target position / posture and the current position / posture of the mobile body 10 A moving mechanism control unit 17 that controls the shift to be small, and a moving mechanism unit 21 having a front caster and a rear drive wheel are provided.
  • the storage unit for various data includes a map data storage unit 18, a route data storage unit 19, and a uniform pattern environment termination data storage unit 20. Further, although not shown here, it is assumed that the parts necessary for the respective parts to cooperate and operate, such as a case supporting each part and a power source / wiring, are provided.
  • the distance sensor control unit 13 controls the distance sensor unit 12 to obtain distance data including measurement of the distance and direction to the outer periphery of the object such as equipment in the surrounding environment.
  • a laser distance sensor is used as the distance sensor unit 12. This laser distance sensor measures the distance from the sensor to the outer periphery of the object by measuring the time from when the laser is irradiated until the irradiated laser is reflected by the object in the environment and returns to the sensor.
  • a laser irradiation unit is provided. Then, by measuring while rotating this laser irradiation unit for every fixed rotation angle, it is possible to measure the distance to the outer edge of the object within the range of the rotation angle (hereinafter referred to as “scan”).
  • a laser distance sensor having such a function is conventionally known, a specific distance measuring method will not be further described.
  • this scan is performed in a plane, the distance and direction from the sensor to the outer edge of the object on the plane formed by the laser (hereinafter referred to as “scan surface”) are obtained by the scan.
  • the data obtained by combining the distance data between the sensor and the object outer edge in each direction and the direction data irradiated with the laser are simply referred to as distance data.
  • each piece of distance data is recorded as a set of distance and direction data, it can be converted into position data with the sensor as a reference.
  • the data obtained by converting the distance data into the position data in this way is referred to herein as geometric shape data.
  • the scanning surface of the laser distance sensor is attached to the moving body 10 so as to be parallel to the floor surface, and geometric shape data at the height of the scanning surface of the laser distance sensor can be obtained.
  • the geometric shape data obtained as described above is sent to the position / orientation estimation unit 14.
  • a map data storage unit 18 in which the geometric shape of the environment at the height of the scan plane is recorded as an image is read in advance.
  • object existence pixel When the object existence pixel of the geometric shape data regarded as an image most overlaps with the pixel indicating that the object exists in the map data storage unit 18 (hereinafter, the term “object existence pixel” is used).
  • a process for searching for the position and orientation of the geometric shape data on the map data hereinafter, the term “matching (matching process)” is used for the process) is performed.
  • the position and orientation of the geometric shape data in the coordinate system of the map data storage unit 18 when the geometric shape data and the map data storage unit 18 are most overlapped are obtained.
  • the process proceeds to the process in the motion planning unit 16.
  • the motion planning unit 16 operates in the general travel mode, and should follow the current position / posture of the mobile body 10 obtained by the route data storage unit 19 and the position / posture estimation unit 14 of the mobile body 10 read in advance.
  • the target position / orientation on the route is calculated.
  • the moving mechanism control unit 17 controls the moving mechanism unit 21 so as to reduce the deviation between the calculated target position / posture and the current position / posture of the moving body 10. That is, an instruction is given to the motor or the like by determining the rotational speed of the wheel (generally the rear drive wheel) and the turning angle of the steering (generally the front caster). As a result, tracking of the moving body 10 along a preset route, and thus automatic traveling to the destination is realized.
  • the autonomous mobile body 10 is autonomous by a position / orientation estimation unit 14 that receives a detection result from the distance sensor unit 12 and estimates the position / posture of the mobile body 10 when traveling in a general environment. Travel along the route set in advance.
  • the moving body 10 moves forward in the traveling direction so as not to contact the obstacle detected by the distance sensor unit 12. Enter into the wall-climbing mode operation.
  • the uniform pattern environment termination determination unit 15 performs determination processing subsequent to the position / orientation estimation unit 14.
  • the uniform pattern environment end determination unit 15 is previously loaded with data from the map data storage unit 18 and the uniform pattern environment end data storage unit 20 that is data at the end position of the uniform pattern environment.
  • the uniform pattern environment end data storage unit 20 when matching between the geometric shape data and the map data storage unit 18, there are a plurality of solution candidates as in the matching in the process of the uniform pattern environment.
  • the position / posture of the matching search range and the shape of the search range in which the position / posture is uniquely determined without appearing are recorded.
  • the search range for matching indicates a range (a range of values that a solution can take) for searching for a solution candidate in matching.
  • the range is related to the three parameters of position and orientation.
  • the map data storage unit is similar to the position / orientation estimation unit 14 based on the position / posture of the matching search range recorded in the uniform pattern environment end data storage unit 20. 18 is matched with geometric data.
  • the moving body 10 While the moving body 10 is automatically traveling in the uniform pattern environment, the moving body 10 moves while performing matching in the position / orientation estimation unit 14 and matching in the uniform pattern environment end determination unit 15. More specifically, in a track pattern environment such as a long corridor, the vehicle travels linearly away from both walls by a predetermined distance.
  • the motion planning unit 16 operates in a wall-climbing traveling mode in a uniform pattern environment, and the movement partially obtained by the route data storage unit 19 and the position / orientation estimation unit 14 of the mobile body 10 read in advance.
  • the posture and speed of the moving body 10 for moving along the environment such as the wall surface are calculated from the current position and posture of the body 10 and the relative position and posture with the environment such as the distance and posture to the surrounding wall surface. Is done.
  • the movement mechanism control unit 17 controls the movement mechanism unit 21 so as to reduce the deviation between the calculated posture / speed and the current posture / speed of the moving body 10, that is, the rotational speed of the wheel or the steering interruption.
  • An instruction is given to a motor or the like for a corner or the like. Accordingly, tracking of the moving body 10 to the route, and thus automatic traveling to the destination is realized.
  • the position / orientation estimation is performed using the position / orientation obtained by the uniform pattern environment end determination unit 15 as the initial position / orientation for matching in the next position / orientation estimation unit 14.
  • the position / orientation estimation unit 14 that has fallen into a state where any or all of the three parameters of position / orientation cannot be estimated during traveling in the uniform pattern environment returns to a state where all the parameters can be estimated again.
  • the moving body 10 switches from the wall-climbing travel mode for the uniform pattern environment to the general travel mode for the general environment, and continues to automatically travel to the destination.
  • the above is the outline of the flow of processing performed by the mobile object 10. ⁇ Software configuration of mobile unit>
  • the hardware and software configurations of the mobile body 10 will be described, and the flow of processing of the hardware and software as a whole will be described through an example in which the mobile body automatically runs in both the general environment and the uniform pattern environment.
  • the moving body moves in a plane, and two parameters for positions that the moving body 10 can take and one parameter for posture are estimated.
  • FIG. 2 shows the configuration of the hardware of the autonomous mobile body 10 of this embodiment and the software stored therein.
  • the moving body 10 includes a controller (corresponding to the controller unit in FIG. 1) 11, a laser distance sensor (also denoted by reference numeral 12) as a sensor of the distance sensor unit 12, a moving mechanism (corresponding to the moving mechanism unit in FIG. 1) 21, a display 25, an input device 26, and a communication line 27 for communication between these devices.
  • a controller corresponding to the controller unit in FIG. 1
  • a laser distance sensor also denoted by reference numeral 12
  • a moving mechanism corresponding to the moving mechanism unit in FIG. 1
  • FIG. 2 only elements directly related to the flow of processing are shown, and it is assumed that a power source and the like necessary for the operation of each element are provided.
  • the distance sensor of the distance sensor unit 12 a sensor of the same type as the laser distance sensor mentioned in the above-described distance sensor unit 12 is used here.
  • the angle range scanned by the laser distance sensor 12 is 180 degrees, the laser is irradiated every 0.5 degrees in this angle range, and the distance to the object is measured.
  • the angle range, the step size of the angle of laser irradiation, the maximum distance measurement range, and the like may be different.
  • FIG. 3 shows an estimation process of the current position by the distance sensor unit 12, and shows how the outer edge of an object such as equipment in a general environment is measured by the laser distance sensor 12 attached to the moving body 10.
  • FIG. 3 shows a state where an object is present in the illustrated environment (hereinafter referred to as an object existence region) 32 (a wide hatched portion) and a moving body 10 (corresponding to the moving body 10 in FIG. 2) as viewed from above. It is a top view showing.
  • an object existence region hereinafter referred to as an object existence region 32 (a wide hatched portion)
  • a moving body 10 corresponding to the moving body 10 in FIG. 2 as viewed from above. It is a top view showing.
  • the moving body 10 in the position / orientation in the figure scans an angle range of 180 degrees by a laser distance sensor 12 (corresponding to the laser distance sensor 12 in FIG. 2).
  • the laser distance sensor 12 obtains geometric shape data 34 (the broken line portion is the geometric shape data, the thin line connecting the broken line portions is an auxiliary line for indicating the scanned range, and the narrow hatched portion is the scanned range). It becomes.
  • the laser distance sensor 12 is used as described above.
  • the sensor method may be different as long as the sensor can measure the geometric shape of the object.
  • a stereo camera or a depth camera that can measure the distance to an object for each pixel by irradiating the object with infrared rays in a plane may be used.
  • the moving body 10 is provided with a front caster and a rear drive wheel, and the moving mechanism unit 21 is a rear drive wheel and a front caster. Suppose that it can turn.
  • the method of the moving mechanism may be different as long as the effect of moving in the environment can be obtained.
  • other moving mechanisms such as a vehicle having an endless track, a moving body having legs, a ship, an aircraft, and an airship may be used.
  • the moving body 10 is configured to automatically travel.
  • a person may board and steer the moving body. The person may be controlled by remote communication without boarding.
  • the controller 11 includes a processor 22, a memory 23, and a storage device 24.
  • the storage device 24 includes an operating system (OS) 24a, a controller initialization program 24b for reading the BIOS and starting the OS, a laser distance sensor control program 24c, a position / orientation estimation program 24d, an operation plan program 24e, a moving mechanism control program 24f,
  • the uniform pattern environment termination determination program 24g, the uniform pattern environment termination data storage unit 20, the map data storage unit 18, and the route data storage unit 19 are included.
  • the laser distance sensor control program 24c acquires distance data from the laser distance sensor 12.
  • the position / orientation estimation program 24d calculates the position / orientation by matching the geometric shape data with the map data stored in the map data storage unit 18.
  • the motion planning program 24e calculates a route for reaching the destination based on the route data stored in the route data storage unit 19.
  • the moving mechanism control program 24f calculates the rotational speed of the wheels and the like so that the moving body 10 moves along the route.
  • the uniform pattern environment end determination program 24g detects that the mobile object 10 has reached the vicinity of the end of the uniform pattern environment while traveling in the uniform pattern environment.
  • the uniform pattern environment termination data storage unit 20 stores uniform pattern environment termination data used when the uniform pattern environment termination determination program 24g determines the arrival of the uniform pattern environment near the end.
  • the program and data of the embodiment shown in FIG. 2 are loaded into the memory 23 and then processed by the processor 22.
  • the implementation may be different.
  • the above processing may be realized by programmable hardware such as FPGA (Field Programmable Grid Array) or CPLD (Complex Programmable Logic Device).
  • the program and data may be transferred from a storage medium such as a CD-ROM, or may be downloaded from another device via a network.
  • the devices constituting the moving body 10 such as the processor 22, the storage device 24, and the moving mechanism 21 communicate with each other through the wired communication line 27, they may be wireless. As long as communication is possible, the controller 11, the display 25, and the input device 26 may be physically remote. Further, the above hardware and software may be selected according to the embodiment. ⁇ Autonomous movement processing of moving objects>
  • FIG. 4 mainly shows processing related to preparation before automatic traveling (outline of autonomous moving processing of the moving body) in the flow of processing in the controller 11 mounted on the moving body 10, and FIG. Processing related to traveling itself (processing in the general traveling mode and the wall traveling mode) is shown.
  • the controller initialization program 24b reads the OS 24a and activates the programs 24c to 24g (402).
  • map data is read from the map data storage unit 18 (hereinafter simply referred to as map data 18) by the position / orientation estimation program 24d (403).
  • the map data 18 is image data, and it is assumed that the presence or absence of an object in the environment is recorded as a pixel value for each pixel.
  • the laser distance sensor control program 24c controls the laser distance sensor 12 to scan the environment, thereby obtaining the geometric data of the environment (404).
  • initial position / orientation estimation is performed (405).
  • the initial position / orientation estimation is a process performed using the position / orientation estimation program 24d, and particularly refers to a position / orientation estimation process performed at the start of the operation of the mobile object 10. In this sense, since the basic principles of position and orientation estimation and initial position and orientation estimation are the same, here, position and orientation estimation will be described first.
  • the position / orientation estimation program 24d has a position / orientation calculation function based on matching between geometric shape data and map data 18. This function will be described with reference to FIG. Now, the map data 18 is recorded as image data by the object existence pixel (data indicating the outer edge of the object) represented by 60 in FIG. 6, and the moving body 10 in FIG.
  • the geometric shape data 34 obtained by scanning with the position / posture is matched to estimate the position / posture of the moving body 10.
  • the matching search range is X centered on the estimated position 62.
  • the search range X only the position search range is shown, but it is assumed that the posture search range is actually set.
  • geometric shape data having an angle range of ⁇ 30 degrees centering on an arrow extending from the previous estimated position 62 is set as a search range related to the posture. Also good.
  • the overlapping state between the geometric shape data and the object existence pixel 60 that is the map data 18 is evaluated by the position / posture that can be taken by the geometric shape data, and the position / posture when the overlapping state becomes the largest is obtained.
  • the geometric shape data is superimposed on the object existence pixel 60 of the map data 18 in the posture of the arrow extending from the estimated position 63 at the estimated position 63 of the moving body 10, It becomes like this.
  • FIG. 6 shows a state where there is a discrepancy between the object existence pixel 60 and the geometric shape data 34.
  • FIG. 7 is an enlarged view of the range indicated by C in FIG.
  • the map data 18 is represented by an image composed of an object presence pixel 67 (black pixel) and a pixel 69 (white pixel) indicating that no object exists, and the geometric shape data 34 for this image. It is represented by an object existence pixel 68 (hatched pixel) forming At this time, the object existence pixel 68 forming the geometric shape data 34 overlaps the object existence pixel 67 of the map data 18 at the position of the pixel 66, and in this case, this pixel is regarded as a match. However, if the laser distance sensor 12 of the moving body 10 is on the right side of FIG.
  • the white pixel on the left side of the line composed of the object existence pixels 67 indicates that no object is immediately present. It is not shown.
  • the line formed by the object existence pixel 67 black pixel
  • the white pixel portion on the left side indicates the position of the inner surface of the object. It may be a thing.
  • the number of object existence pixels in which the geometric shape data 34 and the object existence pixel 60 of the map data 18 are matched is obtained in a round robin manner by the pixel unit matching method as described above.
  • the entire map data is set as a search range without using the previous estimated position / posture, and the search range of the posture is also matched to 360 degrees, and the place where the moving body 10 is activated Find the detailed position / posture.
  • the above-described matching process is assumed here as the process of calculating the position / orientation by the position / orientation estimation program 24d, but other methods may be used as long as the same effect can be obtained. .
  • ICP Intelligent Closest Point
  • the position / posture when the geometric shape data most closely matches the map data 18 is obtained as the initial position / posture.
  • the search takes time. For this reason, for example, matching is performed around the position / posture specified by the operator on the map data 18 displayed on the display 25, or if there is a predetermined parking lot of the moving body 10, matching is performed around that.
  • the initial position and orientation may be obtained by performing
  • a confirmation screen as to whether or not to finish the destination setting is displayed on the display 25.
  • the operator selects to set the destination by using the input device 26 (406). If the mobile object 10 is not automatically driven, end is selected. In this case, the program ends immediately (407).
  • the process proceeds to A, that is, the process proceeds to process 801 in FIG.
  • uniform pattern environment termination data 20 uniform pattern environment termination data 20 is read from the uniform pattern environment termination data storage unit 20 by the uniform pattern environment termination determination program 24g (803).
  • this data indicates that the mobile object 10 that is in a state where any or all of the three parameters of position / posture cannot be estimated while traveling in a uniform pattern environment is
  • FIG. 5 shows the structure of uniform pattern environment termination data 20 and route data 19.
  • Uniform pattern environment termination data is recorded in pairs with path segment data.
  • the route segment data is the information on the line segment (hereinafter referred to as the segment) that forms the route end point from the route start point, the coordinates in the coordinate system of the map data about the segment start point and the segment end point, and the judgment criteria when the end point is reached (hereinafter referred to as Simply referred to as arrival criteria) and the type of segment (hereinafter referred to as segment type).
  • segment type either a segment in a general environment or a segment in a uniform pattern environment is recorded.
  • the arrival determination criterion is a criterion for determining that the moving body 10 travels from the route start point toward the route end point and arrives at the coordinates of the route end point.
  • the route data 19 in FIG. 2 is configured by collecting the route segment data.
  • the uniform pattern environment end data recorded together with the route segment data includes a return search range position and orientation and a return search range shape.
  • the search range position and orientation at the time of return is the geometric shape data to return to a state where all parameters can be estimated from any of the three parameters of position and orientation while driving in a uniform pattern environment. This refers to the position and orientation of the search range for matching when matching is performed with map data.
  • the return search range shape indicates the shape of this search range for matching.
  • the rectangle is simply a rectangle having a length in the horizontal direction h and a length in the height direction v, but may be changed in accordance with the operation status of the moving body 10.
  • the first segment to follow is selected from the route data 19 (804).
  • the laser distance sensor control program 24c controls the laser distance sensor 12 and scans the environment to obtain geometric shape data (805).
  • position and orientation estimation is performed according to the processing flow described in the processing 405 (806).
  • the type of the segment that the moving body 10 is trying to follow is determined. If the segment is a general environment segment, the process proceeds to process 816, and if the segment is a uniform pattern environment segment, the process proceeds to process 817 (808).
  • the moving body 10 includes a segment 95a connecting the point 91 and the point 92, a segment 95b connecting the point 92 and the point 93, and a segment connecting the point 93 and the point 94. It is assumed that the vehicle travels from the start point 91 to the destination 94 using the route data consisting of 95c. It is assumed that the path data 19 records that the segments 95a and 95c are general environment segments and the segment 95b is a uniform pattern environment segment. It is assumed that the map data 18 having the same shape as the shape of the surface portion of the object existence area 60 has already been obtained.
  • the moving body 10 when the moving body 10 is traveling the first segment 95a, for example, when the moving body 10 is at the position / posture 10a, geometric shape data obtained by scanning within the measurement range 12a of the laser distance sensor 12 is obtained. Since 34a includes geometric features such as front walls and corners in addition to the walls on both sides of the passage, in the matching between the map data 18 and the geometric shape data 34a, the position / posture solution when the two overlap most is obtained. It is uniquely determined.
  • the vehicle 10 travels in the general travel mode 816 because it is a general environment segment from the segment type of the route data 19.
  • the segment end point of the currently selected segment (in this case, the segment 95a) has been reached based on the position / posture obtained in the process 806 and the path data 19 (809). If the segment end point has not been reached, general travel control of the moving mechanism is performed so that the deviation between the determined position / posture and the segment end point is reduced (810).
  • the selected segment is recorded as running and the next segment is selected (811). In the case of FIG. 9, when it is determined that the moving body 10 has reached the position of the point 92, the segment 95 b is next selected.
  • the geometric shape data 34b obtained by scanning within the measurement range 12b of the laser distance sensor 12 includes the walls on both sides of the passage. Since there are no geometric features such as front walls and corners other than that, since there are a plurality of positions where the map data 18 and the geometric shape data 34b most overlap each other, The solution cannot be uniquely determined. More specifically, among the three parameters of the position / orientation of the geometric shape data 34b, the position in the longitudinal direction (that is, the traveling direction or the line direction) is not uniquely determined with respect to the passage, and the solution is solved in the passage longitudinal direction. Multiple candidates are obtained.
  • the general travel mode 816 is an operation mode in which three parameters of position and orientation are uniquely determined. Can not. From this, two parameters, the position with respect to the wall and the posture with respect to the wall, can be obtained as parameters other than the position in the longitudinal direction of the passage. I do. Traveling along the wall is an operation mode in which the distance and posture from the wall are kept constant and the vehicle moves toward the end of the uniform pattern environment. In the wall- following traveling mode 817 in the uniform pattern environment, the end of the uniform pattern environment segment is detected during traveling (812). Here, the same processing as the position / orientation estimation processing 806 is performed.
  • the return search range by the uniform pattern environment end data 20 is set as indicated by a broken line 96. It is assumed that the direction of the return search range 96 is set upward in the drawing. Here, it is assumed that the moving body 10 having the position / posture of 10b has finally reached the position / posture of 10c while traveling in the wall-following travel mode 817 in the uniform pattern environment. At this time, since the uniform pattern environment segment 95b is still selected, the uniform pattern environment end determination program 24g is similar to the position and orientation estimation of the geometric shape data 34c and the map data 18 corresponding to the object existence area 60. Matching is performed. However, in this matching, matching is performed based on the return search range 96, not the normal search range at the time of position and orientation estimation.
  • the return search range 96 is provided in a square shape and range as shown by E centering on the position of D shown in FIG. 10, and is set to the posture of the arrow of the segment F (segment 95b in FIG. 9).
  • the search range for the attitude of the uniform pattern environment in the return search range is set as an angle range indicated by G.
  • the position / posture that the geometric shape data can take when searching for a matching solution is rotated by an angle of G / 2 left and right from the direction indicated by the semicircular segment F from each vertex of the return search range E.
  • a range that is, a range obtained by superimposing the semicircles depicted in FIG.
  • the geometric shape data 34c (FIG. 9) obtained by scanning the moving body 10 at the position / posture 10c is within the range where the semicircles shown in FIG. 10 are overlapped, and in addition, the front wall Since there are geometric features such as angle and corner, three parameters of position and orientation are uniquely obtained.
  • the geometric shape data 34c in the end portion of the uniform pattern environment are matched and the matching is established, more specifically, the geometric shape data 34c is converted into the end portion of the uniform pattern environment. If the solution is uniquely determined when it is overlaid on the map data 18, it is determined that the moving body 10 has reached the end of the uniform pattern environment (813). At this time, as the map data 18 used for matching, only the map of the portion that can be observed from the region set by the terminal data is used. If it is determined in this determination that the end of the uniform pattern environment has not been reached, the control of the moving mechanism for performing the above-mentioned wall tracing is continued (814).
  • the selected segment is recorded as being traveled, and the next segment is selected from the route data 19 (815).
  • the segment 95 c is next selected.
  • the end data may not be set depending on the segment.
  • the moving body 10 stops moving when the solution is not uniquely determined due to the influence of a person or an object not described in the map.
  • the moving body 10 performs position and orientation estimation even when it is stopped, and resumes operation when it is determined that there is no disturbance element such as a person and a solution is uniquely obtained.
  • the distance sensor unit 12 that detects the distance between the device itself and the surrounding obstacle, and the position and orientation of the device based on the detection result of the distance sensor unit 12
  • the following effects can be obtained by including the controller unit 11 that estimates the travel and controls the traveling, and the moving mechanism unit 21 that autonomously travels the own device based on the control of the controller unit 11.
  • the controller unit 11 estimates the position and orientation of the own device using the map (map data) of the driving environment and the distance data (geometric shape data) that is the detection result of the distance sensor unit 12, and
  • the driving mode can be switched between a state in which the position and orientation are uniquely determined and a state in which the position and orientation of the device itself are not uniquely determined.
  • map map data
  • distance data geometric shape data
  • the mobile body 10 can autonomously move in an environment in which an environment such as a factory where many geometric features exist due to facilities and an environment such as a corridor with poor geometric features are mixed.
  • the controller unit 11 can perform control so as to stop traveling in a segment for which terminal data is not set. Further, in the segment in which the end data is set, it is possible to control to run in the wall running mode. Specifically, end data is set in a uniform pattern environment where geometric features are scarce and all parameters of position and orientation cannot be obtained with high accuracy even when geometric data and map data are matched. In no segment, you can stop running in the wall-following mode.
  • the controller unit 11 performs control so that the traveling is resumed when the moving body 10 that has stopped traveling in a segment for which no termination data is set returns to a state in which the position and orientation of the device itself is uniquely determined. be able to. Specifically, in order to switch from the wall-climbing travel mode to the general travel mode, it is necessary to determine that an environment in which all the parameters of position and orientation are required has been reached. For this reason, if the accuracy of matching is high due to the matching between the map data and the geometric shape data of the place where the environment with few geometric features switches to the environment where there are many geometric features, the place where the switchable place has been reached And the operation mode can be switched.
  • the controller unit 11 can perform control so as to keep the direction of the wall surface of the passage where the device travels and the sensor of the distance sensor unit 12 constant during traveling in the wall-following traveling mode.
  • two parameters, the position with respect to the wall and the posture with respect to the wall can be obtained as parameters other than the position in the longitudinal direction of the passage. It is possible to run by.
  • the controller unit 11 compares the terminal data of the uniform pattern environment in which the position and orientation of the device itself is not uniquely determined with the geometric shape data that is the detection result of the distance sensor unit 12, thereby determining the terminal end. It can be performed. In this case, the controller unit 11 can perform the end determination using only the shape data that can be observed from the end of the uniform pattern environment as the end data.
  • the invention made by the present inventor has been specifically described based on the embodiment.
  • the present invention is not limited to the embodiment, and various modifications can be made without departing from the scope of the invention. Needless to say.
  • the above-described embodiment has been described in detail for easy understanding of the present invention, and is not necessarily limited to one having all the configurations described.
  • the present invention can be applied to a moving body that does not perform autonomous movement.
  • a distance sensor unit that detects the distance between the device and an obstacle around the device, and a controller unit that determines the position and orientation of the device based on the detection result of the distance sensor unit and controls traveling.
  • the configuration is provided (for example, similar to FIG. 1 and FIG. 2 described above).
  • the controller unit compares the terminal data of the uniform pattern environment in which the position and orientation of the device itself is not uniquely determined with the distance data that is the detection result of the distance sensor unit, thereby Judgment can be made.
  • the controller unit can perform the end determination using only the shape data that can be observed from the end of the uniform pattern environment as the end data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Optics & Photonics (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

La présente invention concerne le problème de la commande d'un objet mobile autonome dans un environnement mixte comprenant à la fois un environnement général comprenant un grand nombre de caractéristiques géométriques (telles que des saillies et des creux) et un environnement d'aspect uniforme comprenant peu de caractéristiques géométriques. Une unité de commande de l'objet mobile estime l'emplacement et l'orientation de l'objet mobile en utilisant des données cartographiques relatives à des environnements de déplacement et des données de forme géométrique détectées par un capteur de distance et elle modifie le mode de déplacement en fonction de la détermination unique de l'emplacement et de l'orientation de l'objet mobile et de la détermination non unique de l'emplacement et de l'orientation de l'objet mobile. Plus spécifiquement, dans un environnement général où la présence d'un grand nombre de caractéristiques géométriques permet de calculer tous les paramètres de localisation et d'orientation avec une grande précision par mise en correspondance mutuelle des données cartographiques et des données de forme géométrique, un déplacement automatique est réalisé selon un mode de déplacement général qui met en œuvre tous les paramètres. Dans un environnement d'aspect uniforme où l'absence de caractéristiques géométriques empêche de calculer avec une grande précision tous les paramètres utilisés pour la localisation et l'orientation, le déplacement automatique est réalisé selon un mode de suivi de paroi qui ne met en œuvre que les paramètres présentant une grande précision.
PCT/JP2015/055870 2014-03-19 2015-02-27 Objet mobile WO2015141445A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016508645A JP6348971B2 (ja) 2014-03-19 2015-02-27 移動体

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-055843 2014-03-19
JP2014055843 2014-03-19

Publications (1)

Publication Number Publication Date
WO2015141445A1 true WO2015141445A1 (fr) 2015-09-24

Family

ID=54144421

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/055870 WO2015141445A1 (fr) 2014-03-19 2015-02-27 Objet mobile

Country Status (2)

Country Link
JP (1) JP6348971B2 (fr)
WO (1) WO2015141445A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2015141445A1 (ja) * 2014-03-19 2017-05-25 株式会社日立産機システム 移動体
JP2017182175A (ja) * 2016-03-28 2017-10-05 国立大学法人豊橋技術科学大学 自律走行装置及びその開始位置判定プログラム
CN108437833A (zh) * 2018-04-13 2018-08-24 山东时风(集团)有限责任公司 一种专用仓储的自动转料车及控制方法
WO2018233401A1 (fr) * 2017-06-20 2018-12-27 南京阿凡达机器人科技有限公司 Procédé et système orientés module de capteur de souris optoélectronique de création de carte d'intérieur
US10930162B2 (en) 2016-06-13 2021-02-23 SZ DJI Technology Co., Ltd. Unmanned aerial vehicle, delivery system, control method for unmanned aerial vehicle, and program for controlling unmanned aerial vehicle

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109839936A (zh) * 2019-03-04 2019-06-04 中新智擎科技有限公司 一种大环境下的自动导航方法、机器人及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0322108A (ja) * 1989-06-20 1991-01-30 Shinko Electric Co Ltd 移動ロボット
JP2008059218A (ja) * 2006-08-30 2008-03-13 Fujitsu Ltd 自律走行ロボットの自己位置回復方法
JP2013020345A (ja) * 2011-07-08 2013-01-31 Hitachi Industrial Equipment Systems Co Ltd 移動体の位置・姿勢推定システム
JP2014006835A (ja) * 2012-06-27 2014-01-16 Murata Mach Ltd 自律移動装置、自律移動方法、標識、及び、自律移動システム
JP2014067223A (ja) * 2012-09-26 2014-04-17 Hitachi Industrial Equipment Systems Co Ltd 自律移動体

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3333223B2 (ja) * 1991-11-29 2002-10-15 マツダ株式会社 移動車の走行路認識装置
JP2005211442A (ja) * 2004-01-30 2005-08-11 Tottori Univ 自律移動型車椅子
WO2013002067A1 (fr) * 2011-06-29 2013-01-03 株式会社日立産機システム Robot mobile et système d'auto-estimation de position et d'attitude installé sur un corps mobile
JP6348971B2 (ja) * 2014-03-19 2018-06-27 株式会社日立産機システム 移動体

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0322108A (ja) * 1989-06-20 1991-01-30 Shinko Electric Co Ltd 移動ロボット
JP2008059218A (ja) * 2006-08-30 2008-03-13 Fujitsu Ltd 自律走行ロボットの自己位置回復方法
JP2013020345A (ja) * 2011-07-08 2013-01-31 Hitachi Industrial Equipment Systems Co Ltd 移動体の位置・姿勢推定システム
JP2014006835A (ja) * 2012-06-27 2014-01-16 Murata Mach Ltd 自律移動装置、自律移動方法、標識、及び、自律移動システム
JP2014067223A (ja) * 2012-09-26 2014-04-17 Hitachi Industrial Equipment Systems Co Ltd 自律移動体

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2015141445A1 (ja) * 2014-03-19 2017-05-25 株式会社日立産機システム 移動体
JP2017182175A (ja) * 2016-03-28 2017-10-05 国立大学法人豊橋技術科学大学 自律走行装置及びその開始位置判定プログラム
US10930162B2 (en) 2016-06-13 2021-02-23 SZ DJI Technology Co., Ltd. Unmanned aerial vehicle, delivery system, control method for unmanned aerial vehicle, and program for controlling unmanned aerial vehicle
WO2018233401A1 (fr) * 2017-06-20 2018-12-27 南京阿凡达机器人科技有限公司 Procédé et système orientés module de capteur de souris optoélectronique de création de carte d'intérieur
CN108437833A (zh) * 2018-04-13 2018-08-24 山东时风(集团)有限责任公司 一种专用仓储的自动转料车及控制方法

Also Published As

Publication number Publication date
JP6348971B2 (ja) 2018-06-27
JPWO2015141445A1 (ja) 2017-05-25

Similar Documents

Publication Publication Date Title
JP6074205B2 (ja) 自律移動体
JP6348971B2 (ja) 移動体
JP7355500B2 (ja) ワークピース上で動作するためのロボットシステムおよび方法
US9244463B2 (en) Automated guided vehicle and method of operating an automated guided vehicle
JP5157803B2 (ja) 自律移動装置
KR100772912B1 (ko) 절대 방위각을 이용한 로봇 및 이를 이용한 맵 작성 방법
JP5800613B2 (ja) 移動体の位置・姿勢推定システム
WO2010038353A1 (fr) Dispositif à mouvement autonome
JP6825712B2 (ja) 移動体、位置推定装置、およびコンピュータプログラム
JP4682973B2 (ja) 移動経路作成方法、自律移動体および自律移動体制御システム
US11747825B2 (en) Autonomous map traversal with waypoint matching
JP4735476B2 (ja) 自律移動装置
JP5805841B1 (ja) 自律移動体及び自律移動体システム
US20110112714A1 (en) Methods and systems for movement of robotic device using video signal
CA3045676A1 (fr) Dispositif de nettoyage robotique a variation de vitesse de fonctionnement en fonction d'un environnement
JP4670807B2 (ja) 移動経路作成方法、自律移動体および自律移動体制御システム
JP2010061484A (ja) 移動体および移動体の位置推定誤り状態からの復帰方法
JP5212939B2 (ja) 自律移動装置
JP5439552B2 (ja) ロボットシステム
CN108363391B (zh) 机器人及其控制方法
JP5427662B2 (ja) ロボットシステム
JP7396353B2 (ja) 地図作成システム、信号処理回路、移動体および地図作成方法
WO2021246170A1 (fr) Dispositif de traitement d'informations, système et procédé de traitement d'informations, et programme
JP6863049B2 (ja) 自律移動ロボット
WO2023089886A1 (fr) Dispositif de création de carte de déplacement, robot autonome, procédé de création de carte de déplacement, et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15764740

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2016508645

Country of ref document: JP

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 15764740

Country of ref document: EP

Kind code of ref document: A1