WO2021254975A1 - Procédé d'exploitation d'un dispositif mobile - Google Patents

Procédé d'exploitation d'un dispositif mobile Download PDF

Info

Publication number
WO2021254975A1
WO2021254975A1 PCT/EP2021/066003 EP2021066003W WO2021254975A1 WO 2021254975 A1 WO2021254975 A1 WO 2021254975A1 EP 2021066003 W EP2021066003 W EP 2021066003W WO 2021254975 A1 WO2021254975 A1 WO 2021254975A1
Authority
WO
WIPO (PCT)
Prior art keywords
map
mobile device
robot
determined
path
Prior art date
Application number
PCT/EP2021/066003
Other languages
English (en)
Inventor
Christian Martin
Christof Schröter
Robert Arenknecht
Christian STERNITZKE
Johannes Trabert
Original Assignee
Metralabs Gmbh Neue Technologien Und Systeme
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Metralabs Gmbh Neue Technologien Und Systeme filed Critical Metralabs Gmbh Neue Technologien Und Systeme
Priority to EP21739566.4A priority Critical patent/EP4208763A1/fr
Publication of WO2021254975A1 publication Critical patent/WO2021254975A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser

Definitions

  • the invention relates to a method for operating a mobile device which can move autonomously and comprises a drive device and at least one sensor device. Furthermore, the invention relates to a mobile device which can move autonomously.
  • Such mobile devices are, for example, robots.
  • Autonomous mobile robots are becoming increasingly important.
  • Such robots can be integrated into everyday life, such as vacuum cleaning robots, or they can be used to perform tasks in commercial operations, for example in warehouses or department stores.
  • a mobile device can also be an autonomously driving vehicle, for example.
  • the mobile device should be able to perform certain tasks, such as transporting an object, by means of navigation.
  • a goal-oriented, planned action requires knowledge about the environment of the mobile device.
  • Such knowledge is stored in so-called maps.
  • maps can be designed in different ways.
  • such a map can contain geometric information about the environment of the mobile device or the area in which the mobile device moves.
  • Such a map can be given to the mobile device, for example.
  • the mobile device can create a map by a movement of the device in the area and further data, for example data of a sensor device and/or an odometry device.
  • the object of the invention is to provide a method of operating a mobile device that can move autonomously, which provides improved navigation.
  • a method for operating a mobile device which can move autonomously and comprises a drive device and at least one sensor device, the method comprising the following steps: a) Provision of a first map of an area or creation of a first map of an area by means of a method for simultaneous position determination and map creation (SLAM) with detection of first environmental features of the area by the at least one sensor device; b) Determination of at least two waypoints on the first map; c) tracing a skeleton path connecting the waypoints by the mobile device, d) creating a second map by detecting second environmental features of the area by the at least one sensor unit while traversing the skeleton path and comparing a position of the mobile device along the skeleton path by comparing the second map with the first map; e) aligning the second map with respect to the first map based on the first and second environment features, wherein steps b), d), and e) are performed by a computer in the mobile device and/or by an external server communicating with
  • the method may be a navigation method.
  • the mobile device which can in particular be a mobile robot, is configured in such a way that it can detect and evaluate its environment by means of SLAM (Simultaneous Localization and Mapping).
  • SLAM Simultaneous Localization and Mapping
  • This computer-implemented method allows an environment map to be repeatedly created in the same coordinate system.
  • Such a navigation method further enables precise navigation in the area in which the mobile device is moving.
  • the terms mobile device and robot are used as synonyms.
  • the first map comprises the first environment features and the second map comprises the second environment features.
  • the first card and/or the second card are stored in a storage device associated with the mobile device or the external server. It would also be conceivable to have an external storage device that can be accessed by the computer of the mobile device and/or the external server.
  • a zero point is defined in the first map between steps b) and c).
  • the zero point is established at a prominent position of the area.
  • a prominent position may be a mark on the ground, an intersection or the like.
  • a coordinate system is defined in the first map, the coordinate origin of which corresponds to the defined zero point.
  • the coordinate system comprises two main axes (X, Y).
  • one main axis can be selected in such a way that it corresponds to a preferred direction of travel or a prominent extension of an environmental feature of the area. This can be, for example, a main direction of travel parallel to a wall of a room, the orientation of a corridor, etc.
  • the simultaneous position determination and map generation (SLAM) method in step a) is graph-based.
  • the method for simultaneous position determination and map generation comprises the creation of nodes and edges.
  • the nodes are created along the traversed route in the map. These nodes are connected in the map by edges.
  • the mobile device comprises an odometry unit by means of which a position and/or an orientation of the mobile device can be detected on the basis of the performed or traveled movement of the mobile device.
  • the mobile device when the mobile device reaches an already known node, a comparison of a calculated position with the stored position of the known node is performed.
  • the mobile device detects the node by comparing the detected first environmental features with first environmental features stored in the first map.
  • the calculated position is determined based on data from an odometry unit.
  • the waypoints are defined manually and/or automatically.
  • the automatic determination is performed by the computer of the mobile device and/or the external server.
  • the waypoints are determined in such a way that the waypoints are distributed substantially uniformly in the area.
  • a uniform distribution of the waypoints means, for example, an ideally equal spacing of the waypoints on the skeleton path.
  • the waypoints are determined in such a way that the waypoints are placed at positions of the area in which environmental features are detectable which are essentially invariable in temporal and/or spatial terms.
  • environment features can be stationary objects in the area such as columns, walls or pieces of furniture.
  • an approximate position of the mobile device on the first map is first determined by means of an odometry unit during the adjustment of a position of the mobile device in step d).
  • the matching of the position of the mobile device in step d) is then performed by comparing the first and second environment features.
  • this involves Monte Carlo localization of the mobile device, which enables detailed position determination.
  • the second map is generated by means of a simultaneous position determination and map generation method (SLAM). It is advantageous if a position determination or localization is carried out on the first map at the same time.
  • a coordinate system is defined in the second map.
  • this coordinate system also comprises two main axes and a coordinate origin.
  • the coordinate system of the second map is synchronized with the coordinate system of the first map. Accordingly, the two maps are synchronized.
  • a new map is created which has the same coordinate system as the first map.
  • new maps can be created permanently, which all show the same coordinate system. Even in case of major changes in the environment, the mobile device can determine a correct position and orientation by simply traversing the skeleton path.
  • the at least one sensor device comprises at least one LIDAR sensor unit and/or at least one optical sensor unit and/or at least one radar sensor unit and/or at least one inertial sensor unit. Accordingly, the environment can be detected by means of laser data and/or camera data and/or radar data and/or inertial sensor data.
  • sensor units are also conceivable, which are suitable for repeatedly detecting certain environment features.
  • the mobile device moves along a further path which, at least in sections, does not correspond to the skeleton path.
  • the movement along the further path creates at least a third map of a section of the area away from the skeleton path.
  • the mobile device can perform predetermined tasks.
  • Such tasks can be, for example, the transport of objects or the like.
  • the creation of the at least one third map, off the skeleton path is performed using a simultaneous position determination and map creation (SLAM) method.
  • SLAM simultaneous position determination and map creation
  • an error probability is determined during the generation of the third map within the framework of methods for simultaneous position determination and map generation (SLAM).
  • the determined error probability is compared to an error probability threshold, and steps c) to e) are performed if the error probability threshold is exceeded.
  • the mobile device can travel the skeleton path until the error probability between its detected position and the stored waypoints of the skeleton path falls below the error probability threshold. The mobile device can then move away from the skeleton path again.
  • the mobile device comprises a drive device and at least one sensor device.
  • the mobile device is a mobile robot.
  • this device can be equipped with all the features already described above in the context of the method, either individually or in combination with one another, and vice versa.
  • the task is further solved by a system comprising a mobile device and an external server.
  • the external server can communicate with the mobile device via a wireless connection.
  • This device can be equipped with all the features described above in the context of the method and/or the mobile device individually or in combination with each other and vice versa.
  • the task is further solved by a method for operating a mobile device which can move autonomously and comprises a drive device and at least one sensor device, the method comprising the following steps: al) Providing a first map of an area; b 1) path planning based on the map; cl) detection of at least one obstacle by the at least one sensor device; dl) evaluation of the data of the sensor device; wherein the evaluation is carried out on the basis of at least one, predetermined obstacle parameter, wherein the evaluation is carried out with regard to the possibility of avoiding the obstacle by a path change; el) modification of at least one obstacle parameter by a predetermined value if it is not possible to bypass the obstacle by a path change; wherein steps al), bl), dl and el) are performed by a computer in the mobile device and/or by an external server which communicates with the mobile device via an interface.
  • This method can be equipped with all the features already described above in the context of the first method, either individually or in combination with each other, and vice versa. It can be a navigation method.
  • the map can be created in step al) by the first method described above with steps a) to e).
  • the mobile device can move between different obstacles, depending on the environment.
  • the at least one sensor device may be subject to noise, which implies that sometimes distances to obstacles cannot be determined accurately.
  • the mobile device may pass a bottleneck, which it must pass again at a later time, but then, due to the noise, passing is not possible due to defined minimum distances to obstacles, because the distances to the obstacles cannot be detected exactly.
  • the computer-implemented method described above still makes it possible to pass the bottleneck at this point.
  • the evaluation of the data of the sensor device is modified accordingly.
  • obstacle parameters are modified incrementally until it is possible to pass the obstacle.
  • the at least one obstacle parameter comprises a protective field depth, which specifies a minimum distance to an obstacle.
  • the protective field depth is dependent on the speed of the mobile device.
  • the change of the at least one obstacle parameter comprises a reduction of the protective field depth.
  • this change of the at least one obstacle parameter can be performed incrementally several times.
  • the protective field depth can be reduced incrementally up to a predetermined limit value.
  • the modification of the at least one obstacle parameter in step el) comprises an at least partial deactivation of the at least one sensor device.
  • the at least one sensor device comprises one or more sensor units. Accordingly, at least a subset of the sensor units can be deactivated.
  • the evaluation of the data of the sensor device is modified according to the at least partial deactivation of the at least one sensor device.
  • the at least one sensor device comprises a sensor unit comprising a pressure-sensitive bumper and/or a close-range sensor unit.
  • the at least partial deactivation of the at least one sensor device comprises a deactivation of the sensor unit comprising a pressure-sensitive bumper and/or the close-range sensor unit.
  • the modification of the at least one obstacle parameter in step el) comprises a reduction of the speed of the mobile device.
  • the speed of the mobile device is an obstacle parameter.
  • the method comprises using a path that was planned in step bl).
  • the method comprises the use of a path that the mobile device has followed to reach its current position.
  • the mobile device can use a path, on which it has successfully passed an obstacle or a bottleneck, in the opposite direction.
  • this path is calculated from the odometry data acquired by an odometry device.
  • the mobile device reduces the range of protective fields that specify minimum distances to obstacles and, on the same path that it has taken, passes a narrow point that it could not otherwise pass because the sensor-detected distances are not interpreted as sufficiently large.
  • the mobile device comprises a drive device and at least one sensor device.
  • the mobile device is a mobile robot.
  • the task is further solved by a system comprising a mobile device and an external server.
  • the external server can communicate with the mobile device via a wireless connection.
  • This device can be equipped with all features described above in the context of the methods and/or the mobile device individually or in combination with each other and vice versa.
  • the task is further solved by a method for operating a mobile device which can move autonomously and comprises a drive device and at least one sensor device, the method comprising the following steps: a2) Optionally, providing a first map of an area; b2) Detection of at least one unevenness of the ground of the area by means of the at least one sensor device; c2) evaluation of the sensor data generated in step b2) by the computer of the mobile device or an external server which communicates with the mobile device via an interface; d2) entry of the data on the detected at least one ground unevenness in a map.
  • the method can be equipped with all the features already described above in the context of the first two methods, either individually or in combination with each other, and vice versa.
  • the method may be a navigation method.
  • the map in step a2) can be created by the first method described above with steps a) to e).
  • This computer-implemented method makes it possible to map unevenness in the ground on which the mobile device is moving and, advantageously, to adjust the robot's route selection accordingly when navigating, taking into account the detected unevenness. If there are unevennesses in the ground on which the mobile device is moving, the movement can be adjusted after detecting them and thus prevent damage to the mobile device. Such an adjustment can be made, for example, by reducing the speed or by changing the path along which the mobile device moves. By making an entry in the map, the mobile device can also take this unevenness into account in the future.
  • the mobile device advantageously further comprises a memory which can be accessed by the computer and at least three wheels which are driven by a drive device and an odometry unit for determining the position of the mobile device.
  • the at least one sensor device comprises at least one inertial sensor unit.
  • step b2) comprises determining the orientation of the at least one inertial sensor unit.
  • the spatial position of the mobile device is detected.
  • the inertial sensor unit is located at a height above the ground that corresponds at least to the distance between two wheels of the mobile device.
  • step c2) a height difference is determined based on an evaluation of the orientation of the at least one inertial sensor unit.
  • the evaluation of the orientation of the at least one inertial sensor unit comprises the evaluation of at least one inclination that results from the fact that at least one ground contact point of the mobile device is located below or above a reference plane.
  • at least one ground support point is represented by a wheel of the mobile device. It is further advantageous that the height difference of the ground unevenness with respect to the reference plane is determined.
  • the height difference of an unevenness of the ground is determined by tri angulation.
  • Triangulation is based on a known distance between two points, for example the distance between two ground contact points of two parallel wheels of the mobile device.
  • the height difference can then be determined by applying trigonometric functions.
  • the position of the board contact point deviating from the reference plane is determined relative to the spatial zero point in the map. This can be done, for example, by evaluating the data of an odometry device.
  • the acceleration of the mobile device is further determined in step b2). This is preferably done by the inertial sensor unit.
  • a corresponding acceleration results from the mobile device hitting the uneven ground.
  • the acceleration data are advantageously stored and provide an indication of the effort required to overcome the unevenness.
  • step b2) the ground or a ground unevenness is detected visually.
  • the data of the at least one optical sensor unit are advantageously evaluated texture-wise in step c2).
  • a textural evaluation comprises the recognition of specific textures, for example patterns or shapes, in the optically recorded data.
  • the detected textures are advantageously assigned position data or coordinates with respect to the coordinate system in the map used. This assignment is done, for example, by determining the position of the captured textures relative to the image section captured by the sensor, whereby the coordinates of the image section are determined from the known coordinates of the mobile device.
  • the evaluation in step c2 further comprises a comparison of the position of at least one determined unevenness with the position of at least one detected texture. Furthermore, it is advantageous that in step c2) a comparison of two determined height differences takes place if a distance threshold value between the position of these height differences is below a threshold value.
  • an interpolation of determined height differences between two positions is performed if the distance of the height differences is below a threshold value.
  • Such an interpolation can be, for example, a connection of the two positions with a line, which is shown in a map.
  • the interpolation can be based on texture similarities.
  • ground unevenness is detected between two determined positions of ground unevenness if the distance between the two positions exceeds a threshold value.
  • ground unevenness is detected between two determined positions of ground unevenness if the difference between two determined height differences is above a threshold value.
  • a detection of ground unevenness takes place between two determined positions of ground unevenness.
  • two positions have textures whose similarity to each other exceeds a threshold value. However, these positions have height differences that are below a threshold value.
  • step d2) the data can be entered into the map provided in step a2).
  • a separate map with the data on the at least one ground unevenness can be created.
  • the data for the detected at least one ground unevenness can be transformed into a map with velocities stored for an area.
  • the computer of the mobile device and/or the external server plans a path along which the mobile device moves through the area.
  • the data of the detected ground unevenness is taken into account in this path planning.
  • the detected ground unevenness and/or variables derived therefrom are taken into account as cost functions in the path planning and/or movement planning.
  • the mobile device can be reliably controlled even if its sensor system cannot detect obstacles on the ground in advance.
  • the speed or path of the mobile device can be adjusted if the map of the mobile device shows an unevenness on the ground. The mobile device can be prevented from hitting the obstacle or can be passed at a reduced speed without being subjected to excessive vibrations, thereby being damaged and/or tipping over.
  • the created maps with ground unevenness can advantageously include different unevenness classes, e.g. in one class edges up to a height of e.g. 5mm, another class with an edge height of 5- 10mm, and so on.
  • the unevennesses may also be classified differently, e.g. with respect to their steepness.
  • the unevennesses can also be converted into a maximum speed map. This results in a method for adapting the speed of a mobile device, comprising planning a path of the mobile device and comparing the path with ground irregularities stored in a map.
  • the mobile device comprises a drive device and at least one sensor device.
  • the mobile device is a mobile robot.
  • the task is further solved by a system comprising a mobile device and an external server.
  • the external server can communicate with the mobile device via a wireless connection.
  • This device can be equipped with all features described above in the context of the methods and/or the mobile device individually or in combination with each other and vice versa.
  • FIG. 2 Architectural view of the system
  • mobile device and robot shall be used synonymously without limitation of generality.
  • a system is shown which comprises the mobile device 1 or the robot 10 and an external server.
  • Fig. 1 the system view of a mobile deice such as a robot 10 with a mobile base 1 and an upper part 9 is sketched, while the system architecture is shown in Fig. 2.
  • the illustration is mainly exemplary.
  • the robot 10 has a hardware level 120 and a software level 100.
  • the software is e.g. stored in a memory 2, which is connected to a processing unit 3.
  • the robot 10 has a drive unit 6, e.g. designed as differential drive 132, as well as e.g. at least one drive wheel 4 and optionally at least one support wheel 5.
  • the mobile base 1 and/or the robot 10 can be rotationally symmetric, e.g. by arranging two drive wheels 4 on a centered axle of the mobile base 1.
  • the mobile base 1 is supplied by an energy storage 8, e.g. an accumulator.
  • This energy storage 8 can be connected to a charging device at least via a charging interface 7 to recharge the energy storage.
  • a charge control 133 is implemented.
  • the robot 10 has at least one sensor unit for navigation purposes. This is a LIDAR 124 and/or a 2D and/or 3D camera 125, e.g. a RGB-D camera.
  • the LIDAR 124 is e.g. arranged in such a way that it detects the environment of the mobile base 1 or the robot 10 in the usual direction of travel.
  • the data is evaluated locally on a processing uni 3, for example, and in one aspect is transferred to the loud 25 for evaluation purposes.
  • Additional LIDAR 124s may be installed, which are oriented backwards, for example.
  • radar sensors ultrasonic sensors, time-of-flight (ToF) sensors or other close-range sensors are used.
  • the camera 125 speckle cameras, stereo cameras or ToF cameras can be used, e.g. an Astra Orbbec or a Microsoft Kinect, or a 2D camera like a RGB camera or a monochrome camera. It is also possible to use more than one camera, whereby these can be oriented in different directions, for example.
  • a pressure-sensitive bumper 122 (or combined rubber-buffered safety edges with impact protection) is located in one aspect at a distance of, for example, more than 10 millimeters above the floor of mobile base 1.
  • ToF sensors and/or so-called close-range LIDAR/radar/ultrasound sensors can also be used as distance sensors.
  • This is or are connected with the motor control 137 and, in one aspect, with the processing unit 3 and serves for collision detection. Thus, in case of a collision the drive unit 6 is stopped immediately.
  • the robot 10 is connected via an interface to an external system 24 that receives data collected by robot 10 and/or contains instructions for the robot 10.
  • the data can be evaluated by this external system 24 in one aspect, in another aspect it can be made available to another system via an interface for further processing.
  • this external system 24 there is an external processing unit 22 and an external memory 23.
  • the data can be e.g. position information, maintenance information such as running time, battery charge status, operating hours, etc.
  • This data can be in the form of log files, for example.
  • operations can also be performed that could alternatively run on the robot 10 such as navigation or user interaction.
  • the instructions stored in the external system 24, i.e. especially there in the external memory 23, can be applications running on robot 10 and distributed to several robots 10 via the external system 24, e.g. to keep the instructions consistent by updates in a fleet of robots 10.
  • the data, instructions and/or applications stored in the external system 24 and external memory 23 can be accessed via a terminal 26, for example. This can be a permanently installed device or a mobile device such as a cell phone or tablet.
  • the terminal 26 can also be used to send instructions to the robot 10, such as taking up a position, operating modes, etc.
  • Fig. 2 shows an architectural view.
  • the hardware level 120 there is an odometry unit 121, which determines the distance travelled by the robot 10 by measuring the distance covered by the driving wheels 4.
  • sensors for determining the angle of rotation are provided either on a motor (and here e.g. the axle) and/or on the drive wheels 4 or a possibly existing gear box.
  • the odometry unit 121 is configured in such a way that the distance travelled can be determined by determining the angle in conjunction with the diameter of the drive wheels 4 and any intermediate gear ratios.
  • These can be Hall sensors, encoders, stroboscopic tachometers, tachogenerators, induction sensors and/or a Wiegand sensor, etc.
  • the encoders can read out more than 1024 steps per revolution (by light barrier), the Hall sensors 24 steps.
  • Pressure-sensitive bumpers 122 and (optionally) infrared ToF sensors 123 are hardware-near mechanisms for collision detection, like e.g. a LIDAR 124, which is used however also for navigation, for which alternatively and/or supplementing also the camera 125 can be used.
  • This can be a 2D or 3D camera, in one aspect an RGB-D camera.
  • the robot 10 can have alarm lights 126 and/or a sound generator/loud speaker 134.
  • An optional display 129 and controls 127 allow the operation of the robot 10, which is also equipped with a communication interface 130 (shortly referred to as WLAN in Fig. 2).
  • the communication interface 130 can be used, for example, for information exchange and/or information transfer from and/or to the external system 24, for information exchange and/or information transfer from and/or to at least one other robot 10 and/or for information exchange and/or information transfer from and/or to at least one other device such as a warning system, a gate, etc.
  • the type of information transmitted and/or exchanged is not limited. It can be navigation information, application specific information, etc.
  • a differential drive 132 is responsible for moving the drive wheels 3.
  • a charging control 133 is configured to charge the internal energy storage 8.
  • a motor control 137 brings the sensor elements together with the differential drive 132 and ensures that the speed and the planned travel trajectory are adjusted in case of obstacle detection.
  • An inertial sensor unit 138 is used for motion control and navigation and is e.g. integrated in the motor control.
  • a navigation module 110 comprising a 2D/3D environment perception module 111 for 2D and/or 3D environment perception, a path planning module 112, with which the robot 10 can efficiently calculate its own path to be covered and evaluate it with regard to certain criteria in its effort.
  • the navigation module 110 is designed, among other things, to determine the position of the robot 10. This can be done e.g.
  • the navigation module 110 includes a motion planner 115, which uses among other things the results of the path planning from the path planning module 112 and calculates an optimal path for the robot 10 under consideration and/or optimization of different cost functions.
  • the cost functions are the data from the path planning, obstacle avoidance, preferred direction of travel, etc.
  • the robot 10 is equipped with a mapping module 117 for mapping its environment, as well as a charging module 118 for automatic recharging at low energy levels in the energy storage 8.
  • the latter means that the robot 10 automatically visits a charging station, e.g. when the voltage of the accumulator falls below a defined threshold value.
  • the robot 10 also has a map module 119, in which maps of its environment are stored, e.g. after being created by the mapping module 117.
  • Fig. 10 a creates node points 3401 along the route it travels and detects the environment (e.g. obstacles, see 3403) at these node points 3401.
  • the nodes 3401 are again connected by edges 3402.
  • the robot 10 detects an environment which it has already recorded (i.e., if the graph created from node points 3401 and edges 3402 is approximately a closed loop), the robot detects a possible deviation of the created nodes 3404 (where these nodes 3401 and edges 3402 are shown as dashed lines) on the basis of an environment evaluation if, as illustrated, the position reached at a later point in time does not match the initially stored position with regard to evaluated odometric data. As shown in Fig.
  • the map created in SLAM is aligned according to the local conditions and a zero point for the coordinate system of the map is defined.
  • the X and Y coordinates are defined, whereby the X direction corresponds e.g. to the main driving direction of the robot 10.
  • the robot 10 is aligned with its main direction of travel parallel to the walls of a room, the orientation of a corridor, etc.
  • the zero point itself is ideally associated with a prominent position such as a mark on the floor, an intersection, etc. Alignment and zero point setting can be done manually, e.g. during initial setup of the robot 10 in its operational environment.
  • a map is created automatically by the robot 10 using SLAM, it may happen that this recorded map differs from maps previously recorded in the same environment, e.g. due to rotations of the robot 10, unevenness in the path that influences the measurements, etc., so that misalignment (rotation) and/or mispositioning (looking at the zero point) may exist.
  • the skeleton path includes these defined waypoints 3405, which are connected by lines.
  • the paths can, for example, be the main paths that lead through a department store and from which customers can reach different areas. In smaller department stores, for example, this main path leads in a circle.
  • Such a skeleton path along a main path is shown in Fig. 3, where the dotted line is the skeleton path. In larger department stores, the skeleton path can cover several blocks. An example of this is shown in Fig. 4, where the main path is much wider than the robot 10 (e.g. at least twice as wide) and there are few obstacles.
  • the skeleton path should cover the basic structure of the operational environment, i.e. the main path should ideally cover substantial parts of the area to be mapped and/or driven in rough strokes.
  • major changes can occur over time from the perspective of the robot, i.e. obstacles and objects can shift, which can mean that, in view of a retail area, there may be spatial changes such as new shelves and/or product arrangements etc.
  • a two-phase process then takes place.
  • the robot 10 in later operation (after mapping the area by the robot 10 or receiving the map from another system), the robot 10 first follows the skeleton path with its waypoints 3405 after self-localization on the map.
  • the robot 10 captures the environment of the skeleton path and compares it with at least one map created during a previous run along the skeleton path and the environment stored in it, whereby a position estimate along the skeleton path is made.
  • the robot 10 determines its approximate position on the map, e.g.
  • odometry unit 121 compares the obstacles in the vicinity of the skeleton path, which were recorded and detected on the basis of laser, camera, radar and/or inertial sensor data, with obstacles or objects along the skeleton path stored in a map as part of a detailed position estimation, whereby e.g. a standard Monte Carlo localization is used.
  • a new map is constructed using SLAM (e.g. the Monte Carlo localization gives SLAM approximately the position of a node 4301 on the skeleton path) and at the same time the localization (determination of the own position) is performed on the old map and the skeleton path.
  • the newly created map is aligned with the previously created maps.
  • similarity values are determined between the respective created maps, which at the same time represent an error probability.
  • the error probabilities are minimized or the similarity values are maximized by tuming/di storting the newly created map.
  • a new map is created by following the skeleton path, which is located in the same coordinate system as the original map. This ensures that maps can be automatically created and used repeatedly in the same coordinate system.
  • the e.g. standard Monte Carlo localization is then switched off and the robot 10 then moves, for example, also off the skeleton path.
  • the robot 10 can also detect major changes, but since the adjustment via the skeleton path in the first phase has already been performed and matches the original map, the (correct) position and orientation of the map is maintained in the second phase.
  • the position estimate may become too uncertain. This uncertainty is determined by the robot 10 comparing the acquired environment with a previously stored environment at the created node 3401, whereby similarities between the maps are determined and the similarity represents a probability of error. If an error probability threshold is determined to be exceeded, the robot 10 then returns to the skeleton path. It then travels along this skeleton path until the error probability drops below the error probability threshold. Then it continues its journey off the skeleton path.
  • a mapping of an area is done by creating a first map using SLAM, in one aspect graphical SLAM.
  • SLAM in one aspect graphical SLAM.
  • nodes 3401 and edges 3402 are created between the nodes 3401
  • an error probability for the position of the nodes 3401 is determined (step 3015) and a correction of the node position based on the error probability 3020 is performed.
  • the zero point of the robot 10 is determined (step 3025), e.g. based on a prominent position on the map, and in step 3030 its orientation in X-Y direction.
  • waypoints 3405 along a skeleton path are defined 3035 at positions that change little over time (e.g. with respect to objects in their surroundings) and that are also widely distributed over the operational environment of the robot 10.
  • a self-localization of the robot 10 takes place and the robot 10 travels along the defined skeleton path (step 3040), while it carries out a Monte Carlo localization 3045. In doing so, it captures its environment and creates a second map.
  • the second map is compared to the previously recorded environment of the first map based on the environment of the skeleton path and aligned based on identified features of the environment, using coordinates of identified obstacles for alignment 3050. This synchronizes the coordinate systems of the first and second map.
  • Monte Carlo localization is deactivated, for example, and the robot 10 maps the area off skeleton path 3055, whereby the robot 10 can also perform tasks off the mapping (e.g. surveying objects, transport tasks, etc.). If the robot 10 detects a transgression of an error probability threshold value between created nodes 3401 (step 3060), it moves back to the skeleton path (i.e., performs steps 3040 and following) and travels along the skeleton path until the error probability between its detected position and the waypoints 3405 of the skeleton path has fallen below the error probability threshold value. It then continues its journey away from the skeleton path.
  • an error probability threshold value between created nodes 3401
  • the robot 10 thus creates a map by SLAM. Then, for example, a so-called skeleton path is created manually by marking certain points on the map. This skeleton path covers a part of the drivable area which tends to change little (e.g. main paths) and which the robot 10 can use for orientation via sensors during subsequent trips, which the robot 10 recognizes via sensors in order to synchronize or merge a newly recorded map with a previously recorded map.
  • the robot 10 When driving through a dynamic operational environment, it may happen that the robot 10 has to pass e.g. on a route specified by path planning module 112 unexpected bottlenecks, which are caused by temporarily existing or newly added obstacles and which make it necessary to change the path based on a path determination by path planning module 112.
  • the robot 10 keeps a certain minimum distance to obstacles, which are detected by the sensor technology of the robot 10. This minimum distance is defined e.g. by protective fields, which can be speed-dependent. Due to noise of the sensor measuring values and/or due to disturbances it happens that obstacles are detected with a small difference to the true position. Due to these measurement deviations, it happens that the robot 10 passes narrow places, which normally should not be passed.
  • the robot 10 drives into areas behind such a narrow passage that cannot be left again, e.g. because it is a dead-end street and because when approaching this narrow passage again, it is recognized as such (showing a width to be passed that is below a pre-defmed threshold value selected e.g. for security reasons with distances to obstacles) and the passage is not passed. This means that in such cases no new path can be determined based on detected obstacles.
  • the robot 10 can also be on a circular path and have already passed the location in the past, but find an obstacle there at that time.
  • the obstacle avoidance is deactivated.
  • Deactivation of obstacle avoidance means, for example, that certain protective fields of a distance sensor such as a laser scanner are deactivated, i.e. if these protective fields have a depth of 10 cm, for example, and the robot stops when an obstacle appears within this depth, this protective field can be deactivated and/or the depth of the protective field can be adjusted, i.e. the depth can be reduced (e.g. from 10 to 5 cm).
  • a reduction of the depth of the protective field is accompanied by a reduced speed of the robot 10, which can be only a few millimeters per second, for example.
  • the distance of the last path travelled e.g. stored in memory 2 of the robot 10 is defined as the new distance.
  • the robot 10 e.g. falls back on the path which was determined by means of odometry unit 121. It is also possible to use combinations of these.
  • the procedure is summarized as follows in Fig. 6:
  • the robot 10 plans a path based on a map 3105. It detects an obstacle 3110 on its route, which is e.g. given by a path planning module 112.
  • the robot 10 tries to replan the path in the path planning module 112, but the planning fails due to obstacle detection 3115, i.e. new paths are determined, but they are not passable due to detected obstacles.
  • an adaptation of the obstacle sensor evaluation takes place, e.g. a reduction of the protective field depth and/or a protective field deactivation 3125 (whereby e.g.
  • a deactivation of the sensor system of a pressure-sensitive bumper 122 or another close-range sensor takes place and/or a protective field adaptation of the LIDAR or a radar sensor). This is accompanied, for example, by a speed reduction 3130.
  • the robot 10 then tries again to pass the bottleneck 3135. This can be done by following a previously planned path or a newly planned path 3145. Alternatively and/or in addition, the robot 10 tries to retrace the previously covered path 3140 to pass the bottleneck.
  • This path has been stored e.g. in the navigation module 110 and recorded by means of the odometry unit 121.
  • the procedure described here is stored e.g. in a self-blockade detection module 113.
  • the robot 10 thus reduces the range of protective fields that specify minimum distances to obstacles and travels along the same path it came, a narrow point that it could not otherwise pass because the perceived distances are not large enough.
  • uneven ground can cause a robot to have to adjust its speed in order to pass it without falling over or suffering damage.
  • Obstacles on the ground with a height of only a few millimeters to a few centimeters can lead to the problems mentioned above.
  • the obstacles can represent elevations or depressions in the ground.
  • the risk of damage or overturning also depends on the speed, the bearing of the wheels, the wheel diameter and/or the suspension of the wheels, etc.
  • a LIDAR usually only detects obstacles at a defined height, often more than 5 cm above the ground. Depth cameras are often not accurate enough in detecting obstacles on the ground that are lower than 5cm. A radar sensor often also does not have the corresponding detection accuracy.
  • the obstacle is, for example, in an area in which the robot 10 moves over a longer period of time and explores this area beforehand.
  • the robot 10 moves with a slow speed, e.g. 0.2m/s. This speed is chosen so that the robot 10 does not fall over when it encounters an obstacle such as a door sill, for example, which is 2.5 cm high. If the robot 10 hits an obstacle, it will be lifted by at least one wheel that comes into contact with the obstacle and is thus lifted at least proportionally. This changes the inclination of the robot 10 in space.
  • an inertial sensor unit 138 that detects the unevenness.
  • This inertial sensor unit 138 can be located in or connected to the motor controller.
  • the inertial sensor unit 138 is located at a height above the ground that is at least 30% of the distance between two wheels of the robot 10, the distance being determined by the ground contact points of the wheels.
  • the position of the inertial sensor unit 138 above the ground is at least a height equal to the distance between the wheels.
  • the at least one inertial sensor unit 138 is positioned as far above the ground as possible.
  • An inertial sensor unit 138 positioned as far above the ground as possible allows more accurate detection, since this inertial sensor unit 138 is deflected more by the lever action, the lever being made up of at least two wheels, for example, a wheel that is on the obstacle and a wheel that is not on the obstacle, and the vertical axis through the inertial sensor unit 138, which is perpendicular to the line connecting the two wheel support points when the robot 10 is positioned horizontally. This greater deflection, in turn, allows the use of less expensive sensors with lower sensitivity.
  • the orientation of the inertial sensor unit 138 in space determines the height of the obstacle on the path of tri angulation.
  • the robot 10 has two parallel wheels located at the front of the robot 10, which move along a line parallel to these wheels, e.g. a carpet edge with a height X, whereas the robot 10 has a support wheel at the back.
  • the speed of the robot 10 and/or the energy required by the robot 10 to get over the obstacle can be determined. For example, speed and/or energy are determined within the motor control 137. Furthermore, the acceleration that occurs when the robot 10 hits the obstacle at a defined speed can be determined. Speed and/or energy as well as the acceleration represent the difficulty of overcoming the unevenness. This information is recorded by the robot 10 and stored in memory 2. Coordinates are assigned to this information, which are determined by the odometry unit of the robot 10.
  • the robot 10 maps its environment, for example, it records these described measurements of the unevenness of the ground and stores them on the map. If, for example, unevenness is detected on two positions that have a height difference that is below a threshold value, that e.g. also has a similar energy to overcome and/or a similar acceleration when hitting and/or overcoming the unevenness, the corresponding, determined coordinates are connected. If, for example, two coordinates acquired in this way are located at a distance from each other that exceeds a threshold value, the robot 10 navigates during mapping in such a way that it moves between these two coordinates and detects whether there is also a bump in the ground. This is entered into the map as described. In this way, for example, the position of a carpet in an apartment can be determined.
  • the robot 10 can use a camera such as an RGB or monochrome camera to record the surface of the floor in the direction of travel and evaluate it.
  • the evaluation includes the determination of textures e.g. by means of histograms of gradients in the tool OpenCV.
  • the textures are assigned coordinates on the map, which the robot 10 creates.
  • the coordinates are compared with the determined ground unevenness. In this way, an interpolation can also be carried out, whereby texturally similar optical structures are assigned the determined unevenness of the ground.
  • the robot 10 uses a map created in this way to adapt its route depending on speed, for example. This information is taken into account in the path planning module 112 of the robot 10. So e.g. a motion planner 115 can consider these as cost functions during navigation. Specifically, the calculation can include, for example, the maximum acceleration that a robot 10 may experience when hitting an obstacle, which is e.g. defined by a threshold value. In the context of the cost functions, for example, it is considered whether it is faster to drive a direct path and thereby overcome the unevenness of the ground at reduced speed or to take a detour at higher speed. Alternatively and/or supplementarily, the height of the obstacles over which the robot 10 is allowed to drive depending on the speed is stored.
  • step 3205 the orientation of an inertial sensor unit 138 is measured.
  • This sensor may be located at a distance of at least 30% of the difference of two wheel support points of the system above the ground, at a distance of at least the difference of two wheel support points above the ground or is installed in the upper part of the system. It may also be part of a motor control 137.
  • step 3210 the height difference of a floor unevenness compared to a reference plane is determined e.g. the horizontal floor. This is done by triangulating the support points of the system, such as the wheels on the floor, for example between at least two wheels of the system and their inclination in space.
  • a suspension of at least one wheel can be taken into account.
  • the deflection of the spring is determined, for example by optical means.
  • the spatial position of the system is recorded, e.g. by means of an odometry unit. From the spatial zero point of the system 3302 (i.e. the odometry center), a relative localization of a detected ground unevenness 3220 can be done in such a way that from the spatial zero point of the system 3302 the distance to the at least one wheel, which is varied in height by the unevenness, is vectorially added, so that the coordinate of the ground unevenness relative to the spatial zero point of the system 3302 is determined.
  • the position of the wheels can be stored in memory 2.
  • Fig. 8 illustrates this.
  • 3301 represents the outline of the system, with the spatial zero point of the system 3302 and the wheels 3303 and 3304, where the wheel 3304 is the first to hit a floor unevenness.
  • the vector 3305 shows the addition to the coordinates which are added to the spatial zero point of system 3302 and which result e.g. from the distance between the wheel 3304 and the zero point of system 3302.
  • the position of the spatial zero point of system 3302 can be known on the one hand, on the other hand it can be determined by sensors, e.g. by an odometry unit.
  • the acceleration of the system is recorded, e.g.
  • the acceleration when determining an inclination of the system.
  • the acceleration can be determined which is not in the horizontal plane.
  • the ground in the environment of the system e.g. in front of the system, can be recorded visually (step 3230).
  • the captured signals are evaluated texturally (step 3235), i.e. patterns are recognized and captured, e.g. using the software OpenCV. These patterns are assigned coordinates in step 3240.
  • the data from the aforementioned steps are stored 3245, e.g. as a map.
  • step 3250 The height of two uneven areas at different positions is then compared 3250, whereby the distance between the positions falls below a threshold value.
  • the position of an unevenness is compared with the position of a visual texture 3255.
  • classification methods described in the state of the art can be used to assign recognized unevenness to recognized textures.
  • step 3260 an interpolation of coordinates takes place, i.e. height differences between two positions are interpolated if the distance between the coordinates of the two height differences is below a threshold value. Interpolated means, for example, that two positions are connected by a line, which is displayed on a map. Alternatively and/or in addition, the connections are determined on the basis of similar textures, i.e.
  • an exploration of the spaces between several coordinates can be performed 3265.
  • Exploration means e.g. that the areas between the coordinates are evaluated sensorially with at least the inertial sensor unit 138. This can be the case, for example, if a) there is a minimum distance between two points with determined unevenness, b) the height differences between two points differ, c) two points have a height difference between them that exceeds a threshold value and/or d) two positions have textures whose similarity between them exceeds a threshold value, but these positions have height differences that are below a threshold value.
  • the results are stored as map 3270, e.g.
  • the determined unevenness of the ground can also be transformed into a map with speed values (such as directional or maximum speeds), e.g. on the way of a classification, whereby e.g. a concordance table is used for the classification, which considers maximum speeds based on a classification of the unevenness of the ground.
  • This concordance table can be system-specific, i.e. it depends on the characteristics of the robot, such as its weight, center of gravity, ground clearance, arrangement and size of the wheels, chassis, etc.
  • This map with speeds is in turn taken into account by a path planning module 112 and/or motion planner 115 when selecting the route and controlling the speed of the robot 10.
  • a map is, in one aspect, a coordinate system whose coordinates are assigned parameters.
  • Fig. 9 illustrates the arrangement of the relevant modules for implementing the elevation maps.
  • the mapping module 117 is the ground unevenness investigation module 116, in which the ground unevenness is determined as described e.g. in the above mentioned paragraphs.
  • the position determination module 141 evaluates e.g. the position or orientation of the robot 10 relative to a horizontal plane.
  • the height determination module 142 determines e.g. the height of at least one wheel (4, 5) relative to the horizontal plane.
  • the floor texture determination module 143 determines e.g. the ground texture based on an optical sensor like the camera 125.
  • the ground unevenness investigation module 116 can then determine the position of the ground unevenness and/or captured textures relative to the position of the robot 10.
  • the odometry unit 121 for example, in turn provides the data for the position of the robot 10, so that the map with the ground irregularities can be created based on this information from the mapping module 117.
  • the determination of the unevenness of the ground is stored e.g. in the unevenness investigation module 116 or the map module 119.
  • the robot 10 is thus configured to register inclinations via an inertial sensor unit 138 and to determine, for example via tri angulation, the deviation of a height difference, which it stores in a map. Different positions with determined height differences are compared and points are interpolated in between, so that a height map of the environment is created. The height differences are taken into account during navigation, for example, so that driving speeds can be adjusted.
  • a computer-implemented method for controlling a robot (10) comprising • path planning based on a map of the robot (10); • detection of at least one obstacle requiring a change of the planned path to avoid collisions;
  • ASBE7 Computer-implemented method according to ASBE1, further comprising the use of the path taken by the robot (10) to reach its current position.
  • ASBE8 Computer-implemented method according to ASBE7, where the path is derived from recorded odometric data.
  • Example 2 Mapping of floor and/or ground unevenness
  • AEU 1 to AEU31 The detection of floor and/or ground unevenness is characterized here by the following embodiments AEU 1 to AEU31 :
  • AEU 1 A computer-implemented method for creating a map that includes ground irregularities by a system comprising
  • AEU4 Computer-implemented method according to AEU1, where the height difference of a floor unevenness compared to a reference plane is determined.
  • AEU5 Computer-implemented method according to AEU1, where the height difference of a floor unevenness is determined by tri angulation.
  • AEU6 Computer-implemented method according to AEU3, whereby the position of the ground contact point deviating from the reference plane is determined relative to the spatial zero point 3302 of the system.
  • AEU7 Computer-implemented method according to AEU6, where the spatial zero point 3302 of the system is marked on a map.
  • AEU8 Computer-implemented method according to AEU1, further comprising the determination of the acceleration of the system.
  • AEU9 Computer-implemented method according to AEU1, further comprising a visual survey of the ground.
  • AEU10 Computer-implemented procedure according to AEU9, whereby the visual signals are evaluated texturally.
  • AEU1 Computer-implemented method according to AEU10, further comprising an assignment of position data to the captured textures.
  • AEU12 Computer-implemented method according to AEU11, whereby the position of at least one detected unevenness is compared with the position of at least one detected texture.
  • AEU13 Computer-implemented method according to AEU1, further comprising a comparison of two determined height differences, if a distance threshold value between the position of these height differences is below a threshold value.
  • AEU14 Computer-implemented method according to AEU13, further comprising an interpolation of height differences between two positions, if the differences of the height differences are below a threshold value.
  • AEU15 Computer-implemented procedure according to AEU14, where the interpolation is based on texture similarities.
  • AEU16 Computer-implemented method according to AEU1, further comprising a detection of ground unevenness between two detected positions of ground unevenness if the distance between the two positions exceeds a threshold value.
  • AEU17 Computer-implemented method according to AEU1, further comprising a detection of ground unevenness between two determined positions of ground unevenness, if the distance between two determined height differences is above a threshold value.
  • AEU18 Computer-implemented method according to AEU11, further comprising a detection of ground unevenness between two detected positions of ground unevenness, wherein two positions have textures whose similarity to each other exceeds a threshold value, but these positions have height differences which are below a threshold value.
  • AEU19 Computer-implemented procedure according to AEU1, further transfer of the determined ground unevenness and its positions into a map.
  • AEU20 Computer-implemented method according to AEU1, further transformation of the determined ground unevenness into a map with speeds.
  • AEU21 Computer-implemented process according to AEU20 or AEU21, further comprising the storage of the map in a map module (119)
  • AEU22 Computer-implemented procedure according to AEU1, further consideration of the ground unevenness in path planning and/or motion planning of a system.
  • AEU23 Computer-implemented procedure according to AEU22, whereby the unevenness of the ground and/or variables derived from it are considered as cost functions in path planning and/or motion planning.
  • AEU24 Device for carrying out the process according to AEU1-AEU23.
  • AEU25 Device according to AEU24, whereby the system is a robot (10).
  • AEU26 Device according to AEU24, whereby the inertial sensor unit (138) is located at a height above the ground which corresponds at least to the distance between two wheels of the system.
  • AEU27. System comprising a memory (2), a processing unit (3), at least three wheels (4, 5), a mapping module (117) and a ground unevenness investigation module (116).
  • AEU28 System according to AEU28, further comprising a position determination module (141) for evaluation of the position of the system relative to a horizontal plane and a height determination module (142) for determination of the height of at least one wheel (4, 5) relative to a horizontal plane.
  • a position determination module for evaluation of the position of the system relative to a horizontal plane
  • a height determination module for determination of the height of at least one wheel (4, 5) relative to a horizontal plane.
  • AEU29 System according to AEU28, further comprising a ground texture module 143 for determination of a ground texture by means of an optical sensor (e.g. 125).
  • an optical sensor e.g. 125
  • AEU30 System according to AEU28, further comprising an odometry unit 121 for determination of the system position.
  • robots 10 can be controlled in such a way that even if the sensors of a robot 10 cannot detect any obstacles on the ground in advance, they can be controlled accordingly.
  • the speed is adjusted if the map of the robot 10 shows an uneven ground surface in order to prevent the robot 10 from hitting an obstacle where the speed of the robot 10 might cause the robot 10 to be subjected to excessive vibrations, be damaged and/or tip over.
  • the unevenness maps created by the robot 10 can include different classes of unevenness, e.g. one class for edges up to 5mm in height, another class for edges 5-10mm in height, etc. In one aspect, the unevenness can also be classified differently, e.g. with respect to their steepness.
  • the unevenness can also be converted into a maximum speed map.
  • a computer-implemented method for adjusting the speed of a robot 10 comprising planning a path of the robot 10; comparing the path with ground bumps stored in a map; adjusting the speed of the robot 10 and/or replanning the path; wherein the ground bumps include more than one ground bump class.
  • ALR2 Computer-implemented method according to ALR1, where the origin of the coordinate system of the map is associated with a prominent position.
  • ALR 3 Computer-implemented method according to ALRl, whereby the waypoints are distributed evenly in the area to be covered.
  • ALR4 Computer-implemented method according to ALRl, further comprising travelling along the defined waypoints by the robot (10), whereby the mapping of the environment is carried out along a path connecting the waypoints and the mapping of the environment is carried out in a new coordinate system.
  • ALR5 Computer-implemented method according to ALR4, where the mapping is done by detecting obstacles using laser data, camera data, radar data and/or inertial sensor data.
  • ALR6 Computer-implemented method according to ALR4, further comprising a comparison of the arrangement of the obstacles in the vicinity of the waypoints with the arrangement of the obstacles in the stored map.
  • ALR7 Computer-implemented method according to ALR6, further comprising overlaying of the coordinate system of the map created in the vicinity of the waypoints with the coordinate system of the stored map.
  • ALR8 Computer-implemented procedure according to ALR7, where the overlay is done by minimizing the deviation of the obstacle positions between both maps at the waypoints.
  • ALR9 Computer-implemented method according to ALR7, further comprising mapping of the area that is not located between two waypoints.
  • ALRIO Computer-implemented method according to ALRl, where a sequence of waypoints results in a skeleton path.
  • ALRl Computer-implemented method according to ALR10, whereby the skeleton path is traced when a new job is performed.
  • ALR12 Computer-implemented method according to ALR4, whereby the waypoints are travelled with simultaneous Monte Carlo localization.
  • ALR13 Computer-implemented procedure according to ALR9, whereby Monte Carlo localization is deactivated during mapping.
  • ALR14. Computer-implemented method according to ALR7, further comprising mapping of the area off the skeleton path.
  • ALR15 Device for performing the process according to ALRl-14.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Optics & Photonics (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

L'invention comprend un système et un procédé d'exécution de tâches de navigation autonome, consistant à créer et aligner des cartes électroniques, par exemple les cartes comprenant des irrégularités de terrain, qui sont mappées et prises en compte de manière autonome dans la planification de mouvement du système. L'invention consiste également à orienter des cartes par rapport à un système de coordonnées. En outre, l'invention consiste à prendre les embouteillages en compte dans la navigation du système, ce qui permet au système de s'en libérer.
PCT/EP2021/066003 2020-06-19 2021-06-15 Procédé d'exploitation d'un dispositif mobile WO2021254975A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP21739566.4A EP4208763A1 (fr) 2020-06-19 2021-06-15 Procédé d'exploitation d'un dispositif mobile

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102020003769 2020-06-19
DE102020003769.0 2020-06-19

Publications (1)

Publication Number Publication Date
WO2021254975A1 true WO2021254975A1 (fr) 2021-12-23

Family

ID=76845183

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/066003 WO2021254975A1 (fr) 2020-06-19 2021-06-15 Procédé d'exploitation d'un dispositif mobile

Country Status (3)

Country Link
EP (1) EP4208763A1 (fr)
DE (1) DE102021115630A1 (fr)
WO (1) WO2021254975A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024015031A1 (fr) * 2022-07-11 2024-01-18 Delivers Ai Robotik Otonom Surus Bilgi Teknolojileri A.S. Système de livraison et procédé de localisation hybride pour un robot de livraison
DE102022128862A1 (de) 2022-10-31 2024-05-02 tediro GmbH Objektlokalisierung durch einen mobilen Roboter
DE102023123138A1 (de) 2022-10-31 2024-05-02 tediro GmbH Objektlokalisierung durch einen mobilen Roboter
DE102022128864A1 (de) 2022-10-31 2024-05-02 tediro GmbH Objektlokalisierung durch einen mobilen Roboter
WO2024122264A1 (fr) * 2022-12-07 2024-06-13 住友重機械工業株式会社 Dispositif mobile

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022207370A1 (de) 2022-07-19 2024-01-25 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zum Erkennen einer fehlerhaften Karte einer Umgebung

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017201490A1 (fr) * 2016-05-19 2017-11-23 Simbe Robotics Inc. Procédé de génération automatique d'un planogramme qui attribue des produits à des structures de rayonnage à l'intérieur d'un magasin
US20200003901A1 (en) * 2018-06-28 2020-01-02 Zoox, Inc. Loading Multi-Resolution Maps for Localization

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017201490A1 (fr) * 2016-05-19 2017-11-23 Simbe Robotics Inc. Procédé de génération automatique d'un planogramme qui attribue des produits à des structures de rayonnage à l'intérieur d'un magasin
US20200003901A1 (en) * 2018-06-28 2020-01-02 Zoox, Inc. Loading Multi-Resolution Maps for Localization

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ARDIYANTO IGI ET AL: "Visibility-based viewpoint planning for guard robot using skeletonization and geodesic motion model", 2013 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA); 6-10 MAY 2013; KARLSRUHE, GERMANY, IEEE, US, 6 May 2013 (2013-05-06), pages 660 - 666, XP032506514, ISSN: 1050-4729, ISBN: 978-1-4673-5641-1, [retrieved on 20131013], DOI: 10.1109/ICRA.2013.6630643 *
KIM MINGU ET AL: "Active object search in an unknown large-scale environment using commonsense knowledge and spatial relations", INTELLIGENT SERVICE ROBOTICS, SPRINGER BERLIN HEIDELBERG, BERLIN/HEIDELBERG, vol. 12, no. 4, 29 August 2019 (2019-08-29), pages 371 - 380, XP036911348, ISSN: 1861-2776, [retrieved on 20190829], DOI: 10.1007/S11370-019-00288-5 *
MATTHEW MCGILL ET AL: "Virtual reconstruction using an autonomous robot", INDOOR POSITIONING AND INDOOR NAVIGATION (IPIN), 2012 INTERNATIONAL CONFERENCE ON, IEEE, 13 November 2012 (2012-11-13), pages 1 - 8, XP032313157, ISBN: 978-1-4673-1955-3, DOI: 10.1109/IPIN.2012.6418851 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024015031A1 (fr) * 2022-07-11 2024-01-18 Delivers Ai Robotik Otonom Surus Bilgi Teknolojileri A.S. Système de livraison et procédé de localisation hybride pour un robot de livraison
DE102022128862A1 (de) 2022-10-31 2024-05-02 tediro GmbH Objektlokalisierung durch einen mobilen Roboter
DE102023123138A1 (de) 2022-10-31 2024-05-02 tediro GmbH Objektlokalisierung durch einen mobilen Roboter
DE102022128864A1 (de) 2022-10-31 2024-05-02 tediro GmbH Objektlokalisierung durch einen mobilen Roboter
WO2024122264A1 (fr) * 2022-12-07 2024-06-13 住友重機械工業株式会社 Dispositif mobile

Also Published As

Publication number Publication date
EP4208763A1 (fr) 2023-07-12
DE102021115630A1 (de) 2021-12-23

Similar Documents

Publication Publication Date Title
WO2021254975A1 (fr) Procédé d'exploitation d'un dispositif mobile
CN106325270B (zh) 基于感知和自主计算定位导航的智能车导航方法
US11960304B2 (en) Localization and mapping using physical features
US20190278273A1 (en) Odometry system and method for tracking traffic lights
WO2020258721A1 (fr) Procédé et système de navigation intelligente pour motocyclette de type cruiser
CN106227212B (zh) 基于栅格地图和动态校准的精度可控室内导航系统及方法
WO2018194768A1 (fr) Procédé et système de localisation et d'étalonnage de capteur simultanés
US20040073337A1 (en) Sentry robot system
CN105466438A (zh) 防撞车辆中的传感器测距和应用
RU2740229C1 (ru) Способ локализации и построения навигационных карт мобильного сервисного робота
WO2018194833A1 (fr) Localisation avec cartographie négative
CN112000103B (zh) 一种agv机器人定位、建图与导航的方法及系统
US20220363263A1 (en) Automated bump and/or depression detection in a roadway
JPWO2019187816A1 (ja) 移動体および移動体システム
EP3998451B1 (fr) Procédé de pilotage, transporteur mobile et système de pilotage
CN114714357A (zh) 一种分拣搬运方法、分拣搬运机器人及存储介质
CN110320912A (zh) 激光与视觉slam融合的agv定位导航装置及方法
Tsukiyama Global navigation system with RFID tags
CN114911223A (zh) 一种机器人导航方法、装置、机器人及存储介质
Drage et al. Lidar road edge detection by heuristic evaluation of many linear regressions
KR102517351B1 (ko) 동적 장애물의 제거가 가능한 운송 로봇 및 동적 장애물 제거 방법
Su et al. Autonomous land vehicle guidance for navigation in buildings by computer vision, radio, and photoelectric sensing techniques
Wu et al. Developing a dynamic obstacle avoidance system for autonomous mobile robots using Bayesian optimization and object tracking: Implementation and testing
Fukushima et al. Magnetic-based localization considering robot’s attitude in slopes
JP2021149420A (ja) 推定システムおよび方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21739566

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021739566

Country of ref document: EP

Effective date: 20230119