WO2024008615A1 - Procédé de détermination de position d'un objet prédéterminé - Google Patents

Procédé de détermination de position d'un objet prédéterminé Download PDF

Info

Publication number
WO2024008615A1
WO2024008615A1 PCT/EP2023/068162 EP2023068162W WO2024008615A1 WO 2024008615 A1 WO2024008615 A1 WO 2024008615A1 EP 2023068162 W EP2023068162 W EP 2023068162W WO 2024008615 A1 WO2024008615 A1 WO 2024008615A1
Authority
WO
WIPO (PCT)
Prior art keywords
points
mobile device
predetermined
predetermined object
docking station
Prior art date
Application number
PCT/EP2023/068162
Other languages
German (de)
English (en)
Inventor
Michal KRAMARCZYK
Dominik KIRCHNER
Marek MALINOWSKI
Kristina Daniel
Maximilian Wenger
Andrey Rudenko
Marco Lampacrescia
Thilak Raj CHIKMAGALORE
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Priority claimed from DE102023206256.9A external-priority patent/DE102023206256A1/de
Publication of WO2024008615A1 publication Critical patent/WO2024008615A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/244Arrangements for determining position or orientation using passive navigation aids external to the vehicle, e.g. markers, reflectors or magnetic means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/656Interaction with payloads or external entities
    • G05D1/661Docking at a base station
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2105/00Specific applications of the controlled vehicles
    • G05D2105/10Specific applications of the controlled vehicles for cleaning, vacuuming or polishing
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2107/00Specific environments of the controlled vehicles
    • G05D2107/40Indoor domestic environment
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2109/00Types of controlled vehicles
    • G05D2109/10Land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2111/00Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
    • G05D2111/10Optical signals
    • G05D2111/17Coherent light, e.g. laser signals

Definitions

  • the present invention relates to a method for determining a position of a predetermined object, in particular a docking station, in an environment in which a mobile device, in particular a robot, is located, as well as a system for data processing, a mobile device, an object with or for Use with a system, and a computer program for carrying it out.
  • Mobile devices such as robots typically move in an environment, in particular an environment to be processed or a work area, such as an apartment or in a garden. Such a mobile device should typically move again and again to predetermined positions or objects in the environment, such as to a docking station where the mobile device can be charged, for example.
  • the invention generally deals with mobile devices that move or are intended to move in an environment or, for example, in a work area, and in particular with determining a position of predetermined objects in such an environment.
  • Examples of such mobile devices include robots and/or drones and/or vehicles that move in a partially automated or (fully) automated manner (on land, water or in the air).
  • Suitable robots include, for example, household robots such as vacuum and/or mopping robots, floor or street cleaning devices or lawn mowing robots, as well as other so-called service robots, such as at least partially automated moving vehicles, such as passenger transport vehicles or goods transport vehicles (also so-called industrial trucks, e.g. in warehouses). ), but also aircraft such as so-called drones or watercraft.
  • Such a mobile device in particular has a control and/or regulating unit and a drive unit for moving the mobile device, so that the mobile device can be moved in the environment, for example also along a movement path or a trajectory.
  • a mobile device can have one or more sensors by means of which the environment or information in the environment can be recorded.
  • Mobile devices such as B. household and service robots are usually equipped with a built-in power supply (energy storage, especially battery), so that the mobile device must automatically recognize and connect to an external power source in order to charge itself.
  • an external power source is typically provided at a so-called docking station.
  • a docking station for mobile devices or robots has an electrical charging system with a series of contacts. The complementary contacts on the robot enable it to detect the contact point and receive an electrical charging current.
  • a docking station can also serve other purposes.
  • a loading and/or unloading process of goods to be transported can take place at a docking station (this can be done manually and/or automatically).
  • the battery can also be charged, for example.
  • a laser scan in particular a 2D laser scanner (e.g. a so-called lidar sensor), is, for example, a cost-efficient sensor solution for a mobile robot.
  • a 2D laser scanner e.g. a so-called lidar sensor
  • its limited sensing capabilities make it difficult to detect the docking station with a scan or 2D scan alone.
  • a predetermined object such as a docking station in an environment, based on the number of points in the environment, the points in particular each being characteristic of a distance between the mobile device and objects in the environment are.
  • a typical example of such a set of points is a so-called lidar point cloud, which results from a lidar scan.
  • This amount of points is provided. For example, this can be the case by receiving a lidar point cloud from a lidar sensor on the mobile device.
  • This set of points is then analyzed with respect to a predetermined relationship between several points of the set of points, the predetermined relationship being determined by an outer contour of the predetermined object.
  • a preferred example of such a predetermined relationship is that points lie on a line of a certain length.
  • this analyzing is done by determining several groups of points from the set of points and then analyzing each of these groups with respect to the predetermined relationship. This determination of several groups can also be referred to as clustering.
  • a group of points from the set - or one of several groups of points from the set - fulfill the predetermined relationship, for example lie on a line of a certain length, or at least lie on a line of the predetermined length within predetermined tolerances, this will Group of points determined as a subset of points. This subset then shows in particular which object it is.
  • the group that has the shortest distance from the mobile device can preferably be determined as the subset.
  • the smallest distance can, for example, apply on average to every point in the group or to a middle point in the group.
  • That group can be determined as the subset that best fulfills the specified relationship, i.e. corresponds most precisely to the specific length.
  • the position of the predetermined object is then determined; In particular, navigation information for the mobile device is also determined based on the position of the predetermined object.
  • an outer contour of the predetermined object is searched for in the existing set of points, for example the lidar point cloud, or an image of this outer contour through the set of points.
  • a predetermined object that is to be recognized such as the docking station
  • a flat plate can be attached to the object or the docking station, which is arranged at the level of the lidar sensor of the mobile device (or another sensor) and has a certain width in the detection direction of the sensor, for example parallel to a surface , on which this is located mobile device moves.
  • This plate as an outer contour of the object, then results in a subset of points in the lidar scan (or set of points) that lie on a line. This line then has (at least approximately) a length that corresponds to the mentioned width of the plate.
  • the predetermined relationship includes that the plurality of points lie within a two- or three-dimensional area predetermined by the outer contour of the predetermined object, in particular lie on a line of a predetermined length.
  • the (straight) line other relationships also come into consideration; For example, it can be provided that the several points should lie on a circular arc section or another curved line.
  • the length and/or the radius can be specified. It is particularly useful to form a simple geometric shape on the outer contour of the predetermined object, which can then be easily found in the set of points. It goes without saying that this contour does not necessarily have to be specifically designed; Existing contours can also be used.
  • the one or at least one, preferably each, of the multiple groups of points that fulfill the predetermined relationship are analyzed among themselves with regard to a predetermined pattern of points of the respective group. This occurs in particular after the analysis regarding the given relationship, but before determining the subset.
  • the one or more of the analyzed groups that corresponds to the given pattern is then determined as the subset.
  • the specified pattern includes a variation of the intensity values of the points.
  • the variation of intensity values can be determined in particular by a variation of a reflectivity of a surface of the predetermined object in the area of the contour.
  • predetermined patterns may include at least two, preferably at least three, areas that alternately have intensity values above an upper threshold and below a lower threshold.
  • the threshold values relate to an intensity of the points, with the upper threshold indicating a higher intensity than the lower threshold.
  • a variation in reflectivity can be achieved on the object or its surface, for example through different colors or matt and shiny areas. However, certain coatings in certain areas are also conceivable in order to increase or reduce reflectivity.
  • a pattern of such variation in reflectivity then affects the intensity of the points, which correspondingly have a pattern of variation in intensity values. For example, if the points are detected using lidar or laser ranging, points corresponding to lidar beams that have been reflected from a surface with lower reflectivity will have a lower intensity than those that have been reflected from a surface with higher reflectivity.
  • lidar laser distance measurement
  • a lidar sensor a time-of-flight camera, a time-of-flight depth camera or, in general, a time-of-flight sensor can also be used.
  • a specific object such as the docking station can be identified even more precisely. For example, if several groups of points meet the specified relationship, i.e. lie on a line of a certain length, the desired object can be identified from them. For example, a box in the environment can provide a comparable relationship between the points as the docking station.
  • the desired object can be provided with the pattern of variation in reflectivity. In this context it should be mentioned that in practice only one of several groups will fulfill the pattern, since this is explicitly specified and, in particular, can be chosen or specified very specifically. In the event that only a group of points fulfills the specified relationship, the pattern can also be analyzed. In this case this is a verification. If only the switches were recorded in the example above, this would otherwise be identified as the docking station, for example.
  • a new set of points that is different from the set of points is preferably provided, in particular after moving the mobile device. The procedure mentioned above can then be carried out again.
  • a position and/or orientation of the mobile device relative to the predetermined object is also determined based on the position of the predetermined object.
  • Navigation information and in particular also movement control variables, are then determined for the mobile device in order to move the mobile device to the predetermined object. For example, a so-called docking maneuver can then be carried out to dock the mobile device to the docking station.
  • the determination of the position and/or orientation of the mobile device relative to the predetermined object is in particular further based on at least one property of the predetermined object and/or the environment, which before the property has an effect on the points.
  • This at least one property includes, for example, a reflectivity of a surface of the predetermined object (whereby the reflectivity can be considered here in particular independently of the pattern mentioned) and/or an inclination of a surface in front of the predetermined object. This makes it even easier to approach the predetermined object or dock it to the docking station.
  • a potential, e.g. estimated, position of the predetermined object is first provided. Based on the potential position of the predetermined object, the position as explained above and in particular the navigation information are then determined in order to move the mobile device to the predetermined object. If the predetermined object cannot be reached with the navigation information or has not been reached, a different potential position of the predetermined object can in particular be provided. The procedure can then be carried out again.
  • a data processing system comprises means for carrying out the method according to the invention or its method steps.
  • the system can be a computer or server, for example in a so-called cloud or cloud environment. From there - in the case of an application for a mobile device - the position of the predetermined object can be transmitted to the mobile device (e.g. via a wireless data connection). Likewise, information about the environment, such as the number of points, can be transmitted from the mobile device to the system.
  • a data processing system is a computer or a control device in such a mobile device.
  • the invention also relates to a mobile device that is set up to provide a set of points in an environment and to obtain a position of a predetermined object in the environment that has been determined from the set of points according to a method according to the invention, and in particular to use for navigation,
  • the mobile device preferably has a distance measurement or lidar sensor for detecting the number of points in the environment, more preferably a control and/or regulating unit and a drive unit for moving the mobile device.
  • the data processing system can also be included in the device.
  • the mobile device is preferably an at least partially automated moving vehicle, in particular as a passenger transport vehicle or as a goods transport vehicle, and/or as a robot, in particular as a household robot, e.g. vacuum and/or mopping robot, floor or street cleaning device or lawn mowing robot, and/or as a drone educated.
  • a robot in particular as a household robot, e.g. vacuum and/or mopping robot, floor or street cleaning device or lawn mowing robot, and/or as a drone educated.
  • the invention also relates to an object, in particular a docking station, with or for use with a data processing system according to the invention or a mobile device according to the invention.
  • the object has an outer contour, based on which a relationship between several points of the set of points can be determined.
  • the object can have the flat plate mentioned, which has a certain width. It is also useful if this outer contour or plate is at the level of the distance measuring or lidar sensor of the mobile device.
  • the object preferably also has a variation of a reflectivity of a surface in the area of the contour, for example a correspondingly predetermined pattern. It is conceivable, for example, to alternate different colors or alternate matt and shiny areas.
  • a machine-readable storage medium is provided with a computer program stored thereon as described above.
  • Suitable storage media or data carriers for providing the computer program are in particular magnetic, optical and electrical memories, such as hard drives, flash memories, EEPROMs, DVDs, etc.
  • Figure 1 shows schematically a mobile device and a predetermined object in an environment to explain the invention in a preferred embodiment.
  • Figures 2a, 2b and 2c show schematically the mobile device and the predetermined object from Figure 1 in other views.
  • Figure 3 shows schematically a method according to the invention in a preferred embodiment.
  • Figure 4 shows schematically a method according to the invention in a further preferred embodiment.
  • FIGS. 5a, 5b, 5c and 5d show schematic diagrams to explain the invention in a preferred embodiment.
  • FIGS. 6a, 6b and 6c show schematic diagrams to explain the invention in a preferred embodiment. Embodiment(s) of the invention
  • the mobile device 100 is, for example, a vacuum cleaner robot with a control or regulating unit 102 and a drive unit 104 (with wheels) for moving the vacuum cleaner robot 100 in the environment 120, for example an apartment or a room.
  • the vacuum cleaner robot 100 has, for example, a sensor 106 designed as a 2D lidar sensor with a detection field (indicated by dashed lines).
  • the detection field is chosen to be relatively small here; In practice, the field of view can also be up to 360° (e.g. at least 180° or at least 270°).
  • the vacuum cleaner robot 100 has a system 108 for data processing, for example a control device, by means of which data can be exchanged with a higher-level system 110 for data processing, for example via an indicated radio connection.
  • a system 110 eg a server, it can also stand for a so-called cloud
  • navigation information can be determined, which is then transmitted to the system 108 in the vacuum cleaner robot 100, according to which it should then be operated.
  • navigation information is determined in the system 108 itself or is otherwise received there.
  • the system 108 can, for example, also receive control information that has been determined based on navigation information and according to which the control or regulation unit 102 can move the vacuum cleaner robot 100 via the drive unit 104.
  • a docking station 130 is shown in the environment 120 as an example of a predetermined object.
  • the docking station 130 has, for example, contacts 132 (eg electrical contacts) for charging an energy storage device of the vacuum cleaner robot 100, and, for example, a plate with a flat plane 134 on one side (a surface).
  • the flat plane 132 is an example of an outer geometric contour of the object or the docking station 130.
  • FIGs 2a and 2b Further different views of the vacuum cleaner robot 100 and in particular the docking station 130 are shown in Figures 2a and 2b. While a top view can be seen in FIG moves and on the other hand the docking station is stationary.
  • the flat plane 134 is at the level of the 2D lidar sensor 106, the height of which above the ground 122 is designated HS. This ensures that the 2D lidar sensor can capture the flat plane 134. With a different sensor or, for example, a 3D lidear sensor, this does not need to be taken into account or paid less attention to.
  • FIG. 2b shows the docking station 130 and in particular the flat plane 134 in a front view, as can be seen, for example, from the perspective of the vacuum cleaner robot 100 or the 2D lidar sensor.
  • Two concrete dimensions of the flat plane 134 are now shown here, namely its width B and its height H. While the height H, for example, serves to provide a certain amount of leeway for the detection with the 2D lidar sensor or the positioning at the height of the 2D lidar sensor allows (e.g. even on uneven ground), the width B serves to identify the flat plane 134 in a lidar scan, a so-called point cloud, as will be explained below.
  • the flat plane 134 or any other surface or contour to be detected can be, for example, white, in particular matt white, for example painted white or covered with a white material in order to reduce or prevent any reflections.
  • 2c shows a docking station 130', which can correspond more fundamentally to the docking station 130 according to FIGS. 2a, 2b. In particular, the docking station 130' is also shown in the same view.
  • the flat plane 134' is understood to have a pattern that has a variation in reflectivity of the surface.
  • four areas 134a, 134b, 134c, 134d are provided, each of which has a length L (seen along the width B).
  • the flat 134' is therefore divided into the four areas 134a, 134b, 134c, 134d. These areas now alternately have different reflectivity, so that when lidar beams or comparable are reflected therein, a pattern with alternating intensity values above an upper threshold value and below a lower threshold value is created.
  • the areas 134a and 134c can be light and/or shiny, while the areas 134b and 134d can be dark and/or matt.
  • the 2D lidar perceives this surface with four clearly distinguishable areas or regions, which correspond to the four areas in Figure 2c. Each area in the lidar scan will include points with similar reflectance values.
  • FIG 3 a method according to the invention is shown schematically in a preferred embodiment as a type of flow chart or flowchart.
  • a procedure for docking should be included here.
  • a vacuum cleaner robot as a mobile device
  • a docking station as a predetermined object, as also shown in Figures 1a to 2c.
  • an (initial) potential position 304 (possibly also with orientation) of the docking station is first provided in a step 302.
  • potential positions or poses of the docking station can be provided to the vacuum cleaner robot. This can e.g. B. be the initial position from which it started, a position or pose specified by the user (e.g. in response to a message from the robot when it cannot detect the docking station at its previous position); It is also possible to have a list of possible positions or poses that is collected during the robot's runtime (e.g. by running the docking station detection algorithm, which can be particularly helpful when the robot is driving near walls). Another possibility is that these new positions or poses of the docking station are created by another process that can be started on demand, for example by exploring the drivable area in search of docking station signatures.
  • a step 306 the robot vacuum cleaner moves to or near the docking station; this is done based on the potential position 304. If the vacuum cleaner robot is then in the vicinity of the docking station, i.e. at or near the potential position, in a step 310 navigation information 308 is determined in order to move the vacuum cleaner robot to the docking station . This is done on the one hand based on the potential position 304, but on the other hand also by determining the position of the docking station based on a 2D lidar scan (set of points). This will be explained in more detail below with reference to FIG. 4.
  • This navigation information can also include, for example, instructions on how the vacuum cleaner robot should navigate exactly (e.g. how far it should travel and when, when it should turn, etc.).
  • movement control variables for controlling the drive unit can also be determined. So in step 310 a docking machine is created növer carried out, ie the vacuum cleaner robot drives to the docking station according to the navigation information 308 or at least tries to do so. The detection of the docking station, ie the determination of its position, is carried out continuously or repeatedly during the docking process (the control is based on this input).
  • step 312 it is then checked whether the docking maneuver was successful or not. If yes (Y), i.e. if the vacuum cleaner robot is successfully docked at the docking station and, for example, has established electrical contact to charge the battery, then the method is ended in step 314. This can be the case, for example, if the potential position is already very close to the actual position.
  • step 316 if the docking maneuver was not successful (N), i.e. if the docking station with the navigation information was not reached, a check is made in step 316 as to whether the docking station was recognized at all. If the docking station has been recognized, i.e. if the position of the docking station (or of any docking station at all) has been determined (Y), the process switches again to step 306 to drive close to the docking station again, based on the potential position 304 or a new potential position derived from the position of the docking station detected in the previous step 310. For example, with the same potential position of the docking station (it was actually recognized), a docking maneuver is attempted again, for example by moving back and then approaching the docking station again with, for example, a slightly changed approach angle or the like.
  • step 316 a new potential position of the docking station can be searched for or requested in step 318. If a new or different potential position of the docking station was found or provided in step 320 (as mentioned for example in step 302) (Y), you can go back to step 306. The entire process can be repeated until it is successful or there are no positions or poses left (N); then the process ends with step 322.
  • a method according to the invention is shown schematically in a further preferred embodiment as a type of flow chart or flowchart. This will be explained again as an example using a vacuum cleaner robot as a mobile device and a docking station as a predetermined object, as also shown in Figures 1a to 2c.
  • a set 500 of points in the environment is first provided in step 400, for example using the 2D lidar scan.
  • a set 500 of points is shown as an example as a 2D lidar scan, as is obtained, for example, in an environment with the docking station in front of a wall from the 2D lidar sensor, similar to that in Figure 1.
  • This set 500 is then analyzed in a step 410 with regard to a predetermined relationship 412 of several points of the set of points to one another.
  • This predetermined relationship is determined, for example, by the flat plane 134 of the docking station and requires that points lie on a straight line of a certain length (which corresponds to the width B of the flat plane 134).
  • step 414 several groups of points can be determined, which, for example, have a similar relationship between several points; this can be referred to as a clustering operation.
  • FIG 5b the set 500 of points is shown again, but after a clustering process in which several groups 510, 512, 514, 516, 518 of points have been determined.
  • a possible algorithm by which these groups (or clusters) can be determined is, for example, as follows: For example, a cluster criterion such as a certain distance can be specified. For each point, it is then gradually checked whether its distance from the previous point corresponds at most to the certain distance. If yes, then this point belongs to the same cluster or group. If not, a new cluster or group is formed.
  • These groups can then be analyzed in step 416, specifically with regard to the predetermined relationship 412 of several points of the set of points to one another.
  • a possible algorithm by which a group (or cluster) can be analyzed to determine whether it or its points fulfill the specified relationship is, for example, as follows.
  • a relationship criterion such as a certain length can be specified.
  • a distance in the x and y directions is then determined for each two consecutive points in the group or in the cluster (this applies, for example, to an analysis in 2D in a Cartesian coordinate system; this is only an example here). This is designated Axi or Ayi in Figure 5c.
  • a standard deviation is then determined for these distances, i.e. once in the x and once in the y direction. If their sum is smaller than the relationship criterion, it can be assumed that these points lie on a line.
  • a group here 512, can be determined that satisfy the predetermined relationship; This group 512 should then be a subset of points, i.e. the selected group that is used to determine the position of the docking station.
  • the group that has the shortest distance to the vacuum cleaner robot or to the lidar sensor is determined as the subset.
  • the distance can be determined, for example, as the average distance of all points in the group to the vacuum cleaner robot or to the lidar sensor.
  • the group 510 or 516 can also fulfill the specified relationship. It also depends on how exactly the specified relationship is fulfilled.
  • another preference criterion can also be used, based on which the several groups are ordered and/or one of the groups is determined as the subset.
  • a suitable algorithm that determines the group can be used. An algorithm can also be used to determine the smallest distance.
  • step 424 these multiple groups of points that meet the predetermined relationship can be analyzed among themselves with regard to a predetermined pattern of points of the respective group .
  • the predetermined pattern includes a variation of intensity values of the points, the variation of intensity values being determined in particular by a variation of a reflectivity of a surface, for example the surface 134 'as shown in Figure 2c.
  • the group 512 from Figure 5b is shown again as an example.
  • this group 512 includes, for example, 16 points, with four consecutive points each forming an area or a segment, here 512a, 512b, 512c, 512d with four points each.
  • the points of each of these segments have the same or at least approximately the same intensity value within the segment, but the intensity values differ between the segments. So there are four different intensity values.
  • Each segment has a length L, which corresponds, for example, to the length L of the four areas of the flat plane 134 'according to FIG. 2c. If the four areas of the flat plane 134 'according to FIG. 2c each have different, thus a total of four, different reflectivities, this could correspond to the group here in FIG. 5c. However, as mentioned above in relation to Figure 2c, if there are only two different reflectivities, the result would be: Group 12, for example, the segments 512a and 512c on the one hand and the segments 512b and 512d on the other hand each had the same intensity value.
  • the pattern in the group results.
  • each group of points that satisfies the specified relationship with respect to the contour can be analyzed with regard to the pattern assigned by the docking station or the object.
  • the group of points can be divided into four equal numbers of points - or corresponding lengths, for example if the points are unevenly distributed. In practice it may happen that the points cannot be divided exactly accordingly; However, an approximately precise division is sufficient, especially since in practice there will be not just 16, but a significantly higher number of points per group.
  • the individual intensity values of the points can be compared with each other, i.e. a relative value is considered instead of an absolute value.
  • average intensity values can also be determined for each segment (e.g. as the arithmetic mean of the individual intensity values of the segment), which are then compared across the segments.
  • the position 432 of the docking station is then determined in step 430.
  • the position 432 as the position of the docking station is then used in relation to FIG. 3 in step 310 for the continuous provision of the navigation information 308 or, as mentioned in relation to FIG.
  • the docking station can also be defined, for example, by a circular arc with a specific radius and/or a specific length instead of a straight line.
  • a group 612 of points is shown in FIG. 6a, which is defined by a circular arc with radius r es t.
  • a group 612' of points is shown, which is defined by a circular arc with a different radius r' es t.
  • a group 612" of points with positions xi to x n is shown, which is defined by a circular arc with length L.
  • the points from a cluster can be fitted to a circle, e.g. using a closed least squares method (e.g. according to https://lucidar.me/en/mathematics/least-squares-fitting-of- circle/, or https://www.emis.de/journals/BBMS/Bulletin/sup962/gander.pdf).
  • This determines the radius of the circle and the coordinates of the center of the circle x c and the radius and/or the calculated arc length L es t (e.g.
  • a pattern with variation in the intensity values can be specified and the groups of points can be analyzed accordingly.
  • two or more different objects can also be distinguished with such predetermined patterns by specifying a different pattern for each of the desired objects; Care should be taken to ensure that different patterns in the intensity values of the points are clearly distinguishable from one another. This makes it possible, for example, to differentiate between different docking stations, e.g. a docking station for charging the robot and one for emptying a dust container or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

L'invention concerne un procédé pour déterminer la position d'un objet prédéterminé, plus particulièrement d'une station d'accueil, dans un environnement dans lequel se trouve un dispositif mobile, ledit procédé comprenant : la fourniture d'un ensemble de points (500) dans l'environnement, les points étant plus particulièrement caractéristiques d'une distance entre le dispositif mobile et des objets dans l'environnement ; l'analyse de l'ensemble de points par rapport à une relation donnée entre une pluralité de points de l'ensemble de points, la relation donnée étant déterminée par un contour externe de l'objet prédéterminé ; si un ou plusieurs groupes (512) de points de l'ensemble satisfont à la relation donnée, la détermination de ce groupe de points en tant que sous-ensemble de points ; sur la base du sous-ensemble de points, la détermination de la position de l'objet prédéterminé.
PCT/EP2023/068162 2022-07-08 2023-07-03 Procédé de détermination de position d'un objet prédéterminé WO2024008615A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
DE102022206983.8 2022-07-08
DE102022206983 2022-07-08
DE102023206256.9A DE102023206256A1 (de) 2022-07-08 2023-07-03 Verfahren zum Bestimmen einer Position eines vorbestimmten Objekts
DE102023206256.9 2023-07-03

Publications (1)

Publication Number Publication Date
WO2024008615A1 true WO2024008615A1 (fr) 2024-01-11

Family

ID=87157916

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/068162 WO2024008615A1 (fr) 2022-07-08 2023-07-03 Procédé de détermination de position d'un objet prédéterminé

Country Status (1)

Country Link
WO (1) WO2024008615A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018101631A2 (fr) * 2016-11-30 2018-06-07 (주)유진로봇 Aspirateur robotique, appareil de commande de fonction de nettoyage équipé d'un aspirateur robotique, et appareil de détection d'obstacle basé sur un lidar à canaux multiples équipé d'un aspirateur robotique
US20210146552A1 (en) * 2019-11-20 2021-05-20 Samsung Electronics Co., Ltd. Mobile robot device and method for controlling mobile robot device
US11348269B1 (en) * 2017-07-27 2022-05-31 AI Incorporated Method and apparatus for combining data to construct a floor plan

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018101631A2 (fr) * 2016-11-30 2018-06-07 (주)유진로봇 Aspirateur robotique, appareil de commande de fonction de nettoyage équipé d'un aspirateur robotique, et appareil de détection d'obstacle basé sur un lidar à canaux multiples équipé d'un aspirateur robotique
EP4177639A1 (fr) * 2016-11-30 2023-05-10 Yujin Robot Co., Ltd. Aspirateur robotique avec détection d'obstacle par lidar multicanal et détection de glissement
US11348269B1 (en) * 2017-07-27 2022-05-31 AI Incorporated Method and apparatus for combining data to construct a floor plan
US20210146552A1 (en) * 2019-11-20 2021-05-20 Samsung Electronics Co., Ltd. Mobile robot device and method for controlling mobile robot device

Similar Documents

Publication Publication Date Title
EP3415070B1 (fr) Système pourvu d'au moins deux dispositifs de traitement du sol
EP1987371B1 (fr) Procede pour la detection d'objets avec un dispositif pivotant a capteurs
EP2479584B1 (fr) Procédé de détermination de la position d'un appareil automobile
EP2764812B1 (fr) Robot de nettoyage
DE102019114796A1 (de) Führungssystem und -verfahren für autonome fahrzeuge
EP2550227A1 (fr) Procédé pour faire fonctionner un chariot de manutention autonome
DE102012209724B4 (de) Positionsbestimmung von Radfahrzeugen mittels Laserscanner
WO2019015852A1 (fr) Procédé et système de détection d'une zone libre dans un parking
EP3545505A1 (fr) Procédé et système de détection d'un objet saillant se trouvant à l'intérieur d'un parc de stationnement
EP3559773B1 (fr) Procédé de navigation et localisation automatique d'un appareil d'usinage à déplacement autonome
DE102012112035A1 (de) Verfahren zum Betrieb eines Staubsaugerroboters und nach dem Verfahren arbeitender Staubsaugerroboter sowie System mit einem solchen Staubsaugerroboter
EP2385014A1 (fr) Chariot de manutention doté d'un dispositif destiné à l'identification d'un produit de transport chargé et procédé destiné à l'identification d'un produit de transport chargé d'un chariot de manutention
EP3482622B1 (fr) Procédé de guidage automatique d'un véhicule le long d'un système de rails virtuel
DE112013001814T5 (de) Automatische Oberleitungsführung
DE10323643B4 (de) Sensorsystem für ein autonomes Flurförderfahrzeug
WO2024008615A1 (fr) Procédé de détermination de position d'un objet prédéterminé
DE102023206256A1 (de) Verfahren zum Bestimmen einer Position eines vorbestimmten Objekts
DE112020002578T5 (de) Verfahren, System und Vorrichtung zur dynamischen Aufgabensequenzierung
DE102019202558A1 (de) Unbemanntes bodengebundenes transportfahrzeug und verfahren zum transportieren von kabinenmonumenten
EP3894349B1 (fr) Système de chargement de conteneurs et procédé de surveillance du fonctionnement dans ce dernier
DE102016001839B3 (de) Fahrerloses Transportsystem
WO2020001690A1 (fr) Procédé et système de reconnaissance d'obstacles
EP4357808A1 (fr) Procédé de détermination de marqueurs pour une position et/ou une orientation d'un dispositif mobile dans un environnement
EP4227707A1 (fr) Procédé de détermination d'une position et/ou d'une orientation d'un dispositif mobile dans un environnement
DE102021205620A1 (de) Verfahren zum Bestimmen eines Bewegungspfades auf einem Untergrund

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23738665

Country of ref document: EP

Kind code of ref document: A1