EP4128017A1 - Système mobile et procédé de fonctionnement d'un système mobile - Google Patents

Système mobile et procédé de fonctionnement d'un système mobile

Info

Publication number
EP4128017A1
EP4128017A1 EP21711846.2A EP21711846A EP4128017A1 EP 4128017 A1 EP4128017 A1 EP 4128017A1 EP 21711846 A EP21711846 A EP 21711846A EP 4128017 A1 EP4128017 A1 EP 4128017A1
Authority
EP
European Patent Office
Prior art keywords
real object
sensor
mobile system
real
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP21711846.2A
Other languages
German (de)
English (en)
Inventor
Maja Sliskovic
Jens Kesselring
Quamrul Hasan Mallik
Ruben KLEIS
Lukas BENZ
Patrick Betz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SEW Eurodrive GmbH and Co KG
Original Assignee
SEW Eurodrive GmbH and Co KG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SEW Eurodrive GmbH and Co KG filed Critical SEW Eurodrive GmbH and Co KG
Publication of EP4128017A1 publication Critical patent/EP4128017A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

Definitions

  • the invention relates to a mobile system for operation, in particular in a technical installation, which comprises a first sensor for detecting a distance to a real object.
  • the invention also relates to a method for operating a mobile system according to the invention, in particular in a technical installation.
  • the technical system is in particular an industrial application, for example a production plant, an industrial hall or a logistics center.
  • the mobile system of the technical installation is, for example, an autonomously driving vehicle.
  • the mobile system is used, for example, to transport objects within the technical system. There are also other objects in the technical system.
  • the mobile system has a sensor for detecting such objects and distances to such objects.
  • the document DE 102019001 253 A1 discloses a method for operating a technical installation which has at least one mobile system that can be moved on a traffic area of the technical installation.
  • the mobile system detects objects in the technical system with the help of appropriate sensors.
  • the positions of the objects detected in this way are compared with assumed positions of objects according to a map of the technical installation.
  • the document US 2011/0218670 A1 describes a method for object recognition by means of a laser scanner and a camera.
  • the document US 9,102,055 B1 describes a method and a system for recognizing an environment.
  • the document US 9,938,092 B2 describes a device for unloading containers which has a camera.
  • the document US 2019/0213438 A1 describes a mobile cleaning robot, which a Has camera.
  • Document DE 102018205964 A1 describes a method and a control device for navigating an autonomous vehicle using a camera.
  • the invention is based on the object of developing a mobile system and a method for operating a mobile system, in particular in a technical installation.
  • the object is achieved by a mobile system with the features specified in claim 1.
  • Advantageous refinements and developments are the subject of the subclaims.
  • the object is also achieved by a method for operating a mobile system with the features specified in claim 6.
  • Advantageous refinements and developments are the subject of the subclaims.
  • the mobile system has a second sensor for taking an image, and the mobile system has a recognition unit for recognizing a real object in an image taken by the second sensor, and the mobile system has a classification unit for classifying an object type of one of the recognition unit recognized real object, and the mobile system has a linking unit for generating a virtual object by linking an object type classified by the classification unit with a distance detected by the first sensor.
  • the generated virtual object includes information about the object type, the position and the orientation of the real object and geometric properties of the real object.
  • the interaction of the two different sensors significantly improves the classification and localization of real objects. For example, it is possible to distinguish whether a supposed object is just an image or a real object. Linking the data reduces the errors in the classification of the object recognition.
  • the object type of the real objects in the environment of a mobile system provides information that is important for further software parts in the mobile system, for example for planning a route.
  • the first sensor is designed as a laser scanner.
  • a laser scanner sends out a laser beam and detects a reflected one Laser beam and uses this to calculate a distance to a real object that reflects the laser beam.
  • a laser scanner serves the mobile system in particular to recognize real objects such as obstacles in the technical system and to record a distance to a recognized real object.
  • a laser scanner is already available in known mobile systems, so there are no additional costs for installing the first sensor.
  • the second sensor is designed as a monocular camera.
  • a monocular camera is relatively inexpensive, robust and reliable.
  • Further sensors, for example in the form of a radar sensor, can also be replaced, but are not required.
  • the first sensor is designed as a 3D camera.
  • a 3D camera calculates a distance to a real object that is recognized by the 3D camera.
  • a 3D camera serves the mobile system in particular to recognize real objects such as obstacles in the technical system and to record a distance to a recognized real object.
  • the classification unit comprises a neural network.
  • the classification unit is specifically designed for the classification of real objects in an industrial environment and is specially trained for the classification of such objects.
  • Real objects in an industrial environment are, for example, forklifts, lattice boxes, people and other mobile transport systems.
  • the mobile system is designed as an autonomously driving vehicle which has a drive device, an electrical energy store for supplying the drive device and a control unit for controlling the drive device.
  • the drive device includes, for example, an electric motor, a transmission and drive wheels.
  • the mobile system is, in particular, a driverless transport system for transporting objects within the technical system.
  • the mobile system has a position sensor for detecting a position of the mobile system, in particular within the technical installation.
  • Said sensor is, for example, one GPS receiver or a SLAM system.
  • a distance to a real object is detected by means of the first sensor, and an image is recorded by means of the second sensor, and a real object is detected in the by means of the detection unit recorded image is recognized, and by means of the classification unit an object type of the recognized real object is classified, and by means of the linking unit, a virtual object is generated by linking the classified object type with the detected distance.
  • the method according to the invention allows not only to recognize the visible geometry of a real object, but also to infer the actual complete geometry of the real object via the object recognition.
  • the method according to the invention significantly improves the classification and localization of real objects through the interaction of the two different sensors. For example, it is possible to distinguish whether a supposed object is just an image or a real object. Linking the data reduces errors in the classification of the object recognition.
  • the object type of the real objects in the environment of a mobile system provides information that is important for further software parts in the mobile system, for example for planning a route.
  • an image is recorded by means of the second sensor in a first step.
  • a real object is recognized in the recorded image by means of the recognition unit.
  • an angular range is determined within the recorded image in which the recognized real object is located.
  • an object type of the recognized real object is classified by means of the classification unit.
  • a distance to the real object located in the determined angular range is recorded by means of the first sensor.
  • the virtual object is generated by means of the linking unit. After an image of the real object has been recorded by the second sensor, the distance to this real object is recorded by the first sensor, and the virtual object is generated from this information.
  • visible edges of the real object located in the determined angular range are segmented in the fifth step, and the segmented visible edges are compared with geometric properties of the classified object type in the sixth step.
  • the visible geometry of a real object which is detected by a first sensor designed as a laser scanner, usually consists only of edges. Edges are also present in the image recorded by the second sensor. The visible edges detected by the first sensor can be assigned to the edges present in the recorded image. The information about the object type allows conclusions to be drawn about the geometric properties of the real object.
  • a distance to a real object is recorded in a first step by means of the first sensor.
  • an angular range is determined in which the detected real object is located.
  • the second sensor records an image that extends at least over the determined angular range.
  • the recognition unit recognizes a real object in the determined angular range within the recorded image.
  • an object type of the recognized real object is classified by means of the classification unit.
  • the virtual object is generated by means of the linking unit. After a real object and the distance to this object have been detected by the first sensor, an image of this real object is recorded by the second sensor, and the virtual object is generated from this information.
  • visible edges of the real object located in the determined angular range are segmented in the second step, and the segmented visible edges are compared with geometric properties of the classified object type in the sixth step.
  • the visible geometry of a real object which is detected by a first sensor designed as a laser scanner, usually consists only of edges. Edges are also present in the image recorded by the second sensor. The visible edges detected by the first sensor can be assigned to the edges present in the recorded image. The information about the object type allows conclusions to be drawn about the geometric properties of the real object.
  • said steps are repeated at defined time intervals until the virtual object is generated. Changes in the recorded distance to the real object are registered, and changes in the Angular range in which the real object is located are registered. A movement of the virtual object is recognized from the registered changes. By detecting a movement, it is possible, in particular, to distinguish between static real objects and dynamic real objects. Furthermore, the precise determination of a pose of the real object is made possible.
  • a speed of the virtual object is preferably calculated from the registered changes.
  • the speed of the virtual object corresponds to a speed of the real object.
  • a direction of movement of the virtual object is preferably calculated from the registered changes.
  • the direction of movement of the virtual object corresponds to a direction of movement of the real object.
  • a local map which has at least one generated virtual object.
  • the local map forms the basis for the autonomous driving of the mobile system.
  • a calculated speed of the at least one virtual object and / or a calculated direction of movement of the at least one virtual object is preferably entered in the local map.
  • a reaction of the mobile system is triggered as a function of the distance to the real object and / or as a function of the object type of the real object. If, for example, a person is recognized at a relatively short distance, the mobile system is immediately braked. If, for example, a lattice box is detected at a relatively great distance, a steering movement is initiated to drive around the lattice box, possibly at a reduced speed.
  • Figure 1 a schematic representation of a mobile system in a technical installation
  • FIG. 1 shows a schematic representation of a mobile system 10 in a technical installation.
  • the mobile system 10 is designed as an autonomously driving vehicle and has a drive device, an electrical energy store for supplying the drive device and a control unit for controlling the drive device.
  • the mobile system 10 also has a position sensor for detecting a position of the mobile system 10 within the technical installation.
  • the technical system has further mobile systems 10, not shown here, which are designed in the same way.
  • the mobile system 10 comprises a first sensor 1, which is designed as a laser scanner, and a second sensor 2, which is designed as a monocular camera.
  • the first sensor 1 is used to detect real objects 11, 12, 13 and to detect a distance to a real object 11, 12, 13.
  • the second sensor 2 is used to record images on which, in particular, real objects 11, 12, 13 are shown.
  • the mobile system 10 has a recognition unit for recognizing a real object 11,
  • the mobile system 10 also has a classification unit for classifying an object type of a real object 11, 12, 13 recognized by the recognition unit.
  • the classification unit comprises a neural network.
  • the mobile system 10 also has a linking unit for generating a virtual object.
  • a virtual object is generated by linking an object type of a real object 11, 12, 13 classified by the classification unit with a distance to the real object 11, 12, 13 detected by the first sensor 1.
  • the recognition unit, the classification unit and the linking unit are parts of a processing unit 25.
  • a first real object 11, a second real object 12 and a third real object are present in the technical installation in a visual area 20 of the mobile system 10 Object 13.
  • the first real object 11 is a forklift in the present case.
  • the second real object 12 is a person.
  • the third real object 13 is a mobile transport system.
  • the first sensor 1 detects and segments a visible edge 31 of the first real object 11 and a distance to the first real object 11 in a specific angular range.
  • the position of the first real object 11 is determined from the own position of the mobile system 10 detected by means of the position sensor, the detected distance and the angular range.
  • the second sensor 2 records an image that extends beyond the specific angular range.
  • the recognition unit recognizes the first real object 11 in the recorded image.
  • the classification unit classifies an object type of the recognized first real object 11.
  • the object type includes further information, for example geometric properties, and in particular a floor plan 35.
  • the object type of the first real object 11 is a forklift truck.
  • the floor plan 35 of this type of object is a rectangle in the present case.
  • the segmented visible edge 31 of the first real object 11 is compared with geometric properties of the classified object type, in particular with edges that are present in the image recorded by the second sensor 2.
  • the visible edge 31 detected by the first sensor 1 can be assigned to an edge present in the recorded image.
  • the information about the object type allows conclusions to be drawn about geometric properties of the first real object 11, in particular about the floor plan 35.
  • the linking unit then generates a virtual object by linking the classified object type of the first real object 11, in this case a forklift truck, with the detected distance to the first real object 11.
  • the generated virtual object thus includes information about the object type, the position and the orientation of the first real object 11 and geometric properties, in particular the floor plan 35, of the first real object 11.
  • the first sensor 1 likewise detects and segments a visible edge 31 of the second real object 12 and a distance to the second real object 12 in a specific angular range. From the mobile's own position detected by means of the position sensor System 10, the detected distance and the angular range, the position of the second real object 12 is determined.
  • the second sensor 2 records an image that extends beyond the specific angle range.
  • the recognition unit recognizes the second real object 12 in the recorded image, and the classification unit classifies an object type of the recognized second real object 12.
  • the object type of the second real object 12 is a person.
  • the floor plan 35 of this type of object is in the present case a circle or an oval.
  • the segmented visible edge 31 of the second real object 12 is compared with geometric properties of the classified object type, in particular with edges that are present in the image recorded by the second sensor 2.
  • the linking unit then generates a virtual object by linking the classified object type of the second real object 12, in the present case a person, with the detected distance to the second real object 12.
  • the generated virtual object thus includes information about the object type, the position and the orientation of the second real object 12 and geometric properties, in particular the floor plan 35, of the second real object 12.
  • the first sensor 1 also detects and segments two visible edges 31 of the third real object 13 and a distance to the third real object 13 in a specific angular range.
  • the position of the third real object 13 is determined from the own position of the mobile system 10 detected by means of the position sensor, the detected distance and the angular range.
  • the second sensor 2 records an image that extends beyond the specific angle range.
  • the recognition unit recognizes the third real object 13 in the recorded image, and the classification unit classifies an object type of the recognized third real object 13.
  • the object type of the third real object 13 is a mobile transport system.
  • the floor plan 35 of this type of object is a rectangle in the present case.
  • the segmented visible edge 31 of the third real object 13 is compared with geometric properties of the classified object type, in particular with edges that are present in the image recorded by the second sensor 2.
  • the linking unit then generates a virtual object by linking the classified object type of the third real object 13, in the present case a rectangle, with the detected distance to the third real object 13.
  • the generated virtual object thus includes information about the object type, the position and the orientation of the third real object 13 and geometric properties, in particular the floor plan 35, of the third real object 13.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Optics & Photonics (AREA)
  • Automation & Control Theory (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Image Analysis (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

L'invention concerne un système mobile (10) pour un fonctionnement en particulier dans une installation technique, comprenant un premier capteur (1) pour acquérir une distance par rapport à un objet réel (11, 12, 13), le système mobile (10) comportant un deuxième capteur (2) pour capturer une image, et le système mobile (10) comprenant une unité de reconnaissance destinée à reconnaître un objet réel (11, 12, 13) dans une image capturée par le second capteur (2), et le système mobile (10) ayant une unité de classification pour classifier un type d'objet d'un objet réel (11, 12, 13) reconnu par l'unité de reconnaissance, et le système mobile (10) ayant une unité d'association pour générer un objet virtuel par association d'un type d'objet classé par l'unité de classification avec une distance acquise par le premier capteur (1). L'invention concerne également un procédé de fonctionnement d'un système mobile (10) selon l'invention, en particulier dans une installation technique, une distance par rapport à un objet réel (11, 12, 13) étant acquise au moyen du premier capteur (1), et une image étant capturée au moyen du second capteur (2), et un objet réel (11, 12, 13) dans l'image capturée étant reconnu par l'unité de reconnaissance, et un type d'objet de l'objet réel reconnu (11, 12, 13) étant classé au moyen de l'unité de classification, et un objet virtuel étant généré au moyen de l'unité d'association par association du type d'objet classé à la distance acquise.
EP21711846.2A 2020-04-01 2021-03-10 Système mobile et procédé de fonctionnement d'un système mobile Pending EP4128017A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102020002043 2020-04-01
PCT/EP2021/056116 WO2021197785A1 (fr) 2020-04-01 2021-03-10 Système mobile et procédé de fonctionnement d'un système mobile

Publications (1)

Publication Number Publication Date
EP4128017A1 true EP4128017A1 (fr) 2023-02-08

Family

ID=74873755

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21711846.2A Pending EP4128017A1 (fr) 2020-04-01 2021-03-10 Système mobile et procédé de fonctionnement d'un système mobile

Country Status (3)

Country Link
EP (1) EP4128017A1 (fr)
DE (1) DE102021001282A1 (fr)
WO (1) WO2021197785A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102023002751A1 (de) 2022-08-03 2024-02-08 Sew-Eurodrive Gmbh & Co Kg Verfahren zum Betreiben einer technischen Anlage
DE102023003260A1 (de) 2022-09-13 2024-03-14 Sew-Eurodrive Gmbh & Co Kg Verfahren zur Detektion eines Objekts durch ein mobiles System

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8538577B2 (en) 2010-03-05 2013-09-17 Crown Equipment Limited Method and apparatus for sensing object load engagement, transportation and disengagement by automated vehicles
US9367770B2 (en) * 2011-08-30 2016-06-14 Digimarc Corporation Methods and arrangements for identifying objects
US9227323B1 (en) 2013-03-15 2016-01-05 Google Inc. Methods and systems for recognizing machine-readable information on three-dimensional objects
WO2016054656A1 (fr) 2014-10-03 2016-04-07 Wynright Corporation Système de manipulation robotique basé sur la perception et procédé pour dispositif de déchargement de camions automatisé qui décharge/déballe le produit des remorques et conteneurs
WO2018192683A1 (fr) 2017-04-21 2018-10-25 Sew-Eurodrive Gmbh & Co. Kg Procédé de détection d'objets mobiles dans une installation et/ou d'évitement de collision dans une installation et installation de mise en œuvre d'un tel procédé
DE102018009114A1 (de) 2017-12-21 2019-06-27 Sew-Eurodrive Gmbh & Co Kg Verfahren zur Bestimmung der Position eines auf einer Verfahrfläche bewegbaren Mobilteils und Anlage mit Mobilteil zur Durchführung des Verfahrens
US10878294B2 (en) 2018-01-05 2020-12-29 Irobot Corporation Mobile cleaning robot artificial intelligence for situational awareness
DE102019000903A1 (de) 2018-02-21 2019-08-22 Sew-Eurodrive Gmbh & Co Kg System mit Anlage, Objekten und Mobilteilen, und Verfahren zum Betreiben eines Systems
DE102019001253A1 (de) 2018-03-08 2019-09-12 Sew-Eurodrive Gmbh & Co Kg Verfahren zum Betreiben einer Anlage
DE102018205964A1 (de) 2018-04-19 2019-10-24 Zf Friedrichshafen Ag Verfahren und Steuergerät zum Navigieren eines autonomen Flurförderfahrzeugs
US10491885B1 (en) * 2018-06-13 2019-11-26 Luminar Technologies, Inc. Post-processing by lidar system guided by camera information

Also Published As

Publication number Publication date
DE102021001282A1 (de) 2021-10-07
WO2021197785A1 (fr) 2021-10-07

Similar Documents

Publication Publication Date Title
DE102016220945A1 (de) Verfahren und Vorrichtung zum Unterstützen eines Manövriervorganges eines Kraftfahrzeuges
EP3824247A1 (fr) Procédé et système destiné à déterminer une position d'un véhicule
EP4128017A1 (fr) Système mobile et procédé de fonctionnement d'un système mobile
DE102004028763A1 (de) Andockassistent
WO2016020347A1 (fr) Procédé de détection d'un objet dans une zone environnante d'un véhicule automobile au moyen d'un capteur à ultrasons, système d'assistance au conducteur et véhicule automobile
EP3687881B1 (fr) Procédé pour effectuer un stationnement en bord de trottoir d'un véhicule automobile, dispositif et véhicule automobile
DE102018127059A1 (de) Verfahren zur Überprüfung mindestens eines Umfelderfassungssensors eines Fahrzeugs
DE102020118629B4 (de) Computerimplementiertes Verfahren zum Bestimmen der Validität einer geschätzten Position eines Fahrzeugs
DE102016012345A1 (de) Verfahren zum Erkennen von Objekten
DE102016003261A1 (de) Verfahren zur Selbstlokalisierung eines Fahrzeugs in einer Fahrzeugumgebung
DE102017120996A1 (de) Verfahren und Vorrichtung zur Kollisionsvermeidung beim Betrieb eines Flurförderzeugs
DE102014221763A1 (de) Verfahren zur automatischen Steuerung von Objekten innerhalb eines räumlich abgegrenzten Bereichs, der für die Herstellung oder Wartung oder das Parken eines Fahrzeugs vorgesehen ist
DE102010021042A1 (de) Verfahren zur rechnergestützten Spurführung von Fahrzeugen
DE102018213378B4 (de) Fahrassistenzsystem für ein Fahrzeug, Fahrzeug mit demselben und Fahrassistenzverfahren für ein Fahrzeug
EP3977225B1 (fr) Procédé de fabrication d'une carte d'environnement pour l'utilisation lors de la navigation autonome d'un robot mobile
DE102021111325A1 (de) Verfahren und Assistenzeinrichtung zum Unterstützen eines Fahrbetriebs eines Kraftfahrzeugs und Kraftfahrzeug
EP3454158A1 (fr) Procédé et dispositif de navigation d'un chariot de manutention
DE102019128253B4 (de) Verfahren zum Navigieren eines Flurförderzeugs
DE102019132967A1 (de) Verfahren und Vorrichtung zur Ermittlung einer Fahrspur-Hypothese
DE102017105879A1 (de) Verfahren und Auswertevorrichtung zum Erfassen einer Umgebung eines Fahrzeugs und Fahrzeug
EP3423911B1 (fr) Procédé de mise à jour d'une carte d'occupation et véhicule autonome
DE102023003260A1 (de) Verfahren zur Detektion eines Objekts durch ein mobiles System
DE102023000148A1 (de) Mobiles System zum Betrieb in einer technischen Anlage und Verfahren zum Betreiben einer technischen Anlage
DE102019212279B3 (de) Verfahren und Vorrichtung zum Überprüfen einer Kalibrierung von Umfeldsensoren
DE102019206231A1 (de) Kamerabasierte Positionsbestimmung für ein Objekt in einem Umfeld eines Fahrzeugs unter Verwendung einer epipolaren Bildlinie

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20221102

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR