WO2024056205A1 - Procédé de détection d'un objet à l'aide d'un système mobile - Google Patents

Procédé de détection d'un objet à l'aide d'un système mobile Download PDF

Info

Publication number
WO2024056205A1
WO2024056205A1 PCT/EP2023/025369 EP2023025369W WO2024056205A1 WO 2024056205 A1 WO2024056205 A1 WO 2024056205A1 EP 2023025369 W EP2023025369 W EP 2023025369W WO 2024056205 A1 WO2024056205 A1 WO 2024056205A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
sensor
mobile system
detected
assigned
Prior art date
Application number
PCT/EP2023/025369
Other languages
German (de)
English (en)
Inventor
Quamrul Hasan Mallik
Patrick Betz
Maja Sliskovic
Original Assignee
Sew-Eurodrive Gmbh & Co. Kg
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sew-Eurodrive Gmbh & Co. Kg filed Critical Sew-Eurodrive Gmbh & Co. Kg
Publication of WO2024056205A1 publication Critical patent/WO2024056205A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

Definitions

  • the invention relates to a method for detecting an object by a mobile system, in particular in a technical system, wherein the mobile system has at least one first sensor and at least one second sensor.
  • the technical system is in particular an industrial application, for example a production plant, an industrial hall or a logistics center.
  • the mobile system is, for example, an autonomous vehicle.
  • the mobile system is used, for example, to transport objects within the technical system. There are also other objects in the technical system.
  • the mobile system has a sensor, in particular a laser scanner, for detecting such objects and distances to such objects.
  • the mobile system has a first sensor, which is designed as a laser scanner, and a second sensor, which is designed as a monocular camera. By linking the data recorded by the two sensors, objects in the technical system are recognized.
  • Objects can be detected from images recorded by a camera.
  • Image pyramids are usually created and used for object detection.
  • An image pyramid is associated with an object and has multiple scale levels. Each scale level includes an object image of a different size. Each scale level shows the object at a different distance. To find the object in an image, all scaling levels of the pyramid must be processed, each comparing it to the object image at the scaling level. The comparison with the large number of object images is computationally intensive.
  • the invention is based on the object of developing a method for detecting an object using a mobile system.
  • a method for detecting an object by a mobile system in particular in a technical system, is proposed.
  • the mobile system has at least one first sensor, at least one second sensor and at least one image pyramid.
  • the at least one image pyramid is assigned to an object and contains several scaling levels, each of which includes an object image.
  • the first sensor several points of an object are detected, and distances to the detected points are determined, and a resulting distance to the detected points is determined.
  • the second sensor an image containing the object is recorded, and an image of the object is recognized in the recorded image.
  • At least one scaling level is selected from the at least one image pyramid, to which the resulting distance is assigned.
  • the object image of the at least one selected scaling level is compared with the image recognized in the captured image. If the object image matches the image, the object in the image is detected.
  • An object to which an image pyramid is assigned is, for example, a pallet, a forklift or a grid box.
  • the first sensor is designed as a laser scanner.
  • a laser scanner sends out a laser beam, detects a reflected laser beam and uses this to calculate a distance to a point on an object that reflects the laser beam.
  • Laser scanners are already present in well-known mobile systems, so there are no additional costs for installing the first sensor.
  • the second sensor is designed as a monocular camera.
  • a monocular camera is relatively inexpensive, robust and reliable.
  • the resulting distance to the detected points is determined as the arithmetic mean of the determined distances to the detected points.
  • the resulting distance therefore corresponds to a distance from the first sensor to a point in the middle of the object.
  • the resulting distance to the detected points is determined as the smallest of the determined distances to the detected points.
  • the resulting distance therefore corresponds to a distance between the first sensor and a side of the object facing the mobile system.
  • the scaling levels of the at least one image pyramid are discrete and each include an object image at exactly one resulting distance. According to an advantageous development of the invention, a range of resulting distances is assigned to each scaling level of the at least one image pyramid. This means that resulting distances that lie between two discrete scaling levels can also be processed.
  • the ranges of resulting distances which are assigned to adjacent scaling levels overlap. This allows minor errors and distortions to be compensated for when the image is captured.
  • the ranges of resulting distances which are assigned to adjacent scaling levels are separated from one another. This means that each resulting distance is assigned to exactly one scaling level. This further reduces the computing intensity for comparing the image with object images.
  • the mobile system has several image pyramids, each of which is assigned to an object. A hypothetical object is suggested from the detected points of the object and the distances to the detected points, and the image pyramid that is assigned to the hypothetical object is selected. At least one scaling level is selected from the selected image pyramid, to which the resulting distance is assigned.
  • Objects to which an image pyramid is assigned are, for example, a pallet, a forklift or a grid box. It is therefore possible to detect different types of objects. By suggesting the hypothetical object, a pre-selection is made. A comparison with all image pyramids is therefore not necessary. This reduces the computational intensity of comparing the image with object images.
  • the mobile system is designed as an autonomously driving vehicle which has a drive device, an electrical energy storage device for supplying the drive device and a control unit for controlling the drive device.
  • the drive device includes, for example, an electric motor, a gearbox and drive wheels.
  • the mobile system is in particular a driverless transport system for transporting objects within the technical system.
  • the mobile system has a position sensor for detecting a position of the mobile system, particularly within the technical system.
  • the sensor in question is, for example, a GPS receiver or a SLAM system. Recording the mobile system's own position, especially in the technical system, makes it possible to create a local map, which forms the basis for the autonomous driving of the mobile system.
  • a local map is created which has at least one detected object.
  • the local map forms the basis for the autonomous driving of the mobile system.
  • Figure 1 a schematic side view of a mobile system and an object in a technical system
  • Figure 2 a schematic top view of the mobile system and the object in the technical system
  • Figure 3 a picture with an image of the object
  • Figure 5 an overlay of the image with the recorded points.
  • Figure 1 shows a schematic side view of a mobile system 10 and an object 11 in a technical system.
  • the object 11 in this case is a pallet.
  • the technical system has additional objects 11 not shown here, for example additional pallets, forklifts and mesh boxes.
  • the technical system also has other mobile systems 10, not shown here, which are designed in the same way.
  • the mobile system 10 is designed as an autonomously driving vehicle and has a drive device, an electrical energy storage device for supplying the drive device and a control unit for controlling the drive device.
  • the mobile system 10 also has a position sensor for detecting a position of the mobile system 10 within the technical system.
  • the mobile system 10 also includes a communication device for wireless communication over a network.
  • the mobile system 10 in the present case has two first sensors 1, which are designed as laser scanners. Each of the first sensors 1 is used to detect objects 11 and to detect distances to detected objects 11. The first sensor 1 detects several points P of the object 11 and determines the distances to the detected points P.
  • the mobile system 10 has a second sensor 2, which is designed as a monocular camera. The second sensor 2 is used to record images on which objects 11 in particular are depicted.
  • the mobile system 10 is located on a level floor in the technical facility.
  • the first sensors 1, designed as laser scanners, are arranged on the mobile system 10 in such a way that the scanning planes of the first sensors 1 are aligned parallel to the floor.
  • the second sensor 2, designed as a monocular camera, is arranged on the mobile system 10 in such a way that the optical axis of the second sensor 2 is aligned parallel to the ground.
  • FIG 2 shows a schematic top view of the mobile system 10 and the object 11 in the technical system shown in Figure 1.
  • the mobile system 10 has an approximately rectangular cross section.
  • the first sensors 1, designed as laser scanners, are arranged at opposite corners of the mobile system 10.
  • the second sensor 2, designed as a monocular camera, is arranged on a front side of the mobile system 10.
  • the geometric arrangement of the first sensors 1 relative to the second sensor 2 on the mobile system 10 can be detected by calibration and is therefore known.
  • the geometric arrangement in question is constant and is therefore not changed dynamically.
  • a transformation of detected points P and distances to the detected points P into a recorded image can clearly be carried out.
  • the mobile system 10 also has several image pyramids. Each image pyramid is assigned to an object. Such objects are, for example, a pallet, a forklift or a wire mesh box. Each image pyramid contains multiple scaling levels. Each scaling level includes an object image. The object images of the scaling levels have different sizes. Each scaling level therefore shows the object at a different distance.
  • An image is recorded using the second sensor 2, where the captured image contains the object 11.
  • An image A of the object 11 is recognized in the captured image.
  • Figure 3 shows an image with the image A of the object 11 shown in Figure 1.
  • the object 11 is, as already mentioned, a pallet.
  • the image A corresponds to a perspective representation of the object 11.
  • FIG. 4 shows the detected points P of the object 11 shown in FIG. 1. From the distances of the mobile system 10 to the detected points P, a resulting distance of the mobile system 10 to the detected points P is determined.
  • the resulting distance to the detected points P is determined, for example, as the arithmetic mean of the determined distances to the detected points P. Alternatively, the resulting distance to the detected points P is determined as the smallest of the determined distances to the detected points P.
  • a hypothetical object is proposed from the detected points P of the object 11 and the distances of the mobile system 10 to the detected points P. Based on the position and orientation of the points P, a pallet is proposed as a hypothetical object.
  • the image pyramid that is assigned to the hypothetical object is now selected.
  • a scaling level is then selected from the selected image pyramid, to which the previously determined resulting distance is assigned.
  • the object image of the selected scaling level is then compared with the image A recognized in the captured image. If the object image matches image A, the object, in this case the pallet, is detected in the image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

L'invention concerne un procédé de détection d'un objet à l'aide d'un système mobile (10), en particulier dans un système technique, le système mobile (10) comprenant au moins un premier capteur (1), au moins un second capteur (2) et au moins une pyramide d'images qui est associée à un objet et contient de multiples niveaux de mise à l'échelle, comprenant chacun une image d'objet. Au moyen du premier capteur (1), de multiples points (P) d'un objet (11) sont détectés, et des distances par rapport aux points détectés (P) sont déterminées, et une distance résultante par rapport aux points détectés (P) est établie ; au moyen du second capteur (2), une image contenant l'objet (11) est capturée, et une représentation (A) de l'objet (11) est identifiée dans l'image capturée ; au moins un niveau de mise à l'échelle associé à la distance résultante est sélectionné parmi l'au moins une pyramide d'images ; l'image d'objet de l'au moins un niveau de mise à l'échelle sélectionné est comparée à la représentation (A) identifiée dans l'image capturée ; et l'objet est détecté dans l'image s'il existe une correspondance entre l'image d'objet et la représentation (A).
PCT/EP2023/025369 2022-09-13 2023-08-07 Procédé de détection d'un objet à l'aide d'un système mobile WO2024056205A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102022003372 2022-09-13
DE102022003372.0 2022-09-13

Publications (1)

Publication Number Publication Date
WO2024056205A1 true WO2024056205A1 (fr) 2024-03-21

Family

ID=87571777

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/025369 WO2024056205A1 (fr) 2022-09-13 2023-08-07 Procédé de détection d'un objet à l'aide d'un système mobile

Country Status (2)

Country Link
DE (1) DE102023003260A1 (fr)
WO (1) WO2024056205A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007072665A (ja) * 2005-09-06 2007-03-22 Fujitsu Ten Ltd 物体判別装置、物体判別方法および物体判別プログラム
US20080164985A1 (en) * 2007-01-10 2008-07-10 Omron Corporation Detection device, method and program thereof
US20130073194A1 (en) * 2011-09-15 2013-03-21 Clarion Co., Ltd. Vehicle systems, devices, and methods for recognizing external worlds
DE102021001282A1 (de) 2020-04-01 2021-10-07 Sew-Eurodrive Gmbh & Co Kg Mobiles System und Verfahren zum Betreiben eines mobilen Systems

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007072665A (ja) * 2005-09-06 2007-03-22 Fujitsu Ten Ltd 物体判別装置、物体判別方法および物体判別プログラム
US20080164985A1 (en) * 2007-01-10 2008-07-10 Omron Corporation Detection device, method and program thereof
US20130073194A1 (en) * 2011-09-15 2013-03-21 Clarion Co., Ltd. Vehicle systems, devices, and methods for recognizing external worlds
DE102021001282A1 (de) 2020-04-01 2021-10-07 Sew-Eurodrive Gmbh & Co Kg Mobiles System und Verfahren zum Betreiben eines mobilen Systems

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
X. ZHAOP. SUNZ. XUH. MINH. YU: "LIDAR and Camera Data for Object Detection in Autonomous Vehicle Applications", IEEE SENSORS JOURNAL, vol. 20, no. 9, 1 May 2020 (2020-05-01), pages 4901 - 4913, XP011781784, DOI: 10.1109/JSEN.2020.2966034

Also Published As

Publication number Publication date
DE102023003260A1 (de) 2024-03-14

Similar Documents

Publication Publication Date Title
EP3497476B1 (fr) Véhicule à moteur et procédé de perception de l'environnement à 360°
WO2004055548A1 (fr) Procede pour identifier et suivre des objets
EP2439487B1 (fr) Dispositif de mesure de volume pour objets mobiles
EP2275990B1 (fr) Capteur 3D
EP3053133B1 (fr) Procédé et dispositif pour afficher l'environnement d'un véhicule et système d'assitance au conducteur
EP2767925A2 (fr) Procédé de détection d'objets dans un entrepôt et/ou d'orientation spatiale dans un entrepôt
DE102011111440A1 (de) Verfahren zur Umgebungsrepräsentation
WO2013178407A1 (fr) Procédé et dispositif pour traiter des données stéréoscopiques
WO1996007959A1 (fr) Procede de determination de la position d'un point de repere dans la carte de l'environnement d'une unite automobile qui determine dynamiquement la distance qui la separe du point de repere
EP1419402B1 (fr) Procede de reconnaissance et de poursuite d'objets
DE102017223160A1 (de) Verfahren zum Erkennen von zumindest einem an einem Kraftfahrzeug anliegenden Objekt sowie Steuervorrichtung und Kraftfahrzeug
EP4128017A1 (fr) Système mobile et procédé de fonctionnement d'un système mobile
WO2022078799A1 (fr) Procédé de détection permettant la détection d'objets statiques
DE102020120887B4 (de) Verfahren zum erfassen einer einhängeposition eines auflagestegs und flachbettwerkzeugmaschine
WO2024056205A1 (fr) Procédé de détection d'un objet à l'aide d'un système mobile
DE102020211596A1 (de) Verfahren zum Generieren eines trainierten neuronalen Faltungs-Netzwerks mit invarianter Integrationsschicht zum Klassifizieren von Objekten
DE102019219247A1 (de) Verfahren zur Umfelderfassung, Datenverarbeitungseinheit
WO2020160861A1 (fr) Étalonnage d'un capteur pour un véhicule sur la base d'indices d'identification côté objet et côté image d'un objet de référence
DE102019210129A1 (de) Verfahren zum Überprüfen einer Kalibrierung mehrerer Sensoren und Kraftfahrzeug mit mehreren Sensoren
DE102019124498A1 (de) Verfahren zum Komprimieren von 3D-Punktdaten in einer mittels eines Detektionssystems eines Fahrzeugs erzeugten 3D-Punktwolke
EP4249957A1 (fr) Étalonnage d'un système de capteur combiné pour un véhicule
WO2024028003A1 (fr) Surveillance de l'orientation d'un dispositif de balayage laser
WO2023151856A1 (fr) Appareil de surveillance pour une zone de stockage, agencement intralogistique doté de l'appareil de surveillance, et procédé de surveillance
DE202021004345U1 (de) Fahrerloses Transportfahrzeug
WO2023104767A1 (fr) Procédé et dispositif de détection d'objets au niveau d'un poste de travail industriel

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23754704

Country of ref document: EP

Kind code of ref document: A1