WO2020104263A1 - Procédé, appareil de commande, système et programme informatique permettant de déterminer des points d'objet caractéristiques d'un objet cible, ainsi que procédé de positionnement d'un véhicule à moteur de manière assistée par caméra - Google Patents
Procédé, appareil de commande, système et programme informatique permettant de déterminer des points d'objet caractéristiques d'un objet cible, ainsi que procédé de positionnement d'un véhicule à moteur de manière assistée par caméraInfo
- Publication number
- WO2020104263A1 WO2020104263A1 PCT/EP2019/081116 EP2019081116W WO2020104263A1 WO 2020104263 A1 WO2020104263 A1 WO 2020104263A1 EP 2019081116 W EP2019081116 W EP 2019081116W WO 2020104263 A1 WO2020104263 A1 WO 2020104263A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- target object
- vehicle
- positioning
- images
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
Definitions
- the invention relates to a method for determining characteristic object points of a target object, which are suitable for camera-assisted positioning of a motor-driven vehicle on the target object, a system for determining characteristic object points of a target object, a control device for the system, a method for camera-based positioning of a motor vehicle and a computer program with program code to perform the steps of one of the methods.
- the object is achieved by a method for determining characteristic object points of a target object, which are suitable for camera-supported positioning of a motor-driven vehicle on the target object, with the following steps:
- the processing unit determining characteristic object points of the target object in the images of the image data set;
- the object is further achieved by a method for camera-assisted positioning of a motor-driven vehicle in relation to a target object, the characteristic object points of which were determined by means of the method according to the invention, with the following steps:
- a control device for a system for determining characteristic object points of a target object and / or for camera-assisted positioning of a motor-driven vehicle in relation to a target object also solves the task, the control device being designed to carry out a method according to the invention.
- the object is achieved by a system for determining characteristic object points of a target object and / or for camera-supported positioning of a motor-operated vehicle in relation to a target object, with a vehicle, at least one camera attached to the vehicle and a control device according to the invention.
- the at least one camera can be provided at the rear of the vehicle in order to enable rearward positioning.
- the object is further achieved by a computer program with program code which is adapted to carry out the steps of a method according to the invention if the Computer program is executed on a computer or a corresponding computing unit, in particular on a computing unit of a control device according to the invention of a system according to the invention.
- the computing unit can be a processing unit and / or an evaluation unit.
- the image data contain images that the camera would also record when the vehicle was automatically positioned.
- the characteristic object points can thus be determined on the basis of the same images that will also be available to a system when the vehicle is positioned at least partially autonomously.
- the processing unit can automatically and easily determine characteristic object points that can always be identified in the image and are easy to identify.
- Position data of the target object are preferably created, in particular by the processing unit, the position data comprising the position of the characteristic object points on the target object, in particular the relative position of the characteristic object points to one another on the target object.
- the characteristic object points can be easily reproduced using the location data.
- the processing unit is provided with a model, in particular a 3D model, of the target object, the processing unit marking the characteristic object points in the model, as a result of which the characteristic object points are clearly marked.
- the processing unit can determine the two-dimensional coordinates of the characteristic object points in at least one of the images of the image data.
- the processing unit preferably uses the two-dimensional coordinates of the characteristic object points to determine the three-dimensional coordinates of the characteristic object points in the 3D model, as a result of which the characteristic object points are marked precisely in the model.
- the camera-based sample positioning maneuver is carried out virtually by a simulation unit, the model being provided with a model, in particular a 3D model, of the target object, the simulation unit based on the 3D model of the target object during the execution of the trial positioning maneuver generates two-dimensional images which correspond to the images taken by a camera on the vehicle during the implementation of the trial positioning maneuver. In this way, development costs can be saved since no driving maneuvers with real vehicles and target objects have to be carried out.
- the or another camera-assisted trial positioning maneuver is carried out by means of a physical camera and the target object or a replica of the target object, the camera being guided during the trial positioning maneuver in such a way that the images taken by the camera during the execution of the positioning maneuver are the same Correspond to images of a camera on the vehicle.
- the camera can generate the image data. In this way, dynamics of the trial positioning maneuver can be recorded particularly precisely.
- the camera is attached to the vehicle, with the test positioning maneuver being carried out with the vehicle, in particular with the camera being calibrated with respect to its position and orientation relative to the vehicle. This makes it easy to deduce the position of the vehicle from the images from the camera.
- the image data can have time stamps of the images which indicate the temporal position of the respective image in the trial positioning maneuver, the time stamps being taken into account when determining the characteristic object points of the target object.
- the determination of the characteristic object points can be further improved by carrying out a plurality of camera-assisted trial positioning maneuvers, in particular where at least one of the camera-assisted trial positioning maneuvers is a successful trial positioning maneuver and / or at least one of the camera-assisted trial positioning maneuvers is an unsuccessful trial positioning maneuver.
- the processing unit can comprise a machine learning module, in particular an artificial neural network, which is set up to determine the characteristic object points of the target object on the basis of at least one of the images of the image data set.
- the position and / or the orientation of the vehicle can be determined using the evaluation unit, in particular a machine learning module.
- the evaluation unit in particular a machine learning module.
- FIG. 1 a shows a system according to the invention for determining characteristic object points of a target object schematically in a bird's eye view in a first position during a positioning maneuver
- FIG. 1b shows a schematic two-dimensional image which corresponds to the image taken by a camera on the vehicle of the system according to FIG. 1a at the time shown in FIG. 1a,
- FIG. 2a shows the system according to FIG. 1a in a second position during the positioning maneuver
- FIG. 2b shows a two-dimensional image similar to the image in FIG. 1b at the time of the situation in FIG. 2a
- FIG. 3a shows the system according to FIG. 1a in a third position during the positioning maneuver
- FIG. 3b shows a two-dimensional image similar to the images in FIGS. 1b and 2b at the time of the situation in FIG. 3a
- FIG. 4 shows a flowchart of a method according to the invention for determining characteristic object points of a target object and a subsequent method according to the invention for camera-assisted positioning of a vehicle on the target object, and
- Figure 5 schematically shows a second embodiment of a system according to the invention.
- FIG. 1 schematically shows a motor-driven vehicle 10 that is to be positioned opposite a target object 12.
- the vehicle 10 is a car or a truck and the target object 12 is a trailer or, as in the exemplary embodiment shown, a swap body that is to be driven under.
- the swap body comprises a container 26 which stands on supports 28 (FIG. 1 b).
- the vehicle 10 has a camera 14, which is preferably provided on the rear 16 of the vehicle 10.
- the camera 14 is firmly positioned and calibrated on the vehicle 10, so that the viewing angle of the camera 14 depends on the position and location of the vehicle 10.
- FIG. 1 b The image recorded by the camera 14 in the situation shown in FIG. 1 a is shown in FIG. 1 b.
- the vehicle also has a control unit 18, which includes a processing unit 20.
- the processing unit 20 can in turn have a machine learning module 21, for example an artificial neural network.
- a database 22 can be provided in the control unit 18, in which 3D models of the target object 12, in particular CAD models, are stored.
- the processing unit 20 has access to the database 22 and thus to the 3D models.
- the database 22 is present outside the control device 18, for example on a server in a network or on the Internet, the processing unit 20 being provided with at least the 3D model of the corresponding target object 12, for example via a network connection.
- the vehicle 10, the camera 14 and the control unit 18 form a system 23 with which characteristic object points 30 of the target object 12 can be determined.
- control device 18 and / or the processing unit 20 are provided outside the vehicle 10.
- the image data or the images are transmitted to the control unit by cable, data carrier or wirelessly.
- a test positioning maneuver is carried out with the vehicle 10, in which the vehicle 10 is moved toward the target object 12.
- the vehicle 10 approaches the target object 12 successively, as can be seen in FIGS. 1 a, 2a and 3a, which represent three different positions during a trial positioning maneuver.
- the target object 12 can also be a replica of a target object in order to save costs.
- FIGS. 3a and 3b it can be clearly seen that the vehicle 10 is correctly aligned with the target object 12 and can now underpass the swap body by further resetting.
- the trial positioning maneuver shown in FIGS. 1 a to 3a is thus successful.
- Figures 1 b, 2b and 3b show the two-dimensional image that the camera 14 of the vehicle 10 in the situation of Figure 1 a, 2a and 3a takes (step S1).
- the images are taken by the camera 14 at regular intervals or continuously and transmitted to the control unit 18.
- the swap body i. H. the target object 12 in the image of the camera 14 is taking up more and more space, the contact points 24 at which the container 26 is connected to the supports 28 are always in the image.
- the individual images are two-dimensional and are combined by the control unit 18 into image data, provided that this has not yet been done by the camera 14.
- the two-dimensional images are, of course, those images which were recorded while a trial positioning maneuver was being carried out.
- the camera imprints the images with a time stamp that indicates the time at which the image was taken or the chronological sequence of the images.
- the time stamps can also be transmitted to the control unit 18 separately from the images.
- the image data are transferred from the control unit 18 or the camera 14 to the processing unit 20 (step S2).
- the processing unit 20 comprises a machine learning module 21, which has an artificial neural network, for example.
- the artificial neural network is set up or trained to recognize characteristic object points 30 on the target object 12.
- “Characteristic object points” 30 are parts or sections of the target object 12 that can be clearly identified from as many different angles of view of the target object 12 and that are in the field of view of the camera 14 in as many situations as possible during a positioning maneuver.
- the contact points 24 are suitable as such distinctive points and thus as characteristic object points 30.
- the processing unit 20 uses as many of the two-dimensional images of the image data as possible.
- the time stamp of each image can also be taken into account. (Step S3)
- a next step S4 the processing unit 20 determines the two-dimensional coordinates of the recognized characteristic object points 30 in the images of the image data.
- step S5 which can also be carried out earlier, the processing unit 20 loads or receives a 3D model, e.g. B. a CAD model of the target object 12 from the database 22nd
- the processing unit determines the three-dimensional coordinates of the characteristic object points 30 on the 3D model of the device in step S6 Target 12.
- the processing unit 20 has automatically determined the relative position of the characteristic object points 30 on the target object 12.
- the processing unit 20 can now mark the characteristic object points 30 in the 3D model of the target object 12 (step S7), whereby position data of the target object 12 are created.
- the location data can now be used to perform camera-assisted positioning of the vehicle 10 on the target object 12 at least partially autonomously.
- image data from several sample positioning maneuvers can be used.
- a second trial positioning maneuver is indicated with dashed lines.
- the vehicle 10 is not correctly aligned towards the target object 12 at the end, so that this second trial positioning maneuver is unsuccessful.
- unsuccessful trial positioning maneuvers also provide important information and information, so that the image data of this trial positioning maneuver are used to determine the characteristic object points 30.
- the vehicle 10 With the aid of the determined position data and characteristic object points 30, the vehicle 10 can be positioned with the aid of a camera in relation to the target object 12.
- the vehicle 10 can be positioned with the aid of a camera in relation to the target object 12.
- FIGS. 1 to 3 since the same situations exist.
- the camera 14 takes pictures of the target object 12 continuously or at regular intervals (step P1).
- a machine learning module such as an artificial neural network
- the evaluation unit 32 has access to the previously determined position data of the target object 12, i.e. in this exemplary embodiment, a 3D model of the target object 12, in which the characteristic object points 30 are marked. The evaluation unit 32 therefore knows the relative position of the characteristic object points 30.
- the evaluation unit 32 can determine the position and the orientation of the vehicle 10 relative to the target object 12 (step P3).
- the control unit 18 can then at least partially autonomously position the vehicle 10 and thus position it relative to the target object 12 (step P4) until the vehicle 10 is correctly positioned.
- the position and the orientation of the vehicle 10 with respect to the target object 12 can also take place by means of the machine learning module 21 or the artificial neural network.
- the control unit 18, the processing unit 20 and the evaluation unit 32 comprise a computer program that has program code.
- the program code is written in such a way that it causes the control unit 18, the processing unit 20 or the evaluation unit 32 to carry out the steps of the described methods when it is executed on the control unit 18, the processing unit 20 or the evaluation unit 32.
- FIG. 5 shows a second embodiment of the system 23, which can carry out a second embodiment of the method for determining the characteristic object points 30.
- the system 23 and the method essentially correspond to the system 23 and the method of the first embodiment, so that only the differences are discussed below. Identical and functionally identical parts are provided with the same reference numerals.
- no physical test positioning maneuvers are carried out with a vehicle 10 on a target object 12.
- the 3D models of the target object 12 and possibly the vehicle 10 of a simulation unit 34 are provided.
- a sample positioning maneuver is simulated by means of the simulation unit 34, in which a predetermined path of the vehicle 10 to the target object 12 is simulated.
- the simulation unit 34 generates images at various positions during the simulated test positioning maneuver, which correspond, for example, to the images of FIGS. 1b, 2b and 3b.
- the simulation unit 34 thus generates images which correspond to the images which have been or are being recorded by a camera 14 on a vehicle 10 during the execution of a (trial) positioning maneuver.
- a motor vehicle is shown as vehicle 10.
- the process can also be applied to other vehicles, such as construction equipment or ships.
- the motor-driven vehicle 10 is designed as a ship that is to cross a pier passage and dock at a landing stage, the “target object” 12 is generally defined in this case as a port including an entrance.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
Abstract
L'invention concerne un procédé permettant de déterminer des points d'objet caractéristiques (30) d'un objet cible (12), lesquels points d'objet caractéristiques sont appropriés pour le positionnement, de manière assisté par caméra, d'un véhicule à moteur (10) sur un objet cible (12), le procédé comprenant les étapes suivantes : - réalisation de la manœuvre de positionnement d'essai de manière assistée par caméra ; génération de données d'image contenant une pluralité d'images bidimensionnelles, lesquelles images correspondent aux images capturées par une caméra (14) sur le véhicule (10) pendant la réalisation de la manœuvre de positionnement d'essai ; - acheminement des données d'image à une unité de traitement (20) ; - détermination, au moyen de l'unité de traitement (20), de points d'objet caractéristiques (30) de l'objet cible (12) dans les images de l'ensemble de données d'image. L'invention concerne également un procédé de positionnement, de manière assisté par caméra, d'un véhicule à moteur, un appareil de commande, ainsi qu'un système permettant de déterminer des points d'objet caractéristiques, et un programme informatique.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102018219829.2A DE102018219829A1 (de) | 2018-11-20 | 2018-11-20 | Verfahren, Steuergerät, System und Computerprogramm zur Bestimmung von charakteristi-schen Objektpunkten eines Zielobjektes sowie Verfahren zum kameragestützten Positionieren eines motorbetriebenen Fahrzeugs |
DE102018219829.2 | 2018-11-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020104263A1 true WO2020104263A1 (fr) | 2020-05-28 |
Family
ID=68583375
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2019/081116 WO2020104263A1 (fr) | 2018-11-20 | 2019-11-13 | Procédé, appareil de commande, système et programme informatique permettant de déterminer des points d'objet caractéristiques d'un objet cible, ainsi que procédé de positionnement d'un véhicule à moteur de manière assistée par caméra |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE102018219829A1 (fr) |
WO (1) | WO2020104263A1 (fr) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102021201525A1 (de) * | 2021-02-17 | 2022-08-18 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zur Ermittlung einer räumlichen Ausrichtung eines Anhängers |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180251153A1 (en) * | 2017-03-03 | 2018-09-06 | Continental Automotive Systems, Inc. | Autonomous Trailer Hitching Using Neural Network |
WO2018192984A1 (fr) * | 2017-04-19 | 2018-10-25 | Robert Bosch Gmbh | Procédés et systèmes d'alignement d'un véhicule sur une remorque |
DE102018110092A1 (de) * | 2017-04-28 | 2018-10-31 | GM Global Technology Operations LLC | System und verfahren zum ermitteln eines ausgangspunkts einer führungslinie zur befestigung eines anhängers an einer in einer ladefläche eines fahrzeugs montierten anhängerkupplung |
WO2018210990A1 (fr) * | 2017-05-18 | 2018-11-22 | Cnh Industrial Italia S.P.A. | Système et procédé de raccordement automatique entre un tracteur et un outil |
-
2018
- 2018-11-20 DE DE102018219829.2A patent/DE102018219829A1/de active Pending
-
2019
- 2019-11-13 WO PCT/EP2019/081116 patent/WO2020104263A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180251153A1 (en) * | 2017-03-03 | 2018-09-06 | Continental Automotive Systems, Inc. | Autonomous Trailer Hitching Using Neural Network |
WO2018192984A1 (fr) * | 2017-04-19 | 2018-10-25 | Robert Bosch Gmbh | Procédés et systèmes d'alignement d'un véhicule sur une remorque |
DE102018110092A1 (de) * | 2017-04-28 | 2018-10-31 | GM Global Technology Operations LLC | System und verfahren zum ermitteln eines ausgangspunkts einer führungslinie zur befestigung eines anhängers an einer in einer ladefläche eines fahrzeugs montierten anhängerkupplung |
WO2018210990A1 (fr) * | 2017-05-18 | 2018-11-22 | Cnh Industrial Italia S.P.A. | Système et procédé de raccordement automatique entre un tracteur et un outil |
Non-Patent Citations (1)
Title |
---|
ATOUM YOUSEF ET AL: "Monocular Video-Based Trailer Coupler Detection Using Multiplexer Convolutional Neural Network", 2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), IEEE, 22 October 2017 (2017-10-22), pages 5478 - 5486, XP033283427, DOI: 10.1109/ICCV.2017.584 * |
Also Published As
Publication number | Publication date |
---|---|
DE102018219829A1 (de) | 2020-05-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102016223422B4 (de) | Verfahren zur automatischen Ermittlung extrinsischer Parameter einer Kamera eines Fahrzeugs | |
DE102018206805B3 (de) | Verfahren, Vorrichtung und Computerprogramm zum Prädizieren einer zukünftigen Bewegung eines Objekts | |
DE102018221054A1 (de) | Verfahren zum Bereitstellen von Kartendaten in einem Kraftfahrzeug, Kraftfahrzeug und zentrale Datenverarbeitungseinrichtung | |
DE102018215055A1 (de) | Verfahren zum Bestimmen einer Spurwechselangabe eines Fahrzeugs, ein computerlesbares Speichermedium und ein Fahrzeug | |
EP3444780A1 (fr) | Procédé d'enregistrement d'au moins deux modèles tridimensionnels différents | |
DE102019132996A1 (de) | Schätzen einer dreidimensionalen Position eines Objekts | |
DE102019002269A1 (de) | Verfahren zum Ermitteln einer Orientierung eines Fahrzeugs relativ zu einem Kraftfahrzeug | |
WO2020104263A1 (fr) | Procédé, appareil de commande, système et programme informatique permettant de déterminer des points d'objet caractéristiques d'un objet cible, ainsi que procédé de positionnement d'un véhicule à moteur de manière assistée par caméra | |
DE102018121866A1 (de) | Verfahren zur Tiefenabschätzung von zweidimensionalen Sensordaten | |
DE102018213844A1 (de) | Verfahren zum Testen einer zumindest teilautomatisierten Fahrfunktion für Kraftfahrzeuge | |
DE102020211636A1 (de) | Verfahren und Vorrichtung zum Bereitstellen von Daten zum Erstellen einer digitalen Karte | |
DE102005005242A1 (de) | Verfahren und Vorrichtung zum Bestimmen eines Kameraoffsets | |
DE102020112549A1 (de) | Verfahren zum Vermessen eines an einem Fahrzeug angeordneten Anbauteils auf Grundlage von Bildern eines mobilen Endgeräts, Recheneinrichtung sowie Fahrerassistenzsystem | |
DE102017201796A1 (de) | Steuervorrichtung zum Ermitteln einer Eigenbewegung eines Kraftfahrzeugs sowie Kraftfahrzeug und Verfahren zum Bereitstellen der Steuervorrichtung | |
DE102019219734A1 (de) | Auswertungssystem für Messdaten aus mehreren Domänen | |
EP1756748B1 (fr) | Procede pour classer un objet au moyen d'une camera stereo | |
WO2021245151A1 (fr) | Apprentissage non surveillé d'une présentation commune de données provenant de capteurs de modalité différente | |
WO2020260134A1 (fr) | Procédé de localisation d'un véhicule | |
DE102019129101A1 (de) | Verfahren und System zum Schätzen eines Begrenzungsrahmens, der ein Zielfahrzeug einschließt | |
DE102018217219B4 (de) | Verfahren zum Ermitteln einer dreidimensionalen Position eines Objekts | |
DE102018120966A1 (de) | Verfahren zum Erkennen eines Teils eines Anhängers sowie Anhängererfassungssystem für ein Zugfahrzeug | |
WO2012150150A1 (fr) | Procédé d'estimation assistée par ordinateur de la position d'un objet | |
DE102018208604A1 (de) | Ermitteln eines Aufnahmeverhaltens einer Aufnahmeeinheit | |
DE102023203322B3 (de) | Verfahren zum Erzeugen einer Umgebungsansicht eines Fahrzeugs, Fahrerassistenzeinrichtung, Fahrzeug und Computerprogramm | |
DE102021212485B3 (de) | Verfahren und Vorrichtung zum autonomen Einstecken eines Ladesteckers in eine Ladesteckdose eines Fahrzeugs |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19805209 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19805209 Country of ref document: EP Kind code of ref document: A1 |