WO2017178232A1 - Procédé permettant de faire fonctionner un système d'aide à la conduite d'un véhicule automobile, dispositif de calcul, système d'aide à la conduite et véhicule automobile - Google Patents

Procédé permettant de faire fonctionner un système d'aide à la conduite d'un véhicule automobile, dispositif de calcul, système d'aide à la conduite et véhicule automobile Download PDF

Info

Publication number
WO2017178232A1
WO2017178232A1 PCT/EP2017/057405 EP2017057405W WO2017178232A1 WO 2017178232 A1 WO2017178232 A1 WO 2017178232A1 EP 2017057405 W EP2017057405 W EP 2017057405W WO 2017178232 A1 WO2017178232 A1 WO 2017178232A1
Authority
WO
WIPO (PCT)
Prior art keywords
motor vehicle
specific feature
pose
sensor data
detected
Prior art date
Application number
PCT/EP2017/057405
Other languages
German (de)
English (en)
Inventor
Marco Heimberger
Jean-Francois Bariant
Markus Heimberger
Original Assignee
Valeo Schalter Und Sensoren Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Valeo Schalter Und Sensoren Gmbh filed Critical Valeo Schalter Und Sensoren Gmbh
Publication of WO2017178232A1 publication Critical patent/WO2017178232A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0225Failure correction strategy
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0272Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • B60W2050/021Means for detecting failure or malfunction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • B60W2050/0215Sensor drifts or sensor failures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data

Definitions

  • the invention relates to a method for operating a driver assistance system of a motor vehicle by detecting an object in by a vehicle-side
  • Sensor device from a surrounding region of the motor vehicle detected first sensor data, identifying an object-specific feature on the object in the first sensor data and determining a first pose of the motor vehicle to the object-specific feature based on the object-specific feature.
  • the invention also relates to a computing device, a driver assistance system and a
  • a method for operating a driver assistance system of the aforementioned type is known for example from DE 10 2010 056 217 A1.
  • an orientation of a motor vehicle to an adjacent vehicle is determined on the basis of an image taken by a vehicle-side camera of the adjacent vehicle.
  • the image is processed by a computing device and identified at least one object on the adjacent vehicle.
  • the orientation of the motor vehicle is determined based on an image of the identified object.
  • Such an object may be, for example, an official license plate and / or tires of the other vehicle.
  • the information about the orientation of the neighboring vehicle may be used to assist the driver in performing parking operations.
  • This object is achieved by a method by a
  • an object may be in by a vehicle-side sensor device detected from an environmental region of the motor vehicle detected first sensor data and an object-specific feature on the object in the first sensor data are identified.
  • a first pose of the motor vehicle to the object-specific feature can be determined based on the object-specific feature.
  • the object is in through the sensor device from the
  • Driver assistance system an object detected in a vehicle-side sensor device from a surrounding area of the motor vehicle first sensor data and identified an object-specific feature on the object in the first sensor data.
  • object-specific feature determined based on the first sensor data.
  • the object is detected in the second sensor data detected by the sensor device from the surrounding area on the basis of the object-specific feature, and a second pose of the motor vehicle for the object-specific feature is determined on the basis of the object-specific feature.
  • a pose of the motor vehicle to the object-specific feature resulting from a movement of the motor vehicle between the detection of the first and the second sensor data is determined by means of odometry and the odometry error is detected if a deviation between the resulting pose and the second pose exceeds a predetermined threshold value.
  • a driver assistance system can be implemented, which is designed to determine poses with respect to at least one object in the surrounding area of the motor vehicle and also during the pose determination for detecting an odometry error.
  • the pose includes a position or a distance of the motor vehicle relative to the object and an orientation of the motor vehicle to the object.
  • the determination of the pose that is to say the determination of a spatial position between the motor vehicle and the object, becomes a particular one so-called SLAM (Simultaneous Localization and Mapping, Simultaneous Localization and Map Generation).
  • SLAM Simultaneous Localization and Mapping, Simultaneous Localization and Map Generation
  • the environment map is in particular a two-dimensional map, in which the objects detected by the vehicle-side sensor device can be registered and thus spatial positions of the objects are indicated to the motor vehicle. This can be the
  • the absolute position of the object can also be determined on the basis of the specific pose of an object with respect to the motor vehicle.
  • Ultrasonic sensors the positions of the vehicles can be determined only very inaccurate, since the ultrasonic sensors measure depending on the pose of the motor vehicle, ie orientation or angle and distance of the motor vehicle to the other vehicle, different contour points on the other vehicle. In order to obtain more reliable results, the data of several measuring cycles and / or sensors are thus usually fused, whereby the computing power increases significantly.
  • Posenbetician the motor vehicle based on the object-specific feature, which is located on the object to perform.
  • the object is first recognized based on the first sensor data.
  • a first image of at least one camera of the motor vehicle is detected as the first sensor data and / or first laser scanner data of at least one laser scanner of the first sensor data
  • the first sensor data is a first image of the surrounding area of the motor vehicle captured by the camera.
  • the object is identified and the at least one object-specific feature is identified on the object.
  • the object-specific feature is in particular a feature of the object, by means of which the object can be uniquely identified and recognized.
  • the first pose of the motor vehicle is determined to the object-specific feature. Based on the object, which is clearly identifiable, and based on the first pose then the
  • Area map can be created by the motor vehicle and the object in the
  • Map to be entered and thereby have a predetermined by the first pose spatial position to each other.
  • second sensor data for example a second camera image and / or second laser scanner data
  • second sensor data are acquired by the sensor device at a second measurement time.
  • a second camera image of the surrounding area is detected as the second sensor data.
  • the first and the second sensor data are thus in particular no sensor data, which are recorded directly in chronological succession.
  • the first and the second sensor data can be recorded, for example, when the motor vehicle is on a parking space search and passes several times the same location and thus the same object in the surrounding area. It can also be provided that the first and the second sensor data are detected on a route section which is traveled regularly by the motor vehicle, for example in the daily drive to a workstation and / or to a driver's residence
  • the second sensor data are searched for the object and can be clearly recognized based on the object-specific feature.
  • the second pose of the object As soon as the object has been recognized by the computing device of the driver assistance system on the basis of the object-specific feature, the second pose of the object has been recognized by the computing device of the driver assistance system on the basis of the object-specific feature, the second pose of the object has been recognized by the computing device of the driver assistance system on the basis of the object-specific feature.
  • the pose of the motor vehicle at the second detection time is additionally determined by means of odometry.
  • the motor vehicle can estimate its location based on data of its propulsion system. Such data of
  • Propulsion system for example, a number of Rohmformatungen and steering angle of the motor vehicle. Odometry can thus be done by capturing the data of the
  • Propulsion system as well as knowledge of vehicle-specific geometry data, such as a known wheel diameter, a distance are determined, which the motor vehicle from the first pose to the second measurement time
  • a stationary object is detected as the object.
  • the invention is based on the finding that the odometry adjustment can only be carried out reliably and correctly if the global position of the object has not changed.
  • the object-specific feature can be recognized in the images, for example by means of pattern recognition.
  • edges of the object-specific feature may be identified and an angle between the edges recognized.
  • the detection of the edges can be done by means of a known edge detection method, according to which in the recorded image for color transitions - for example, from black to white - is searched.
  • the orientation of the motor vehicle can then be determined, since the angle represents a measure of an orientation angle of the static object with respect to the motor vehicle.
  • Pozzo determination by Odoemtrie be provided. This can advantageously be provided a reliable and versatile driver assistance system.
  • a correction value for compensating the odometry error is determined based on the second pose. So it's about it
  • the detected from the second sensor data pose is the correct current pose of the motor vehicle.
  • the correction value is then determined so that the resulting pose determined by odometry corresponds to the second pose.
  • Ambient map or two-dimensional plane can thus be corrected by means of the SLAM method continuously the Odometrie employ and thus a particularly reliable driver assistance system can be provided.
  • the object is assigned an identification number which is stored in a vehicle-side storage device. Based on this
  • the object can be uniquely identified as soon as the object-specific characteristic has been recaptured.
  • the ID is stored together with information about the object-specific feature in the storage device.
  • the ID and the information are stored at least for so long that the odometry adjustment can also be carried out if, for example, the motor vehicle returns to the global position after several days, in which the object with the object-specific feature is located.
  • a global position of the motor vehicle in GPS coordinates can be determined by means of GPS (Global Positioning System).
  • Motor vehicle can thus be located globally by GPS. This global position of the motor vehicle can by means of the navigation system to the first
  • Measuring time are detected. If it has been detected by the navigation system that the motor vehicle is at the second measuring time at least in the vicinity of the global position, which held the motor vehicle at the first measuring time, for example in the same street, the second sensor data for identifying the object after the object-specific feature.
  • the navigation system If it has been detected by the navigation system that the motor vehicle is at the second measuring time at least in the vicinity of the global position, which held the motor vehicle at the first measuring time, for example in the same street, the second sensor data for identifying the object after the object-specific feature.
  • Sensor data will only be searched if a recognition of the object is likely.
  • a parked other vehicle is recognized as the object and identified as the object-specific feature is a motor vehicle license plate, in particular at least one alphanumeric character of the motor vehicle license plate.
  • the license plates or license plates are detected by parked vehicles.
  • These number plates can be detected continuously by means of the camera and their positions can be imaged in the two-dimensional plane or environment map. So it will be the exact positions or spatial positions of the
  • Number plates used to correct the continuous odometry errors using the SLAM method This embodiment takes advantage of the fact that almost every vehicle has such an official license plate, so that it can be used reliably for the identification of the vehicle. These license plates can be identified particularly easily, for example, using pattern recognition. As these license plates are usually a
  • the license plates can be used advantageously for reliable determination of the pose.
  • longitudinal edges of the license plate are recognized, and determines an angle between the longitudinal edges as a measure of the orientation angle of the motor vehicle to the other vehicle.
  • the at least one alphanumeric character representing information about the vehicle is identified and the ID assigned to this alphanumeric character, so that the parked vehicle can be uniquely identified.
  • At least one geometric dimension of a parking space adjacent to the parked vehicle is determined for the motor vehicle.
  • transverse parking gaps can be detected on the basis of the pose of the motor vehicle with respect to the vehicle registration number. If the longitudinal edges of the license plate, for example, extend parallel to a vehicle longitudinal direction of the motor vehicle, the transverse parking space can be detected in a passing of the motor vehicle to the parked vehicles. Based on given
  • Geometry data which in particular a length of the parked vehicle
  • Describe can be determined as the geometric dimension of the parking space, in particular a depth of the parking space thus.
  • the particular pose can therefore be used advantageously in many ways by the driver assistance system.
  • a milestone may be recognized as the object and a kilometer indication on the milestone may be identified as the object-specific feature.
  • the milestones are usually attached to lane edges of lanes at regular intervals so that they can be used to determine the distance traveled and to determine location.
  • Embodiment is particularly advantageous if the motor vehicle is regularly, for example daily, on a particular road section and thus regularly passes by the same milestone.
  • the odometry data are thus recorded over a particularly long period of time, for example, between passing the milestone on a first day and passing the milestone on the following day.
  • the odometry data are then recorded over the entire period, even if the motor vehicle has been turned off in the meantime.
  • Vehicle covered route is detected by odometry and the
  • Odometer error is detected if a further deviation between the distance and the distance traveled exceeds a further predetermined threshold. This embodiment is based on the finding that the milestones are usually set up at regular intervals. By the mileage on the milestones in the form of numbers, the distance between two
  • Milestones are determined. In addition, using odometry,
  • the distance covered should correspond to the distance between the milestones. However, if the distance covered by odometry deviates from the distance between the milestones, then the odometry error is also detected. This is particularly advantageous when the motor vehicle passes every milestone only once and thus can not perform the Odometriev Dermat the same milestone.
  • a house is recognized as the object and, as the object-specific feature, a house number plate, in particular at least one digit on the house number plate, is identified.
  • a house number plate in particular at least one digit on the house number plate
  • Embodiment the detection of the object, so here the house, linked to the performed by the navigation system of the motor vehicle position determination. For example, if it was detected by the navigation system that the
  • Motor vehicle repeatedly enters a certain street, so a specific house number can be recognized at a house in this street, and be performed on the basis of this house number odometry adjustment.
  • the invention also relates to a computing device for a driver assistance system of a motor vehicle, which is designed to carry out a method according to the invention.
  • the computing device can be integrated, for example, in a vehicle-side control unit.
  • An inventive driver assistance system comprises an inventive
  • the sensor device may comprise, for example, at least one laser scanner and / or at least one camera.
  • Driver assistance system is designed in particular to perform a determination of the pose of the motor vehicle by means of a SLAM method and by means of odometry and thus to determine an environment map of the motor vehicle. Based on
  • Ambient map and the corrected Odometrieish can be made possible in particular an at least semi-autonomous maneuvering of the motor vehicle.
  • an at least semi-autonomous parking process for the motor vehicle can be carried out by the driver assistance system.
  • the invention relates to a motor vehicle with a driver assistance system according to the invention.
  • the motor vehicle is designed in particular as a passenger car.
  • Embodiments and their advantages apply correspondingly to the computing device according to the invention, the driver assistance system according to the invention and to the motor vehicle according to the invention.
  • Fig. 1 is a schematic representation of an embodiment of a
  • Fig. 2 is a schematic representation of a detected by the motor vehicle
  • Fig. 1 shows a motor vehicle 1 according to the present invention.
  • the motor vehicle 1 is designed in particular as a passenger car.
  • the motor vehicle 1 comprises a driver assistance system 2, which is designed to carry out a so-called simultaneous localization and mapping (SLAM) or "simultaneous localization and mapping" process
  • SLAM simultaneous localization and mapping
  • Driver assistance system 2 is designed to create an environmental map describing a surrounding area 3 of the motor vehicle 1 and at the same time estimate or determine a pose of the motor vehicle 1 in the surroundings map.
  • the driver assistance system 2 has for this purpose a sensor device S, which in addition is designed to detect sensor data from the surrounding area 3.
  • the sensor device S has at least one camera 4.
  • Sensor device S may also have at least one laser scanner.
  • the cameras 4 are designed to capture the surrounding area 3 of the motor vehicle 1 in images.
  • the cameras 4 form here an all-round camera system or surround-view camera system, wherein a first camera 4 is arranged in a front area 5 of the motor vehicle 1, a second camera 4 is arranged in a rear area 6 of the motor vehicle 1, a third camera 4 on a driver's side 7 of the
  • Motor vehicle 1 is arranged and a fourth camera 4 is arranged on a passenger side 8 of the motor vehicle 1.
  • a fourth camera 4 is arranged on a passenger side 8 of the motor vehicle 1.
  • the driver assistance system 2 here also comprises a computing device 9, which is designed to detect the pose of the motor vehicle 1, that is to say an orientation and a position of the motor vehicle, on the basis of the sensor data acquired by the sensor device S
  • the driver assistance system 2 is designed to determine the pose of the motor vehicle 1 in the surrounding area 3 by odometry.
  • the driver assistance system 2 can record data of a propulsion system of the motor vehicle 1, for example a number of revolutions of wheels 10 of the motor vehicle 1 and / or steering angle of the motor vehicle 1. Due to vehicle parameters, for example, due to changing and / or incorrectly given wheel geometries and Fahrtechnikgeometrien, it may happen that the pose determination by means of odometry is faulty.
  • Such first sensor data are shown in FIG. 2 in the form of a first camera image 11 of the camera 4 arranged in the front region 5 of the motor vehicle 1.
  • the motor vehicle 1 is located on a roadway 12 and detects the first camera image during a journey at a first measurement time
  • the camera 4 may, for example, have a fisheye lens, resulting in the perspective distortion shown in FIG. 2 in the camera image 1 1.
  • the objects O are formed in this case as parked, other vehicles 13 and as a house 14.
  • the objects O are in particular at least temporarily stationary or immovable objects.
  • the computing device 9 is designed to recognize the objects O in the camera image 1 1 detected by the camera 4 and to identify at least one object-specific feature M on the objects O.
  • the respective object-specific feature can be clearly assigned to an object O, in particular.
  • a license plate 15 or license plate is identified as the object-specific feature M of the parked vehicles 13 and a house number plate 16 as the object-specific feature M of the house 14
  • the captured image 1 1 of the arithmetic unit 9 of a pattern recognition in terms of corners and edges of the license plate 15 and / or the house number plate 16 are subjected. It can thus be searched in the recorded image 1 1 for a rectangle or a trapezoid.
  • This method of detecting one of the license plate 15 and / or the house number plate 16 is on the one hand particularly reliable and on the other hand also relatively easy to implement.
  • the detection of the edges can be done by means of a known edge detection method, according to which in the recorded image 1 1 for color transitions - for example, from black to white - is searched. If the edges have been detected, then the corners of the license plate 15 and / or the house number plate 16. After recognition of the license plate 15 and / or the house number plate 16, the detection can be made plausible by a length of the image of the license plate 15 and / or House number plate 16 is set in relation to its width, and it is checked whether this ratio is within a predetermined range of values. Thus, the license plate 15 and / or the house number plate 16 can be distinguished from other rectangles, since both the length and the width are fixed in particular for license plates and thus represent a standard size.
  • the computing device 9 is designed to use the object-specific features M to determine a respective first pose of the motor vehicle 1 relative to the object-specific features M.
  • the first pose corresponds to the pose which the motor vehicle 1 has when capturing the first camera image 11 to the object-specific feature M.
  • longitudinal edges of the license plate 15 and / or the house number plate 16 in Figure 1 1 can be identified and an angle between the longitudinal edges are determined.
  • the orientation can then be determined, since the angle represents a measure of an orientation angle of the adjacent vehicle 13 and / or the house 14 with respect to the motor vehicle 1. Based on the
  • the SLAM method can be performed by the objects O and the motor vehicle are mapped in the particular two-dimensional map of the environment.
  • the motor vehicle 1, for example, in the origin of a two-dimensional
  • Coordinate system in which the environment map is determined to be positioned, resulting in positions of the objects O in the two-dimensional map of the environment by the carried out on the basis of the camera image 1 1 Posenbeées.
  • the computing device 9 each object O based on the
  • Object-specific feature M assign a unique identification number or ID and deposit them in a vehicle-mounted storage device. It may be advantageous to recognize the numbers or numbers and letters of the license plate 15 and based on which a unique
  • Identification number or ID matching the parked vehicle 13 is created.
  • the information that this is the same license plate 15 of the same vehicle 13 is provided through this ID.
  • This information is especially important if only parts of the number plate were recognized and thus results in a scattering, if the same number plate 15 is recognized and detected again and again, but at different positions. This dispersion can thus be contained in order to converge faster with the SLAM. Furthermore, for example, only one digit can be picked out, which is to be detected to calculate the SLAM, whereby fewer detection points are necessary to calculate a SLAM, and thus run time can be saved.
  • the computing device 9 can recognize a number on the house number plate 16 and assign it, in particular together with a street in which the house 14 is located, a specific identification number.
  • the road can be
  • a navigation system of the motor vehicle 1 by which the motor vehicle 1 is located globally. Then, at a second measuring time, a further second camera image is taken by the camera 4 as second sensor data of the surrounding area 3. Now, if the motor vehicle 1 again passes by the objects O, for example because the motor vehicle 1 is in search of a parking space, the images of the objects O are also located in the further camera image.
  • the computing device 9 can now search the further second image for the objects O by the computing device 9 searches the further image for the object-specific features M. If the
  • the object-specific feature M for example, one of
  • License plates 15 has recognized in the further picture, so the object O, so the associated vehicle 13, clearly recognized and a second pose of the
  • the computing device 9 searches the further camera image for the object-specific features M only if it was detected by means of the navigation system of the motor vehicle 1 that the motor vehicle 1 during the recording of the second
  • Camera image is located approximately in about the same surrounding area 3 as in the recording of the first camera image 1 1.
  • This second pose is compared with a pose which is determined by the computing device 9 by means of odometry.
  • the pose determined by means of odometry results from the route which the motor vehicle 1 has traveled since the acquisition of the first image 11 until the acquisition of the further image. In other words, starting from the first pose using odometry data or data of the
  • Propulsion system of the motor vehicle determines the distance covered and determines the end of the route as the pose with which the second pose is compared. So it is carried out an odometry adjustment. If there is no odometry error, ie if the underlying and recorded odometry data is the current, real data
  • Odometry error detected Based on the size of the deviation can be a
  • Correction factor can be determined by means of which the odometry error can be corrected during the position determination of the motor vehicle by means of odometry.
  • Motor vehicle 1 to the license plate 15 of one of the vehicles 13 and based on at least one predetermined geometric dimension of the parked vehicle thirteenth determine a geometric dimension 17 of a parking space 18 of the motor vehicle 1. Based on a length of the vehicle 13, the computing device 9 can recognize, in particular, a depth of the parking space 18 as the geometric dimension 17. Based on the environment map and the corrected odometry data, the driver assistance system 2 can then provide or perform, for example, an at least semi-autonomous parking process for the motor vehicle 1 in the parking space 18.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Transportation (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un procédé permettant de faire fonctionner un système d'aide à la conduite (2) d'un véhicule automobile (1) par la reconnaissance d'un objet dans de premières données de détection détectées par un dispositif de détection (S) côté véhicule dans une zone (3) environnant le véhicule automobile (1), et par l'identification d'une caractéristique (M) spécifique à l'objet au niveau de l'objet (O) dans les premières données de détection, le procédé comprenant les étapes suivantes : détermination d'une première position du véhicule automobile (1) par rapport à la caractéristique (M) spécifique à l'objet, au moyen de la caractéristique (M) spécifique à l'objet ; nouvelle reconnaissance de l'objet (O) dans de secondes données de détection détectées par le dispositif de détection (S), au moyen de la caractéristique (M) spécifique à l'objet ; détermination par odométrie d'une position du véhicule automobile (1) par rapport à la caractéristique (M) spécifique à l'objet résultant d'un déplacement du véhicule automobile (1) entre la détection des premières et des secondes données de détection ; et reconnaissance d'une erreur d'odométrie si un écart entre la position résultant du déplacement et la seconde position dépasse une valeur seuil prédéfinie. L'invention concerne en outre un dispositif de calcul (9), un système d'assistance à la conduite (2) ainsi qu'un véhicule automobile (1).
PCT/EP2017/057405 2016-04-15 2017-03-29 Procédé permettant de faire fonctionner un système d'aide à la conduite d'un véhicule automobile, dispositif de calcul, système d'aide à la conduite et véhicule automobile WO2017178232A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102016106978.7A DE102016106978A1 (de) 2016-04-15 2016-04-15 Verfahren zum Betreiben eines Fahrerassistenzsystems eines Kraftfahrzeugs, Recheneinrichtung, Fahrerassistenzsystem sowie Kraftfahrzeug
DE102016106978.7 2016-04-15

Publications (1)

Publication Number Publication Date
WO2017178232A1 true WO2017178232A1 (fr) 2017-10-19

Family

ID=58503573

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/057405 WO2017178232A1 (fr) 2016-04-15 2017-03-29 Procédé permettant de faire fonctionner un système d'aide à la conduite d'un véhicule automobile, dispositif de calcul, système d'aide à la conduite et véhicule automobile

Country Status (2)

Country Link
DE (1) DE102016106978A1 (fr)
WO (1) WO2017178232A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110555801A (zh) * 2019-07-26 2019-12-10 纵目科技(上海)股份有限公司 一种航迹推演的校正方法、终端和存储介质
WO2021156091A1 (fr) * 2020-02-04 2021-08-12 Volkswagen Aktiengesellschaft Procédé et dispositif destiné à déterminer la position d'un véhicule

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018101388A1 (de) 2018-01-23 2019-07-25 Valeo Schalter Und Sensoren Gmbh Korrigieren einer Position eines Fahrzeugs mit SLAM
DE102019214345B4 (de) * 2018-12-20 2022-08-25 Volkswagen Aktiengesellschaft Verfahren zur Dejustageerkennung wenigstens eines Sensors eines Kraftfahrzeugs
DE102019120440A1 (de) * 2019-07-29 2021-02-04 Bayerische Motoren Werke Aktiengesellschaft Verfahren und System zum automatisierten Erstellen von Objektmarkierungen für die Verwendung in einem Fahrassistenzsystem eines Fahrzeugs

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040167667A1 (en) * 2002-12-17 2004-08-26 Goncalves Luis Filipe Domingues Systems and methods for filtering potentially unreliable visual data for visual simultaneous localization and mapping
US20120078510A1 (en) * 2010-09-24 2012-03-29 Honeywell International Inc. Camera and inertial measurement unit integration with navigation data feedback for feature tracking
DE102010056217A1 (de) 2010-12-24 2012-06-28 Valeo Schalter Und Sensoren Gmbh Verfahren zum Betreiben eines Fahrerassistenzsystems in einem Kraftfahrzeug, Fahrerassistenzsystem und Kraftfahrzeug
DE102011082478A1 (de) * 2011-09-12 2013-03-14 Robert Bosch Gmbh Verfahren, System sowie Vorrichtung zur Lokalisation eines Fahrzeugs relativ zu einem vordefinierten Bezugssystem
DE102011118161B3 (de) * 2011-11-10 2013-03-28 Audi Ag Verfahren zur Positionsbestimmung
DE102014111011A1 (de) * 2014-08-04 2016-02-04 Valeo Schalter Und Sensoren Gmbh Verfahren zum Bestimmen einer Parkfläche zum Abstellen eines Kraftfahrzeugs, Fahrerassistenzsystem und Kraftfahrzeug

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011077555A1 (de) * 2011-06-15 2012-12-20 Robert Bosch Gmbh Nachrüstsatz für Parkführung
DE102011054379B4 (de) * 2011-10-11 2013-11-07 Deutsches Zentrum für Luft- und Raumfahrt e.V. Verfahren und Vorrichtung zur Ermittlung einer Positionsinformation
US9251587B2 (en) * 2013-04-05 2016-02-02 Caterpillar Inc. Motion estimation utilizing range detection-enhanced visual odometry

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040167667A1 (en) * 2002-12-17 2004-08-26 Goncalves Luis Filipe Domingues Systems and methods for filtering potentially unreliable visual data for visual simultaneous localization and mapping
US20120078510A1 (en) * 2010-09-24 2012-03-29 Honeywell International Inc. Camera and inertial measurement unit integration with navigation data feedback for feature tracking
DE102010056217A1 (de) 2010-12-24 2012-06-28 Valeo Schalter Und Sensoren Gmbh Verfahren zum Betreiben eines Fahrerassistenzsystems in einem Kraftfahrzeug, Fahrerassistenzsystem und Kraftfahrzeug
DE102011082478A1 (de) * 2011-09-12 2013-03-14 Robert Bosch Gmbh Verfahren, System sowie Vorrichtung zur Lokalisation eines Fahrzeugs relativ zu einem vordefinierten Bezugssystem
DE102011118161B3 (de) * 2011-11-10 2013-03-28 Audi Ag Verfahren zur Positionsbestimmung
DE102014111011A1 (de) * 2014-08-04 2016-02-04 Valeo Schalter Und Sensoren Gmbh Verfahren zum Bestimmen einer Parkfläche zum Abstellen eines Kraftfahrzeugs, Fahrerassistenzsystem und Kraftfahrzeug

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110555801A (zh) * 2019-07-26 2019-12-10 纵目科技(上海)股份有限公司 一种航迹推演的校正方法、终端和存储介质
WO2021156091A1 (fr) * 2020-02-04 2021-08-12 Volkswagen Aktiengesellschaft Procédé et dispositif destiné à déterminer la position d'un véhicule
CN115023679A (zh) * 2020-02-04 2022-09-06 大众汽车股份公司 用于确定交通工具位置的方法和装置

Also Published As

Publication number Publication date
DE102016106978A1 (de) 2017-10-19

Similar Documents

Publication Publication Date Title
DE102015203016B4 (de) Verfahren und Vorrichtung zur optischen Selbstlokalisation eines Kraftfahrzeugs in einem Umfeld
DE102015209467A1 (de) Verfahren zur Schätzung von Fahrstreifen
WO2017178232A1 (fr) Procédé permettant de faire fonctionner un système d'aide à la conduite d'un véhicule automobile, dispositif de calcul, système d'aide à la conduite et véhicule automobile
DE102008026397B4 (de) Radar-, Lidar- und Kamera-unterstützte Verfahren zur Schätzung der Fahrzeugdynamik
DE102018115813A1 (de) Verfahren und systeme zum lokalisieren eines fahrzeugs
DE102018114808A1 (de) Verfahren zur automatischen Querführung eines Folgefahrzeugs in einem Fahrzeug-Platoon
DE102016223422A1 (de) Verfahren zur automatischen Ermittlung extrinsischer Parameter einer Kamera eines Fahrzeugs
WO2017134228A1 (fr) Procédé d'assistance à un déplacement autonome d'un véhicule automobile, dispositif de commande et système d'aide au stationnement
DE102017011177A1 (de) Verfahren zum Ermitteln eines Knickwinkels eines Fahrzeug-Gespanns, Ermittlungseinrichtung sowie Fahrzeug-Gespann
DE102016003261A1 (de) Verfahren zur Selbstlokalisierung eines Fahrzeugs in einer Fahrzeugumgebung
DE102018115317A1 (de) Verfahren und Fahrunterstützungssystem zum Betreiben eines Fahrzeugs oder zum Unterstützen eines Fahrers des Fahrzeugs unter Verwendung von Fahrspurinformation
DE102014111012A1 (de) Verfahren zum Unterstützen eines Fahrers eines Kraftfahrzeugs beim Ausparken, Fahrerassistenzsystem und Kraftfahrzeug
DE102018221864A1 (de) Verfahren und System zum Bestimmen einer Trajektorie eines Fahrzeugs
DE102013001867A1 (de) Verfahren zur Ermittlung einer Fahrzeugorientierung und/oder einer korrigierten Fahrzeugposition eines Kraftfahrzeugs und Kraftfahrzeug
EP1724726B1 (fr) Méthode et dispositif pour la mesure d'un espace de stationnement à l'aide d'une mono-caméra
DE102016209242B4 (de) Einheiteneinstellvorrichtung und einheiteneinstellverfahren
DE102010056217A1 (de) Verfahren zum Betreiben eines Fahrerassistenzsystems in einem Kraftfahrzeug, Fahrerassistenzsystem und Kraftfahrzeug
DE102016009756A1 (de) Verfahren zum Kalibrieren eines Sensorsystems eines Kraftfahrzeugs sowie Sensorsystem
DE102018104963A1 (de) Verfahren zum Unterstützen eines Benutzers eines Kraftfahrzeugs beim Einparken auf einem Parkplatz, entsprechendes Computerprogrammprodukt, Unterstützungssystem und Kraftfahrzeug
DE102019213930B4 (de) Verfahren zur Optimierung der Umgebungserkennung bei einem Fahrzeug
DE102011111856B4 (de) Verfahren und Vorrichtung zur Detektion mindestens einer Fahrspur in einem Fahrzeugumfeld
DE102020123808A1 (de) Verfahren und System zum Erfassen von Umgebungsdaten zur Absicherung einer Umfelderkennung
DE102020111471A1 (de) Verfahren und System zur Bilderkennung für ein automatisiert fahrendes Fahrzeug
EP2172826B1 (fr) Dispositif et procédé destinés à déterminer un parcours de voie de circulation
DE102018120966A1 (de) Verfahren zum Erkennen eines Teils eines Anhängers sowie Anhängererfassungssystem für ein Zugfahrzeug

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17716146

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17716146

Country of ref document: EP

Kind code of ref document: A1