WO2019016114A1 - Method for operating a driver assistance system for a motor vehicle with renewed storage of training data, driver assistance system as well as motor vehicle - Google Patents

Method for operating a driver assistance system for a motor vehicle with renewed storage of training data, driver assistance system as well as motor vehicle Download PDF

Info

Publication number
WO2019016114A1
WO2019016114A1 PCT/EP2018/069198 EP2018069198W WO2019016114A1 WO 2019016114 A1 WO2019016114 A1 WO 2019016114A1 EP 2018069198 W EP2018069198 W EP 2018069198W WO 2019016114 A1 WO2019016114 A1 WO 2019016114A1
Authority
WO
WIPO (PCT)
Prior art keywords
object features
stored
motor vehicle
trajectory
assistance system
Prior art date
Application number
PCT/EP2018/069198
Other languages
French (fr)
Inventor
Ciaran Hughes
Catherine Enright
Jonathan Horgan
Gustavo Pelaez
Olivia Donnellan
German Feijoo
Original Assignee
Connaught Electronics Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Connaught Electronics Ltd. filed Critical Connaught Electronics Ltd.
Publication of WO2019016114A1 publication Critical patent/WO2019016114A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3863Structures of map data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/027Parking aids, e.g. instruction means
    • B62D15/0285Parking performed automatically
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0112Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0129Traffic data processing for creating historical data or processing based on historical data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0145Measuring and analyzing of parameters relative to traffic conditions for specific applications for active traffic flow control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096805Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
    • G08G1/096827Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route where the route is computed onboard

Definitions

  • the present invention relates to a method for operating a driver assistance system for a motor vehicle, wherein in a training phase of the driver assistance system, during which the motor vehicle is maneuvered manually by a driver along a trajectory, the trajectory is stored, and on the basis of at least one image, which is provided by means of a camera of the motor vehicle, a plurality of object features, which describe objects in an
  • the present invention relates to a driver assistance system for a motor vehicle.
  • the present invention relates to a motor vehicle.
  • driver assistance systems are known by means of which the motor vehicle can be maneuvered along a previously recorded trajectory.
  • the driver assistance system can first be operated in a training phase.
  • the motor vehicle is manually maneuvered by a driver of the motor vehicle.
  • the trajectory which the motor vehicle travels during manual maneuvering, is recorded by the driver assistance system.
  • objects in the environment of the motor vehicle are detected and object features which describe these objects are stored.
  • the motor vehicle In a subsequent operating phase of the driver assistance system, the motor vehicle can then be maneuvered semi-autonomously or autonomously along the recorded trajectory.
  • the motor vehicle In the driver assistance systems known from the prior art, the motor vehicle is
  • the motor vehicle can be maneuvered along the stored trajectory if it is detected that the motor vehicle has reached the starting position of the trajectory.
  • the previously stored object features can be used as landmarks.
  • the driver of the motor vehicle can be supported, for example, in parking maneuvers in a home zone.
  • the driver can be assisted in the daily parking of the motor vehicle into a garage of his home or in a parking lot at his workplace.
  • the environment in which the trajectory has been recorded changes depending on time. For example, the environment may vary depending on the time of the year.
  • DE 10 2014 018 192 A1 describes a method for controlling a vehicle in which a target trajectory is determined from driving data of the vehicle, which is used as the basis for the vehicle during automatic travel.
  • a trajectory determined from current vehicle travel data is compared with a target trajectory, and a deviation between the new trajectory and the target trajectory is determined.
  • a new target trajectory is set depending on the deviation.
  • a method for operating a driver assistance system for a motor vehicle in a training phase of the driver assistance system, during which the motor vehicle is preferably maneuvered manually by a driver along a trajectory, the trajectory is stored, and in particular on the basis of at least one image, which in particular is provided by means of a camera of the motor vehicle, a plurality of object features, which describe in particular objects in an environmental region of the motor vehicle, is stored.
  • the motor vehicle is maneuvered along the stored trajectory at least semi-autonomously, in particular on the basis of the stored trajectory and the stored object features.
  • a plurality of object features is detected in the operating mode.
  • the detected object features are assigned in particular to the stored object features. In particular, it is decided by means of the assignment whether renewed storage of the object features and/or the trajectory is required.
  • a method according to the invention serves to operate a driver assistance system for a motor vehicle.
  • a training phase of the driver assistance system during which the motor vehicle is manually maneuvered by a driver along a trajectory, the trajectory is recorded and, on the basis of at least one image which is provided by means of a camera of the motor vehicle, a plurality of object features which describe objects in an environmental region of the motor vehicle, are stored.
  • the motor vehicle is maneuvered along the stored trajectory at least semi- autonomously on the basis of the stored trajectory and the stored object features.
  • a plurality of object features are detected in the operating mode, the detected object features are assigned to the stored object features, and a decision is made on the basis of the assignment whether renewed storage of the object features and/or the trajectory is required.
  • the driver assistance system of the motor vehicle is first to be trained in the training phase or a learning phase.
  • the motor vehicle can then be maneuvered on the basis of the training using the driver assistance system.
  • the method can be used to drive the motor vehicle into a garage or a parking space of the driver.
  • the method can also be used to drive the motor vehicle to a parking space at the driver's workplace.
  • the driver specifies the trajectory which extends from a starting position to a target position. Along this trajectory, the motor vehicle can then be maneuvered at least semi-autonomously at a later stage in the operating phase.
  • the motor vehicle is operated exclusively manually by the driver.
  • the trajectory which the driver specifies through manual maneuvering, is recorded using the driver assistance system.
  • the set steering angle and/or the revolutions of at least one wheel of the motor vehicle can be recorded continuously.
  • the start position and the target position of the trajectory can be stored. It can also be provided that the start position, the target position and/or the trajectory are determined by means of a satellite-based position determination system.
  • At least one object in the environmental region of the motor vehicle is detected during the training phase.
  • several objects are detected in the environmental region.
  • the environmental region of the motor vehicle can be detected with the camera of the motor vehicle or the driver assistance system.
  • a plurality of cameras can be used to detect the environmental region.
  • the object features which describe the objects in the environmental area can then be recognized in the images.
  • further environmental sensors of the driver assistance system can also be used to detect the objects.
  • object features can also be determined for an object. These object features and their position can then be stored in a digital environmental map, which describes the environmental region of the motor vehicle.
  • the trajectory and/or points of the trajectory can also be stored in this environment map.
  • the environmental map which describes the relative position of the object features relative to the stored trajectory.
  • the stored object features can be used as landmarks for orientation in the subsequent operating phase of the driver assistance system.
  • the motor vehicle is maneuvered semi- autonomously by means of the driver assistance system.
  • the driver assistance system engages in the steering and the driver of the motor vehicle continues to actuate the accelerator pedal and the brake pedal.
  • the motor vehicle is maneuvered fully automatically along the trajectory.
  • the driver assistance system also takes over the intervention in a drive motor and a braking system of the motor vehicle.
  • a plurality of object features is detected in the operating phase. These detected object features are then assigned to the stored object features. This means that object features are also detected during the subsequent operating phase or the subsequent operating phase by means of the at least one image which is provided with the camera. These detected object features are then assigned to the object features previously stored in the training phase. Based on the assignment, the motor vehicle can then be maneuvered along the stored trajectory. In the present case, the assignment of the detected object features to the stored object features is used to verify whether a renewed storage of the object features and/or a renewed recording of the trajectory is required.
  • the driver assistance system can be automatically adjusted to changing environment conditions. These changes in the environment can be caused by changes in the objects due to the season, lighting and/or weather conditions. Thus, the driver assistance system can be operated more efficiently and more user- friendly overall.
  • the object features and/or the trajectory are stored anew during a subsequent operating phase if it is decided that a renewed storage of the object features and/or the trajectory is required. If the assignment of the detected object features to the stored object features shows that a new recording of the training data is required, this can be carried out automatically in the subsequent operating phase.
  • the driver assistance system can automatically adapt to the changing environmental conditions, whereby reliable operation can be guaranteed.
  • This automatic renewed storage of the object features and/or the trajectory can thus be carried out continuously without the user or the driver perceiving this. In this way, transitions in ambient conditions can also be
  • driver assistance system can be operated more user-friendly.
  • the assignment of the detected object features to the stored object features is carried out by means of a visual method of simultaneous localization and mapping.
  • a so-called VSLAM algorithm VSLAM - Visual Simultaneous Localization and Mapping
  • algorithms related to VSLAM may be used, for example Structure From Motion and/or Bundle Adjustment.
  • the object features can be determined during the training phase and be entered into the environment map.
  • points describing the trajectory can be entered into the environment map.
  • the algorithm is used to assign the detected object features to the stored object features determined in the training phase. Based on the assignment of the detected object features to the stored object features, the algorithm can determine the current position of the motor vehicle relative to the recorded trajectory.
  • the assignment of the object features can be carried out in a reliable manner.
  • a visual descriptor describing the respective object feature in the image is determined for the stored object features and/or the detected object features, and the assignment is determined by means of the visual descriptor.
  • One or more visual descriptors can be determined and stored for the stored object features.
  • visual descriptors can be determined for the detected object features in the operating phase.
  • the visual descriptors of the detected object features can in particular be assigned to the visual descriptors of the stored object features.
  • the respective visual descriptor is, in particular, an algorithmic representation of what the object feature "looks" like in the image.
  • the respective visual descriptor may describe a color, shape, texture or the like of the features. This makes a reliable assignment of the detected object features to the stored object features possible.
  • a quality measure which describes a matching between the detected object features and the stored object features, and it is decided on the basis of the quality measure whether the renewed storage of the object features and/or the trajectory is necessary.
  • the quality measure is determined by means of the matching of the visual descriptors of the detected object features with the visual descriptors of the stored features. This quality measure describes, in particular, how good the matching of the detected feature with the stored features is.
  • the quality measure can, for example, be compared with a predetermined limit value.
  • a number of the detected object features, which are assigned to the stored object features is determined, and the number is used to decide whether the object features and/or the trajectory need to be stored anew.
  • a further criterion, which determines whether the renewed storing of the training data is required, is the number of valid matches in the assignment of the detected object features to the stored object features. If, in the operating phase, the object features are extracted from the image, a detected object feature cannot be assigned to each of the stored object features. If the environmental conditions change, the number will decrease. If, for example, this number falls below a predefined threshold value, it can be decided that the renewed determination of the object features and/or the trajectory is necessary.
  • one of the stored object features is deleted if no object feature is assigned to this object feature in a predetermined number of operating phases and/or for a predetermined time duration in the operating phases. It can be provided in particular that an information is assigned to each of the stored object features, which describes the time since this object feature was last detected in an operating phase. This time can be a physical time, for example, a date or time, or the number of operating phases carried out. If the time exceeds a predetermined limit, this stored object feature can be deleted. Here, for example, it can be assumed that this stored object feature describes an object which is no longer present, or which has been stored incorrectly.
  • a further object feature is stored if it is detected in a
  • predetermined number of operating phases In each operating phase object features are recognized which cannot be assigned to any of the stored object features. If these object features are detected in several successive operating phases, it can be assumed that they originate from a new object or that they originate from a change in the environmental conditions. If these object features are detected in several operating phases or a predetermined number of operating phases, they can be stored. The selection of the newly stored object features can be determined on the basis of criteria which describe the quality of the object feature. Here, for example, the optical saliency and/or the noise in reconstruction can be taken into account. In addition, it can be verified whether the object feature can be assigned to further object features or whether it is an outlier.
  • the number of newly stored features can be limited. For example, the number of newly stored object features can correspond to the previously deleted object features. Thus, the memory space can be limited and the computational effort can be reduced.
  • a recording information for the respective object features is stored in the training phase and/or in the operating phase, and the assignment is performed using the respective recording information.
  • additional recording information can be determined, which is then assigned to the stored object features.
  • This recording information can describe the recording of the image by means of the camera and/or the ambient conditions.
  • recording information for the detected object features can be determined during the operating phase. When comparing the detected object features with the stored object features, this recording information can then be taken into account. For example, it can be verified whether the recording information is similar or comparable. In this way, the robustness in the assignment can be improved.
  • the recording information describes a lighting situation in the environmental region, a time of day and/or a season. It is thus possible, for example, to take account of the fact that different illumination situations are given during the storage phase of the object features during the training phase and during the detection of the object features during the operating phase. If, for example, object features have been stored under bright ambient conditions, it may be the case that these are not detected during dark operating conditions in the operating phase. If the stored object features were determined at a different time of the year than the object features are detected, differences may also occur. For example, it is possible to store object features which describe leaves in trees or plants, which are not present during an operating phase which takes place in winter. This enables a reliable allocation of the object features during the operating phase.
  • a driver assistance system for a motor vehicle is adapted for performing a method according to the invention and the advantageous embodiments thereof.
  • the driver assistance system can comprise the camera by means of which the objects can be detected in the environment of the motor vehicle.
  • the driver assistance system can have a motion sensor with which the movement of the motor vehicle can be recorded in the training phase.
  • the driver assistance system can comprise an electronic control device by means of which the trajectory can be recorded and the object features and/or the associated optical descriptors can be stored.
  • a steering, a drive motor and/or a braking system can be controlled during the at least semi-autonomous maneuvering by means of the control device.
  • the driver assistance system may have a satellite-based position determination system.
  • a motor vehicle according to the invention comprises a driver assistance system according to the invention.
  • the motor vehicle is preferably designed as a passenger car.
  • Fig. 1 a motor vehicle according to an embodiment of the invention, which
  • Fig. 2 the motor vehicle which is manually maneuvered along a trajectory
  • driver assistance system is operated in a training phase
  • Fig. 3 the motor vehicle which is maneuvered along a trajectory in an operating phase of the driver assistance system
  • Fig. 4 a digital environmental map describing the trajectory, object features stored in the training phase as well as object features detected in the operating phase.
  • Fig. 1 shows a motor vehicle 1 according to an embodiment of the present invention in a plan view.
  • the motor vehicle 1 is designed as a passenger car.
  • the motor vehicle 1 comprises a driver assistance system 2.
  • the driver assistance system 2 comprises, in turn, an electronic control unit 3.
  • the driver assistance system 2 comprises at least one camera 4.
  • the driver assistance system 2 comprises four cameras 4 which are arranged distributed on the motor vehicle 1 .
  • one of the cameras 4 is arranged in a rear region 5 of the motor vehicle 1
  • one of the cameras 4 is arranged in a front region 7 of the motor vehicle 1
  • the other two cameras 4 are arranged in a respective side region 6, in particular in a region of the side mirrors of the motor vehicle 1 .
  • the number and arrangement of the cameras 4 is to be understood as purely exemplary.
  • objects 10 can be detected in an environmental region 8 of the motor vehicle 1 .
  • images or image sequences can be provided with the cameras 4, which describe the objects 10 in the environmental region 8. These images can then be transmitted from the respective cameras 4 to the control unit 3.
  • the control device 3 can then recognize the objects 10 in the environmental region 8.
  • the driver assistance system 2 comprises a motion sensor 9, by means of which a movement of the motor vehicle 1 can be recorded. For example, a steering angle and/or revolutions of at least one wheel of the motor vehicle 1 can be determined continuously with the movement sensor 9.
  • the movement sensor 9 is also connected to the control device 3 for data transmission.
  • the motion sensor 9 can comprise a receiver for a satellite-based position determination system. Thereby, the position of the motor vehicle 1 can be determined continuously.
  • Fig. 2 shows the motor vehicle 1 , wherein the driver assistance system 2 is operated in a training phase.
  • the motor vehicle 1 is manually maneuvered along a trajectory 1 1 by a driver of the motor vehicle 1 .
  • This trajectory 1 1 which comprises a plurality of points 12, extends from a starting point 13 to a target point 14.
  • the target point 14 is located within a garage 15.
  • the motor vehicle 1 is
  • the trajectory 1 1 or its points 12 can be determined by means of the motion sensor 9.
  • Fig. 3 shows the motor vehicle 1 , the driver assistance system 2 being operated in an operating phase which follows the training phase in time.
  • the trajectory 1 1 which was determined in the preceding training phase, is shown schematically.
  • the motor vehicle 1 can be maneuvered along the trajectory 1 1 to the target point 14 in the garage 15. This applies in the event that there is no obstacle on the trajectory 1 1 .
  • object features 19 are detected and assigned to the object features 18 stored in the training phase.
  • Fig. 4 shows a representation of an environmental map 20 which shows the trajectory 1 1 , the points 12 of the trajectory 1 1 , the stored object features 18, and the detected object features 19 which were assigned to the stored object features 18.
  • the assigned object features 19 are assigned to the stored object features 18 by means of a visual algorithm for simultaneous localization and mapping. In the present case, it can be seen that the number of detected object features 19 is smaller than the number of object features 18 stored in the training phase.
  • at least one visual descriptor can be determined for the stored object features 18 and be stored in the digital environmental map 20.
  • visual descriptors can also be determined for the detected object features 19.
  • the visual descriptors of the detected object features 19 can then be assigned to the visual descriptors of the stored object features 18.
  • the motor vehicle 1 can thus be located on the basis of the known points 12 of the trajectory 1 1 .
  • the detected object features 19 are assigned to the stored object features 18 by means of the algorithm. Based on this assignment, a quality measure can be determined which describes the quality of the assignment or the matching. This quality measure can then be used as a criterion as to whether a renewed recording of the stored object features 18 and/or the trajectory 1 1 is necessary. As a further criterion, the number of validly assigned detected object features 19 to the stored object features 18 can be used. When the object features 1 1 are extracted from the image, not all object features 19 can be assigned to the stored object features 18. If the environmental conditions change, the number of valid assignments will decrease significantly.
  • the limit value or the threshold value can be determined in such a way that the operating phase can be performed reliably, but there are indications that the ambient conditions have changed. This renewed storage of the object features 18 and/or of the trajectory 1 1 is then carried out
  • the stored object features 18 and/or the visual descriptors of the stored object features 18 are dynamically adapted during operation of the driver assistance system 2.
  • stored object features 18, which are never recognized in the operating mode can be deleted.
  • an initial training phase is required in which the driver explicitly specifies the trajectory 1 1 along which the motor vehicle 1 is to be maneuvered.
  • the object features 18 can also be assigned an information describing the time since this object feature 18 has been found for the last time in the operating phase. If this time exceeds a threshold value, this stored object feature 18 can be deleted from the digital environmental map 20.
  • new object features 19 are detected which cannot be assigned to any of the stored object features 18.
  • this object feature 19 is assigned to a new object 10 in the environmental region 8. If this object feature 19 is recognized in a predetermined number of operating phases, this object feature 19 can also be stored or used as a stored object feature 18.
  • additional recording information is added to the object features 18.
  • This recording information may, for example, describe the time of day, the season, the lighting state or the like. These recording information can then be taken into account when the detected object features 19 are assigned to the stored object features 18. Furthermore, these recording information can also be taken into account in the deletion of stored object features 18.
  • the stored trajectory 1 1 and/or the stored object features 18 are to be recorded again without user input by the driver.
  • the renewed storage of these training data can then be carried out in the subsequent operating phase.
  • the training data can be updated continuously and be adapted to the gradually changing environmental conditions. It is not necessary for the driver to manually trigger the renewed recording of the training data. This overall enables a reliable operation of the driver assistance system 2.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Transportation (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

The invention relates to a method for operating a driver assistance system (2) for a motor vehicle (1), wherein in a training phase of the driver assistance system (2), during which the motor vehicle (1) is maneuvered manually by a driver along a trajectory (11), the trajectory (11) is stored, and on the basis of at least one image, which is provided by means of a camera (4) of the motor vehicle (1), a plurality of object features (18), which describe objects in an environmental region (8) of the motor vehicle (1), is stored, and in an operating phase of the driver assistance system (2) the motor vehicle (1) is maneuvered at least semi-autonomously along the stored trajectory (11) on the basis of the stored trajectory (11) and the stored object features (18), wherein a plurality of object features (19) are detected in the operating mode, the detected object features (19) are assigned to the stored object features (18), and a decision is made on the basis of the assignment whether renewed storage of the object features (18) and/or the trajectory (11) is required.

Description

Method for operating a driver assistance system for a motor vehicle with renewed storage of training data, driver assistance system as well as motor vehicle
The present invention relates to a method for operating a driver assistance system for a motor vehicle, wherein in a training phase of the driver assistance system, during which the motor vehicle is maneuvered manually by a driver along a trajectory, the trajectory is stored, and on the basis of at least one image, which is provided by means of a camera of the motor vehicle, a plurality of object features, which describe objects in an
environmental region of the motor vehicle, is stored, and in an operating phase of the driver assistance system the motor vehicle is maneuvered at least semi-autonomously along the stored trajectory on the basis of the stored trajectory and the stored object features. In addition, the present invention relates to a driver assistance system for a motor vehicle. Finally, the present invention relates to a motor vehicle.
From the prior art, driver assistance systems are known by means of which the motor vehicle can be maneuvered along a previously recorded trajectory. To record the trajectory, the driver assistance system can first be operated in a training phase. In the training phase, the motor vehicle is manually maneuvered by a driver of the motor vehicle. The trajectory, which the motor vehicle travels during manual maneuvering, is recorded by the driver assistance system. Furthermore, it is provided that during manual maneuvering, objects in the environment of the motor vehicle are detected and object features which describe these objects are stored.
In a subsequent operating phase of the driver assistance system, the motor vehicle can then be maneuvered semi-autonomously or autonomously along the recorded trajectory. In the driver assistance systems known from the prior art, the motor vehicle is
maneuvered between a starting position and a target position along the trajectory. For example, the motor vehicle can be maneuvered along the stored trajectory if it is detected that the motor vehicle has reached the starting position of the trajectory. In the case of the at least semi-autonomous maneuvering of the motor vehicle, the previously stored object features can be used as landmarks. In this way, the driver of the motor vehicle can be supported, for example, in parking maneuvers in a home zone. For example, the driver can be assisted in the daily parking of the motor vehicle into a garage of his home or in a parking lot at his workplace. It may be the case that the environment in which the trajectory has been recorded changes depending on time. For example, the environment may vary depending on the time of the year. When the training phase has been carried out in the summer, leaves are present on the trees, which are recorded as objects. These leaves are no longer present in winter, which significantly changes the detected objects and their object features. In this case, a new training phase or a renewed recording of the trajectory and the object features is necessary. According to the prior art, it is provided that this renewed recording of the object features and the trajectory is predetermined by the driver by means of an operator input.
In this context, DE 10 2014 018 192 A1 describes a method for controlling a vehicle in which a target trajectory is determined from driving data of the vehicle, which is used as the basis for the vehicle during automatic travel. In a modified environment of the vehicle, a trajectory determined from current vehicle travel data is compared with a target trajectory, and a deviation between the new trajectory and the target trajectory is determined. A new target trajectory is set depending on the deviation.
It is an object of the present invention to provide a solution, how an operation of a driver assistance system in which a trajectory is recorded in a learning phase and an at least semi-autonomous maneuvering along the trajectory is performed in an operating phase can be carried out more efficiently and more user-friendly.
According to the invention this object is solved by a method, by a driver assistance system as well as by a motor vehicle having the features according to the respective independent claims. Advantageous embodiments of the invention are the subject matter of the dependent claims.
According to one embodiment of a method for operating a driver assistance system for a motor vehicle, in a training phase of the driver assistance system, during which the motor vehicle is preferably maneuvered manually by a driver along a trajectory, the trajectory is stored, and in particular on the basis of at least one image, which in particular is provided by means of a camera of the motor vehicle, a plurality of object features, which describe in particular objects in an environmental region of the motor vehicle, is stored. In an operating phase of the driver assistance system, the motor vehicle is maneuvered along the stored trajectory at least semi-autonomously, in particular on the basis of the stored trajectory and the stored object features. In this case, it is in particular provided that a plurality of object features is detected in the operating mode. In addition, the detected object features are assigned in particular to the stored object features. In particular, it is decided by means of the assignment whether renewed storage of the object features and/or the trajectory is required.
A method according to the invention serves to operate a driver assistance system for a motor vehicle. In a training phase of the driver assistance system, during which the motor vehicle is manually maneuvered by a driver along a trajectory, the trajectory is recorded and, on the basis of at least one image which is provided by means of a camera of the motor vehicle, a plurality of object features which describe objects in an environmental region of the motor vehicle, are stored. In an operating phase of the driver assistance system, the motor vehicle is maneuvered along the stored trajectory at least semi- autonomously on the basis of the stored trajectory and the stored object features. In this case, it is also provided that a plurality of object features are detected in the operating mode, the detected object features are assigned to the stored object features, and a decision is made on the basis of the assignment whether renewed storage of the object features and/or the trajectory is required.
By means of the method, the driver assistance system of the motor vehicle is first to be trained in the training phase or a learning phase. In the subsequent operating phase or the subsequent operating phases, the motor vehicle can then be maneuvered on the basis of the training using the driver assistance system. For example, the method can be used to drive the motor vehicle into a garage or a parking space of the driver. The method can also be used to drive the motor vehicle to a parking space at the driver's workplace. In this case, it is provided that, during the training phase, the driver specifies the trajectory which extends from a starting position to a target position. Along this trajectory, the motor vehicle can then be maneuvered at least semi-autonomously at a later stage in the operating phase. In the training phase, the motor vehicle is operated exclusively manually by the driver. In the training phase, the trajectory, which the driver specifies through manual maneuvering, is recorded using the driver assistance system. For this purpose, for example, the set steering angle and/or the revolutions of at least one wheel of the motor vehicle can be recorded continuously. In addition, the start position and the target position of the trajectory can be stored. It can also be provided that the start position, the target position and/or the trajectory are determined by means of a satellite-based position determination system.
In addition, at least one object in the environmental region of the motor vehicle is detected during the training phase. Preferably, several objects are detected in the environmental region. During manual maneuvering, the environmental region of the motor vehicle can be detected with the camera of the motor vehicle or the driver assistance system. In particular, a plurality of cameras can be used to detect the environmental region. With the at least one camera images describing the environmental region are provided. By means of a corresponding object detection algorithm, the object features which describe the objects in the environmental area can then be recognized in the images. In addition, further environmental sensors of the driver assistance system can also be used to detect the objects. Several object features can also be determined for an object. These object features and their position can then be stored in a digital environmental map, which describes the environmental region of the motor vehicle. The trajectory and/or points of the trajectory can also be stored in this environment map. Thus, there is information in the environmental map which describes the relative position of the object features relative to the stored trajectory. Thus, the stored object features can be used as landmarks for orientation in the subsequent operating phase of the driver assistance system. In the operating phase it can be provided that the motor vehicle is maneuvered semi- autonomously by means of the driver assistance system. In this case, the driver assistance system engages in the steering and the driver of the motor vehicle continues to actuate the accelerator pedal and the brake pedal. Preferably, the motor vehicle is maneuvered fully automatically along the trajectory. In this case, the driver assistance system also takes over the intervention in a drive motor and a braking system of the motor vehicle.
According to an essential aspect of the present invention, it is provided that a plurality of object features is detected in the operating phase. These detected object features are then assigned to the stored object features. This means that object features are also detected during the subsequent operating phase or the subsequent operating phase by means of the at least one image which is provided with the camera. These detected object features are then assigned to the object features previously stored in the training phase. Based on the assignment, the motor vehicle can then be maneuvered along the stored trajectory. In the present case, the assignment of the detected object features to the stored object features is used to verify whether a renewed storage of the object features and/or a renewed recording of the trajectory is required. On the basis of the assignment of the detected object features to the stored object features, it is possible to automatically recognize whether the environmental region or the region in which the object features have been detected has changed. In other words, by means of the assignment it is verified whether it is necessary to perform a new training phase. In particular, it is determined whether the stored object features and/or the trajectory, which have been determined as training data, must be determined again. Thus, it can be recognized automatically whether a renewed determination of the training data or a renewed execution of a training phase is required without requiring a manual input from the user or the driver. In this way, the driver assistance system can be automatically adjusted to changing environment conditions. These changes in the environment can be caused by changes in the objects due to the season, lighting and/or weather conditions. Thus, the driver assistance system can be operated more efficiently and more user- friendly overall.
Preferably, the object features and/or the trajectory are stored anew during a subsequent operating phase if it is decided that a renewed storage of the object features and/or the trajectory is required. If the assignment of the detected object features to the stored object features shows that a new recording of the training data is required, this can be carried out automatically in the subsequent operating phase. Thus, the driver assistance system can automatically adapt to the changing environmental conditions, whereby reliable operation can be guaranteed. This automatic renewed storage of the object features and/or the trajectory can thus be carried out continuously without the user or the driver perceiving this. In this way, transitions in ambient conditions can also be
considered. This applies, for example, to autumn, when trees have only shed part of their leaves. Thus, the driver assistance system can be operated more user-friendly.
In one embodiment, the assignment of the detected object features to the stored object features is carried out by means of a visual method of simultaneous localization and mapping. In other words, a so-called VSLAM algorithm (VSLAM - Visual Simultaneous Localization and Mapping) is used. In addition or as an alternative, algorithms related to VSLAM may be used, for example Structure From Motion and/or Bundle Adjustment. With the aid of the algorithm, the object features can be determined during the training phase and be entered into the environment map. In addition, points describing the trajectory can be entered into the environment map. In the operating phase, the algorithm is used to assign the detected object features to the stored object features determined in the training phase. Based on the assignment of the detected object features to the stored object features, the algorithm can determine the current position of the motor vehicle relative to the recorded trajectory. Thus, the assignment of the object features can be carried out in a reliable manner.
In a further embodiment, a visual descriptor describing the respective object feature in the image is determined for the stored object features and/or the detected object features, and the assignment is determined by means of the visual descriptor. One or more visual descriptors can be determined and stored for the stored object features. In the same way, visual descriptors can be determined for the detected object features in the operating phase. In the assignment of the detected object features to the stored object features, the visual descriptors of the detected object features can in particular be assigned to the visual descriptors of the stored object features. The respective visual descriptor is, in particular, an algorithmic representation of what the object feature "looks" like in the image. The respective visual descriptor may describe a color, shape, texture or the like of the features. This makes a reliable assignment of the detected object features to the stored object features possible.
Furthermore, it is advantageous if, on the basis of the assignment, a quality measure is determined which describes a matching between the detected object features and the stored object features, and it is decided on the basis of the quality measure whether the renewed storage of the object features and/or the trajectory is necessary. In particular, it is provided that the quality measure is determined by means of the matching of the visual descriptors of the detected object features with the visual descriptors of the stored features. This quality measure describes, in particular, how good the matching of the detected feature with the stored features is. On the basis of this quality measure, it can then be decided whether a renewed storage of the training data is necessary. For this purpose, the quality measure can, for example, be compared with a predetermined limit value.
In a further embodiment, a number of the detected object features, which are assigned to the stored object features, is determined, and the number is used to decide whether the object features and/or the trajectory need to be stored anew. A further criterion, which determines whether the renewed storing of the training data is required, is the number of valid matches in the assignment of the detected object features to the stored object features. If, in the operating phase, the object features are extracted from the image, a detected object feature cannot be assigned to each of the stored object features. If the environmental conditions change, the number will decrease. If, for example, this number falls below a predefined threshold value, it can be decided that the renewed determination of the object features and/or the trajectory is necessary.
In a further embodiment, one of the stored object features is deleted if no object feature is assigned to this object feature in a predetermined number of operating phases and/or for a predetermined time duration in the operating phases. It can be provided in particular that an information is assigned to each of the stored object features, which describes the time since this object feature was last detected in an operating phase. This time can be a physical time, for example, a date or time, or the number of operating phases carried out. If the time exceeds a predetermined limit, this stored object feature can be deleted. Here, for example, it can be assumed that this stored object feature describes an object which is no longer present, or which has been stored incorrectly.
In a further embodiment, a further object feature is stored if it is detected in a
predetermined number of operating phases. In each operating phase object features are recognized which cannot be assigned to any of the stored object features. If these object features are detected in several successive operating phases, it can be assumed that they originate from a new object or that they originate from a change in the environmental conditions. If these object features are detected in several operating phases or a predetermined number of operating phases, they can be stored. The selection of the newly stored object features can be determined on the basis of criteria which describe the quality of the object feature. Here, for example, the optical saliency and/or the noise in reconstruction can be taken into account. In addition, it can be verified whether the object feature can be assigned to further object features or whether it is an outlier. The number of newly stored features can be limited. For example, the number of newly stored object features can correspond to the previously deleted object features. Thus, the memory space can be limited and the computational effort can be reduced.
In a further embodiment, a recording information for the respective object features is stored in the training phase and/or in the operating phase, and the assignment is performed using the respective recording information. In the training phase, additional recording information can be determined, which is then assigned to the stored object features. This recording information can describe the recording of the image by means of the camera and/or the ambient conditions. In the same way, recording information for the detected object features can be determined during the operating phase. When comparing the detected object features with the stored object features, this recording information can then be taken into account. For example, it can be verified whether the recording information is similar or comparable. In this way, the robustness in the assignment can be improved.
In particular, it is provided that the recording information describes a lighting situation in the environmental region, a time of day and/or a season. It is thus possible, for example, to take account of the fact that different illumination situations are given during the storage phase of the object features during the training phase and during the detection of the object features during the operating phase. If, for example, object features have been stored under bright ambient conditions, it may be the case that these are not detected during dark operating conditions in the operating phase. If the stored object features were determined at a different time of the year than the object features are detected, differences may also occur. For example, it is possible to store object features which describe leaves in trees or plants, which are not present during an operating phase which takes place in winter. This enables a reliable allocation of the object features during the operating phase.
A driver assistance system for a motor vehicle according to the invention is adapted for performing a method according to the invention and the advantageous embodiments thereof. The driver assistance system can comprise the camera by means of which the objects can be detected in the environment of the motor vehicle. In addition, the driver assistance system can have a motion sensor with which the movement of the motor vehicle can be recorded in the training phase. In addition, the driver assistance system can comprise an electronic control device by means of which the trajectory can be recorded and the object features and/or the associated optical descriptors can be stored. In addition, a steering, a drive motor and/or a braking system can be controlled during the at least semi-autonomous maneuvering by means of the control device. Furthermore, the driver assistance system may have a satellite-based position determination system.
A motor vehicle according to the invention comprises a driver assistance system according to the invention. The motor vehicle is preferably designed as a passenger car.
The preferred embodiments presented with respect to the method according to the invention and the advantages thereof correspondingly apply to the driver assistance system according to the invention as well as to the motor vehicle according to the invention.
Further features of the invention are apparent from the claims, the figures and the description of figures. The features and feature combinations mentioned above in the description as well as the features and feature combinations mentioned below in the description of figures and/or shown in the figures alone are usable not only in the respectively specified combination, but also in other combinations or alone without departing from the scope of the invention. Thus, implementations are also to be considered as encompassed and disclosed by the invention, which are not explicitly shown in the figures and explained, but arise from and can be generated by separated feature combinations from the explained implementations. Implementations and feature combinations are also to be considered as disclosed, which thus do not have all of the features of an originally formulated independent claim. Moreover, implementations and feature combinations are to be considered as disclosed, in particular by the implementations set out above, which extend beyond or deviate from the feature combinations set out in the relations of the claims.
Now, the invention is explained in more detail based on preferred embodiments as well as with reference to the attached drawings.
These show in:
Fig. 1 a motor vehicle according to an embodiment of the invention, which
comprises a driver assistance system;
Fig. 2 the motor vehicle which is manually maneuvered along a trajectory,
wherein the driver assistance system is operated in a training phase;
Fig. 3 the motor vehicle which is maneuvered along a trajectory in an operating phase of the driver assistance system; and
Fig. 4 a digital environmental map describing the trajectory, object features stored in the training phase as well as object features detected in the operating phase.
In the figures, identical or functionally identical elements are provided with the same reference characters.
Fig. 1 shows a motor vehicle 1 according to an embodiment of the present invention in a plan view. In the present case, the motor vehicle 1 is designed as a passenger car. The motor vehicle 1 comprises a driver assistance system 2. The driver assistance system 2 comprises, in turn, an electronic control unit 3. In addition, the driver assistance system 2 comprises at least one camera 4. In the present exemplary embodiment, the driver assistance system 2 comprises four cameras 4 which are arranged distributed on the motor vehicle 1 . In the present case, one of the cameras 4 is arranged in a rear region 5 of the motor vehicle 1 , one of the cameras 4 is arranged in a front region 7 of the motor vehicle 1 , and the other two cameras 4 are arranged in a respective side region 6, in particular in a region of the side mirrors of the motor vehicle 1 . The number and arrangement of the cameras 4 is to be understood as purely exemplary.
With the cameras 4, objects 10 can be detected in an environmental region 8 of the motor vehicle 1 . For this purpose, images or image sequences can be provided with the cameras 4, which describe the objects 10 in the environmental region 8. These images can then be transmitted from the respective cameras 4 to the control unit 3. By means of a corresponding object identification algorithm, the control device 3 can then recognize the objects 10 in the environmental region 8.
In addition, the driver assistance system 2 comprises a motion sensor 9, by means of which a movement of the motor vehicle 1 can be recorded. For example, a steering angle and/or revolutions of at least one wheel of the motor vehicle 1 can be determined continuously with the movement sensor 9. The movement sensor 9 is also connected to the control device 3 for data transmission. In addition, the motion sensor 9 can comprise a receiver for a satellite-based position determination system. Thereby, the position of the motor vehicle 1 can be determined continuously.
Fig. 2 shows the motor vehicle 1 , wherein the driver assistance system 2 is operated in a training phase. In the training phase, the motor vehicle 1 is manually maneuvered along a trajectory 1 1 by a driver of the motor vehicle 1 . This trajectory 1 1 , which comprises a plurality of points 12, extends from a starting point 13 to a target point 14. In the present case, the target point 14 is located within a garage 15. The motor vehicle 1 is
maneuvered past several objects 10 during maneuvering. In the present case, there are 8 objects 10 in the form of plants 16, a pedestrian 17 as well as the garage 15. During the maneuvering of the motor vehicle 1 in the training phase, object features 18 of the objects 10 are recognized and stored on the basis of the images which are provided with the cameras 4. In addition, the trajectory 1 1 or the points 12 of the trajectory 1 1 are stored. The trajectory 1 1 or its points 12 can be determined by means of the motion sensor 9.
Fig. 3 shows the motor vehicle 1 , the driver assistance system 2 being operated in an operating phase which follows the training phase in time. The trajectory 1 1 , which was determined in the preceding training phase, is shown schematically. As soon as it is detected that the motor vehicle 1 is located at the start position 13 of the stored trajectory 1 1 , the motor vehicle 1 can be maneuvered along the trajectory 1 1 to the target point 14 in the garage 15. This applies in the event that there is no obstacle on the trajectory 1 1 . During the operating phase, object features 19 are detected and assigned to the object features 18 stored in the training phase.
Fig. 4 shows a representation of an environmental map 20 which shows the trajectory 1 1 , the points 12 of the trajectory 1 1 , the stored object features 18, and the detected object features 19 which were assigned to the stored object features 18. The assigned object features 19 are assigned to the stored object features 18 by means of a visual algorithm for simultaneous localization and mapping. In the present case, it can be seen that the number of detected object features 19 is smaller than the number of object features 18 stored in the training phase. In the training phase, at least one visual descriptor can be determined for the stored object features 18 and be stored in the digital environmental map 20. In the subsequent operating phase, visual descriptors can also be determined for the detected object features 19. In the operating phase, the visual descriptors of the detected object features 19 can then be assigned to the visual descriptors of the stored object features 18. The motor vehicle 1 can thus be located on the basis of the known points 12 of the trajectory 1 1 .
Due to changing ambient conditions, differences between the detected object features 19 and the stored object features 18 can result. This may, for example, be due to a difference in the season, the time of day and/or the lighting conditions. If the ambient conditions have changed significantly, a renewed recording of the trajectory 1 1 and/or the stored object features 18 is necessary in order to be able to reliably operate the driver assistance system 2 in the operating phases. In the present case, it is intended to automatically recognize when a renewed recording of the stored object features 18 and/or the trajectory 1 1 is required.
In the operating phase, the detected object features 19 are assigned to the stored object features 18 by means of the algorithm. Based on this assignment, a quality measure can be determined which describes the quality of the assignment or the matching. This quality measure can then be used as a criterion as to whether a renewed recording of the stored object features 18 and/or the trajectory 1 1 is necessary. As a further criterion, the number of validly assigned detected object features 19 to the stored object features 18 can be used. When the object features 1 1 are extracted from the image, not all object features 19 can be assigned to the stored object features 18. If the environmental conditions change, the number of valid assignments will decrease significantly. As soon as the number of valid matches is below a threshold value and/or the quality measure falls below a predetermined limit value, it can be decided that a re-storage of the object features 18 and/or the trajectory 1 1 is required. It should be noted that the limit value or the threshold value can be determined in such a way that the operating phase can be performed reliably, but there are indications that the ambient conditions have changed. This renewed storage of the object features 18 and/or of the trajectory 1 1 is then carried out
automatically in the subsequent operating phase.
Furthermore, it is provided that the stored object features 18 and/or the visual descriptors of the stored object features 18 are dynamically adapted during operation of the driver assistance system 2. For example, stored object features 18, which are never recognized in the operating mode, can be deleted. In principle, an initial training phase is required in which the driver explicitly specifies the trajectory 1 1 along which the motor vehicle 1 is to be maneuvered. The object features 18 can also be assigned an information describing the time since this object feature 18 has been found for the last time in the operating phase. If this time exceeds a threshold value, this stored object feature 18 can be deleted from the digital environmental map 20. It may also be provided that during the operating phase new object features 19 are detected which cannot be assigned to any of the stored object features 18. Here, it can be assumed that this object feature 19 is assigned to a new object 10 in the environmental region 8. If this object feature 19 is recognized in a predetermined number of operating phases, this object feature 19 can also be stored or used as a stored object feature 18.
Furthermore, it may be provided that additional recording information is added to the object features 18. This recording information may, for example, describe the time of day, the season, the lighting state or the like. These recording information can then be taken into account when the detected object features 19 are assigned to the stored object features 18. Furthermore, these recording information can also be taken into account in the deletion of stored object features 18.
Thus, it is possible to decide whether the stored trajectory 1 1 and/or the stored object features 18 are to be recorded again without user input by the driver. The renewed storage of these training data can then be carried out in the subsequent operating phase. Thus, the training data can be updated continuously and be adapted to the gradually changing environmental conditions. It is not necessary for the driver to manually trigger the renewed recording of the training data. This overall enables a reliable operation of the driver assistance system 2.

Claims

Claims
1 . Method for operating a driver assistance system (2) for a motor vehicle (1 ), wherein in a training phase of the driver assistance system (2), during which the motor vehicle (1 ) is maneuvered manually by a driver along a trajectory (1 1 ), the trajectory (1 1 ) is stored, and on the basis of at least one image, which is provided by means of a camera (4) of the motor vehicle (1 ), a plurality of object features (18), which describe objects in an environmental region (8) of the motor vehicle (1 ), is stored, and in an operating phase of the driver assistance system (2) the motor vehicle (1 ) is maneuvered at least semi-autonomously along the stored trajectory (1 1 ) on the basis of the stored trajectory (1 1 ) and the stored object features (18),
characterized in that
a plurality of object features (19) are detected in the operating mode, the detected object features (19) are assigned to the stored object features (18), and a decision is made on the basis of the assignment whether renewed storage of the object features (18) and/or the trajectory (1 1 ) is required.
2. Method according to claim 1 ,
characterized in that
the object features (18) and/or the trajectory (1 1 ) are stored anew during a subsequent operating phase if it is decided that a renewed storage of the object features (18) and/or the trajectory (1 1 ) is required.
3. Method according to claim 1 or 2,
characterized in that
the assignment of the detected object features (19) to the stored object features (18) is carried out by means of a visual method of simultaneous localization and mapping, structure from motion and/or bundle adjustment.
4. Method according to any one of the preceding claims,
characterized in that a visual descriptor describing the respective object features (18, 19) in the image is determined for the stored object features (18) and/or the detected object features (19), and the assignment is determined by means of the visual descriptors.
5. Method according to any one of the preceding claims,
characterized in that
on the basis of the assignment, a quality measure is determined which describes a matching between the detected object features (19) and the stored object features (18), and on the basis of the quality measure it is decided whether the renewed storage of the object features (18) and/or the trajectory (1 1 ) is necessary.
6. Method according to any one of the preceding claims,
characterized in that
a number of the detected object features (19), which are assigned to the stored object features (18), is determined, and the number is used to decide whether the object features (18) and/or the trajectory (1 1 ) need to be stored anew.
7. Method according to any one of the preceding claims,
characterized in that
one of the stored object features (18) is deleted if no detected object feature (19) is assigned to this object feature (18) in a predetermined number of operating phases and/or for a predetermined time duration in the operating phases.
8. Method according to any one of the preceding claims,
characterized in that
a further object feature is stored if it is detected in a predetermined number of operating phases.
9. Method according to any one of the preceding claims,
characterized in that
a recording information for the respective object features (18, 19) is stored in the training phase and/or in the operating phase, and the assignment is performed using the respective recording information.
10. Method according to claim 9,
characterized in that that the recording information describes a lighting situation in the environmental region (8), a time of day and/or a season.
1 1 . Driver assistance system (2) of a motor vehicle (1 ), which is adapted for performing a method according to any one of the preceding claims.
12. Motor vehicle (1 ) comprising a driver assistance system (2) according to claim 1 1 .
PCT/EP2018/069198 2017-07-17 2018-07-16 Method for operating a driver assistance system for a motor vehicle with renewed storage of training data, driver assistance system as well as motor vehicle WO2019016114A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102017115991.6A DE102017115991A1 (en) 2017-07-17 2017-07-17 Method for operating a driver assistance system for a motor vehicle with re-storage of training data, driver assistance system and motor vehicle
DE102017115991.6 2017-07-17

Publications (1)

Publication Number Publication Date
WO2019016114A1 true WO2019016114A1 (en) 2019-01-24

Family

ID=62948110

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/069198 WO2019016114A1 (en) 2017-07-17 2018-07-16 Method for operating a driver assistance system for a motor vehicle with renewed storage of training data, driver assistance system as well as motor vehicle

Country Status (2)

Country Link
DE (1) DE102017115991A1 (en)
WO (1) WO2019016114A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2571152A (en) * 2018-02-15 2019-08-21 Jaguar Land Rover Ltd Vehicle Control System and Control Method
WO2020160268A1 (en) * 2019-01-31 2020-08-06 Micron Technology, Inc. Autonomous vehicle data recorders
US10846955B2 (en) 2018-03-16 2020-11-24 Micron Technology, Inc. Black box data recorder for autonomous driving vehicle
CN113200039A (en) * 2021-06-09 2021-08-03 广州小鹏智慧充电科技有限公司 Parking-based road generation method and device, vehicle and readable medium
US11094148B2 (en) 2018-06-18 2021-08-17 Micron Technology, Inc. Downloading system memory data in response to event detection
US11328210B2 (en) 2017-12-29 2022-05-10 Micron Technology, Inc. Self-learning in distributed architecture for enhancing artificial neural network
US11373466B2 (en) 2019-01-31 2022-06-28 Micron Technology, Inc. Data recorders of autonomous vehicles

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019121469A1 (en) * 2019-08-08 2021-02-11 Bayerische Motoren Werke Aktiengesellschaft Method and system for labeling situation data for use in a driver assistance system of a vehicle
DE102020206168A1 (en) * 2020-05-15 2021-11-18 Robert Bosch Gesellschaft mit beschränkter Haftung Method for localizing a vehicle in relation to an environment model around a driving trajectory
DE102021107523A1 (en) 2021-03-25 2022-09-29 Valeo Schalter Und Sensoren Gmbh METHOD OF OPERATING A PARKING ASSISTANCE SYSTEM, COMPUTER PROGRAM PRODUCT, PARKING ASSISTANCE SYSTEM AND VEHICLE
DE102021117615A1 (en) 2021-07-08 2023-01-12 Valeo Schalter Und Sensoren Gmbh METHOD OF OPERATING A PARKING ASSISTANCE SYSTEM, COMPUTER PROGRAM PRODUCT, PARKING ASSISTANCE SYSTEM AND VEHICLE
DE102022113191A1 (en) 2022-05-25 2023-11-30 Bayerische Motoren Werke Aktiengesellschaft TRAINING METHOD FOR A DRIVING ASSISTANCE SYSTEM FOR AUTOMATED TRANSVERSE GUIDANCE OF A MOTOR VEHICLE
DE102022126293A1 (en) 2022-10-11 2024-04-11 Valeo Schalter Und Sensoren Gmbh Method for operating a driver assistance system for a motor vehicle, driver assistance system for a motor vehicle and motor vehicle

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014018192A1 (en) 2014-12-09 2015-06-18 Daimler Ag Method for approaching a parking space by a vehicle
DE102015011358A1 (en) * 2015-08-29 2016-03-17 Daimler Ag Method for operating a vehicle
DE102016003261A1 (en) * 2016-03-16 2016-09-29 Daimler Ag Method for self-localization of a vehicle in a vehicle environment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014223363B4 (en) * 2014-11-17 2021-04-29 Volkswagen Aktiengesellschaft Method and device for localizing a motor vehicle in a fixed reference map

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014018192A1 (en) 2014-12-09 2015-06-18 Daimler Ag Method for approaching a parking space by a vehicle
DE102015011358A1 (en) * 2015-08-29 2016-03-17 Daimler Ag Method for operating a vehicle
DE102016003261A1 (en) * 2016-03-16 2016-09-29 Daimler Ag Method for self-localization of a vehicle in a vehicle environment

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11328210B2 (en) 2017-12-29 2022-05-10 Micron Technology, Inc. Self-learning in distributed architecture for enhancing artificial neural network
GB2571152A (en) * 2018-02-15 2019-08-21 Jaguar Land Rover Ltd Vehicle Control System and Control Method
GB2571152B (en) * 2018-02-15 2020-04-22 Jaguar Land Rover Ltd Vehicle Control System and Control Method
US10846955B2 (en) 2018-03-16 2020-11-24 Micron Technology, Inc. Black box data recorder for autonomous driving vehicle
US11676431B2 (en) 2018-03-16 2023-06-13 Micron Technology, Inc. Black box data recorder for autonomous driving vehicle
US11094148B2 (en) 2018-06-18 2021-08-17 Micron Technology, Inc. Downloading system memory data in response to event detection
US11756353B2 (en) 2018-06-18 2023-09-12 Micron Technology, Inc. Downloading system memory data in response to event detection
WO2020160268A1 (en) * 2019-01-31 2020-08-06 Micron Technology, Inc. Autonomous vehicle data recorders
US11373466B2 (en) 2019-01-31 2022-06-28 Micron Technology, Inc. Data recorders of autonomous vehicles
US11410475B2 (en) 2019-01-31 2022-08-09 Micron Technology, Inc. Autonomous vehicle data recorders
US11670124B2 (en) 2019-01-31 2023-06-06 Micron Technology, Inc. Data recorders of autonomous vehicles
CN113200039A (en) * 2021-06-09 2021-08-03 广州小鹏智慧充电科技有限公司 Parking-based road generation method and device, vehicle and readable medium
CN113200039B (en) * 2021-06-09 2024-02-02 广州小鹏智慧充电科技有限公司 Road generation method and device based on parking, vehicle and readable medium

Also Published As

Publication number Publication date
DE102017115991A1 (en) 2019-01-17

Similar Documents

Publication Publication Date Title
WO2019016114A1 (en) Method for operating a driver assistance system for a motor vehicle with renewed storage of training data, driver assistance system as well as motor vehicle
EP3694768B1 (en) Automatically parking a vehicle in a parking slot
US10338600B2 (en) Autonomous driving control apparatus, vehicle having the same, and method for controlling the same
JP6382328B2 (en) Autonomous driving system and autonomous driving method
KR102299825B1 (en) Apparatus for detecting obstacle in vehicle and control method thereof
JP6565615B2 (en) Vehicle control device
CN106864454A (en) For the method and apparatus of the manipulation process of auxiliary maneuvering vehicle
US11170241B2 (en) Device for determining the attentiveness of a driver of a vehicle, on-board system comprising such a device, and associated method
US20100036562A1 (en) Driver Assistance System
US20070211482A1 (en) Method for controlling the automatic switching of the projector of a vehicle
US20220363250A1 (en) Vehicular driving assistance system with lateral motion control
JP2018531175A6 (en) Method for identifying a parking area for parking an automatic vehicle, driver assistance system, and automatic vehicle
WO2018127365A1 (en) Method for assisting a driver of a motor vehicle when parking using a driver assistance device, corresponding computer program product and driver assistance device
CN108367755B (en) Method and device for controlling at least one driver interaction system
US10766527B2 (en) Steering angle controller
WO2018224356A1 (en) Method for providing stored data of a trained parking procedure, corresponding computer program product and system
CN107054355B (en) Method and device for assisting a parking maneuver
CN114954632A (en) Vehicle control method, vehicle control system and vehicle
US10175055B2 (en) Navigation system
US20170201687A1 (en) Vehicle Camera System
US20210024056A1 (en) Parking alignment adjustment apparatus and method
JP5813298B2 (en) Drive recorder and image storage method
US11747148B2 (en) Parking assistance apparatus for aiding a driver of a motor vehicle with a parking process
WO2018091400A1 (en) Method for operating a driver assistance system of a motor vehicle with extended learning mode, driver assistance system as well as motor vehicle
US11807225B2 (en) Method and device for providing assistance to a motor vehicle during a process of exiting a lateral parking space

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18742460

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18742460

Country of ref document: EP

Kind code of ref document: A1