EP4313733A1 - Verfahren zum betreiben eines parkassistenzsystems, computerprogrammprodukt, parkassistenzsystem sowie fahrzeug - Google Patents

Verfahren zum betreiben eines parkassistenzsystems, computerprogrammprodukt, parkassistenzsystem sowie fahrzeug

Info

Publication number
EP4313733A1
EP4313733A1 EP22717798.7A EP22717798A EP4313733A1 EP 4313733 A1 EP4313733 A1 EP 4313733A1 EP 22717798 A EP22717798 A EP 22717798A EP 4313733 A1 EP4313733 A1 EP 4313733A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
distribution
determined
similarity
trajectory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22717798.7A
Other languages
German (de)
English (en)
French (fr)
Inventor
Usama MOHAMAD
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Valeo Schalter und Sensoren GmbH
Original Assignee
Valeo Schalter und Sensoren GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Valeo Schalter und Sensoren GmbH filed Critical Valeo Schalter und Sensoren GmbH
Publication of EP4313733A1 publication Critical patent/EP4313733A1/de
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/027Parking aids, e.g. instruction means
    • B62D15/0285Parking performed automatically
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2300/00Purposes or special features of road vehicle drive control systems
    • B60Y2300/06Automatic manoeuvring for parking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9314Parking operations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2015/932Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles for parking operations

Definitions

  • the present invention relates to a method for operating a parking assistance system, a computer program product, a parking assistance system and a vehicle.
  • Parking assistance systems which can learn a specific trajectory to be followed, with the vehicle being driven manually along the trajectory to be followed later in a training mode.
  • environmental data is recorded and stored by the vehicle's sensors, which is intended to enable the vehicle to be localized later on. This can be done, for example, using VSLAM, with camera images being recorded and evaluated, and a current position of the vehicle being determined in this way.
  • the stored environment data is up-to-date, otherwise localization is not possible. Since the environment can change over time, for example because moving objects are removed, added or moved, or because structural measures are carried out in the environment, the problem arises that the environment data can become out of date. In order to continue to carry out the localization successfully, the saved environment data must be updated.
  • DE 10 2017 115991 A1 discloses a method for operating a driver assistance system for a motor vehicle in which, in a training phase of the driver assistance system, during which the motor vehicle is maneuvered manually by a driver along a trajectory ma, the trajectory is stored and based on at least one image which is provided with a camera of the motor vehicle, a plurality of object features is stored.
  • the motor vehicle is maneuvered semi-autonomously along the stored trajectory using the stored trajectory and the stored object features.
  • a plurality of object features are recognized and the recognized object features are assigned to the stored object features. Based on the assignment, a decision is made as to whether the object features and/or the trajectory need to be saved again.
  • an object of the present invention is to improve the operation of a parking assistance system.
  • a method for operating a parking assistance system for a vehicle is proposed.
  • the parking assistance system is set up in a training mode for detecting and storing a trajectory to be trained and is set up in a following mode for following the stored trajectory with the vehicle.
  • Training mode includes:
  • A3 determining a plurality of optical features in the received image, a respective optical feature being characterized by at least one parameter
  • This method has the advantage that an update of the stored data set with the optical features that are used to localize the vehicle in the tracking mode is only updated if a statistical significance for a necessary update is determined. On the one hand, this avoids an update being carried out even in the event of minor changes in the environment, which is why the computing power required for this, which would have to be provided by the parking assistance system or another computing unit in the vehicle, is not required. The computing power is thus available for other processes, which contributes, for example, to increased security, reliability and/or speed of other running processes.
  • the method provides a reliable measure, because it is based purely on statistics, for reliably assessing whether an update of a respective data set is useful, ie whether it contributes significantly to improved localization of the vehicle, for example.
  • the vehicle In the training mode, the vehicle is in particular moved manually by a user of the vehicle. This means that the user is in control of the vehicle at all times.
  • a remote control and/or self-steering and/or self-propelled systems of the vehicle are used here, with sensor-supported decisions about a change in the direction of travel being able to be proposed and/or implemented by the vehicle.
  • the received image is in particular an image that is received by an on-board camera, for example a front camera. It can also be an image composed of multiple images from different cameras and/or images captured at different times.
  • the received image can comprise an extended spectral range, for example the image can comprise optical information in the near infrared range and in the UV range.
  • the image has spectral information between 2500 nm-150 nm.
  • the image can contain information in one or more sharply delimited spectral ranges, which were recorded, for example, using appropriate band or line filters. the one that can optimize a contrast for determining the optical features in the respective image.
  • the received image is in particular a digital image that can be displayed in the form of a two-dimensional pixel matrix, the pixel matrix being able to have a number of levels, with each level containing, for example, the information of a specific spectral range.
  • each level containing, for example, the information of a specific spectral range.
  • this has three levels corresponding to three detected color channels, in particular red, green and blue (RGB).
  • the optical features that are determined in the image have, for example, certain characteristics, such as a certain contrast between adjacent pixels and/or across a plurality of pixels, a certain shape, such as a round shape, an angular shape, an elongated shape shape, a wavy shape and the like.
  • Various image processing methods and/or image transformations can be used to determine the optical features and can be combined with one another in different sequences.
  • neural networks can be used, in particular to carry out an object classification of objects visible in the image.
  • a respective feature is characterized in particular by a number of parameters.
  • these parameters include the position of the feature in the image, the position being defined by at least two coordinates, for example an x-value and a y-value, a "color" of the feature, a shape of the feature, a Extension of the feature, which can be specified, for example, by the number of pixels that the feature covers, a classification of the feature, and the like.
  • the “color” of a respective feature can be specified, for example, by specifying an intensity (brightness information) of the feature at a specific wavelength or with a specific filter. For example, the intensity is determined by the value of an entry in the pixel matrix that is associated with a pixel.
  • the number of possible values that a parameter can take ranges from binary ("0" or "1") to quasi-continuous with no upper and/or lower limit. "Quasi-continuous" because the data is presently processed digitally, which is why the parameter values are present in quantized form, even if the corresponding parameter itself is of continuous nature.
  • At least 50 optical features and up to 5000 optical features are preferably determined in a single image. It should be noted here that a larger number of optical features requires a correspondingly larger memory requirement for storing the data set. On the other hand, although the accuracy of a localization increases with an increasing number of optical features, this increase flattens out as the number increases. Between 100 and 500 optical features are preferably determined per image and stored in the data set.
  • the data record includes, for example, a list or table of the optical features, where the corresponding parameter values are assigned to each feature. Not all determined optical features necessarily include a corresponding value for every possible parameter, or they have a value that characterizes a parameter as "undetermined".
  • Receiving the images and determining the optical features is carried out in particular in the same way in the training mode and the tracking mode, for example the same image processing methods are used. In this case, however, it can be provided that the accuracy with which one or more arithmetic operations are carried out varies, for example as a function of the available system resources. This does not rule out the possibility of new and/or different image processing steps and methods being added over the course of the service life of the parking assistance system as part of a system update or the like. After the system update has been carried out, these are then used again equally for training mode and tracking mode. This ensures that results of the same quality and/or the same type are achieved in the training mode and in the follow-up mode.
  • the first and the second distribution of at least one of the parameters are determined in the next step.
  • the distribution determined here is in particular a probability distribution.
  • the distribution of this random variable is characteristic of a particular image.
  • the determined distribution can be one-dimensional or also multi-dimensional.
  • a local distribution of the optical features in a two-dimensional image can be determined as a two-dimensional distribution.
  • a multidimensional distribution is not limited to parameters of the same type (such as location coordinates), but a multidimensional distribution can also be determined on the basis of a "location coordinate" parameter and a "color” parameter and/or other and/or additional parameters.
  • the similarity of the distributions can be determined by comparing the two distributions.
  • the similarity of the distributions corresponds, for example, to the overlap, the common set or the overlap of the distributions.
  • the similarity in multidimensional distributions can be determined separately for different dimensions (parameters) of the distributions.
  • the determined similarity can be compared to an update threshold value, with an update being performed if the determined similarity is below the update threshold value.
  • an overall similarity can be determined on the basis of the number of similarities determined.
  • the similarity values of the distributions of different parameters can be taken into account to different extents.
  • the similarity of the y-position (vertical position) distributions may be considered more than the similarity of the x-position (horizontal position) distributions, or vice versa.
  • the parking assistance system is set up in particular for semi-autonomous or fully autonomous operation of the vehicle, with it driving automatically along the trained trajectory in the follow-up mode, for example. Partially autonomous driving is understood, for example, to mean that the parking assistance system controls a steering device and/or an automatic driving stage.
  • Fully autonomous driving means, for example, that the parking assistance system also controls a drive device and a braking device. Orientation and/or localization of the vehicle takes place here in particular on the basis of a comparison of the determined optical features with the stored data records. From the absolute and/or relative arrangement of the determined optical features to each other, a shift or relative position of the current position of the vehicle in relation to the respective position during the training drive can be determined and the vehicle can correspondingly move to the trained trajectory and along the trained trajectory to be controlled.
  • the parameters of the optical features include a respective position of the respective feature in the image, a classification of the respective feature, a color of the respective feature, a geometric shape of the respective feature, a contrast value of the respective feature and the like.
  • the “color” parameter is understood to mean, for example, an intensity (brightness) at a specific wavelength, a specific wavelength band and/or at a plurality of wavelengths. Furthermore, the "color” parameter can include a ratio of two or more than two intensities at different wavelengths.
  • the "contrast value” parameter can include a pure intensity contrast, but can also include a color contrast.
  • a respective optical feature is unambiguously characterized by specifying the assigned or associated parameter values. For example, one can arrange the parameter values in a parameter vector, with the position in the vector identifying the parameter.
  • the similarity of the first and the second distribution is determined on the basis of the Bhattacharyya distance and/or the Kullback-Leibler distance.
  • steps A2)-A4) are carried out for a number of positions along the trajectory to be trained, so that a corresponding data set is stored for each of the positions.
  • steps B3)-B5) are carried out on the basis of those stored data sets whose corresponding position is at a distance from a current position of the vehicle that is less than or equal to a predetermined distance threshold value.
  • the trained trajectory is assigned a plurality of data sets with determined optical features, which were each determined on the basis of images captured at different positions along the trajectory.
  • the current vehicle position is a useful indicator of whether it makes sense to compare two distributions. Because if the positions from which the images on the basis of which the optical features are or were determined are too different, it can be assumed that the distributions are dissimilar, since the images can show different sections or areas of the environment. In this case it would be disadvantageous to perform an update, which can be reliably avoided by comparing the position.
  • This embodiment can also be referred to as a selection method for selecting the data sets or distributions to be compared.
  • the position of the vehicle can be determined here in particular using a position sensor such as GPS.
  • position in this example also includes an orientation of the vehicle, which can be determined, for example, by a magnetic field sensor relative to the earth's magnetic field and/or an artificial horizon.
  • only that stored data set is used in the follow-up mode whose corresponding position has the smallest distance from a current position of the vehicle in comparison with the other stored data sets of the trajectory.
  • steps A2)-A4) are carried out for a number of positions along the trajectory to be trained, so that a corresponding data set is stored for each of the positions.
  • steps B3) and B4) are carried out for all stored data sets and step B5) is carried out for those data sets whose first distribution has a similarity with the second distribution that is above a predetermined similarity threshold value.
  • the trained trajectory is assigned a plurality of data sets with determined optical features, which were each determined on the basis of images captured at different positions along the trajectory.
  • it is determined on the basis of the similarity of the distributions whether the images of the environment on which the respective distribution is based show a comparable section or area from the environment or not.
  • this embodiment can be combined with the position-based selection method.
  • the predetermined similarity threshold corresponds to a smaller similarity than the predetermined update threshold.
  • the similarity threshold is between 65%-75% and the update threshold is between 80%-95%. If the similarity is between 75%-80%, then it is determined that the corresponding record is to be updated.
  • a value of 100% means that two compared distributions are identical, and a value of 0% means that two compared distributions have no overlap or similarity at all.
  • step B5) is only carried out for that data set whose first distribution is most similar to the second distribution in comparison with all data sets of the trajectory.
  • a first stochastic process of the first distribution of the at least one parameter is determined based on a respective time stamp of the images received in the training mode, and a second stochastic process based on the respective time stamp of the images received in the tracking mode the second distribution of the parameter is determined, and step B5) is additionally and/or alternatively carried out on the basis of a similarity between the first stochastic process and the second stochastic process.
  • the development over time of the distribution of a parameter along the trajectory is determined and the decision to update tion linked to this.
  • the time is treated as an additional parameter, so that, for example, the temporal development of the distribution of a location coordinate can be represented in the form of a two-dimensional distribution.
  • the data set is updated in step B5) on the basis of the optical features determined in step B2).
  • updating the data set in step B5) includes replacing the data set with a current data set and/or replacing at least one optical feature contained in the stored data set and/or updating at least one parameter of one in the ge stored data record contained optical feature.
  • a computer program product which comprises instructions which, when the program is executed by a computer, cause the latter to execute the method according to the first aspect.
  • a computer program product such as a computer program means
  • a server in a network, for example, as a storage medium such as a memory card, USB stick, CD-ROM, DVD, or in the form of a downloadable file. This can be done, for example, in a wireless communication network by transferring a corresponding file with the computer program product or the computer program means.
  • a parking assistance system for a vehicle is proposed.
  • the parking assistance system is set up in a training mode for detecting and storing a trajectory to be trained and is set up in a following mode for following the stored trajectory with the vehicle.
  • the parking assistance system includes: a receiving unit for receiving at least one image of an environment of the vehicle while it is driving in the training mode along the trajectory to be trained, a first determination unit for determining a plurality of optical features in the received image, a respective optical feature being characterized by at least one parameter , and a storage unit for storing a data set comprising the determined optical features, the receiving unit being set up to receive at least one current image of the surroundings of the vehicle while the vehicle is traveling in the follow-up mode along the trajectory and the first determination unit for determining the optical features is established in the received current image.
  • the parking assistance system also includes: a second determination unit for determining a first distribution of at least one of the parameters based on the stored data set and for determining a second distribution of the parameter based on the determined optical features of the current image, a comparison unit for determining a similarity of the first distribution with the second distribution, and an update unit for updating the stored data set if the determined similarity is less than or equal to a predetermined update threshold value.
  • This parking assistance system has the same advantages that are described for the method according to the first aspect.
  • the embodiments and definitions presented for the method according to the first aspect apply accordingly to the parking assistance system.
  • the respective unit of the parking assistance system can be implemented in terms of hardware and/or software.
  • each respective unit can be designed, for example, as a computer or as a microprocessor.
  • the respective unit can be designed as a computer program product, as a function, as a routine, as an algorithm, as part of a program code or as an executable object.
  • each of the units mentioned here can also be designed as part of a higher-level control system of the vehicle, such as a central electronic control device and/or an engine control unit (ECU: Engine Control Unit).
  • ECU Engine Control Unit
  • a vehicle is proposed with at least one camera for capturing and outputting an image of an area surrounding the vehicle and with a parking assistance system according to the third aspect.
  • the vehicle is, for example, a passenger car or a truck.
  • the vehicle preferably includes a number of sensor units that are set up to detect the driving condition of the vehicle and to detect an environment of the vehicle.
  • sensor units of the vehicle are image recording devices, such as a camera, a radar (radio detection and ranging) or a lidar (light detection and ranging), ultrasonic sensors, location sensors, wheel angle sensors and/or wheel speed sensors.
  • the sensor units are each set up to output a sensor signal, for example to the parking assistance system or driving assistance system, which carries out the partially autonomous or fully autonomous driving as a function of the detected sensor signals
  • FIG. 1 shows a schematic view of an exemplary embodiment of a vehicle from a bird's eye view
  • Figure 2 shows a schematic view of a received image with a number of optical features contained therein;
  • 3 shows three diagrams as examples for a respective distribution of a respective parameter
  • FIG. 5 shows a schematic block diagram of an exemplary embodiment of a parking assistance system
  • FIG. 6 shows a schematic flow diagram of an exemplary embodiment of a method for operating a parking assistance system.
  • FIG. 1 shows a schematic view of a vehicle 100 from a bird's eye view.
  • the vehicle 100 is, for example, a car that is arranged in an environment 200 .
  • Car 100 has a parking assistance system 110, which is embodied as a control unit, for example.
  • the car 100 includes a front camera 120, which in this case game is arranged at an upper edge of the windscreen.
  • the camera 120 is configured to capture an image of the environment 200 .
  • the captured image is output, for example, to parking assistance system 110, which receives it and processes it further, as is described in detail below with reference to FIGS. 2-6.
  • the parking assistance system 110 is set up to drive the car 100 semi-autonomously or fully autonomously.
  • vehicle 100 can have various other sensor devices. Examples of this are ultrasonic sensors, a lidar, a radar, a GPS receiver, an acceleration sensor, a receiver for receiving electromagnetically transmittable data signals, a microphone and the like.
  • Fig. 2 shows a schematic view of a received image IMG with a number of optical features F1 - F8 contained therein.
  • the image IMG is an image with two spatial dimensions, ie it comprises a two-dimensional arrangement of pixels. In embodiments, it can also be an image with three spatial dimensions, which additionally includes depth information for each pixel.
  • Each pixel of the image of the IMG is uniquely determined by its respective coordinates in the image IMG, which are two spatial coordinates x, y.
  • the coordinates x, y relate to an image coordinate system (not shown), which originates, for example, in one of the corners of the image IMG.
  • the optical features F1 - F8 shown as an example have two location parameters that correspond to the x, y coordinates of the respective optical feature. It should be noted that the optical features F1-F8 are only shown as points in the image IMG in this example for reasons of clarity, but a respective optical feature can also have an extent.
  • the optical features F1-F8 are characterized by a third parameter p, which is a color value of the optical feature F1-F8, for example.
  • a respective optical feature is thus clearly characterized by specifying the three parameters x, y, p.
  • the optical Feature F1 can be represented, for example, by the specification F1 (x1, y1, p1), where x1, y1, p1 are the respective values of the respective parameter for the optical feature F1.
  • a respective optical feature F1-F8 can be characterized by more than three parameters. It should also be noted that significantly more than eight optical features F1-F8 are preferably determined in an image IMG, for example between 200 and 500 optical features.
  • a respective optical feature F1-F8 is in particular a characteristic structure in the received image IMG, which can be determined, for example, in the form of a contrast.
  • FIG. 3 shows three diagrams as examples for a respective distribution P(x), P(y), P(p) of a respective parameter x, y, p.
  • the distributions each represent a probability distribution or a frequency distribution, with the vertical axis P indicating the probability or the frequency and the respective horizontal axis x, y, p indicating the value of the respective parameter x, y, p.
  • the distributions P(x) and P(y) are represented in this example as (quasi-)continuous distributions and the distribution P(p) is represented as a discrete distribution. Since the respective values that a parameter x, y, p can assume are quantized in particular when determined by the determination unit of the parking assistance system, all distributions are, for example, discrete, and one can also speak of quasi-continuous distributions. A data reduction can also be advantageous here, which is carried out, for example, in the form of "binning", with all values that lie within a specific interval being assigned a mean value (for example when reducing the bit depth for a parameter).
  • Figure 4 shows a diagram containing two distributions P1(x), P2(x) for a parameter x.
  • FIG. 4 serves to illustrate what is to be understood by the similarity of two distributions P1(x), P2(x).
  • the first distribution P1(x) is the first distribution of the Parameter x, which was determined on the basis of the stored data set
  • the distribution P2(x) is the second distribution of the same parameter x, which is based on the determined optical features F1-F8 (see FIG. 2) of the received current image IMG (see Fig. 2) was determined.
  • the similarity between the two distributions P1(x), P2(x) can be illustrated as the overlap U(P1; P2) of the two distributions P1(x), P2(x).
  • the number of equal parameter values x in the two distributions P1(x), P2(x) can be counted and divided by the total number of optical features F1-F8 to determine the similarity.
  • the similarity between two distributions P1(x), P2(x) is advantageously determined on the basis of the Bhattacharyya distance and/or the Kullback-Leibler distance. Other known measures of similarity can also be used.
  • Fig. 5 shows a schematic block diagram of an exemplary embodiment of a parking assistance system 110, for example the parking assistance system of vehicle 100 in Fig. 1.
  • Parking assistance system 110 is set up in a training mode MODO (see Fig. 6) for detecting and storing a trajectory to be trained and is set up in a tracking mode MODI (see FIG. 6) for tracking the stored trajectory with the vehicle 100 .
  • Parking assistance system 110 includes a receiving unit 111 for receiving at least one image IMG (see Fig. 2) of surroundings 200 (see Fig. 1) of vehicle 100 while it is driving in training mode MODO along the trajectory to be trained, a first determination unit 112 for determining a plurality of optical features F1-F8 (see FIG.
  • a respective optical feature F1-F8 being characterized by at least one parameter x, y, p (see FIG. 2 or 3), and a storage unit 113 for storing a data set comprising the determined optical features F1-F8.
  • the receiving unit 111 is also set up to receive at least one current image IMG of the surroundings 200 of the vehicle 100 while the vehicle 100 is traveling in the follow-up mode MODI along the trajectory, and the first determination unit 112 is also set up to determine the optical features F1 - F8 in the received current picture IMG set up.
  • the parking assistance system 110 also includes a second determination unit 114 for determining a first distribution P(x), P(y), P(p), P1(x) (see FIG. 3 or 4) of at least one of the parameters x, y, p based on the stored data set and for determining a second distribution P(x), P(y), P(p), P2(x) (see FIG.
  • a comparison unit 115 for determining a similarity of the first distribution P(x), P(y), P(p), P1 (x) to the second distribution P(x), P (y), P(p), P2(x) and an update unit 116 for updating the stored data set if the determined similarity is less than or equal to a predetermined update threshold value.
  • Fig. 6 shows a schematic flowchart of an embodiment of a method for operating a parking assistance system 110, for example the parking assistance system 110 of Fig. 1 or Fig. 5.
  • the method includes a training mode MODO, in which a trajectory to be trained is detected and stored, and includes a tracking mode in which the stored trajectory is tracked with the vehicle.
  • the MODO training mode includes, in particular, steps S1 - S4, with vehicle 100 being driven manually along the trajectory in a first step S1, and at least one image IMG (see Fig. 2) of an environment 200 (see Fig. 1) of vehicle 100 is received during manual driving, in a third step S3 a plurality of optical features F1-F8 (see FIG. 2) are determined in the received image IMG, with a respective optical feature F1-F8 being determined by at least one parameter x, y, p (see FIG. 2 or 3) is characterized, and in a fourth step S4 a data set comprising the determined optical features F1-F8 is stored.
  • the tracking mode includes in particular the steps S5 - S9, wherein in a fifth step S5 at least one current image IMG of the surroundings 200 of the vehicle 100 is received during tracking, in a sixth step S6 the optical features F1 - F8 in the received current image Image IMG are determined, in a seventh step S7, a first distribution P (x), P (y), P (p), P1 (x) (see Fig. 3 or 4) at least one of Parameter x, y, p based on the stored data set and a second distribution P(x), P(y), P(p), P2(x) (see Fig.
  • a similarity of the first distribution P(x), P(y), P(p), P1(x) with the second distribution P (x), P(y), P(p), P2(x) is determined and in a ninth step S9 the stored data record is updated if the determined similarity is less than or equal to a predetermined update threshold value.
  • the MODO training mode is carried out only once for a specific trajectory, with the MODI follow-up mode being able to be carried out as often as desired on the basis of the trained trajectory.
  • follow-up mode MODI can include further steps that relate, for example, to the control of vehicle 100 by parking assistance system 110 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
EP22717798.7A 2021-03-25 2022-03-24 Verfahren zum betreiben eines parkassistenzsystems, computerprogrammprodukt, parkassistenzsystem sowie fahrzeug Pending EP4313733A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021107523.8A DE102021107523A1 (de) 2021-03-25 2021-03-25 Verfahren zum betreiben eines parkassistenzsystems, computerprogrammprodukt, parkassistenzsystem sowie fahrzeug
PCT/EP2022/057727 WO2022200482A1 (de) 2021-03-25 2022-03-24 Verfahren zum betreiben eines parkassistenzsystems, computerprogrammprodukt, parkassistenzsystem sowie fahrzeug

Publications (1)

Publication Number Publication Date
EP4313733A1 true EP4313733A1 (de) 2024-02-07

Family

ID=81346210

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22717798.7A Pending EP4313733A1 (de) 2021-03-25 2022-03-24 Verfahren zum betreiben eines parkassistenzsystems, computerprogrammprodukt, parkassistenzsystem sowie fahrzeug

Country Status (7)

Country Link
US (1) US20240158010A1 (zh)
EP (1) EP4313733A1 (zh)
JP (1) JP2024512572A (zh)
KR (1) KR20230160368A (zh)
CN (1) CN117157228A (zh)
DE (1) DE102021107523A1 (zh)
WO (1) WO2022200482A1 (zh)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007051612B4 (de) 2007-10-24 2009-06-25 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Verfahren und Vorrichtung zum automatisierten Vergleichen zweier Sätze von Messwerten
DE102017115991A1 (de) 2017-07-17 2019-01-17 Connaught Electronics Ltd. Verfahren zum Betreiben eines Fahrerassistenzsystems für ein Kraftfahrzeug mit erneutem Speichern von Trainingsdaten, Fahrerassistenzsystem sowie Kraftfahrzeug
DE102017123848A1 (de) * 2017-10-13 2019-04-18 Connaught Electronics Ltd. Automatisches Parken eines Fahrzeugs auf einem Parkplatz
US10845815B2 (en) * 2018-07-27 2020-11-24 GM Global Technology Operations LLC Systems, methods and controllers for an autonomous vehicle that implement autonomous driver agents and driving policy learners for generating and improving policies based on collective driving experiences of the autonomous driver agents
US11679760B2 (en) * 2018-12-10 2023-06-20 Mobileye Vision Technologies Ltd. Navigation in vehicle crossing scenarios
WO2020180887A1 (en) 2019-03-04 2020-09-10 Iocurrents, Inc. Near real-time detection and classification of machine anomalies using machine learning and artificial intelligence

Also Published As

Publication number Publication date
KR20230160368A (ko) 2023-11-23
DE102021107523A1 (de) 2022-09-29
WO2022200482A1 (de) 2022-09-29
US20240158010A1 (en) 2024-05-16
CN117157228A (zh) 2023-12-01
JP2024512572A (ja) 2024-03-19

Similar Documents

Publication Publication Date Title
DE102018132525A1 (de) Verfahren und Vorrichtung zur Bewertung einer Fahrzeugfahrfläche
DE102016211182A1 (de) Verfahren, Vorrichtung und System zum Durchführen einer automatisierten Fahrt eines Fahrzeugs entlang einer aus einer Karte bereitgestellten Trajektorie
DE102009048699A1 (de) Pixelbasierte Detektion einer nicht vorhandenen Struktur eines freien Pfads
DE102009048892A1 (de) Pixelbasierte strukturreiche Detektion eines freien Pfads
DE102017123848A1 (de) Automatisches Parken eines Fahrzeugs auf einem Parkplatz
EP3545507A1 (de) Verfahren und system zum detektieren eines sich innerhalb eines parkplatzes befindenden erhabenen objekts
DE102020109279A1 (de) System und verfahren zur anhängerausrichtung
EP3545505A1 (de) Verfahren und system zum detektieren eines sich innerhalb eines parkplatzes befindenden erhabenen objekts
EP3970116A1 (de) Verfahren zum ermitteln eines betriebswinkels zwischen einer zugmaschine und einem anhänger der zugmaschine
DE102017106152A1 (de) Ermitteln einer Winkelstellung eines Anhängers mit optimierter Vorlage
DE102017207438B4 (de) Verfahren, Vorrichtung und deren Verwendung zur Ermittlung des Knickwinkels eines Gespanns
WO2019015851A1 (de) Verfahren und vorrichtung zum identifizieren von schäden in fahrzeugscheiben
DE102015223500B4 (de) Verfahren und Vorrichtung zur Prüfung der Funktionalität einer außenseitigen Lichteinrichtung eines Fahrzeugs
DE102015008422B3 (de) Verfahren und System zur Detektion von auf einem Parkplatz falsch geparkten Fahrzeugen
DE102020125232A1 (de) Verfahren zur Farbkorrektur für ein Kamerasystem sowie ein Kamerasystem
EP4313733A1 (de) Verfahren zum betreiben eines parkassistenzsystems, computerprogrammprodukt, parkassistenzsystem sowie fahrzeug
EP3621035A1 (de) Verfahren zum führen eines fahrzeugs hinter einem vorausfahrenden fahrzeug
WO2022122439A1 (de) Verfahren zum betreiben eines parkassistenzsystems, computerprogrammprodukt, parkassistenzsystem und fahrzeug
DE102006007550A1 (de) Vorrichtung und Verfahren zur Erkennung einer Fahrbahnmarkierung für ein Kraftfahrzeug
WO2019057252A1 (de) Verfahren und vorrichtung zum erkennen von fahrspuren, fahrerassistenzsystem und fahrzeug
EP3704631A2 (de) Verfahren zur ermittlung einer entfernung zwischen einem kraftfahrzeug und einem objekt
DE102015218967A1 (de) Verfahren und System zur Ermittlung und Nutzung von Eigenschaftszusammenhängen
EP3476696A1 (de) Verfahren zum ermitteln von objektgrenzen eines objekts in einem aussenbereich eines kraftfahrzeugs sowie steuervorrichtung und kraftfahrzeug
WO2023280574A1 (de) Verfahren zum betreiben eines parkassistenzsystems, computerprogrammprodukt und parkassistenzsystem
DE102016223180A1 (de) Verfahren und System zum Detektieren eines sich innerhalb eines Parkplatzes befindenden erhabenen Objekts

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230905

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR