EP3814872A1 - Verfahren zum kalibrieren eines eye-tracking-systems, verfahren zur auswertung des verhaltens einer person mittels eines eye-tracking-systems sowie eye-tracking-system und kraftfahrzeug mit einem solchen eye-tracking-system - Google Patents
Verfahren zum kalibrieren eines eye-tracking-systems, verfahren zur auswertung des verhaltens einer person mittels eines eye-tracking-systems sowie eye-tracking-system und kraftfahrzeug mit einem solchen eye-tracking-systemInfo
- Publication number
- EP3814872A1 EP3814872A1 EP18737837.7A EP18737837A EP3814872A1 EP 3814872 A1 EP3814872 A1 EP 3814872A1 EP 18737837 A EP18737837 A EP 18737837A EP 3814872 A1 EP3814872 A1 EP 3814872A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- path
- eye
- gaze
- determined
- person
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
Definitions
- the invention relates to the field of eye tracking, that is to say the detection and evaluation of the eyes and eye movement of an observed person by means of a camera and an evaluation unit connected thereto.
- the invention relates in particular to a method for calibrating an eye-T racking system and the use of the eye-T racking system calibrated in this way for evaluating the behavior of a person.
- the invention also relates to an appropriately trained eye-T racking system.
- this relates in particular to the detection of a vehicle driver or aircraft driver by an eye-T racking system provided in a vehicle or aircraft.
- eye-T racking is used in particular to determine the current activity of the driver or aircraft operator or to assess his concentration.
- the reliable detection of features of the eye that are characteristic of the viewing direction by a camera system in a vehicle or aircraft is also subject to inaccuracies due to, for example, changing lighting conditions. It has therefore already been proposed to use multiple cameras or particularly high-resolution cameras. Such a configuration of a camera system is comparatively expensive.
- the state of the art also includes the article "SubsMatch 2.0: Scanpath comparison and Classification based on sequence frequencies” by Thomas C.deer, Colleen Rothe, Ulrich Schiefer and Wolfgang Rosenstiel (Behavior Research Methods (2017) 49: 1048-1064). It describes how the evaluation of a person's eye movement can take place by dividing the eye movement or the gaze path resulting from the change in the direction of view into short sections. The evaluation is then carried out on the basis of these subsections, the occurrence of which is counted in a glance path of a given length of time and a classification is derived from this.
- the object of the invention is to provide a possibility for improved calibration of an eye tracking system and, based on this, an improved possibility for evaluating the detected eye movements, in particular for use in a vehicle or aircraft.
- a method for calibrating an eye-T racking system is proposed which fulfills the following features and which is intended in particular for use in the eye-T racking system of a motor vehicle.
- the method according to the invention provides that the eyes of a person, in particular a driver of a motor vehicle, are recorded by means of a camera system and the direction of the person's view is determined on the basis of optically recorded features. The chronological course of these viewing directions results in a viewing path.
- This view path is imprecise due to a variety of causes and therefore cannot be used directly to use a processing unit to determine what the driver is looking at or how his concentration or readiness to take over are to be assessed.
- These causes include a variable sitting position, differences in the optically recorded features that are used to determine the direction of view, and a limited resolution of the camera used.
- the calibration aims to compensate for the individual influencing factors mentioned.
- the captured gaze path is formed by gaze directions that follow one another in time from the eye-T racking system, wherein it is preferably supplemented by the current gaze direction according to a predetermined frequency.
- the viewing directions recorded with the predetermined frequency form the nodes of the recorded viewing path connected by edges.
- bundles of gaze paths of frequently used partial paths of the ascertained gaze path are then determined by means of clustering.
- This clustering means that similar partial paths of the gaze path, along which a change of direction of view has taken place, are identified as belonging together, so that the large number of recorded partial paths are traced back to a smaller number of bundles of partial paths, the typical gaze direction. Change of direction or partial sections thereof, for example the change from a view of the street in front of the vehicle to the side mirror.
- eye path bundles determined by clustering are in turn used according to the invention in order to derive correction parameters which can be used for the later normalization of further detected eye paths for data standardization, as will be explained in more detail below.
- the method according to the invention for calibrating an eye tracking system is therefore distinguished by the fact that it does not require a separate calibration phase, but rather uses the viewing directions of the person, in particular the driver of the vehicle in normal driving operation of the vehicle, for the calibration.
- the gaze directions or the resulting gaze path are not used directly to derive correction parameters, since these are not suitable for direct calibration. Instead, the bundles of eye paths determined by clustering are used. Clustering means that a coarse network with fewer edges and knots is created from a fine network, which forms the captured gaze path.
- This coarse network of bundled partial paths is well suited for evaluation in the course of calibration. It is hereby possible to determine to what extent there are deviations from the reference network by means of a comparison with a rough reference network formed from reference data.
- the evaluation of the viewing direction to form viewing paths for the subsequent clustering is preferably carried out by evaluating images captured by the camera system in such a way that the eyes of the captured person and their pupils are localized and a vector representing the viewing direction is derived from this.
- the optically recorded features for determining this line of sight include the detection of a pupil center of the person, the detection of a pupil contour of the person, the detection of an eye contour of the person, the detection of an iris contour of the person, the detection of an eyelid of the person and / or the detection of reflexes on a person's cornea.
- the determined vector is preferably used together with an imaginary (virtual) projection surface in order to determine the viewing direction used to generate the viewing path in the manner of two-dimensional coordinates. This determines where the gaze vector crosses the imaginary projection surface.
- the calibration procedure described here can also be done with more than two parameters or coordinates describing the viewing direction of the observed person are used.
- the correction parameters which are determined can be, for example, a total of five parameters which describe the deviation of the coarse network formed by clustering from a reference network with regard to horizontal displacement, vertical displacement, Describe horizontal scaling, vertical scaling and rotation.
- the clustering is preferably carried out by means of the following process steps, these process steps in particular preferably being carried out several times.
- the above-mentioned fine network which forms the captured view path, forms the input data.
- the target data of the previous run forms the input data for the next run.
- the eye path is recalculated in a first part step, hereinafter referred to as “resampling”.
- the gaze path i.e. either the gaze path previously determined by the camera or a resulting gaze path generated in a previous clustering run, remains essentially unchanged, but the nodes are set again, so that their distance on the gaze path no longer differs depends on the speed of the change in viewing direction.
- the nodes can be arranged at an identical distance on the gaze path.
- the gaze path can also be smoothed at the same time.
- a density map (heat map) is generated.
- the nodes of the line of sight on the imaginary projection surface mentioned are preferably used.
- the density map therefore shows how many nodes are located in a preferably square area of the imaginary projection area relative to the other square areas of the imaginary projection area.
- the density map can also be based on the edges of the viewing path.
- the resampled gaze path which was generated in the first sub-step, is modified by moving the nodes or at least some of the nodes in the direction of one local maximum of the density map are shifted, for example by 10% of the distance between the position of the node and the local maximum.
- the result is a modified view path that shows the target data of the passage.
- This modified view path can then be used in order to be subjected again to the above-mentioned method with the three method steps.
- clustering with resampling, creating a density map and moving nodes should be carried out between 5 and 400 times, preferably between 10 and 200 times.
- resampling may be omitted if the nodes were previously formed by viewing directions that were approximately the same distance apart (see above) when the viewing path was created.
- the evaluation of the determined eye path bundle for deriving the correction parameters is preferably carried out by a comparison with stored reference values, in particular by linear optimization.
- the reference values can in particular form a coarse reference network of edges and nodes, which is also considered to be given if the reference values can be transformed into a reference network.
- This reference network can be compared with the rough network of partial paths of the resulting gaze path after clustering, in particular by means of the linear optimization mentioned. In this way, the correction parameters can be determined, by means of which the determined eye path bundles can be converted into the reference values.
- the correction parameters determined by means of the described calibration method allow them to be used after determination in order to standardize the driver's line of sight in calibrated operation in such a way that it can subsequently be evaluated.
- the calibration described in particular with the clustering method described in detail, allows a comparatively inexpensive camera system to be used which detects the face of the person being observed in the near infrared spectral range (wavelength range: 780 nm to 1000 nm). Usually one camera is sufficient to achieve sufficiently good results.
- the comparatively low demands on the camera technology used are made possible by the calibration method described, since this compensates for inaccuracies caused by the camera, among other things, by determining the correction parameters mentioned.
- the camera When installed in a vehicle, the camera is preferably positioned in the area of the dashboard, the A-pillar or above the windshield.
- the camera is preferably positioned in the area of the dashboard, the A-pillar or above the windshield.
- integration into glasses is also possible and can be advantageous depending on the application.
- the invention also encompasses a method for evaluating the behavior of a vehicle driver by means of an eye tracking system, within the scope of which the calibration method described is used.
- the directions of sight recorded after the calibration by means of the camera system or the gaze paths determined therefrom are standardized by means of the determined correction parameters and can then be evaluated by comparison with reference values in order to permit an assessment of the behavior of the observed person.
- TOI Transitions Of Interest
- a standardized gaze path is determined based on the standardized gaze directions, which shows the course of the gaze direction and its change over time.
- this gaze path can span a limited period of time just past, which is particularly meaningful for the assessment of the observed person, in particular the driver of the vehicle. So it is considered expedient, a past period between 5 seconds and 300 seconds in length, in particular between 15 seconds and 60 seconds in length.
- the gaze path standardized by the correction parameters or a simplified standardized gaze path derived therefrom is subdivided into partial sections. These subsections of the gaze path preferably include the gaze directions of a defined period, in particular a period between 0.1 second and 1.0 second. The sections overlap each other preferentially, instead of being flush against each other.
- the standardized gaze path is preferably not subdivided directly into sub-sections, but rather the aforementioned simplified standardized gaze path.
- This can be generated by moving the standardized view path in the direction of a structure defined by reference data or by projecting onto it, so that the nodes of the standardized view path which are shifted lie directly on a structure which is defined by the reference data. This ensures that the subsections also lie on this structure and thus the further evaluation steps are simplified.
- the further evaluation steps include in particular a step in which occurrences of matching or similar subsections are counted.
- the values determined in this way can be used as the input variable of a trained machine learning system, for example a “Support Vector Machine”, which assigns an evaluation result to the specific count values for different sections.
- Such an evaluation result can have different types depending on the training of the machine learning system. In this way, a conclusion can be drawn about the activity of the observed person, which is then assessed for appropriateness against the background of the current street situation. Alternatively, however, the evaluation result can also be a value which quantifies the willingness of the observed person to take over, for example a period within which the observed person, judged from the previous line of sight, is likely to be able to take control of the vehicle again.
- Such a numerical value can then be compared with a corresponding period of time which relates to the autonomous driving functions of a vehicle and predicts how long, based on the current traffic situation, the autonomous driving functions can still guarantee safe driving without the driver's intervention. As soon as the predicted takeover time of the driver approaches or falls short of the specified safe period of the autonomous driving functions, the driver can be warned, for example by an acoustic, tactical or visual warning.
- the invention also relates to the eye tracking system itself, which is equipped with a camera system and an evaluation unit.
- this evaluation unit is designed to carry out the calibration method described above.
- the evaluation unit is preferably further designed to carry out the described method for determining the behavior of a person on the basis of the calibrated eye-tracking system and to give acoustic, haptic or visual warnings if necessary.
- the invention also relates to a motor vehicle or an aircraft which is equipped with such an eye-T racking system.
- the camera of such an eye-T racking system is preferably firmly integrated into the vehicle, in particular in the area of the dashboard, the A-pillar or above the windshield, and is directed towards the driver. The same applies to an aircraft.
- a mobile system in which the camera for detecting the eyes is integrated in a head-attached structure, in particular in the form of glasses. These glasses are then to be located in the room in addition to allow, in the manner shown, the driver's line of sight to be determined in an imaginary projection plane.
- the glasses can be located in the room by another camera, which is either also provided on the glasses or is permanently integrated in the vehicle.
- Fig. 1 shows the interior of a passenger car including the eye tracking system as well as the head of the driver and typical gaze vectors.
- FIG. 2 is a representation based on FIG. 1, which clarifies the determination of viewing directions through the intersection of the viewing vectors with an imaginary projection surface.
- 3 shows a gaze path determined for the purpose of calibration.
- 4A and 4B show the resampling of a section of the ascertained gaze path for the purpose of a clusters process.
- Fig. 5 shows the entire resampled view path in a partial section of the imaginary projection surface.
- FIG. 6 shows a density map that is created on the basis of the resampled gaze path.
- 7A and 7B show the clustering based on the displacement of nodes of the resampled gaze path in the direction of local maxima and the resulting modified gaze path.
- FIG. 9 illustrates the derivation of correction parameters by comparing the determined path bundles with reference values.
- 10 shows the vehicle interior together with an eye path determined by the previously calibrated eye tracking system and the eye path standardized therefrom on the basis of the correction parameters.
- 11 shows the ascertained gaze path in the imaginary projection surface and the reference structure created on the basis of the reference values of the system.
- 13A to 13D show the breakdown of the standardized gaze path and generated by projection onto the reference structure into sub-segments for subsequent evaluation.
- FIG. 1 shows a cockpit of a passenger car with an eye tracking system 10, comprising a camera 12 which is attached in such a way that it can capture a vehicle driver 2 and in particular his eyes 3.
- the camera 12 is designed such that it can always or at least almost always capture the driver's eyes regardless of the specific size and head position of the vehicle driver.
- the camera 12, which is designed to capture the image in the near infrared spectral range, is connected to an electronic evaluation unit 14, which is only shown schematically and is designed to evaluate the image captured by the camera 12 and to carry out the methods described below. Even if the evaluation unit 14 is designed as a separate component in FIG. 1, it can of course also be part of an on-board computer instead.
- the evaluation unit 14 can, in particular, be connected directly or indirectly to signaling means, such as a loudspeaker 16, in order to be able to give signals to the driver 2 when necessary.
- signaling means such as a loudspeaker 16
- the vehicle driver 2 looks at various things while driving, which is illustrated in FIG. 1 by a plurality of gaze vectors 20.
- this also includes various other vehicle-owned objects such as the exterior mirror, the interior mirror, the navigation device or the radio, the instruments behind the steering wheel or other parts of the vehicle such as the passenger seat.
- the eye tracking system 10 continuously records images of the vehicle driver 2 by means of the camera 12 and forwards them to the evaluation unit 14 mentioned.
- This evaluation unit 14 recognizes the the current gaze vector 20, for which purpose it detects the eye position and the pupil position. Optically detectable features that can be used for this purpose are generally known.
- the gaze vector 20 that is recognized in each case is not recognized absolutely error-free due to simple camera technology and variability in the person and seating position of the driver. However, due to the calibration procedure described below, this is hardly significant.
- the evaluation unit 14 derives from the gaze vectors 20, which can be described, for example, by three coordinates and two angles, gaze directions 22, which in the case of the exemplary embodiment of the invention are formed by the intersection between the gaze vector 20 with an imaginary projection surface 30 and thus in Two coordinates. These two coordinates (X, Y) are used for further processing in this specific exemplary embodiment.
- This form of determining gaze directions 22 is not error-free, since a gaze direction determined in this way could result identically from different gaze vectors 20 when the head position of the vehicle driver 2 changes. However, since the head of the driver 2 is only slightly shifted up and down or to the right and left in practice, this imprecision does not play a role in practice.
- the temporal course of the viewing directions 22 leads to the viewing path 24 shown in the imaginary projection surface 30 of FIG. 2.
- This gaze path 24 forms the starting point for the calibration of the Eye-T racking system.
- the eye-T racking system 10 can, for example, observe the line of sight 22 over a defined period of a few minutes and determine the line of sight 24 therefrom.
- the viewing direction is preferably determined at a fixed frequency, for example at a frequency of 50 Hz.
- FIG. 3 shows the ascertained gaze path 24 on the imaginary projection surface 30 and a partial area 32 that is considered in more detail.
- the exemplary gaze path 24 shown here is compared to a real gaze path that was determined over a few minutes for the purpose of better comprehension Simplified the framework of this description. Nevertheless, it shows that the gaze path 24 usually has recognizable sections which are used particularly frequently, but which of course do not naturally have the exact same course.
- a step of clustering then takes place, that is, a simplification of the gaze path by combining or merging the like. In the present case, this is done by means of the sequence that can be seen in FIGS.
- the nodes 25 of the viewing path which are also variably spaced apart due to the originally variable speed of the change in viewing direction, are newly set by resampling. Areas to which the driver 2 has focused his gaze for a long time without significantly changing the gaze direction 20 are thereby possibly simplified.
- 4B shows a partial section of the resampled gaze path 24 ′.
- a density map 40 (fleat map) is generated.
- the imaginary projection surface 30 is subdivided into square partial areas.
- the density map shown in FIG. 6 represents a rough simplification since the number of sub-areas is quite small. In practice, it is advantageous to create a density map with a higher resolution. The differentiation of only five different weightings in FIG. 5 is also a simplification compared to practice.
- Local maxima result from the density map 40.
- the nodes 25 ′ of the viewing path are shifted. These are shifted in the direction of a local maximum, alternatives also being conceivable in which the nodes are influenced in a weighted manner by several local maxima during the shift.
- the shifted nodes 25 ′′ are shown in FIGS. 7A and 7B as open circles. This defines a modified gaze path 24 ′′, which is shown in FIG. 7B.
- the aforementioned steps of clustering, ie the resampling, the generation of the density map 40 and the displacement of the nodes 25 'in the direction of at least one local maximum, are preferably carried out multiple times, with a number of runs between 10 and 200 being considered particularly expedient.
- This structure only consists of comparatively little different nodes.
- the subsections of the gaze path are quasi bundled in this structure, that is, they form the gaze path bundles shown.
- the line strength in FIG. 7 reflects how many of the respective subsections in the modified gaze path are contained in the respective gaze path bundles.
- FIG. 8 is used to derive correction parameters from their differences from reference values, which in the present case form a reference structure 28.
- reference values which in the present case form a reference structure 28.
- FIG. 9 the bundles of gaze paths 26 of FIG. 7 are shown in broken lines. Also shown in FIG. 9 is the reference structure 28 formed from the reference values.
- the eye tracking system 10 can normalize the gaze directions 122 or gaze paths 124 subsequently acquired by means of this correction parameter, that is to say convert the gaze directions 122 in such a way that they are suitable for further processing.
- This subsequent operation of the eye tracking system 10 calibrated by the determination of the correction parameters can serve, in particular, the purpose of evaluating the viewing direction of the vehicle driver 2 in such a way that its concentration on the traffic situation is evaluated.
- the evaluation in the case of a partially autonomous ferry operation of the vehicle can be used to estimate how long the driver 2 is likely to need, starting from an acoustic signal, for example, until he can take control of the vehicle again.
- the simplest type of further processing of viewing directions recorded with the calibrated eye tracking system is to reliably determine where the driver is currently looking, for example on the traffic in front of the vehicle or in the rearview mirror, or for example on instruments of the vehicle or a multimedia system or the like.
- subsections 27 are subsections 27 of the path 124, which usually represent between 0.1 second and 1.0 second. They are very well suited as an input variable for a trained machine learning system.
- a trained machine learning system can be used both for an evaluation, the target information of which is a determination of the driver's activity. However, in particular, even without determining intermediate values, it can have the activity relating to an immediate assessment as target information as to whether the vehicle driver is ready to take control of the vehicle from an autonomous driving mode or what time period is to be expected before the readiness to take over.
- the 10 shows the driving situation.
- the gaze vector 20 of the driver 2 is detected by the calibrated eye tracking system 10 and its intersection with the imaginary projection surface 30 is made available as a gaze direction 22 for further processing.
- the line of sight 22 is determined with a frequency of, for example, 50 Hz.
- the sequence of the gaze directions 22 forms the gaze path 24.
- the correction parameters recorded during the calibration are used to convert the recorded gaze path 24 into a standardized gaze path 124. This is shown in dotted lines in FIG. 10.
- this standardized gaze path 124 is now projected onto the reference structure 28 and thus forms a simplified standardized gaze path 124 '.
- the simple view path 124 'thus modified is shown in FIG. 12.
- This gaze path 124 ' is now broken down into the partial sections 27 converted, which are shown in FIGS. 13A to 13D by a solid highlight Darge.
- the subsections 27 thus represent subsections of the simplified standardized gaze path 124 ", whereby typically, but not necessarily, subsections 27 are of the same duration.
- Such a time period can be, for example, between 0.1 second and 1.0 second.
- a standardized gaze path 124 which represents 15 seconds, can be simplified in the manner outlined above by the reference structure 28, so that it forms the simplified gaze path 124 ', and then in 900 overlapping sections 27 of 0.1 seconds in length.
- the calibration can also be repeated or the correction parameters can be continuously updated as part of a continuous calibration, for example based on the viewing path of the previous 60 seconds.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Description
Claims
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2018/067315 WO2020001767A1 (de) | 2018-06-27 | 2018-06-27 | Verfahren zum kalibrieren eines eye-tracking-systems, verfahren zur auswertung des verhaltens einer person mittels eines eye-tracking-systems sowie eye-tracking-system und kraftfahrzeug mit einem solchen eye-tracking-system |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3814872A1 true EP3814872A1 (de) | 2021-05-05 |
Family
ID=62842077
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18737837.7A Pending EP3814872A1 (de) | 2018-06-27 | 2018-06-27 | Verfahren zum kalibrieren eines eye-tracking-systems, verfahren zur auswertung des verhaltens einer person mittels eines eye-tracking-systems sowie eye-tracking-system und kraftfahrzeug mit einem solchen eye-tracking-system |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP3814872A1 (de) |
WO (1) | WO2020001767A1 (de) |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102012105664A1 (de) * | 2012-06-28 | 2014-04-10 | Oliver Hein | Verfahren und Vorrichtung zur Kodierung von Augen- und Blickverlaufsdaten |
-
2018
- 2018-06-27 WO PCT/EP2018/067315 patent/WO2020001767A1/de unknown
- 2018-06-27 EP EP18737837.7A patent/EP3814872A1/de active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2020001767A1 (de) | 2020-01-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1789906B1 (de) | Verfahren zur bildbasierten fahreridentifikation in einem kraftfahrzeug | |
DE112017002765T5 (de) | Zustandsabschätzungsvorrichtung, Zustandsabschätzungsverfahren und Zustandsabschätzungsprogramm | |
DE102018206208A1 (de) | Verfahren, Vorrichtung, Erzeugnis und Computerprogramm zum Betreiben eines technischen Systems | |
DE102012223481B4 (de) | Vorrichtung und Verfahren zum Verfolgen der Position eines peripheren Fahrzeugs | |
DE112014000591T5 (de) | Aufmerksamkeitsgrad-Erfassungsgerät und Aufmerksamkeitsgrad-Erfassungsverfahren | |
DE102015203354A1 (de) | Einstellungen von fahrzeugbedienerüberwachung und betriebsabläufen | |
DE102016119698A1 (de) | Fahrzeugsteuerungssystem | |
EP2220588A1 (de) | Konfigurationsmodul für ein überwachungssystem, überwachungssystem, verfahren zur konfiguration des überwachungssystems sowie computerprogramm | |
DE112017007258T5 (de) | Vorrichtung zur bestimmung des konzentrationsgrades, verfahren zur bestimmung des konzentrationsgrades und programm zur bestimmung des konzentrationsgrades | |
DE102019115330A1 (de) | Echtzeit-Sicherheitspfaderzeugung für hochautomatisiertes Fahrzeugrückfallmanöver | |
EP2562681A1 (de) | Objektverfolgungsverfahren für ein Kamerabasiertes Fahrerassistenzsystem | |
DE112017007248T5 (de) | Vorrichtung zur bestimmung des konzentrationsgrades, verfahren zur bestimmung des konzentrationsgrades und programm zur bestimmung des konzentrationsgrades | |
DE102019122250A1 (de) | Verfahren sowie Steuergerät für ein System zum Steuern eines Kraftfahrzeugs | |
DE102008007152B4 (de) | Verfahren zur Parametrisierung des Augenöffnungsgrades eines Fahrers eines Kraftfahrzeugs | |
WO2018149768A1 (de) | Automatisierte aktivierung eines sichtunterstützungssystems | |
EP3802285B1 (de) | Verfahren und fahrzeugsystem zur optimierung von parkvorschlägen | |
EP3814872A1 (de) | Verfahren zum kalibrieren eines eye-tracking-systems, verfahren zur auswertung des verhaltens einer person mittels eines eye-tracking-systems sowie eye-tracking-system und kraftfahrzeug mit einem solchen eye-tracking-system | |
DE112017007950T5 (de) | Fahrunterstützungsvorrichtung und Fahrunterstützungsverfahren | |
DE102019205137B4 (de) | Verfahren zur Einparkunterstützung eines Kraftfahrzeuges in eine Quer-Parklücke | |
WO2021204431A1 (de) | Verfahren zur einstellung eines fahrzeugspiegels und/oder eines bildschirms und system zum ausführen des verfahrens | |
DE102012020778B4 (de) | Verfahren zum Labeln einer Sequenz von in zeitlicher Abfolge aufgenommenen Bildern mit integrierter Qualitätsprüfung | |
AT515340B1 (de) | Verfahren zur Erstellung von Stereo-Digitalbildern | |
DE102018109466A1 (de) | Verfahren zum Erfassen eines Parkplatzes für ein Fahrzeug | |
WO2019029966A1 (de) | Verfahren und vorrichtung zur fahrerzustandsbewertung sowie fahrzeug | |
DE112017007253T5 (de) | Vorrichtung zur bestimmung des konzentrationsgrades, verfahren zur bestimmung des konzentrationsgrades und programm zur bestimmung des konzentrationsgrades |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210125 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20221207 |