SE539952C2 - Adaptive user interface system for a vehicle - Google Patents

Adaptive user interface system for a vehicle Download PDF

Info

Publication number
SE539952C2
SE539952C2 SE1451415A SE1451415A SE539952C2 SE 539952 C2 SE539952 C2 SE 539952C2 SE 1451415 A SE1451415 A SE 1451415A SE 1451415 A SE1451415 A SE 1451415A SE 539952 C2 SE539952 C2 SE 539952C2
Authority
SE
Sweden
Prior art keywords
information
information area
display
control unit
user
Prior art date
Application number
SE1451415A
Other languages
Swedish (sv)
Other versions
SE1451415A1 (en
Inventor
Fjellstroem Jonatan
Katzman Simon
Vaennstroem Johanna
Friberg Robert
Krupenia Stas
Ahlm Victor
Nyberg Helena
Nicola Daniele
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE1451415A priority Critical patent/SE539952C2/en
Priority to DE102015015136.3A priority patent/DE102015015136A1/en
Publication of SE1451415A1 publication Critical patent/SE1451415A1/en
Publication of SE539952C2 publication Critical patent/SE539952C2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/10
    • B60K35/20
    • B60K35/23
    • B60K35/29
    • B60K35/654
    • B60K35/81
    • B60K2360/149
    • B60K2360/151
    • B60K2360/195
    • B60K2360/334
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Abstract

An adaptive user interface system (2) for a vehicle (3), configured to present information at an information display (4) of said vehicle, the system comprises a control unit (6) for controlling the information presented at said information display (4), and that said information display comprises a plurality of information areas (8), each information area has a predefined position at said information display and is configured for presentation of information area content. The system comprises an eye tracking arrangement (10) configured to determine at least one predefined information area parameter (12) related to a user’s visual interaction (14) with information areas (8) and to generate an eye tracking signal (16) to be applied to said control unit (6) including said at least one parameter. The control unit (6) is configured to store the determined information area parameter(s) for each information area, and to analyse said information area parameters over time using a set of analyse rules, and further to determine one or many information areas to be active in dependence of a set of adaptive presentation rules related to a present user, and to control said information display to present information area content of active information areas.

Description

Adaptive user interface system for a vehicle Field of the inventionThe present disclosure relates to an adaptive user interface system for a vehicle, anda method in connection with such a system according to the preambles of the independent claims.
Background of the invention ln modern vehicles the amount of information presented to drivers increase as thenumber of new help systems are integrated in the vehicles. This may result in anincreasing mental workload for the drivers. Some drivers require as much information they can get, and some drivers only need some basic information.
Different drivers have different strategies for solving the task of driving the vehicle.
Therefore, different drivers require different information and different amount of information to feel secure and comfortable. lnformation is normally presented at information displays arranged in front of thedriver and below the wind screen.
An alternative way of presenting information to a driver is by using so-calledheads-up displays (HUD). HUD consists of projecting information on thewindshield of an automobile, allowing a driver to view the projected informationwithout having to look down at an instrument panel. Although they were initiallydeveloped for military aviation, HUDs are now used in commercial aircraft,automobiles, computer gaming, and other applications.
HUD is any transparent display that presents data without requiring users to lookaway from their usual viewpoints. The origin of the name stems from a pilot beingable to view information with the head positioned "up" and looking forward, instead of angled down looking at lower instruments.
US-2007/0194902 relates to an adaptive heads-up user interface for automobiles.The interface comprises a number of display elements that may be presented in avariety of display states which are determined based on inputs from a variety of 2 sources from e.g. the vehicle or based on user interaction and biometric information.
As discussed above the amount of information that some drivers would like to seemay be large and if presented at the windscreen it can have negative effects ofthe driver”s driving capabilities as it may obscure the driver”s view.
One overall object of the present invention is to take into account a driver”spersonal way of choosing which information he/she would like to see and therebyachieve a system and a method of personalizing the information presented to a driver.
Summary of the invention The above-mentioned object is achieved, or at least mitigated, by the presentinvention according to the independent claims.
Preferred embodiments are set forth in the dependent claims.
According to the present invention an adaptive user interface system is achievedhaving capabilities of remembering when the driver choses to look at a certainpiece of information. The system is configured to log the position the driver islooking at by using an eye tracker technique including an integrated so-calledgaze interaction. The system adapts the presented information in accordance withthe result of an analysis of parameters related to the information areas the driverhas been looking at. The adaptation relates both to which information that shouldbe presented and also when the information should be presented. ln one embodiment the information to the driver is presented at the windscreen,i.e. the information is presented at a heads-up display (HUD).
One advantage of the present invention is that only information wanted by thedriver is presented.
Short description of the appended drawinds Figure 1 is a schematic illustration of a vehicle comprising the adaptive userinterface system according to the present invention.
Figure 2 is a block diagram schematically illustrating an adaptive user interfacesystem according to the present invention.
Figure 3 is a flow diagram illustrating the method according to the present invenfion.
Detailed description of preferred embodiments of the inventionThe present invention will now be disclosed in detail with references to the appended figures.
The present invention relates to an adaptive user interface system 2 applicable foruse in e.g. a vehicle 3 which is shown by the schematic illustration in figure 1. lnthe figure a display 4, a control unit 6 and an eye tracking arrangement 10 is schematically illustrated.
Thus, with references to figure 2 the present invention relates to an adaptive userinterface system 2 for a vehicle 3 (see figure 1), e.g. a bus, a truck, or anautomobile, but also for aircraft and boats.
The system is configured to present information at an information display 4 of thevehicle. The system comprises a control unit 6 for controlling, via control signals7, the information presented at the information display 4. ln its turn, the control unit 6 receives information to be presented from various systems of the vehicle.
The information display 4 comprises a plurality of information areas 8, where eachinformation area has a predefined position at the information display and isconfigured for presentation of information area content. Thus, the control unit 6 isprovided with data regarding information areas, their respective positions and alsowhich information that is presented at each information area, herein denoted as the information area content. An information area may have different sizes and 4 shapes dependent on the information intended to be presented. The informationarea content may be the speed of the vehicle, a temperature indication, trafficwarnings, information related to the surroundings, information of music played at the radio, etc.
The system comprises an eye tracking arrangement 10 configured to determine atleast one predefined information area parameter 12 related to a user”s visualinteraction 14 - illustrated by a schematic illustration of a user”s eye 15 - withinformation areas 8 and to generate an eye tracking signal 16 to be applied to thecontrol unit 6 including the at least one parameter. The control unit 6 is configuredto control the function of the eye tracking arrangement 10 by control signals 17.Below under a separate heading, different aspects of the eye tracking techniquewill be discussed in detail where also specific issues in relation to using thetechnique in vehicles when implementing the present invention also will bediscussed. The user”s visual interaction 14 includes both determining the positionof an information area that the user is looking at, and in addition, also a possibilityof activating a function controlled by an information area content in an informationarea that the user is looking at- i.e. gaze interaction.
The at least one predefined information area parameter includes e.g. a parameterstating the time duration of a user fixation of an information area.
The control unit 6 is configured to store the determined information areaparameter(s) for each information area, and to analyse the information areaparameters over time using a set of analyse rules.
The control unit is further configured to determine one or many information areasto be active in dependence of a set of adaptive presentation rules related to apresent user, and to control the information display, by control signals 7, topresent information area content of active information areas.
According to the present invention the set of analyse rules comprises a rule toperform a frequency analysis of user interaction with information areas. Thefrequency analysis may comprise determining a number of times a fixation is made at a specific information area during a predetermined time period. A fixation 5 is defined as viewing at a specific information area a time period at least having aduration in the range of 200-350 ms.
The analysis is preferably performed continuously and may include interactioninformation from a predetermined time period, e.g. some hours, or during anentire driving session.
An information area may be regarded as active if a number of fixations of thatinformation area are above a predetermined threshold during the predeterminedtime period. Thus, a fixed number of fixations may be determined in order toqualify as an active information area. As an alternative, the predeterminedthreshold may be set such that the number of information areas being active ise.g. the “top ten” areas having most fixations.
Thus, the presentation rules include a rule to determine an information area as active in dependence of the result of said analysing step.
According to one embodiment the set of presentation rules include at least onerule including vehicle related parameters, e.g. the speed of the vehicle. Thispresentation rule may e.g. include that the number of active information areasmay be higher when driving at a lower speed, for the reason that at lower speedthe driver may have time to take more information into account than when drivingat higher speed when only few information areas should be shown.
As an alternative, when driving at higher speed, e.g. at a motorway, there is lessdistraction compared to when driving in more complex environments, e.g. in townswhere the driver”s full attention is required, and therefore more information areasmay be active. ln that case the speed, in combination with e.g. driving in anessentially straight direction may be taken into account.
Also, more advanced functionality may be provided. For example, the systemshould be that smart that it remembers that the driver e.g. ten minutes prior eachstop would like the check where to eat. That information will then always emergebefore it is time to stop.
Thus, the present invention provides for relating and/or adapting the presentedinformation to the situation, i.e. relate the piece of information to the situation and 6 also adapt how the information is presented, e.g. specifically highlight theinformation if the degree of importance is high. E.g. at a red light stop (the speed is low/zero) it would be possible to present a certain type of information.
The control unit is preferably configured to store a user”s set of adaptivepresentation rules when a user session is terminated, and to use that set ofpresentation rules the next time the same user uses the system. Thereby thepresentation of information is personalized. The driver is normally identified byhis/her driver card.
According to one particular embodiment the information display is a so-calledheads-up display (HUD) at a windscreen of the vehicle. The HUD technology willbe further discussed below under a separate heading.
As an alternative the information display is an LCD, or other type of display, arranged in front of the driver below the windscreen.
The present invention also relates to a method in an adaptive user interfacesystem for a vehicle. The adaptive interface system has been described in detailabove and it is herein referred to that description. Thus, the interface system isconfigured to present information at an information display of the vehicle. Thesystem comprises a control unit for controlling the information presented at theinformation display, and that the information display comprises a plurality ofinformation areas, each information area has a predefined position at the information display and is configured for presentation of information area content.
With references to the flow diagram shown in figure 3 the method will now bedescribed more in detail.
The method comprises determining, by an eye tracking arrangement at least onepredefined information area parameter related to a user”s visual interaction withinformation areas, and generating an eye tracking signal including theparameter(s) and storing, in the control unit, the determined information area parameter(s) for each information area. 7 The control unit then performs the step of analysing the information areaparameters over time using a set of analyse rules, e.g. by performing frequencyanalysis of user interaction with information areas, by a frequency analysis rule inthe set of analyse rules.
Furthermore, the method comprises determining one or many information areas tobe active in dependence of a set of adaptive presentation rules related to apresent user, and by presenting, at the information display, information areacontent of active information areas. More in detail, the presentation rules providefor determining an information area as active in dependence of the result of theanalysing step.
The set of presentation rules may e.g. include one or many rules including vehicle related parameters, e.g. the speed of the vehicle.
The frequency analysis rule comprises determining a number of times a fixation ismade at a specific information area during a predetermined time period. Aninformation area is regarded as active if a number of fixations of that informationarea are above a predetermined threshold during a predetermined time period.The predefined information area parameters include a parameter stating the timeduration of a user fixation of an information area. These aspects are furtherdiscussed above when describing the system.
According to one embodiment the method comprises storing a user”s set ofadaptive presentation rules when a user session is terminated, and using that set of presentation rules the next time the same user uses the system.
As an output from the eye tracking arrangement an attention level of the drivermay be available. This attention level may include a fatigue level of the driver andin the case it might be necessary to “wake” the driver up by e.g. playing musiclouder or by visual indication at the windscreen (HUD).
The attention level may also include information of the stress level of the driver,e.g. if the traffic situation is intensive it may be necessary to present lessinformation, or a certain type of information to assist the driver in the best manner. 8 The present invention also relates to a computer program P (see figure 2) thatcomprises a computer program code to cause the control unit 10, or a computerconnected to the control unit 10, to perform the method described above. ln addition a computer program product is provided comprising a computerprogram code stored on a computer-readabie medium to perform the methoddescribed above, when the computer program code is executed by the control unit10 or by a computer connected to the control unit 10.
Heads-up display (HUD)A typical HUD comprises three primary components: a projector unit, a combiner,and a video generation computer - herein included in the control unit 6.
The projection unit in a typical HUD is an optical collimator setup: a convex lens orconcave mirror with a cathode ray tube, light emitting diode, or liquid crystaldisplay at its focus. This setup produces an image where the light is parallel i.e.perceived to be at infinity.
The combiner is typically an angled flat piece of glass (a beam splitter) locateddirectly in front of the viewer, which redirects the projected image from projector insuch a way as to see the field of view and the projected infinity image at the sametime. Combiners may have special coatings that reflect the monochromatic lightprojected onto it from the projector unit while allowing all other wavelengths oflight to pass through. ln some optical layouts combiners may also have a curvedsurface to refocus the image from the projector.
The computer provides the interface between the HUD (i.e. the projection unit)and the systems/data to be displayed and generates the imagery and symbologyto be displayed by the projection unit.
Newer micro-display imaging technologies are being introduced, including liquidcrystal display (LCD), liquid crystal on silicon (LCoS), digital micro-mirrors (Dl\/ID),and organic light-emitting diode (OLED).
Eye-tracker technique An eye tracker system incorporates near-infrared micro-projectors, optical sensorsand image processing. l\/licro-projectors create reflection pattern on the eyes. lmage sensors register theimage of the user, the user”s eyes, and the projection patterns, in real time. lmage processing is used to find features of the user, the eyes and projectionpatterns.
Mathematical models are used to exactly calculate the eyes” position and the gaze point.
Eye tracking is the process of measuring either the point of gaze (where one islooking) or the motion of an eye relative to the head. An eye tracker is a device formeasuring eye positions and eye movement. Eye trackers are used in researchon the visual system, in psychology, in cognitive linguistics and in product design.There are a number of methods for measuring eye movement. The most popularvariant uses video images from which the eye position is extracted. Other methods use search coils or are based on the electro-oculogram.
The second broad category uses some non-contact, optical method for measuringeye motion. Light, typically infrared, is reflected from the eye and sensed by avideo camera or some other specially designed optical sensor. The information isthen analysed to extract eye rotation from changes in reflections. Video-basedeye trackers typically use the corneal reflection (the first Purkinje image) and thecentre of the pupil as features to track over time. A more sensitive type of eyetracker, the dual-Purkinje eye tracker, uses reflections from the front of the cornea(first Purkinje image) and the back of the lens (fourth Purkinje image) as featuresto track. A still more sensitive method of tracking is to image features from insidethe eye, such as the retinal blood vessels, and follow these features as the eyerotates. Optical methods, particularly those based on video recording, are widely used for gaze tracking and are favoured for being non-invasive and inexpensive. lO The most widely used current designs are video-based eye trackers. A camerafocuses on one or both eyes and records their movement as the viewer looks atsome kind of stimulus. l\/lost modern eye-trackers use the centre of the pupil andinfrared / near-infrared non-collimated light to create cornea| reflections (CR). Thevector between the pupil centre and the cornea| reflections can be used tocompute the point of regard on surface or the gaze direction. A simple calibrationprocedure of the individual is usually needed before using the eye tracker.
Two general types of eye tracking techniques are used: bright-pupil and dark-pupil. Their difference is based on the location of the illumination source withrespect to the optics. lf the illumination is coaxial with the optical path, then theeye acts as a retroreflector as the light reflects off the retina creating a bright pupileffect similar to red eye. lf the illumination source is offset from the optical path,then the pupil appears dark because the retroreflection from the retina is directed away from the camera.
Bright-pupil tracking creates greater iris/pupil contrast, allowing more robust eyetracking with all iris pigmentation, and greatly reduces interference caused byeyelashes and other obscuring features. lt also allows tracking in lightingconditions ranging from total darkness to very bright. But bright-pupil techniquesare not effective for tracking outdoors, as extraneous IR sources interfere with monitoring.
Eye-tracking setups vary greatly; some are head-mounted, some require the headto be stable (for example, with a chin rest), and some function remotely andautomatically track the head during motion. l\/lost use a sampling rate of at least30 Hz. Although 50/60 Hz is most common, today many video-based eye trackersrun at 240, 350 or even 1000/1250 Hz, which is needed in order to capture thedetails of the very rapid eye movement during reading or during studies of neurology. ll Eye movement is typically divided into fixations and saccades - when the eyegaze pauses in a certain position, and when it moves to another position,respectively. The resulting series of fixations and saccades is called a scanpath.l\/lost information from the eye is made available during a fixation, but not during asaccade. The central one or two degrees of the visual angle (the fovea) providethe bulk of visual information; the input from larger eccentricities (the periphery) isless informative. Hence, the locations of fixations along a scanpath show whatinformation loci on the stimulus were processed during an eye tracking session.On average, fixations last for around 200 ms during the reading of linguistic text,and 350 ms during the viewing of a scene. Preparing a saccade towards a new goal takes around 200 ms.
Scanpaths are useful for analyzing cognitive intent, interest, and salience. Otherbiological factors (some as simple as gender) may affect the scanpath as well.Eye tracking in human-computer interaction (HCI) typically investigates thescanpath for usability purposes, or as a method of input in gaze-contingentdisplays, also known as gaze-based interfaces.
Below is a list of different aspects of eye tracking to be taken into account whenimplementing the adaptive user interface system according to the presentinvention, and more specifically to achieve a system that immediately andsecurely connects, or matches, a user”s interaction with a specific position at thedisplay, i.e. a specific information area, and the information presented at thatarea, information area content. And then analyses the stored data to adapt the information to be presented.
- Gaze direction and gaze point- used in interaction with computers and otherinterfaces, and in behavioural research/human response testing to betterunderstand what attracts people”s attention.
- Eye-presence detection -the eye-tracking system must first find the eyes, so itis the most fundamental part of eye tracking. 12 - Eye position -the ability to calculate the position of the eyes in real time makesthe eye tracking system accurate and precise while allowing the user to movefreely.
- User identification -the eye tracking system can be used as a multimodalbiometrics sensor, such as for logging on to a computer or for car-driveridentification. lt can combine face identification with physiological eye featuresand eye movement patterns.
- Eyelid closure -is used to monitor the user”s sleepiness, for instance inadvanced driver assistance or operator safety solutions.
- Eye movement and patterns - are studied to understand human behaviour andto assess and diagnose injuries or diseases.
- Pupil size and pupil dilation - pupil dilation is an indicator of excitement. lncombination with eye movement patterns and facial expressions, it can be used toderive emotional reactions, for instance in creating innovative user experiences.Pupil dilation can also serve as a marker of impairment, such as concussion, or drug or alcohol influence.
The present invention is not limited to the above-described preferredembodiments. Various alternatives, modifications and equivalents may be used.Therefore, the above embodiments should not be taken as limiting the scope of the invention, which is defined by the appending claims.

Claims (16)

13 Claims
1. An adaptive user interface system (2) for a vehicle (3), configured topresent information at an information display (4) of said vehicle, the systemcomprises a control unit (6) for controlling the information presented at saidinformation display (4), and that said information display comprises a plurality of information areas (8),each information area has a predefined position at said information display and isconfigured for presentation of information area content, c h a r a c t e r i z e d in that said system comprises an eye tracking arrangement (10) configured to determine at least one predefinedinformation area parameter (12) related to a user”s visual interaction (14) withinformation areas (8) and to generate an eye tracking signal (16) to be applied tosaid control unit (6) including said at least one parameter, and that said control unit (6) is configured to store said determined informationarea parameter(s) for each information area, and to analyse said information areaparameters over time using a set of analyse rules, the control unit is furtherconfigured to determine one or many information areas to be active independence of a set of adaptive presentation rules related to a present user, andto control said information display to present information area content of activeinformation areas, wherein said set of analyse rules comprises a rule to perform afrequency analysis of user interaction with information areas, wherein saidfrequency analysis comprises determining a number of times a fixation is made ata specific information area during a predetermined time period, and wherein saidpresentation rules include a rule to determine an information area as active in dependence of the result of said analysing step.
2. The adaptive user interface system according to claim 1, wherein aninformation area is active if a number of fixations of that information area are above a predetermined threshold during a predetermined time period.
3. The adaptive user interface system according to any of claims 1-2, wherein said predefined information area parameters include a parameter stating 14 the time duration of a user fixation of an information area.
4. The adaptive user interface system according to any of claims 1-3,wherein said set of presentation rules include a rule including vehicle related pafametefS.
5. The adaptive user interface system according to any of claims 1-4,wherein said information display is a heads-up display (HUD) at a windscreen ofthe vehicle.
6. The adaptive user interface system according to any of claims 1-4,wherein said information display is an LCD, or another type of display.
7. The adaptive user interface system according to any of claims 1-6,wherein said control unit is configured to store a user”s set of adaptivepresentation rules when a user session is terminated, and to use that set of presentation rules the next time the same user uses the system.
8. A method in an adaptive user interface system for a vehicle,configured to present information at an information display of said vehicle, thesystem comprises a control unit for controlling the information presented at saidinformation display, and that said information display comprises a plurality ofinformation areas, each information area has a predefined position at saidinformation display and is configured for presentation of information area content,c h a r a c t e r i z e d in that said method comprises - determining, by an eye tracking arrangement at least one predefined informationarea parameter related to a user”s visual interaction with information areas, - generating an eye tracking signal including said parameter(s), - storing, in said control unit, said determined information area parameter(s) foreach information area, - analysing said information area parameters over time using a set of analyse rules, - determining one or many information areas to be active in dependence of a setof adaptive presentation rules related to a present user, - presenting, at said information display, information area content of activeinformation areas, - performing frequency analysis of user interaction with information areas, by afrequency analysis rule in said set of analyse rules, - determining a number of times a fixation is made at a specific information areaduring a predetermined time period by said frequency analysis rule, - by said presentation rules, determining an information area as active in dependence of the result of said analysing step.
9. The method according to claim 8, wherein an information area isactive if a number of fixations of that information area are above a predetermined threshold during a predetermined time period.
10. The method according to any of claims 8-9, wherein said predefinedinformation area parameters include a parameter stating the time duration of a user fixation of an information area.
11. The method according to any of claims 8-10, wherein said set of presentation rules include a rule including vehicle related parameters.
12. The method according to any of claims 8-11, wherein said informationdisplay is a heads-up display (HUD) at a windscreen of the vehicle.
13. The method according to any of claims 8-12, comprising storing auser”s set of adaptive presentation rules when a user session is terminated, and using that set of presentation rules the next time the same user uses the system.
14. A computer program P, wherein said computer program P comprisesa computer program code to cause a control unit (1 O), or a computer connected tothe control unit (10), to perform the method according to any of claims 8-13. 16
15. A computer program product comprising a computer program codestored on a computer-readable medium to perform the method according to any ofthe claims 8-13, when the computer program code is executed by a control unit (10) or by a computer connected to the control unit (10).
16. A vehicle (3) comprising an adaptive user interface system according to any of claims 1-7.
SE1451415A 2014-11-24 2014-11-24 Adaptive user interface system for a vehicle SE539952C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE1451415A SE539952C2 (en) 2014-11-24 2014-11-24 Adaptive user interface system for a vehicle
DE102015015136.3A DE102015015136A1 (en) 2014-11-24 2015-11-23 Adaptive user interface system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1451415A SE539952C2 (en) 2014-11-24 2014-11-24 Adaptive user interface system for a vehicle

Publications (2)

Publication Number Publication Date
SE1451415A1 SE1451415A1 (en) 2016-05-25
SE539952C2 true SE539952C2 (en) 2018-02-06

Family

ID=55913932

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1451415A SE539952C2 (en) 2014-11-24 2014-11-24 Adaptive user interface system for a vehicle

Country Status (2)

Country Link
DE (1) DE102015015136A1 (en)
SE (1) SE539952C2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112578903A (en) * 2019-09-30 2021-03-30 托比股份公司 Eye tracking method, eye tracker, and computer program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7764247B2 (en) 2006-02-17 2010-07-27 Microsoft Corporation Adaptive heads-up user interface for automobiles

Also Published As

Publication number Publication date
SE1451415A1 (en) 2016-05-25
DE102015015136A1 (en) 2016-05-25

Similar Documents

Publication Publication Date Title
Holmqvist et al. RETRACTED ARTICLE: Eye tracking: empirical foundations for a minimal reporting guideline
US11786105B2 (en) Content presentation in head worn computing
TWI741512B (en) Method, device and electronic equipment for monitoring driver's attention
US11474348B2 (en) Method and device for eye tracking using event camera data
US20150213634A1 (en) Method and system of modifying text content presentation settings as determined by user states based on user eye metric data
US10448826B2 (en) Visual function testing device and visual function testing system
Itier et al. Effects of task demands on the early neural processing of fearful and happy facial expressions
US20170289519A1 (en) Display with eye-discomfort reduction
CN110155349A (en) Peripheral vision in man-machine interface
US20160085302A1 (en) Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects
US20100191156A1 (en) Human state estimating device and method
Rigas et al. Study of an extensive set of eye movement features: Extraction methods and statistical analysis
Sharma et al. Eye gaze techniques for human computer interaction: A research survey
KR101638095B1 (en) Method for providing user interface through head mount display by using gaze recognition and bio-signal, and device, and computer-readable recording media using the same
TW202020625A (en) The method of identifying fixations real-time from the raw eye- tracking data and a real-time identifying fixations system applying this method
SE539952C2 (en) Adaptive user interface system for a vehicle
CN111753628B (en) Training eye tracking model
US20220183546A1 (en) Automated vision tests and associated systems and methods
Thomson Eye tracking and its clinical application in optometry
US20230259203A1 (en) Eye-gaze based biofeedback
JP7165910B2 (en) Impression evaluation system and impression evaluation method
WO2023134637A1 (en) Vehicle-mounted eye movement interaction system and method
DO HYONG The Study of Visual Attention: From Raw Data to an Interdisciplinary Environment
Greene Transportation
Wang et al. Investigating pupil dilation in decision research