GB2536882A - Head up display adjustment - Google Patents

Head up display adjustment Download PDF

Info

Publication number
GB2536882A
GB2536882A GB1505092.5A GB201505092A GB2536882A GB 2536882 A GB2536882 A GB 2536882A GB 201505092 A GB201505092 A GB 201505092A GB 2536882 A GB2536882 A GB 2536882A
Authority
GB
United Kingdom
Prior art keywords
hud
display
head
position information
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1505092.5A
Other versions
GB201505092D0 (en
GB2536882B (en
Inventor
Ashley Jon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Denso Corp
Original Assignee
Denso Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Denso Corp filed Critical Denso Corp
Priority to GB1505092.5A priority Critical patent/GB2536882B/en
Publication of GB201505092D0 publication Critical patent/GB201505092D0/en
Publication of GB2536882A publication Critical patent/GB2536882A/en
Application granted granted Critical
Publication of GB2536882B publication Critical patent/GB2536882B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/50Instruments characterised by their means of attachment to or integration in the vehicle
    • B60K35/53Movable instruments, e.g. slidable
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/149Instrument input by detecting viewing direction not otherwise provided for
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • G02B2027/012Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility comprising devices for attenuating parasitic image effects
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04804Transparency, e.g. transparent or translucent windows

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A controller (6, fig 7) and method (fig 14) for adjusting a display 10 on a Head Up Display HUD (7, fig 7), wherein the controller obtains head position information 31 of a user 30 from a detector unit 5, and based on that information, estimates whether content displayed on the HUD is causing an obstruction (S1402, fig 14). If an obstruction is estimated, the controller generates an adjustment command to adjust the display and reduce the level of obstruction (S1403, fig 14). The obstruction may be reduced by moving an image on the display to other locations 10a, 10b & 10c, or may change the saturation, hue, luminosity, contrast or degree of animation of a portion of, or all of the display. The controller may, as a dependent feature of the controller, or as an alternative second independent aspect of the application, estimate whether content displayed on the HUD is distorted due to a different viewing angle or perspective of the user, and apply a reverse-distortion processing based on the information.

Description

Head Up Display Adjustment
Field
The present disclosure relates to a method and an apparatus for adjusting a display on a Head Up Display and to a vehicle comprising a Head Up Display and such an apparatus. The present disclosure relates in particular to, but is not limited to, the adjustment of the display of a Head Up Display associated with a vehicle such as a car based on a positioning of a driver's head.
Background
In a vehicle or elsewhere, a Head Up Display or "HUD" can be used to display information to a user, generally on a transparent screen within the field of vision of the user. In some cases, it can also be displayed such that it is perceived as being at infinity such that the user does not have to accommodate or focus on a near object and can see the information on the display while looking at a far object, such as a road ahead. This can be achieved using for example an optical collimator.
HUDs can be used to display a variety of information and in a vehicle, can display a vehicle speed, safety information, Satnav directions or other non-vehicle related information such as weather information, notifications, etc. US 2004/0178894 Al discusses a head-up display system and method for carrying out the location-correct display of an object situated outside a vehicle with regard to the position of the driver which aims at correcting the position of an object on the HUD with respect to the position of the driver.
US 2013/194110 Al discusses an automotive augmented reality head-up display apparatus and method for displaying augmented reality information, wherein the location of the display unit can be changed based on a visual field of driver. However adjusting the location of the unit can be cumbersome and increase the cost and complexity of the unit.
However, these arrangements still suffer from some limitations, as will be clear from the discussion below.
Summary
The invention is defined in the appended claims.
According to a first aspect of the present disclosure, there is provided a controller for adjusting a display on a Head Up Display "HUD". The controller is configured to: obtain, from a detector unit, head position information for a user of a HUD; estimate whether content displayed on the HUD is causing obstruction based on the obtained head position information; and generate, if it is estimated that content displayed on the HUD is causing obstruction, adjustment commands to adjust the display on the HUD to reduce the level of obstruction caused by the content displayed on the HUD.
The head position information may comprise head direction information and the controller may then be configured to estimate whether content displayed on the HUD is causing obstruction based on head direction information only.
The controller may be further configured to: estimate whether content displayed on the HUD is likely to appear distorted based on the obtained head position information; and apply reverse-distortion processing to the content to be displayed on the HUD wherein the reverse-distortion processing is based on the head position information. For example, the head position information may comprise head coordinates information and wherein the controller may then be configured to estimate whether content displayed on the HUD is likely to appear distorted based on head coordinates position information only.
The controller may be further configured to generate adjustment commands to adjust the display on the HUD by generating adjustment commands for adjusting one or more of: the saturation of a portion or all of the display; the hue of a portion or all of the display; the luminosity of a portion or all of the display; the contrast of a portion or all of the display; the degree of animation for some or all of the content to be displayed; the location of some or all of the content within the HUD display; and the location of the HUD display.
The controller may be further configured to generate adjustment commands to adjust the display on the HUD based on the type of content to be displayed. For example, the controller may be configured to generate adjustment commands for adjusting differently portions of the displays which are for displaying content of different types.
According to a second aspect of the present disclosure, there is provided a Head Up Display "HUD", the HUD comprising: a light emitting device for displaying content on the HUD; a detector unit configured to output head position information for a user of a HUD; and an apparatus as discussed above in respect of the first aspect, wherein the controller is configured to receive the head position information output by the head up display; and control the light emitting device based on the generated adjustment commands.
According to a third aspect of the present disclosure, there is provided a vehicle comprising a Head Up Display "HUD" according to the second aspect.
According to a fourth aspect of the present disclosure, there is provided a method of adjusting a display on a Head Up Display "HUD", the method comprising: obtaining head position information for a user of a HUD; estimating whether content displayed on the HUD is causing obstruction based on the obtained head position information; and generating, if it is estimated that content displayed on the HUD is causing obstruction, adjustment commands to adjust the display on the HUD to reduce the level of obstruction caused by the content displayed on the HUD.
The method may further comprise adjusting the display of the HUD based on the generated adjustments commands.
In some examples, the head position information comprises head direction information and the method may then further comprise estimating whether content displayed on the HUD is causing obstruction based on head direction information only.
The method may comprise: estimating whether content displayed on the HUD is likely to appear distorted based on the obtained head position information; applying reverse-distortion processing to the content to be displayed on the HUD wherein the reverse-distortion processing is based on the head position information. For example, the head position information may comprise head coordinates information and the method may then comprise estimating whether content displayed on the HUD is likely to appear distorted based on head coordinates position information only.
The method may comprise generating adjustment commands to adjust the display on the HUD by generating adjustment commands for adjusting one or more of the saturation of a portion or all of the display; the hue of a portion or all of the display; the luminosity of a portion or all of the display; the contrast of a portion or all of the display; the degree of animation for some or all of the content to be displayed; the location of some or all of the content within the HUD display; and the location of the HUD display.
The method may comprise generating adjustment commands to adjust the display on the HUD based on the type of content to be displayed. For example, the method may comprise generating adjustment commands for adjusting differently portions of the displays which are for displaying content of different types.
According to a fifth aspect of the present disclosure, there is provided a controller for adjusting a display on a Head Up Display "HUD", the controller being configured to: obtain, from a detector unit, head position information for a user of a HUD; estimate whether content displayed on the HUD is likely to appear distorted based on the obtained head position information; and apply reverse-distortion processing to the content to be displayed on the HUD wherein the reverse-distortion processing is based on the head position information.
The head position information may comprise head coordinates information and the controller may then be configured to estimate whether content displayed on the HUD is likely to appear distorted based on head coordinates position information only.
The controller may be further configured to: estimate whether content displayed on the HUD is causing obstruction based on the obtained head position information; and generate, if it is estimated that content displayed on the HUD is causing obstruction, adjustment commands to adjust the display on the HUD to reduce the level of obstruction caused by the content displayed on the HUD. For example, the head position information may comprise head direction information and the controller may then be configured to estimate whether content displayed on the HUD is causing obstruction based on head direction information only.
According to a sixth aspect of the present disclosure, there may be provided a method of adjusting a display on a Head Up Display "HUD", the method comprising: obtaining, from a detector unit, head position information for a user of a HUD; estimating whether content displayed on the HUD is likely to appear distorted based on the obtained head position information; and applying reverse-distortion processing to the content to be displayed on the HUD wherein the reverse-distortion processing is based on the head position information.
Other aspects will also become apparent upon review of the present disclosure, in particular upon review of the Brief description of drawings, Description of examples and Claims sections.
Brief description of drawings
Examples of the present disclosure will be described in the following by way of example only and with reference to the accompanying figures, in which: Figure 1 illustrates an example head up display in a vehicle; Figure 2 illustrates another example head up display in a vehicle; Figure 3 illustrates an example of a system for adjusting a head up display; Figure 4 illustrates an example coordinates and direction system; Figure 5 illustrates example head directions in the coordinates and direction system of Figure 4; Figure 6 illustrates another example of head directions in the coordinates and direction system of Figure 4; Figure 7 illustrates an example head up display system; Figure 8 illustrates an example head up display configuration for a driver in a vehicle; Figures 9-12 illustrate examples of head up display adjustments; Figure 13 illustrates an example map for HUD adjustments; Figure 14 illustrates an example method of adjusting a HUD; Figure 15 illustrates another example of head up display adjustments; Figure 16 illustrates an example of distortion on a head up display using a flat surface; and Figure 17 illustrates another example of distortion on a head up display using a curved surface.
Description of examples
Figure 1 illustrates an example head up display ("HUD") in a vehicle. In this example, the HUD is based on a projection of an image 10 on a windscreen 1 using a projector 2. This representation is a simplified one and for illustrative purposes only as, in some examples, the projector 2 may not be projecting directly onto the windscreen but the emitted light may instead go through a lens and/or mirror arrangement. This HUD uses the windscreen of the car as the screen for the HUD. This arrangement thus makes use of equipment already available in the car rather than require the addition of another element which could present a safety hazard. Additionally, it also limits the amount space required for having a HUD installed in the vehicle as no additional screen is then required. On the other hand, with such an arrangement it may in some cases be difficult to configure the HUD so that the image appears to be at the infinite. For example, the angle of the windscreen and the car configuration may not enable a configuration where an image can easily be projected from projector 2 such that it appears to be at infinity or to be a far object. Also, the configuration will be windscreen-specific and is likely to vary from car model to car model such that it may require complex configuration prior to being useable by a driver.
Figure 2 illustrates another example head up display in a vehicle. In this example, the HUD comprises a screen 3 mounted on brackets 4 on which an image 10 or content 10 is displayed. The screen 3 may be fully or partially retractable, may be movable (e.g. up/down, sideways and/or rotatable) or may be generally fixed. Conventionally, the screen 3 will be at an angle relative to the vertical so that it can reflect light sent from a projector conventionally placed above or below the screen 3 (i.e. in a substantially vertical direction) to a horizontal direction towards the driver. As the skilled person will understand, other configurations may be provided where the projector is on the side of the screen 3, or in any other suitable direction, so that the driver can see the content displayed on the screen, whether it is set up so that it appears to be at infinity or not. This arrangement can more easily be fitted on any type of car or vehicle or environment as the HUD configuration will not depend on the shape of an element of the car, vehicle or environment. For example, with a screen oriented as screen 3 of Figure 2, if the projector or light emitting device is provided underneath the screen, the HUD may be put into a single unit which can be readily fitted without configuration need to redesign the internal configuration of the dashboard and windscreen area of the vehicle.
The present disclosure is intended to cover any suitable type of HUD and is not limited to HUDs which include one or more standalone screens and/or reuse an existing screen. Also, in the present disclosure, the expression "head up display" or "HUD" will generally refer to the entire HUD system, which may include for example one or more of: a projector or any other type of light emitting device, an arrangement of one or more lenses or mirrors, and/or a screen (e.g. a standalone screen or windscreen). The HUD system may also include a controller, for example for processing the image to be projected by the light emitting device.
Figure 3 illustrates an example of a system for adjusting a head up display. In this example, the image 10 displayed on a windscreen 1 may be adjusted based on a position of the driver. For example, a measuring device 5 can detect a position of the driver and adjust the position of the displayed image 10 on the windscreen. Figure 3 schematically illustrates three different positions 10a, 10b and 10c for image 10 on the HUD's screen which can be adjusted based on the position of the head 31 of the driver 30 as measured using the camera 5. To move the position of the image 10, the HUD may adjust a direction of the projector, may adjust a position of the content within the displayable frame of the projector, or any combination of both techniques. The same type of adjustments can be made on a screen of a HUD (rather than windscreen), for example by moving the position of the screen, the direction of the projector and/or the position of the content within the displayable frame and within the screen.
Figure 4 illustrates an example coordinates and direction system. This coordinates system will be used for describing the position of the driver's head or face throughout the present disclosure in the interest of consistency. However, as the skilled person will understand, any other coordinate systems may be used and the teachings of the present disclosure would apply equally to such other systems. For example, another system may not be orthogonal, may include a polar coordinates system, etc. With the system of Figure 4, the head or face 31 of the driver 30 will be positioned in space using two sets of information: first the position of the head in the X, Y, Z directions and the orientation or direction of the head, in the yaw, pitch, roll directions. For X, Y and Z, the centre of the coordinate system may be a point fixed relative to the vehicle or environment. Direction Y is upward, i.e. in a substantially vertical direction or from the bottom to the top of the car. Directions X and Z are in a substantially horizontal plan where direction Z is frontward, i.e. in the direction of travel of the vehicle or facing in the same direction as the driver or user. Finally, direction X is sideward, i.e. from one side of the vehicle to the other side with respect to the direction of travel or, in use, from one side of the user to the other side of the user. Rotation around the directions X, Y and Z are called "pitch", "yaw" and "roll", respectively, and are used for determining an orientation or direction of the face or head of the user. For example, a face detection system may be configured to detect a specific point on a user's face such as the base or tip of a nose, a point between the eyes, etc. in the X, Y, Z coordinates system and may additionally try to detect which direction the face of the user is facing (which is called orientation or direction of the face).
Figure 5 illustrates examples face directions in the coordinates and direction system of Figure 4.
As the user looks up or down (for example when nodding), the pitch of the face will vary. This can be estimated by comparing for example whether the position of the eyes, nose and/or mouth has moved up or down relative to the edges of the face as detected compared to a normal or expected position. Likewise, the yaw indicates whether the user is turning his head to one side or the other (for example when shaking his head to say "no") and can be estimated by comparing whether the eyes, nose and/or mouth have moved to the side relative to the edge of the face compared to a normal or expected position of the eyes, nose and/or mouth. Finally, the roll indicates whether the user tilts his head to one side or the other by moving his head towards one shoulder or the other. Using one or more cameras and a X, Y, Z and pitch, yaw, roll coordinates system, the position and direction of a user's head or face can be identified and used as an input for adjusting the display of a HUD.
Figure 6 illustrates another example of head directions identified based on the coordinates and direction system of Figure 4. In this example, it may for example be determined that the range 0°-10° (e.g. [-100;+100]) in the yaw direction is considered as "normal" while a yaw in the [+300;+450] is considered as "yaw right". Likewise, a yaw in [-45° ;-301 may be considered as "yaw left". Also, a yaw between "normal" and "yaw right"/"yaw left" may be considered as a "slight yaw right"! slight yaw left" while a yaw beyond "yaw right"/"yaw left" may be considered as a "large yaw right'?' large yaw left". In other examples, the ranges may be different and/or a HUD system may use the measured or estimated yaw angles for the user's face directly in its adjustment process, rather than a category that the yaw angle falls into.
While using an estimated head position to adapt a display on a HUD to be better adapted to the user's need has clear benefits, this can present some difficulty. For example, on a large and/or curved HUD screen, distortion can occur when the position of a graphic object (of content) is moved to a different position which may reduce the quality of the display and, possibly, make the display difficult to read. Also, systems such as those discussed in US 2004/0178894 and US 2013/194110 mentioned above can be used for calibrating a HUD position before driving, but may not be particularly well suited to adjusting a HUD display while driving for example, as will be apparent from the discussion below.
Figure 7 illustrates an example head up display system in accordance with the present disclosure. The HUD 7 comprises a detector 5 for detecting the position of a head, wherein the position may include one or both of a set of coordinates for the head or face, and a direction for the head or face. The detector 5 may for example comprise one or more cameras. The HUD 7 also comprises a controller 6, a light emitting device 2, such as a projector, which can display content onto a screen 1,3. The screen may sometimes be considered as part of the HUD 7, for example in the case of a dedicated screen 3 which is operated by the controller, while in other cases it may be considered as external to the HUD 7, for example when the screen is a windscreen 1. In other words, the screen can optionally be part of the HUD or not part of the HUD, which has been illustrated with a dashed line in Figure 7. The controller 6 is configured to receive position information from the detector 5 and to control the light emitting device 2 and, optionally, the screen 1,3. The controller 6 is configured to use the position information to determine whether the display on the HUD 7 should be adapted and, if it is determined that the display should be adapted, the controller 6 is configured to control the light emitting device and, optionally, the screen 1,3, to adjust the display. For example, the controller may change the content to be displayed, the position of the content to be displayed, and/or any other display parameter for the content such as contrast, luminosity, saturation, etc. For example, the controller 6 may be configured to adjust the display of the HUD with a view to compensating for a potential distortion and/or for compensating for a potential obstruction or distraction caused by the HUD, for example by estimating whether a head position is indicative of an obstruction caused by content displayed on the HUD.
Figure 8 illustrates an example head up display configuration for a driver in a vehicle. The HUD uses a screen 3 placed in front of the windscreen 1 (relative to the user's head 31) and which displays content onto the screen 3. The content displayed on the screen 3 is conventionally formed of one or more graphical objects displayed to the user. Figures 9-12 illustrate examples of head up display adjustments illustrated in the example system of Figure 8 where (A) illustrates the display prior to adjustment and (B) illustrates the display after adjustment. It may for example be beneficial to carry out these adjustments when the direction of the head (or face) changes which can be treated as an indication that the HUD may be obstructing the view of the driver thereby causing a distraction.
In the example of Figure 9, based on the position of the head, the display of the HUD is adjusted so that the content, or the graphic objects, are hidden from view when it is detected that they obstruct the field of view of the user so as to create a distraction. In other examples, the content displayed or to be displayed on the HUD may be toned down as illustrated in the example of Figure 10. For example, the luminosity and/or intensity of the light emitted by the light emitting device may be reduced. In other examples it may be achieved by reducing the saturation of the display so as to reduce potential distractions caused by bright colours in the field of vision of the user. In the example of Figure 11, the content displayed on the HUD may be moved to the left while, in the example of Figure 12, the content is moved to the right. This can be achieved by moving the content within the image or frame to be projected by the light emitting device, by moving the content relative to the screen (e.g. by moving a projector to the right or left), by moving the screen itself, by any other suitable means or by any combination of the above.
The examples of Figures 9-12 can be implemented using for example the measured or estimated direction of the head so that the controller is operable to adjust the HUD based on head direction information obtained with or derived from the position information obtained from the detector of the HUD. For example, in some cases the content may be move in the same direction as the head or in a direction opposite the direction in the head. Also, when reference is made to moving in the same direction or opposite direction, these teachings can apply to a direction component in 1D (for example along the X, Y or Z direction of any coordinate system), in 2D (for example in any (X,Y), (X,Z) or (Y,Z) plane of any coordinate system) or in 3D, if possible and appropriate. In this example case where the HUD is adjusted based on head direction information obtained with or derived from the position information obtained from the detector of the HUD, the head direction information is used to estimate whether the HUD causes distraction and/or obstruction to the user. For example, if the yaw is identified as "yaw right" this may indicate that the HUD is distracting the user and the content displayed on the HUD may be hidden, e.g. temporarily not displayed, so as to reduce the level of distraction and obstruction. In other examples, a combination of different direction angles (e.g. yaw, pitch, roll) may be used to estimate whether the HUD obstructs the view of the user. For example, a HUD system may be implemented such that, for a yaw change only, an obstruction may be detecting if the yaw is of at least 30° in one or the other direction (left or right) while, for a yaw and roll change, an obstruction may be detected if the yaw is of at least 15° and the roll of at least 10°. Such a system may use a map which defines the action to take, if any, based on the various information (e.g. head direction information) for the user. Figure 13 illustrates an example map for HUD adjustments where this map takes into account yaw and roll only, each between 0° and 90°. As the skilled person will understand, this is for illustrative purposes and a map in accordance with the present disclosure may include more than two parameters and over a wider range (e.g. each of yaw, pitch and roll from -90° to +90°). In this example, five zones are defined wherein in the zone (1), the changes in direction compared to a "normal" position (e.g. facing towards the road in a vehicle) are not considered significant enough to warrant an adjustment of the display. On the other hand, in zones (2) and (3), the system may decide to adjust the display by moving the content to the left or toning the content down, respectively. For example, a strong yaw may be an indication that moving the content may be preferable whereas an increase in mostly roll may indicate that toning the content down may be better suited to the type of distraction caused by the HUD. As illustrated with the example zone (4), more than one action may be taken and in this zone, it may be decided that the content should be moved to the left and toned down so as to better address the situation identified. Finally, zone (5) may indicate that the yaw and roll are so significant that the distraction or obstruction is substantial such that it is then preferable to hide the content, i.e. to temporarily stop displaying the content.
Although Figure 13 illustrates a small number of zones with clear limits, the map may be more gradual and may involve gradual transitions from one type of adjustment (or combination thereof) to another one. Also, while for example the "move left" zone (2) has been represented as a single zone, the actual adjustment may vary within the zone, for example to vary the amount of left displacement based on the yaw angle.
In other words, in some examples a HUD system may use a map or mapping arrangement to derive the adjustment action(s) to take based on the head position, where the HUD identifies head position parameters (e.g. coordinates and direction parameters) and uses a map that identifies which one or more adjustments to make for each possible combination of head position parameters. In other words, by using a map, the controller may take the relevant position information as an input and, by plotting the current position for this input onto the map and directly obtain the adjustment or adjustments to make based on this map plotting. Such an arrangement has the benefit of requiring relatively little computing to identify the adjustment(s) to carry out but can be less flexible than other options as all outcomes are pre-determined in advanced based on a set of parameters. In other examples, the HUD system may use in place of, or in addition to, a map arrangement, an arrangement where the adjustments are derived from algorithms and rules. In some examples, such algorithms and rules may be changed or updated, e.g. remotely or by the controller itself (e.g. by machine learning). For example, a score may be given to all possible scenarios and/or adjustment types and based on these scores the actual adjustment(s) to be carried out may be determined. Arrangement based on rules and algorithms can require more computing resources to determine which adjustment to make but can also provide for a more flexible system which can implement more fine tuning. As the skilled person will understand, using a map or algorithms and/or rules may sometimes lead to the same results and may simply represent two possible implementation of the same principle and any other suitable implementation may be used when deriving the adjustment from the position information.
Figure 14 illustrates an example method of adjusting a HUD. The method starts at step S1401 where the position of the head relative to the HUD is measured. This may for example involve one or more cameras detecting the position of a head using face detection mechanisms. The position information for the head may comprise one or more X, Y, Z coordinates, a pitch, a yaw and a roll. Then, at step S1402, it is estimated whether the head position is indicative of the HUD causing an obstruction. In some implementations, this may be carried out using head direction information only. If a possible obstruction is identified, the display of the HUD is adjusted to reduce the level of obstruction at S1403. The method may then end or return to step S1401, if appropriate.
Also, the adjustments to HUD may depend on the content (e.g. on the type of content) displayed or to be displayed on the HUD. For example, content may be classified depending on its importance and this classification information may be used for determining whether to and how to adjust the HUD. Figure 15 illustrates another example of head up display adjustments. On the left side, prior to any adjustment, the display 10 of the HUD is illustrated and includes two types of content. The first type of content is the speed limit 101 while the second type of content is weather information 102 (current and/or future). The two areas marked with a dashed line illustrate two possible central areas for the user's field of vision. For the sake of conciseness, the central area of the field of vision will hereinafter be referred to as the field of vision or field. Field 32A corresponds to the "normal" field of vision for a user, i.e. to the expected field of the vision for the user when the user is in a normal or first position (e.g. sitting in a driver's seat and facing the road). Depending on the configuration and/or use for the HUD, this "normal" field of vision may be expected to be in a centred position or in any other suitable position. When the position of the head, and in particular direction of the head, changes, this would generally result in a change of field of vision. In the example of Figure 15, field 32B indicates a potential new field of vision resulting from a change of position of the head. In the event that it is detected that the position of the head is indicative of an obstruction, the HUD system may decide to reduce the level of obstruction and for example to hide the display of the weather information 102 but to keep showing the speed limit information 101, as illustrated on the right side of Figure 15. This selection of which adjustment to apply to which content can for example be based on classification information which may indicate that the weather information is secondary while the speed limit information is critical for the safety of the vehicle. Two or more classification levels may be provided which can enable a wider range differentiation on how the display is adjusted based on the content. For example, a first type of content may be moved to the right, while another type of content may be toned down, while yet another type of content may have its saturation reduced. In brief, the HUD may be adjusted so that different types of content displayed or to be displayed on the HUD can be adjusted separately and/or differently.
While the display of the HUD can be adjusted when it is estimated that the HUD is causing a distraction to the user by obstructing his view. Also, the head position can be used to adjust the display of the HUD to align the position of the display with the position of the head. For example, this display alignment mechanism may be applied based on the X, Y, Z coordinates for the head and to align the HUD position with that of the user's head. Alternatively or additionally, the display may be adjusted to compensate for distortions.
If the position of the display 10 is to be moved, distortion can result which would of course reduce the readability of the HUD. For example and as illustrated in Figure 16, distortion can occur on a head up display using a flat surface. In this example the display 10 of the HUD is made of an image 11 projected by a projector 2 which can be seen on screen 3. As can be seen from comparing (A) and (B) of Figure 16, the display 10 will differ depending on the angle between the projector and the screen 3. Therefore, if the position of a display 10 (or of content shown on display 10) changes, the appearance of the content and/or of the entire display will change and be distorted.
Likewise, Figure 17 illustrates another example of distortion on head up display using a curved surface. In this example, it can be seen that even though the location of the display does not move between (A) and (B), the location of a graphical object moves and its appearance on the display 10 will thus be distorted as it moves.
Accordingly, a HUD system may be configured to correct such a distortion such that, as the display or displayed content changes position, distortions that may occur as a result of such position changes may be compensated for.
Additionally, distortion of the display or of the content of the display can also occur when the display or content, respectively, are not moved relative to the HUD but are in a different position with respect to the user. For example, and using the same principle as illustrated in Figure 16, if the user moves his head from a first position to a second position, from the user's perspective, the appearance of the display is likely changed. This change of appearance includes for example a type of distortion referred to as perspective distortion. Therefore, the position of the head, and in particular the coordinates for the head, may be used to apply a reverse-distortion processing to the image to be displayed on the HUD so that the image seems to have the same appearance from the user's perspective. This can involve for example estimating the position of the user's head or face, such as the coordinates, determining a level of distortion of display for the user based on the estimated position and, based on the determined level of distortion, apply an anti-distortion processing to an image to be displayed by the HUD so as to reduce the level of distortion perceived by the user. For example, the relative position of the head or face and of the display of the HUD can be estimated and an image to be displayed may be distorted so that it does not appear distorted when viewed from the user's perspective by using reverse-perspective distortion image processing means. In some examples, to simplify the complexity of the processing to be carried out, the distortion correction may be carried out so as to reduce the level of distortion in on direction only. For example, the reverse-distortion processing may be calculated taking into account head movement in the X direction only, that is, side movement only. Other examples include any other possible combination or X and/or Y and/or Z position changes.
Accordingly, there has been provided an arrangement where the display of a HUD can be adjusted based on the position of the head of a user with a view to improving the quality of display while reducing the likelihood of the HUD causing obstruction or distraction to the user. Accordingly, using a HUD controller in accordance with the present disclosure, the operation of HUDs can be adjusted with a view to providing a clearer and less obstructive display.
The subject matter of all combinations of independent and dependent claims, both singly and multiply dependent, is expressly contemplated but is not necessarily described in detail for the sake of brevity. Additionally, any feature discussed in a claim in a first category is intended to be explicitly disclosed for any other type of claims having corresponding features. For example, features discussed in a dependent apparatus claim are also considered as relevant and disclosed in respect of a corresponding method. The present disclosure has been described in an illustrative manner, and it is to be understood that the terminology used herein is intended to be descriptive rather than limitative. Many modifications and variations of the present disclosure are possible in light of the above teachings, and the disclosure may be practiced otherwise than as specifically described in respect of example implementations. In particular the present disclosure is intended to include any possible combination of one or more features discussed in any part of the description, drawings and/or claims with one or more features discussed in any other part of the description, drawings and/or claims, provided that the combination can be done technically.
Some example modifications, generalisations and/or combinations are discussed below although the discussion of these variations does not constitute an exhaustive list of all possible modifications, generalisations or combinations but merely an explicit discussion of some of these variations.
In the present disclosure, reference has been made to a user who is the user of the HUD. While the HUD does not include a user or require a user to operate, in use it is likely to be used by a user to whom content will be displayed. For example, in the case of a HUD installed in a vehicle or an aircraft, the user is likely to a driver or passenger on one hand or a pilot or co-pilot on the other hand. The HUD of the present disclosure provides technical features which can be activated or used when a user is using the HUD.
As used herein, the term "position" when referring to the position of a head or face is intended to be interpreted broadly in the sense that it refers to the position taking into account some or all of the six degrees of freedom for the head or face. It may thus include any of: a coordinate position in a 1D, 2D or 3D coordinate system (orthogonal or not, if appropriate), a direction or directional position based on a rotation angle about one or more axes of a 1D, 2D or 3D coordinate system (orthogonal or not, if appropriate). The head position is preferably measured with respect to the HUD or measured so that the relative position of the head with respect from the HUD can be derived from the position measurements.
The detector of the HUD which enables the estimating of a position of a head or face may include one or more cameras which can be combined with head or face detection means. Using one camera may provide for simpler and less costly apparatus while using two or more cameras may provide for a more complex but more accurate head position estimation.
In the present disclosure, reference is sometimes made to horizontal or vertical directions. As the skilled person will understand, these terms should be understood as substantially vertical and horizontal and the direction may be defined relative to the earth or relative to an object used with the HUD (e.g. a vehicle). For example, vertical could for example be defined as normal to the surface of the earth or the direction going through the centre of the earth. In the case of a HUD used in vehicle, the vehicle will generally have a front/back and left/right sides which may be defined with respect to a "neutral" direction of travel of the vehicle (e.g. the vehicle travelling straight ahead without turning and going up/down). Thus, in the case of a HUD used in a vehicle, a vertical direction may for example be defined as normal to a plan defined by the front/back and left/right directions. Likewise, such a plan may define the corresponding horizontal direction. Of course, for a HUD that is not associated with a vehicle (e.g. that is later to be installed in a vehicle), the HUD may be defined with respect to either absolute direction (e.g. vertical/horizontal) using any suitable means or with respect to a first, second and third directions (e.g. corresponding to X, Y and Z) suitable for determining a position in a 3D space and that these directions may not correspond to conventional vertical, horizontal, front-back or left-right directions. It is also noteworthy that a HUD may also optionally be configurable so as to adjust one or more of such directions.
The present teachings may be used with any suitable HUD and are applicable but not limited to a HUD used in a vehicle. As used in the present disclosure, the term "vehicle" refers to any means of transportation or mobile machine and includes a car, a bus, a lorry or any other type of road vehicle, a boat, an aircraft, etc. A vehicle is generally expected to be controlled (e.g. driven, piloted, rode, navigated, etc.) by a person or user (e.g driver or pilot).
Additionally, while the present disclosure generally refers to a position, location or direction of a head, the term "head" used in this context explicitly encompasses a position, location or direction of a face. For example the HUD may include or rely on external face detection means which can provide the controller of the HUD with face position information. As the skilled person will understand, any head or face detection means may be provided in hardware, software or any suitable combination thereof. The coordinates position of the head may be that of a point or area of the head or face, for example a point identified as midway between the eyes, the centre of the mouth or the tip of the nose.
It is also noteworthy that a HUD in accordance with the present disclosure is not limited to the example discussed above and include any suitable type of HUD. For example, the light emitting device of the HUD may include a projector (e.g. projecting onto a windscreen or a HUD screen), a cathode ray tube (CRT) for displaying an image on a phosphor screen, an organic light-emitting diode (OLED) display, a liquid crystal display (LCD) arrangement, or any other suitable type of light emitting device which enables the display of content onto a transparent or at least partially transparent display.
In some cases, the head position detector(s) may include a processing unit and be configured to output position information, for example to the controller of the HUD. In other cases, the head position detector(s) may output measurement information relating to the position of the head while the controller of the HUD may be configured to estimate the position of the head (e.g. coordinates and directions) based on the output of the detector(s).
Adjusting the display of the HUD may include carrying out one or more adjustments to the entire display and/or carrying one or more adjustments to a portion of the display -and also potentially carrying out different adjustments to different portions of the display, whether the different portions are separate ones or may overlap. The identification of portions to adjust may for example be based on the content to be displayed on the HUD and on the location of such content on the display. The adjustments carried out to the display include at least adjusting the saturation, adjusting the hue, adjusting the luminosity, adjusting the contrast, stopping or starting animations for some or all of the content, changing the location of the content within the HUD display, changing the location of the HUD display.
Also, the HUD arrangement of the present disclosure may take any other suitable parameter or input into consideration for determining whether to adjust the display and, if so, how to adjust the display. For example, the time and/or date may be considered, the length of use of the HUD by the user, any user preferences (e.g. entered manually, detected or learnt by the HUD), etc. Time aspects may also be considered when determining when to carry out an adjustment. For example, if a user is briefly turning its head to the side, this may be an indication that the user is checking his side mirror and may thus not be indicative of the HUD obstructing the user's field of view. On the other hand, if the user is moving his head quickly (and possibly by smaller amount) and for a longer period, this may be indicative of HUD contact obstructing the field of view. The arrangement of the present disclosure may therefore make use of timer and/or time thresholds when estimating whether the HUD is causing obstruction or distraction. Such time parameters may be used in combination with angle parameters. For example, for a change in only or mostly the angle in a first direction (e.g. yaw), a first timer may be used to estimate while for changes involving angles in the first and a second direction or in the second direction only, a second timer may be used etc. In other words, the estimation whether the content displayed on the HUD is causing distraction can be based on one or more time parameters used in combination with head position information (e.g. head's direction).
Also, when determining whether the HUD is distracting the user, the head's coordinates and/or directions may be used directly (e.g. based on the values for one or more of the directions) or may be used indirectly, for example by estimating the new field of vision for the user based on position and by comparing the old and new field of vision with respect to the display on the HUD.
Also, in one arrangement the head direction information only may be used for determining whether to adjust the display based on an estimated obstruction or distraction caused by the HUD and/or the head coordinates information only may be used for determining whether to compensate for a distortion of the display on the HUD. When determining whether the HUD is causing obstruction or distraction, the controller may estimate whether the current head position measured by the detector unit is indicative of an obstruction caused by content displayed on the HUD. In one example, this can be estimated using the head or face direction only, without taking the head or face coordinate position. Such arrangements enable useful adjustments of the display so as to help reduce distraction and/or distortion while providing simplified processing at the HUD such that the HUD can be made smaller (which is of particular importance for example for fitting it in a vehicle which, by nature, is limited in space) and can use less energy resources to operate. Likewise, the distortion correction may be based on the head or face coordinates position only, not taking into account the direction of the head or face of the user. It is noteworthy that whenever reference is made to using (a) the coordinate position only or (b) the direction, or directional position, only, this can also be presented as (a) using the coordinate position and/or ignoring the direction for the head or (b) using the direction and/or ignoring the coordinates position for the head, respectively.
When a method is discussed in the present disclosure with steps carried out in a specific order, it is within the scope of the present disclosure that method steps may be carried out in any other order and/or in parallel, provided that the change of order is technically achievable.
For the ease of representation and of discussion, some of the elements of the present disclosure have been described as separate logical elements. This representation is however not intended to be limiting for the physical implementation of these elements and two or more elements may for example be implemented in the form of a single hardware and/or software element which is configured to provide the functionalities of the two or more elements. Likewise, a single element may be implemented using two or more hardware and/or software elements together providing the same functionalities of the single element (e.g. a controller may comprise a CPU, a plurality of CPUs or a plurality of CPU cores which are operable to carry out instructions, for example from a computer program). Generally, any suitable physical implementation corresponding to the logical elements discussed herein and providing the functions or features discussed herein are intended to be fully within the scope of the present disclosure.
Also, the representations of apparatuses provided herein are not intended to be exhaustive or limiting and these apparatuses may include fewer or additional elements. The term "or" is intended to explicitly disclose both an exclusive or and a non-exclusive choice and the expression "and/or" is sometimes used to emphasise that both exclusive and non-exclusive options are considered.
As used in the present disclosure, the singular forms "a" "an" and "the" are intended to include plural references unless expressly and unequivocally limited to the singular form. In turn, the expression "one or more" is intended to encompass "one" and "a plurality of'.
As used herein, the terms "include" and "comprise" are intended to be non-exhaustive inclusion, such that one or more items which are included or comprised in a list do not necessarily limit the list to these items only to the exclusion of other items. For example, only these items or these and additional items may be included in the list.
As used herein, the terms "based on [something]" are intended to mean "based at least on [something]" and while it expressly disclose using this "something" only, it is not intended to provide an exhaustive list and additional aspects or parameters may also be taken into account.

Claims (25)

  1. CLAIMS1. A controller for adjusting a display on a Head Up Display "HUD", the controller being configured to: obtain, from a detector unit, head position information for a user of a HUD; estimate whether content displayed on the HUD is causing obstruction based on the obtained head position information; and generate, if it is estimated that content displayed on the HUD is causing obstruction, adjustment commands to adjust the display on the HUD to reduce the level of obstruction caused by the content displayed on the HUD.
  2. 2. A controller according to claim 1, wherein the head position information comprises head direction information and wherein the controller is configured to estimate whether content displayed on the HUD is causing obstruction based on head direction information only.
  3. 3. A controller according to any preceding claim, wherein the controller is further configured to: estimate whether content displayed on the HUD is likely to appear distorted based on the obtained head position information; apply reverse-distortion processing to the content to be displayed on the HUD wherein the reverse-distortion processing is based on the head position information.
  4. 4. A controller according to claim 3, wherein the head position information comprises head coordinates information and wherein the controller is configured to estimate whether content displayed on the HUD is likely to appear distorted based on head coordinates position information only.
  5. 5. A controller according to any preceding claim, wherein the controller is further configured to generate adjustment commands to adjust the display on the HUD by generating adjustment commands for adjusting one or more of: the saturation of a portion or all of the display; the hue of a portion or all of the display; the luminosity of a portion or all of the display; the contrast of a portion or all of the display; the degree of animation for some or all of the content to be displayed; the location of some or all of the content within the HUD display; and the location of the HUD display.
  6. 6. A controller according to any preceding claim, wherein the controller is further configured to generate adjustment commands to adjust the display on the HUD based on the type of content to be displayed.
  7. 7. A controller according to claim 6 wherein the controller is configured to generate adjustment commands for adjusting differently portions of the displays which are for displaying content of different types.
  8. 8. A Head Up Display "HUD", the HUD comprising: a light emitting device for displaying content on the HUD; a detector unit configured to output head position information for a user of a HUD; and an apparatus according to any of claims 1 to 7 wherein the controller is configured to receive the head position information output by the head up display; and control the light emitting device based on the generated adjustment commands.
  9. 9. A vehicle comprising a Head Up Display "HUD" according to claim 8.
  10. 10. A method of adjusting a display on a Head Up Display "HUD", the method comprising: obtaining head position information for a user of a HUD; estimating whether content displayed on the HUD is causing obstruction based on the obtained head position information; and generating, if it is estimated that content displayed on the HUD is causing obstruction, adjustment commands to adjust the display on the HUD to reduce the level of obstruction caused by the content displayed on the HUD.
  11. 11. A method according to claim 10 further comprising adjusting the display of the HUD based on the generated adjustments commands.
  12. 12. A method according to claim 10 or 11, wherein the head position information comprises head direction information and wherein the method comprises estimating whether content displayed on the HUD is causing obstruction based on head direction information only.
  13. 13. A method according to any of claims 10 to 12, wherein the method comprises: estimating whether content displayed on the HUD is likely to appear distorted based on the obtained head position information; applying reverse-distortion processing to the content to be displayed on the HUD wherein the reverse-distortion processing is based on the head position information.
  14. 14. A method according to claim 13, wherein the head position information comprises head coordinates information and wherein the method comprises estimating whether content displayed on the HUD is likely to appear distorted based on head coordinates position information only.
  15. 15. A method according to any of claims 10 to 14, wherein the method comprises generating adjustment commands to adjust the display on the HUD by generating adjustment commands for adjusting one or more of: the saturation of a portion or all of the display; the hue of a portion or all of the display; the luminosity of a portion or all of the display; the contrast of a portion or all of the display; the degree of animation for some or all of the content to be displayed; the location of some or all of the content within the HUD display; and the location of the HUD display.
  16. 16. A method according to any of claims 10 to 15, wherein the method comprises generating adjustment commands to adjust the display on the HUD based on the type of content to be 20 displayed.
  17. 17. A method according to claim 16 wherein the method comprises generating adjustment commands for adjusting differently portions of the displays which are for displaying content of different types.
  18. 18. A controller for adjusting a display on a Head Up Display "HUD", the controller being configured to: obtain, from a detector unit, head position information for a user of a HUD; estimate whether content displayed on the HUD is likely to appear distorted based on the obtained head position information; and apply reverse-distortion processing to the content to be displayed on the HUD wherein the reverse-distortion processing is based on the head position information.
  19. 19. A controller according to claim 18, wherein the head position information comprises head coordinates information and wherein the controller is configured to estimate whether content displayed on the HUD is likely to appear distorted based on head coordinates position information only.
  20. 20. A controller according to claim 18 or 19, wherein the controller is further configured to: estimate whether content displayed on the HUD is causing obstruction based on the obtained head position information; and generate, if it is estimated that content displayed on the HUD is causing obstruction, adjustment commands to adjust the display on the HUD to reduce the level of obstruction caused by the content displayed on the HUD.
  21. 21. A controller according to claim 20, wherein the head position information comprises head direction information and wherein the controller is configured to estimate whether content displayed on the HUD is causing obstruction based on head direction information only.
  22. 22. A method of adjusting a display on a Head Up Display "HUD", the method comprising: obtaining, from a detector unit, head position information for a user of a HUD; estimating whether content displayed on the HUD is likely to appear distorted based on the obtained head position information; and applying reverse-distortion processing to the content to be displayed on the HUD wherein the reverse-distortion processing is based on the head position information.
  23. 23. A controller for adjusting a display on a Head Up Display substantially as hereinbefore described with reference to the accompanying drawings.
  24. 24. A Head Up Display substantially as hereinbefore described with reference to the accompanying drawings.
  25. 25. A method of adjusting a display on a Head Up Display substantially as hereinbefore described with reference to the accompanying drawings.
GB1505092.5A 2015-03-26 2015-03-26 Head up display adjustment Active GB2536882B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1505092.5A GB2536882B (en) 2015-03-26 2015-03-26 Head up display adjustment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1505092.5A GB2536882B (en) 2015-03-26 2015-03-26 Head up display adjustment

Publications (3)

Publication Number Publication Date
GB201505092D0 GB201505092D0 (en) 2015-05-06
GB2536882A true GB2536882A (en) 2016-10-05
GB2536882B GB2536882B (en) 2021-12-22

Family

ID=53052420

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1505092.5A Active GB2536882B (en) 2015-03-26 2015-03-26 Head up display adjustment

Country Status (1)

Country Link
GB (1) GB2536882B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016225082A1 (en) * 2016-12-15 2018-06-21 Bayerische Motoren Werke Aktiengesellschaft Method and device for operating data glasses
WO2019068537A1 (en) * 2017-10-06 2019-04-11 Bayerische Motoren Werke Aktiengesellschaft Display control system for a vehicle
IT201900018836A1 (en) * 2019-10-15 2021-04-15 Manitou Italia Srl Improved information presentation system.
RU2783717C2 (en) * 2019-10-15 2022-11-16 МАНИТОУ ИТАЛИА С.р.л. Improved information display system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011155878A1 (en) * 2010-06-10 2011-12-15 Volvo Lastavagnar Ab A vehicle based display system and a method for operating the same
US20120050143A1 (en) * 2010-08-25 2012-03-01 Border John N Head-mounted display with environmental state detection
US20130076787A1 (en) * 2011-09-22 2013-03-28 GM Global Technology Operations LLC Dynamic information presentation on full windshield head-up display
US20130293452A1 (en) * 2012-05-02 2013-11-07 Flextronics Ap, Llc Configurable heads-up dash display

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011155878A1 (en) * 2010-06-10 2011-12-15 Volvo Lastavagnar Ab A vehicle based display system and a method for operating the same
US20120050143A1 (en) * 2010-08-25 2012-03-01 Border John N Head-mounted display with environmental state detection
US20130076787A1 (en) * 2011-09-22 2013-03-28 GM Global Technology Operations LLC Dynamic information presentation on full windshield head-up display
US20130293452A1 (en) * 2012-05-02 2013-11-07 Flextronics Ap, Llc Configurable heads-up dash display

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016225082A1 (en) * 2016-12-15 2018-06-21 Bayerische Motoren Werke Aktiengesellschaft Method and device for operating data glasses
WO2019068537A1 (en) * 2017-10-06 2019-04-11 Bayerische Motoren Werke Aktiengesellschaft Display control system for a vehicle
IT201900018836A1 (en) * 2019-10-15 2021-04-15 Manitou Italia Srl Improved information presentation system.
EP3808588A1 (en) * 2019-10-15 2021-04-21 Manitou Italia S.r.l. Improved system for displaying information
RU2783717C2 (en) * 2019-10-15 2022-11-16 МАНИТОУ ИТАЛИА С.р.л. Improved information display system
EP4235261A3 (en) * 2019-10-15 2023-10-11 Manitou Italia S.r.l. Improved system for displaying information

Also Published As

Publication number Publication date
GB201505092D0 (en) 2015-05-06
GB2536882B (en) 2021-12-22

Similar Documents

Publication Publication Date Title
US10510276B1 (en) Apparatus and method for controlling a display of a vehicle
US11194154B2 (en) Onboard display control apparatus
US8994558B2 (en) Automotive augmented reality head-up display apparatus and method
US10732412B2 (en) Display device for vehicle
JP6377508B2 (en) Display device, control method, program, and storage medium
WO2017163292A1 (en) Headup display device and vehicle
CN109564501B (en) Method for controlling a display device of a motor vehicle, display device of a motor vehicle and motor vehicle having a display device
US11370304B2 (en) Head-up display device
US10274726B2 (en) Dynamic eyebox correction for automotive head-up display
US20190143815A1 (en) Drive assist device and drive assist method
KR20130089139A (en) Augmented reality head-up display apparatus and method for vehicles
US11367418B2 (en) Vehicle display device
JP2013216286A (en) Monitoring device for confirming vehicle surroundings
US20170371165A1 (en) Head up display with stabilized vertical alignment
JP2017196911A (en) Image display device
GB2536882A (en) Head up display adjustment
WO2020009217A1 (en) Head-up display device
KR20200001584A (en) Method, apparatus and computer readable storage medium having instructions for controlling the display of a augmented reality head-up display device
JP2008037118A (en) Display for vehicle
JP2007322552A (en) Visual sensation correcting device
JP6845988B2 (en) Head-up display
KR20220032448A (en) Method and apparatus of correcting crosstalk
WO2018146048A1 (en) Apparatus and method for controlling a vehicle display
KR101806466B1 (en) Apparatus for controlling head-up display and method thereof
JP7375753B2 (en) heads up display device