WO2018146048A1 - Apparatus and method for controlling a vehicle display - Google Patents

Apparatus and method for controlling a vehicle display Download PDF

Info

Publication number
WO2018146048A1
WO2018146048A1 PCT/EP2018/052812 EP2018052812W WO2018146048A1 WO 2018146048 A1 WO2018146048 A1 WO 2018146048A1 EP 2018052812 W EP2018052812 W EP 2018052812W WO 2018146048 A1 WO2018146048 A1 WO 2018146048A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
head
eye
image
positional data
Prior art date
Application number
PCT/EP2018/052812
Other languages
French (fr)
Inventor
Sebastian Paszkowicz
Robert Hardy
Eduardo DIAS
George Alexander
Original Assignee
Jaguar Land Rover Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Limited filed Critical Jaguar Land Rover Limited
Publication of WO2018146048A1 publication Critical patent/WO2018146048A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/65Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive
    • B60K35/654Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive the user being the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/80Arrangements for controlling instruments
    • B60K35/81Arrangements for controlling instruments for controlling displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/149Instrument input by detecting viewing direction not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/33Illumination features
    • B60K2360/334Projection means
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0123Head-up displays characterised by optical features comprising devices increasing the field of view
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/0154Head-up displays characterised by mechanical features with movable elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/0154Head-up displays characterised by mechanical features with movable elements
    • G02B2027/0159Head-up displays characterised by mechanical features with movable elements with mechanical means other than scaning means for positioning the whole image
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0181Adaptation to the pilot/driver
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • the present disclosure relates to an apparatus and method for controlling a vehicle display. In particular, but not exclusively it relates to controlling the position of an eye-box of a head- up display in vehicles, such as road vehicles.
  • aspects of the invention relate to an apparatus, a system, a vehicle, a method, a computer program and a non-transitory computer readable medium.
  • Head-up displays are provided in some road vehicles, in which an image presented on a display is reflected in the windshield so that the image appears to the driver as a virtual object on the outside of the windshield.
  • a space within the vehicle from which the whole of the image presented by the head-up display may be viewed is referred to as the eye-box of the head-up display. If the user's eyes are not positioned within the eye-box, their view of the presented image may be restricted or non-existent.
  • Some vehicles have mechanisms to allow a user to manually adjust the position of the eye-box, so that people of varying heights may clearly view the presented image.
  • a problem with existing head-up displays is that if the driver moves his or her head, so that their eyes are no longer within the eye-box, the image will appear to the driver to be cropped or may not be visible to the user. It is an aim of the present invention to address disadvantages associated with the prior art.
  • an apparatus for controlling the position of an eye-box of a head-up display comprising a control means configured to: obtain positional data representative of a current position of an eye of a user; cause movement of a moveable element of a head-up display to adjust the position of the eye-box of the head-up display responsive to the position of the eye of the user; determine a transformation in dependence on the position of the eye-box; and apply the transformation to an image to be displayed by the head-up display.
  • This provides the advantage that the user is able to view the entire image presented by the head-up display even when they move their head with respect to the head-up display. Furthermore this may be achieved without making the optical components of the head-up display impractically large.
  • the transformation is to correct a distortion due to reflecting the image off a curved windshield of a vehicle.
  • the control means is configured to: obtain positional data representative of the current position of the eye of a user; and provide a positioning signal in dependence on the positional data for causing movement of the moveable element of the head-up display to adjust the position of the eye-box of the head-up display responsive to the position of the eye of the user.
  • the apparatus is configured to receive from an imaging means an image signal from which the positional data is obtainable.
  • an imaging means such as one or more cameras, enable information relating to the position of the user's eyes to be obtained without requiring effort from the user.
  • the positional data is indicative of a current two-dimensional position of the eye of the user. This provides the advantage that the position of the eye-box may be positioned in two dimensions, such as vertically and horizontally.
  • the control means is configured to: compare the obtained positional data with stored positional data to obtain a position value representative of a position of the moveable element of the head-up display associated with the obtained positional data; and provide a positioning signal in dependence on the position value.
  • the control means is configured to look up the positional data in a look-up table to obtain a position value representing a position of the moveable element of the head-up display, and the positioning signal is dependent upon the position value.
  • control means is configured to determine the transformation in dependence on the positional data. This provides the advantage that the varying distortions applied to the image by optical components of the head-up display at the various positions of the user's eyes, may be compensated for.
  • control means is configured to compare the obtained positional data and/or the position of the moveable element with stored positional data and/or stored positions of the moveable element to obtain the transformation for applying to image data forming the image to be displayed by the head-up display. This provides the advantage that the transformations to be applied may be pre-determined and stored, for example in a calibration procedure.
  • control means is configured to look up the positional data and/or the position of the moveable element in a look up table to obtain the transformation for applying to the image to be displayed by the head-up display.
  • the applied transformation is an approximation of an inverse transformation of an image distortion that is caused by optical elements of the head-up display for the current position of the eye and/or position of the moveable element of the head-up display.
  • optical elements may include a windscreen or windshield of a vehicle, onto which the image is displayed by the head-up display. This provides the advantage that the displayed image appears to be undistorted to the user regardless of their position within the vehicle.
  • control means is configured to determine the positional data by analyzing image data to identify a representation of at least one eye of the user.
  • control means comprises at least one electronic processor and at least one electronic memory device coupled to the electronic processor and having instructions stored therein.
  • a system comprising the apparatus of any one of the previous paragraphs and a head-up display comprising a moveable element, wherein the head-up display is arranged to: receive positional data from the control means; and adjust the position of the moveable element in dependence on the positional data.
  • the moveable element comprises a mirror. This provides the advantage that the eye-box position is adjusted by moving a component that may already be configured to be manually adjustable in existing systems.
  • the moveable element is arranged to direct light onto a windshield of a vehicle.
  • a vehicle comprising the system of any one of the above paragraphs and an imaging means that is positioned on or within the vehicle and which is configured to capture an image containing a representation of at least one eye of a user of the vehicle.
  • the vehicle is an automotive vehicle.
  • a method of controlling the position of an eye-box of a head-up display comprising: obtaining positional data representative of a current position of an eye of a user; causing movement of a moveable element of a head-up display to adjust the position of the eye-box of the head-up display responsive to the position of the eye of the user; determining a transformation in dependence on the position of the eye-box; and applying the transformation to an image to be displayed by the head-up display.
  • This provides the advantage that the user is able to view the entire image presented by the head-up display even when they move their head with respect to the head-up display. Furthermore this may be achieved without making the optical components of the head-up display impractically large.
  • the transformation is to correct a distortion due to reflecting the image off a curved windshield of a vehicle.
  • method comprises: obtaining positional data representative of the current position of the eye of a user; and providing a positioning signal in dependence on the positional data for causing movement of a moveable element of a head-up display to adjust the position of the eye-box of the head-up display responsive to the position of the eye of the user.
  • the obtaining positional data comprises obtaining the positional data from a signal received from an imaging means. This provides the advantage that the imaging means, such as one or more cameras, enable information relating to the position of the user's eyes to be obtained without requiring effort from the user.
  • the positional data is indicative of a current two-dimensional position of the eye of the user. This provides the advantage that the position of the eye-box may be positioned in two dimensions, such as vertically and horizontally.
  • the method comprises: determining the transformation in dependence on the positional data.
  • the method comprises: comparing the obtained positional data with stored positional data to obtain a position value representative of a position of the moveable element of the head-up display associated with the obtained positional data; and providing the positioning signal in dependence on the position value.
  • the method comprises looking up the positional data in a look-up table to obtain a position value representing a position of the moveable element of the head- up display, and a positioning signal is dependent upon the position value.
  • the method comprises determining the transformation in dependence on the positional data. This provides the advantage that the varying distortions applied to the image by optical components of the head-up display at the various positions of the user's eyes, may be compensated for. In some embodiments the method comprises comparing the obtained positional data and/or the position of the moveable element with stored positional data and/or stored positions of the moveable element to obtain the transformation for applying to image data forming the image to be displayed by the head-up display. This provides the advantage that the transformations to be applied may be pre-determined and stored, for example in a calibration procedure.
  • the method comprises looking up the positional data and/or the position of the moveable element in a look up table to obtain the transformation for applying to the image to be displayed by the head-up display.
  • the transformation is an approximation to an inverse transformation of an image distortion caused by optical elements of the head-up display for the current position of the eye of the user and/or position of the moveable element of the head-up display.
  • optical elements may include a windscreen or windshield of the vehicle, onto which the image is displayed by the head-up display. This provides the advantage that the displayed image appears to be undistorted to the user.
  • the method comprises determining the positional data by analyzing image data to identify a representation of at least one eye of a user.
  • the moveable element comprises a mirror. This provides the advantage that the eye-box position is adjusted by moving a component that may already be configured to be manually adjustable in existing systems.
  • the method comprises generating the image at an image display device of the head-up display and reflecting the image at the moveable element and a windshield of a vehicle.
  • the method comprises: at an imaging means positioned within a vehicle, capturing an image containing a representation of at least one eye of a user of the vehicle; and determining the positional data by analyzing image data of the image to identify a region containing a representation of at least one eye of the user.
  • a computer program which when executed by a processor causes the processor to perform the method of any one of the previous paragraphs.
  • a non-transitory computer- readable storage medium having instruction stored therein which when executed on a processor cause the processor to perform the method of any one of the previous paragraphs.
  • an apparatus for controlling the position of an eye-box of a head-up display comprising an electronic processor having an electrical input for receiving one or more signals; and an electronic memory device electrically coupled to the electronic processor and having instructions stored therein, wherein the processor is configured to access the memory device and execute the instructions stored therein such that it becomes configured to: obtain positional data representative of a current position of an eye of a user; and provide a positioning signal in dependence on the positional data for causing movement of a moveable element of a head-up display to adjust the position of the eye-box of the head-up display responsive to the position of the eye of the user.
  • an apparatus for controlling the position of an eye-box of a head-up display comprising a control means configured to: obtain positional data representative of a current position of an eye of a user; and cause movement of a moveable element of a head-up display to adjust the position of the eye-box of the head-up display responsive to the position of the eye of the user.
  • a method of controlling the position of an eye-box of a head-up display comprising: obtaining positional data representative of a current position of an eye of a user; causing movement of a moveable element of a head-up display to adjust the position of the eye-box of the head- up display responsive to the position of the eye of the user.
  • Fig. 1 shows a schematic side view of a vehicle comprising a head-up display and an apparatus comprising a control means in accordance with an embodiment of the invention
  • Fig. 2 shows a further schematic side view of the vehicle and user shown in Fig. 1 ;
  • Fig. 3 shows a schematic plan view of the vehicle shown in Fig. 1 ;
  • Fig. 4 shows a further schematic plan view of the vehicle shown in Fig. 1 ;
  • Fig. 5 shows an image captured by the imaging means and illustrates an example of how the positional data indicative of a position of the eyes of the user are determined in accordance with an embodiment of the invention
  • Fig. 6 shows an example of a calibration image for use in an embodiment of the invention
  • Fig. 7 shows an example of a detected image that has been captured by an imaging device forming part of an embodiment of the invention
  • Fig. 8 shows a diagram illustrating functional blocks of a system comprising an imaging means, a control means and a head-up display in accordance with an embodiment of the invention
  • Fig. 9 shows a schematic diagram of an apparatus comprising a control means in accordance with an embodiment of the invention.
  • Fig. 10 shows a flowchart illustrating a method of controlling the position of an eye-box of a head-up display in accordance with an embodiment of the invention
  • Fig. 1 1 shows a flowchart of a method in accordance with an embodiment of the invention
  • Fig. 12 shows a flowchart of a method in accordance with an embodiment of the invention
  • Fig. 13 shows a flowchart illustrating a method of transforming images for display on a head- up display in accordance with an embodiment of the invention
  • Fig. 14 shows a flowchart of a method in accordance with an embodiment of the invention.
  • Fig. 15 shows a diagram illustrating functional blocks of a system in accordance with an embodiment of the invention.
  • Fig. 16 shows a diagram illustrating functional blocks of a system in accordance with an embodiment of the invention.
  • the Figures illustrate an apparatus 101 for controlling the position of an eye-box 102 of a head-up display 103, the apparatus 101 comprising a control means 1 14 configured to: obtain positional data representative of a current position of at least one eye 104 of a user 105; and provide a positioning signal in dependence on the positional data for causing movement of a moveable element 108 of a head-up display 103 to adjust the position of the eye-box 102 of the head-up display 103 responsive to the position of the at least one eye position of the user's eyes, thus maintaining the eye box in the user's field of view regardless of head movement. This greatly improves the legibility of the information presented by the head-up display for the user.
  • the Figures also illustrate an apparatus 101 for transforming images for display on a head- up display 103, the apparatus 101 comprising a control means 1 14 configured to: obtain positional data representative of a current position of at least one eye 104 of a user 105; determine a transformation in dependence on the positional data; and output a transformation signal for applying the transformation to image data representative of an image to generate transformed image data representative of a transformed image to be displayed on the head-up display 103.
  • a vehicle 106 including a system 120 comprising a head-up display 103 is shown in a schematic side view in Fig. 1 .
  • the head-up display 103 comprises a display device 107, which may comprise a light emitting diode (LED) display, a liquid crystal display (LCD), an organic light emitting diode (OLED) display or another type of illuminated display, as is known in the art.
  • the head-up display 103 also comprises a moveable optical element 108 for directing the light emitted by the display device 107 onto a windshield 109 where it is reflected towards the eyes 104 of a user 105 of the vehicle 106 when seated in a front seat 1 10 of the vehicle 106.
  • the user 105 is the driver 105 of the vehicle 106 and the moveable optical element 108 is arranged to direct the light emitted by the display device 107 onto the windshield 109 where it is reflected towards the eyes 104 of the user 105 of the vehicle 106 when seated in the user's seat 1 10. Consequently, the image displayed by the display device 107 is presented to the user 105 as a virtual object 1 13 that appears to be located on the outside of the windshield 109.
  • the moveable optical element 108 comprises a mirror which reflects the light from the display device 107 towards the windshield 109.
  • the mirror is a part of a mirror galvanometer, or the mirror is mounted on a motorized gimbal, to enable it to be reoriented.
  • the moveable optical element 108 is the only optical element on the path of the light from the display device 107 to the windshield 109, but it will be appreciated that other embodiments may have more than one optical element along this light path.
  • the head-up display 107 also comprises an actuation means 1 1 1 which is configured to enable adjustment of the orientation of the moveable optical element 108 so that the direction of the light leaving the moveable optical element 108 may be adjusted.
  • This enables the position of the eye-box 102 of the head-up display 103 to be adjusted so that the eyes 104 of the user 105 are positioned within the eye-box 102 to provide the user 105 with a clear view of the image displayed by the head-up display 103.
  • the moveable optical element 108 is adjustable about a lateral axis (in a direction into the paper as viewed in Fig. 1 ) and adjustable about a second axis 1 12 substantially perpendicular to the lateral axis.
  • the actuation means 1 1 1 may comprise electric motors, or electric stepper motors arranged to adjust the orientation of the moveable optical element 108 in dependence on signals received by the actuation means 1 1 1 .
  • the head-up display 103 operates in dependence on signals provided by the control means 1 14, which provides the signals in dependence on signals it receives from an imaging means 1 15.
  • the imaging means 1 15 comprises one or more cameras located within the vehicle 106 which are configured to capture images of the face of a user 105 in order to obtain information defining a position of at least one of the user's eyes.
  • Image data representing images captured by the imaging means 1 15 are analyzed to identify a representation of at least one eye 104 of the user 105.
  • Positional data which may comprise 2-dimensional coordinates of the representation of the at least one eye 104 within the image are thus determined.
  • This analysis may be performed by the control means 1 14 in dependence on receiving image data from the one of more cameras providing the imaging means 1 15, or the analysis may be performed by one or more processors located within the imaging means 1 15 and the positional data is received by the control means 1 14 from a processor of the imaging means 1 15. In this latter case, the one or more processors of the imaging means 1 15 may be regarded as a part of an apparatus 101 comprising the control means 1 14.
  • the control means 1 14 is configured to provide one or more signals to the head-up display 103 in dependence on the positional data obtained from the image data. In the present example, it provides two different output signals in dependence on positional data indicating a position of one or both eyes 104 of the user 105. The first of the two signals is supplied to the actuation means 1 1 1 to cause the actuation means 1 1 1 to rotate the moveable optical element 108, so that the position of the eye-box 102 of the head-up display 103 is adjusted relative to the position of the at least one eye 104 of the user 105.
  • a potential problem with this movement of the eye-box 102 is that a different region of the windshield 109 will be used to reflect the light towards the user 105, and the windshield 109 is curved, typically with radii of curvature of the windshield 109 differing from one point to another point. Also, angles at which the light is reflected off the windshield 109 will be altered when the eye-box is repositioned. Consequently the image displayed by the display device 107 may be distorted in varying ways before it reaches the user's eyes 104, depending upon the positioning of the eye-box 102.
  • the user 105 is shown in Fig. 1 seated in an upright position and the orientation of the moveable optical element 108 has been adjusted by the actuation means 1 1 1 , so that the user's eyes 104 are positioned within the eye-box 102.
  • Light from the moveable optical element 108 is reflected off a first region 1 16 of the windshield 109 at first angles 1 17.
  • the vehicle 106 and user 105 of Fig. 1 are shown in Fig. 2, but with the user 105 in a more reclined position and therefore with a lower eye level within the vehicle 106 relative to the imaging means 1 15.
  • the orientation of the moveable optical element 108 has been adjusted by the actuation means 1 1 1 , so that the user's eyes 104 are once again positioned within the eye-box 102.
  • Light from the moveable optical element 108 is reflected off a second region 201 , lower down the windshield 109, and at second angles 202 to the windshield 109, which are larger than the first angles 1 17.
  • FIG. 3 and 4 additionally illustrate how the lateral position assumed by the user 105 also affects the region of the windshield 109 that is used to reflect the light from the display device 107 and the angles at which the light is reflected off the windshield 109.
  • a schematic plan view of the vehicle 106 and the head 301 of the user 105 is shown in Fig. 3 with the user's head 301 positioned towards the middle of the vehicle 106, and a similar schematic plan view is shown in Fig. 4 with the user's head 301 positioned nearer to the user's side door 302.
  • the moveable optical element 108 has been tilted about the axis 1 12 to direct light onto the windshield 109 and reflect it from a region 303 (shown hatched) towards the more central position taken up by the eyes 104 of the user 105.
  • the moveable optical element 108 has been tilted about the axis 1 12 to direct light onto the windshield 109 and reflect it from another region 304 (shown hatched) towards the position adjacent to the door 302, taken up by the eyes 104 of the user 105.
  • vehicle windshields are typically formed as glass or other transparent members having compound curves, with the screen being curved from top to bottom as well as from side to side.
  • varying regions of the windshield 109 are used to reflect light towards the user's eyes 104, and the light is reflected at varying angles from the windshield 109, in dependence on the positioning of the user's eyes 104 and the positioning of the moveable optical element 108. Consequently, the image displayed by the display device 107 is distorted in correspondingly various ways by the reflection in the windshield 109, depending on the position of the user's eyes 104 and the position of the moveable optical element 108.
  • the second of the two signals comprises a transformation signal for applying a transformation to image data that represents an image that is to be displayed on the display device 107 of the head-up display 103.
  • the resulting transformation to the image data causes a distortion of the image displayed by the display device 107 in an opposite sense to the distortion created by the optical components, including the windshield 109, i.e. the image to be presented by the display device 107 is transformed by a transformation that is the inverse of the transformation produced by the optical distortion. Consequently, the image observed by the user 105 appears to the user to be free of distortion.
  • the transformation signal provided by the control means 1 14 comprises a transformation that is determined by the control means 1 14 in dependence on the positional data obtained from the analysis of the image captured by the imaging means 1 15.
  • FIG. 5 illustrates an image captured by the imaging means 1 15.
  • the image is analyzed to detect a number of features of the user's eyes 104 that define a border geometry (represented by rectangle 501 ) surrounding the user's eyes 104.
  • the analysis may identify the highest and lowest points of a user's eyes 104, the leftmost point of the left eye and the rightmost point of the right eye and define the border geometry as an upright rectangle having an edge passing through each of these points.
  • Positional data which may comprise 2-dimensional co-ordinates of the border geometry 501 within a field of view 502 are determined.
  • the 2-dimensional co-ordinates may be the coordinates of the centre of the rectangle.
  • This process of analyzing the image to determine the positional data is typically performed by one or more processors within the imaging means 1 15.
  • the positional data defines a 1 -dimensional position and the automated repositioning of the eye-box 102 is only in 1 dimension, either along a vertical or horizontal axis with respect to the vehicle 106.
  • the position of the eye-box 102 of the head-up display 103 is adjusted by the control means 1 14 when the current positions of the eyes 104 of the user 105 are not aligned with the current position of the eye-box 102.
  • the control means 1 14 is arranged to determine 2-dimensional co-ordinates of a position (illustrated by an "X" 504) of one or both eyes 104 of the user 105.
  • a system 120 in Fig. 1 ) comprising imaging means 1 15, the head-up display 103 and the control means 1 14 is calibrated for a finite number of different positions 503 of the eye-box 102. For example, central points 503 of the different positions of the eye-box 102 are illustrated in Fig. 5.
  • the control means 1 14 may be arranged to maintain the current position of the eye-box 102.
  • the control means 1 14 may be arranged to adjust the position of the eye-box 102 to the calibrated position nearest to the current position 504 of the eyes 104.
  • the system (120 in Fig. 1 ) is calibrated to compensate for the optical distortion caused by the optical elements of the head-up display 103 for each of the finite number of calibrated positions with central points 503 of the eye-box 102. That is, for each calibrated position, the image displayed by the display device 107 may be distorted in a different way by the reflection in the windshield 109, depending on the position of the user's eyes 104 and the position of the moveable optical element 108. Thus, for each calibrated position, the control means 1 14 determines a corresponding transformation signal for applying a transformation to image data that represents an image to be displayed on the display device 107 of the head-up display 103.
  • a calibration image may be displayed on the display device 107 of the head-up display 103 and an imaging device such as a camera (not shown) is located at each of the central points 503 in turn, while the head-up display 103 is arranged to position the centre of the eye-box 102 at that point 503.
  • the imaging device is arranged to detect the image projected by the head-up display 103, and the detected image will typically be distorted.
  • An example of a calibration image 601 is shown in Fig. 6.
  • the calibration image 601 comprises a pattern defining a regular array of accurately positioned features 602.
  • the features 602 are circles 602 arranged in a square array.
  • FIG. 7 An example of a detected image 701 that has been captured by the imaging device during calibration is shown in Fig. 7.
  • a grid 702 of squares is also shown which illustrates the distortion produced by the optical elements of the head-up display 103.
  • the grid 702 has been chosen such that, in a non-distorted image, the centres of the circles 602 would coincide with the vertices 703 of the squares of the grid 702.
  • most of the centres of the circles 602 of the detected image 701 are separated from the corresponding vertex.
  • the centre of a first circle 602A is separated from a corresponding vertex 703A by a displacement vector 704A and the centre of a second circle 602B is separated from a corresponding vertex 703B by a displacement vector 704B.
  • the calibration process may therefore determine a displacement vector, such as vectors 704A and 704B, for each of the circles 602.
  • These displacement vectors represent a transformation caused by the optical components of the head-up display 103 to the original displayed image 601 . Therefore an approximation to the transformation caused by the optical components of the head-up display 103 may be determined.
  • Determination of the image transformation e.g. a set of head-up display specific distortions
  • the image transformation caused by the optical system of the head-up display 103 may be determined by software, such as SPEOS or ZEMAX, which models the propagation of rays through the optical system. This software is used to simulate the level of distortion in the optical system from each one of a set of positions that the eyes of the user may assume during use.
  • an inverse transformation may be determined and stored for each of the calibrated positions (points 503 in Fig. 5) of the system (120 in Fig. 1 ).
  • a corresponding inverse transformation may be applied to the image to be displayed by the display device 107 of the head-up display 103, and because the transformation applied to the image to be displayed by the display device 107 approximates to the inverse of the transformation applied by the optical components of the head-up display 103, the image observed by the user 105 appears to be free of distortion.
  • the inverse transform which is applied to the image that is to be displayed by the display device 107, is selected in dependence on a nearest neighbour algorithm using the current eye position of the user 105.
  • a diagram illustrating functional blocks of an embodiment of the system 120 comprising the imaging means 1 15, the control means 1 14 and the head-up display 103 is shown in Fig. 8.
  • a picture generator 801 provides image data for display by the display device 107 of the head-up display 103.
  • the image data generated by the picture generator 801 is generated in dependence on one or more signals comprising information received from one or more other systems 803.
  • the one or more signals may be indicative of the current road speed received from another system, such as an antilock braking system (ABS) or speedometer, or indicative of a selected gear and received from a transmission control module (TCM).
  • the picture generator 801 generates image data representing a graphical image that illustrates the information in a format determined by graphical data stored in a memory device 802.
  • An image analysing means 805 receives images captured by the imaging means 1 15 and analyses each of the images to generate the positional data representative of a current position of at least one eye 104 of a user 105.
  • a positioning determination means 806 receives the positional data and provides a positioning signal in dependence on the positional data for causing movement of the moveable element 108 of the head-up display 103 to adjust the position of the eye-box 102 of the head-up display 103 relative to the position of the eyes 104 of the user 105.
  • the positioning determination means 806 may compare the positional data with data defining the central points (503 in Fig. 5) of the calibrated positions to determine if the current position of the eyes 104 is within a threshold distance of the centre of the eye-box 102.
  • the positioning determination means 806 may provide an output signal to the actuation means 1 1 1 to cause it to move the moveable element 108 of the head-up display 103 to position the centre of the eye-box 102 at the central point (503 in Fig. 5) of the calibrated position nearest to the position of the eyes 104 of the user 105.
  • the positional data generated by the image analysing means 805 is also provided to a transformation determination means 807 configured to determine a transformation in dependence on the positional data and to output a transformation signal for applying the transformation to image data.
  • the transformation may be determined by retrieving transformation data stored in a memory device 808 in dependence on the received positional data.
  • the transformation data may have been produced in a calibration process as described above with regard to Figs. 6 and 7 and stored in a look-up table in the memory device 808.
  • the transformation determination means 807 may be configured to retrieve transformation data, corresponding to the positional data, from the look-up table.
  • a picture transformation means 809 is configured to receive the transformation signal from the transformation determination means 807 and image data from the picture generator 801 and apply the transformation to the image data to generate transformed image data representative of a transformed image to be displayed on the head-up display 103.
  • the picture transformation means 809 provides a signal to the display device 107 to cause it to display a transformed image.
  • the positioning determination means 806 may be configured to provide an output signal dependent on the positional data to continuously keep moving the centre of the eye-box 102 to the position of the eyes 104.
  • the transformation determination means 807 may be configured to determine which one of the calibrated positions has a central point (503 in Fig. 5) nearest to the current eye position and output a transformation signal comprising transformation data corresponding to that calibrated position.
  • Apparatus 101 comprising the control means 1 14 is shown schematically in Fig. 9.
  • the control means 1 14 comprises one or more electronic processors 902 and one or more electronic memory devices 903.
  • a computer program 904 comprising instructions is stored in the memory device 903 and the one or more electronic processors 902 are configured to execute the instructions and perform at least the positioning determination means 806 and/or the transformation determination means 807 described above and shown in Fig. 8 and/or any one of the methods described below with reference to Figs. 10 to 14.
  • the processors may be located within a single module or may be distributed over several different modules.
  • the image analysing means (805 of Fig. 8) may be performed by a processor 902 of the control means 1 14 that is located within a camera 1 15 configured to capture images of the eyes 104 of the user 105, while the positioning determination means 806 and/or the transformation determination means 807 shown in Fig. 8 may be located within a unit that includes the display device 107 of the head-up display 103.
  • one or more processors 902 of the control means 1 14 may be located within a unit that includes the display device 107 of the head-up display 103, and the one or more processors 902 may also be configured to perform the picture generation performed by the picture generator 801 and the processes performed by the picture transformation means 809 and the transformation determination means 807.
  • the apparatus 101 also comprises input/output means 905 for receiving and transmitting communications to other electronic devices.
  • the input/output means 905 may comprise one or more transceivers for communicating with other devices over data buses, such as a controller area network bus (CAN bus) of the vehicle 106.
  • CAN bus controller area network bus
  • the computer program 904 may be transferred to the memory device 903 via a non- transitory computer readable medium, such as a CD-ROM 906 or a portable memory device, or via a network, such as a wireless network.
  • a non- transitory computer readable medium such as a CD-ROM 906 or a portable memory device
  • a network such as a wireless network.
  • FIG. 10 A flowchart illustrating a method 1000 of controlling the position of an eye-box of a head-up display, performable by the control means 1 14, is shown in Fig. 10.
  • the method 1000 comprises, at block 1001 , obtaining positional data representing a current position of one or more eyes of a user.
  • This process may comprise receiving positional data from a processor that is configured to perform an analysis of an image captured by an imaging means.
  • the process at block 1001 may comprise the processes illustrated in the flowchart of Fig. 1 1 .
  • the method 1000 may comprise, at block 1 101 of process 1001 , receiving from an imaging means an image signal from which positional data is obtainable, and, at block 1 102, analysing image data contained within the image signal to identify a representation of at least one eye of the user.
  • the process 1001 comprises obtaining the positional data representative of a current position of an eye of a user from the received image signal.
  • the method 1000 also comprises, at block 1002, causing movement of a moveable element of a head-up display in dependence on the positional data to adjust the position of the eye-box of the head-up display relative to the current position of the one or more eyes of the user.
  • the method 1000 is typically performed repeatedly; each time using the most recently received positional data obtained from the most recently captured image. Thus, the method 1000 repeatedly provides positioning signals, or continuously provides a positioning signal, to adjust the position of the eye-box of the head-up display.
  • the process at block 1002 may comprise the processes illustrated in the flowchart of Fig. 12.
  • the obtained positional data is compared with stored positional data to obtain a position value representative of a position of the moveable element of the head-up display associated with the obtained positional data.
  • the process at block 1201 may comprise looking up the positional data in a stored look-up table to obtain the position value.
  • the process 1002 also comprises, at block 1202, providing the positioning signal in dependence on the position value.
  • a positioning signal may be provided to a head-up display to cause the position of the eye-box of the head-up display to be moved in dependence on the positional data.
  • a flowchart illustrating a method 1300 of transforming images for display on a head-up display, performable by the control means 1 14, is shown in Fig. 13.
  • positional data representative of a current position of at least one eye of a user is obtained.
  • the process at block 1301 of the method 1300 may be the same as the process performed at block 1001 of the method 1000, as described above.
  • the method 1300 determines a transformation in dependence on the positional data obtained at block 1301 . This process may be as described above with reference to Figs. 6 and 7.
  • the method 1300 outputs a transformation signal for applying the transformation to image data representative of an image, in order to generate transformed image data representative of a transformed image to be displayed on a head-up display.
  • the method 1300 is typically repeatedly performed; each time using the most recently received positional data obtained from the most recently captured image. Thus, the method 1300 repeatedly provides an output signal that causes the head-up display to transform the image in dependence on the most recently determined positions of the eyes of the user.
  • the process at 1302 of the method 1300 may comprise looking up the positional data and/or the position of the moveable element in a look-up table to obtain the transformation to be applied to the image data at block 1303.
  • FIG. 15 A diagram illustrating functional components of an alternative system 120A is shown in Fig. 15. Components common to both the system 120 and system 120A have been provided with the same reference signs.
  • the system 120A has an image analyzing means 805 which receives images captured by the imaging means 1 15 and analyses each of the images to generate the positional data representative of a current position of at least one eye 104 of a user 105.
  • a positioning determination means 806 receives the positional data and provides a positioning signal in dependence on the positional data for causing movement of the moveable element 108 of the head-up display 103 to adjust the position of the eye-box 102 of the head-up display 103 relative to the position of the eyes 104 of the user 105.
  • the system 120A also has a picture generator 801 that provides image data for display by the display device 107 of the head-up display 103.
  • the image data generated by the picture generator 801 may be generated in dependence on one or more signals comprising information received from one or more other systems 803.
  • the picture generator 801 generates image data representing a graphical image that illustrates the information in a format determined by graphical data stored in a memory device 802.
  • the image data generated by the picture generator 801 is provided to the display device 107 of the head-up display 103 without being transformed beforehand.
  • the image observed by the user 105 may at times appear to be distorted depending on the position of the eyes of the user.
  • system 120A like system 120, ensures that the user 105 is able to view the image provided by the head-up display 103, by adjusting the position of the eye-box 102 in dependence on the position of the user's eyes 104.
  • FIG. 16 A diagram illustrating functional components of another alternative system 120B is shown in Fig. 16. Components common to both the system 120 and system 120B have been provided with the same reference signs.
  • the system 120B has a picture generator 801 that provides image data for display by the display device 107 of the head-up display 103.
  • the image data generated by the picture generator 801 may be generated in dependence on one or more signals comprising information received from one or more other systems 803.
  • the picture generator 801 generates image data representing a graphical image that illustrates the information in a format determined by graphical data stored in a memory device 802.
  • the system 120B also includes an image analysing means 805 which receives images captured by the imaging means 1 15 and analyses each of the images to generate the positional data representative of a current position of at least one eye 104 of a user 105.
  • the positional data generated by the image analyzing means 805 is provided to a transformation determination means 807 configured to determine a transformation in dependence on the positional data and to output a transformation signal for applying the transformation to image data.
  • the transformation may be determined by retrieving transformation data stored in a memory device 808 in dependence on the received positional data.
  • the transformation data may be previously produced and stored in a calibration process similar to that described above with reference to Figs. 5, 6 and 7. However, in this instance, the camera used for the calibration process is moved between calibration positions (similar to points 503 in Fig. 5), while the eye-box 102 of the head-up display 103 remains stationary, and the camera is caused to capture images (similar to image 701 in Fig. 7) of the calibration image (601 in Fig. 6). The transformation is then determined from the displacements (similar to vectors 704A and 704B in Fig. 7) measured in those captured images. This process may be performed for a number of different static positions of the head-up display 103.
  • a picture transformation means 809 is configured to receive the transformation signal from the transformation determination means 807 and image data from the picture generator 801 and apply the transformation to the image data to generate transformed image data representative of a transformed image to be displayed on the head- up display 103.
  • the picture transformation means 809 provides a signal to the display device 107 to cause it to display a transformed image.
  • the system 120B does not include a positioning determining means 806 for controlling the position of the moveable optical element 108 of the head-up display 103.
  • the picture transformation means 809 is still considered to be advantageous, particularly in a system having a head-up display with a relatively large eye-box 102, in which the user 105 can move their eye-position by substantial distances within the vehicle and still see the whole of the displayed image.
  • the apparent distortion produced by the optical components (and particularly the windshield 109) of the head-up display 103 is likely to vary depending upon the position of the eyes 104 of the user 105, even though the eye-box 102 remains stationary.
  • the system 120B is able provide the user 105 with a substantially undistorted view of the image.
  • the controller(s) or control means described herein can each comprise a control unit or computational device having one or more electronic processors.
  • a vehicle and/or a system thereof may comprise a single control unit or electronic controller or alternatively different functions of the controller(s) may be embodied in, or hosted in, different control units or controllers.
  • a set of instructions could be provided which, when executed, cause said controller(s) or control unit(s) to implement the control techniques described herein (including the described method(s)).
  • the set of instructions may be embedded in one or more electronic processors, or alternatively, the set of instructions could be provided as software to be executed by one or more electronic processor(s).
  • a first controller may be implemented in software run on one or more electronic processors, and one or more other controllers may also be implemented in software run on or more electronic processors, optionally the same one or more processors as the first controller. It will be appreciated, however, that other arrangements are also useful, and therefore, the present disclosure is not intended to be limited to any particular arrangement.
  • the set of instructions described above may be embedded in a computer-readable storage medium (e.g., a non-transitory storage medium) that may comprise any mechanism for storing information in a form readable by a machine or electronic processors/computational device, including, without limitation: a magnetic storage medium (e.g., floppy diskette); optical storage medium (e.g., CD-ROM); magneto optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM ad EEPROM); flash memory; or electrical or other types of medium for storing such information/instructions.
  • a computer-readable storage medium e.g., a non-transitory storage medium
  • a magnetic storage medium e.g., floppy diskette
  • optical storage medium e.g., CD-ROM
  • magneto optical storage medium e.g., magneto optical storage medium
  • ROM read only memory
  • RAM random access memory
  • the blocks illustrated in the Figs. 10 to 14 may represent steps in a method and/or sections of code in the computer program 904.
  • the illustration of a particular order to the blocks does not necessarily imply that there is a required or preferred order for the blocks and the order and arrangement of the block may be varied. Furthermore, it may be possible for some steps to be omitted.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

An apparatus (101), a system, a vehicle (106), a method (1000), a computer program (904) or a non-transitory computer readable medium (906) for controlling the position of an eye- box (102) of a head-up display (103) are disclosed. The apparatus (101) comprises a control means (114) configured to: obtain positional data representative of a current position of an eye (104) of a user (105); and cause movement of a moveable element (108) of a head-up display (103) to adjust the position of the eye-box (102) of the head-up display (103) responsive to the position of the eye (104) of the user (105).

Description

APPARATUS AND METHOD FOR CONTROLLING A VEHICLE DISPLAY
TECHNICAL FIELD The present disclosure relates to an apparatus and method for controlling a vehicle display. In particular, but not exclusively it relates to controlling the position of an eye-box of a head- up display in vehicles, such as road vehicles.
Aspects of the invention relate to an apparatus, a system, a vehicle, a method, a computer program and a non-transitory computer readable medium.
BACKGROUND
Head-up displays are provided in some road vehicles, in which an image presented on a display is reflected in the windshield so that the image appears to the driver as a virtual object on the outside of the windshield. A space within the vehicle from which the whole of the image presented by the head-up display may be viewed is referred to as the eye-box of the head-up display. If the user's eyes are not positioned within the eye-box, their view of the presented image may be restricted or non-existent. Some vehicles have mechanisms to allow a user to manually adjust the position of the eye-box, so that people of varying heights may clearly view the presented image. A problem with existing head-up displays is that if the driver moves his or her head, so that their eyes are no longer within the eye-box, the image will appear to the driver to be cropped or may not be visible to the user. It is an aim of the present invention to address disadvantages associated with the prior art.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide an apparatus, a system, a vehicle, a method, a computer program and a non-transitory computer readable medium as claimed in the appended claims.
According to an aspect of the invention there is provided an apparatus for controlling the position of an eye-box of a head-up display, the apparatus comprising a control means configured to: obtain positional data representative of a current position of an eye of a user; cause movement of a moveable element of a head-up display to adjust the position of the eye-box of the head-up display responsive to the position of the eye of the user; determine a transformation in dependence on the position of the eye-box; and apply the transformation to an image to be displayed by the head-up display.
This provides the advantage that the user is able to view the entire image presented by the head-up display even when they move their head with respect to the head-up display. Furthermore this may be achieved without making the optical components of the head-up display impractically large.
In some embodiments the transformation is to correct a distortion due to reflecting the image off a curved windshield of a vehicle. In some embodiments the control means is configured to: obtain positional data representative of the current position of the eye of a user; and provide a positioning signal in dependence on the positional data for causing movement of the moveable element of the head-up display to adjust the position of the eye-box of the head-up display responsive to the position of the eye of the user.
In some embodiments the apparatus is configured to receive from an imaging means an image signal from which the positional data is obtainable. This provides the advantage that the imaging means, such as one or more cameras, enable information relating to the position of the user's eyes to be obtained without requiring effort from the user.
In some embodiments the positional data is indicative of a current two-dimensional position of the eye of the user. This provides the advantage that the position of the eye-box may be positioned in two dimensions, such as vertically and horizontally. In some embodiments the control means is configured to: compare the obtained positional data with stored positional data to obtain a position value representative of a position of the moveable element of the head-up display associated with the obtained positional data; and provide a positioning signal in dependence on the position value. In some embodiments the control means is configured to look up the positional data in a look-up table to obtain a position value representing a position of the moveable element of the head-up display, and the positioning signal is dependent upon the position value. In some embodiments the control means is configured to determine the transformation in dependence on the positional data. This provides the advantage that the varying distortions applied to the image by optical components of the head-up display at the various positions of the user's eyes, may be compensated for. In some embodiments the control means is configured to compare the obtained positional data and/or the position of the moveable element with stored positional data and/or stored positions of the moveable element to obtain the transformation for applying to image data forming the image to be displayed by the head-up display. This provides the advantage that the transformations to be applied may be pre-determined and stored, for example in a calibration procedure.
In some embodiments the control means is configured to look up the positional data and/or the position of the moveable element in a look up table to obtain the transformation for applying to the image to be displayed by the head-up display.
In some embodiments the applied transformation is an approximation of an inverse transformation of an image distortion that is caused by optical elements of the head-up display for the current position of the eye and/or position of the moveable element of the head-up display. Such optical elements may include a windscreen or windshield of a vehicle, onto which the image is displayed by the head-up display. This provides the advantage that the displayed image appears to be undistorted to the user regardless of their position within the vehicle.
In some embodiments the control means is configured to determine the positional data by analyzing image data to identify a representation of at least one eye of the user.
In some embodiments the control means comprises at least one electronic processor and at least one electronic memory device coupled to the electronic processor and having instructions stored therein. According to another aspect of the invention there is provided a system comprising the apparatus of any one of the previous paragraphs and a head-up display comprising a moveable element, wherein the head-up display is arranged to: receive positional data from the control means; and adjust the position of the moveable element in dependence on the positional data.
In some embodiments the moveable element comprises a mirror. This provides the advantage that the eye-box position is adjusted by moving a component that may already be configured to be manually adjustable in existing systems.
In some embodiments the moveable element is arranged to direct light onto a windshield of a vehicle. According to another aspect of the invention there is provided a vehicle comprising the system of any one of the above paragraphs and an imaging means that is positioned on or within the vehicle and which is configured to capture an image containing a representation of at least one eye of a user of the vehicle. In some embodiments the vehicle is an automotive vehicle.
According to further aspect of the invention there is provided a method of controlling the position of an eye-box of a head-up display, the method comprising: obtaining positional data representative of a current position of an eye of a user; causing movement of a moveable element of a head-up display to adjust the position of the eye-box of the head-up display responsive to the position of the eye of the user; determining a transformation in dependence on the position of the eye-box; and applying the transformation to an image to be displayed by the head-up display. This provides the advantage that the user is able to view the entire image presented by the head-up display even when they move their head with respect to the head-up display. Furthermore this may be achieved without making the optical components of the head-up display impractically large. In some embodiments the transformation is to correct a distortion due to reflecting the image off a curved windshield of a vehicle.
In some embodiments method comprises: obtaining positional data representative of the current position of the eye of a user; and providing a positioning signal in dependence on the positional data for causing movement of a moveable element of a head-up display to adjust the position of the eye-box of the head-up display responsive to the position of the eye of the user. In some embodiments the obtaining positional data comprises obtaining the positional data from a signal received from an imaging means. This provides the advantage that the imaging means, such as one or more cameras, enable information relating to the position of the user's eyes to be obtained without requiring effort from the user. In some embodiments the positional data is indicative of a current two-dimensional position of the eye of the user. This provides the advantage that the position of the eye-box may be positioned in two dimensions, such as vertically and horizontally.
In some embodiments the method comprises: determining the transformation in dependence on the positional data.
In some embodiments the method comprises: comparing the obtained positional data with stored positional data to obtain a position value representative of a position of the moveable element of the head-up display associated with the obtained positional data; and providing the positioning signal in dependence on the position value.
In some embodiments the method comprises looking up the positional data in a look-up table to obtain a position value representing a position of the moveable element of the head- up display, and a positioning signal is dependent upon the position value.
In some embodiments the method comprises determining the transformation in dependence on the positional data. This provides the advantage that the varying distortions applied to the image by optical components of the head-up display at the various positions of the user's eyes, may be compensated for. In some embodiments the method comprises comparing the obtained positional data and/or the position of the moveable element with stored positional data and/or stored positions of the moveable element to obtain the transformation for applying to image data forming the image to be displayed by the head-up display. This provides the advantage that the transformations to be applied may be pre-determined and stored, for example in a calibration procedure.
In some embodiments the method comprises looking up the positional data and/or the position of the moveable element in a look up table to obtain the transformation for applying to the image to be displayed by the head-up display.
In some embodiments the transformation is an approximation to an inverse transformation of an image distortion caused by optical elements of the head-up display for the current position of the eye of the user and/or position of the moveable element of the head-up display. Such optical elements may include a windscreen or windshield of the vehicle, onto which the image is displayed by the head-up display. This provides the advantage that the displayed image appears to be undistorted to the user. In some embodiments the method comprises determining the positional data by analyzing image data to identify a representation of at least one eye of a user.
In some embodiments the moveable element comprises a mirror. This provides the advantage that the eye-box position is adjusted by moving a component that may already be configured to be manually adjustable in existing systems.
In some embodiments the method comprises generating the image at an image display device of the head-up display and reflecting the image at the moveable element and a windshield of a vehicle.
In some embodiments the method comprises: at an imaging means positioned within a vehicle, capturing an image containing a representation of at least one eye of a user of the vehicle; and determining the positional data by analyzing image data of the image to identify a region containing a representation of at least one eye of the user. According to further aspect of the invention there is provided a computer program which when executed by a processor causes the processor to perform the method of any one of the previous paragraphs.
According to yet another aspect of the invention there is provided a non-transitory computer- readable storage medium having instruction stored therein which when executed on a processor cause the processor to perform the method of any one of the previous paragraphs.
According to some, but not necessarily all examples there is provided an apparatus for controlling the position of an eye-box of a head-up display, the apparatus comprising an electronic processor having an electrical input for receiving one or more signals; and an electronic memory device electrically coupled to the electronic processor and having instructions stored therein, wherein the processor is configured to access the memory device and execute the instructions stored therein such that it becomes configured to: obtain positional data representative of a current position of an eye of a user; and provide a positioning signal in dependence on the positional data for causing movement of a moveable element of a head-up display to adjust the position of the eye-box of the head-up display responsive to the position of the eye of the user.
According to some, but not necessarily all examples there is provided an apparatus for controlling the position of an eye-box of a head-up display, the apparatus comprising a control means configured to: obtain positional data representative of a current position of an eye of a user; and cause movement of a moveable element of a head-up display to adjust the position of the eye-box of the head-up display responsive to the position of the eye of the user.
According to some, but not necessarily all examples there is provided a method of controlling the position of an eye-box of a head-up display, the method comprising: obtaining positional data representative of a current position of an eye of a user; causing movement of a moveable element of a head-up display to adjust the position of the eye-box of the head- up display responsive to the position of the eye of the user. Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Fig. 1 shows a schematic side view of a vehicle comprising a head-up display and an apparatus comprising a control means in accordance with an embodiment of the invention;
Fig. 2 shows a further schematic side view of the vehicle and user shown in Fig. 1 ;
Fig. 3 shows a schematic plan view of the vehicle shown in Fig. 1 ;
Fig. 4 shows a further schematic plan view of the vehicle shown in Fig. 1 ;
Fig. 5 shows an image captured by the imaging means and illustrates an example of how the positional data indicative of a position of the eyes of the user are determined in accordance with an embodiment of the invention;
Fig. 6 shows an example of a calibration image for use in an embodiment of the invention;
Fig. 7 shows an example of a detected image that has been captured by an imaging device forming part of an embodiment of the invention; Fig. 8 shows a diagram illustrating functional blocks of a system comprising an imaging means, a control means and a head-up display in accordance with an embodiment of the invention;
Fig. 9 shows a schematic diagram of an apparatus comprising a control means in accordance with an embodiment of the invention;
Fig. 10 shows a flowchart illustrating a method of controlling the position of an eye-box of a head-up display in accordance with an embodiment of the invention;
Fig. 1 1 shows a flowchart of a method in accordance with an embodiment of the invention;
Fig. 12 shows a flowchart of a method in accordance with an embodiment of the invention;
Fig. 13 shows a flowchart illustrating a method of transforming images for display on a head- up display in accordance with an embodiment of the invention;
Fig. 14 shows a flowchart of a method in accordance with an embodiment of the invention;
Fig. 15 shows a diagram illustrating functional blocks of a system in accordance with an embodiment of the invention; and
Fig. 16 shows a diagram illustrating functional blocks of a system in accordance with an embodiment of the invention.
DETAILED DESCRIPTION
The Figures illustrate an apparatus 101 for controlling the position of an eye-box 102 of a head-up display 103, the apparatus 101 comprising a control means 1 14 configured to: obtain positional data representative of a current position of at least one eye 104 of a user 105; and provide a positioning signal in dependence on the positional data for causing movement of a moveable element 108 of a head-up display 103 to adjust the position of the eye-box 102 of the head-up display 103 responsive to the position of the at least one eye position of the user's eyes, thus maintaining the eye box in the user's field of view regardless of head movement. This greatly improves the legibility of the information presented by the head-up display for the user.
The Figures also illustrate an apparatus 101 for transforming images for display on a head- up display 103, the apparatus 101 comprising a control means 1 14 configured to: obtain positional data representative of a current position of at least one eye 104 of a user 105; determine a transformation in dependence on the positional data; and output a transformation signal for applying the transformation to image data representative of an image to generate transformed image data representative of a transformed image to be displayed on the head-up display 103.
A vehicle 106 including a system 120 comprising a head-up display 103 is shown in a schematic side view in Fig. 1 . The head-up display 103 comprises a display device 107, which may comprise a light emitting diode (LED) display, a liquid crystal display (LCD), an organic light emitting diode (OLED) display or another type of illuminated display, as is known in the art. The head-up display 103 also comprises a moveable optical element 108 for directing the light emitted by the display device 107 onto a windshield 109 where it is reflected towards the eyes 104 of a user 105 of the vehicle 106 when seated in a front seat 1 10 of the vehicle 106. In the present embodiment, the user 105 is the driver 105 of the vehicle 106 and the moveable optical element 108 is arranged to direct the light emitted by the display device 107 onto the windshield 109 where it is reflected towards the eyes 104 of the user 105 of the vehicle 106 when seated in the user's seat 1 10. Consequently, the image displayed by the display device 107 is presented to the user 105 as a virtual object 1 13 that appears to be located on the outside of the windshield 109.
In the present embodiment, the moveable optical element 108 comprises a mirror which reflects the light from the display device 107 towards the windshield 109. Typically, the mirror is a part of a mirror galvanometer, or the mirror is mounted on a motorized gimbal, to enable it to be reoriented. In the present embodiment, the moveable optical element 108 is the only optical element on the path of the light from the display device 107 to the windshield 109, but it will be appreciated that other embodiments may have more than one optical element along this light path.
10 The head-up display 107 also comprises an actuation means 1 1 1 which is configured to enable adjustment of the orientation of the moveable optical element 108 so that the direction of the light leaving the moveable optical element 108 may be adjusted. This enables the position of the eye-box 102 of the head-up display 103 to be adjusted so that the eyes 104 of the user 105 are positioned within the eye-box 102 to provide the user 105 with a clear view of the image displayed by the head-up display 103. In the present embodiment, the moveable optical element 108 is adjustable about a lateral axis (in a direction into the paper as viewed in Fig. 1 ) and adjustable about a second axis 1 12 substantially perpendicular to the lateral axis. The actuation means 1 1 1 may comprise electric motors, or electric stepper motors arranged to adjust the orientation of the moveable optical element 108 in dependence on signals received by the actuation means 1 1 1 .
The head-up display 103 operates in dependence on signals provided by the control means 1 14, which provides the signals in dependence on signals it receives from an imaging means 1 15. The imaging means 1 15 comprises one or more cameras located within the vehicle 106 which are configured to capture images of the face of a user 105 in order to obtain information defining a position of at least one of the user's eyes. Image data representing images captured by the imaging means 1 15 are analyzed to identify a representation of at least one eye 104 of the user 105. Positional data, which may comprise 2-dimensional coordinates of the representation of the at least one eye 104 within the image are thus determined. This analysis may be performed by the control means 1 14 in dependence on receiving image data from the one of more cameras providing the imaging means 1 15, or the analysis may be performed by one or more processors located within the imaging means 1 15 and the positional data is received by the control means 1 14 from a processor of the imaging means 1 15. In this latter case, the one or more processors of the imaging means 1 15 may be regarded as a part of an apparatus 101 comprising the control means 1 14.
The control means 1 14 is configured to provide one or more signals to the head-up display 103 in dependence on the positional data obtained from the image data. In the present example, it provides two different output signals in dependence on positional data indicating a position of one or both eyes 104 of the user 105. The first of the two signals is supplied to the actuation means 1 1 1 to cause the actuation means 1 1 1 to rotate the moveable optical element 108, so that the position of the eye-box 102 of the head-up display 103 is adjusted relative to the position of the at least one eye 104 of the user 105.
A potential problem with this movement of the eye-box 102 is that a different region of the windshield 109 will be used to reflect the light towards the user 105, and the windshield 109 is curved, typically with radii of curvature of the windshield 109 differing from one point to another point. Also, angles at which the light is reflected off the windshield 109 will be altered when the eye-box is repositioned. Consequently the image displayed by the display device 107 may be distorted in varying ways before it reaches the user's eyes 104, depending upon the positioning of the eye-box 102.
For example, the user 105 is shown in Fig. 1 seated in an upright position and the orientation of the moveable optical element 108 has been adjusted by the actuation means 1 1 1 , so that the user's eyes 104 are positioned within the eye-box 102. Light from the moveable optical element 108 is reflected off a first region 1 16 of the windshield 109 at first angles 1 17.
The vehicle 106 and user 105 of Fig. 1 are shown in Fig. 2, but with the user 105 in a more reclined position and therefore with a lower eye level within the vehicle 106 relative to the imaging means 1 15. The orientation of the moveable optical element 108 has been adjusted by the actuation means 1 1 1 , so that the user's eyes 104 are once again positioned within the eye-box 102. Light from the moveable optical element 108 is reflected off a second region 201 , lower down the windshield 109, and at second angles 202 to the windshield 109, which are larger than the first angles 1 17. Figs. 3 and 4 additionally illustrate how the lateral position assumed by the user 105 also affects the region of the windshield 109 that is used to reflect the light from the display device 107 and the angles at which the light is reflected off the windshield 109. A schematic plan view of the vehicle 106 and the head 301 of the user 105 is shown in Fig. 3 with the user's head 301 positioned towards the middle of the vehicle 106, and a similar schematic plan view is shown in Fig. 4 with the user's head 301 positioned nearer to the user's side door 302.
In the example of Fig. 3 the moveable optical element 108 has been tilted about the axis 1 12 to direct light onto the windshield 109 and reflect it from a region 303 (shown hatched) towards the more central position taken up by the eyes 104 of the user 105. In the example of Fig. 4 the moveable optical element 108 has been tilted about the axis 1 12 to direct light onto the windshield 109 and reflect it from another region 304 (shown hatched) towards the position adjacent to the door 302, taken up by the eyes 104 of the user 105. It will be appreciated that vehicle windshields are typically formed as glass or other transparent members having compound curves, with the screen being curved from top to bottom as well as from side to side. Where the radius of curvature of the windshield varies across the vehicle, so an image reflected from a display device onto that windshield towards the user will be affected, and this can be particularly noticeable to a user if the point on which that image is reflected is varied around the windshield in use.
Thus, varying regions of the windshield 109 are used to reflect light towards the user's eyes 104, and the light is reflected at varying angles from the windshield 109, in dependence on the positioning of the user's eyes 104 and the positioning of the moveable optical element 108. Consequently, the image displayed by the display device 107 is distorted in correspondingly various ways by the reflection in the windshield 109, depending on the position of the user's eyes 104 and the position of the moveable optical element 108.
In view of the varying optical distortion produced by the varying region of reflection on the windshield 109, the second of the two signals comprises a transformation signal for applying a transformation to image data that represents an image that is to be displayed on the display device 107 of the head-up display 103. The resulting transformation to the image data causes a distortion of the image displayed by the display device 107 in an opposite sense to the distortion created by the optical components, including the windshield 109, i.e. the image to be presented by the display device 107 is transformed by a transformation that is the inverse of the transformation produced by the optical distortion. Consequently, the image observed by the user 105 appears to the user to be free of distortion.
The transformation signal provided by the control means 1 14 comprises a transformation that is determined by the control means 1 14 in dependence on the positional data obtained from the analysis of the image captured by the imaging means 1 15.
An example of how the positional data indicative of a position of the eyes 104 of the user 105 are determined is illustrated in Fig. 5, which illustrates an image captured by the imaging means 1 15. Firstly, the image is analyzed to detect a number of features of the user's eyes 104 that define a border geometry (represented by rectangle 501 ) surrounding the user's eyes 104. For example, the analysis may identify the highest and lowest points of a user's eyes 104, the leftmost point of the left eye and the rightmost point of the right eye and define the border geometry as an upright rectangle having an edge passing through each of these points. Positional data, which may comprise 2-dimensional co-ordinates of the border geometry 501 within a field of view 502 are determined. For example, where the border geometry 501 comprises a rectangle, the 2-dimensional co-ordinates may be the coordinates of the centre of the rectangle. This process of analyzing the image to determine the positional data is typically performed by one or more processors within the imaging means 1 15.
In alternative embodiments, the positional data defines a 1 -dimensional position and the automated repositioning of the eye-box 102 is only in 1 dimension, either along a vertical or horizontal axis with respect to the vehicle 106.
The position of the eye-box 102 of the head-up display 103 is adjusted by the control means 1 14 when the current positions of the eyes 104 of the user 105 are not aligned with the current position of the eye-box 102. In the example of Fig. 5, the control means 1 14 is arranged to determine 2-dimensional co-ordinates of a position (illustrated by an "X" 504) of one or both eyes 104 of the user 105. A system (120 in Fig. 1 ) comprising imaging means 1 15, the head-up display 103 and the control means 1 14 is calibrated for a finite number of different positions 503 of the eye-box 102. For example, central points 503 of the different positions of the eye-box 102 are illustrated in Fig. 5. When the current position 504 of the eyes 104 is within a threshold distance of the current central point 503A of the eye-box 102, the control means 1 14 may be arranged to maintain the current position of the eye-box 102. When the current position 504 of the eyes 104 is not within the threshold distance of the current central point 503A of the eye-box 102, the control means 1 14 may be arranged to adjust the position of the eye-box 102 to the calibrated position nearest to the current position 504 of the eyes 104.
In the present embodiment, the system (120 in Fig. 1 ) is calibrated to compensate for the optical distortion caused by the optical elements of the head-up display 103 for each of the finite number of calibrated positions with central points 503 of the eye-box 102. That is, for each calibrated position, the image displayed by the display device 107 may be distorted in a different way by the reflection in the windshield 109, depending on the position of the user's eyes 104 and the position of the moveable optical element 108. Thus, for each calibrated position, the control means 1 14 determines a corresponding transformation signal for applying a transformation to image data that represents an image to be displayed on the display device 107 of the head-up display 103.
To calibrate the system 120 in this way, a calibration image may be displayed on the display device 107 of the head-up display 103 and an imaging device such as a camera (not shown) is located at each of the central points 503 in turn, while the head-up display 103 is arranged to position the centre of the eye-box 102 at that point 503. The imaging device is arranged to detect the image projected by the head-up display 103, and the detected image will typically be distorted. An example of a calibration image 601 is shown in Fig. 6. The calibration image 601 comprises a pattern defining a regular array of accurately positioned features 602. In the present example the features 602 are circles 602 arranged in a square array.
An example of a detected image 701 that has been captured by the imaging device during calibration is shown in Fig. 7. A grid 702 of squares is also shown which illustrates the distortion produced by the optical elements of the head-up display 103. The grid 702 has been chosen such that, in a non-distorted image, the centres of the circles 602 would coincide with the vertices 703 of the squares of the grid 702. However, due to the distortion produced by the optical components of the head-up display 103, most of the centres of the circles 602 of the detected image 701 are separated from the corresponding vertex. For example, the centre of a first circle 602A is separated from a corresponding vertex 703A by a displacement vector 704A and the centre of a second circle 602B is separated from a corresponding vertex 703B by a displacement vector 704B. The calibration process may therefore determine a displacement vector, such as vectors 704A and 704B, for each of the circles 602. These displacement vectors represent a transformation caused by the optical components of the head-up display 103 to the original displayed image 601 . Therefore an approximation to the transformation caused by the optical components of the head-up display 103 may be determined. Determination of the image transformation (e.g. a set of head-up display specific distortions) may be achieved by capturing the image (e.g. on a camera) at the eye-box corners and edges. Sampling between the centre of the eye-box out towards the corners/edges may be used to understand the relationship of the distortion. The positions of the circles in the image may be detected through a circle detection algorithm and used to compute the transform. The inverse transform can then be computed through a common mathematics process from each sample.
As an alternative to projecting a calibration image and capturing the resulting image, the image transformation caused by the optical system of the head-up display 103 may be determined by software, such as SPEOS or ZEMAX, which models the propagation of rays through the optical system. This software is used to simulate the level of distortion in the optical system from each one of a set of positions that the eyes of the user may assume during use.
In each of these ways an inverse transformation may be determined and stored for each of the calibrated positions (points 503 in Fig. 5) of the system (120 in Fig. 1 ).
Thus, during use, when the head-up display 103 is arranged to position its eye-box 102 at, or adjacent to, any one of the calibrated positions having a central point 503, a corresponding inverse transformation may be applied to the image to be displayed by the display device 107 of the head-up display 103, and because the transformation applied to the image to be displayed by the display device 107 approximates to the inverse of the transformation applied by the optical components of the head-up display 103, the image observed by the user 105 appears to be free of distortion. The inverse transform, which is applied to the image that is to be displayed by the display device 107, is selected in dependence on a nearest neighbour algorithm using the current eye position of the user 105.
A diagram illustrating functional blocks of an embodiment of the system 120 comprising the imaging means 1 15, the control means 1 14 and the head-up display 103 is shown in Fig. 8. A picture generator 801 provides image data for display by the display device 107 of the head-up display 103. In the present example, the image data generated by the picture generator 801 is generated in dependence on one or more signals comprising information received from one or more other systems 803. For example, the one or more signals may be indicative of the current road speed received from another system, such as an antilock braking system (ABS) or speedometer, or indicative of a selected gear and received from a transmission control module (TCM). The picture generator 801 generates image data representing a graphical image that illustrates the information in a format determined by graphical data stored in a memory device 802.
An image analysing means 805 receives images captured by the imaging means 1 15 and analyses each of the images to generate the positional data representative of a current position of at least one eye 104 of a user 105.
In the present example, a positioning determination means 806 receives the positional data and provides a positioning signal in dependence on the positional data for causing movement of the moveable element 108 of the head-up display 103 to adjust the position of the eye-box 102 of the head-up display 103 relative to the position of the eyes 104 of the user 105. The positioning determination means 806 may compare the positional data with data defining the central points (503 in Fig. 5) of the calibrated positions to determine if the current position of the eyes 104 is within a threshold distance of the centre of the eye-box 102. If it is not, then the positioning determination means 806 may provide an output signal to the actuation means 1 1 1 to cause it to move the moveable element 108 of the head-up display 103 to position the centre of the eye-box 102 at the central point (503 in Fig. 5) of the calibrated position nearest to the position of the eyes 104 of the user 105.
In the present embodiment, the positional data generated by the image analysing means 805 is also provided to a transformation determination means 807 configured to determine a transformation in dependence on the positional data and to output a transformation signal for applying the transformation to image data. The transformation may be determined by retrieving transformation data stored in a memory device 808 in dependence on the received positional data. For example, the transformation data may have been produced in a calibration process as described above with regard to Figs. 6 and 7 and stored in a look-up table in the memory device 808. Thus, the transformation determination means 807 may be configured to retrieve transformation data, corresponding to the positional data, from the look-up table. A picture transformation means 809 is configured to receive the transformation signal from the transformation determination means 807 and image data from the picture generator 801 and apply the transformation to the image data to generate transformed image data representative of a transformed image to be displayed on the head-up display 103. Thus, the picture transformation means 809 provides a signal to the display device 107 to cause it to display a transformed image.
In an alternative embodiment, the positioning determination means 806 may be configured to provide an output signal dependent on the positional data to continuously keep moving the centre of the eye-box 102 to the position of the eyes 104. In this case, the transformation determination means 807 may be configured to determine which one of the calibrated positions has a central point (503 in Fig. 5) nearest to the current eye position and output a transformation signal comprising transformation data corresponding to that calibrated position.
Apparatus 101 comprising the control means 1 14 is shown schematically in Fig. 9. The control means 1 14 comprises one or more electronic processors 902 and one or more electronic memory devices 903. A computer program 904 comprising instructions is stored in the memory device 903 and the one or more electronic processors 902 are configured to execute the instructions and perform at least the positioning determination means 806 and/or the transformation determination means 807 described above and shown in Fig. 8 and/or any one of the methods described below with reference to Figs. 10 to 14.
In embodiments in which the control means 1 14 comprises several processors, the processors may be located within a single module or may be distributed over several different modules. For example, the image analysing means (805 of Fig. 8) may be performed by a processor 902 of the control means 1 14 that is located within a camera 1 15 configured to capture images of the eyes 104 of the user 105, while the positioning determination means 806 and/or the transformation determination means 807 shown in Fig. 8 may be located within a unit that includes the display device 107 of the head-up display 103. For example, one or more processors 902 of the control means 1 14 may be located within a unit that includes the display device 107 of the head-up display 103, and the one or more processors 902 may also be configured to perform the picture generation performed by the picture generator 801 and the processes performed by the picture transformation means 809 and the transformation determination means 807.
In the illustrated embodiment, the apparatus 101 also comprises input/output means 905 for receiving and transmitting communications to other electronic devices. The input/output means 905 may comprise one or more transceivers for communicating with other devices over data buses, such as a controller area network bus (CAN bus) of the vehicle 106.
The computer program 904 may be transferred to the memory device 903 via a non- transitory computer readable medium, such as a CD-ROM 906 or a portable memory device, or via a network, such as a wireless network.
A flowchart illustrating a method 1000 of controlling the position of an eye-box of a head-up display, performable by the control means 1 14, is shown in Fig. 10. The method 1000 comprises, at block 1001 , obtaining positional data representing a current position of one or more eyes of a user. This process may comprise receiving positional data from a processor that is configured to perform an analysis of an image captured by an imaging means.
Alternatively, the process at block 1001 may comprise the processes illustrated in the flowchart of Fig. 1 1 . Thus the method 1000 may comprise, at block 1 101 of process 1001 , receiving from an imaging means an image signal from which positional data is obtainable, and, at block 1 102, analysing image data contained within the image signal to identify a representation of at least one eye of the user. At block 1 103, the process 1001 comprises obtaining the positional data representative of a current position of an eye of a user from the received image signal.
Returning to Fig. 10, the method 1000 also comprises, at block 1002, causing movement of a moveable element of a head-up display in dependence on the positional data to adjust the position of the eye-box of the head-up display relative to the current position of the one or more eyes of the user.
The method 1000 is typically performed repeatedly; each time using the most recently received positional data obtained from the most recently captured image. Thus, the method 1000 repeatedly provides positioning signals, or continuously provides a positioning signal, to adjust the position of the eye-box of the head-up display.
The process at block 1002 may comprise the processes illustrated in the flowchart of Fig. 12. At block 1201 , the obtained positional data is compared with stored positional data to obtain a position value representative of a position of the moveable element of the head-up display associated with the obtained positional data. The process at block 1201 may comprise looking up the positional data in a stored look-up table to obtain the position value. The process 1002 also comprises, at block 1202, providing the positioning signal in dependence on the position value. Thus, a positioning signal may be provided to a head-up display to cause the position of the eye-box of the head-up display to be moved in dependence on the positional data. A flowchart illustrating a method 1300 of transforming images for display on a head-up display, performable by the control means 1 14, is shown in Fig. 13.
At block 1301 of the method 1300, positional data representative of a current position of at least one eye of a user is obtained. The process at block 1301 of the method 1300 may be the same as the process performed at block 1001 of the method 1000, as described above. At block 1302, the method 1300 determines a transformation in dependence on the positional data obtained at block 1301 . This process may be as described above with reference to Figs. 6 and 7. At block 1303, the method 1300 outputs a transformation signal for applying the transformation to image data representative of an image, in order to generate transformed image data representative of a transformed image to be displayed on a head-up display.
The method 1300 is typically repeatedly performed; each time using the most recently received positional data obtained from the most recently captured image. Thus, the method 1300 repeatedly provides an output signal that causes the head-up display to transform the image in dependence on the most recently determined positions of the eyes of the user.
Examples of the process at block 1302 of the method 1300 are illustrated in the process block 1401 of Fig. 14. Thus, the process at 1302 may comprise looking up the positional data and/or the position of the moveable element in a look-up table to obtain the transformation to be applied to the image data at block 1303.
A diagram illustrating functional components of an alternative system 120A is shown in Fig. 15. Components common to both the system 120 and system 120A have been provided with the same reference signs. Like the system 120, the system 120A has an image analyzing means 805 which receives images captured by the imaging means 1 15 and analyses each of the images to generate the positional data representative of a current position of at least one eye 104 of a user 105.
A positioning determination means 806 receives the positional data and provides a positioning signal in dependence on the positional data for causing movement of the moveable element 108 of the head-up display 103 to adjust the position of the eye-box 102 of the head-up display 103 relative to the position of the eyes 104 of the user 105.
The system 120A also has a picture generator 801 that provides image data for display by the display device 107 of the head-up display 103. The image data generated by the picture generator 801 may be generated in dependence on one or more signals comprising information received from one or more other systems 803. The picture generator 801 generates image data representing a graphical image that illustrates the information in a format determined by graphical data stored in a memory device 802.
Unlike in the system 120, the image data generated by the picture generator 801 is provided to the display device 107 of the head-up display 103 without being transformed beforehand. Thus, the image observed by the user 105 may at times appear to be distorted depending on the position of the eyes of the user.
However, the system 120A, like system 120, ensures that the user 105 is able to view the image provided by the head-up display 103, by adjusting the position of the eye-box 102 in dependence on the position of the user's eyes 104.
A diagram illustrating functional components of another alternative system 120B is shown in Fig. 16. Components common to both the system 120 and system 120B have been provided with the same reference signs. Like the system 120, the system 120B has a picture generator 801 that provides image data for display by the display device 107 of the head-up display 103. The image data generated by the picture generator 801 may be generated in dependence on one or more signals comprising information received from one or more other systems 803. The picture generator 801 generates image data representing a graphical image that illustrates the information in a format determined by graphical data stored in a memory device 802.
The system 120B also includes an image analysing means 805 which receives images captured by the imaging means 1 15 and analyses each of the images to generate the positional data representative of a current position of at least one eye 104 of a user 105.
The positional data generated by the image analyzing means 805 is provided to a transformation determination means 807 configured to determine a transformation in dependence on the positional data and to output a transformation signal for applying the transformation to image data. The transformation may be determined by retrieving transformation data stored in a memory device 808 in dependence on the received positional data.
The transformation data may be previously produced and stored in a calibration process similar to that described above with reference to Figs. 5, 6 and 7. However, in this instance, the camera used for the calibration process is moved between calibration positions (similar to points 503 in Fig. 5), while the eye-box 102 of the head-up display 103 remains stationary, and the camera is caused to capture images (similar to image 701 in Fig. 7) of the calibration image (601 in Fig. 6). The transformation is then determined from the displacements (similar to vectors 704A and 704B in Fig. 7) measured in those captured images. This process may be performed for a number of different static positions of the head-up display 103.
Returning to Fig. 16, a picture transformation means 809 is configured to receive the transformation signal from the transformation determination means 807 and image data from the picture generator 801 and apply the transformation to the image data to generate transformed image data representative of a transformed image to be displayed on the head- up display 103. Thus, the picture transformation means 809 provides a signal to the display device 107 to cause it to display a transformed image. Unlike the system 120, the system 120B does not include a positioning determining means 806 for controlling the position of the moveable optical element 108 of the head-up display 103. However, the picture transformation means 809 is still considered to be advantageous, particularly in a system having a head-up display with a relatively large eye-box 102, in which the user 105 can move their eye-position by substantial distances within the vehicle and still see the whole of the displayed image. As the user 105 moves their head (up and down and/or left and right) the apparent distortion produced by the optical components (and particularly the windshield 109) of the head-up display 103 is likely to vary depending upon the position of the eyes 104 of the user 105, even though the eye-box 102 remains stationary. However, by transforming the image data, using an approximation to an inverse transformation of that the transformation produced by the optical components of the head-up display 103, the system 120B is able provide the user 105 with a substantially undistorted view of the image. For purposes of this disclosure, it is to be understood that the controller(s) or control means described herein can each comprise a control unit or computational device having one or more electronic processors. A vehicle and/or a system thereof may comprise a single control unit or electronic controller or alternatively different functions of the controller(s) may be embodied in, or hosted in, different control units or controllers. A set of instructions could be provided which, when executed, cause said controller(s) or control unit(s) to implement the control techniques described herein (including the described method(s)). The set of instructions may be embedded in one or more electronic processors, or alternatively, the set of instructions could be provided as software to be executed by one or more electronic processor(s). For example, a first controller may be implemented in software run on one or more electronic processors, and one or more other controllers may also be implemented in software run on or more electronic processors, optionally the same one or more processors as the first controller. It will be appreciated, however, that other arrangements are also useful, and therefore, the present disclosure is not intended to be limited to any particular arrangement. In any event, the set of instructions described above may be embedded in a computer-readable storage medium (e.g., a non-transitory storage medium) that may comprise any mechanism for storing information in a form readable by a machine or electronic processors/computational device, including, without limitation: a magnetic storage medium (e.g., floppy diskette); optical storage medium (e.g., CD-ROM); magneto optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM ad EEPROM); flash memory; or electrical or other types of medium for storing such information/instructions.
The blocks illustrated in the Figs. 10 to 14 may represent steps in a method and/or sections of code in the computer program 904. The illustration of a particular order to the blocks does not necessarily imply that there is a required or preferred order for the blocks and the order and arrangement of the block may be varied. Furthermore, it may be possible for some steps to be omitted. Although embodiments of the present invention have been described in the preceding paragraphs with reference to various examples, it should be appreciated that modifications to the examples given can be made without departing from the scope of the invention as claimed. Features described in the preceding description may be used in combinations other than the combinations explicitly described.
Although functions have been described with reference to certain features, those functions may be performable by other features whether described or not.
Although features have been described with reference to certain embodiments, those features may also be present in other embodiments whether described or not.
Whilst endeavoring in the foregoing specification to draw attention to those features of the invention believed to be of particular importance it should be understood that the Applicant claims protection in respect of any patentable feature or combination of features hereinbefore referred to and/or shown in the drawings whether or not particular emphasis has been placed thereon.

Claims

1 . Apparatus for controlling the position of an eye-box of a head-up display, the apparatus comprising a control means configured to:
obtain positional data representative of a current position of an eye of a user;
cause movement of a moveable element of a head-up display to adjust the position of the eye-box of the head-up display responsive to the position of the eye of the user;
determine a transformation in dependence on the position of the eye-box; and apply the transformation to an image to be displayed by the head-up display.
2. Apparatus according to claim 1 , wherein the transformation is to correct a distortion due to reflecting the image off a curved windshield of a vehicle.
3. Apparatus according to claim 1 or 2, wherein the apparatus is configured to receive from an imaging means an image signal from which the positional data is obtainable.
4. Apparatus according to any preceding claim, wherein the positional data is indicative of a current two-dimensional position of the eye of the user.
5. Apparatus according to any preceding claim, wherein the control means is configured to: compare the obtained positional data with stored positional data to obtain a position value representative of a position of the moveable element of the head-up display associated with the obtained positional data; and provide a positioning signal in dependence on the position value.
6. Apparatus according to any preceding claim, wherein the control means is configured to determine the transformation in dependence on the positional data.
7. Apparatus according to any preceding claim, wherein the control means is configured to compare the obtained positional data and/or the position of the moveable element with stored positional data and/or stored positions of the moveable element to obtain the transformation for applying to image data forming the image to be displayed by the head-up display.
8. Apparatus according to claim 6 or claim 7, wherein the applied transformation is an approximation of an inverse transformation of an image distortion that is caused by optical elements of the head-up display for the current position of the eye of the user and/or position of the moveable element of the head-up display.
9. Apparatus according to any preceding claim, wherein the control means is configured to determine the positional data by analyzing image data to identify a representation of at least one eye of the user.
10. Apparatus according to any preceding claim, wherein the control means comprises an electronic processor and an electronic memory device coupled to the electronic processor and having instructions stored therein.
1 1 . A system comprising the apparatus of any preceding claim and a head-up display comprising a moveable element, wherein the head-up display is arranged to: receive the positional data from the control means; and adjust the position of the moveable element in dependence on the positional data.
12. A system according to claim 1 1 , wherein the moveable element comprises a mirror.
13. A system according to claim 1 1 or 12, wherein the moveable element is arranged to direct light onto a windshield of a vehicle.
14. A vehicle comprising the system according to any one of claims 1 1 to 13 and an imaging means that is positioned on or within the vehicle and which is configured to capture an image containing a representation of at least one eye of a user of the vehicle.
15. A method of controlling the position of an eye-box of a head-up display, the method comprising:
obtaining positional data representative of a current position of an eye of a user; causing movement of a moveable element of a head-up display to adjust the position of the eye-box of the head-up display responsive to the position of the eye of the user;
determining a transformation in dependence on the position of the eye-box; and applying the transformation to an image to be displayed by the head-up display.
16. A method according to claim 15, wherein the transformation is to correct a distortion due to reflecting the image off a curved windshield of a vehicle.
17. A method according to claim 15 or 16, wherein the obtaining positional data comprises obtaining the positional data from a signal received from an imaging means.
18. A method according to any one of claims 15 to 17, wherein the positional data is indicative of a current two-dimensional position of the eye of the user.
19. A method according to any one of claims 15 to 18, wherein the method comprises: comparing the obtained positional data with stored positional data to obtain a position value representative of a position of the moveable element of the head-up display associated with the obtained positional data; and
providing a positioning signal in dependence on the position value.
20. A method according to any one of claims 15 to 19, wherein the method comprises: determining the transformation in dependence on the positional data.
21 . A method according to any one of claims 15 to 20, wherein the method comprises comparing the obtained positional data and/or the position of the moveable element with stored positional data and/or stored positions of the moveable element to obtain the transformation for applying to image data forming the image to be displayed by the head-up display.
22. A method according to claim 20 or claim 21 , wherein the transformation is an approximation to an inverse transformation of an image distortion caused by optical elements of the head-up display for the current position of the eye and/or position of the moveable element of the head-up display.
23. A method according to any one of claims 15 to 22, wherein the method comprises determining the positional data by analyzing image data to identify a representation of at least one eye of the user.
24. A method according to any one of claims 15 to 23, wherein the moveable element comprises a mirror.
25. A method according to any one of claims 15 to 24, wherein the method comprises generating the image at an image display device of the head-up display and reflecting the image at the moveable element and a windshield of a vehicle.
26. A method according to any one of claims 15 to 25, wherein the method comprises: at an imaging means positioned within a vehicle, capturing an image containing a representation of at least one eye of a user of the vehicle; and determining the positional data by analyzing image data of the image to identify a representation of at least one eye of a user.
27. A computer program which when executed by a processor causes the processor to perform the method of any one of claims 15 to 26.
28. A non-transitory computer-readable storage medium having instructions stored therein which when executed on a processor cause the processor to perform the method of any one of claims 15 to 26.
29. An apparatus, a system, a vehicle, a method, a computer program or a non-transitory computer readable medium as described herein with reference to the accompanying figures.
PCT/EP2018/052812 2017-02-13 2018-02-05 Apparatus and method for controlling a vehicle display WO2018146048A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1702306.0A GB2559605A (en) 2017-02-13 2017-02-13 Apparatus and method for controlling a vehicle display
GB1702306.0 2017-02-13

Publications (1)

Publication Number Publication Date
WO2018146048A1 true WO2018146048A1 (en) 2018-08-16

Family

ID=58461931

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/052812 WO2018146048A1 (en) 2017-02-13 2018-02-05 Apparatus and method for controlling a vehicle display

Country Status (2)

Country Link
GB (1) GB2559605A (en)
WO (1) WO2018146048A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3696594A1 (en) * 2019-01-07 2020-08-19 Yazaki Corporation Head-up display device
CN113552905A (en) * 2021-06-22 2021-10-26 歌尔光学科技有限公司 Position adjusting method and system for vehicle-mounted HUD
US20220197120A1 (en) * 2017-12-20 2022-06-23 Micron Technology, Inc. Control of Display Device for Autonomous Vehicle
US11391956B2 (en) 2019-12-30 2022-07-19 Samsung Electronics Co., Ltd. Method and apparatus for providing augmented reality (AR) object to user
CN115225875A (en) * 2022-06-17 2022-10-21 苏州蓝博控制技术有限公司 Auxiliary display device of excavator and display method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090278765A1 (en) * 2008-05-09 2009-11-12 Gm Global Technology Operations, Inc. Image adjustment and processing for a head up display of a vehicle
US20110267700A1 (en) * 2008-11-05 2011-11-03 Johnson Control Technology Company Vehicle display system or projection display for a motor vehicle, and calibration method
DE102015109027A1 (en) * 2015-06-08 2016-12-08 Denso Corporation Head-up display with situation-based adaptation of the presentation of virtual image content
US20160357015A1 (en) * 2014-03-19 2016-12-08 Yazaki Corporation Vehicle display device
EP3128357A2 (en) * 2015-08-05 2017-02-08 LG Electronics Inc. Display device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3727078B2 (en) * 1994-12-02 2005-12-14 富士通株式会社 Display device
JP6221942B2 (en) * 2014-05-26 2017-11-01 株式会社デンソー Head-up display device
US10088683B2 (en) * 2014-10-24 2018-10-02 Tapuyihai (Shanghai) Intelligent Technology Co., Ltd. Head worn displaying device employing mobile phone
US10247941B2 (en) * 2015-01-19 2019-04-02 Magna Electronics Inc. Vehicle vision system with light field monitor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090278765A1 (en) * 2008-05-09 2009-11-12 Gm Global Technology Operations, Inc. Image adjustment and processing for a head up display of a vehicle
US20110267700A1 (en) * 2008-11-05 2011-11-03 Johnson Control Technology Company Vehicle display system or projection display for a motor vehicle, and calibration method
US20160357015A1 (en) * 2014-03-19 2016-12-08 Yazaki Corporation Vehicle display device
DE102015109027A1 (en) * 2015-06-08 2016-12-08 Denso Corporation Head-up display with situation-based adaptation of the presentation of virtual image content
EP3128357A2 (en) * 2015-08-05 2017-02-08 LG Electronics Inc. Display device

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220197120A1 (en) * 2017-12-20 2022-06-23 Micron Technology, Inc. Control of Display Device for Autonomous Vehicle
EP3696594A1 (en) * 2019-01-07 2020-08-19 Yazaki Corporation Head-up display device
US11391956B2 (en) 2019-12-30 2022-07-19 Samsung Electronics Co., Ltd. Method and apparatus for providing augmented reality (AR) object to user
CN113552905A (en) * 2021-06-22 2021-10-26 歌尔光学科技有限公司 Position adjusting method and system for vehicle-mounted HUD
CN115225875A (en) * 2022-06-17 2022-10-21 苏州蓝博控制技术有限公司 Auxiliary display device of excavator and display method thereof
CN115225875B (en) * 2022-06-17 2023-12-01 苏州蓝博控制技术有限公司 Display method of auxiliary display device of excavator

Also Published As

Publication number Publication date
GB2559605A (en) 2018-08-15
GB2559605A8 (en) 2018-10-03
GB201702306D0 (en) 2017-03-29

Similar Documents

Publication Publication Date Title
WO2018146048A1 (en) Apparatus and method for controlling a vehicle display
US10281729B2 (en) Vehicle equipped with head-up display system capable of adjusting imaging distance and maintaining image parameters, and operation method of head-up display system thereof
JP4618287B2 (en) Adjustment method and system
CN110001400B (en) Display device for vehicle
CN108225734B (en) Error calibration system based on HUD system and error calibration method thereof
US7952808B2 (en) Display system for vehicle and display method
US10274726B2 (en) Dynamic eyebox correction for automotive head-up display
US11004424B2 (en) Image display system, image display method, movable object including the image display system, and non-transitory computer-readable medium
CN112639573B (en) Method for operating a visual display device for a motor vehicle
CN109477967A (en) Has the head-up-display system of variable focal plane
US20080192045A1 (en) Holographic information display
RU2764080C1 (en) Control device for device of onboard projection indication (hud)
US20210260999A1 (en) Head-up display device
US11945306B2 (en) Method for operating a visual field display device for a motor vehicle
CN113920102A (en) Imaging detection method, device, equipment, system and storage medium
WO2018145956A1 (en) Apparatus and method for controlling a vehicle display
KR101265710B1 (en) Vehicle Installed Camera Extrinsic Parameter Estimation Method and Apparatus
GB2536882A (en) Head up display adjustment
WO2021065698A1 (en) Head-up display device, method, and computer program
KR101398068B1 (en) Vehicle Installed Camera Extrinsic Parameter Estimation Method and Apparatus
JP6855254B2 (en) Image processing device, image processing system, and image processing method
CN115018942A (en) Method and apparatus for image display of vehicle
JP2018167669A (en) Head-up display device
CN118004035B (en) Auxiliary driving method and device based on vehicle-mounted projector and electronic equipment
CN108415161A (en) Method for the display system of front windshield and for configuring display system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18707631

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18707631

Country of ref document: EP

Kind code of ref document: A1