WO2020089332A1 - Augmented reality head-up display and vehicle - Google Patents

Augmented reality head-up display and vehicle Download PDF

Info

Publication number
WO2020089332A1
WO2020089332A1 PCT/EP2019/079727 EP2019079727W WO2020089332A1 WO 2020089332 A1 WO2020089332 A1 WO 2020089332A1 EP 2019079727 W EP2019079727 W EP 2019079727W WO 2020089332 A1 WO2020089332 A1 WO 2020089332A1
Authority
WO
WIPO (PCT)
Prior art keywords
imagery
vehicle
concave mirror
sections
augmented reality
Prior art date
Application number
PCT/EP2019/079727
Other languages
French (fr)
Inventor
Yijun Zhao
Reinhold Langbein
Thomas Agung Nugraha
Artem Rudi
Benjamin Samson
Original Assignee
Motherson Innovations Company Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motherson Innovations Company Ltd. filed Critical Motherson Innovations Company Ltd.
Publication of WO2020089332A1 publication Critical patent/WO2020089332A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/23
    • B60K35/28
    • B60K35/81
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • B60K2360/177
    • B60K2360/23
    • B60K2360/334
    • B60K2360/66

Definitions

  • the present disclosure relates generally to head-up displays for vehicles and, more particularly, to an augmented reality head-up display (AR-HUD) for a vehicle as well as a vehicle with such a display.
  • AR-HUD augmented reality head-up display
  • Some vehicles may include a head-up display (HUD) for assisting a driver of the vehicle.
  • a head-up display in a vehicle is capable of reflecting imagery (e.g. depicting various vehicle-related information such as a speedometer, tachometer, vehicle navigation, etc.) up through an opening in a dashboard and onto a windshield of the vehicle.
  • imagery e.g. depicting various vehicle-related information such as a speedometer, tachometer, vehicle navigation, etc.
  • a driver of the vehicle may observe various vehicle-related information while keeping his/her head and eyes up so as to be more focused on the exterior view ahead while driving.
  • AR-HUD augmented reality head-up display
  • Such augmented reality head-up displays differ from more conventional head-up displays in that the virtual information reflected onto the windshield appears to be part of the driving experience itself.
  • one or more virtual symbols or graphics e.g. directional guidance arrows, etc. may be reflected onto the windshield and placed accurately into the exterior view so as to indicate or mark the route that the driver should follow on the road ahead.
  • one or more virtual symbols or graphics may be reflected onto the windshield and placed accurately into the exterior view so as to indicate or mark a specific object or vehicle ahead on the road, or a distance to the specific object or vehicle ahead on the road, which is detected by the ADAS.
  • ACC collision avoidance system or adaptive cruise control
  • ADAS advanced driver-assistance system
  • one or more virtual symbols or graphics may be reflected onto the windshield and placed accurately into the exterior view so as to indicate or mark specific boundaries of the lanes ahead on the road which are detected by the ADAS.
  • an augmented reality head-up display in a vehicle may increase driver-assistance capabilities, such as generally described above, and may further enhance the comfort, safety and overall driving experience, certain challenges and limitations exist when attempting to implement an AR-HUD into a vehicle.
  • an AR-HUD in a vehicle may reflect substantially more information (e.g. virtual information) onto a windshield over a substantially larger area (e.g. overlaying the road ahead as seen by a driver looking through the windshield)
  • certain components of the AR-HUD and the resulting layout of such components often occupy substantially more packaging space within a dashboard disposed within an interior of the vehicle.
  • the available or ideal packaging space within a dashboard disposed within the interior is rather limited.
  • an AR-HUD which occupies more than the available or ideal packaging space within a dashboard disposed within an interior of a vehicle may not be able to function as desired or may not even be feasible for use in the vehicle.
  • AR-HUD augmented reality head-up display
  • the need is met by an augmented reality head-up display (AR-HUD) for a vehicle of claim 1, with preferred embodiments thereof being described in claims 2 to 12. Further, the need is met by a vehicle with such an augmented reality head-up display (AR- HUD) in line with claim 13.
  • AR-HUD augmented reality head-up display
  • an augmented reality head-up display for a vehicle
  • the augmented reality head-up display comprising: a concave mirror including n individually-divided concave sections, with n being a natural number above; an optical component, which is configured to diffuse and/or reflect imager projected thereon, is spaced apart from the concave mirror and is configured to be moved, continuously or successively, into a plurality of or at least n different positions, with n positions being preferred, relative to the concave mirror; and a picture generating unit spaced apart from the concave mirror and the optical component; wherein the concave mirror, the optical component and the picture generating unit are each configured to be mounted within a dashboard disposed within an interior of the vehicle; wherein the picture generating unit is further configured to receive n individual sections of imagery in a sequential order, the individual sections of imagery taken from a divided imagery set such that a total quantity of the individual sections of imagery taken from the divided imagery set is equal to a total quantity of the individually-divided concave
  • the n individually-divided concave sections of the concave mirror may be separated by a plurality of gaps formed on the concave mirror and/or may be mounted to a substrate, preferably via a UV sensitive glue.
  • Each one of the individually-divided concave mirror sections may be produced individually and/or mounted to the substrate individually, and/or each one of the individually-divided concave mirror sections may be configured to be moved, in particular with a 5-axis robot and/or in directions x, y, z around a pitch axis and around a roll axis, relative to the substrate for calibration during manufacture of the concave mirror, with preferably the substrate being mounted on an optical jig.
  • At least one of the concave mirror or the substrate and the picture generating unit may be configured to be moved, in particular rotated relative to the windshield and/or around an axis in Y-direction of the vehicle, for adjustment to a driver of the vehicle, in particular to the driver body height, with preferably the rotating being manually controlled and/or via an optical sensor that monitors the eye position of the driver.
  • the optical component may comprise a diffuser, a scan mirror, preferably being flat, a rotating disc carrying n optical prism or n mirror segments, or a rotating mirror, in particular providing n mirror segments, and may be configured to be moved, in particular oscillated or rotated, preferably with n stops per cycle or sequence and/or relative to a diffusor provided with n diffusor sections.
  • the optical component may be configured to be moved via an actuator or motor or a galvanometer scanner, with preferably the galvanometer scanner including a high speed rotary actuator, and/or the actuator or motor or the galvanometer scanner may be operatively connected to the optical component and may be configured to move the optical component into the different positions.
  • Each of the n different positions into which the optical component is configured to be moved may correspond to a respective one of the n individually-divided concave sections of the concave mirror and/or to a different angle at which the optical component is configured to be oriented.
  • the individual sections of imagery, which the picture generating unit is configured to receive and project, may comprise individual sections of video imagery.
  • the picture generating unit may be a projector, in particular a high-speed projector, a laser projector or a digital light processing (DLP) projector, with preferably the projector being configured to project extended graphics array (XGA) resolution or wide extended graphics array (WXGA) resolution, and/or the projector may include a digital micromirror device (DMD).
  • XGA extended graphics array
  • WXGA wide extended graphics array
  • DMD digital micromirror device
  • the respective individual sections of imagery reflected onto the windshield of the vehicle may appear as unified, undivided content providing imagery as viewed by a driver of the vehicle
  • the respective individual sections of imagery reflected onto the windshield of the vehicle may comprise respective individual sections of video imagery
  • the respective individual sections of video imagery reflected onto the windshield of the vehicle may each be displayed at a video frame rate of n times 60 frames per second (FPS) or greater
  • the respective individual sections of imagery reflected onto the windshield of the vehicle may include virtual information configured to be viewed by a driver of the vehicle to assist the driver and/or at least one marker for adjustment.
  • the virtual information may be generated from, or determined by, an advanced driver-assistance system (ADAS) of the vehicle and/or a navigation system of the vehicle, and/or the virtual information may be reflected onto the windshield of the vehicle such that the virtual information is capable of indicating or marking a boundary or object detected ahead of the vehicle, or a distance to the boundary or object detected ahead of the vehicle.
  • ADAS advanced driver-assistance system
  • An optical sensor device in particular comprising a photo diode array and/or mounted at an upper edge of the concave mirror, preferably in the middle of a middle mirror section, may be configured to detect a calibration signal added to the imagery by the picture generating unit (PGU) and to send a signal characteristic for a deflection, preferably to the electronic control unit (ECU) of the vehicle, for controlling the movement and/or stop of the optical component and/or for controlling the actuator or motor or the galvanometer scanner.
  • PGU picture generating unit
  • ECU electronic control unit
  • one aspect of this disclosure is directed to an augmented reality head- up display (AR-HUD) for a vehicle.
  • the augmented reality head-up display (AR- HUD) may include a concave mirror including individually-divided concave sections.
  • the augmented reality head-up display may further include an optical component spaced apart from the concave mirror.
  • the optical component may be configured to be moved into a plurality of different positions.
  • the augmented reality head-up display (AR-HUD) may further include a picture generating unit (PGU) spaced apart from the concave mirror and the optical component.
  • the concave mirror, the optical component and the picture generating unit (PGU) may each be configured to be mounted within a dashboard disposed within an interior of a vehicle.
  • the picture generating unit (PGU) may be further configured to receive individual sections of imagery in a sequential order.
  • the individual sections of imagery may be taken from a divided imagery set such that a total quantity of the individual sections of imagery taken from the divided imagery set is equal to a total quantity of the individually-divided concave sections of the concave mirror.
  • the picture generating unit (PGU) may be further configured to project the received individual sections of imagery onto the optical component, in a sequential order synchronized with the plurality of different positions into which the optical component may be moved, such that the optical component may move into the plurality of different positions to diffuse and reflect the respective individual sections of imagery projected thereon onto the respective individually- divided concave sections of the concave mirror, thereby further reflecting the respective individual sections of imagery generally upwardly and away from the concave mirror, through an opening in the dashboard, and onto a windshield of the vehicle.
  • the respective individual sections of imagery reflected onto the windshield of the vehicle may appear as unified, undivided imagery as viewed by a driver of the vehicle. Furthermore, the respective individual sections of imagery reflected onto the windshield of the vehicle may collectively represent virtual information configured to be viewed by the driver of the vehicle to assist the driver.
  • Another aspect of the present disclosure is directed to a vehicle with a dashboard, a windshield, an electronic control unit (ECU) and an augmented reality head-up display according to this disclosure.
  • ECU electronice control unit
  • the electronic control unit may calculate on the basis of vehicle data, which are preferably received via a CAN bus connection, predictive corrections to an image in order to maintain the alignment of AR information with the outside scenery, with the calculation preferably using Artificial Intelligence.
  • FIG. 1 is a perspective sectional view of a windshield and dashboard disposed within an interior of a vehicle, and further illustrating an overall layout of a first exemplary augmented reality head-up display (AR-HUD) according to the present disclosure, with a rotating diffusor;
  • AR-HUD augmented reality head-up display
  • FIG. 2 is a side view of a portion of the augmented reality head-up display (AR-HUD) shown in FIG. 1;
  • AR-HUD augmented reality head-up display
  • FIG. 3 is a schematic diagram illustrating various main components of the augmented reality head-up display (AR- HUD) shown in FIG. 1;
  • AR- HUD augmented reality head-up display
  • FIGS. 4A and 4B are schematic diagrams illustrating respective side and top views of certain features of the various main components shown in FIG. 3;
  • FIGS. 5A and 5B are schematic diagrams illustrating respective dimensional sizes and relationships of certain features of the various main components shown in FIG. 3;
  • FIG. 6A is a schematic diagram illustrating image processing of various imagery which is delivered to at least a picture generating unit (PGU) of the augmented reality head-up display (AR-HUD) shown in FIG. 1;
  • PGU picture generating unit
  • AR-HUD augmented reality head-up display
  • FIG. 6B is a further schematic diagram illustrating the image processing of various imagery
  • FIGS. 7 A to 7C are schematic diagrams illustrating a second exemplary augmented reality head-up display (AR-HUD) according to the present disclosure, each with a different scan mirror position;
  • AR-HUD augmented reality head-up display
  • FIG. 8 is a schematic diagram illustrating a third exemplary augmented reality head-up display (AR-HUD) according to the present disclosure, with a rotating disc;
  • AR-HUD augmented reality head-up display
  • FIGS. 9A to 9C are schematic diagrams illustrating a fourth exemplary augmented reality head-up display (AR-HUD) according to the present disclosure, each with a different rotating mirror position;
  • AR-HUD augmented reality head-up display
  • FIG. 10 is a schematic diagram illustrating a fifth exemplary augmented reality head-up display (AR-HUD) according to the present disclosure, with another rotating mirror;
  • AR-HUD augmented reality head-up display
  • FIG. 11 is a schematic diagram illustrating a calibration of individually-divided concave mirror sections
  • FIG. 12 is a schematic diagram illustrating a driver adaption of individually-divided concave mirror sections; and FIG. 13 is a schematic diagrams illustrating a calibration of movement of an optical component.
  • AR- HUD 10 augmented reality head-up display 10 for a vehicle according to the present disclosure is shown and described.
  • FIGS. 1-6 A provide several views collectively illustrating a first exemplary augmented reality head-up display (AR-HUD) 10 for a vehicle in line with the present invention.
  • the augmented reality head-up display (AR-HUD) 10 may include a concave mirror 12, as shown in FIGS. 1-5A.
  • the concave mirror 12 may include distinct, individually-divided concave sections 14, 16, 18, each including a reflective mirror surface.
  • the individually-divided concave sections 14, 16, 18 of the concave mirror 12 may be separated by a plurality of small gaps 20, 22 formed on the concave mirror 12.
  • the concave mirror 12 includes three distinct, individually-divided concave sections 14, 16, 18, however, it is to be understood that the concave mirror 12 may include any suitable quantity of distinct, individually-divided concave sections.
  • the augmented reality head-up display (AR- HUD) 10 may further include an optical component 24, comprising a diffuser 24a, which may be in the form of a galvanometer mirror or other similar optical component which may include a material capable of diffusing and reflecting imagery proj ected thereon.
  • the diffuser 24a is spaced apart from the concave mirror
  • the diffuser 24a may be configured to be moved into a plurality of different positions.
  • the diffuser 24a may be part of a galvanometer scanner 26, as shown in FIG. 3.
  • the galvanometer scanner 26 may include an electrically-driven high-speed rotary actuator operatively connected to the diffuser 24a, such as by way of a rotatable shaft 28, and may be configured to rotatably move the diffuser 24a into the plurality of different positions at a high speed.
  • each of the plurality of different positions into which the diffuser 24a may be moved may correspond to a different angle at which the diffuser 24a is configured to be oriented.
  • each of the plurality of different positions into which the diffuser 24a may be moved may correspond to a respective one of the individually-divided concave sections 14, 16, 18 of the concave mirror 12, as will be further described herein.
  • the augmented reality head-up display (AR-HUD) 10 may further include a picture generating unit (PGU) 30 spaced apart from the concave mirror 12 and the diffuser 24a. As shown in FIG. 1, at least the concave mirror 12, the diffuser 24a and the picture generating unit (PGU) 30 may each be configured to be mounted within a dashboard 32 disposed within an interior 36 of a vehicle.
  • the picture generating unit (PGU) 30 may be a projector configured to project imagery by way of a light path.
  • the projector may be a high-speed projector.
  • the projector may be a laser projector, such as a high speed laser projector.
  • the projector may be a digital light processing (DLP) projector which may include a digital micromirror device (DMD).
  • DLP digital light processing
  • the projector may be configured to project extended graphics array (XGA) resolution (e.g. 1024 x 768), wide extended graphics array (WXGA) resolution (e.g. 1280 x 800), high-definition (HD) resolution (e.g. 1920 x 1080) or any other suitable resolution as may be understood by those skilled in the art.
  • XGA extended graphics array
  • WXGA wide extended graphics array
  • HD high-definition
  • the diffuser 24a e.g. by way of the galvanometer scanner 26
  • the picture generating unit (PGU) 30 may be in operative communication with a PC/embedded system 38 in the vehicle.
  • the PC/embedded system 38 in the vehicle may receive or otherwise process unified, undivided imagery 40, such as unified, undivided video imagery having a first video frame rate.
  • the first video frame rate may be, for example, 60 frames per second (FPS).
  • the unified, undivided imagery 40 may be imagery (e.g. virtual information) which is intended to ultimately be reflected onto a windshield 34 of the vehicle to assist a driver, as will be further described herein.
  • the unified, undivided imagery 40 may undergo image processing 41 (e.g. by software) on the PC/embedded system 38 so as to be divided into a divided imagery set 42.
  • the divided imagery set 42 may include individual sections 44, 46, 48 of imagery.
  • the individual sections 44, 46, 48 of imagery taken from the divided imagery set 42 may collectively represent the unified, undivided imagery 40, which is intended to ultimately be reflected onto the windshield 34 of the vehicle to assist the driver.
  • the individual sections 44, 46, 48 of imagery taken from the divided imagery set 42 may each be individual sections of video imagery stored in a plurality of respective buffers Al, B2, C3, each still having the first video frame rate, which may be 60 frames per second (FPS) as previously described herein.
  • FPS frames per second
  • Each of the individual sections 44, 46, 48 of imagery taken from the divided imagery set 42 are configured to be sent from the plurality of respective buffers Al , B2, C3 to the picture generating unit (PGU) 30.
  • each of the individual sections 44, 46, 48 of imagery taken from the divided imagery set 42 may undergo sequential superposition, such as by hardware (e.g. a field-programmable gate array (FPGA)).
  • FPGA field-programmable gate array
  • the individual sections 44, 46, 48 of imagery taken from the divided imagery set 42 which may each be individual sections of video imagery as previously described herein, may subsequently have a multiplied second video frame rate which may be, for example, 180 frames per second (FPS).
  • FPS field-programmable gate array
  • the picture generating unit (PGU) 30 may be in operative communication with the PC/embedded system 38 in the vehicle so as to be further configured to receive the individual sections 44, 46, 48 of imagery, taken from the divided imagery set 42, in a sequential order from the plurality of respective buffers Al, B2, C3.
  • the individual sections 44, 46, 48 of imagery taken from the divided imagery set 42 may be configured such that a total quantity of the individual sections 44, 46, 48 of imagery taken from the divided imagery set 42 is equal to a total quantity of the individually-divided concave sections 14, 16, 18 of the concave mirror 12.
  • the total quantity of the individual sections 44, 46, 48 of imagery taken from the divided imagery set 42 is three, however, the total quantity may vary as the total quantity of the individually-divided concave sections 14, 16, 18 of the concave mirror 12 varies.
  • the picture generating unit (PGU) 30 may be further configured to project the received individual sections 44, 46, 48 of imagery onto the diffuser 24a (e.g. by way of a projected light path), in a sequential order synchronized with the plurality of different positions into which the diffuser 24a is configured to be moved (as shown in FIGS. 4B and 5B), such that the diffuser 24a moves into the plurality of different positions to diffuse and reflect the respective individual sections 44, 46, 48 of imagery projected thereon onto the respective individually-divided concave sections 14, 16, 18 of the concave mirror 12.
  • the respective individually-divided concave sections 14, 16, 18 of the concave mirror 12 may further reflect the respective individual sections 44, 46, 48 of imagery generally upwardly and away from the concave mirror 12, through an opening in the dashboard 32, and onto the windshield 34 of the vehicle (as shown in FIG. 1).
  • the individual sections 44, 46, 48 of imagery, which the picture generating unit (PGU) 30 is configured to receive and project may be individual sections of video imagery.
  • the respective individual sections 44, 46, 48 of imagery reflected onto the windshield 34 of the vehicle may be respective individual sections of the video imagery.
  • the respective individual sections of the video imagery reflected onto the windshield 34 of the vehicle may each be displayed at a video frame rate of 3 times 60 and, thus, 180 frames per second (FPS) or greater.
  • the respective individual sections 44, 46, 48 of imagery reflected onto the windshield 34 of the vehicle may advantageously appear as unified, undivided imagery, while avoiding flickering, as viewed by the driver of the vehicle.
  • the respective individual sections 44, 46, 48 of imagery reflected onto the windshield 34 of the vehicle may include virtual information (i.e. augmentations) configured to be viewed by the driver of the vehicle to assist the driver.
  • the virtual information may be generated from, or may be determined by, an advanced driver-assistance system (ADAS) of the vehicle and/or a navigation system of the vehicle.
  • ADAS advanced driver-assistance system
  • the virtual information may be reflected onto the windshield 34 of the vehicle such that the virtual information is capable of indicating or marking a boundary or object detected ahead of the vehicle, or a distance to the boundary or object detected ahead of the vehicle.
  • the virtual information reflected onto the windshield 34 of the vehicle may appear to be part of the driving experience itself, thus further enhancing the comfort, safety and overall driving experience for the driver of the vehicle.
  • the augmented reality head-up display (AR-HUD) 10 advantageously occupies substantially less packaging space within the dashboard 32 disposed within the interior 36 of the vehicle.
  • the augmented reality head- up display (AR-HUD) 10 may be capable of occupying approximately 6 liters of total packaging space (i.e. volume), as compared to approximately 18 liters of total packaging space which other known augmented reality head-up displays (AR-HUDs) may occupy within a vehicle dashboard.
  • the augmented reality head-up display (AR-HUD) 10 is advantageously capable of projecting and reflecting high quality and/or high resolution imagery (e.g. various virtual information) onto the windshield 34 of the vehicle to assist the driver.
  • high quality and/or high resolution imagery e.g. various virtual information
  • FIG. 6B is further schematic diagram illustrating the image processing of various imagery 40 from a not shown camera system of the vehicle, in particular a camera system of an advanced driver-assistance system (ADAS), providing a standard video input of 60 Hz corresponding to 60 frames per second (FPS) to the embedded system 38 which can be comprised by the electronic control unit (ECU) of the vehicle.
  • ADAS advanced driver-assistance system
  • FPS frames per second
  • ECU electronice control unit
  • AR-HUD augmented reality head-up display
  • n a natural number of at least 2
  • FPS frames per second
  • the embedded system 38 or rather the electronic control unit (ECU) of the vehicle sends the series of n split imagery sections to the picture generating unit (PGU) 30 in particular in form of a projector operating at n time 60 Hz.
  • the embedded system 38 or rather the electronic control unit (ECU) of the vehicle sends control signals for the movement of the optical component 24, e.g. to the galvanometer scanner 26 of the diffusor 24a.
  • FIGS. 7A-7C provide several views collectively illustrating a second exemplary augmented reality head-up display (AR-HUD) 10 for a vehicle in line with the present invention.
  • the augmented reality head-up display (AR-HUD) 10 includes a concave mirror 12, with three distinct, individually-divided concave sections 14,
  • the augmented reality head-up display (AR-HUD) 10 includes an optical component 24, comprising a scan mirror 24b acting as a galvanometer scanner and cooperating with a diffusor 50 having three distinct, individually- divided diffusor sections 52, 54, 56.
  • the scan mirror 24b is substantially flat and oscillating to reflect imaginary 40 from the picture generating unit (PGU) 30 onto the diffusor 50 such that, as described in particular with respect to FIG. 6B, each one of the three individual sections 44, 46,
  • the scan mirror 24b is in a left position with a rotation angle of +
  • the scan mirror 24b is in a middle position with a rotation angle of 0° to direct imaginary through the middle diffusor section 54 on the middle mirror section 16 for reflection on the not shown windshield in FIG. 7A.
  • the scan mirror 24b is in a right position with a rotation angle of -
  • the scan mirror 24b of the second embodiment may be a three stop oscillating mirror, which might oscillate from stop 1 to stop 2, from stop 2 to stop 3, and from stop 3 to stop 1 to start a new cycle.
  • the amount of stops depends on the selected number n as described above and can vary.
  • the cycle of passing the stops and the rotation angle of the scan mirror 24 at each stop can be adapted to certain embodiments as well as the refraction angle 58 of the diffusor sections 52, 54 and 56, respectively.
  • FIG. 8 illustrates a third exemplary augmented reality head-up display (AR-HUD)
  • an optical component 24 comprising a rotating disc 24c carrying three prism 72, 74, 76 and cooperating with a diffusor 50 having three distinct, individually-divided diffusor sections 52, 54, 56.
  • the optical prisms 72, 74, 76 are designed to deflect light beams from the picture generating unit (PGU) 30 to left, middle and right positions and, thus, to the three diffusor sections 52, 54, 56, in a sequence defined by the rotation of the rotating disc 24c to reach the three mirror sections 14, 16, 18, respectively, in order to provide the head-up display on the windshield 34.
  • PGU picture generating unit
  • the disc 24c stops every 120°.
  • imagery sections 44, 46, 48, prisms 72, 74, 76, stops of the rotating disc 24c, diffusor sections 52, 54, 56 and mirror sections 14, 16, 18 can be varied, but must be matched to each other.
  • FIGS. 9A-9C provide several views collectively illustrating a fourth exemplary augmented reality head-up display (AR-HUD) 10 for a vehicle in line with the present invention, with an optical component 24 comprising an alternative rotating disc 24c carrying three mirror sections 62, 64, 66 and cooperating with a diffusor 50 having three distinct, individually-divided diffusor sections 52, 54, 56, functioning in an analogue manner as the third embodiments.
  • FIG. 9A, 9B and 9C 30 to left, middle and right positions are shown in FIG. 9A, 9B and 9C, respectively.
  • the three mirror sections 62, 64, 66 may move in an axial direction in coordination with their angular orientation. This also allows the rotation of the disc 24C to be continuous in other embodiment, i.e. the disc 24C does not need any stop in this case.
  • FIG. 10 illustrates a fifth exemplary augmented reality head-up display (AR-HUD)
  • an optical component 24 comprising a rotating mirror 24d carrying three mirror segments 82, 84, 86 and cooperating with a diffusor 50 having three distinct, individually-divided diffusor sections 52, 54, 56.
  • the fifth embodiment uses a rotation movement via a motor 25 with only two stops per rotation cycle. The first stop takes place when a mirror front side with a first mirror segment 82 is pointing to the middle mirror section 16 for projecting the centre imagery section 46. Then the mirror 24d turns by 180° and the remaining two imagery sections 44, 48 are projected to the two mirror segments 14, 16 via the two mirror segments 84, 86 on the rear side of the mirror 24d.
  • FIG. 11 is a schematic diagram illustrating a calibration of the individually-divided concave mirror sections 14, 16, 18 mounted on a substrate 13 to provide the concave mirror 12, and the corresponding method may be applied in a camera imager chip calibration.
  • Each one of the individually-divided concave mirror sections 14, 16, 18 may be produced individually, which saves production costs as smaller parts are to be manufactured.
  • the substrate 13 may be mounted on an optical jig (not shown) for calibration where the deviation from any desired position can be measured.
  • UV- sensitive glue may be applied on the substrate 13 and one of the mirror sections 14, 16, 18, see section 18 in in FIG. 11, is applied on the glued part of the substrate 13.
  • As said section 18 can be moved e.g. with a 5-axis robot (not shown) in directions x, y, z, pitch axis and roll axis, it can be adjusted until the alignment measured in the jig is correct such that UV light can be applied to harden the UV-sensitive glue. This process can be repeated for the remaining sections 16, 14, until all individually-divided concave mirror sections 14, 16, 18 are aligned.
  • FIG. 12 is a schematic diagram illustrating a driver adaption of individually-divided concave mirror sections 14, 16, 18 by tilting the substrate 13 and/or the picture generating unit (PGU) 30.
  • PGU picture generating unit
  • FIG. 13 is a schematic diagram illustrating a calibration of the movement of an optical component 24. With respect to the fifth embodiment shown in FIG. 10 and the additional installation of a photo diode array 96 at an upper edge of the middle mirror section 16 said calibration is explained.
  • the photo diode array 96 is connected to the electronic control unit (ECU) which, as explained with respect to FIG. 6B, controls both the picture generating unit (PGU) 30 and the motor 25 rotating the mirror 24d.
  • the picture generating unit (PGU) 30 adds a special signal to the imagery 40 to be projected, e.g. in form of a distinctive flashing in a distinctive colour (shown as dashed line in FIG. 13) and resulting in a marker 102 within the displayed content 100.
  • a special signal to the imagery 40 to be projected, e.g. in form of a distinctive flashing in a distinctive colour (shown as dashed line in FIG. 13) and resulting in a marker 102 within the displayed content 100.
  • said signal is detected in the middle of the photo diode array 96.
  • the electronic control unit receives the signal from said other diode to calculate the deflection as well as modified control signals for the motor 25.
  • the ECU may receive vehicle data in addition (e.g via a CAN bus connection) and calculate predictive corrections to the image in order to maintain the alignment of AR information with the outside scenery. It may be beneficial to use Artificial Intelligence methods to calculate these corrections because a multitude of data can be taken into account for this improvement. While one or more exemplary embodiments are described above, it is not intended that these embodiments describe all possible forms of the disclosure. Rather, the words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the disclosure. Additionally, the features of various implementing embodiments may be combined to form further embodiments of the disclosure.
  • AR-HUD augmented reality head-up display

Abstract

The invention refers to an augmented reality head-up display (10) for a vehicle, the augmented reality head-up display comprising: a concave mirror (12) including n individually-divided concave sections (14, 16, 18), with n being a natural number above; an optical component (24, 24a, 24b, 24c, 24d), which is configured to diffuse and/or reflect imagery (40) projected thereon, is spaced apart from the concave mirror and is configured to be moved into a plurality of different positions relative to the concave mirror, with n positions being preferred; and a picture generating unit (30) spaced apart from the concave mirror and the optical component; wherein the concave mirror, the optical component and the picture generating unit are each configured to be mounted within a dashboard (32) disposed within an interior (36) of the vehicle; wherein the picture generating unit is further configured to receive at least n individual sections (44, 46, 48) of imagery (40) in a sequential order, the individual sections of imagery taken from a divided imagery set (42) such that a total quantity of the individual sections of imagery taken from the divided imagery set is equal to a total quantity of the individually-divided concave sections of the concave mirror; and wherein the picture generating unit is further configured to project the received individual sections of imagery onto the optical component, in a sequential order synchronized with the at least n different positions into which the optical component is configured to be moved relative to the concave mirror, such that the optical component moves into the different positions moved relative to the concave mirror to diffuse and/or reflect the n respective individual sections of imagery projected thereon onto the n respective individually-divided concave sections of the concave mirror, thereby further reflecting the respective individual sections of imagery generally upwardly and away from the concave mirror, through an opening in the dashboard, and onto a windshield of the vehicle.

Description

AUGMENTED REALITY HEAD-UP DISPLAY AND VEHICLE
TECHNICAL FIELD
The present disclosure relates generally to head-up displays for vehicles and, more particularly, to an augmented reality head-up display (AR-HUD) for a vehicle as well as a vehicle with such a display.
BACKGROUND
Some vehicles, such as passenger cars, vans and trucks, may include a head-up display (HUD) for assisting a driver of the vehicle. Typically, a head-up display in a vehicle is capable of reflecting imagery (e.g. depicting various vehicle-related information such as a speedometer, tachometer, vehicle navigation, etc.) up through an opening in a dashboard and onto a windshield of the vehicle. As such, a driver of the vehicle may observe various vehicle-related information while keeping his/her head and eyes up so as to be more focused on the exterior view ahead while driving.
More advanced generations of head-up displays for vehicles continue to be developed to further enhance the comfort, safety and overall driving experience for drivers. One example is an augmented reality head-up display (AR-HUD) which supplements the exterior view ahead of the vehicle with imagery in the form of virtual information (i.e. augmentations) to further assist a driver. Such augmented reality head-up displays differ from more conventional head-up displays in that the virtual information reflected onto the windshield appears to be part of the driving experience itself. For example, when using an AR-HUD in conjunction with a navigation system while driving, one or more virtual symbols or graphics (e.g. directional guidance arrows, etc.) may be reflected onto the windshield and placed accurately into the exterior view so as to indicate or mark the route that the driver should follow on the road ahead. As another example, when using an AR-HUD in conjunction with an enabled collision avoidance system or adaptive cruise control (ACC) of an advanced driver-assistance system (ADAS), one or more virtual symbols or graphics may be reflected onto the windshield and placed accurately into the exterior view so as to indicate or mark a specific object or vehicle ahead on the road, or a distance to the specific object or vehicle ahead on the road, which is detected by the ADAS. As yet another example, when using an AR-HUD in conjunction with a lane departure warning system of an ADAS, one or more virtual symbols or graphics may be reflected onto the windshield and placed accurately into the exterior view so as to indicate or mark specific boundaries of the lanes ahead on the road which are detected by the ADAS.
While the use of an augmented reality head-up display (AR-HUD) in a vehicle may increase driver-assistance capabilities, such as generally described above, and may further enhance the comfort, safety and overall driving experience, certain challenges and limitations exist when attempting to implement an AR-HUD into a vehicle. As compared to more conventional head-up displays (HUDs) in vehicles, since an AR-HUD in a vehicle may reflect substantially more information (e.g. virtual information) onto a windshield over a substantially larger area (e.g. overlaying the road ahead as seen by a driver looking through the windshield), certain components of the AR-HUD and the resulting layout of such components often occupy substantially more packaging space within a dashboard disposed within an interior of the vehicle. However, in most vehicles, the available or ideal packaging space within a dashboard disposed within the interior is rather limited.
As such, an AR-HUD which occupies more than the available or ideal packaging space within a dashboard disposed within an interior of a vehicle may not be able to function as desired or may not even be feasible for use in the vehicle.
With the aforementioned challenges and limitations in mind, there is a continuing unaddressed need for an augmented reality head-up display (AR-HUD) for a vehicle which occupies substantially less packaging space within a dashboard disposed within an interior of the vehicle, and which is capable of reflecting high quality and/or high resolution imagery (e.g. various virtual information) onto a windshield of the vehicle to assist a driver.
SUMMARY
The above-identified need is met with the present disclosure. In particular, the need is met by an augmented reality head-up display (AR-HUD) for a vehicle of claim 1, with preferred embodiments thereof being described in claims 2 to 12. Further, the need is met by a vehicle with such an augmented reality head-up display (AR- HUD) in line with claim 13.
Thus, an augmented reality head-up display for a vehicle is provided, the augmented reality head-up display comprising: a concave mirror including n individually-divided concave sections, with n being a natural number above; an optical component, which is configured to diffuse and/or reflect imager projected thereon, is spaced apart from the concave mirror and is configured to be moved, continuously or successively, into a plurality of or at least n different positions, with n positions being preferred, relative to the concave mirror; and a picture generating unit spaced apart from the concave mirror and the optical component; wherein the concave mirror, the optical component and the picture generating unit are each configured to be mounted within a dashboard disposed within an interior of the vehicle; wherein the picture generating unit is further configured to receive n individual sections of imagery in a sequential order, the individual sections of imagery taken from a divided imagery set such that a total quantity of the individual sections of imagery taken from the divided imagery set is equal to a total quantity of the individually-divided concave sections of the concave mirror; and wherein the picture generating unit is further configured to project the received individual sections of imagery onto the optical component, in a sequential order synchronized with the at least n different positions into which the optical component is configured to be moved relative to the concave mirror, such that the optical component moves into the different positions relative to the concave mirror to diffuse and/or reflect the n respective individual sections of imagery projected thereon onto the n respective individually-divided concave sections of the concave mirror, thereby further reflecting the respective individual sections of imagery, in particular generally upwardly, away from the concave mirror, through an opening in the dashboard, and onto a windshield of the vehicle.
The n individually-divided concave sections of the concave mirror may be separated by a plurality of gaps formed on the concave mirror and/or may be mounted to a substrate, preferably via a UV sensitive glue.
Each one of the individually-divided concave mirror sections may be produced individually and/or mounted to the substrate individually, and/or each one of the individually-divided concave mirror sections may be configured to be moved, in particular with a 5-axis robot and/or in directions x, y, z around a pitch axis and around a roll axis, relative to the substrate for calibration during manufacture of the concave mirror, with preferably the substrate being mounted on an optical jig.
At least one of the concave mirror or the substrate and the picture generating unit (PGU) may be configured to be moved, in particular rotated relative to the windshield and/or around an axis in Y-direction of the vehicle, for adjustment to a driver of the vehicle, in particular to the driver body height, with preferably the rotating being manually controlled and/or via an optical sensor that monitors the eye position of the driver.
The optical component may comprise a diffuser, a scan mirror, preferably being flat, a rotating disc carrying n optical prism or n mirror segments, or a rotating mirror, in particular providing n mirror segments, and may be configured to be moved, in particular oscillated or rotated, preferably with n stops per cycle or sequence and/or relative to a diffusor provided with n diffusor sections.
The optical component may be configured to be moved via an actuator or motor or a galvanometer scanner, with preferably the galvanometer scanner including a high speed rotary actuator, and/or the actuator or motor or the galvanometer scanner may be operatively connected to the optical component and may be configured to move the optical component into the different positions. Each of the n different positions into which the optical component is configured to be moved, may correspond to a respective one of the n individually-divided concave sections of the concave mirror and/or to a different angle at which the optical component is configured to be oriented. The individual sections of imagery, which the picture generating unit is configured to receive and project, may comprise individual sections of video imagery.
The picture generating unit may be a projector, in particular a high-speed projector, a laser projector or a digital light processing (DLP) projector, with preferably the projector being configured to project extended graphics array (XGA) resolution or wide extended graphics array (WXGA) resolution, and/or the projector may include a digital micromirror device (DMD).
The respective individual sections of imagery reflected onto the windshield of the vehicle may appear as unified, undivided content providing imagery as viewed by a driver of the vehicle, the respective individual sections of imagery reflected onto the windshield of the vehicle may comprise respective individual sections of video imagery, the respective individual sections of video imagery reflected onto the windshield of the vehicle may each be displayed at a video frame rate of n times 60 frames per second (FPS) or greater, and/or the respective individual sections of imagery reflected onto the windshield of the vehicle may include virtual information configured to be viewed by a driver of the vehicle to assist the driver and/or at least one marker for adjustment.
The virtual information may be generated from, or determined by, an advanced driver-assistance system (ADAS) of the vehicle and/or a navigation system of the vehicle, and/or the virtual information may be reflected onto the windshield of the vehicle such that the virtual information is capable of indicating or marking a boundary or object detected ahead of the vehicle, or a distance to the boundary or object detected ahead of the vehicle.
An optical sensor device, in particular comprising a photo diode array and/or mounted at an upper edge of the concave mirror, preferably in the middle of a middle mirror section, may be configured to detect a calibration signal added to the imagery by the picture generating unit (PGU) and to send a signal characteristic for a deflection, preferably to the electronic control unit (ECU) of the vehicle, for controlling the movement and/or stop of the optical component and/or for controlling the actuator or motor or the galvanometer scanner. Accordingly, one aspect of this disclosure is directed to an augmented reality head- up display (AR-HUD) for a vehicle. The augmented reality head-up display (AR- HUD) may include a concave mirror including individually-divided concave sections. The augmented reality head-up display (AR-HUD) may further include an optical component spaced apart from the concave mirror. The optical component may be configured to be moved into a plurality of different positions. The augmented reality head-up display (AR-HUD) may further include a picture generating unit (PGU) spaced apart from the concave mirror and the optical component. The concave mirror, the optical component and the picture generating unit (PGU) may each be configured to be mounted within a dashboard disposed within an interior of a vehicle. The picture generating unit (PGU) may be further configured to receive individual sections of imagery in a sequential order. The individual sections of imagery may be taken from a divided imagery set such that a total quantity of the individual sections of imagery taken from the divided imagery set is equal to a total quantity of the individually-divided concave sections of the concave mirror. The picture generating unit (PGU) may be further configured to project the received individual sections of imagery onto the optical component, in a sequential order synchronized with the plurality of different positions into which the optical component may be moved, such that the optical component may move into the plurality of different positions to diffuse and reflect the respective individual sections of imagery projected thereon onto the respective individually- divided concave sections of the concave mirror, thereby further reflecting the respective individual sections of imagery generally upwardly and away from the concave mirror, through an opening in the dashboard, and onto a windshield of the vehicle. The respective individual sections of imagery reflected onto the windshield of the vehicle may appear as unified, undivided imagery as viewed by a driver of the vehicle. Furthermore, the respective individual sections of imagery reflected onto the windshield of the vehicle may collectively represent virtual information configured to be viewed by the driver of the vehicle to assist the driver.
Another aspect of the present disclosure is directed to a vehicle with a dashboard, a windshield, an electronic control unit (ECU) and an augmented reality head-up display according to this disclosure.
The electronic control unit (ECU) may calculate on the basis of vehicle data, which are preferably received via a CAN bus connection, predictive corrections to an image in order to maintain the alignment of AR information with the outside scenery, with the calculation preferably using Artificial Intelligence.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the present disclosure are pointed out with particularity in the appended claims. However, other features of the one or more embodiments will become more apparent and will be best understood by referring to the following detailed description in conjunction with the accompanying drawings, wherein:
FIG. 1 is a perspective sectional view of a windshield and dashboard disposed within an interior of a vehicle, and further illustrating an overall layout of a first exemplary augmented reality head-up display (AR-HUD) according to the present disclosure, with a rotating diffusor;
FIG. 2 is a side view of a portion of the augmented reality head-up display (AR-HUD) shown in FIG. 1;
FIG. 3 is a schematic diagram illustrating various main components of the augmented reality head-up display (AR- HUD) shown in FIG. 1;
FIGS. 4A and 4B are schematic diagrams illustrating respective side and top views of certain features of the various main components shown in FIG. 3; FIGS. 5A and 5B are schematic diagrams illustrating respective dimensional sizes and relationships of certain features of the various main components shown in FIG. 3;
FIG. 6A is a schematic diagram illustrating image processing of various imagery which is delivered to at least a picture generating unit (PGU) of the augmented reality head-up display (AR-HUD) shown in FIG. 1;
FIG. 6B is a further schematic diagram illustrating the image processing of various imagery; FIGS. 7 A to 7C are schematic diagrams illustrating a second exemplary augmented reality head-up display (AR-HUD) according to the present disclosure, each with a different scan mirror position;
FIG. 8 is a schematic diagram illustrating a third exemplary augmented reality head-up display (AR-HUD) according to the present disclosure, with a rotating disc;
FIGS. 9A to 9C are schematic diagrams illustrating a fourth exemplary augmented reality head-up display (AR-HUD) according to the present disclosure, each with a different rotating mirror position;
FIG. 10 is a schematic diagram illustrating a fifth exemplary augmented reality head-up display (AR-HUD) according to the present disclosure, with another rotating mirror;
FIG. 11 is a schematic diagram illustrating a calibration of individually-divided concave mirror sections;
FIG. 12 is a schematic diagram illustrating a driver adaption of individually-divided concave mirror sections; and FIG. 13 is a schematic diagrams illustrating a calibration of movement of an optical component.
DETAILED DESCRIPTION
As required, detailed embodiments of the present disclosure are disclosed herein, however, it is to be understood that the disclosed embodiments are merely exemplary of the disclosure that may be embodied in various and alternative forms. The figures are not necessarily to scale; some features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present disclosure. Additionally, various terms and/or phrases describing or indicating a position or directional reference such as“top”,“bottom”, “front”,“rear”,“forward”,“rearward”,“end”, etc. may relate to one or more particular components as seen generally from a user’s vantage point during use or operation, and such terms and/or phrases are not to be interpreted as limiting, but merely as a representative basis for describing the disclosure to one skilled in the art.
Referring to the figures, an exemplary augmented reality head-up display (AR- HUD) 10 for a vehicle according to the present disclosure is shown and described.
FIGS. 1-6 A provide several views collectively illustrating a first exemplary augmented reality head-up display (AR-HUD) 10 for a vehicle in line with the present invention. The augmented reality head-up display (AR-HUD) 10 may include a concave mirror 12, as shown in FIGS. 1-5A. The concave mirror 12 may include distinct, individually-divided concave sections 14, 16, 18, each including a reflective mirror surface. The individually-divided concave sections 14, 16, 18 of the concave mirror 12 may be separated by a plurality of small gaps 20, 22 formed on the concave mirror 12. In this example, the concave mirror 12 includes three distinct, individually-divided concave sections 14, 16, 18, however, it is to be understood that the concave mirror 12 may include any suitable quantity of distinct, individually-divided concave sections. As shown throughout FIGS. 1-6 A, the augmented reality head-up display (AR- HUD) 10 may further include an optical component 24, comprising a diffuser 24a, which may be in the form of a galvanometer mirror or other similar optical component which may include a material capable of diffusing and reflecting imagery proj ected thereon. The diffuser 24a is spaced apart from the concave mirror
12. As shown in at least FIGS. 3, 4B and 5B, the diffuser 24a may be configured to be moved into a plurality of different positions. As a non- limiting example, the diffuser 24a may be part of a galvanometer scanner 26, as shown in FIG. 3. The galvanometer scanner 26 may include an electrically-driven high-speed rotary actuator operatively connected to the diffuser 24a, such as by way of a rotatable shaft 28, and may be configured to rotatably move the diffuser 24a into the plurality of different positions at a high speed. As shown in FIGS. 3 and 4B, each of the plurality of different positions into which the diffuser 24a may be moved may correspond to a different angle at which the diffuser 24a is configured to be oriented. Similarly, as shown in FIGS. 3 and 4B, each of the plurality of different positions into which the diffuser 24a may be moved may correspond to a respective one of the individually-divided concave sections 14, 16, 18 of the concave mirror 12, as will be further described herein.
The augmented reality head-up display (AR-HUD) 10 may further include a picture generating unit (PGU) 30 spaced apart from the concave mirror 12 and the diffuser 24a. As shown in FIG. 1, at least the concave mirror 12, the diffuser 24a and the picture generating unit (PGU) 30 may each be configured to be mounted within a dashboard 32 disposed within an interior 36 of a vehicle. The picture generating unit (PGU) 30 may be a projector configured to project imagery by way of a light path. As a non- limiting example, the projector may be a high-speed projector. As a further non- limiting example, the projector may be a laser projector, such as a high speed laser projector. As a further non- limiting example, the projector may be a digital light processing (DLP) projector which may include a digital micromirror device (DMD). The projector may be configured to project extended graphics array (XGA) resolution (e.g. 1024 x 768), wide extended graphics array (WXGA) resolution (e.g. 1280 x 800), high-definition (HD) resolution (e.g. 1920 x 1080) or any other suitable resolution as may be understood by those skilled in the art.
As shown in FIG. 6A, the diffuser 24a (e.g. by way of the galvanometer scanner 26) and the picture generating unit (PGU) 30 may be in operative communication with a PC/embedded system 38 in the vehicle. The PC/embedded system 38 in the vehicle may receive or otherwise process unified, undivided imagery 40, such as unified, undivided video imagery having a first video frame rate. The first video frame rate may be, for example, 60 frames per second (FPS). The unified, undivided imagery 40 may be imagery (e.g. virtual information) which is intended to ultimately be reflected onto a windshield 34 of the vehicle to assist a driver, as will be further described herein. The unified, undivided imagery 40 may undergo image processing 41 (e.g. by software) on the PC/embedded system 38 so as to be divided into a divided imagery set 42. The divided imagery set 42 may include individual sections 44, 46, 48 of imagery. The individual sections 44, 46, 48 of imagery taken from the divided imagery set 42 may collectively represent the unified, undivided imagery 40, which is intended to ultimately be reflected onto the windshield 34 of the vehicle to assist the driver. The individual sections 44, 46, 48 of imagery taken from the divided imagery set 42 may each be individual sections of video imagery stored in a plurality of respective buffers Al, B2, C3, each still having the first video frame rate, which may be 60 frames per second (FPS) as previously described herein. Each of the individual sections 44, 46, 48 of imagery taken from the divided imagery set 42 are configured to be sent from the plurality of respective buffers Al , B2, C3 to the picture generating unit (PGU) 30. Before being sent to the picture generating unit (PGU) 30, each of the individual sections 44, 46, 48 of imagery taken from the divided imagery set 42 may undergo sequential superposition, such as by hardware (e.g. a field-programmable gate array (FPGA)). As a result of undergoing sequential superposition, the individual sections 44, 46, 48 of imagery taken from the divided imagery set 42, which may each be individual sections of video imagery as previously described herein, may subsequently have a multiplied second video frame rate which may be, for example, 180 frames per second (FPS). As further shown in FIG. 6 A, the picture generating unit (PGU) 30 may be in operative communication with the PC/embedded system 38 in the vehicle so as to be further configured to receive the individual sections 44, 46, 48 of imagery, taken from the divided imagery set 42, in a sequential order from the plurality of respective buffers Al, B2, C3. The individual sections 44, 46, 48 of imagery taken from the divided imagery set 42 may be configured such that a total quantity of the individual sections 44, 46, 48 of imagery taken from the divided imagery set 42 is equal to a total quantity of the individually-divided concave sections 14, 16, 18 of the concave mirror 12. As such, in this example, the total quantity of the individual sections 44, 46, 48 of imagery taken from the divided imagery set 42 is three, however, the total quantity may vary as the total quantity of the individually-divided concave sections 14, 16, 18 of the concave mirror 12 varies.
As further shown in FIG. 6A, the picture generating unit (PGU) 30 may be further configured to project the received individual sections 44, 46, 48 of imagery onto the diffuser 24a (e.g. by way of a projected light path), in a sequential order synchronized with the plurality of different positions into which the diffuser 24a is configured to be moved (as shown in FIGS. 4B and 5B), such that the diffuser 24a moves into the plurality of different positions to diffuse and reflect the respective individual sections 44, 46, 48 of imagery projected thereon onto the respective individually-divided concave sections 14, 16, 18 of the concave mirror 12. As such, the respective individually-divided concave sections 14, 16, 18 of the concave mirror 12 may further reflect the respective individual sections 44, 46, 48 of imagery generally upwardly and away from the concave mirror 12, through an opening in the dashboard 32, and onto the windshield 34 of the vehicle (as shown in FIG. 1).
As previously described herein, the individual sections 44, 46, 48 of imagery, which the picture generating unit (PGU) 30 is configured to receive and project, may be individual sections of video imagery. As such, the respective individual sections 44, 46, 48 of imagery reflected onto the windshield 34 of the vehicle may be respective individual sections of the video imagery. The respective individual sections of the video imagery reflected onto the windshield 34 of the vehicle may each be displayed at a video frame rate of 3 times 60 and, thus, 180 frames per second (FPS) or greater. With such a video frame rate, the respective individual sections 44, 46, 48 of imagery reflected onto the windshield 34 of the vehicle may advantageously appear as unified, undivided imagery, while avoiding flickering, as viewed by the driver of the vehicle.
The respective individual sections 44, 46, 48 of imagery reflected onto the windshield 34 of the vehicle may include virtual information (i.e. augmentations) configured to be viewed by the driver of the vehicle to assist the driver. As non limiting examples, the virtual information may be generated from, or may be determined by, an advanced driver-assistance system (ADAS) of the vehicle and/or a navigation system of the vehicle. The virtual information may be reflected onto the windshield 34 of the vehicle such that the virtual information is capable of indicating or marking a boundary or object detected ahead of the vehicle, or a distance to the boundary or object detected ahead of the vehicle. In other words, the virtual information reflected onto the windshield 34 of the vehicle may appear to be part of the driving experience itself, thus further enhancing the comfort, safety and overall driving experience for the driver of the vehicle.
As a result of at least the relatively close positioning and overall compact layout of at least the concave mirror 12, the diffuser 24a and the picture generating unit (PGU) 30 (as shown in at least FIG. 5A), the augmented reality head-up display (AR-HUD) 10 according to the present disclosure advantageously occupies substantially less packaging space within the dashboard 32 disposed within the interior 36 of the vehicle. As a non-limiting example, the augmented reality head- up display (AR-HUD) 10 according to the present disclosure may be capable of occupying approximately 6 liters of total packaging space (i.e. volume), as compared to approximately 18 liters of total packaging space which other known augmented reality head-up displays (AR-HUDs) may occupy within a vehicle dashboard. Furthermore, as a result of the configured picture generating unit (PGU)
30 and diffuser 24a being synchronized to be capable of operating at relatively high speeds, the augmented reality head-up display (AR-HUD) 10 is advantageously capable of projecting and reflecting high quality and/or high resolution imagery (e.g. various virtual information) onto the windshield 34 of the vehicle to assist the driver.
FIG. 6B is further schematic diagram illustrating the image processing of various imagery 40 from a not shown camera system of the vehicle, in particular a camera system of an advanced driver-assistance system (ADAS), providing a standard video input of 60 Hz corresponding to 60 frames per second (FPS) to the embedded system 38 which can be comprised by the electronic control unit (ECU) of the vehicle. In case the embedded system 38 or rather the image processing 41 splits the imagery 40 into three individual sections 44, 46, 48, the video frame rate is increased to three times 60 Hz to provide the projection rate.
While with respect to the exemplary augmented reality head-up display (AR-HUD)
10 a splitting into three with respect to the imagery 40 as well as the concave mirror 12 has been described, it is to be understood that another splitting number n, with n being a natural number of at least 2, is also covered with the present invention. As today up to 1.000 frames per second (FPS) are possible, n could be enhanced to e.g.
16.
As further shown in FIG. 6B, the embedded system 38 or rather the electronic control unit (ECU) of the vehicle sends the series of n split imagery sections to the picture generating unit (PGU) 30 in particular in form of a projector operating at n time 60 Hz. In parallel or rather synchronized the embedded system 38 or rather the electronic control unit (ECU) of the vehicle sends control signals for the movement of the optical component 24, e.g. to the galvanometer scanner 26 of the diffusor 24a.
FIGS. 7A-7C provide several views collectively illustrating a second exemplary augmented reality head-up display (AR-HUD) 10 for a vehicle in line with the present invention. The augmented reality head-up display (AR-HUD) 10 includes a concave mirror 12, with three distinct, individually-divided concave sections 14,
16, 18, and a picture generating unit (PGU) 30 spaced apart from the concave mirror 12 and configured to be mounted within a dashboard disposed a vehicle, in line with the first embodiment. But in contrast to the first embodiment, as shown throughout FIGS. 7A-7C, the augmented reality head-up display (AR-HUD) 10 includes an optical component 24, comprising a scan mirror 24b acting as a galvanometer scanner and cooperating with a diffusor 50 having three distinct, individually- divided diffusor sections 52, 54, 56. The scan mirror 24b is substantially flat and oscillating to reflect imaginary 40 from the picture generating unit (PGU) 30 onto the diffusor 50 such that, as described in particular with respect to FIG. 6B, each one of the three individual sections 44, 46,
48 of the imaginary 40 passes one of the three distinct, individually-divided diffusor sections 52, 54, 56 to reach a respective one of the three distinct, individually- divided concave sections 14, 16, 18 of the concave mirror 12. This is illustrated in FIGS. 7A - 7C with 3 stops of the oscillation as follows:
(1) The scan mirror 24b is in a left position with a rotation angle of +
15° to direct imaginary through the left diffusor section 52 on the left mirror section 14 for reflection on the not shown windshield in FIG. 7B.
(2) The scan mirror 24b is in a middle position with a rotation angle of 0° to direct imaginary through the middle diffusor section 54 on the middle mirror section 16 for reflection on the not shown windshield in FIG. 7A. (3) The scan mirror 24b is in a right position with a rotation angle of -
15° to direct imaginary through the right diffusor section 56 on the right mirror section 18 for reflection on the not shown windshield in FIG. 7C.
Thus, the scan mirror 24b of the second embodiment may be a three stop oscillating mirror, which might oscillate from stop 1 to stop 2, from stop 2 to stop 3, and from stop 3 to stop 1 to start a new cycle. The amount of stops depends on the selected number n as described above and can vary. Also the cycle of passing the stops and the rotation angle of the scan mirror 24 at each stop can be adapted to certain embodiments as well as the refraction angle 58 of the diffusor sections 52, 54 and 56, respectively. FIG. 8 illustrates a third exemplary augmented reality head-up display (AR-HUD)
10 for a vehicle in line with the present invention, with an optical component 24 comprising a rotating disc 24c carrying three prism 72, 74, 76 and cooperating with a diffusor 50 having three distinct, individually-divided diffusor sections 52, 54, 56. The optical prisms 72, 74, 76 are designed to deflect light beams from the picture generating unit (PGU) 30 to left, middle and right positions and, thus, to the three diffusor sections 52, 54, 56, in a sequence defined by the rotation of the rotating disc 24c to reach the three mirror sections 14, 16, 18, respectively, in order to provide the head-up display on the windshield 34. For that purpose, the disc 24c stops every 120°.
Again, the number n of imagery sections 44, 46, 48, prisms 72, 74, 76, stops of the rotating disc 24c, diffusor sections 52, 54, 56 and mirror sections 14, 16, 18 can be varied, but must be matched to each other.
FIGS. 9A-9C provide several views collectively illustrating a fourth exemplary augmented reality head-up display (AR-HUD) 10 for a vehicle in line with the present invention, with an optical component 24 comprising an alternative rotating disc 24c carrying three mirror sections 62, 64, 66 and cooperating with a diffusor 50 having three distinct, individually-divided diffusor sections 52, 54, 56, functioning in an analogue manner as the third embodiments. The three stops of the rotating disc 24c for reflecting light beams from the picture generating unit (PGU)
30 to left, middle and right positions are shown in FIG. 9A, 9B and 9C, respectively.
For a stable image on the windshield 34, the three mirror sections 62, 64, 66 may move in an axial direction in coordination with their angular orientation. This also allows the rotation of the disc 24C to be continuous in other embodiment, i.e. the disc 24C does not need any stop in this case.
FIG. 10 illustrates a fifth exemplary augmented reality head-up display (AR-HUD)
10 for a vehicle in line with the present invention, with an optical component 24 comprising a rotating mirror 24d carrying three mirror segments 82, 84, 86 and cooperating with a diffusor 50 having three distinct, individually-divided diffusor sections 52, 54, 56. The fifth embodiment uses a rotation movement via a motor 25 with only two stops per rotation cycle. The first stop takes place when a mirror front side with a first mirror segment 82 is pointing to the middle mirror section 16 for projecting the centre imagery section 46. Then the mirror 24d turns by 180° and the remaining two imagery sections 44, 48 are projected to the two mirror segments 14, 16 via the two mirror segments 84, 86 on the rear side of the mirror 24d.
FIG. 11 is a schematic diagram illustrating a calibration of the individually-divided concave mirror sections 14, 16, 18 mounted on a substrate 13 to provide the concave mirror 12, and the corresponding method may be applied in a camera imager chip calibration.
Each one of the individually-divided concave mirror sections 14, 16, 18 may be produced individually, which saves production costs as smaller parts are to be manufactured. The substrate 13 may be mounted on an optical jig (not shown) for calibration where the deviation from any desired position can be measured. UV- sensitive glue may be applied on the substrate 13 and one of the mirror sections 14, 16, 18, see section 18 in in FIG. 11, is applied on the glued part of the substrate 13. As said section 18 can be moved e.g. with a 5-axis robot (not shown) in directions x, y, z, pitch axis and roll axis, it can be adjusted until the alignment measured in the jig is correct such that UV light can be applied to harden the UV-sensitive glue. This process can be repeated for the remaining sections 16, 14, until all individually-divided concave mirror sections 14, 16, 18 are aligned.
FIG. 12 is a schematic diagram illustrating a driver adaption of individually-divided concave mirror sections 14, 16, 18 by tilting the substrate 13 and/or the picture generating unit (PGU) 30.
To adjust for the driver body height, the substrate 13 carrying the individually- divided concave mirror sections 14, 16, 18 of the concave mirror 12 can be rotated around an axis in Y-direction of the vehicle (not shown). The rotating can be manually controlled with a pair of electrical switches 92 (up/down) or via an optical sensor 94 that monitors the eye position of the driver. In both cases, the rotation moves the position of the HUD Eye-Box in vertical direction (up or down). FIG. 13 is a schematic diagram illustrating a calibration of the movement of an optical component 24. With respect to the fifth embodiment shown in FIG. 10 and the additional installation of a photo diode array 96 at an upper edge of the middle mirror section 16 said calibration is explained. The photo diode array 96 is connected to the electronic control unit (ECU) which, as explained with respect to FIG. 6B, controls both the picture generating unit (PGU) 30 and the motor 25 rotating the mirror 24d. The picture generating unit (PGU) 30 adds a special signal to the imagery 40 to be projected, e.g. in form of a distinctive flashing in a distinctive colour (shown as dashed line in FIG. 13) and resulting in a marker 102 within the displayed content 100. For a calibrated the optical component 24 with its mirror 24d and motor 25, said signal is detected in the middle of the photo diode array 96. In case there is a misalignment or deflection, another diode, not being arranged in the middle of the photo diode array 96, sees the signal and the electronic control unit (ECU) receives the signal from said other diode to calculate the deflection as well as modified control signals for the motor 25. Also the ECU may receive vehicle data in addition (e.g via a CAN bus connection) and calculate predictive corrections to the image in order to maintain the alignment of AR information with the outside scenery. It may be beneficial to use Artificial Intelligence methods to calculate these corrections because a multitude of data can be taken into account for this improvement. While one or more exemplary embodiments are described above, it is not intended that these embodiments describe all possible forms of the disclosure. Rather, the words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the disclosure. Additionally, the features of various implementing embodiments may be combined to form further embodiments of the disclosure.
With regard to the processes, systems, methods, heuristics, etc., described herein, it should be understood that, although the steps of such processes, etc., have been described as occurring according to a certain ordered sequence, such processes could be practiced with the described steps performed in an order other than the order described herein. It should be further understood that certain steps could be performed simultaneously, that other steps could be added, or that certain steps described herein could be omitted. In other words, the descriptions of processes described above are provided for the purpose of illustrating certain embodiments, and should in no way be construed so as to limit the claims. As used in this specification and claims, the terms "for example”/ (“e.g.”), "for instance”, "such as”, and "like”, and the verbs "comprising”, "having", "including", and their other verb forms, when used in conjunction with a listing of one or more carriers or other items, are each to be construed as open-ended, meaning that the listing is not to be considered as excluding other, additional carriers or items. Other terms are to be construed using their broadest reasonable meaning unless they are used in a context that requires a different interpretation.
REFERENCE SIGNS
10 augmented reality head-up display (AR-HUD)
12 concave mirror
13 substrate
14, 16, 18 individually-divided concave sections
20, 22 small gaps
24 optical component
24a diffusor
24b scan mirror
24c rotating disc
24d rotating mirror
25 motor
26 galvanometer scanner
28 rotable shaft
30 picture generating unit (PGU)
32 dashboard
34 windshield
36 interior of vehicle
38 PC/embedded system 40 unified, undivided imagery
41 image processing
42 divided imagery set
44, 46, 48 individual sections 50 diffusor
52, 54, 56 diffusor section
58 refraction angle of diffusor
62, 64, 66 rotating mirror segment 72, 74, 76 prism
82, 84, 86 rotating mirror segment
92 electrical control switches 94 optical sensor
96 photo diode array
100 content
102 marker

Claims

1. An augmented reality head-up display (10) for a vehicle, the augmented reality head-up display (10) comprising:
a concave mirror (12) including n individually-divided concave sections (14, 16, 18), with n being a natural number above;
an optical component (24, 24a, 24b, 24c, 24d), which is configured to diffuse and/or reflect imagery (40) projected thereon, is spaced apart from the concave mirror (12) and is configured to be moved into a plurality of or at least n different positions, with n positions being preferred, relative to the concave mirror (12); and a picture generating unit (30) spaced apart from the concave mirror (12) and the optical component (24, 24a, 24b, 24c, 24d);
wherein the concave mirror (12), the optical component (24, 24a, 24b, 24c, 24d) and the picture generating unit (30) are each configured to be mounted within a dashboard (32) disposed within an interior (36) of the vehicle;
wherein the picture generating unit (30) is further configured to receive n individual sections (44, 46, 48) of imagery (40) in a sequential order, the individual sections (44, 46, 48) of imagery (40) taken from a divided imagery set (42) such that a total quantity of the individual sections (44, 46, 48) of imagery (40) taken from the divided imagery set (42) is equal to a total quantity of the individually-divided concave sections (14, 16, 18) of the concave mirror (12); and
wherein the picture generating unit (30) is further configured to project the received individual sections (44, 46, 48) of imagery (40) onto the optical component (24, 24a, 24b, 24c, 24d), in a sequential order synchronized with the at least n different positions into which the optical component (24, 24a, 24b, 24c, 24d) is configured to be moved relative to the concave mirror (12), such that the optical component (24, 24a, 24b, 24c, 24d) moves into the different positions relative to the concave mirror (12) to diffuse and/or reflect the n respective individual sections (44, 46, 48) of imagery (40) projected thereon onto the n respective individually-divided concave sections (14, 16, 18) of the concave mirror (12), thereby further reflecting the respective individual sections (44, 46, 48) of imagery (40), in particular generally upwardly, away from the concave mirror (12), through an opening in the dashboard (32), and onto a windshield (34) of the vehicle.
2. The augmented reality head-up display (10) according to claim 1, wherein the n individually-divided concave sections (14, 16, 18) of the concave mirror (12) are separated by a plurality of gaps (20, 22) formed on the concave mirror (12) and/or are mounted to a substrate (13), preferably via a UV sensitive glue.
3. The augmented reality head-up display (10) according to claim 2, wherein
• each one of the individually-divided concave mirror sections (14, 16, 18) is produced individually and/or mounted to the substrate (13) individually, and/or
• each one of the individually-divided concave mirror sections (14, 16, 18) is configured to be moved, in particular with a 5-axis robot and/or in directions x, y, z around a pitch axis and around a roll axis, relative to the substrate (13) for calibration during manufacture of the concave mirror (12), with preferably the substrate (13) being mounted on an optical jig.
4. The augmented reality head-up display (10) according to any one of the preceding claims, wherein
at least one of the concave mirror (12) or the substrate (13) and the picture generating unit (PGU) (30) is configured to be moved, in particular rotated relative to the windshield (34) and/or around an axis in Y-direction of the vehicle, for adjustment to a driver of the vehicle, in particular to the driver body height, with preferably the rotating being manually controlled and/or via an optical sensor (94) that monitors the eye position of the driver.
5. The augmented reality head-up display (10) according to according to any one of the preceding claims, wherein
the optical component (24, 24a, 24b, 24c, 24d) comprises
• a diffuser (24a),
• a scan mirror (24b), preferably being flat, • a rotating disc (24c) carrying n optical prism (72, 74, 76) or n mirror segments (62, 64, 66), or
• a rotating mirror (24d), in particular providing n mirror segments (82, 84,
86),
and is configured to be moved, in particular oscillated or rotated, preferably with n stops per cycle or sequence and/or relative to a diffusor (50) provided with n diffusor sections (52, 54, 56).
6. The augmented reality head-up display (AR-HUD) for a vehicle according to claim 5, wherein
the optical component (24, 24a, 24b, 24c, 24d) is configured to be moved via an actuator or motor (25) or a galvanometer scanner (26), with preferably
the galvanometer scanner (26) including a high-speed rotary actuator, and/or the actuator or motor (25) or the galvanometer scanner (26) is operatively connected to the optical component (24, 24a, 24b, 24c, 24d) and is configured to move the optical component (24, 24a, 24b, 24c, 24d) into the different positions.
7. The augmented reality head-up display (10) according to any one of the preceding claims, wherein
each of the n different positions into which the optical component (24, 24a, 24b, 24c) is configured to be moved, corresponds
to a respective one of the n individually-divided concave sections (14, 16, 18) of the concave mirror (12) and/or to a different angle at which the optical component (24, 24a, 24b, 24c) is configured to be oriented.
8. The augmented reality head-up display (10) according to any one of the preceding claims, wherein
the individual sections (44, 46, 48) of imagery (40), which the picture generating unit (30) is configured to receive and project, comprise individual sections of video imagery.
9. The augmented reality head-up display (10) according to any one of the preceding claims, wherein
the picture generating unit (30) is a projector, in particular a high-speed projector, a laser projector or a digital light processing (DLP) projector, with preferably
· the projector being configured to project extended graphics array (XGA) resolution or wide extended graphics array (WXGA) resolution, and/or • the projector includes a digital micromirror device (DMD).
10. The augmented reality head-up display (10) according to any one of the preceding claims, wherein
• the respective individual sections (44, 46, 48) of imagery (40) reflected onto the windshield (34) of the vehicle appear as unified, undivided content (100) providing imagery as viewed by a driver of the vehicle,
• the respective individual sections (44, 46, 48) of imagery (40) reflected onto the windshield (34) of the vehicle comprise respective individual sections of video imagery,
• the respective individual sections (44, 46, 48) of video imagery (40) reflected onto the windshield (34) of the vehicle are each displayed at a video frame rate of n times 60 frames per second (FPS) or greater, and/or
• the respective individual sections (44, 46, 48) of imagery (40) reflected onto the windshield (34) of the vehicle include virtual information configured to be viewed by a driver of the vehicle to assist the driver and/or at least one marker (102) for adjustment.
11. The augmented reality head-up display (10) according to claim 10, wherein • the virtual information is generated from, or determined by, an advanced driver-assistance system (ADAS) of the vehicle and/or a navigation system of the vehicle, and/or
· the virtual information is reflected onto the windshield (34) of the vehicle such that the virtual information is capable of indicating or marking a boundary or object detected ahead of the vehicle, or a distance to the boundary or object detected ahead of the vehicle.
12. The augmented reality head-up display (10) according to claim 10 or 11, wherein
an optical sensor device, in particular comprising a photo diode array (96) and/or mounted at an upper edge of the concave mirror (12), preferably in the middle of a middle mirror section (16), is configured to detect a calibration signal added to the imagery (40) by the picture generating unit (PGU) (30) and to send a signal characteristic for a deflection, preferably to the electronic control unit (ECU) of the vehicle, for controlling the movement and/or stop of the optical component (24, 24a, 24b, 24c, 24d) and/or for controlling the actuator or motor (25) or the galvanometer scanner (26).
13. A vehicle with a dashboard (32), a windshield (34), an electronic control unit
(ECU) and an augmented reality head-up display (10) according to any one of the preceding claims.
14. The vehicle according to claim 13, wherein
the electronic control unit (ECU) calculates on the basis of vehicle data, which are preferably received via a CAN bus connection, predictive corrections to an image in order to maintain the alignment of AR information with the outside scenery, with the calculation preferably using Artificial Intelligence.
PCT/EP2019/079727 2018-10-30 2019-10-30 Augmented reality head-up display and vehicle WO2020089332A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862752355P 2018-10-30 2018-10-30
US62/752,355 2018-10-30

Publications (1)

Publication Number Publication Date
WO2020089332A1 true WO2020089332A1 (en) 2020-05-07

Family

ID=68771598

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/079727 WO2020089332A1 (en) 2018-10-30 2019-10-30 Augmented reality head-up display and vehicle

Country Status (1)

Country Link
WO (1) WO2020089332A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112954309A (en) * 2021-02-05 2021-06-11 的卢技术有限公司 Test method for target tracking effect on vehicle based on AR-HUD augmented reality

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3015905A1 (en) * 2013-06-28 2016-05-04 Aisin Aw Co., Ltd. Head-up display device
EP3031656A1 (en) * 2014-12-10 2016-06-15 Ricoh Company, Ltd. Information provision device, information provision method, and carrier medium storing information provision program
WO2018042898A1 (en) * 2016-08-29 2018-03-08 マクセル株式会社 Head-up display device
WO2018100040A1 (en) * 2016-11-30 2018-06-07 Jaguar Land Rover Limited Multi-depth augmented reality display

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3015905A1 (en) * 2013-06-28 2016-05-04 Aisin Aw Co., Ltd. Head-up display device
EP3031656A1 (en) * 2014-12-10 2016-06-15 Ricoh Company, Ltd. Information provision device, information provision method, and carrier medium storing information provision program
WO2018042898A1 (en) * 2016-08-29 2018-03-08 マクセル株式会社 Head-up display device
WO2018100040A1 (en) * 2016-11-30 2018-06-07 Jaguar Land Rover Limited Multi-depth augmented reality display

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112954309A (en) * 2021-02-05 2021-06-11 的卢技术有限公司 Test method for target tracking effect on vehicle based on AR-HUD augmented reality
CN112954309B (en) * 2021-02-05 2022-09-20 的卢技术有限公司 Test method for target tracking effect on vehicle based on AR-HUD augmented reality

Similar Documents

Publication Publication Date Title
US10223835B2 (en) Augmented reality alignment system and method
JP6458998B2 (en) Head-up display
JP7194906B2 (en) Video display system, video display method, program, and moving body provided with video display system
WO2015060193A1 (en) Vehicle information projection system, and projection device
WO2016203793A1 (en) Virtual image presentation system, image projection device, and virtual image presentation method
WO2012036098A1 (en) Head-up display
US11315528B2 (en) Image display system, image display method, movable object including the image display system, and non-transitory computer-readable medium
US20090160736A1 (en) Automotive head up display apparatus
CN110816408B (en) Display device, display control method, and storage medium
JP2015080988A (en) Vehicle information projection system and projection device
WO2017090464A1 (en) Head-up display
US10971116B2 (en) Display device, control method for placement of a virtual image on a projection surface of a vehicle, and storage medium
CN110967833B (en) Display device, display control method, and storage medium
WO2022209439A1 (en) Virtual image display device
US20200124846A1 (en) Display device
WO2018123528A1 (en) Display device and moving body carrying display device
WO2020089332A1 (en) Augmented reality head-up display and vehicle
US20200050002A1 (en) Display device and display control method
CN110816268B (en) Display device, display control method, and storage medium
US20200047686A1 (en) Display device, display control method, and storage medium
WO2018185956A1 (en) Virtual-image display device
CN110816270B (en) Display device, display control method, and storage medium
JP2023066556A (en) head-up display device
JP2023151827A (en) Display control device of head-up display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19813726

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 13/07/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19813726

Country of ref document: EP

Kind code of ref document: A1