GB2613003A - System and method - Google Patents

System and method Download PDF

Info

Publication number
GB2613003A
GB2613003A GB2116706.9A GB202116706A GB2613003A GB 2613003 A GB2613003 A GB 2613003A GB 202116706 A GB202116706 A GB 202116706A GB 2613003 A GB2613003 A GB 2613003A
Authority
GB
United Kingdom
Prior art keywords
graphic
feature
display unit
vehicle
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2116706.9A
Other versions
GB202116706D0 (en
Inventor
Skorokhod Aleksey
Marchuk Artemy
Bondar Oleg
Shtok Maxim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wayray AG
Original Assignee
Wayray AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wayray AG filed Critical Wayray AG
Priority to GB2116706.9A priority Critical patent/GB2613003A/en
Publication of GB202116706D0 publication Critical patent/GB202116706D0/en
Priority to PCT/EP2022/082431 priority patent/WO2023089105A1/en
Publication of GB2613003A publication Critical patent/GB2613003A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/211Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays producing three-dimensional [3D] effects, e.g. stereoscopic images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/168Target or limit values
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/177Augmented reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/178Warnings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/179Distances to obstacles or vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The display of information to a user of a vehicle using augmented reality (AR) includes a processor. The processor interacts with a display unit for displaying information using AR in a field of view associated with the system S10010. The processor also interacts with a detector for detecting a feature in an environment external to the vehicle S10010. The processor processes an AR graphic for display at the display unit, the AR graphic being representative of the time-to-collision with the feature S10020 and causes the display unit to display the AR graphic S10030. The processor may cause the display unit to display the AR graphic to augment the feature. The AR graphic may be processed to be proportional to the time to collision with the feature. An AR graphic of a first kind may be processed if the feature is in the field of view and the time to collision with the feature is greater than a first threshold time to collision. An AR graphic of a second/third kind may be processed if the time to collision with the feature is also greater/less than a second threshold time to collision. The AR graphic of the second/third kind may be a three-dimensional/two-dimensional AR graphic and representative of a moderate/high risk factor.

Description

SYSTEM AND METHOD
FIELD
The present invention relates to a system for displaying information to a user of a vehicle using augmented reality (AR), a head-up display (HUD) system, a method of displaying information using AR and a computer, computer program and non-transient computer-readable storage medium.
BACKGROUND
Known head-up display (HUD) systems used in vehicles such as cars and airplanes provide a vehicle user with computer-generated virtual graphics that augment the user's visual sensory perception of real-world features, or objects, viewable to the user during vehicle operation. The HUD systems may be configured to generate virtual graphics and project the virtual graphics as images or text through a windshield or other display screen so that the user can view the information while holding their head up and without taking their attention away from the real-world features viewable during vehicle operation. In a known system, the location of a feature such as another vehicle is determined, and an AR graphic is displayed to the user to alert the user to the presence of the another vehicle. The AR graphic is in the form of a line which is displayed to the user on the HUD in the proximity of the another vehicle. This simple augmentation might be useful, in terms of safety, awareness, or general context.
However, there are a number of problems with this approach. Firstly, the AR graphic displayed to the user can be difficult to distinguish from the real-world view and may somewhat "blend in" to the background. Secondly, the user may be inclined to disregard the AR graphic if the user considers that the AR graphic is not indicating the feature with high accuracy or if the AR graphic is not providing the user with useful information about the feature. This ultimately causes the user to distrust the HUD system. In such a case, the AR graphics displayed to the user are an unwanted distraction to the user of the vehicle. This is a problem for general use of the system, and comfortable engagement with the system, but this can also provide a significant safety issue, in terms of distracting the driver, or causing the driver to concentrate too much on the AR -2 -feature, in an attempt to compensate for any deficiencies in or around the provision of that feature.
Many commercially available vehicles now incorporate collision avoidance systems, which can incorporate automatic braking systems. Collision avoidance systems aim to prevent or reduce the severity of a collision.
However, in known collision avoidance systems, the warnings provided to the user of the vehicle are inadequate. For example, in known systems, the user may only be alerted when a collision is imminent and when the automatic braking system is being operated to, for example, reduce the speed of the vehicle. This does not allow the user to decide whether preventative action is necessary, or if preventative action should be taken by the user, prior to the automatic braking system being engaged. Moreover, the warning provided to the user of the vehicle is often displayed on the dashboard, or a simplistic projection toward the lower portion of the windscreen, therefore requiring the user of the vehicle to take their attention away from the road.
It is an object of the present invention to provide an improved system and/or method thereof and/or address one or more of the problems discussed above, or discussed elsewhere, or to at least provide an alternative system and/or method.
SUMMARY
The summary statements which follow relate to a number of aspects. The aspects are only aspects of the invention where the system and method is as defined in the claims that follow. The reader will appreciate that features of the aspects which do not fall within the scope of the invention may nevertheless be incorporated in aspects of the invention which do fall within the scope of the invention. For this reason, the aspects which are absent these features are retained in order to provide useful background information to the reader.
A first aspect provides a system for displaying information to a user of a vehicle using augmented reality (AR), the system comprising a processor configured to: interact with a display unit for displaying information using AR in a field of view associated with the system and a detector for detecting a feature in an environment external to the vehicle; process an AR graphic for display at the display unit, the AR graphic being three-dimensionally matched to the -3 -feature detected by the detector; and cause the display unit to display the AR graphic.
In one example, the processor is configured to: cause the display unit to display the AR graphic to augment the feature.
In one example, the processor is configured to: process the AR graphic such that a change in a three-dimensional property of the feature is represented by the three-dimensionally matched AR graphic.
In one example, the processor is configured to: process the AR graphic for display at the display unit, the AR graphic being three-dimensionally matched to the proximity of the feature.
In one example, the processor is configured to: process the AR graphic for display at the display unit, the AR graphic being scaled to the feature in the field of view.
In one example, the processor is configured to: process the AR graphic for display at the display unit, the AR graphic being oriented to the feature in the
field of view.
In one example, the processor is configured to: process the AR graphic for display at the display unit, the AR graphic having a first graphic portion which is provided at a constant apparent depth and a second graphic portion 20 which is provided at an increasing apparent depth.
In one example, the second graphic portion indicates the orientation of the feature.
In one example, the processor is configured to: process the AR graphic for display at the display unit by updating the scale or orientation of the AR graphic in real-time based on the proximity or orientation of the feature.
In one example, the processor is configured to: process an AR graphic of a first type for display at the display unit if the feature is in the field of view.
In one example, the processor is configured to: process an AR graphic of a second type for display at the display unit if the feature is outside the field of view.
In one example, the processor is configured to: process an AR graphic for display at the display unit, wherein the AR graphic is used to enhance the visibility and/or user awareness of the feature in the field of view. -4 -
In one example, the processor is configured to: process an AR graphic for display at the display unit, the AR graphic being representative of a time-tocollision with the feature.
A second aspect provides a head-up display (HUD) system comprising a system according to the first aspect.
A third aspect provides a vehicle comprising the system or HUD system according to the first aspect or second aspect.
A fourth aspect provides a method of displaying information using augmented reality (AR) to a user of a vehicle, the method comprising: interacting with a display unit for displaying information using AR in a field of view and a detector for detecting a feature in an environment external to the vehicle; processing an AR graphic for display at the display unit, the AR graphic being three-dimensionally matched to the feature detected by the detector; and causing the display unit to display the AR graphic.
A fifth aspect provides a computer comprising a processor and a memory configured to perform a method according to the fourth aspect.
A sixth aspect provides a computer program comprising instructions which, when executed by a computer comprising a processor and a memory, cause the computer to perform a method according to the fourth aspect.
A seventh aspect provides a non-transient computer-readable storage medium comprising instructions which, when executed by a computer comprising a processor and a memory, cause the computer to perform a method according to the fourth aspect.
An eighth aspect provides a system for displaying information to a user of a vehicle using augmented reality (AR), the system comprising a processor configured to: interact with a display unit for displaying information using AR in a field of view associated with the system and a detector for detecting a feature in an environment external to the vehicle; process an AR graphic for display at the display unit, the AR graphic being representative of the time-to-collision with the feature; and cause the display unit to display the AR graphic.
In one example, the processor is configured to: cause the display unit to display the AR graphic to augment the feature. -5 -
In one example, the processor is configured to: process the AR graphic for display at the display unit, the AR graphic being proportional to the time-tocollision with the feature.
In one example, the processor is configured to: process an AR graphic of a first kind for display at the display unit if the feature is in the field of view and the time-to-collision with the feature is greater than a first threshold time-tocollision.
In one example, the AR graphic of the first kind is a three-dimensional AR graphic.
In one example, the AR graphic of the first kind is representative of a low risk factor.
In one example, the processor is configured to: process an AR graphic of a second kind for display at the display unit if the feature is in the field of view, the time-to-collision with the feature is less than a first threshold time-to-collision and the time-to-collision with the feature is greater than a second threshold time-to-collision.
In one example, the AR graphic of the second kind is a three-dimensional AR graphic, In one example, the AR graphic of the second kind is representative of a moderate risk factor.
In one example, the processor is configured to: process an AR graphic of a third kind for display at the display unit if the feature is in the field of view, the time-to-collision with the feature is less than a second threshold time-tocollision.
In one example, the AR graphic of the third kind is a two-dimensional AR graphic.
In one example, the AR graphic of the third kind is representative of a high risk factor.
In one example, the processor is configured to: process an AR graphic of a fourth kind for display at the display unit if the feature is outside the field of view.
In one example, the AR graphic of the fourth kind is a two-dimensional AR graphic. -6 -
In one example, the processor is configured to: calculate the time-tocollision based on the relative kinematics of the feature and the vehicle.
In one example, the processor is configured to: process an AR graphic for display at the display unit, the AR graphic being three-dimensionally 5 matched to the feature detected by the detector A ninth aspect provides a head-up display (HUD) system comprising a system according to the eighth aspect.
A tenth aspect provides a vehicle comprising the system or HUD system according to the eighth aspect or ninth aspect.
An eleventh aspect provides a method of displaying information using augmented reality (AR) to a user of a vehicle, the method comprising: interacting with a display unit for displaying information using AR in a field of view and a detector for detecting a feature in an environment external to the vehicle; processing an AR graphic for display at the display unit, the AR graphic being representative of the time-to-collision with the feature; and causing the display unit to display the AR graphic.
A twelfth aspect provides a computer comprising a processor and a memory configured to perform a method according to the eleventh aspect.
A thirteenth aspect provides a computer program comprising instructions 20 which, when executed by a computer comprising a processor and a memory, cause the computer to perform a method according to the eleventh aspect.
A fourteenth aspect provides a non-transient computer-readable storage medium comprising instructions which, when executed by a computer comprising a processor and a memory, cause the computer to perform a 25 method according to the eleventh aspect.
It will of course be appreciated that features described in relation to one aspect may be incorporated into other aspects. For example, the method of any aspect may incorporate any of the features described with reference to the apparatus of any aspect and vice versa.
Other preferred and advantageous features of the invention will be
apparent from the following description.
BRIEF DESCRIPTION OF THE FIGURES -7 -
Embodiments of the invention will now be described by way of example only with reference to the figures, in which: Figure 1.1 shows a schematic drawing of a system for displaying information to a user of a vehicle using augmented reality (AR); Figure 1.2 shows field of views of the user and display unit; Figure 2 shows a schematic drawing of a head-up display (HUD) system; Figure 3 shows a first view wherein an AR graphic is displayed; Figure 4 shows a second view wherein an AR graphic is displayed; Figure 5 shows a third view wherein an AR graphic is displayed; Figure 6 shows a fourth view wherein an AR graphic is displayed; Figure 7 shows a fifth view wherein an AR graphic is displayed; and Figure 8 shows a vehicle comprising a system or HUD system; Figure 9 shows general methodology principles; and Figure 10 shows general methodology principles.
DETAILED DESCRIPTION
The description which follows describes a number of embodiments. The embodiments are only embodiments of the invention where the system and method is as defined in the claims that follow. The reader will appreciate that features of the embodiments which do not fall within the scope of the invention may nevertheless be incorporated in embodiments of the invention which do fall within the scope of the invention. For this reason, the description of the embodiments which are absent these features are retained in order to provide useful background information to the reader.
Referring to Figure 1.1, there is shown a system 10 for displaying information to a user of a vehicle using augmented reality (AR). The system 10 comprises a processor 100. The processor 100 is configured to interact with a display unit 200 and a detector 300. The processor 100 being configured to interact with the display unit 200 and a detector 300 may mean that the processor 100 is configured to receive information from and send information to the display unit 200 and the detector 300.
Here, "a user of a vehicle" is typically a driver, or operator, of the vehicle. However, the user may be one or more other users of the vehicle, such as a passenger. In an example, the vehicle may be operable in a fully autonomous -8 -mode, including a mode without any direct control from a user of the vehicle (e.g., a driver). In this case, the driver may indeed be a vehicle passenger. In a fully autonomous mode, the system 10 (and/or HUD system 50 as described below) may augment the field of view by displaying AR graphics on the display unit 200 to encourage the user to maintain their attention on the road ahead, despite not controlling the vehicle themselves. This may be referred to as "gam ification" of the display. Even more so, if not driving or controlling the vehicle, games may be played whilst using the vehicle. The gaming may involve the use of the AR graphics.
For the avoidance of doubt, the system 10 may be able to be retrofitted to a vehicle comprising a display unit 200 and a detector 300. In this case, the system 10 itself need not comprise a display unit 200 and detector 300. However, in an alternative construction, which does not involve retrofit, the system 10 may comprise a display unit 200 and a detector 300. This could be a physical retrofit, or a software update or upgrade.
The display unit 200 is for displaying information using AR in a field of view (FoV). The display unit 200 may comprise, or be in the form of, a holographic display unit. A holographic display is a type of display that utilizes light diffraction to create a virtual three-dimensional image. The holographic display unit may comprise a combiner. The combiner may be the windshield (i.e., the whole or a part of the windshield). That is, the present system 10 is an optical see-through system. Optical see-through systems allow the user to view the real-world "directly". These AR systems add virtual content, in the form of AR graphics, by adding additional light on top of the light coming in from the real-world.
Referring to Figure 1.2, a plan view of an exemplary FoV of the user and FoV associated with the system 100 is shown. The FoV of the user (herein referred to as the "user FoV") is indicated at 20, and the FoV associated with the system 100 is indicated at 30.
In this example, the FoV associated with the system 100 is a FoV of the display unit 200. As shown, the FoV of the user is greater than the FoV of the display unit 200. Typically, the FoV of a user ranges from 120 to 200 degrees. The FoV of the display unit has a vertical FoV of approximately 4 degrees, and a horizontal FOV in the range of 15 to 20 degrees. From this figure, it will be -9 -understood that the display unit 200 displays information using AR in a FoV of the user.
In the description herein, the "FoV of the display unit" means a FoV of an augmentable area. This is the part of a human visual field where the display unit 200 can show virtual content, to augment the view of the real-world. That is, the FoV of the display unit 200 is an area which is observable by the user and which the display unit 200 is able to augment the user's view of the real-world. In this sense, it may be possible to refer to the FoV of the display unit as "the field of view of a user using the system 10" or the "field of view of the user using/viewing the display unit 200". In a HUD system, for example HUD system 50, this may be referred to as the "field of view of the user using the HUD system 50".
In the description herein, the FoV of the display unit 200 will be referred to simply as the "FoV". It will be understood that the display unit 200 is located in an area which is observable by the user. In a specific example, this means that the display unit 200 is located at, or in front of (i.e. toward the user), the windshield through which the user observes the road ahead when using the vehicle. In other examples, the display unit 200 may be provided at a side or rear of the vehicle, and thus the FoV may be to the side of rear of the vehicle.
The detector 300 is for detecting a feature in an environment external to the vehicle.
The detector 300 may be in the form of an optical detector, for example a camera. Additionally, or alternatively, the detector 300 may be in the form of a radio-detection-and-ranging (Radar) system or a light-detection-and-ranging (Lidar) system.
A feature may include another vehicle, such as a road-going vehicle or off-road vehicle. A feature may also include road-markings or general road furniture. A feature could be a human or animal. The sorts of features detected or tracked may vary for different uses or modes or types of system, vehicle or scenario.
Here, "an environment external to the vehicle" may include detecting features in-front of the vehicle, to the sides of the vehicle and also behind the vehicle. That is, the detector 300 may be operable to detect features outside the field of view.
-10 -The processor 100 is configured to process an AR graphic for display at the display unit. An AR graphic is a graphic that is to be displayed using augmented reality (AR). The processor 100 is configured to process the AR graphic in a number of different ways, as will be described in further detail below.
The processor 100 is configured to cause the display unit 200 to display the AR graphic.
Whilst the processor 100 causing the display unit 200 to display the AR graphic will likely be implemented in a practical implementation, it may be viewed as optional in terms of certain embodiments. For example, a core differentiating feature is the processing a three-dimensionally matched graphic, or processing an AR graphic representative of the time-to-collision. In terms of key features that differentiate the invention, causing the display unit 200 to display the AR graphic may be optional.
Referring to Figure 2, there is shown a head-up display (HUD) system 50. The HUD system 50 comprises a picture generation unit 500, a combiner unit 600 and a corrector unit 700.
A combiner and corrector may be useful in, for example, a holographic system, or other system where a user is presented with a surface that is both looked or seen through, yet provided with graphical features by the system. The combiner might be, for example an at least partially (e.g. semi) transparent surface used in the system, to overlay an image presented by a projector (which may be part of or separate to the picture generation unit 500) on top of the user's physical world. The combiner is at least partially transparent and lets the user see through it, while simultaneously reflecting or otherwise directing light (e.g. AR features) to the user. The corrector unit may be used to correct aberrations, filter, and/or to improve light utilization efficiencies, or generally correct an optical property of some kind. In some examples, one or more optical devices or lenses may be used, including filters.
The system 10 described herein may be incorporated in the HUD system 50. The system 10 and/or HUD system 50 may be incorporated, or form part of, an advanced driver-assistance system (ADAS).
Although the present disclosure relates to systems 10 and HUD systems 50 for installation in vehicles, the present disclosure could also be applied in the context of head-mounted displays (HMDs). For example, the system 10 could be incorporated in a pair of smart-glasses or other wearable device.
Figures 3 to 7 each show a forward view through the vehicle windshield wherein one or more AR graphics are displayed in a FoV (that is, as above, a field of view of the display unit 200). Operation and functionality of the system 10 will be described with reference to the figures. In the figures, user FoVs and FoVs of the display unit 200 are illustrated as rectangular FoVs. It will be appreciated that in practice, such FoVs are not rectangular.
Figure 3 shows a first user FoV 3000 wherein a first AR graphic 3010 is displayed in a FoV 3200 (that is, the FoV of the display unit 200).
Figure 4 shows a second user FoV 4000 wherein a second AR graphic 4010 is displayed in a FoV 4200 (that is, the FoV of the display unit 200).
Figure 5 shows a third user FoV 5000 wherein a third AR graphic 5010 is displayed in a FoV 5200 (that is, the FoV of the display unit 200).
Figure 6 shows a fourth user FoV 6000 wherein a fourth AR graphic 6010 is displayed in a FoV 6200 (that is, the FoV of the display unit 200).
Figure 7 shows a fifth user FoV 7000 wherein a fifth AR graphic 7010 is displayed in a FoV 7200 (that is, the FoV of the display unit 200).
The processor 100 is configured to cause the display unit 200 to display the AR graphic 3010:7010 (increments of 1000) to augment the feature. That is, as shown in Figures 3 to 7, the AR graphic 3010:7010 provides information to the user about the feature, for example information relating to a three-dimensional property of the feature (e.g. relative to the system/vehicle) such as proximity, position, orientation, and also provides information to the user about (estimated or calculated) time-to-collision with the feature. In other words, a three-dimensional property of the feature (e.g. relative to the system/vehicle) is taken into account in the processing and provision of the AR graphic.
By augmenting the feature, the system provides the user with useful information about the feature. This can improve user experience, encourage safe operation of the vehicle, and alert the vehicle user to the presence of the feature.
-12 -In each of Figures 3 to 7, the feature is another vehicle, indicated generally at 1000. The another vehicle will herein be referred to as a "detected vehicle 1000", as it is a feature in an environment external to the vehicle that is detected by the detector 300. The skilled person will nevertheless appreciate that the present teaching may be applied to other features, including road-markings or general road furniture, as stated above.
The processor 100 is configured to process the AR graphic in a number of different ways, as will be described in further detail below. Two such ways include three-dimensional matching and representing time-to-collision.
In a first example, the processor 100 is configured to process the AR graphic for display at the display unit 200, the AR graphic being three-dimensionally matched to the feature detected by the detector 300.
Here, "a three-dimensionally matched" AR graphic means a graphic that is matched to the position and/or orientation (including direction-of- travel) of the feature in three-dimensions or is representative of a three-dimensional property of the feature, for example its shape. In this way, the AR graphic displayed on the display unit 200 is able to be scaled to the detected feature. In other words, a three-dimensional property of the feature (e.g. relative to the system/vehicle) is taken into account in the processing and provision of the AR graphic. This is in contrast to a simplistic approach, where a feature is detected, and an AR graphic is simply provided adjacent to that feature, irrespective of a three-dimensional relationship.
In the examples described herein, and those illustrated in Figures 3 to 5, the three-dimensional matching is by virtue of the AR graphic having a first graphic portion 3012:5012 (increments of 1000) which is provided at a constant apparent depth (e.g. a horizontal line) and a second graphic portion 3014:5014 (increments of 1000) which is provided at an increasing apparent depth (e.g. two lines extending appearing to extend away from the user and indicating a dimension, such as the width, of the feature). The second graphic portion also indicates the orientation of the feature. That is, where the feature is turned to the side, the second graphic portion indicates this by following (e.g. extending along) the side of the vehicle in the FoV.
In this way, the AR graphic displayed to the user is better representative of the real-life three-dimensional nature of the feature. A three-dimensional AR -13 -graphic is easily identifiable by the user. Moreover, a three-dimensional AR graphic provides greater level of information to the user, as a three-dimensional AR graphic can indicate the position or orientation of the feature, and not merely the location of the feature. A three-dimensional AR graphic also indicates to the user that the system 10 is properly tracking the feature. This increases confidence in the system 10, which may be particularly useful in a self-driving vehicle implementation. A three-dimensionally matched AR graphic indicates the position or orientation of features with high accuracy. Ultimately, as a result, the user's experience of the system 10 is thereby improved. Distraction and concentration levels are reduced, which improves safety.
In providing a three-dimensionally matched AR graphic, the processor 100 may receive information from the detector 300, process the information from the detector 300 and process an AR graphic for display on the display unit 200. The processor 100 may select an AR graphic from a library of AR graphics, the selected graphic being three-dimensionally matched to the feature as perceived by the user of the vehicle. However, this approach might require a large number of such graphics, for matching to a large number of possible three-dimensionally matchings.
Alternatively, the processor 100 may adjust an AR graphic displayed on the display unit 200 to ensure that the AR graphic remains three-dimensionally matched to the feature as perceived by the user of the vehicle. For example, this might be undertaken using a mathematical or graphical transform procedure. That is, the AR graphic scales with the detected feature. This might require a lesser number of features to be stored in comparison with the library approach.
Of course, the skilled person will understand how an AR graphic may be displayed on a display unit 200, and the above provides illustrative examples of how a three-dimensionally matched AR graphic may be displayed on a display unit 200.
The processor 100 is configured to process the AR graphic such that a change in a three-dimensional property of the feature (e.g. in isolation or relative to the vehicle) is represented by the three-dimensionally matched AR graphic. This can be understood from Figures 3 to 5.
-14 -Referring to Figure 3, with the detected vehicle 1000 in a first position, the first AR graphic 3010 is three-dimensionally matched to the detected vehicle 1000 in the first position.
Referring to Figure 4, with the detected vehicle 1000 in a second position, the second AR graphic 4010 is three-dimensionally matched to the detected vehicle 1000 in the second position. In this case, the second position is also a second, different, proximity. As shown, in the second position the detected vehicle 1000 is closer to the user/vehicle than in the first position. The size of the vehicle in the FoV 4200 is therefore greater. The second AR graphic 4010 represents this change in three-dimensional property of the feature by becoming larger, noticeably increasing the size of the second graphic portion 4012. That is, the AR graphic scales to the detected feature.
Referring to Figure 5, with the detected vehicle 1000 in a third position, the third AR graphic 5010 is three-dimensionally matched to the detected vehicle 1000 in the third position. As shown, in the third position the detected vehicle 1000 is closer to the user/vehicle than in the first and second positions. The size of the vehicle in the FoV 5200 is therefore greater. The third AR graphic 5010 represents this change in three-dimensional property of the feature by becoming larger again, noticeably increasing the size of the second graphic portion 5010b. That is, the AR graphic scales to the detected feature.
In this way, a change in a three-dimensional property of the detected vehicle 1000 is represented by the three-dimensionally matched AR graphics. As above, such functionality means that AR graphic displayed to the user is better representative of the real-life three-dimensional nature of the feature.
Moreover, a three-dimensionally matched AR graphic provides greater level of information to the user, as a matched AR graphic can indicate the position or orientation of the feature, and not merely the location of the feature. A three-dimensionally matched AR graphic indicates the position or orientation of features with high accuracy. Ultimately, as a result, the user's experience of the system 10 is thereby improved. The user's safety is also improved.
Referring back to Figures 3 to 5, as will be apparent from the description provided above of processing the AR graphic such that a change in a three-dimensional property of the feature is represented by the three-dimensionally matched AR graphic, the processor 100 is configured to process the AR graphic -15 -for display at the display unit 200, the AR graphic being three-dimensionally matched to the proximity of the detected vehicle 1000. As shown in Figures 3 to 5, as the proximity of the detected vehicle 1000 changes, the AR graphic is matched accordingly. The proximity might be determined or inferred. For example, the proximity could be measured by a sensor, and any matching undertaken based on that determination. Or, the proximity might be inferred, by a change in detected size of the detected vehicle (or other object) in the FoV. That is, if the vehicle (or other object) is detected or tracked, and its apparent size changes, it can be inferred that (typically) the detected vehicle (or other object) has changed its proximity.
In this way, the AR graphic is representative of the proximity of the detected vehicle 1000. User safety and experience is thereby improved.
Additionally, the processor 1000 is configured to process the AR graphic for display at the display unit 200, the AR graphic being scaled to the detected vehicle 1000 in the FoV. As shown in Figures 3 to 5, the AR graphic is scaled to the apparent size of the detected vehicle 1000 in the FoV. For example, a dimension of the AR graphic (e.g., width) may be proportional to, or match, the same dimension of the feature in the FoV.
In this way, the AR graphic accurately represents the real-life position of 20 the detected vehicle 1000. User safety and experience is thereby improved, as the AR graphic effectively augments the FoV and does not provide a distraction to the user.
Additionally, the processor 1000 is configured to process the AR graphic for display at the display unit 200, the AR graphic being oriented to the detected vehicle 1000 in the FoV. As shown in Figures 3 to 5, the AR graphic is oriented to represent the detected vehicle 1000 travelling away from, and directly in front of, the vehicle. Whilst not shown, where the detected vehicle 1000 is oriented at an angle to the vehicle (for example where the detected vehicle 1000 is changing lanes ahead of the vehicle) the AR graphic is oriented accordingly.
In this way, the AR graphic accurately represents the real-life orientation of the detected vehicle 1000. User safety and experience is thereby improved, as the AR graphic effectively augments the FoV and does not provide a distraction to the user.
-16 -The processor 100 is configured to update the scale or orientation of the AR graphic in real-time based on the proximity or orientation of the detected vehicle 1000. This is shown in Figures 3 to 5 above. In this way, changes in the proximity or orientation are accurately indicated by the AR graphic. The AR graphic accurately represents the real-life proximity or orientation of the detected vehicle 1000. User safety and experience is thereby improved, as the AR graphic effectively augments the FoV and does not provide a distraction to the user.
Referring back to Figures 3 to 5, the processor 100 is configured to process an AR graphic 3010, 4010, 5010 of a first type for display at the display unit 200 if the feature is in the FoV. The first type of AR graphic may be a "staple"-shaped graphic, such as 3010, 4010 and 5010 shown in Figures 3 to 5, having a first graphic portion 3012, 4012, 5012 of constant apparent depth and second graphic portion 3014, 4014, 5014 of increasing apparent depth.
By displaying an AR graphic of a first type if the detected feature is in the FoV, user interaction with the system 10 is improved as the user learns that the first type of AR graphic is displayed to provide information about features in the FoV.
Referring to Figure 7, the processor 100 is configured to process an AR graphic 7010 of a second type for display at the display unit 200 if the feature is outside the FoV 7200. The second type of AR graphic may be an "arrowhead"-shaped graphic, such as 7010 shown in Figure 7. The arrowhead-shaped graphic indicates the presence of features outside the FoV 7200. Here, "outside the field of view of the user" may mean not completely within the FoV 7200, for example as shown in Figure 7. The arrowhead-shaped AR graphic 7010 is three-dimensionally matched to the detected vehicle 1000 by scaling or orienting the AR graphic 7010 relative to the position or orientation of the detected vehicle 1000.
By displaying an AR graphic of a second type if the detected feature is outside (or not completely within) the FoV 7200, user interaction with the system 10 is improved as the user learns that the second type of AR graphic is displayed to provide information about features outside (or not completely within) the FoV 7200. That is, it may not be possible to display an AR graphic of a first type, but an AR graphic of a second type may still be displayable so as to -17 -provide useful information to the user. User safety and experience is thereby improved, as the AR graphic effectively augments the FoV 7200 and alerts the user to features outside the FoV 7200.
In this way, whilst the detected vehicle 1000 may be too close to the vehicle (and thus not completely within the FoV 7200) to display the first type of AR graphic at the display unit 200, the second type of AR graphic 7010 can still indicate the presence of the detected vehicle 1000 to the user. This may avoid the need to display an incorrect or inappropriately proportioned first type of AR graphic to the user. For example, displaying a first type of AR graphic in this scenario may take up too much space in the FoV 7200, or may overlap other features displayed on the display unit 200, for example speed indicators.
As understood from Figures 3 to 7, the AR graphic 3010:7010 is used to enhance the visibility and/or user awareness of the feature in the FoV. As shown in the figures, by enhancing the visibility, the detected feature stands out from the background (i.e. the road features) and so user and vehicle safety is improved Enhancing user awareness of the feature leads to improvements in user and vehicle safety, and ultimately improves user interaction with the system 10.
In a second example, the processor 100 is configured to process the AR graphic for display at the display unit 200, the AR graphic being representative of a time-to-collision with the feature.
Here, the "time-to-collision" means the predicted or calculated time before the vehicle comes into contact with the feature detected by the detector 3000. It will be understood that this may apply to three-dimensional features, for example a detected vehicle 1000. In one example, the time-to-collision may be the time to reach a specific point or boundary, for example a road lane marking or a road junction, or even a navigational point.
Advantageously, the AR graphic being representative of the time-tocollision with the feature, the system 10 provides advance warning of a possible collision to the vehicle user. This allows the user to decide whether preventative or corrective action is necessary and should be taken by the user, which may allow the user to make such a decision prior to an automatic braking system being engaged. Moreover, by providing such an indication as an AR graphic (which may be displayed on a HUD system, such as a HUD system -18 -incorporated in or adjacent to a windshield) the user need not take their attention away from the road. User experience is thereby improved, and user and vehicle safety is thereby increased.
The processor 100 is configured to calculate the time-to-collision based on the relative kinematics of the feature and the vehicle. That is, the time-to-collision may be calculated based on relative position, orientation, speed, acceleration, motion and/or other kinematic factors. In this way, time-to-collision can be calculated with high accuracy, and thus accurate information relating to the environment can be provided to the user of the vehicle. Additionally, the time-to-collision may be calculated based on road conditions and/or environmental conditions. For example, a loose road surface and wet weather will increase stopping distance, and thus may need to be factored in to the timeto-collision calculation.
The processor 100 is configured to process the AR graphic for display at the display unit 200, the AR graphic being proportional to the time-to-collision with the feature. That is, AR graphic itself is representative of the time-tocollision with the feature. In this way, the user need only refer to the AR graphic to gauge the time-to-collision with the feature. Advantageously, this allows the user to decide whether preventative or corrective action is necessary and should be taken so as to avoid collision, which may allow the user to make such a decision prior to an automatic braking system being engaged. The user need not take their attention away from the road where such an AR graphic is provided on a HUD system. User experience is thereby improved, and user and vehicle safety is thereby increased.
Referring to Figures 3 to 7, AR graphics representative of the time-to-collision with a feature -the detected vehicle 1000 -are shown. In the description which follows, AR graphics of a first, second, third and fourth kind are described. It will be understood that the AR graphic of a "kind" may be the same as an AR graphic of a "type" as described above.
Referring to Figure 3, the time-to-collision with the detected vehicle 1000 is determined to be greater than a first threshold time-to-collision ti. As can be seen from Figure 3, the detected vehicle 1000 is in the FoV 3200. The processor 100 is configured to process an AR graphic 3010 of a first kind for display at the display unit 200.
-19 -In this way, the user learns that the first kind of AR graphic 3010 represents a time-to-collision with the detected vehicle 1000 that is greater than a first threshold time-to-collision ti. Thus, the user may understand that the time-to-collision with the detected vehicle 1000 is such that the user has adequate time to decide whether preventative or corrective action could be necessary, if, for example, the detected vehicle 1000 were to brake or otherwise change position. More simply, the user understands that the vehicle is maintaining a safe relative position, orientation, speed, motion and/or acceleration to the detected vehicle 1000.
The first kind of AR graphic 3010 is shown as a single "staple"-shaped AR graphic. That is, the first kind of AR graphic 3010 is a three-dimensional AR graphic. In this way, the AR graphic 3010 can represented a three-dimensional property of the detected vehicle 100, which is advantageous as set out above. In other words, the different examples can be used in advantageous or synergistic combination.
The first kind of AR graphic 3010 is representative of a low risk factor. In this way, on seeing the first kind of AR graphic the user can understand that the detected vehicle 1000 poses a low level of risk to the vehicle. In a preferred example the first kind of AR graphic 3010 may be provided in a colour which indicates a low risk factor, for example a green colour. In this way, the user can quickly and easily interpret and understand the first kind of AR graphic 3010 to represent a low risk factor.
Referring to Figure 4, the time-to-collision with the detected vehicle 1000 is determined to be less than the first threshold time-to-collision ti but greater than a second threshold time-to-collision t2. As can be seen from Figure 4, the detected vehicle 1000 is in the FoV 4200. The processor 100 is configured to process an AR graphic 4010 of a second kind for display at the display unit 200. In this way, the user learns that the second kind of AR graphic 4010 represents a time-to-collision with the detected vehicle 1000 that is less than a first threshold time-to-collision ti but greater than a second threshold time-to-collision t2. Thus, the user may understand that the time-to-collision with the detected vehicle 1000 is such that the user has a moderate amount time to decide whether preventative or corrective action could be necessary, if, for example, the detected vehicle 1000 were to brake or otherwise change position. -20 -
More simply, the user understands that the vehicle is maintaining a moderately safe relative position, orientation, speed, motion and/or acceleration to the detected vehicle 1000. The user thereby understands that increased attention is required to be provided to the road ahead, or alternatively or additionally, the user may wish to take corrective action to reduce the level of risk to a lower level. Of course, in doing so, the AR graphic would switch from the second kind (e.g., 4010) to the first kind (e.g., 3010).
The second kind of AR graphic 4010 is shown as a double "staple"-shaped AR graphic. That is, the second kind of AR graphic 4010 is a three-dimensional AR graphic. In this way, the AR graphic 4010 can represent a three-dimensional property of the detected vehicle 1000, which is advantageous as set out above.
The second kind of AR graphic 4010 is representative of a moderate risk factor. In this way, on seeing the second kind of AR graphic 4010 the user can understand that the detected vehicle 1000 poses a moderate level of risk to the vehicle. In a preferred example the second kind of AR graphic 4010 may be provided in a colour which indicates a moderate risk factor, for example a yellow or amber colour. In this way, the user can quickly and easily interpret and understand the second kind of AR graphic 4010 to represent a moderate risk factor. In another preferred example, the second kind of AR graphic 4010 may be a single "staple-shaped AR graphic displayed in a yellow or amber colour, to represent a moderate risk factor.
Referring to Figure 5, the time-to-collision with the detected vehicle 1000 is determined to be less than the first threshold time-to-collision ti and also less than the second threshold time-to-collision t2. As can be seen from Figure 5, the detected vehicle 1000 is in the FoV 5200. The processor 100 is configured to process an AR graphic 5010 of a third kind for display at the display unit 200. In this way, the user learns that the third kind of AR graphic 5010 represents a time-to-collision with the detected vehicle 1000 that is less than a first threshold time-to-collision ti and less than a second threshold time-to-collision t2. Thus, the user may understand that the time-to-collision with the detected vehicle 1000 is such that the user has a short amount time to decide whether preventative or corrective action could be necessary, if, for example, the detected vehicle 1000 were to brake or otherwise change position. More -21 -simply, the user understands that the vehicle is maintaining a potentially dangerous relative position, orientation, speed, motion and/or acceleration to the detected vehicle 1000. The user thereby understands that high attention is required to be provided to the road ahead, or alternatively or additionally, the user may wish to take corrective action to reduce the level of risk to a lower level. Of course, in doing so, the AR graphic would switch from the third kind (e.g., 5010) to the second kind (e.g., 3010) or to the first kind (e.g., 2010).
The third kind of AR graphic 5010 is shown as a double "cross-staple"shaped AR graphic. That is, the third kind of AR graphic 5010 is a three-dimensional AR graphic. In this way, the AR graphic 5010 can represent a three-dimensional property of the detected vehicle 1000, which is advantageous as set out above.
The third kind of AR graphic 5010 is representative of a high risk factor. In this way, on seeing the third kind of AR graphic 5010 the user can understand that the detected vehicle 1000 poses a high level of risk to the vehicle. In a preferred example the third kind of AR graphic 5010 may be provided in a colour which indicates a high risk factor, for example a red colour. In this way, the user can quickly and easily interpret and understand the third kind of AR graphic 5010 to represent a high risk factor.
Alternatively, the third kind of AR graphic may be such as that shown in Figure 6 and indicated at 6010. In this example, the third kind of AR graphic 6010 is a triangular-shaped AR graphic 6010. Again, the third kind of AR graphic 6010 is representative of a high risk factor. In this way, on seeing the third kind of AR graphic 6010 the user can understand that the detected vehicle 1000 poses a high level of risk to the vehicle. In a preferred example the third kind of AR graphic 6010 may be provided in a colour which indicates a high risk factor, for example a red colour. In this way, the user can quickly and easily interpret and understand the third kind of AR graphic 6010 to represent a high risk factor. In this instance, the triangular-shaped third kind of AR graphic 6010 may be a two-dimensional AR graphic.
Additionally, whilst the detected vehicle 1000 may be too close to the vehicle to display a staple-shaped AR graphic at the display unit 200, the triangular-shaped AR graphic 6010 can still indicate the presence of the detected vehicle 1000 to the user. This may avoid the need to display an -22 -incorrect or inappropriately proportioned AR graphic to the user. For example, displaying a staple-shaped AR graphic in the FoV 6200 in this scenario may take up too much space in the FoV, or may overlap other features displayed on the display unit 200, for example speed indicators.
Referring to Figure 7, the time-to-collision with the detected vehicle 1000 is determined to be less than the first threshold time-to-collision ti and also less than the second threshold time-to-collision t2. As can be seen from Figure 7, the detected vehicle 1000 is outside the FoV 7200. The processor 100 is configured to process an AR graphic 7010 of a fourth kind for display at the display unit 200.
In this way, the user learns that fourth kind of AR graphic 7010 represents a detected vehicle outside the FoV 7200. The user is therefore alerted to a potential risk outside of the FoV 7200. In a preferred example, the fourth kind of AR graphic 7010 is an arrowhead-shaped graphic, pointing toward the detected feature.
Where the time-to-collision with the detected vehicle 1000 is greater than a first threshold time-to-collision ti the fourth kind of AR graphic 7010 may have a first shape or colour, for example a green colour. Where the time-to-collision with the detected vehicle 1000 is less than a first threshold time-to-collision t1 but greater than a second threshold time-to-collision t2 the fourth kind of AR graphic 7010 may have a second shape or colour, for example a yellow or amber colour. Where the time-to-collision with the detected vehicle 1000 is less than a first threshold time-to-collision t1 and less than a second threshold timeto-collision t2 the fourth kind of AR graphic may have a third shape or colour, for example a red colour.
In this way, the level of risk posed by the detected feature is readily understandable by the user. User attention is thereby drawn to the feature, despite it being outside of the FoV 7200. User experience and vehicle safety is thereby improved.
It will be understood that the first and second examples described above, relating to three-dimensional matching and time-to-collision respectively, can be implemented by the same system 10, 50. Features of each may be combined or interchanged as necessary or as desired. -23 -
Referring to Figure 8, a vehicle 8000 is shown. The vehicle 8000 comprises a system 10 or HUD system 50.
Referring to Figure 9, a method of displaying information using augmented reality (AR) to a user of a vehicle is shown. Step S9010 comprises interacting with a display unit for displaying information using AR in a field of view associated with the system and a detector for detecting a feature in an environment external to the vehicle. Step S9020 comprises processing an AR graphic for display at the display unit, the AR graphic being three-dimensionally matched to the feature detected by the detector. Step S9030 comprises causing the display unit to display the AR graphic.
Whilst step 59030 will likely be implemented in a practical implementation, it may be viewed as optional in terms of certain embodiments. For example, a core differentiating feature is the processing step. In terms of key features that differentiate the invention, the actual display step may be optional.
Referring to Figure 10, a method of displaying information using augmented reality (AR) to a user of a vehicle is shown. Step S10010 comprises interacting with a display unit for displaying information using AR in a field of view associated with the system and a detector for detecting a feature in an environment external to the vehicle. Step S10020 comprises processing an AR graphic for display at the display unit, the AR graphic being representative of the time-to-collision with the feature. Step S10030 comprises causing the display unit to display the AR graphic.
Whilst step S10030 will likely be implemented in a practical implementation, it may be viewed as optional in terms of certain embodiments.
For example, a core differentiating feature is the processing step. In terms of key features that differentiate the invention, the actual display step may be optional.
Although a preferred embodiment has been shown and described, it will 30 be appreciated by those skilled in the art that various changes and modifications might be made without departing from the scope of the invention, as defined in the appended claims and as described above. -24 -
At least some of the example embodiments described herein may be constructed, partially or wholly, using dedicated special-purpose hardware. Terms such as 'component', 'module' or 'unit' used herein may include, but are not limited to, a hardware device, such as circuitry in the form of discrete or integrated components, a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC), which performs certain tasks or provides the associated functionality. In some embodiments, the described elements may be configured to reside on a tangible, persistent, addressable storage medium and may be configured to execute on one or more processors.
These functional elements may in some embodiments include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. Although the example embodiments have been described with reference to the components, modules and units discussed herein, such functional elements may be combined into fewer elements or separated into additional elements. Various combinations of optional features have been described herein, and it will be appreciated that described features may be combined in any suitable combination. In particular, the features of any one example embodiment may be combined with features of any other embodiment, as appropriate, except where such combinations are mutually exclusive. Throughout this specification, the term "comprising" or "comprises" means including the component(s) specified but not to the exclusion of the presence of others.
Attention is directed to all papers and documents which are filed concurrently with or previous to this specification in connection with this application and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference.
All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except -25 -combinations where at least some of such features and/or steps are mutually exclusive Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
The invention is not restricted to the details of the foregoing embodiment(s). The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.

Claims (6)

  1. -26 -CLAIMS1. A system for displaying information to a user of a vehicle using augmented reality (AR), the system comprising a processor configured to: interact with a display unit for displaying information using AR in a field of view associated with the system and a detector for detecting a feature in an environment external to the vehicle; process an AR graphic for display at the display unit, the AR graphic being representative of the time-to-collision with the feature; and cause the display unit to display the AR graphic.
  2. 2. The system according to claim 1 wherein the processor is configured to: cause the display unit to display the AR graphic to augment the feature.
  3. The system according to claim 1 or claim 2 wherein the processor is configured to: process the AR graphic for display at the display unit, the AR graphic being proportional to the time-to-collision with the feature.
  4. 4. The system according to any one of the preceding claims wherein the processor is configured to: process an AR graphic of a first kind for display at the display unit if the feature is in the field of view and the time-to-collision with the feature is greater than a first threshold time-to-collision.
  5. 5. The system according to claim 4 wherein the AR graphic of the first kind is a three-dimensional AR graphic, optionally wherein the AR graphic of the first kind is representative of a low risk factor.
  6. 6. The system according to any one of the preceding claims wherein the processor is configured to: process an AR graphic of a second kind for display at the display unit if the feature is in the field of view, the time-to-collision with the -27 -7 8. 9. 10.feature is less than a first threshold time-to-collision and the time-tocollision with the feature is greater than a second threshold time-tocollision The system according to claim 6 wherein the AR graphic of the second kind is a three-dimensional AR graphic, optionally wherein the AR graphic of the second kind is representative of a moderate risk factor.The system according to any one of the preceding claims wherein the processor is configured to: process an AR graphic of a third kind for display at the display unit if the feature is in the field of view, the time-to-collision with the feature is less than a second threshold time-to-collision, optionally wherein the AR graphic of the third kind is a two-dimensional AR graphic, optionally wherein the AR graphic of the third kind is representative of a high risk factor.The system according to any one of the preceding claims wherein the processor is configured to: process an AR graphic of a fourth kind for display at the display unit if the feature is outside the field of view, optionally wherein the AR graphic of the fourth kind is a two-dimensional AR graphic.The system according to any one of the preceding claims wherein the processor is configured to: calculate the time-to-collision based on the relative kinematics of the feature and the vehicle.The system according to any one of the preceding claims wherein the processor is configured to: process an AR graphic for display at the display unit, the AR graphic being three-dimensionally matched to the feature detected by the detector.12. A head-up display (HUD) system comprising the system according to any one of the preceding claims.13. A vehicle comprising the system or HUD system according to any one of the preceding claims.14. A method of displaying information using augmented reality (AR) to a user of a vehicle, the method comprising: interacting with a display unit for displaying information using AR in a field of view and a detector for detecting a feature in an environment external to the vehicle; processing an AR graphic for display at the display unit, the AR graphic being representative of the time-to-collision with the feature; and causing the display unit to display the AR graphic.15. A computer comprising a processor and a memory configured to perform a method according to claim 14, a computer program comprising instructions which, when executed by a computer comprising a processor and a memory, cause the computer to perform a method according to claim 14 or a non-transient computer-readable storage medium comprising instructions which, when executed by a computer comprising a processor and a memory, cause the computer to perform a method according to claim 14.
GB2116706.9A 2021-11-19 2021-11-19 System and method Pending GB2613003A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2116706.9A GB2613003A (en) 2021-11-19 2021-11-19 System and method
PCT/EP2022/082431 WO2023089105A1 (en) 2021-11-19 2022-11-18 System and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2116706.9A GB2613003A (en) 2021-11-19 2021-11-19 System and method

Publications (2)

Publication Number Publication Date
GB202116706D0 GB202116706D0 (en) 2022-01-05
GB2613003A true GB2613003A (en) 2023-05-24

Family

ID=79163898

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2116706.9A Pending GB2613003A (en) 2021-11-19 2021-11-19 System and method

Country Status (2)

Country Link
GB (1) GB2613003A (en)
WO (1) WO2023089105A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090189753A1 (en) * 2008-01-25 2009-07-30 Denso Corporation Automotive display device showing virtual image spot encircling front obstacle
US20140019005A1 (en) * 2012-07-10 2014-01-16 Samsung Electronics Co., Ltd. Transparent display apparatus for displaying information of danger element, and method thereof
US20180297520A1 (en) * 2017-04-12 2018-10-18 Toyota Jidosha Kabushiki Kaisha Warning device
DE102019122177A1 (en) * 2018-09-05 2020-03-05 Denso Corporation Forward collision avoidance indicator
CN210139859U (en) * 2019-06-28 2020-03-13 威马智慧出行科技(上海)有限公司 Automobile collision early warning system, navigation and automobile
EP3922501A1 (en) * 2020-06-11 2021-12-15 Volkswagen Ag Control of a display of an augmented reality head-up display device for a method of transport

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108025644B (en) * 2015-09-18 2019-05-07 日产自动车株式会社 Display apparatus and vehicle display methods
JP7077616B2 (en) * 2017-12-28 2022-05-31 トヨタ自動車株式会社 Display control device and display control method
JP7124529B2 (en) * 2018-08-01 2022-08-24 トヨタ自動車株式会社 vehicle controller
EP3924767A1 (en) * 2019-02-12 2021-12-22 CY Vision Inc. Holographic head-up display device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090189753A1 (en) * 2008-01-25 2009-07-30 Denso Corporation Automotive display device showing virtual image spot encircling front obstacle
US20140019005A1 (en) * 2012-07-10 2014-01-16 Samsung Electronics Co., Ltd. Transparent display apparatus for displaying information of danger element, and method thereof
US20180297520A1 (en) * 2017-04-12 2018-10-18 Toyota Jidosha Kabushiki Kaisha Warning device
DE102019122177A1 (en) * 2018-09-05 2020-03-05 Denso Corporation Forward collision avoidance indicator
CN210139859U (en) * 2019-06-28 2020-03-13 威马智慧出行科技(上海)有限公司 Automobile collision early warning system, navigation and automobile
EP3922501A1 (en) * 2020-06-11 2021-12-15 Volkswagen Ag Control of a display of an augmented reality head-up display device for a method of transport

Also Published As

Publication number Publication date
GB202116706D0 (en) 2022-01-05
WO2023089105A1 (en) 2023-05-25

Similar Documents

Publication Publication Date Title
US11034297B2 (en) Head-up display and program
US9690104B2 (en) Augmented reality HUD display method and device for vehicle
US11181737B2 (en) Head-up display device for displaying display items having movement attribute or fixed attribute, display control method, and control program
CN107848415B (en) Display control device, display device, and display control method
US10269161B2 (en) Vehicular display device and vehicular display method
WO2018066695A1 (en) In-vehicle display control device
US20170084176A1 (en) Vehicle warning device
US20170269684A1 (en) Vehicle display device
WO2015163205A1 (en) Vehicle display system
US9463743B2 (en) Vehicle information display device and vehicle information display method
GB2550472B (en) Adaptive display for low visibility
US10488658B2 (en) Dynamic information system capable of providing reference information according to driving scenarios in real time
KR20190052374A (en) Device and method to visualize content
US20150375679A1 (en) Apparatus and method for displaying vehicle information
CN109788243B (en) System unreliability in identifying and visually presenting display enhanced image content
JP6277933B2 (en) Display control device, display system
CN111095078A (en) Method, device and computer-readable storage medium with instructions for controlling the display of an augmented reality head-up display device for a motor vehicle
US20200298703A1 (en) Method, device and computer-readable storage medium with instructions for controlling a display of an augmented-reality head-up display device for a transportation vehicle
KR20220032448A (en) Method and apparatus of correcting crosstalk
CN110891841A (en) Method and device for ascertaining the probability of an object being in the field of view of a vehicle driver
GB2613003A (en) System and method
GB2613004A (en) System and method
US9283891B1 (en) Alert systems and methods using a transparent display
US20220074753A1 (en) Method for Representing a Virtual Element
US20220072958A1 (en) Display apparatus