GB2616648A - System and method - Google Patents

System and method Download PDF

Info

Publication number
GB2616648A
GB2616648A GB2203665.1A GB202203665A GB2616648A GB 2616648 A GB2616648 A GB 2616648A GB 202203665 A GB202203665 A GB 202203665A GB 2616648 A GB2616648 A GB 2616648A
Authority
GB
United Kingdom
Prior art keywords
graphic
vehicle
component
user
multimedia data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2203665.1A
Other versions
GB202203665D0 (en
Inventor
Bondar Oleg
Romashkin Sergii
Shtok Maxim
Mamonov Kirill
Skorokhod Aleksey
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wayray AG
Original Assignee
Wayray AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wayray AG filed Critical Wayray AG
Priority to GB2203665.1A priority Critical patent/GB2616648A/en
Publication of GB202203665D0 publication Critical patent/GB202203665D0/en
Priority to PCT/EP2023/056765 priority patent/WO2023175080A2/en
Publication of GB2616648A publication Critical patent/GB2616648A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/80Arrangements for controlling instruments
    • B60K35/81Arrangements for controlling instruments for controlling displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/164Infotainment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/177Augmented reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)
  • Navigation (AREA)

Abstract

A display 100 using augmented reality, AR, for a user of a vehicle has processing 200 of vehicle operation data 300 and multimedia data 400 to produce an AR graphic on the display. The AR graphic may have a first component which includes an applied spatial and temporal variation based on the multimedia data which is one or more of volume, timing or frequency information in the multimedia data. The applied spatial and temporal variation may be in a first dimension such as a vertical dimension. The applied spatial and temporal variation may have a second component which may include graphic portion(s). A property of the graphic portion(s) may be determined based on vehicle operation data and multimedia data. The property may be a dimension based on the vehicle operation data and/or the property may be a duration of display of the graphic portion(s) based on the multimedia data or one or more of lyric timing/grammatic/pronunciation features or current audio-track timing within the multimedia data. The second component may be a textual feature. A head-up display, HUD or audio-visual, AV, system may include the display. A vehicle such as an autonomous vehicle may include the display, HUD or AV system.

Description

SYSTEM AND METHOD
Field
The present disclosure relates to a system for displaying information to a user of a vehicle using augmented reality (AR), a head-up display (HUD) system, a method of displaying information using AR and a computer, computer program and non-transient computer-readable storage medium.
Background to the invention
Known head-up display (HUD) systems used in vehicles such as cars and airplanes provide a vehicle user with computer-generated virtual graphics that augment the user's visual sensory perception of real-world features, or objects, viewable to the user during vehicle operation. The HUD systems are configured to generate virtual graphics and project the virtual graphics as images or text through or onto a windshield or other display screen so that the user can view the information while holding their head up and without taking their attention away from the real-world features viewable during vehicle operation.
It is desired to display such computer-generated virtual graphics in a manner which is consistent and well-matched with the user's visual sensory perception of real-world features, or objects, viewable to the user during vehicle operation. If the graphics displayed to the user are not well-matched with the user's visual sensory perception, the user may be disorientated or have difficulty in understanding or interpreting the vehicle dynamics.
This is particularly important in autonomous vehicle operation. In autonomous vehicle operation, the user has minimal involvement in vehicle operation. The user can understand the vehicle operation by instrumentation which indicates to the user the dynamics of the vehicle. However, if such instrumentation is not well-matched to the vehicle dynamics, the user may have difficulty in understanding or interpreting the autonomous vehicle dynamics. This may lead to reduced confidence in the autonomous vehicle and ultimately a worse user experience.
A further problem is how to display content for a user to interact with during vehicle operation. The display of content, which may be multimedia content, can enable the user to understand the vehicle dynamics and thus be used as instrumentation. However, if such content is not consistent and well-matched with the user's visual sensory perception of real-world features, or objects, viewable to the user during vehicle operation, then the user may be disorientated or may have difficulty in understanding or interpreting the vehicle dynamics. Furthermore, this may lead to a reduced level of engagement with the content displayed to the user.
Summary of the invention
It is one aim of the present invention, amongst others, to provide an improved system and/or method thereof and/or address one or more of the problems discussed above, or discussed elsewhere, or to at least provide an alternative system and/or method.
The summary statements which follow relate to a number of aspects. The aspects are only aspects of the invention where the system and method is as defined in the claims that follow.
The reader will appreciate that features of the aspects which do not fall within the scope of the invention may nevertheless be incorporated in aspects of the invention which do fall within the scope of the invention. For this reason, any aspects which are absent these features are retained in order to provide useful background information to the reader.
A first aspect provides a system for displaying information to a user of a vehicle using augmented reality, AR, the system comprising a processor configured to: interact with a display unit for displaying information using AR in a field of view associated with the system, a vehicle operation data source for providing vehicle operation data to the processor, and a multimedia data source for providing multimedia data to the processor; process an AR graphic for display at the display unit, the AR graphic being based on the vehicle operation data and the multimedia data; and cause the display unit to display the AR graphic.
In one example, the processor is configured to: process the AR graphic for display at the display unit, the AR graphic having a first component, wherein the first component comprises an applied spatial and temporal variation based on the multimedia data.
In one example, the applied spatial and temporal variation is based on one of more of: volume information in the multimedia data; timing information in the multimedia data; frequency information in the multimedia data.
In one example, the applied spatial and temporal variation is provided in a first dimension. In one example, the first dimension is a vertical dimension.
In one example, the processor is configured to: process the AR graphic for display at the display unit, the AR graphic having a second component.
In one example, the second component of AR graphic comprises one or more graphic portions.
In one example, a property of the one or more graphic portions is determined based on vehicle operation data and the multimedia data. In one example, the property is a dimension determined based on the vehicle operation data; and/or the property is a duration of display of the one or more graphic portions which is determined based on the multimedia data. In one example, the duration of display is determined based on one or more of: lyric timing data within the multimedia data; grammatic and/or pronunciation features of a lyric within the multimedia data; current audio-track timing within the multimedia data.
In one example, the second component is a textual feature.
In one example, the second component is located based on: a distance where the user should begin to interact with the second component, or a time when the user should begin to interact with the second component; a distance where the user should have already interacted with the second component, or a time when the user should have already interacted with the second component; and/or a start location of an animation to remove the second component or a graphic portion thereof and/or a stop location of an animation to remove the second component or a graphic portion thereof and/or a start time of an animation to remove the second component or a graphic portion thereof and/or a stop time of an animation to remove the second component or a graphic portion thereof.
In one example, the processor is configured to: process the AR graphic for display at an apparent fixed location relative to a feature in the field of view when the vehicle operation data indicates that the vehicle is above a threshold vehicle operation parameter.
In one example, the processor is configured to: process the AR graphic for display at an apparent time-varying location relative to a feature in the field of view when the vehicle operation data indicates that the vehicle is below a threshold vehicle operation parameter.
A further aspect provides a system for displaying information to a user of a vehicle using augmented reality, AR, the system comprising a processor configured to: interact with a display unit for displaying information using AR in a field of view associated with the system, and a multimedia data source for providing multimedia data to the processor; process an AR graphic for display at the display unit, the AR graphic being based on the multimedia data; and cause the display unit to display the AR graphic.
A further aspect provides a system for displaying information to a user of a vehicle using augmented reality, AR, the system comprising a processor configured to: interact with a display unit for displaying information using AR in a field of view associated with the system, a first data source for providing first data to the processor, and a second data source for providing multimedia data to the processor; process an AR graphic for display at the display unit, the AR graphic being based on the first data and the multimedia data; and cause the display unit to display the AR graphic.
In one example, the first data is vehicle operation data, user data, a user profile, a vehicle profile, environment data, field of view feature data, or other data.
A further aspect provides a head-up display, HUD, system comprising a system according to any system aspect.
A further aspect provides an audio/visual, AV, system comprising the system or HUD system according to any system or HUD system aspect, the AV system configured to provide an AV output related to the multimedia data provided by the multimedia data source.
A further aspect provides a vehicle comprising the system, HUD system or AV system according to any system, HUD system or AV system aspect.
In one example, the vehicle is an autonomous vehicle.
A further aspect provides a method of displaying information using augmented reality, AR, to a user of a vehicle, the method comprising: interacting with a display unit for displaying information using AR in a field of view, a vehicle operation data source for providing vehicle operation data to the processor, and a multimedia data source for providing multimedia data to the processor; processing an AR graphic for display at the display unit, the AR graphic being based on the vehicle operation data and the multimedia data; and causing the display unit to display the AR graphic.
A further aspect provides a computer comprising a processor and a memory configured to perform a method according to the aspect, a computer program comprising instructions which, when executed by a computer comprising a processor and a memory, cause the computer to perform a method according to a method aspect, or a non-transient computer-readable storage medium comprising instructions which, when executed by a computer comprising a processor and a memory, cause the computer to perform a method according to a method aspect.
It will of course be appreciated that features described in relation to one aspect of the present invention may be incorporated into other aspects of the present invention. For example, the method of any aspect of the invention may incorporate any of the features described with reference to the apparatus of any aspect of the invention and vice versa.
Other preferred and advantageous features of the invention will be apparent from the following description.
Brief description of the figures
For a better understanding of the invention, and to show how embodiments of the same may be carried into effect, reference will now be made, by way of example only, to the accompanying diagrammatic drawings in which: Figure 1 shows a schematic drawing of a system for displaying information to a user of a vehicle using augmented reality (AR); Figure 2 shows field of views of the user and display unit; Figure 3 shows a schematic drawing of a head-up display (HUD) system; Figure 4 shows a first view wherein an AR graphic is displayed; Figure 5 shows a second view wherein an AR graphic is displayed; Figure 6 shows a third view wherein an AR graphic is displayed; Figure 7 shows a fourth view wherein an AR graphic is displayed; Figure 8 shows a fifth view wherein an AR graphic is displayed; Figure 9 shows a sixth view wherein an AR graphic is displayed; Figure 10 shows a seventh view wherein an AR graphic is displayed; Figure 11 shows an eighth view wherein an AR graphic is displayed; Figure 12 shows a ninth view wherein an AR graphic is displayed; Figure 13 shows a vehicle comprising a system or HUD system; and Figure 14 shows general methodology principles.
Detailed description of the invention
According to the present invention there is provided a system, a head-up display system, a method, computer program and non-transient computer-readable storage medium as set forth in the appended claims. Other features of the invention will be apparent from the dependent claims, and the description which follows.
The description which follows describes a number of embodiments. The embodiments are only embodiments of the invention where the system and method is as defined in the claims that follow. The reader will appreciate that features of the embodiments which do not fall within the scope of the invention may nevertheless be incorporated in embodiments of the invention which do fall within the scope of the invention. For this reason, the description of the embodiments which are absent these features are retained in order to provide useful background information to the reader.
Referring to Figure 1, there is shown a system 10 for displaying information to a user of a vehicle using augmented reality (AR). The system 10 comprises a processor 100. The processor 100 is configured to interact with a display unit 200. The processor 100 is configured to interact with a vehicle operation data source 300. The processor 100 is configured to interact with a multimedia data source 400. The processor 100 being configured to interact with the display unit 200, vehicle operation data source 300 and multimedia data source 400 may mean that the processor 100 is configured to receive information from and send information to the display unit 200, the vehicle operation data source 300 and multimedia data source 400. In particular, the vehicle operation data source 300 and multimedia data source 400 are each configured to provide data to the processor 100. The processor 100 is configured to receive data from each of the vehicle operation data source 300 and multimedia data source 400, accordingly.
As will be described in further detail below, the system 10 enables displayed AR graphics to be well-matched with the user's visual sensory perception of real-world features, or objects, viewable to the user during vehicle operation, because the AR graphics are based on vehicle operation data and multimedia data. Where implemented in an autonomous vehicle, the AR graphic being matched to the environment external to the vehicle enables the AR graphic to be used as a form of instrumentation to the user. Whether consciously or subconsciously, the user can use the AR graphic to interpret that the vehicle is, for example, maintaining a constant speed, accelerating, braking and/or changing direction. This means that the user is able to receive and process technical information, but without the need to interact with a dedicated display or interface, for example a speedometer or accelerometer. This means that there is synergy for the user. There is also synergy for the system, in that fewer dedicated technical displays or interfaces might be needed.
Furthermore, where the AR graphic incorporates content, such as multimedia content, the content is well-matched with the user's perception of real-world features, improving readability, usability, and user engagement. This is facilitated by processing the AR graphic based on vehicle operation data, which ensures that, for example, changes of speed or position of the vehicle can be accommodated for in the display of the AR graphic. Ensuring user engagement with the autonomous vehicle is important from a safety and redundancy aspect.
Additionally, by basing the AR graphic on vehicle operation data and multimedia data, multimedia content displayed to the user can be located, sized and provided for an appropriate time period dependent upon the vehicle operation. This ensures that the user can easily read or otherwise interact with the multimedia content.
Here, "a user of a vehicle" is typically a driver, or operator, of the vehicle. However, the user may be one or more other users of the vehicle, such as a passenger. In an example, the vehicle may be operable in a fully autonomous mode, including a mode without any direct control from a user of the vehicle (e.g., a driver). In this case, the driver may indeed be a vehicle passenger. In a fully autonomous mode, the system 10 (and/or HUD system 50 as described below) may augment the field of view by displaying AR graphics on the display unit 200 to encourage the user to maintain their attention on the road ahead, despite not controlling the vehicle themselves. This may be referred to as "gamification" of the display. Even more so, if not driving or controlling the vehicle, games may be played whilst using the vehicle. The gaming may involve the use of the AR graphics.
For the avoidance of doubt, the system 10 may be able to be retrofitted to a vehicle comprising a display unit 200, a vehicle operation data source 300, and a multimedia data source 400. In this case, the system 10 itself need not comprise the display unit 200, vehicle operation data source 300, and multimedia data source 400. However, in an alternative construction, which does not involve retrofit, the system 10 may comprise a display unit 200, a vehicle operation data source 300, and a multimedia data source 400. This could be a physical retrofit, or a software update or upgrade.
The display unit 200 is for displaying information using AR in a field of view (FoV) associated with the system 10. The display unit 200 may comprise, or be in the form of, a holographic display unit. A holographic display is a type of display that utilizes light diffraction to create a virtual three-dimensional image. The holographic display unit may comprise a combiner. The combiner may be the windshield (i.e., the whole or a part of the windshield). That is, the present system 10 is an optical see-through system. Optical see-through systems allow the user to view the real-world "directly". These AR systems add virtual content, in the form of AR graphics, by adding additional light on top of the light coming in from the real-world.
Referring to Figure 2, a plan view of an exemplary FoV of the user and FoV associated with the system 100 is shown. The FoV of the user (herein referred to as the "user FoV") is indicated at 20, and the FoV associated with the system 100 is indicated at 30.
In this example, the FoV associated with the system 100 is a FoV of the display unit 200. As shown, the FoV of the user is greater than the FoV of the display unit 200. Typically, the FoV of a user ranges from 120 to 200 degrees. The FoV of the display unit has a vertical FoV of approximately 4 degrees, and a horizontal FOV in the range of 15 to 20 degrees. From this figure, it will be understood that the display unit 200 displays information using AR in a FoV of the user.
In the description herein, the "FoV of the display unit" means a FoV of an augmentable area.
This is the part of a human visual field where the display unit 200 can show virtual content, to augment the view of the real-world. That is, the FoV of the display unit 200 is an area which is observable by the user and which the display unit 200 is able to augment the user's view of the real-world. In this sense, it may be possible to refer to the FoV of the display unit as "the field of view of a user using the system 10" or the "field of view of the user using/viewing the display unit 200". In a HUD system, for example HUD system 50, this may be referred to as the "field of view of the user using the HUD system 50".
In the description which follows, the FoV of the display unit 200 will be referred to simply as the "FoV".
It will be understood that the display unit 200 is located in an area which is observable by the user. In a specific example, this means that the display unit 200 is located at, or in front of (i.e., toward the user), the windshield through which the user observes the road ahead when using the vehicle. In other examples, the display unit 200 may be provided at a side or rear of the vehicle, and thus the FoV may be to the side of rear of the vehicle.
In the figures, user FoVs and FoVs of the display unit 200 are illustrated as rectangular FoVs. It will be appreciated that in practice, such FoVs are not rectangular.
As introduced above, the vehicle operation data source 300 is for providing data to the processor 100. Similarly, the multimedia data source 400 is for providing data to the processor 100. The processor 100 is configured to display information to the user, at the display unit 200. The information provided to the user is based on the data provided by the vehicle operation data source 300 and the multimedia data source 400. The information provided to the user may be based on, dependent upon, or representative of, the data provided by the vehicle operation data source 300 and multimedia data source 400.
Each of the vehicle operation data source 300 and multimedia data source 400 may be a system, a component, or may be a store of data. For example, the vehicle operation data source 300 may be a speedometer and/or accelerometer. The vehicle operation data may be provided directly to the processor 100 by a vehicle system or component. That is, the vehicle operation data source 300 may be a system or component of the vehicle configured to provide vehicle operation data (e.g. speed, acceleration and/or direction-of-travel data). Alternatively, the vehicle operation data source 300 may be a global positioning system (GPS), mapping system/service or other navigation system. Alternatively, the data source 300 may be a component such as a detector or a memory. A detector may be a camera, a RADAR and/or a LIDAR. Alternatively, the data source 300 may be a store of data such as a table of data. The data source 300 may comprise one or more systems, components, or data stores.
The vehicle operation data provided to the processor 100 may be speed data, acceleration data, direction-of-travel data, GPS data, navigation data, route data and/or feature data. Navigation data may be according to the Navigation Data Standard (NDS).
GPS data may comprise a set of geocoordinates received from a mapping service. Navigation data and route data comprise data points received from a navigation or routing system. Navigation data and route data may comprise a set of data points relating to an intended path of the vehicle.
The vehicle operation data may relate to dynamics of the vehicle, for example a current and/or future velocity of the vehicle, and/or a current and/or future acceleration of the vehicle; direction-of-travel data; a position in a lane or relative to a road marking or feature; a road plane feature; a vehicle position; a visible distance border; a shifted route position; and/or a position correction.
Feature data may relate to a feature detected in an environment external to the vehicle. A feature may include another vehicle, such as a road-going vehicle or off-road vehicle. A feature may also include road-markings or general road furniture. A feature could be a human or animal. A feature may include road layout, or road lane layout. Feature data may relate to all of these features. The features to which the feature data relates may vary for different uses or modes or types of system, vehicle or scenario.
Here, "an environment external to the vehicle" may include features in-front of the vehicle, to the sides of the vehicle and also behind the vehicle. That is, the feature data may relate to features outside the FoV.
In an embodiment, the vehicle operation data provided by the vehicle operation data source 300 relates to the motion of the vehicle, position of the vehicle relative to lanes and/or environmental features.
In an embodiment, the multimedia data provided by the multimedia data source 400 relates to lyric timing data, grammatical features in lyrics or text, and/or audio track timing. As stated above, the multimedia data source 400 may be a system, a component, or may be a store of data. The multimedia data source 400 may be a multimedia system, a component such as a memory or data provider in a multimedia system, or a store of multimedia data.
The processor 100 is configured to process an AR graphic for display at the display unit 200.
An AR graphic is a graphic that is to be displayed using augmented reality (AR). The processor 100 is configured to process the AR graphic in a number of different ways, as will be described in further detail below.
The processor 100 is configured to cause the display unit 200 to display the AR graphic.
Whilst the processor 100 causing the display unit 200 to display the AR graphic will likely be implemented in a practical implementation, it may be viewed as optional in terms of certain embodiments. For example, a core differentiating feature is the actual processing of an AR graphic based on the vehicle operation data and the multimedia data.. In terms of key features that differentiate the invention, causing the display unit 200 to display the AR graphic may be optional.
Of course, the skilled person will understand how an AR graphic may be displayed on a display unit 200, and the above provides illustrative examples of how an AR graphic may be displayed on a display unit 200.
Referring to Figure 3, there is shown a head-up display (HUD) system 50. The HUD system 50 comprises a picture generation unit 500, a combiner unit 600 and a corrector unit 700.
A combiner and corrector may be useful in, for example, a holographic system, or other system where a user is presented with a surface that is looked through, yet provided with graphical features by the system. The combiner might be, for example an at least partially (e.g., semi) transparent surface used in the system, to overlay an image presented by a projector (which may be part of or separate to the picture generation unit 500) on top of the user's physical world. The combiner is at least partially transparent and lets the user see through it, while simultaneously reflecting or otherwise directing light (e.g., AR features) to the user. The corrector unit may be used to correct aberrations, filter, and/or to improve light utilization efficiencies, or generally correct an optical property of some kind. In some examples, one or more optical devices or lenses may be used, including filters.
The system 10 described herein may be incorporated in the HUD system 50. The system 10 and/or HUD system 50 may be incorporated, or form part of, an advanced driver-assistance system (ADAS).
Although the present disclosure relates to systems 10 and HUD systems 50 for installation in vehicles, the present disclosure could also be applied in the context of head-mounted displays (HMD5). For example, the system 10 could be incorporated in a pair of smart-glasses or other wearable device.
Furthermore, the system 10 or HUD system 50 may be incorporated in an audio/visual (AV) system. That is, an AV system may comprise a system 10 or HUD system 50 according to an embodiment described herein. The AV system may be in connection with, or comprise, comprise the multimedia data source, according to an embodiment described herein. The AV system is configured to provide an AV output related to the multimedia data provided by the multimedia data source. In an advantageous embodiment, the AV output includes an audio track, such as a music track. AR graphics output to the user may include a textual feature relating to the words or lyrics of the audio track. The audio track may be provided simultaneously to the user by the AV system. For this purpose, the AV system comprises speakers and, optionally, amplification apparatus. In some embodiments, an audio system may be connected to the system 10 or HUD system 50, the system 10 or HUD system providing visual output and the audio system providing audio output. In this way, the AV system may be, or form part of, a karaoke system incorporating AR.
Figures 4 to 12 each show a forward view through the vehicle windshield wherein an AR graphic is displayed in a FoV. Operation and functionality of the system 10 will be described with reference to the figures. In each figure, an AR graphic is shown wherein the AR graphic is based on vehicle operation data from the vehicle operation data source 300 and multimedia data from the multimedia data source 400.
In one embodiment, the processor 100 is configured to generate the AR graphic. That is, the processor 100 itself generates the AR graphic. In this way, no additional components for generation of the AR graphic may be necessary.
In another embodiment, the processor 100 is configured to receive the AR graphic. That is, the processor 100 itself need not generate the AR graphic. This may reduce the computational load on the processor 100, whilst allowing dedicated componentry to generate the AR graphic.
In general, applicable to all figures, a system 10 is described in which multimedia data is output to a user of the system 10. In the example illustrated, the multimedia data is an audio track (e.g., a music track). As the audio track plays, words or lyrics are displayed to the user in the form of an AR graphic comprising textual information. The words or lyrics are placed above the road surface, and are provided in graphic portions of suitable lengths and displayed for such a duration that the user may read the words or lyrics as the audio track is played. An equaliser-like AR graphic component is also displayed to the user, illustrating properties such as a beat, a track timing, or frequency, of the audio track. Such a system engages the vehicle user, and also serves as a form of instrumentation to allow the user to determine vehicle dynamics. For example, when the vehicle is stationary, or below a threshold vehicle operation parameter, the AR graphic may have a time-varying location relative to a feature in the field of view. Alternatively, when the vehicle is moving, or above a threshold vehicle operation parameter, the AR graphic may have a fixed location relative to a feature in the field of view. In this way, engagement with the user of the vehicle is facilitated, whilst the user may use the AR graphic as a form of instrumentation to understand the dynamics of the vehicle. Further details of operation of the system 10 and functionality of the processor 100 are described below.
It is intended that the user can interact with the AR graphic displayed on the display unit 200. Interacting with the AR graphic may include, for example, include reading the AR graphic, or a textual feature included in the AR graphic. The user may read or sing a lyric after interacting the AR graphic. That is, in an embodiment, the system 10 may be, or form part of, a karaoke system incorporating AR.
In another exemplary embodiment, the user can interact with the AR graphic displayed on the display unit 200 to cause the AR graphic to change its appearance. For example, the user may interact with the AR graphic by selecting the AR graphic. This may be achieved by use of a remote device, vehicle component, or by monitoring movement of a user's body part, such as a hand or arm. Such functionality may be incorporated in an AR game, thereby encouraging the user to maintain their attention on the road ahead, despite not controlling the vehicle themselves. This may allow the user to interact with the environment by controlling an AR game environment. As above, this may be referred to as "gamification" of the display. Even more so, if not driving or controlling the vehicle, games may be played whilst using the vehicle.
Referring to Figures 4 to 7, a first, second, third and fourth view wherein an AR graphic 4110 is displayed is shown.
A user FoV is indicated at 4000. A FoV (that is, the FoV of the display unit 200) is indicated at 4100.
The processor 100 is configured to process the AR graphic 4110 for display at the display unit 200 based on the vehicle operation data and the multimedia data from the data sources 300, 400.
In this way, the AR graphic 4110 is well-matched with the user's visual sensory perception of real-world features, or objects, viewable to the user during vehicle operation, because the AR graphic 4110 is based on vehicle operation data and multimedia data. Where implemented in an autonomous vehicle, the AR graphic 4110 being matched to the environment external to the vehicle enables the AR graphic 4110 to be used as a form of instrumentation to the user. Whether consciously or subconsciously, the user can use the AR graphic 4110 to interpret that the vehicle is, for example, maintaining a constant speed, accelerating, braking and/or changing direction. This means that the user is able to receive and process technical information, but without the need to interact with a dedicated display or interface, for example a speedometer or accelerometer. This means that there is synergy for the user. There is also synergy for the system, in that fewer dedicated technical displays or interfaces might be needed.
Furthermore, where the AR graphic 4110 incorporates content, such as multimedia content, the content is well-matched with the user's perception of real-world features, improving readability, usability, and user engagement. This is facilitated by processing the AR graphic 4110 based on vehicle operation data, which ensures that, for example, changes of speed or position of the vehicle can be accommodated for in the display of the AR graphic 4110. Ensuring user engagement with the autonomous vehicle is important from a safety and redundancy aspect.
Additionally, by basing the AR graphic on vehicle operation data and multimedia data, multimedia content displayed to the user can be located, sized and provided for an appropriate time period dependent upon the vehicle operation. This ensures that the user can easily read or otherwise interact with the multimedia content.
The processor 100 is configured to process the AR graphic 4110 for display at the display unit 200, the AR graphic 4110 having a first component 4112. Additionally, the processor 100 is configured to process the AR graphic for display at the display unit, the AR graphic 4110 having a second component 4114. Figures 4 to 7 each show an AR graphic comprising a first component and second component. Whilst Figures 8 to 12 each show an AR graphic comprising only a second component, it will be readily understood that the AR graphics of said figures may further comprise a first component as described herein.
The first component 4112 comprises an applied spatial and temporal variation. The applied spatial and temporal variation is based on the multimedia data from the multimedia data source 400.
In one embodiment, the processor 100 is configured to generate the AR graphic comprising the applied spatial and temporal variation. That is, the processor 100 itself generates the AR graphic. In this way, no additional components for generation of AR graphic may be necessary.
In another embodiment, the processor 100 is configured to receive the AR graphic comprising the applied spatial and temporal variation. That is, the processor 100 itself need not generate the AR graphic. This may reduce computational load on the processor 100, whilst allowing dedicated componentry to generate the AR graphic.
The spatial and temporal variation may be referred to as, or be, or take the form of, a "lightning bolt appearance", "lightning bolt variation", or "lightning bolt manner". The first component 4112 may be located in the FoV 4100 based on, for example, an intended travel path of the vehicle. One example of such a fit is a "route spline". A route spline may be based on waypoints, or lane markings or boundaries.
The applied spatial and temporal variation of the first component 4112 is based on one or more of: volume information (e.g. sound level or intensity) in the multimedia data; timing information in the multimedia data; frequency information in the multimedia data. In some cases, these may all be referred to generally as "multimedia dynamics" or "sound dynamics".
In a specific embodiment, the multimedia data source 400 provides data about a music track to the processor 100. The processor 100 processes the AR graphic based on information in the multimedia data. In particular, the multimedia data may contain volume information. The applied spatial and temporal variation of the first component 4112 is provided in a first dimension. The first dimension may be a horizontal dimension, or amplitude change, or may be a vertical dimension, or amplitude change, as viewed by the user. Preferably, the first dimension is a vertical amplitude change.
In an example, an increase in volume of the music track (as indicated by the multimedia data) results in an increase in vertical amplitude of the applied spatial and temporal variation of the first component 4112. Accordingly, a reduction in volume of the music track (as indicated by the multimedia data) results in a decrease in vertical amplitude of the applied spatial and temporal variation of the first component 4112.
In a further example, the multimedia data may contain timing information. Again, the applied spatial and temporal variation of the first component 4112 is provided in a first dimension, as described above.
In an example, the applied spatial and temporal variation of the first component 4112 may be based on timing information, for example beats-per-minute (BPM) of the music track. An increase in BPM of the music track (as indicated by the multimedia data) results in an increase in frequency of maximum vertical amplitude of the applied spatial and temporal variation of the first component 4112. In an example, the frequency of maximum vertical amplitude of the first component 4112 corresponds with the BPM of the music track.
In an example, the applied spatial and temporal variation of the first component 4112 may be based on frequency information. An increase in frequency, or pitch, of the music track (as indicated by the multimedia data) results in an increase in maximum vertical amplitude of the applied spatial and temporal variation of the first component 4112.
In general, the first component 4112 may be considered to act as an "equaliser" which provides a visual indication of the multimedia data, or in a specific example, the music track.
As the vehicle travels forward in-motion, the user perceives that they are travelling toward the first component 4112 of the AR graphic 4110. At the same time, the first component 4112 has spatial and temporal variation, as described above. In this way, the user can interpret that the vehicle is in-motion, and thus the first component 4112 may be used as a form of instrumentation. Correspondingly, when the vehicle is stationary, the first component 4112 of the AR graphic 4110 may be provided at a fixed location relative to features external to the vehicle. The first component 4112 may vary only, for example, in a vertical dimension. The user can therefore interpret that the vehicle is stationary by reference to the first component 4112.
The processor 100 is configured to process the AR graphic for display at the display unit, the AR graphic 4110 having a second component 4114. The second component 4114 comprises one or more graphic portions 4114a, 4114b, 4114c. In Figures 4 and 5, the AR graphic 4110 can be seen to comprise two graphic portions 4114a, 4114b. In Figures 6 and 7, the AR graphic 4110 can be seen to comprise two graphic portions 4114b, 4114c.
Providing graphic portions, such as graphic portions 4114a, 4114b, 4114c, allows the multimedia data to be separated into portions of a suitable length for user interaction. Furthermore, graphic portions can be separated so as to provide instrumentation to the user by illustrating vehicle operation which enables the user to interpreted vehicle dynamics by reference to the graphic portions. The graphic portions may be spaced apart in a direction of vehicle travel. That is, the graphic portions may be spaced apart in a direction extending away from the vehicle.
As shown in the figures, the second component 4114 is a textual feature. For example, the second component 4114 may be a graphic displaying words or lyrics of a music track, video, audiobook or other multimedia.
The processor 100 is configured to process the AR graphic 4110, including the first component 4112 and second component 4114, to locate them in the field of view based on various parameters, including: * current vehicle motion, for example speed, in vehicle operation data; * timing information in multimedia data; * grammatic and pronunciation features in multimedia data; * current timing in the multimedia data, for example audio-track timing; * lane detection data in vehicle operation data; * road plane detection data in vehicle operation data; * vehicle position in vehicle operation data; * visible distance borders in vehicle operation data; and * shifted route position and position correction in vehicle operation data.
Specifically, a property of the graphic portions 4114a, 4114b, 4114c is determined based on the vehicle operation data and the multimedia data. In this way, the user may understand vehicle operation, consciously or subconsciously, from the property of the graphic portions 4114a, 4114b, 4114c. In one embodiment, the property is a dimension determined based on the vehicle operation data. The dimension may be a length, width or height of the graphic portions 4114a, 4114b, 4114c. For example, when the vehicle velocity is high, the width of the graphic portions 4114a, 4114b, 4114c may be greater than the width when the vehicle velocity is comparatively lower. Additionally, in said embodiment, the property could also be a duration (i.e., a length of time) for displaying the graphic portions 4114a, 4114b, 4114c. The duration is based on the multimedia data. The duration is determined based on one or more of: lyric timing data; grammatic and pronunciation features of a lyric; current audio-track timing.
The second component 4114 can be located based on a number of factors. In one embodiment, the second component 4114 is located based on a distance where the user should begin to interact with the second component 4114, or a time when the user should begin to interact with the second component 4114. That is, the second component 4114 may be displayed at a suitable apparent distance for the user to interact with (e.g., read) the second component 4114, and at a suitable time for the user to interact with the second component 4114. This distance and/or time may be based on the multimedia data. In this way, the system 10 ensures that the user can interact with and interpret the second component 4114. The user can interact with the second component 4114 by reading the second component 4114, or singing in response to reading the second component 4114 is a system 10 that is a karaoke system incorporating AR.
In another embodiment, the second component 4114 is located based on a distance where the user should have already interacted with the second component 4114, or a time when the user should have already interacted with the second component 4114. That is, the second component 4114 may located at an apparent location where it is to be removed from the display. It is expected that the user would have interacted with (e.g., read) the second component 4114 and so it is appropriate to remove the second component 4114 from the display to avoid distracting the user with unnecessary AR graphic portions. Similarly, in this way, the system 10 ensures that the user can interact with and interpret the second component 4114.
In another embodiment, the second component 4114 is located based on a start location of an animation to remove the second component 4114 or a graphic portion 4114a, 4114b, 4114c thereof and/or a stop location of an animation to remove the second component 4114 or a graphic portion 4114a, 4114b, 4114c thereof and/or a start time of an animation to remove the second component 4114 or a graphic portion 4114a, 4114b, 4114c thereof and/or a stop time of an animation to remove the second component 4114 or a graphic portion 4114a, 4114b, 4114c thereof. In this way, the second component 4114 or graphic portions 4114a, 4114b, 4114c thereof can be removed from the display by a disappearance animation. This may include a "fade-out" animation. Removing the AR graphic in this way can increase user engagement and intelligibility of the system.
The processor 100 is configured to change a property or characteristic of the second component 4114 of the AR graphic 4110, or a portion thereof, at a distance or time at which it is intended for the user to engage with the AR graphic 4110. That is, for example, the second component 4114 may be highlighted in a different colour, or enlarged in size, when it is intended for the user to read the textual feature of the second component 4114.
The processor 100 is configured to process the AR graphic for display at a fixed location relative to a feature in the field of view when the vehicle operation data indicates that the vehicle is above a threshold vehicle operation parameter. This is particularly noticeable by comparing Figures 4 and 5, and Figures 6 and 7. In these figures, the vehicle is maintaining a substantially constant velocity above a threshold velocity. In particular, the vehicle is travelling on a motorway/freeway. The second component 4114 is displayed at an apparent fixed location relative to features in the user field of view. Being displayed at an apparent fixed location relative to features in the user field of view means that the second component 4114 appears to track said features, or follow said features, despite motion of the vehicle.
By processing the AR graphic in said manner when the vehicle is above a threshold vehicle operation parameter, the user can determine that the vehicle is above a threshold vehicle operation parameter simply by reference to the AR graphic. For example, the user may determine that the vehicle is above a threshold velocity by acknowledging that the AR graphic is displayed at a fixed location relative to a feature in the field of view despite the vehicle motion. This is particularly advantageous in an autonomous vehicle setting, where the user does not control vehicle operation.
As can be seen in Figure 4, the graphic portion 4114a is displayed at a location relative to the environment, and in Figure 5, the graphic portion 4114a remains at that (fixed) location relative to the environment whilst the vehicle has travelled toward the apparent location of the graphic portion 4114a.
Similarly, as can be seen in Figure 6, the graphic portion 4114b is displayed at a location relative to the environment, and in Figure 7, the graphic portion 4114b remains at that (fixed) location relative to the environment whilst the vehicle has travelled toward the apparent location of the graphic portion 4114b.
Referring to Figures 8 to 12, a fifth, sixth, seventh, eighth, ninth view wherein an AR graphic 8110 is displayed is shown.
A user FoV is indicated at 8000. A FoV (that is, the FoV of the display unit 200) is indicated at 8100.
The processor 100 is configured to process the AR graphic 8110 for display at the display unit 200 based on the vehicle operation data and the multimedia data from the data sources 300, 400.
The description of processing the AR graphic 4110 provided in relation to Figures 4 to 7 is also relevant and equally applicable in describing the AR graphic 8110 displayed in Figures 8 to 12. In particular, the processor is configured to process the AR graphic 8110 for display at the display unit 200 based on the vehicle operation data and the multimedia data from the data sources 300, 400. As mentioned above, whilst Figures 8 to 12 each show an AR graphic 8110 comprising only a second component, it will be readily understood that the AR graphics of said figures may further comprise a first component as described herein.
The processor 100 is configured to process the AR graphic for display at the display unit, the AR graphic 8110 having a second component 8114. The second component 8114 comprises one or more graphic portions 8114a, 8114b, 8114c.
As illustrated in Figures 8 to 12, the processor 100 is configured to: process the AR graphic 8110 for display at an apparent time-varying location relative to a feature in the field of view when the vehicle operation data indicates that the vehicle is below a threshold vehicle operation parameter.
In Figure 8, the vehicle is decelerating, but is travelling at a velocity above a threshold velocity.
In particular, the vehicle is travelling in a residential area and is approaching a stop sign. The graphic portions 8114a, 8114b, 8114c are displayed at an apparent fixed location relative to features in the user field of view, in a similar manner to that described above.
In Figure 9, the vehicle has reached the stop sign and has decelerated below the threshold velocity. In this case, the vehicle is stationary at the stop sign. The graphic portions 8114a, 8114b, 8114c are displayed at an apparent time-varying location relative to features in the user field of view. That is, the graphic portions 8114a, 8114b, 8114c are displayed at a first apparent location, and then moved to a second apparent location toward the user, based on the multimedia data, for example timing information of when it is intended for the user to interact with that particular graphic portion 8114a, 8114b, 8114c. Thus, the AR graphic is displayed at an apparent time-varying location relative to features in the field of view.
By processing the AR graphic in said manner when the vehicle is below a threshold vehicle operation parameter, the user can determine that the vehicle is below a threshold vehicle operation parameter simply by reference to the AR graphic. For example, the user may determine that the vehicle is below a threshold velocity by acknowledging that the AR graphic is displayed at an apparent time-varying location relative to a feature in the field of view. This is particularly advantageous in an autonomous vehicle setting, where the user does not control vehicle operation.
In this way, when the processor 100 determines that a vehicle operation parameter, for example velocity, is below a threshold, the graphic portions of the second component 8114 can still be displayed to the user in a manner in which they are visible, thus allowing the user to interact with the AR graphic. Interacting with the AR graphic may, for example, include reading the AR graphic and in particular the textual feature of the graphic portions. The user can read the AR graphic and sing in response to reading the AR graphic, and in this way karaoke functionality can be provided. The user of the system 10 can interpret the time-varying location of the AR graphic to mean that a vehicle operation parameter is below a threshold level. That is, for example, by reference to only the AR graphic 8110 the user can understand that the vehicle is slowing to a stop, or is stationary.
In Figure 10, the vehicle remains at the stop sign. The graphic portions are moved toward the user to a final location which is consistent between Figures 9 to 11. That is, the graphic portions are each moved to the final location before being removed from the field of view by a disappearance animation.
In Figure 11, the vehicle remains at the stop sign. Again, the graphic portions are moved toward the user to an apparent final location. By comparing Figures 10 and 11, it can be seen how graphic portion 8114f moves from an apparent distance greater than graphic portion 8114e (in Figure 10) to an apparent distance closer to the vehicle.
In Figure 12, the vehicle begins to accelerate from the stop sign. As this is determined by the processor from the vehicle operation data, the graphic portions 8114h, 8114i of the second component 8114 are again displayed at an apparent fixed location relative to features in the user field of view.
In summary, a system a system 10 for displaying information to a user of a vehicle using augmented reality (AR) is described. The system 10 comprises a processor 100 configured to: interact with a display unit 200 for displaying information using AR in a field of view associated with the system 10, a vehicle operation data source 300 for providing vehicle operation data to the processor, and a multimedia data source 400 for providing multimedia data to the processor; process an AR graphic for display at the display unit 200, the AR graphic being based on the vehicle operation data and the multimedia data; and cause the display unit 200 to display the AR graphic. In this way, content with which it is intended for the user to interact with can be displayed to the user dependent on vehicle operation data and multimedia data. In this way, user engagement is improved. The user can engage with the system 10 by reading and singing textual features of the AR graphic, and as such a karaoke system incorporating AR is provided. Furthermore, the user may, consciously or subconsciously, use the AR graphic as a form of technical instrumentation to understand vehicle dynamics, such as speed, acceleration, or positional changes. This is particularly advantageous in an autonomous vehicle implementation, where the user does not control vehicle dynamics, where providing such AR graphics to the user can provide a visual feedback or indication of vehicle dynamics.
Whilst the description of the invention provided herein describes advantageous embodiments of systems comprising a vehicle operation data source 300 for providing vehicle operation data to the processor 100, and processing an AR graphic, the AR graphic being based on the vehicle operation data, in an exemplary embodiment of the system 10 related to said embodiments the vehicle operation data source 300 may not be present, and the AR graphic may not be based on vehicle operation data. Instead, in one example, the processor 100 is configured to interact with a multimedia data source 400 (that is, absent a vehicle operation data source 300), and to process an AR graphic for display at the display unit, the AR graphic being based on the multimedia data. In this way, AR graphic output can be provided that is not linked to vehicle operation data, and instead provide an AR multimedia output system. In another example, the processor 100 is configured to interact with a data source for providing any operation data, or parameter, to the processor 100, and the AR graphic may be based on said operation data or parameter. It will be appreciated that the data source may be a vehicle operation data source, which is advantageous for the reasons provided above. Alternatively, the data source may be any other data source, for example providing information relating to a user, a user profile, a vehicle profile, an environment, a feature in a field of view, or other information.
Referring to Figure 13, a vehicle 13000 is shown. The vehicle 13000 comprises a system 10 or HUD system 50, for example as described above.
Referring to Figure 14, a method of displaying information using augmented reality, AR, to a user of a vehicle is shown. Step S14100 comprises interacting with a display unit for displaying information using AR in a field of view, a vehicle operation data source for providing vehicle operation data to the processor, and a multimedia data source for providing multimedia data to the processor. Step S14200 comprises processing an AR graphic for display at the display unit, the graphic being based on the vehicle operation data and the multimedia data. Step S14300 comprises causing the display unit to display the AR graphic.
Whilst step S14300 will likely be implemented in a practical implementation, it may be viewed as optional in terms of certain embodiments. For example, a core differentiating feature is the processing step. In terms of key features that differentiate the invention, the actual display step may be optional.
Definitions Although a preferred embodiment has been shown and described, it will be appreciated by those skilled in the art that various changes and modifications might be made without departing from the scope of the invention, as defined in the appended claims and as described above.
All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.
Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
The invention is not restricted to the details of the foregoing embodiment(s). The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.

Claims (15)

  1. CLAIMS1. A system for displaying information to a user of a vehicle using augmented reality, AR, the system comprising a processor configured to: interact with a display unit for displaying information using AR in a field of view associated with the system, a vehicle operation data source for providing vehicle operation data to the processor, and a multimedia data source for providing multimedia data to the processor; process an AR graphic for display at the display unit, the AR graphic being based on the vehicle operation data and the multimedia data; and cause the display unit to display the AR graphic.
  2. The system according to claim 1, wherein the processor is configured to: process the AR graphic for display at the display unit, the AR graphic having a first component, wherein the first component comprises an applied spatial and temporal variation based on the multimedia data.
  3. The system according to claim 2, wherein the applied spatial and temporal variation is based on one of more of: volume information in the multimedia data; timing information in the multimedia data; frequency information in the multimedia data.
  4. The system according to claim 2 or claim 3, wherein the applied spatial and temporal variation is provided in a first dimension, optionally wherein the first dimension is a vertical dimension.
  5. The system according to any preceding claim, wherein the processor is configured to: process the AR graphic for display at the display unit, the AR graphic having a second component.
  6. The system according to claim 5, wherein the second component of AR graphic comprises one or more graphic portions.
  7. The system according to claim 6, wherein a property of the one or more graphic portions is determined based on vehicle operation data and the multimedia data, optionally wherein: the property is a dimension determined based on the vehicle operation data; and/or 2. 3. 4. 5. 6. 7.the property is a duration of display of the one or more graphic portions which is determined based on the multimedia data, optionally wherein the duration of display is determined based on one or more of: lyric timing data within the multimedia data; grammatic and/or pronunciation features of a lyric within the multimedia data; current audio-track timing within the multimedia data.
  8. 8. The system according to any one of claims 5 to 7, wherein the second component is a textual feature. 10
  9. 9. The system according to any one of claims 5 to 9, wherein the second component is located based on: a distance where the user should begin to interact with the second component, or a time when the user should begin to interact with the second component; a distance where the user should have already interacted with the second component, or a time when the user should have already interacted with the second component; and/or a start location of an animation to remove the second component or a graphic portion thereof and/or a stop location of an animation to remove the second component or a graphic portion thereof and/or a start time of an animation to remove the second component or a graphic portion thereof and/or a stop time of an animation to remove the second component or a graphic portion thereof
  10. 10. The system according to any one of the preceding claims, wherein the processor is configured to: process the AR graphic for display at an apparent fixed location relative to a feature in the field of view when the vehicle operation data indicates that the vehicle is above a threshold vehicle operation parameter.
  11. 11. The system according to any one of the preceding claims, wherein the processor is configured to: process the AR graphic for display at an apparent time-varying location relative to a feature in the field of view when the vehicle operation data indicates that the vehicle is below a threshold vehicle operation parameter.
  12. 12. A head-up display, HUD, system comprising the system according to any one of the preceding claims.
  13. 13. An audio/visual, AV, system comprising the system or HUD system according to any one of the preceding claims, the AV system configured to provide an AV output related to the multimedia data provided by the multimedia data source.
  14. 14. A vehicle comprising the system, HUD system or AV system according to any one of the preceding claims, optionally wherein the vehicle is an autonomous vehicle.
  15. 15. A method of displaying information using augmented reality, AR, to a user of a vehicle, the method comprising: interacting with a display unit for displaying information using AR in a field of view, a vehicle operation data source for providing vehicle operation data to the processor, and a multimedia data source for providing multimedia data to the processor; processing an AR graphic for display at the display unit, the AR graphic being based on the vehicle operation data and the multimedia data; and causing the display unit to display the AR graphic.
GB2203665.1A 2022-03-16 2022-03-16 System and method Pending GB2616648A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2203665.1A GB2616648A (en) 2022-03-16 2022-03-16 System and method
PCT/EP2023/056765 WO2023175080A2 (en) 2022-03-16 2023-03-16 System and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2203665.1A GB2616648A (en) 2022-03-16 2022-03-16 System and method

Publications (2)

Publication Number Publication Date
GB202203665D0 GB202203665D0 (en) 2022-04-27
GB2616648A true GB2616648A (en) 2023-09-20

Family

ID=81254980

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2203665.1A Pending GB2616648A (en) 2022-03-16 2022-03-16 System and method

Country Status (2)

Country Link
GB (1) GB2616648A (en)
WO (1) WO2023175080A2 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3527417A1 (en) * 2018-02-15 2019-08-21 Toyota Jidosha Kabushiki Kaisha Sound output and text display device for a vehicle
WO2019232005A1 (en) * 2018-05-30 2019-12-05 Dakiana Research Llc Method and device for presenting an audio and synthesized reality experience
CN112185415A (en) * 2020-09-10 2021-01-05 珠海格力电器股份有限公司 Sound visualization method and device, storage medium and MR mixed reality equipment
US20210133448A1 (en) * 2019-10-30 2021-05-06 Lg Electronics Inc. Xr device and method for controlling the same

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5839956B2 (en) * 2011-11-21 2016-01-06 株式会社エクシング Karaoke system, karaoke lyrics display method, karaoke terminal device, and computer program
DE102016218602A1 (en) * 2016-09-27 2018-03-29 Volkswagen Aktiengesellschaft A method for changing the perception of the outside world of a vehicle
US10257582B2 (en) * 2017-03-17 2019-04-09 Sony Corporation Display control system and method to generate a virtual environment in a vehicle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3527417A1 (en) * 2018-02-15 2019-08-21 Toyota Jidosha Kabushiki Kaisha Sound output and text display device for a vehicle
WO2019232005A1 (en) * 2018-05-30 2019-12-05 Dakiana Research Llc Method and device for presenting an audio and synthesized reality experience
US20210133448A1 (en) * 2019-10-30 2021-05-06 Lg Electronics Inc. Xr device and method for controlling the same
CN112185415A (en) * 2020-09-10 2021-01-05 珠海格力电器股份有限公司 Sound visualization method and device, storage medium and MR mixed reality equipment

Also Published As

Publication number Publication date
GB202203665D0 (en) 2022-04-27
WO2023175080A3 (en) 2023-10-26
WO2023175080A2 (en) 2023-09-21

Similar Documents

Publication Publication Date Title
US11250816B2 (en) Method, device and computer-readable storage medium with instructions for controlling a display of an augmented-reality head-up display device for a transportation vehicle
CN104512336B (en) 3 dimension navigation
US11247565B2 (en) Virtual image display device
JP7017154B2 (en) Display control device and display control program
US20160054563A9 (en) 3-dimensional (3-d) navigation
WO2018207566A1 (en) Display device and display control method
JP6121131B2 (en) Multiple display device
JP5916541B2 (en) In-vehicle system
JP7310560B2 (en) Display control device and display control program
JP2012035745A (en) Display device, image data generating device, and image data generating program
JP2013112269A (en) In-vehicle display device
EP3159782A1 (en) A head up display
JP7147370B2 (en) Display device
KR102593383B1 (en) Control of a display of an augmented reality head-up display apparatus for a means of transportation
WO2020045328A1 (en) Display device
JP2019206256A (en) Display control device and display control program
US11670260B2 (en) Augmented reality system
GB2616648A (en) System and method
JP2021037917A (en) Display control device, display control program, and head-up display
CN116963926A (en) Improved visual display using augmented reality heads-up display
JP2013167732A (en) Display, screen and image conversion apparatus
JP7268481B2 (en) Information processing device, autonomous vehicle, information processing method and program
WO2023089104A1 (en) System and method
JP7415516B2 (en) display control device
WO2021200913A1 (en) Display control device, image display device, and method