GB2640887A - Head-up display for use in a motor vehicle - Google Patents
Head-up display for use in a motor vehicleInfo
- Publication number
- GB2640887A GB2640887A GB2406412.3A GB202406412A GB2640887A GB 2640887 A GB2640887 A GB 2640887A GB 202406412 A GB202406412 A GB 202406412A GB 2640887 A GB2640887 A GB 2640887A
- Authority
- GB
- United Kingdom
- Prior art keywords
- hud
- vehicle
- display feature
- frame
- frames
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/29—Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0118—Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0141—Head-up displays characterised by optical features characterised by the informative content of the display
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0407—Resolution change, inclusive of the use of different resolutions for different screen areas
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Instrument Panels (AREA)
Abstract
A head-up display (HUD) system 30 comprising a HUD controller 32 configured to control a HUD projector 34 in which to generate a HUD image 36 on a vehicle screen 10. The HUD controller is configured to: determine a first HUD display feature (56, Fig. 9) based on received vehicle data; generate a first and second HUD frame (100, 102; Fig. 7) wherein each frame comprises an associated colour space value, respectively; generate a HUD frame sequence (100, 104; Fig. 7) comprising a plurality of interleaved first and second HUD frames; and output the HUD frame sequence to the HUD projector to be projected as the HUD image at the screen of the vehicle. At least some of the first and second HUD frames may in an alternating order. The HUD controller may determine the presence of a low contrast scenario from the HUD image and an external view of the vehicle captured by a camera wherein a camera field of view may partially overlap with the HUD image. The HUD controller may determine a colour space value of the external view and compare the colour space value of the at least one display feature in the first HUD frame.
Description
HEAD-UP DISPLAY FOR USE IN A MOTOR VEHICLE
TECHNICAL FIELD
The present invention relates generally to a system for providing a head-up display to a driver of a motor vehicle. More specifically, the invention relates to a process for improving the clarity of the heads-up display when projected onto a screen in a vehicle.
BACKGROUND
In recent years, there has been a growing demand for advanced technologies to enhance the driving experience, particularly in the automotive sector. One area of significant interest and innovation is in the field of head-up display (HUD) systems designed to provide drivers with crucial information directly within their field of view while minimising distraction from the road.
Traditional dashboard displays require drivers to divert their attention away from the road, potentially increasing the risk of accidents. HUD systems address this concern by projecting relevant information onto a windscreen, or a dedicated screen within the driver's line of sight, allowing for continuous monitoring of vital data without requiring the driver to look away from the road.
Early automotive HUD systems typically displayed basic information such as vehicle speed and navigation directions. However, advancements in display technology, optics, and computing power have enabled the integration of more sophisticated features into modern HUD systems. These may include real-time traffic data, collision warnings, adaptive cruise control status, lane departure warnings, and even augmented reality overlays that enhance situational awareness.
The primary objective of incorporating a HUD system into automotive vehicles is to enhance driver safety and convenience. By presenting essential information directly in the drivers line of sight, HUD systems help reduce the cognitive load associated with monitoring various dashboard gauges and screens, thus allowing drivers to maintain focus on the road ahead.
Despite the significant progress made in HUD technology, there remain opportunities for further innovation and improvement, particularly in terms of display clarity, customisation options, and integration with other vehicle systems.
It is against this background to which the present invention is set.
SUMMARY OF THE INVENTION
In accordance with one aspect, the examples of the invention provide a head up display (HUD) system for a vehicle, comprising a HUD projector and a HUD controller, wherein the HUD controller is configured to control the HUD projector to generate a HUD image onto a screen of the vehicle, wherein the HUD image has at least one display feature; a camera associated with the vehicle being coupled to the HUD controller, and being configured to capture an external view relating to a forward-looking scene of the vehicle.
The HUD controller is configured to detect a low contrast scenario by comparing the external view captured by the camera with the HUD image; adjust a colour of the at least one display feature in the HUD image to enhance the colour contrast perceived by a user between the at least one display feature in the HUD image and the external view captured by the camera.
Beneficially, the examples of the invention provide for an enhanced head-up display system which adapts to the external view of the vehicle, as captured by the camera or equivalent imaging device that carries out a comparable function, by enhancing the colour contrast of the at least one display feature within the HUD image. The display feature may be shown on its own in the HUD image, or it may be part of many display features in the HUD image. The examples of the invention may be applied to any of the display features displayed in the HUD image.
In some examples, the HUD controller is configured to perform the following procedure in detecting a low contrast scenario: generate a HUD frame comprising the least one display feature, wherein the at least one display feature has an associated colour space value, evaluate the generated HUD frame to determine the colour space value of the at least one display feature, evaluate the external view captured by the camera to determine a colour space value of the external view; and determine if a difference between the colour space value of the at least one display feature and the colour space value of the external view exceeds a predetermined threshold.
Moreover, in adjusting the colour of the at least one display feature, the HUD controller is operable to: adjust the colour space value of the at least one display feature in one or more subsequent HUD frames to increase the difference between the colour space values if the determined difference between colour space values is less than the predetermined threshold.
The adjustment to subsequent HUD frames, in some examples, can be configured to take place only in non-sequential HUD frames. For example, the colour change to the display feature may take place in alternate HUD frame that are sent to the HUD projector.
The colour space of the external view captured by the camera or equivalent imaging device may be determined by averaging the valves of different colour channels of pixels in the external view. The process may involve averaging e.g. RGB components of each pixels in a pixel array of the external view. Alternatively, the averaging may be applied to pixels within a selected field of view of the external view (or external image) captured by the camera, for example by focussing on the array of pixels that is determined to overlap the drivers perception of where the HUD image exists in the external view ahead of the vehicle. As a further alternative, a spread of pixels may be selected to reduce the processing load. An even spread of selected pixels throughout the external view (or a sub field of view that overlaps with the position of the HUD image) can provide a good representation of the average colour content (i.e the different RGB colour values) of an image without imposing too high a processing burden.
As discussed, in some examples the pixel array of the external view may correspond to a localised region of the external view which overlaps the HUD image. This may provide a more accurate determination of similar colours appearing in the localised image region compared to the colour of the display feature in the HUD image. The pixel array may substantially coincide with the region of the external view which corresponds to the display feature in the HUD image. Expressed another way a portion of the external view may be evaluated that overlaps at least partially with the HUD image on the screen of the vehicle.
In another aspect, the examples of the invention provide a head-up display (HUD) system for a vehicle, comprising a HUD projector and a HUD controller, wherein the HUD controller is configured to control the HUD projector to generate a HUD image on a screen of the vehicle. The HUD controller is configured to: determine a first HUD display feature based on received vehicle data, generate a first HUD frame for projection by the HUD projector, the first HUD frame comprising the first HUD display feature, and wherein a first colour space value is attributed to the first HUD display feature, generate a second HUD frame for projection by the HUD projector, wherein the second HUD frame comprises the first HUD display feature, and wherein a second colour space value is attributed to the first HUD display feature in the second HUD frame; generate a HUD frame sequence for outputting to the HUD projector, wherein the HUD frame sequence comprises a plurality of the first HUD frames and a plurality of the second HUD frames, wherein the plurality of the first HUD frames are interleaved with the plurality of the second HUD frames, outputting the HUD frame sequence to the HUD projector, when the HUD projector projects the HUD frame sequence at the screen.
The vehicle data may be any suitable data with respect to the vehicle that may be useful to the driver to display on the HUD, for example vehicle speed data, navigational data, speed limit data, vehicle safety data, power system operational data, and so on.
Usefully, this example of the invention provides an increased contrast of the selected display feature compared with the view behind the HUD image as seen by the driver. In effect, the colours of the display feature are 'mixed' by virtue of the first and second HUD display frames, which means that the driver's visual perception of the display feature is enhanced through a variety of visual conditions of the external image behind the HUD image from the perspective of the driver.
The interleaving between the first HUD frames and the second HUD frames can take place in a strictly alternate order, for example a 1:1 ratio, but this is not essential.
In this example, the interleaving process of the first and second HUD frames can take place continuously or, in other examples, the interleaving process can be triggered by certain triggering conditions. One triggering condition may be when the system detects that a low contrast scenario exists between the colour of the display feature in the HUD image and the eternal view of the vehicle, as has been discussed above.
Features of the first aspect of the invention may be combined with the second aspect of the invention as applicable.
BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of the present invention reference will now be made, by way of example, to the accompanying diagrammatic drawings in which: Figure 1 is a schematic view of a vehicle 10 showing a plurality of common vehicle systems communicatively coupled to a HUD system; Figure 2 is an example of a view which may be seen out of the front of the vehicle 10 of Figure 1 from the perspective of a user when operating the vehicle 10, and shows an example of a HUD image projected onto a screen of the vehicle by the HUD system; Figure 3 is a view like that in Figure 2 but for comparison shows a light-coloured surface covering at the side of the road that reduces the user perception of the HUD image; Figure 4 is a schematic view of components of a HUD controller of the HUD system; Figure 5 is a flow chart in accordance with an example of the invention; Figure 6 is a chromaticity plot which is referred to in the discussion regarding the exemplary method of Figure 5, wherein the chromaticity plot is illustrative of the gamut of colours of a suitable colour space, e.g. the sRGB colour space; Figure 7 is a diagrammatic view of an aspect of the operation of the method described in Figure 5; Figure 8 is a flow chart in accordance with another example of the invention; Figures 9 and 10 are comparative views of a HUD image projected onto a screen of the vehicle illustrating the contrast enhancement of the HUD image that is achieved.
DETAILED DESCRIPTION
Embodiments of the present invention relate to methods for controlling a head-up display (HUD) suitable for use in a vehicle 10.
With reference to Figure 1, a vehicle 10 includes a plurality of vehicle systems that are arranged on a communications network 12 that is known as a CAN bus. As would be understood by a skilled person, the term 'CAN' refers to a Controller Area Network and is a standard technology for intra-vehicle communications. CAN bus technology is conventional and would be well understood by a skilled person so no further discussion will be provided.
The vehicle systems may comprise and engine control system 14, a navigation system 16, a driver assistance system 18, a camera system 20 and a communications system 22. These vehicle systems are simply a set of examples of suitable systems that may be communicatively coupled to the CAN bus 12. As would be understood, each of the vehicle systems mentioned here would be provided with a respective CAN controller (not shown) that acts as a node on the CAN bus and transmits data to, and receives data from, the CAN bus 12 by way of a dedicated CAN transceiver (not shown).
It will be appreciated that the above discussion provides a brief overview of possible vehicular subsystems, and that a vehicle may include many more systems such as a transmission system, an airbag system, anti-lock braking system, power steering system and so on. However, a full discussion of vehicle subsystem and how they integrate to the CAN bus 12 is outside the scope of this application.
The camera system 20 is coupled to a camera 24 that is mounted at the front of the vehicle 10. The camera 24 may be any conventional camera suitable for providing image data of the view ahead of the vehicle. In Figure 1, the camera 24 is indicated as having a field of view 26 ahead of the vehicle 10.
The vehicle 10 also includes a HUD system 30. In overview, the HUD system 30 includes: a HUD controller 32 and a HUD projector 34.
The HUD projector 34 is configured and oriented to display a HUD image 36 onto a screen 38 of the vehicle 10.
The screen 38 may be a windscreen of the vehicle 10 or may be a separate screen placed inboard of the windscreen. Such a separate screen may be known as a 'combiner' in some conventional HUD systems. As would be understood by a skilled person, a combiner is a semi-transparent surface typical in HUD and AR (augmented reality) systems which function to overlay a projected image on top of the physical world. Such combiners are transparent and permit a user to see through to the real-world scene behind the combiner, whilst reflecting dynamic digital information to the user.
In principle the HUD projector 34 may be configured to project the HUD image 36 on any screen of the vehicle. It is also envisaged that projection technology may advance to be able to project a holographic display to a vehicle user. Accordingly, a HUD image 36 in accordance with the examples of the invention discussed here should be considered to encompass such variants.
The HUD controller 32 can receive data from any or all of the vehicle systems for the purposes of displaying suitable data to the driver through the HUD projector 34.
Figure 2 shows an example of this in a simplified format in order to illustrate the principle of the invention. Figure 2 illustrates a view through the windscreen 38 of the where a road 40 extends ahead of the vehicle 10 into the distance, and an environment 42 surrounds the road (e.g. a surrounding landscape and horizon). The road 40 has a left edge and a right edge and a central dividing line.
The HUD projector 34 is shown projecting the HUD image 36 onto the screen 38. As has been mentioned, here the HUD image 36 is shown being projected directly onto the screen 38, although other means could achieve the same effect.
Here, a dashed box is used to indicate the outer boundary of the HUD image 36 of the screen 38 on which the HUD image 36 is displayed. The outer boundary may therefore be referred to as the projection portion 44 of the screen since it is this portion of the screen 38 which the HUD image 36 is displayed.
The HUD image 36 may comprise information relating to the data received by the HUD controller 32 from any of the vehicle systems. In Figure 2, the information displayed is the speed of the vehicle 10 displayed through a textual representation. As would be understood by a skilled person, the vehicle speed data may be based on data received by the HUD controller 32 via the CAN bus 20 from the engine control system 14, although this is not essential.
For the purposes of this discussion, the vehicle information that is displayed by the HUD image 36 has been simplified for ease of understanding to include on vehicle speed data.
However, it should be noted that in a practical system, such a HUD image 36 may comprise many more items of vehicle data, such as navigation directions, speed limit data, cellular phone data such as caller information and so on, to name such a few examples. The data displayed on the HUD image 36 may be selected by the user during the setup of the HUD system 30 or may be a system default. The displayed data may herein be referred to as a display feature 46 of the HUD image 36. Notably, the HUD image 36 may comprise more than one such display feature, although a single display feature is shown here to benefit the simplicity of the discussion.
From observing Figure 2, it will be noted that the display feature 46 in the HUD image 36 has a particular colour associated with it. In this case, the display feature 46 has the colour white. White is often chosen for vehicle speed data displayed on HUD images because it strikes a good contrast with the view ahead of the vehicle so it can be discerned readily by the driver.
However, in some circumstances, the view ahead of the vehicle may change such that the contrast between the colour of the display feature 46 and the colour 'behind' that display feature 46 is reduced to an extent that it is more difficult for the driver to discern clearly the information portrayed by the display feature 46.
An example of this is shown in Figure 3. Here, the same view ahead of the vehicle 10 can be observed, although there is a region to the right hand side of the road 40 that has a much lighter colour. This might be due to a build-up of snow on the side of the road, or a similar surface covering that is light in colour. A comparable situation could be found where the view ahead of the vehicle encompasses some light areas of sky where the HUD image 36 overlaps with that light area. Or the vehicle could be following a large slab-sided vehicle such as a truck which is very light in colour. The effect is the same in that the contrast between the display feature 46 in the HUD image 36 and the view of the driver behind the HUD image 36 is greatly reduced, which means that the display feature 46 is more difficult to discern by the driver. This is seen clearly in Figure 3 since the speed display feature 46 is shown as white in colour which is only shown faintly with respect to the background view which also has a very light colour. Therefore, it will be appreciated by comparing Figure 3 to Figure 2 that the textual representation of the display feature 46 has become less clear.
The examples discussed here provide a solution to this problem by improving the clarity of information contained in a HUD image 36. In specific examples, the clarity of the display feature 46 in the HUD image 46 may be improved through increasing the contrast between the display features 46 and the colour or predominant colour of the view ahead of the vehicle 10 in the driver's line of sight by adapting the colour of the display feature 46.
Two alternative examples will be now be discussed, both of which are operable to improve the clarity of the display feature 46 of the HUD image 36 to the user. It should be appreciated that the discussion refers to a single display feature 46 but in reality there may be many display features in the HUD image 36, some or all of which may be adjusted to enhance their clarity/contrast in accordance with the examples described herein.
The following examples may be implemented by the HUD controller 32. In this discussion the HUD controller 32 is shown as a separate module, although this is principally for ease of illustration. The functionality provided by the HUD controller 32 may be provided by a separate control module or as part of a control module with responsibility for carrying out other functionality in relation to the vehicle systems.
Figure 4 provides a suitable architecture for the HUD controller 32 in a schematic format and includes an input module 50, an output module 52, a processor module 54, and a memory module 56. That is, Figure 4 shows an example architecture for the HUD controller 32, including four functional elements, units or modules. Each of these units or modules may be provided, at least in part, by suitable software running on any suitable computing substrate using conventional or customer processors and memory. Some or all of the units or modules may use a common computing substrate (for example, they may run on the same server) or separate substrates, or different combinations of the modules may be distributed between multiple computing devices. The example architecture of the HUD controller 32 is not intended to be limiting on the scope of the invention though and, in other examples, it shall be appreciated that the architecture may take other suitable forms.
The input module 50 is electrically connected to the CAN bus 20 and so is configured to receive data in a digital format for the purposes of processing by the HUD controller 32.
Therefore, the input module 50 is configured to receive signals that are indicative of the speed of the vehicle, for the purposes of displaying as the display feature 46 in the HUD image 36. Further signals indicative of other vehicle data for the purposes of display via the HUD image 36 may also be received by the input module 50.
Similarly, the output module 52 is electrically connected to the CAN bus 12 and so is configured to transmit data in a suitable digital format to other vehicle systems.
The processor module 54 is configured to: (i) process the signals received at the input module 50 and (ii) determine control data and signals for controlling the HUD projector 34 in accordance with one or more schemes, rules, or algorithms which will be discussed below. Operating data may suitable be stored in the memory module 56.
Reference will now be made to be made to Figures 5 to 7 which illustrate a first example of the invention depicting a methodology implemented by the HUD controller 32 and a visual result of the methodology.
With reference to Figure 5, a method 400 in accordance with an example of the invention includes steps for increasing the contrast between the display feature 46 in the HUD image 36 and the background view behind the HUD image 36 in the line of sight of the driver.
At step 401, the method 400 comprises receiving, at the HUD controller 32, data from one or more of the plurality of common vehicle systems 14-22.
In the example illustrated in Figures 1, 2a and 2b, the data received is indicative of a speed of a vehicle 10, although it will be appreciated that the data may be indicative of any common characteristic associated with a vehicle 10 or its surrounding environment.
At step 402, the HUD controller 32 determines a suitable display feature 46 in dependence of the vehicle data received at step 401. This step may involve the HUD controller 32 determining the appropriate graphical representation for the display feature 46 for output by the HUD projector. Determining the appropriate graphical representation may include selecting the pixel pattern to output and the colour associated with each pixel in the pixel pattern. The colour for each pixel may be selected based on an appropriate colour model, for example the RGB colour model.
As the skilled person would appreciate, an appropriate RGB colour model may be implemented in different ways, depending on the capabilities of the system used. One example is a 24-bit implementation, with 8 bits, or 256 discrete levels of colour per channel. Any colour space based on such a 24-bit RGB model is thus limited to a range of 256x256x256 = 16.7 million colours. The same principle applies for any colour space based on the same colour model, but implemented at different bit depths. A suitable colour space for the display feature 46 would comprise the sRGB colour space. Other colour models are also applicable such as the HSV colour model.
An RGB colour space may also be represented by a chromaticity plot which represents that colour gamut of the colour space. Such a chromaticity plot is shown in Figure 6. As would be understood by a skilled person, a chromaticity plot is a representation of colour in x and y coordinates, independent of luminance. Therefore, any colour that a display feature 46 may be assigned in accordance with an e.g. sRGB colour space by may represented by x,y coordinates in a suitable chromaticity plot.
In the context of the display feature 46 being a vehicle speed data being assigned a white colour, this would be represented by a chromaticity value of x=0.3 and y=0.3, as is indicated by the point 'X' in Figure 6. Colour values for each of a plurality of display features 46 may be stored in the memory module 56 of the HUD controller 32, for example in a suitable data structure such as a look up table.
At step 403, the method 400 generates a first HUD frame 100 comprising said display feature 46 at a suitably selected colour value, which in the illustrated example is vehicle speed data display as a white colour. The first HUD frame 100 is shown in Figure 7 on the left-hand side. As can be seen, the display feature 46 is shown in white. This step is conventional in the sense that a conventional HUD system would generate a display feature based on vehicle data and a required colour assignment and output that generated HUD frame to the HUD projector.
However, in this example of the invention, the method further includes, at step 404, the generation of a second HUD frame 102. The second HUD frame 102 is shown in Figure 7 on the right-hand side of the drawing.
The second HUD frame 102 is generated based on the first HUD frame 100 comprising the display feature 46. In this sense, the same display feature 46 is included in the second HUD frame 102 in the same position as it is displayed in the first HUD frame 100, but is shown in Figure 7 as 46'.
In the second HUD frame 102, the display feature 46 is assigned a colour which is different from the colour of the display feature 46 in the first HUD frame 100, which is clearly apparent from Figure 7.
It should be noted that in assigning a colour to the display feature 46 of the second HUD frame 102, the colour may be selected to as to maximise, or at least enhance, a contrast with the colour assigned to the display feature 46 in the first HUD frame 100. In the illustrated example in which the display feature 46 in the first HUD frame 100 is assigned white as its colour, the display feature 46' in the second HUD frame 102 may be assigned a suitably contrasting colour such as dark blue. This is illustrated in Figure 6 as point 'B'.
The colour selection B for the display feature 46' of the second HUD frame 102 can be stored in the suitable look up table as has been mentioned above.
Since the colour of the display feature 46 in the second HUD frame 102 is selected to be a high contrast with the background scene behind the HUD image 36, the second HUD frame 102 can be considered to be a 'high-contrasf HUD frame 102, whereas the first HUD frame 100 can be considered to be a standard-contrast HUD frame 100.
It should be noted at this point that steps 403 and 404 are described as generating a HUD frame comprising a single display feature 46, but that this is simplified for ease of explanation and understanding. It is envisaged therefore, that the steps 403 and 404 would comprise generation of multiple instances of different display features based on different vehicle data, and that such features may include navigation information from a satellite navigation system, and telephone call symbology, speed limit data given the prevailing road conditions, and so on. The same principles described here to each display feature that may be generated, however.
Once both the first HUD frame 100 and second HUD frame 102 have been generated by the HUD controller 32 (steps 403 and 404), the HUD controller 32 then generates a suitable sequence (see 104 in Figure 7) of HUD frames 100,102 that may be projected by the HUD projector 34 as the HUD image 36, as indicated in step 405. When generating the HUD frame sequence 104 with the first and second HUD frames 100,102, the HUD controller 34 may interleave the first and second HUD frames 100,102 as seen in Figure 7. Once the HUD frame sequence 104 is generated, the HUD controller 34 will output the HUD frame sequence 104 to the HUD projector 34, as indicated in step 406, for display on the screen 38.
It should be noted that the HUD frame sequence 104 that is output from the HUD projector 34 may run at a rate of about 20-30fps (frames per second). The frame rate is lower than is typical for video due to the generally lower resolution required for HUD systems.
However, such a frame rate still has the clarity of perception required by the user. Display features within the HUD image 36 may not change from one frame to the next. For example, many display features presented in the HUD image 46 may be relative static such as speed limit information, directional information and other symbology. The method 400 may therefore be implemented to run at a suitable rate to ensure that display features are output to the HUD projector 34 at a suitable frequency.
In the illustrated example, when generating the sequence of HUD frame for display by the HUD projector 34, the ratio of first HUD frames 100 to second HUD frames 102 in the sequence may be 1:1. In other words, the HUD frame sequence 104 comprise alternating instances of the first HUD frame 100 and the second HUD frame 102. The sequence of HUD frame 100,102 may continue for any number, and it should be noted that the sequence of Figure 7 is merely exemplary. For example, the sequence 104 may comprise thirty HUD frames 100,102 to make up one second duration (or alternatively, 0.2 seconds, or 0.5 seconds) of HUD image 36, and then the method may output a difference HUD frame sequence 104 comprising one or more display features that have changed. For example, vehicle speed will change and navigational directions may change, and speed limit date may change.
In the illustrated example, the ratio of first HUD frame 100 to second HUD frame 102 in the HUD frame sequence 104 is 1:1. However, this is not essential and the ratio may be different, for example, 1:2, or 2:1, or 1:3, or 3:1, or any other suitable ratio.
A benefit of this example is that the perception of the display feature 46 to the driver is improved in both light and dark background conditions. For example, in dark background conditions, the driver visually will perceive the white colour of the display feature in the first or 'standard-contrast HUD frames 100 and fail to recognise the darker colour display feature 46' of the second or 'high-contrast' HUD frames 102. In comparison to this, in light background conditions, for example in the case of snow on the road as discussed above, the driver visually will perceive the darker colour display features 46' in the second HUD frames 102 and fail to recognise the white colour display features 46 of the standard-contrast HUD frames 100.
Therefore, in this example the driver's visual perception of the display feature 46,46' in the HUD image 36 is improved in a wider range of background colour situations.
In this example it is envisaged that the HUD frame interleaving approach may be constant such that the perception of the display feature 46 in the HUD image 36 is enhanced continually. The driver will not notice any visual disturbance in the display feature 46 despite the switching in colour due to the high frame rate of the display of the HUD frame sequence 104 in the HUD image 36. However, the colour contrast will be improved.
In the above examples, although the method 400 has been described in relation to the display feature 46 being a textual representation (i.e. alphanumeric characters), it should be understood that the display features may instead be other patterns of pixels, such as symbols, icons and so on.
It should also be noted that the display feature 46 may be a background feature. That is, in the HUD image 36 discussed above the display feature 46 is an alphanumeric string that has been assigned the colour white, whereas the colour surrounding the display feature in the HUD image 46 does not have a colour associated with it, i.e. there is no projected colour. Instead, the display feature 46 may consist of a region or 'patch' of pixels bounding the textual characters. The display feature 46 therefore becomes an array of pixels e.g. a rectangular array of pixels within which is embedded a set of inactive pixels so that the textual characters are shown in an 'inverse video' manner.
Having described one example of how the HUD image 36 may be controlled to improve the visual perception of it to the driver of the vehicle, the discussion will now move on to a second example of achieving the same or similar benefits. This example will be described with reference to Figures 8 to 10.
With reference to Figure 8, a method 500 includes steps for increasing the contrast between a display feature 46 in a HUD image 36 and the background scene behind the HUD image 36.
In general, the method 500 is operable to use the camera system 20 of the vehicle 10 to capture a view relating to a forward-looking scene of the vehicle 10 and to identify when a low contrast scenario exists by comparing the external view with the HUD image 36. In the event a low contrast scenario exists, then the method is operable to adjust a colour of the display feature 46 in the HUD image 36 to enhance the colour contrast between the display feature and the external view captured by the camera system 20.
It will be noted that the first three steps (501-503) of method 500 are the same as the method 400 of Figure 5.
As such, at step 501, the method 500 comprises receiving, at the HUD controller 32, data from one or more of the plurality of common vehicle systems 14-22. In this example of the invention, in the same way as the previously illustrated example, the data received may be indicative of a speed of a vehicle 10, although it will be appreciated that the data may be indicative of any common characteristic associated with a vehicle 10 or its surrounding environment.
At step 502, the HUD controller 32 determines a suitable display feature 46 in dependence of the vehicle data received at step 501. This step may involve the HUD controller 32 determining the appropriate graphical representation for the display feature 46 for output by the HUD projector 34. Determining the appropriate graphical representation may include selecting the pixel pattern to output and the colour associated with each pixel in the pixel pattern. The colour for each pixel may be selected based on an appropriate colour model, for example the RGB colour model. The discussion included above about a suitable colour space assignment to the display feature 46 applies also to this example of the invention. In the specific example discussed above, the display feature 46 is an alphanumeric string representing the current vehicle speed and is assigned the colour white.
At step 503, the method 500 generates a first HUD frame comprising said display feature 56 at a suitably selected colour value, which in the illustrated example is vehicle speed data display as a white colour. To assist in this discussion of the method 500, the first HUD frame 200 is shown in Figure 9 and labelled as '200'. As can be seen, the display feature 56 is shown in white, and the background of the first HUD frame 200 has been provided with a grey background in Figure 8 merely to help the white characters be seen. The grey background would not be projected in the HUD image 36. It will also be observed that the display feature 56 is projected by the HUD projector 34 within the HUD image 36.
This step is conventional in the sense that a conventional HUD system would generate a display feature 56 based on vehicle data and a required colour assignment and output that generated HUD frame 200 to the HUD projector 34 for generation of the HUD image 36.
The method 500 is then operable to evaluate the scene ahead of the vehicle in order to determine whether a sufficient contrast exists between the background scene of the HUD image 36 and the colour of the display feature 56 to ensure that the display feature 56 can be perceived well by the driver of the vehicle. This is indicated at step 504.
The step of determining whether a low contrast scenario exists between the HUD image 36 and the background scene ahead of the vehicle may be achieved in various ways.
In one example, the HUD controller 32 may evaluate data generated by the camera system 20. As has been discussed, the camera system 20 is configured to capture an image of the forward-looking view of the vehicle 10. This is depicted in Figure 8 by the wide-angle main field of view (FOV) 60 of the camera 24. Within the main FOV 60, there is encompassed a sub-FOV 62 which coincides the position of the HUD image 36 on the screen 38, namely, the position of the screen 38 onto which the HUD image 36 is projected, as referred to as the projection portion 44. The sub-FOV 62 will correspond with a specific region of the image captured by the camera system 20. It will be appreciated that the sub-FOV 62 is a localised array of pixels within the larger array of pixels that corresponds to the FOV 60.
In evaluating the view ahead of the vehicle 10, the HUD controller 32 is configured to receive data from the camera system 20 over the CAN bus 12. As the skilled person would understand, the camera data may comprise colour space values for each of an array of pixels captured by the camera 24. Thus, suitably the camera 24 is a charged coupled device (CCD) that outputs a data array having data points corresponding to each pixel in the CCD. Each data point may therefore contain a value for each R component, G component and B component according to the sRGB colour space.
In determining the contrast between the background scene and the HUD image 36, the HUD controller 32 may average each of the R, G, and B components (also referred to as channels) of the camera data to arrive at an overall value of the average colour of the background scene relating to the sub-FOV 62.
In one example, a low contrast scenario may be determined by the HUD controller 32 averaging the colour content of pixels contained within the sub-FOV 62 and comparing the calculated average colour content to the selected colour of the HUD display feature 56.
As an example of this, consider the display feature 56 having the colour white. White can be expressed as RGB(255,255,255) or, as expressed on a chromaticity plot, x=0.31. y=0.33 (based on CIE 1931 chromaticity diagram).
A suitable colour contrast threshold can be established over which the contrast between the colour of the display feature 56 and the average colour of the sub-FOV 62 will be considered to provide an unacceptable visual contrast to the driver. So, the threshold may be considered to be a difference greater than x=0.1 and a difference greater than y=0.1.
The colour contrast threshold may be generated during development of the system and stored within the memory module 56 for reference in the method 500 at any time. The skilled person would understand that comparable thresholds may be set by reference with RGB colour components and that such thresholds may be determined offline based on the required sensitivity of the system.
As an alternative to evaluating the colour of the pixels within the sub-FOV 62 that overlaps with the projection area of the HUD image 36, it is envisaged that the colour of the pixels over a wider area may be evaluated, for example the maximum FOV 60. It should be noted that the sub-FOV 62 may not coincide precisely with the projection portion 44 of the HUD image 36 and may instead by a portion of the projection portion 44. As a further alternative, it is envisaged that the average colour of an array of pixels that borders the display feature 56 may be evaluated, i.e. a small patch of pixels that surrounds the display feature 56, thereby overlapping the display feature 56 at least in part.
It will be appreciated that depending on the type of camera/imaging system that is used, the number of pixels needing to be processed may be between a few hundred or many thousands. This may impose a significant processing burden. To mitigate the processing requirement, a spread of pixels may be selected from the FOV 60 or the sub-FOV 62, as applicable. Such an approach may still yield an accurate average colour value, but reduce the processing load significantly. It is expected that the number of selected pixels may be between 10 and 50. The pixels should preferably be spread evenly throughout the FOV 60 or the sub-FOV 62 as applicable, but this is not essential.
Notably, different regions of the FOV 60 may be used to 'predict' upcoming colour clashes with the HUD image. In this respect, pixels at a higher level in the FOV may be indicative of view features that may 'travel down' the image as the vehicle moves forward. By sampling the pixels at higher regions of the FOV 60, the system may be able to make a prediction of possible low contrast scenarios that may occur several second in advance.
The processing may also take into account the standard deviation (SD) of the pixel colour values. For example, a lower standard deviation can be indicative that the colour 'spread' around a calculated mean value is low so it is more likely that there is a lot of one colour in an image that is reducing the contrast between the external view and the display feature in the HUD image.
As a further alternative, rather than the average RGB content of the pixels, the average SD of the pixels may be determined from the 'target' colour of the display feature. For example, if it is determined that the SD of the selected pixels (e.g. either a spread of selected pixels in the external FOV 60 or sub-FOV 62) is relatively low, then it can be deduced that the average colour of those pixels is quite close to the colour of the display feature and so a high contrast' HUD frame should be enabled. It is envisaged that a low' SD could be considered to be below 3, for example, at which point the system would identify that a low contrast scenario exists.
If a low contrast scenario is determined not to exist, then the HUD controller 32 is operable to output the generated first HUD frame 200 to the HUD projector to display as the HUD image 36 at a predetermined display frequency, as discussed above, as noted at step 505.
The method then returns to step 501.
If it is determined that a low contrast scenario does exist, then the HUD controller 32 is operable to generate a further HUD frame with an alternate colour scheme for the display feature 56, as indicated at step 506. The HUD controller 32 is then operable to output the enhanced contrast HUD frame 202 to the HUD projector 34, as indicated at step 507.
This process is also illustrated in Figure 10. As is shown, the enhanced contrast HUD frame is illustrated as 202. The enhanced contrast HUD frame 202 has a display feature 56' which is the same alphanumeric string as the first display feature 56 but shown in a much darker colour. It will be appreciated therefore that the colour of the display feature in the original HUD image has been adjusted to provide a higher contrast with the background image.
In order to generate the enhanced contrast HUD frame 202, the HUD controller 32 may be operable to refer to a suitable data structure (e.g. a look up table) stored in internal memory 56. The data structure may comprise a list of display features corresponded with their primary assigned colour and corresponding 'high contrast' colour allocations to be used when a low contrast situation has been identified. For example, the illustrated display feature 56 may have a primary assigned colour as white (approx. x=0.3 and y=0.3 on the chromaticity plot for sRGB colour space) whereas its high colour contrast colour allocation may be dark blue (approx. x=0.2; y=0.1 on the chromaticity plot), which thereby increases the difference between the colour space value of the display feature 56 of the first HUD frame and the colour space value of the display feature 56' of the second HUD frame 202.
The effect of this will be appreciated from Figure 10 since the display feature 56' is much more apparent in the HUD image 56 than is the case in Figure 9, thereby providing a more easily discernible HUD image 36 for the driver of the vehicle.
Following the output of the generated second HUD frame 202 at step 507, the method 500 will continue to output the second HUD frame 202 until such a time as the colour contrast between the original colour of the display feature 56 in the HUD image 36 and the averaged pixel colour content of the sub-POV 62 falls below the predetermined threshold discussed above. This is shown in Figure 10 as the loop back to decision step 504. In this loop, the second HUD frame 202 may be updated with received data from the vehicle systems 14- 22 which, in effect, provides a refresh of the display feature 56' in the HUD image 36, as indicated at step 508.
In the above discussion, variants have been described where the specific examples discussed may depart from the examples of the invention as defined by the claims. The skilled person may however be able to conceive other adaptations that are not discussed here.
For example, in the method 500 described above with reference to Figure 8 to 10, it will be apparent that one of the effects of the method 500 is that the HUD controller 32 detects when a low contrast scenario exists in the view ahead of the vehicle as captured by the camera system 20 compared to the display feature 56 in the HUD image 36 and is operable to 'switch' the display feature 56 to an enhanced contrast display feature 56' which has a different colour. It is envisaged that the enhanced contrast display feature 56' will remain displayed in the HUD image 36 until such time that the low contrast scenario with the original display feature 56 in the HUD image 36 has passed, at which point the HUD controller 32 will 'switch back' to the original colour of display feature 56 in the HUD image 36. It will be appreciated that within this methodology, the HUD controller 32 may be configured to implement an approach by which different HUD frames (the different HUD frames having display features with respective contrasting colours) are interleaved with one another in a similar manner as the methodology described with reference to Figures 5 to 7.
It is further envisaged that some user configurability of the system could be implemented to ensure that the colour contrasts may be selected appropriately. For example, some users with colour blindness may perceive some colours better than others. The system could be enhanced such that a colour blindness test may be conducted on a display screen (e.g. the dashboard display) of the vehicle to ascertain which colours or colour combinations are most receptive to the user. The results of such a colour blindness test may then be used to set the selected 'high contrast' colours for the display features in the data structure of the system.
Claims (15)
- CLAIMS1. A head-up display (HUD) system for a vehicle, comprising: a HUD projector and a HUD controller, wherein the HUD controller is configured to control the HUD projector to generate a HUD image on a screen of the vehicle, wherein the HUD controller is configured to: determine a first HUD display feature based on received vehicle data, generate a first HUD frame for projection by the HUD projector, the first HUD frame comprising the first HUD display feature, and wherein a first colour space value is attributed to the first HUD display feature, generate a second HUD frame for projection by the HUD projector, wherein the second HUD frame comprises the first HUD display feature, and wherein a second colour space value is attributed to the first HUD display feature in the second HUD frame; generate a HUD frame sequence for outputting to the HUD projector, wherein the HUD frame sequence comprises a plurality of the first HUD frames and a plurality of the second HUD frames, wherein the plurality of the first HUD frames are interleaved with the plurality of the second HUD frames, outputting the HUD frame sequence to the HUD projector, when the HUD projector projects the HUD frame sequence at the screen.
- 2. The HUD system of Claim 1, wherein, in interleaving the plurality of first HUD frames and the plurality of second HUD frames in the generated HUD frame sequence, at least some of the first HUD frames and at least some of the second HUD frames are arranged in an alternating order in the HUD frame sequence.
- 3. The HUD system of Claims 1 or 2, wherein the HUD controller is configured to: determine the presence of a low contrast scenario from an external view of the vehicle, and the HUD image, prior to the step of generating the HUD frame sequence comprising the plurality of the first HUD frames and the plurality of the second HUD frames.
- 4. The HUD system of Claim 3, wherein the external view of the vehicle is captured by a camera device.
- 5. The system of Claim 4, wherein the camera is configured to capture the external view such that a camera field of view overlaps at least partially with the HUD image as perceived by a driver of the vehicle.
- 6. The HUD system of Claims 3 to 5, wherein, in determining the presence of a low contrast scenario, the HUD controller is configured to: evaluate the external view to determine a colour space value of the external view; and determine if a difference between the colour space value of the at least one display feature in the first HUD frame and the colour space value of the external view exceeds a predetermined threshold.
- 7. The HUD system of any one of the preceding claims, wherein, in generating the second HUD frame, the second colour space value attributed to the first display feature is determined by reference to a data structure stored in a memory module comprising alternate colour space values relating to at least the first display feature.
- 8. The HUD system of any one of the preceding claims, wherein the first display feature is selected from the group of the following: a representation of vehicle speed data, a representation of speed limit data, a representation of vehicle navigation data, a representation of vehicle communications data, a representation of vehicle audio data. 30
- 9. A method of controlling a HUD system for a vehicle, the HUD system comprising a HUD projector and a HUD controller, wherein the HUD controller is configured to control the HUD projector to generate a HUD image on a screen of the vehicle, wherein the HUD image has at least one display feature, wherein the method comprises, at the HUD controller: determining a first HUD display feature based on received vehicle data, generating a first HUD frame for projection by the HUD projector, the first HUD frame comprising the first HUD display feature, and attributing a first colour space value to the first HUD display feature, generating a second HUD frame for projection by the HUD projector, wherein the second HUD frame comprises the first HUD display feature, and attributing a second colour space value to the first HUD display feature in the second HUD frame; generating a HUD frame sequence for outputting to the HUD projector as the HUD image, wherein the HUD frame sequence comprises a plurality of the first HUD frames and a plurality of the second HUD frames, wherein the plurality of the first HUD frames are interleaved with the plurality of the second HUD frames, outputting the HUD frame sequence as the HUD image to the screen of the vehicle.
- 10. The method of Claim 9, wherein, in interleaving the plurality of first HUD frames and the plurality of second HUD frames in the generated HUD frame sequence, at least some of the first HUD frames and at least some of the second HUD frames are arranged in an alternating order in the HUD frame sequence.
- 11. The method of Claim 9 or Claim 10, further comprising: determining the presence of a low contrast scenario from an external view of the vehicle, and the HUD image, prior to the step of generating the HUD frame sequence comprising the plurality of the first HUD frames and the plurality of the second HUD frames.
- 12. The method of Claim 11, wherein, in determining the presence of a low contrast scenario, the method comprises: evaluating the external view to determine a colour space value of at least a portion of the external view; and determining if a difference between the colour space value of the at least one display feature in the first HUD frame and the colour space value of the portion of the external view exceeds a predetermined threshold.
- 13. The method of Claim 12, wherein the portion of the external view that is evaluated overlaps at least partially with the HUD image on the screen of the vehicle.
- 14. The method of any one of Claims 9 to 13, wherein, in generating the second HUD frame, the second colour space value attributed to the first display feature is determined by reference to a data structure stored in a memory module comprising alternate colour space values relating to at least the first display feature.
- 15. The method of any one of Claims 9 to 14, wherein the first display feature is selected from the group of the following: a representation of vehicle speed data, a representation of speed limit data, a representation of vehicle navigation data, a representation of vehicle communications data, a representation of vehicle audio data.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB2406412.3A GB2640887A (en) | 2024-05-08 | 2024-05-08 | Head-up display for use in a motor vehicle |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB2406412.3A GB2640887A (en) | 2024-05-08 | 2024-05-08 | Head-up display for use in a motor vehicle |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| GB202406412D0 GB202406412D0 (en) | 2024-06-19 |
| GB2640887A true GB2640887A (en) | 2025-11-12 |
Family
ID=91465648
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| GB2406412.3A Pending GB2640887A (en) | 2024-05-08 | 2024-05-08 | Head-up display for use in a motor vehicle |
Country Status (1)
| Country | Link |
|---|---|
| GB (1) | GB2640887A (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160307346A1 (en) * | 2015-04-17 | 2016-10-20 | Freescale Semiconductor, Inc. | Display controller, heads-up image display system and method thereof |
| US20190107886A1 (en) * | 2014-12-10 | 2019-04-11 | Kenichiroh Saisho | Information provision device and information provision method |
| JP2022100119A (en) * | 2020-12-23 | 2022-07-05 | 日本精機株式会社 | Display control device, head-up display device, and image display control method |
| EP4039519A1 (en) * | 2019-09-30 | 2022-08-10 | Koito Manufacturing Co., Ltd. | Vehicular display system and vehicle |
| US11699368B1 (en) * | 2022-08-23 | 2023-07-11 | GM Global Technology Operations LLC | Head-up display for accommodating color vision deficiencies |
-
2024
- 2024-05-08 GB GB2406412.3A patent/GB2640887A/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20190107886A1 (en) * | 2014-12-10 | 2019-04-11 | Kenichiroh Saisho | Information provision device and information provision method |
| US20160307346A1 (en) * | 2015-04-17 | 2016-10-20 | Freescale Semiconductor, Inc. | Display controller, heads-up image display system and method thereof |
| EP4039519A1 (en) * | 2019-09-30 | 2022-08-10 | Koito Manufacturing Co., Ltd. | Vehicular display system and vehicle |
| JP2022100119A (en) * | 2020-12-23 | 2022-07-05 | 日本精機株式会社 | Display control device, head-up display device, and image display control method |
| US11699368B1 (en) * | 2022-08-23 | 2023-07-11 | GM Global Technology Operations LLC | Head-up display for accommodating color vision deficiencies |
Also Published As
| Publication number | Publication date |
|---|---|
| GB202406412D0 (en) | 2024-06-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5056831B2 (en) | Head-up display device | |
| US11295704B2 (en) | Display control device, display control method, and storage medium capable of performing appropriate luminance adjustment in case where abnormality of illuminance sensor is detected | |
| JP6481273B2 (en) | Vehicle display device | |
| US20200105224A1 (en) | Display device, display control method, and storage medium | |
| US20110050738A1 (en) | Liquid crystal display device and image processing method in liquid crystal display device | |
| US11670201B2 (en) | Method and device for influencing an optical output of image data on an output device in a vehicle | |
| US11238834B2 (en) | Method, device and system for adjusting image, and computer readable storage medium | |
| US20250135886A1 (en) | Method for adjusting a screen | |
| EP3156768B1 (en) | Methods and systems for displaying information on a heads-up display | |
| US20220242433A1 (en) | Saliency-based presentation of objects in an image | |
| EP2784572A1 (en) | Head-up display device and display method of head-up display device | |
| JP2015154420A (en) | Display device | |
| US20170004809A1 (en) | Method for operating a display device for a vehicle | |
| JPH10116052A (en) | Image display device | |
| GB2640887A (en) | Head-up display for use in a motor vehicle | |
| US10859820B2 (en) | Head-up display device | |
| US12112662B2 (en) | Vehicle display control device for controlling display color of content | |
| US12071012B2 (en) | Device and method for projecting image data onto a projection surface of a vehicle windowpane | |
| CN116500787B (en) | Method and system for presenting augmented reality information in a vehicle | |
| CN104253951B (en) | Method and device for adjusting an image representation on a display screen | |
| JP6607128B2 (en) | Virtual image display device, virtual image display method, and control program | |
| JP2021175130A (en) | In-vehicle display device and in-vehicle display system | |
| JP7036164B2 (en) | Driving support device, driving support method | |
| JP2019182018A (en) | Cabin display system | |
| JP2024147436A (en) | Projection device control device |