WO2017162812A1 - Adaptive display for low visibility - Google Patents

Adaptive display for low visibility Download PDF

Info

Publication number
WO2017162812A1
WO2017162812A1 PCT/EP2017/056965 EP2017056965W WO2017162812A1 WO 2017162812 A1 WO2017162812 A1 WO 2017162812A1 EP 2017056965 W EP2017056965 W EP 2017056965W WO 2017162812 A1 WO2017162812 A1 WO 2017162812A1
Authority
WO
WIPO (PCT)
Prior art keywords
visibility
information
display
detector
display device
Prior art date
Application number
PCT/EP2017/056965
Other languages
French (fr)
Inventor
Sebastian Paszkowicz
Robert Hardy
George Alexander
Original Assignee
Jaguar Land Rover Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Limited filed Critical Jaguar Land Rover Limited
Publication of WO2017162812A1 publication Critical patent/WO2017162812A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/002Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/80Arrangements for controlling instruments
    • B60K35/81Arrangements for controlling instruments for controlling displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/57Control of contrast or brightness
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/177Augmented reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/179Distances to obstacles or vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/21Optical features of instruments using cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/33Illumination features
    • B60K2360/334Projection means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/33Illumination features
    • B60K2360/349Adjustment of brightness
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/107Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using stereoscopic cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/205Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using a head-up display
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/302Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with GPS information or vehicle data, e.g. vehicle speed, gyro, steering angle data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8053Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for bad weather conditions or night vision
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • G09G2360/144Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications

Definitions

  • the present disclosure relates to an adaptive display for low visibility, and particularly, but not exclusively, to an adaptive display for low visibility for a land vehicle. Aspects of the invention relate to an adaptive display, an adaptive display low visibility system and to a method of providing information to a driver through an adaptive display.
  • low visibility conditions can also be the consequence of light intensity, as both not enough and too much light can be detrimental to the vision of a driver.
  • HUDs Head-Up Displays
  • HUDs are used to present a range of information to a driver by displaying it in their typical line of sight. Information may be projected onto a windscreen, or any clear transparent medium placed within the driver's field of view. This allows a driver to receive and process information about the surrounding environment without diverting or shifting the driver's focus and attention from their usual line of sight.
  • the information displayed by HUDs may also have the undesired effect of being distracting to the driver by providing inessential data. This is especially apparent in more difficult driving conditions, which particularly require the driver's complete concentration.
  • the present invention has been devised to mitigate or overcome at least some of the above-mentioned problems.
  • a method of controlling a display device to display an object image of an object, the object being in a low visibility area comprising: receiving visibility information relating to the low visibility area, from a visibility detector; determining a visibility level from the received visibility information; and in the event that the visibility is below a predetermined threshold; determining the display settings dependent on the determined visibility level; receiving object information from an object detector; generating an object image based on the received object information; determining a control signal to control the display device to display an object image of the object in dependence on the determined display settings; and outputting the control signal to the display device.
  • the aforementioned method provides a display which adapts according to the determined visibility conditions in an area, the area being the field of view of the visibility detector.
  • the display device is located within the vehicle, such that a driver is presented with the adaptive display in their immediate field of view
  • the display adapts according to the visibility conditions in the field of view of the visibility detector, which is taken to be substantially the same as the field of view of the vehicle, or driver.
  • the driver is displayed road information when the visibility conditions are determined to be below the predetermined threshold, in a manner dependent on the visibility conditions, However, in the case that the visibility conditions are not below the predetermined threshold, the driver is not displayed any information.
  • Road information comprises one or more of, but is not limited to, surrounding objects such as vehicles, pedestrians and highway furniture, where highway furniture may relate to but is not limited to, road signs, barriers and speed bumps.
  • road information may also include road network related information.
  • Road network related information comprises information associated with one or more of road networks, speed limits and road works.
  • This method provides the advantage that the driver is only presented with road information when it is necessary, namely, in low visibility conditions.
  • This method is devised to be less distracting to the driver than previous driver information displays, and to aid road awareness.
  • the display adapts the display settings according to determined visibility conditions, optimising the visuality of the display whilst ensuring the display is not distracting to the driver.
  • the above describes a method of controlling a display device such that it automatically adapts to the visibility conditions without any user interaction required. In the presence of low visibility this method holds the advantage that information is presented to the user without any attention being redirected towards the operation of the display device.
  • this is advantageous as the driver can give their full attention to driving in the low visibility conditions, whilst at the same time being displayed useful information which may increase their awareness of the surroundings and consequently their safety.
  • the display settings comprise any one or more of: brightness levels; contrast levels; colours; and opaqueness levels.
  • the display device comprises a head-up display.
  • the method comprises outputting a first control signal and subsequently outputting a second control signal, the first and second control signals being associated with different display settings wherein the display device is arranged to fade between the different display settings.
  • the visibility detector comprises a camera.
  • the visibility detector comprises a photometer arranged to measure light intensity.
  • the visibility detector comprises a photometer arranged to measure scattering of light.
  • the object detector comprises a camera.
  • the camera comprises a stereoscopic camera.
  • the object detector comprises a radar system.
  • generating an object image comprises determining an object type from the object information and receiving a representation of the object type from an object database.
  • the method also comprises receiving GPS information from a GPS device, where the GPS information comprises location information regarding the current position of the GPS device and may also comprise road network related information.
  • the method comprises determining object types or object information, from the GPS information, to be displayed.
  • receiving object information comprises receiving a video image and generating an object image comprises generating a processed video image in dependence on the determined visibility level.
  • the method comprises periodically determining the control signal to control the display device.
  • a computer storage medium comprising computer-readable instructions for a computer to carry out the aforementioned method.
  • a non-transitory computer-readable storage medium storing executable computer program instructions to implement the aforementioned method.
  • a system for displaying an object image of an object comprising: an input arranged to: receive visibility information relating to the low visibility area from a visibility detector; and receive object information from an object detector; a processor arranged to: determine a visibility level from the received visibility information; determine if the visibility level is below a predetermined threshold; generate an object image based on the received object information; and determine a control signal to control the display device to display an object image of the object in dependence on the determined display settings; an output arranged to output the control signal to the display device.
  • the visibility detector may provide a visibility level in the form of a contrast value.
  • the contrast value may be determined from an array of intensity values corresponding to an imaged scene.
  • the imaged scene may comprise the low visibility area.
  • the contrast value may be a measure of the spread of intensity values in the array of intensity values.
  • the input comprises an object detection module arranged to receive a video image from the object detector and to generate an object image, comprising generating a processed video image in dependence on the determined visibility level.
  • the system comprises an object database arranged to: store image representations of object types; and receive object types from the object detection module; and output the image representations to the object detection module.
  • the system comprises a GPS module arranged to: receive road network related information and the current position of the GPS device; and determine object types and information associated with the received information; and retrieve object images of the determined object types and information; and output the image representations.
  • Figure 1 (a) is a schematic top view of a driving system with low visibility in the field of view of a vehicle, the vehicle comprising a low visibility system according to an embodiment of the present invention
  • Figure 1 (b) is a schematic top view of a driving system with low visibility in the field of view of a vehicle, the vehicle comprising a low visibility system including a global positioning system (GPS) device, according to another embodiment of the present invention
  • GPS global positioning system
  • Figure 1 (c) is a schematic top view of a driving system with very low visibility in the field of view of a vehicle, the vehicle comprising a low visibility system according to an embodiment of the present invention, illustrating very low visibility
  • Figure 1 (d) is a schematic top view of a driving system with extremely low visibility in the field of view of a vehicle, the vehicle comprising a low visibility
  • Figure 2(a) is a schematic block diagram of the low visibility system of Figure 1 (a);
  • Figure 2(b) is a schematic block diagram of the low visibility system of Figure 1 (b);
  • Figure 3 is a flowchart of a process carried out by the low visibility system of Figure 2(a);
  • Figure 4(a) is a schematic view illustrating the point of view of a driver through a vehicle windscreen, with a display according to an embodiment of the present invention, illustrating low visibility;
  • Figure 4(b) is a schematic view illustrating the point of view of a driver through a vehicle windscreen, with a display according to an embodiment of the present invention, illustrating very low visibility;
  • Figure 4(c) is a schematic view illustrating the point of view of a driver through a vehicle windscreen, with a display according to an embodiment of the present invention, illustrating extremely low visibility;
  • Figure 5 is a schematic top view of a driver system showing a specific scenario, illustrating how the driving system of Figure 1 (b) could be employed;
  • Figure 6 is a schematic view illustrating the point of view of a driver through a vehicle windscreen corresponding to the specific scenario shown in Figure 5.
  • Figure 1 (a) shows a driving environment 100 comprising a vehicle 102, with a low visibility area 104, and an object 106.
  • the object 106 is located within the low visibility area 104.
  • the vehicle 102 comprises a visibility detector 108, an object detector 1 10, a low visibility system 1 12a, 1 12b and a display device 1 14.
  • the visibility detector 108, object detector 1 10 and display device 1 14 are each operatively connected to the low visibility system 1 12a, 1 12b. In other embodiments the visibility detector 108, object detector 1 10 and display device 1 14 are wirelessly connected to the low visibility system 1 12a, 1 12b.
  • the display device 1 14 is arranged to adapt its brightness and/or contrast levels and/or colours and/or opaqueness, which comprise the display settings, dependent on the visibility conditions, and to adapt the display settings dependent on detected visibility conditions. Moreover, in an embodiment, the display device 1 14 is arranged to fade between display settings as the display settings adapt to the detected visibility conditions. In an embodiment, the display device 1 14 is a Head-Up Display (HUD).
  • HUD Head-Up Display
  • the visibility detector 108 is a sensor which has a field of view depicted by the lines 1 16, which is substantially the same as the field of view of the object detector 1 10 and of the driver.
  • the low visibility area 104 and object 106 are ahead of the vehicle 102 and in the field of view 1 16 of the visibility detector 108 and the object detector 1 10.
  • the visibility detector 108 is a camera, arranged to capture images of the external environment of the vehicle 102 at between 24 to 240 frames per second.
  • the camera is a digital camera and arranged to output electronic data.
  • the visibility detector 108 is photometer, arranged to measure the light intensity of the external environment. In yet another embodiment of the invention, the visibility detector 108 is a photometer, arranged to measure the scattering of light in the external environment. In an aspect of the embodiment, the photometer is digital and arranged to output electronic data.
  • the object detector 1 10 may comprise a radar system.
  • the radar system includes an emitter and a receiver.
  • the emitter emits radio waves which are projected to scan all directions within the vehicle's field of view 1 16.
  • the receiver detects any reflections from the emitted waves and a filter is used to discriminate between those emitted waves and waves due to noise.
  • the object detector 1 10 may comprise a stereoscopic camera.
  • the stereoscopic camera includes two lenses in order to provide a degree of depth perception for obtaining three dimensional representations of any objects 106 detected in the field of view 1 16.
  • the stereoscopic camera is digital and arranged to output electronic data.
  • the object detector 1 10 may comprise a camera arranged to capture images of the external environment of the vehicle 102 at between 24 to 240 frames per second. In yet another embodiment, the object detector 1 10 may comprise a video camera arranged to capture video images.
  • Data from the visibility detector 108 and the object detector 1 10 is processed by the low visibility system 1 12a, 1 12b which outputs control signals in order to display object images on the display device 1 14.
  • Figure 1 (b) shows another embodiment of the invention, illustrating a driving environment 101 comprising a vehicle 102, with a low visibility area 104 in the field of view 1 16, and an object 106.
  • the object 106 is located within the low visibility area 104.
  • the vehicle 102 comprises a visibility detector 108, an object detector 1 10, a global positioning system (GPS) device 105, a low visibility system 1 12c and a display device 1 14.
  • the display device 1 14 is a head-up display (HUD).
  • HUD head-up display
  • the features of the driving environment 101 are substantially the same as those previously described in the driving environment 100 of Figure 1 (a), excluding the GPS device 105.
  • the GPS device 105 is operatively connected to a low visibility system 1 12c, and is arranged to determine its current position, which will be substantially the same as the current position of the vehicle 102.
  • the GPS device 105 is also arranged to include, or remotely access, a database of maps, which include road networks and other information, such as road works and speed limits.
  • the GPS device 105 is arranged to search the database of maps with respect to its determined position and consequently to output data relevant to the immediate and/or approaching external environment of the vehicle to the low visibility system 1 12c.
  • Figure 1 (c) shows the same driving environment as given in Figure 1 (a), with a very low visibility area 104 in the field of view 1 16.
  • Figure 1 (d) shows the same driving environment as given in Figure 1 (a), with an extremely low visibility area 104 in the field of view.
  • FIG. 2(a) shows an embodiment of the low visibility system 1 12a in greater detail.
  • the low visibility system 1 12a comprises a controller 200, a visibility detection module 202, an object detection module 204, and an output module 208.
  • the visibility detection module 202, object detection module 204 and output module 208 are each operatively connected to the controller 200.
  • the visibility detection module 202 is arranged to receive input from the visibility detector 108, where the input is visibility information relating to the low visibility area 104, and further to process the visibility information.
  • the visibility detection module 202 is also arranged to analyse visibility information in order to calculate the visibility level of the low visibility area 104.
  • the visibility level will be compared to a predetermined threshold, and if the visibility level is below the predetermined threshold, the visibility detection module 202 is arranged to determine display settings dependent on the determined visibility level, and initiate further processes of the system in reaction to the visibility level.
  • the display settings are sent to the output module 208 to be outputted, as part of the control signal, to the display device 1 14.
  • images are received by a camera on the vehicle 102 and are processed by the visibility detection module.
  • the visibility level is determined to be extremely low, far below the predetermined threshold.
  • the visibility detection module will therefore output high contrast and high brightness display settings.
  • the visibility detector may provide intensity values for each pixel in an array of pixels corresponding to an imaged scene, for example a scene ahead of the vehicle. Therefore, an array of intensity values can be determined for an imaged scene. From the array of intensity values, different visibility parameters may be determined. The data forming the array of intensity values may be represented as a histogram. From analysis of the array of intensity values, a mean luminance value can be obtained. A low mean luminance may indicate low light level conditions. A high mean luminance value may indicate high light level conditions.
  • the spread of intensity values can be used to indicate contrast levels in the array of pixels.
  • a small spread of intensity values may indicate low contrast conditions.
  • a large spread of intensity values may indicate high contrast conditions.
  • Conditions indicative of night-time, or other low light level conditions such as may be experienced when a vehicle enters a tunnel or other space sheltered from external lighting, may be indicated by a low mean luminance and a small spread of intensity values.
  • Conditions indicative of the occurrence of one or more of mist, fog, snow, drizzle, rain, smoke, dust, or other visual obscurant may be indicated by a high mean luminance and a small spread of intensity values. However, if the visual obscurant is encountered during low light level conditions, such as at night-time, then a low mean luminance and a small spread of intensity values may be observed.
  • the driver of the vehicle may operate lighting, such as forward lighting, of the vehicle, in order to illuminate the scene around, and in particular in front of, the vehicle.
  • lighting such as forward lighting
  • the mean luminance level may be increased, whilst the spread of intensity values is also increased.
  • the operation of vehicle lighting to illuminate the scene may increase the mean luminance level but not provide a corresponding increase in the spread of intensity values.
  • the spread of intensity values may decrease when the vehicle lighting is operated in conditions of fog, or other visual obscurant. Therefore it may be observed that the reduction in contrast, that is, the reduction in the spread of intensity values in the histogram of the array of intensity values, independent of the mean luminance value, provides the increased difficulty in the driving conditions for the vehicle driver. Operating vehicle lighting may be detrimental to the observed contrast.
  • the present invention provides for enhancing object detection when the detected spread of intensity values in the histogram of the array of intensity values is low.
  • the enhancement of object detection may be dependent on the spread of intensity values in the histogram of the array of intensity values.
  • the visibility level is a measure of the spread of intensity values in a histogram of an array of intensity values from a visibility detector.
  • the object detection module is 204 is arranged to receive object information from the object detector 1 10.
  • the object detection module 204 is arranged to perform object recognition analysis on the object information received from the object detector 1 10 to determine if there is an object present in the low visibility area 104. If it is determined that there is an object in the low visibility area 104, the object detection module 204 is arranged to generate an object image based on the received object information.
  • the object detection module 204 is arranged to send the generated object image to the output module 208 to be outputted, as part of the control signal, to the display device 1 14.
  • the object detection module 204 performs object recognition analysis based on the detected size and shape of the object 106.
  • the objects 106 detectable by the object detection module 204 include, but are not limited to, vehicles, pedestrians, and highway furniture, such as but not limited to, road signs, barriers and speed bumps.
  • the object detection module 204 is also arranged to process the images in a way that is dependent on the visibility level, such that the images clearly show any objects within the image.
  • the object detection module 204 is also arranged to process video images in a way that is dependent on the visibility level, such that the video images clearly show any objects within the video stream.
  • the object detection module 204 is arranged to continually output the processed video stream to the output module 208 to be continually outputted to the display device 1 14, until such a time that the detected visibility level is above the predetermined threshold or no objects are detected within the low visibility area 104.
  • the embodiment of the low visibility system 1 12b of Figure 2(b) also comprises an object database 206.
  • the object database 206 comprises image representations of object types.
  • image representations may be symbols of vehicles, pedestrians, and highway furniture.
  • the object detection module 204 is arranged to retrieve an image representation from the object database 206 of the determined object type.
  • the object database 206 is remote from the low visibility system 1 12b, and can be accessed wirelessly.
  • FIG. 2(c) shows an embodiment of the low visibility system 1 12c according to the embodiment of the invention shown in Figure 1 (c).
  • the low visibility system 1 12c comprises a controller 200, a visibility detection module 202, an object detection module 204, an object database 206, a GPS module 205 and an output module 208.
  • the visibility detection module 202, object detection module 204, object database 206, GPS module 205, and output module 208 are each operatively connected to the controller 200.
  • the embodiment of the low visibility system 1 12c of Figure 2(c) includes a GPS module 205 that is arranged to receive and process data relevant to the current position of the GPS device 105.
  • GPS module 205 may receive information about the position of the GPS device with respect to the road. As another example, the GPS module 205 may receive information about an approaching change in speed limit. The GPS module 205 processes the received data, and determines any object types to be displayed. The GPS module 205 is arranged to retrieve image representations of the object types from the object database 206. The GPS module is also arranged to send the image representations to the output module 208 to be outputted, as part of the control signal, to the display device 1 14.
  • Each of the low visibility systems shown in Figures 2(a), 2(b) and 2(c), comprise of an input, a processor as illustrated by box 210, and an output. It is noted that the processor may, depending on configuration, comprise a single processing module or a plurality of modules. Additionally, the input may comprise a single input or a number of inputs depending on the configuration of the system. Furthermore, in another embodiment of the invention, the output may comprise of a number of outputs.
  • FIG 3 shows a process 300 according to an embodiment of the present invention carried out by the low visibility system 1 12a given in Figure 2(a).
  • Visibility information is received from the visibility detector 108 in Step 302.
  • the visibility information is analysed in Step 304, to determine the level of visibility of the field of view of the visibility detector 108.
  • the visibility level is then compared to a predetermined threshold in Step 306, to confirm whether the field of view comprises low visibility area 104 and further if the visibility is low enough such that a display of an object image is required. If the visibility level is not lower than the predetermined threshold, the process 300 begins again at Step 302. However, if, following the analysis of Step 304, the visibility level is lower the predetermined threshold, then display settings are determined, dependent on the visibility level, shown by Step 308.
  • object information is received from the object detector 1 10 in Step 310.
  • the object information from the object detector 1 10 is analysed by the object detection module 204, such that any objects the low visibility area 104 are detected. If there are no objects in the low visibility area 104, then the process returns to Step 310. However, if there is an object or objects detected in the low visibility area 104, then the object detection module 204 generates an object image based on the received object information, shown in Step 314. Then, shown by Step 316, the display settings and object image, which comprise a control signal, are outputted to the display device 1 14.
  • the low visibility system 1 12b of Figure 2(b) is configured to carry out a substantially similar process to process 300, however Step 314, generating the object image, comprises the object detection module 204 determining an object type and retrieving an image representation of the object type from the object database 206.
  • the low visibility system 1 12c of Figure 2(c) is configured to carry out a substantially similar process to process 300, however Step 314, generating the object image, comprises the object detection module 204 determining an object type and retrieving an image representation of the object type from the object database 206, and further the GPS module 205 determining any object types to be displayed and retrieving image representations from the object database 206.
  • Each of the low visibility systems 1 12a, 1 12b and 1 12c are arranged to repeat reiteratively, after a predetermined time period, such that the system is adapting the control signal dependent on any detected changes in the visibility level or objects in the low visibility area 104.
  • Figures 4(a), 4(b) and 4(c) illustrate the adaptive nature of the system.
  • Figures 4(a) to (c) show the view of a driver 400 through a vehicle windscreen 402 with the adaptive display 404 in use.
  • Figures 4(a), 4(b), and 4(c) a car has been detected in front of the vehicle, such that a representation 406 is shown on the adaptive display 404.
  • Figure 4(a) shows low visibility conditions 408.
  • Figure 4(b) shows very low visibility conditions 410.
  • Figure 4(c) shows extremely low visibility conditions 412. Comparing the representation 406 in each of the Figures, it is illustrated that as visibility conditions worsen, the brightness and contrast of the representation 406 increase, dependent on the detected visibility conditions 408, 410, 412.
  • the display settings are faded between the reiterative implementation of the process, or when no display is required, such that the display appears to the driver as smoothly transitioning in response to changing visibility levels, and that the transitioning between the display settings dependent on visibility levels is not obvious to the driver. Therefore, it should appear to the driver as a continuously adapting display. Furthermore, fading between different display settings is arranged such that it is of minimum possible distraction to the driver.
  • the low visibility system 1 12a, 1 12b, 1 12c can precisely detect the level of the visibility so that it can characterise very many different varying levels of visibility, such that the adaptive display settings are very sensitive to the level of visibility.
  • the display has a limited number of display levels, such that the display only varies after a substantive change in the visibility level.
  • Figures 5 and 6 show a specific scenario. This scenario is non-limiting and is provided here for illustrative purposes.
  • Figure 5 shows a vehicle 102 in two positions. Initially, the vehicle 102 is in position 500. After a period of time, the vehicle 102 travels to the second position 502. In the second position 502, there are low visibility conditions 504 within the field of view of the vehicle 102.
  • the vehicle 102 comprises a visibility detector 108, an object detector 1 10, a global positioning system (GPS) device 105, a low visibility system 1 12c and a display device 1 14, where the display device 1 14 is a HUD.
  • the visibility detector 108 is a digital camera and the object detector 1 10 is a radar system with a digital output.
  • the camera 108 sends images to the visibility detection module 202.
  • the visibility detection module 202 processes the images and analyses the visibility information to determine contrast levels. In this position, the contrast levels are normal and therefore are within the predetermined threshold. Therefore, no other processes are required.
  • the camera 108 sends images to the visibility detection module 202.
  • the visibility detection module 202 processes the images and analyses the data to determine visibility levels. In the second position 502, very low visibility levels are detected, and the visibility level is determined to be below the predetermined threshold.
  • the visibility detection module 202 analyses the visibility level and determines the corresponding display settings. This initiates the radar system 1 10 and the GPS device 105, which collect information about the vehicle surroundings.
  • the object detection module 204 receives and processes object information from the radar system 1 10, and determines if there is an object 106 in the field of view 1 16, and further that the object type is a car.
  • the object detection module 204 then retrieves an image representation of a car from the object database 206.
  • the object detection module also determines the distance to the car 106 at the point the image was taken, in order to determine what size the object representation should be shown at.
  • the GPS module 205 receives and processes GPS information from the GPS device 105 and determines that the vehicle is in the left-hand lane and additionally that the speed-limit will change in the next 100m from 70mph (about 1 10 km/h) to 60mph (about 95 km/h).
  • the GPS module 105 therefore retrieves the image representations for a left-hand lane and for a 60mph (or 95km/h) sign from the object database 206.
  • the visibility detection module 202 outputs the display settings to the output module 208, the object detection module 204 outputs the car representation to the output module 208 and the GPS module 205 outputs the left-hand lane and 60mph (or 95 km/h) sign representations to the output module 208, all of which then comprise the control signal which is sent to the display device 1 14.
  • the display device 1 14 which is a head -up display receives the control signal from the output module 208 and displays the object representations at the display settings corresponding to the determined visibility level.
  • Figure 6 provides an illustration of the view of a driver 600 through a vehicle windscreen 402 in the scenario described in Figure 5, when the vehicle is in the second position 502.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present invention provides a system and method of controlling a display device to display an object image of an object, the object being in a low visibility area. The method comprises receiving visibility information from a visibility detector, relating to the low visibility area; determining a visibility level from the received visibility information, and in the event that the visibility level is below a predetermined threshold; determining display settings dependent on the determined visibility level; receiving object information from an object detector; generating an object image based on the received object information; determining a control signal to control the display device to display an object image of the object in dependence on the determined display settings; and outputting the control signal to a display device.

Description

ADAPTIVE DISPLAY FOR LOW VISIBILITY
TECHNICAL FIELD
The present disclosure relates to an adaptive display for low visibility, and particularly, but not exclusively, to an adaptive display for low visibility for a land vehicle. Aspects of the invention relate to an adaptive display, an adaptive display low visibility system and to a method of providing information to a driver through an adaptive display.
BACKGROUND
Driving is often an essential part of a modern lifestyle, and frequently people are required to drive despite low visibility conditions. However, low visibility conditions can have a considerable impact on road safety. It is of little surprise that as a driver's visibility of the surroundings decreases, the risk of an accident occurring increases.
There are many factors which can cause low visibility conditions for a driver. One significant factor is weather such as fog, snow and rain, which are well known to result in dangerous driving conditions. As well as simply obscuring an object or hazard, such weather conditions can produce low contrast visibility between objects and their surroundings, a consequence of which is that the objects are perceived as being further away from the observer than they are in reality. Moreover, low contrast can result in a lack of clear external markers for a driver, and in this case it is easy for the driver to underestimate their speed and over-estimate distances. These factors can contribute to higher accident rates.
Furthermore, low visibility conditions can also be the consequence of light intensity, as both not enough and too much light can be detrimental to the vision of a driver.
Therefore, it is of growing importance, especially as vehicle technology develops, to provide drivers with systems which can aid safer driving when there are adverse effects on visibility. In response to low visibility conditions, as well as other road safety considerations, vehicles are now often equipped with perception sensors, such as radar, lidar and CCD cameras, which can continuously monitor driving conditions. Driver assistance systems can analyse both inside and outside conditions in order to provide the driver with warnings, images or information, or initiate active systems such as automatically turning on lights. Although there are many automatic systems which have been developed in order to aid drivers, one system which can significantly aid driving is Head-Up Displays (HUDs). HUDs are used to present a range of information to a driver by displaying it in their typical line of sight. Information may be projected onto a windscreen, or any clear transparent medium placed within the driver's field of view. This allows a driver to receive and process information about the surrounding environment without diverting or shifting the driver's focus and attention from their usual line of sight.
However, the information displayed by HUDs may also have the undesired effect of being distracting to the driver by providing inessential data. This is especially apparent in more difficult driving conditions, which particularly require the driver's complete concentration.
The present invention has been devised to mitigate or overcome at least some of the above-mentioned problems.
SUMMARY OF THE INVENTION
According to an aspect of the present invention there is provided a method of controlling a display device to display an object image of an object, the object being in a low visibility area, the method comprising: receiving visibility information relating to the low visibility area, from a visibility detector; determining a visibility level from the received visibility information; and in the event that the visibility is below a predetermined threshold; determining the display settings dependent on the determined visibility level; receiving object information from an object detector; generating an object image based on the received object information; determining a control signal to control the display device to display an object image of the object in dependence on the determined display settings; and outputting the control signal to the display device. The aforementioned method provides a display which adapts according to the determined visibility conditions in an area, the area being the field of view of the visibility detector. In an embodiment where the display device is located within the vehicle, such that a driver is presented with the adaptive display in their immediate field of view, the display adapts according to the visibility conditions in the field of view of the visibility detector, which is taken to be substantially the same as the field of view of the vehicle, or driver. In this embodiment, the driver is displayed road information when the visibility conditions are determined to be below the predetermined threshold, in a manner dependent on the visibility conditions, However, in the case that the visibility conditions are not below the predetermined threshold, the driver is not displayed any information.
Road information comprises one or more of, but is not limited to, surrounding objects such as vehicles, pedestrians and highway furniture, where highway furniture may relate to but is not limited to, road signs, barriers and speed bumps. In some embodiments road information may also include road network related information. Road network related information comprises information associated with one or more of road networks, speed limits and road works.
This method provides the advantage that the driver is only presented with road information when it is necessary, namely, in low visibility conditions. This method is devised to be less distracting to the driver than previous driver information displays, and to aid road awareness. Moreover, the display adapts the display settings according to determined visibility conditions, optimising the visuality of the display whilst ensuring the display is not distracting to the driver. Furthermore, the above describes a method of controlling a display device such that it automatically adapts to the visibility conditions without any user interaction required. In the presence of low visibility this method holds the advantage that information is presented to the user without any attention being redirected towards the operation of the display device. In an embodiment where the user is driving a vehicle it can be seen that this is advantageous as the driver can give their full attention to driving in the low visibility conditions, whilst at the same time being displayed useful information which may increase their awareness of the surroundings and consequently their safety.
Optionally, the display settings comprise any one or more of: brightness levels; contrast levels; colours; and opaqueness levels. Optionally the display device comprises a head-up display. Optionally, the method comprises outputting a first control signal and subsequently outputting a second control signal, the first and second control signals being associated with different display settings wherein the display device is arranged to fade between the different display settings.
Optionally the visibility detector comprises a camera.
Optionally the visibility detector comprises a photometer arranged to measure light intensity.
Optionally the visibility detector comprises a photometer arranged to measure scattering of light. Optionally the object detector comprises a camera. Furthermore, optionally the camera comprises a stereoscopic camera.
Optionally the object detector comprises a radar system. In certain embodiments generating an object image comprises determining an object type from the object information and receiving a representation of the object type from an object database.
Optionally the method also comprises receiving GPS information from a GPS device, where the GPS information comprises location information regarding the current position of the GPS device and may also comprise road network related information.
Optionally, the method comprises determining object types or object information, from the GPS information, to be displayed.
Optionally receiving object information comprises receiving a video image and generating an object image comprises generating a processed video image in dependence on the determined visibility level.
Optionally the method comprises periodically determining the control signal to control the display device. According to an aspect of the invention there is provided a computer storage medium comprising computer-readable instructions for a computer to carry out the aforementioned method.
According to another aspect of the invention there is provided a non-transitory computer-readable storage medium storing executable computer program instructions to implement the aforementioned method. According to a further aspect of the present invention there is provided a system for displaying an object image of an object, the object being in a low visibility area, the system comprising: an input arranged to: receive visibility information relating to the low visibility area from a visibility detector; and receive object information from an object detector; a processor arranged to: determine a visibility level from the received visibility information; determine if the visibility level is below a predetermined threshold; generate an object image based on the received object information; and determine a control signal to control the display device to display an object image of the object in dependence on the determined display settings; an output arranged to output the control signal to the display device.
The visibility detector may provide a visibility level in the form of a contrast value. The contrast value may be determined from an array of intensity values corresponding to an imaged scene. The imaged scene may comprise the low visibility area. The contrast value may be a measure of the spread of intensity values in the array of intensity values.
Optionally the input comprises an object detection module arranged to receive a video image from the object detector and to generate an object image, comprising generating a processed video image in dependence on the determined visibility level.
Optionally the system comprises an object database arranged to: store image representations of object types; and receive object types from the object detection module; and output the image representations to the object detection module. Optionally, the system comprises a GPS module arranged to: receive road network related information and the current position of the GPS device; and determine object types and information associated with the received information; and retrieve object images of the determined object types and information; and output the image representations. An aspect of the invention provides for a vehicle comprising the system of the above further aspect of the present invention.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 (a) is a schematic top view of a driving system with low visibility in the field of view of a vehicle, the vehicle comprising a low visibility system according to an embodiment of the present invention;
Figure 1 (b) is a schematic top view of a driving system with low visibility in the field of view of a vehicle, the vehicle comprising a low visibility system including a global positioning system (GPS) device, according to another embodiment of the present invention;
Figure 1 (c) is a schematic top view of a driving system with very low visibility in the field of view of a vehicle, the vehicle comprising a low visibility system according to an embodiment of the present invention, illustrating very low visibility; Figure 1 (d) is a schematic top view of a driving system with extremely low visibility in the field of view of a vehicle, the vehicle comprising a low visibility
system according to an embodiment of the present invention, illustrating extremely low visibility;
Figure 2(a) is a schematic block diagram of the low visibility system of Figure 1 (a);
Figure 2(b) is a schematic block diagram of the low visibility system of Figure 1 (b);
Figure 3 is a flowchart of a process carried out by the low visibility system of Figure 2(a);
Figure 4(a) is a schematic view illustrating the point of view of a driver through a vehicle windscreen, with a display according to an embodiment of the present invention, illustrating low visibility;
Figure 4(b) is a schematic view illustrating the point of view of a driver through a vehicle windscreen, with a display according to an embodiment of the present invention, illustrating very low visibility;
Figure 4(c) is a schematic view illustrating the point of view of a driver through a vehicle windscreen, with a display according to an embodiment of the present invention, illustrating extremely low visibility;
Figure 5 is a schematic top view of a driver system showing a specific scenario, illustrating how the driving system of Figure 1 (b) could be employed; and
Figure 6 is a schematic view illustrating the point of view of a driver through a vehicle windscreen corresponding to the specific scenario shown in Figure 5. DETAILED DESCRIPTION
Figure 1 (a) shows a driving environment 100 comprising a vehicle 102, with a low visibility area 104, and an object 106. The object 106 is located within the low visibility area 104. The vehicle 102 comprises a visibility detector 108, an object detector 1 10, a low visibility system 1 12a, 1 12b and a display device 1 14.
The visibility detector 108, object detector 1 10 and display device 1 14 are each operatively connected to the low visibility system 1 12a, 1 12b. In other embodiments the visibility detector 108, object detector 1 10 and display device 1 14 are wirelessly connected to the low visibility system 1 12a, 1 12b.
The display device 1 14 is arranged to adapt its brightness and/or contrast levels and/or colours and/or opaqueness, which comprise the display settings, dependent on the visibility conditions, and to adapt the display settings dependent on detected visibility conditions. Moreover, in an embodiment, the display device 1 14 is arranged to fade between display settings as the display settings adapt to the detected visibility conditions. In an embodiment, the display device 1 14 is a Head-Up Display (HUD).
The visibility detector 108 is a sensor which has a field of view depicted by the lines 1 16, which is substantially the same as the field of view of the object detector 1 10 and of the driver. The low visibility area 104 and object 106 are ahead of the vehicle 102 and in the field of view 1 16 of the visibility detector 108 and the object detector 1 10.
In an embodiment of the invention, the visibility detector 108 is a camera, arranged to capture images of the external environment of the vehicle 102 at between 24 to 240 frames per second. In an aspect of the embodiment, the camera is a digital camera and arranged to output electronic data.
In another embodiment the visibility detector 108 is photometer, arranged to measure the light intensity of the external environment. In yet another embodiment of the invention, the visibility detector 108 is a photometer, arranged to measure the scattering of light in the external environment. In an aspect of the embodiment, the photometer is digital and arranged to output electronic data. In an embodiment of the invention, the object detector 1 10 may comprise a radar system. The radar system includes an emitter and a receiver. The emitter emits radio waves which are projected to scan all directions within the vehicle's field of view 1 16. The receiver detects any reflections from the emitted waves and a filter is used to discriminate between those emitted waves and waves due to noise.
In another embodiment, the object detector 1 10 may comprise a stereoscopic camera. The stereoscopic camera includes two lenses in order to provide a degree of depth perception for obtaining three dimensional representations of any objects 106 detected in the field of view 1 16. In an aspect of embodiment, the stereoscopic camera is digital and arranged to output electronic data.
In another embodiment, the object detector 1 10 may comprise a camera arranged to capture images of the external environment of the vehicle 102 at between 24 to 240 frames per second. In yet another embodiment, the object detector 1 10 may comprise a video camera arranged to capture video images.
Data from the visibility detector 108 and the object detector 1 10 is processed by the low visibility system 1 12a, 1 12b which outputs control signals in order to display object images on the display device 1 14.
Figure 1 (b) shows another embodiment of the invention, illustrating a driving environment 101 comprising a vehicle 102, with a low visibility area 104 in the field of view 1 16, and an object 106. The object 106 is located within the low visibility area 104. The vehicle 102 comprises a visibility detector 108, an object detector 1 10, a global positioning system (GPS) device 105, a low visibility system 1 12c and a display device 1 14. In an embodiment of the invention, the display device 1 14 is a head-up display (HUD).
The features of the driving environment 101 are substantially the same as those previously described in the driving environment 100 of Figure 1 (a), excluding the GPS device 105. The GPS device 105 is operatively connected to a low visibility system 1 12c, and is arranged to determine its current position, which will be substantially the same as the current position of the vehicle 102. Furthermore, the GPS device 105 is also arranged to include, or remotely access, a database of maps, which include road networks and other information, such as road works and speed limits. The GPS device 105 is arranged to search the database of maps with respect to its determined position and consequently to output data relevant to the immediate and/or approaching external environment of the vehicle to the low visibility system 1 12c.
Figure 1 (c) shows the same driving environment as given in Figure 1 (a), with a very low visibility area 104 in the field of view 1 16. Figure 1 (d) shows the same driving environment as given in Figure 1 (a), with an extremely low visibility area 104 in the field of view. These illustrate that the driving system is arranged to work in differing levels of visibility.
Figure 2(a) shows an embodiment of the low visibility system 1 12a in greater detail. The low visibility system 1 12a comprises a controller 200, a visibility detection module 202, an object detection module 204, and an output module 208. The visibility detection module 202, object detection module 204 and output module 208 are each operatively connected to the controller 200.
The visibility detection module 202 is arranged to receive input from the visibility detector 108, where the input is visibility information relating to the low visibility area 104, and further to process the visibility information. The visibility detection module 202 is also arranged to analyse visibility information in order to calculate the visibility level of the low visibility area 104. The visibility level will be compared to a predetermined threshold, and if the visibility level is below the predetermined threshold, the visibility detection module 202 is arranged to determine display settings dependent on the determined visibility level, and initiate further processes of the system in reaction to the visibility level. The display settings are sent to the output module 208 to be outputted, as part of the control signal, to the display device 1 14. In an example of an embodiment of the invention, images are received by a camera on the vehicle 102 and are processed by the visibility detection module. The visibility level is determined to be extremely low, far below the predetermined threshold. The visibility detection module will therefore output high contrast and high brightness display settings.
The visibility detector may provide intensity values for each pixel in an array of pixels corresponding to an imaged scene, for example a scene ahead of the vehicle. Therefore, an array of intensity values can be determined for an imaged scene. From the array of intensity values, different visibility parameters may be determined. The data forming the array of intensity values may be represented as a histogram. From analysis of the array of intensity values, a mean luminance value can be obtained. A low mean luminance may indicate low light level conditions. A high mean luminance value may indicate high light level conditions.
From the data forming the array of intensity values, and/or from the histogram, the spread of intensity values can be used to indicate contrast levels in the array of pixels. A small spread of intensity values may indicate low contrast conditions. A large spread of intensity values may indicate high contrast conditions.
Conditions indicative of night-time, or other low light level conditions, such as may be experienced when a vehicle enters a tunnel or other space sheltered from external lighting, may be indicated by a low mean luminance and a small spread of intensity values.
Conditions indicative of the occurrence of one or more of mist, fog, snow, drizzle, rain, smoke, dust, or other visual obscurant, may be indicated by a high mean luminance and a small spread of intensity values. However, if the visual obscurant is encountered during low light level conditions, such as at night-time, then a low mean luminance and a small spread of intensity values may be observed.
In night-time conditions, without the presence of a visual obscurant, the driver of the vehicle may operate lighting, such as forward lighting, of the vehicle, in order to illuminate the scene around, and in particular in front of, the vehicle. By doing so, the mean luminance level may be increased, whilst the spread of intensity values is also increased.
When the vehicle is operated in conditions of fog, or other visual obscurant, then the operation of vehicle lighting to illuminate the scene may increase the mean luminance level but not provide a corresponding increase in the spread of intensity values. In some conditions, the spread of intensity values may decrease when the vehicle lighting is operated in conditions of fog, or other visual obscurant. Therefore it may be observed that the reduction in contrast, that is, the reduction in the spread of intensity values in the histogram of the array of intensity values, independent of the mean luminance value, provides the increased difficulty in the driving conditions for the vehicle driver. Operating vehicle lighting may be detrimental to the observed contrast. The present invention provides for enhancing object detection when the detected spread of intensity values in the histogram of the array of intensity values is low. This is identified herein as a low visibility level, and may be independent of the mean luminance level. The enhancement of object detection may be dependent on the spread of intensity values in the histogram of the array of intensity values. The visibility level is a measure of the spread of intensity values in a histogram of an array of intensity values from a visibility detector.
The object detection module is 204 is arranged to receive object information from the object detector 1 10. The object detection module 204 is arranged to perform object recognition analysis on the object information received from the object detector 1 10 to determine if there is an object present in the low visibility area 104. If it is determined that there is an object in the low visibility area 104, the object detection module 204 is arranged to generate an object image based on the received object information. The object detection module 204 is arranged to send the generated object image to the output module 208 to be outputted, as part of the control signal, to the display device 1 14.
The object detection module 204 performs object recognition analysis based on the detected size and shape of the object 106. The objects 106 detectable by the object detection module 204 include, but are not limited to, vehicles, pedestrians, and highway furniture, such as but not limited to, road signs, barriers and speed bumps.
In an embodiment of the invention, wherein the object detector 1 10 comprises a camera, the object detection module 204 is also arranged to process the images in a way that is dependent on the visibility level, such that the images clearly show any objects within the image.
In another embodiment of the invention, wherein the object detector 1 10 comprises a video camera, the object detection module 204 is also arranged to process video images in a way that is dependent on the visibility level, such that the video images clearly show any objects within the video stream. In this embodiment, the object detection module 204 is arranged to continually output the processed video stream to the output module 208 to be continually outputted to the display device 1 14, until such a time that the detected visibility level is above the predetermined threshold or no objects are detected within the low visibility area 104.
Otherwise described by the low visibility system 1 12a of Figure 2(a), the embodiment of the low visibility system 1 12b of Figure 2(b) also comprises an object database 206. The object database 206 comprises image representations of object types. For example, image representations may be symbols of vehicles, pedestrians, and highway furniture. In this embodiment, the object detection module 204 is arranged to retrieve an image representation from the object database 206 of the determined object type.
In another embodiment of the invention, the object database 206 is remote from the low visibility system 1 12b, and can be accessed wirelessly.
Figure 2(c) shows an embodiment of the low visibility system 1 12c according to the embodiment of the invention shown in Figure 1 (c). The low visibility system 1 12c comprises a controller 200, a visibility detection module 202, an object detection module 204, an object database 206, a GPS module 205 and an output module 208. The visibility detection module 202, object detection module 204, object database 206, GPS module 205, and output module 208 are each operatively connected to the controller 200. Otherwise described by the low visibility system 1 12b of Figure 2(b), the embodiment of the low visibility system 1 12c of Figure 2(c) includes a GPS module 205 that is arranged to receive and process data relevant to the current position of the GPS device 105. For example, GPS module 205 may receive information about the position of the GPS device with respect to the road. As another example, the GPS module 205 may receive information about an approaching change in speed limit. The GPS module 205 processes the received data, and determines any object types to be displayed. The GPS module 205 is arranged to retrieve image representations of the object types from the object database 206. The GPS module is also arranged to send the image representations to the output module 208 to be outputted, as part of the control signal, to the display device 1 14. Each of the low visibility systems shown in Figures 2(a), 2(b) and 2(c), comprise of an input, a processor as illustrated by box 210, and an output. It is noted that the processor may, depending on configuration, comprise a single processing module or a plurality of modules. Additionally, the input may comprise a single input or a number of inputs depending on the configuration of the system. Furthermore, in another embodiment of the invention, the output may comprise of a number of outputs.
Figure 3 shows a process 300 according to an embodiment of the present invention carried out by the low visibility system 1 12a given in Figure 2(a). Visibility information is received from the visibility detector 108 in Step 302. The visibility information is analysed in Step 304, to determine the level of visibility of the field of view of the visibility detector 108. The visibility level is then compared to a predetermined threshold in Step 306, to confirm whether the field of view comprises low visibility area 104 and further if the visibility is low enough such that a display of an object image is required. If the visibility level is not lower than the predetermined threshold, the process 300 begins again at Step 302. However, if, following the analysis of Step 304, the visibility level is lower the predetermined threshold, then display settings are determined, dependent on the visibility level, shown by Step 308. Then, object information is received from the object detector 1 10 in Step 310. The object information from the object detector 1 10 is analysed by the object detection module 204, such that any objects the low visibility area 104 are detected. If there are no objects in the low visibility area 104, then the process returns to Step 310. However, if there is an object or objects detected in the low visibility area 104, then the object detection module 204 generates an object image based on the received object information, shown in Step 314. Then, shown by Step 316, the display settings and object image, which comprise a control signal, are outputted to the display device 1 14.
The low visibility system 1 12b of Figure 2(b) is configured to carry out a substantially similar process to process 300, however Step 314, generating the object image, comprises the object detection module 204 determining an object type and retrieving an image representation of the object type from the object database 206.
The low visibility system 1 12c of Figure 2(c) is configured to carry out a substantially similar process to process 300, however Step 314, generating the object image, comprises the object detection module 204 determining an object type and retrieving an image representation of the object type from the object database 206, and further the GPS module 205 determining any object types to be displayed and retrieving image representations from the object database 206.
Each of the low visibility systems 1 12a, 1 12b and 1 12c are arranged to repeat reiteratively, after a predetermined time period, such that the system is adapting the control signal dependent on any detected changes in the visibility level or objects in the low visibility area 104.
Figures 4(a), 4(b) and 4(c) illustrate the adaptive nature of the system. Figures 4(a) to (c) show the view of a driver 400 through a vehicle windscreen 402 with the adaptive display 404 in use. In Figures 4(a), 4(b), and 4(c) a car has been detected in front of the vehicle, such that a representation 406 is shown on the adaptive display 404.
Figure 4(a) shows low visibility conditions 408. Figure 4(b) shows very low visibility conditions 410. Figure 4(c) shows extremely low visibility conditions 412. Comparing the representation 406 in each of the Figures, it is illustrated that as visibility conditions worsen, the brightness and contrast of the representation 406 increase, dependent on the detected visibility conditions 408, 410, 412.
In an embodiment of the invention, the display settings are faded between the reiterative implementation of the process, or when no display is required, such that the display appears to the driver as smoothly transitioning in response to changing visibility levels, and that the transitioning between the display settings dependent on visibility levels is not obvious to the driver. Therefore, it should appear to the driver as a continuously adapting display. Furthermore, fading between different display settings is arranged such that it is of minimum possible distraction to the driver.
In one embodiment of the invention, the low visibility system 1 12a, 1 12b, 1 12c can precisely detect the level of the visibility so that it can characterise very many different varying levels of visibility, such that the adaptive display settings are very sensitive to the level of visibility.
In another embodiment of the invention, the display has a limited number of display levels, such that the display only varies after a substantive change in the visibility level. A specific scenario is shown in Figures 5 and 6 to illustrate how the system could be employed. This scenario is non-limiting and is provided here for illustrative purposes. Figure 5 shows a vehicle 102 in two positions. Initially, the vehicle 102 is in position 500. After a period of time, the vehicle 102 travels to the second position 502. In the second position 502, there are low visibility conditions 504 within the field of view of the vehicle 102. The vehicle 102 comprises a visibility detector 108, an object detector 1 10, a global positioning system (GPS) device 105, a low visibility system 1 12c and a display device 1 14, where the display device 1 14 is a HUD. In this embodiment of the invention, the visibility detector 108 is a digital camera and the object detector 1 10 is a radar system with a digital output.
In the first position 500, the camera 108 sends images to the visibility detection module 202. The visibility detection module 202 processes the images and analyses the visibility information to determine contrast levels. In this position, the contrast levels are normal and therefore are within the predetermined threshold. Therefore, no other processes are required.
When the car has travelled to the second position 502, the camera 108 sends images to the visibility detection module 202. The visibility detection module 202 processes the images and analyses the data to determine visibility levels. In the second position 502, very low visibility levels are detected, and the visibility level is determined to be below the predetermined threshold.
The visibility detection module 202 analyses the visibility level and determines the corresponding display settings. This initiates the radar system 1 10 and the GPS device 105, which collect information about the vehicle surroundings. The object detection module 204 receives and processes object information from the radar system 1 10, and determines if there is an object 106 in the field of view 1 16, and further that the object type is a car. The object detection module 204 then retrieves an image representation of a car from the object database 206. The object detection module also determines the distance to the car 106 at the point the image was taken, in order to determine what size the object representation should be shown at. The GPS module 205 receives and processes GPS information from the GPS device 105 and determines that the vehicle is in the left-hand lane and additionally that the speed-limit will change in the next 100m from 70mph (about 1 10 km/h) to 60mph (about 95 km/h). The GPS module 105 therefore retrieves the image representations for a left-hand lane and for a 60mph (or 95km/h) sign from the object database 206. The visibility detection module 202 outputs the display settings to the output module 208, the object detection module 204 outputs the car representation to the output module 208 and the GPS module 205 outputs the left-hand lane and 60mph (or 95 km/h) sign representations to the output module 208, all of which then comprise the control signal which is sent to the display device 1 14. The display device 1 14 which is a head -up display receives the control signal from the output module 208 and displays the object representations at the display settings corresponding to the determined visibility level.
Figure 6 provides an illustration of the view of a driver 600 through a vehicle windscreen 402 in the scenario described in Figure 5, when the vehicle is in the second position 502.
Many modifications may be made to the above examples without departing from the scope of the present invention as defined in the accompanying claims.

Claims

1 . A method of controlling a display device to display an object image of an object, the object being in a low visibility area , the method comprising:
receiving visibility information relating to the low visibility area, from a visibility detector;
determining a visibility level from the received visibility information; and in the event that the visibility level is below a redetermined threshold;
determining display settings dependent on the determined visibility level;
receiving object information from an object detector ;
generating an object image based on the received object information;
determining a first control signal to control the display device to display the object image of the object in dependence on the determined display settings; and
outputting the first control signal to the display device.
2. A method according to claim 1 , wherein the display settings comprise any one or more of: brightness levels; contrast levels; colours; and opaqueness levels.
3. A method according to any of the preceding claims, wherein the display device comprises a head-up display.
4. A method according to claim 3, comprising outputting a second control signal, the first and second control signals being associated with different display settings wherein the display device is arranged to fade between the different display settings.
5. A method according to any of the preceding claims, wherein the visibility detector comprises a camera.
6. A method according to any of claims 1 to 4, wherein the visibility detector comprises a photometer arranged to measure light intensity.
7. A method according to any of claims 1 to 5, wherein the visibility detector comprises a photometer arranged to measure scattering of light.
8. A method according to any of the preceding claims, wherein the object detector comprises a camera.
9. A method according to claim 8, wherein the camera is a stereoscopic camera.
10. A method according to any of claims 1 to 7, wherein the object detector comprises a radar system.
1 1 . A method according to any of the preceding claims, wherein generating an object image comprises determining an object type from the object information and receiving a representation of the object type from an object database.
12. A method according to any of the preceding claims, wherein the method comprises receiving GPS information from a GPS device, and determining an object type or object information from the GPS information to be displayed.
13. A method according to claims 1 to 8, wherein receiving object information comprises receiving a video image and generating an object image comprises generating a processed video image in dependence on the determined visibility level.
14. A method according to any of the preceding claims, comprising periodically determining the control signal to control the display device.
15. A non-transitory computer readable storage medium comprising computer readable instructions for a computer processor to carry out the method of any preceding claim.
16. A system for displaying an object image of an object, the object being in a low visibility area, the system comprising:
an input arranged to:
receive visibility information relating to the low visibility area from a visibility detector; and
receive object information from an object detector;
a processor arranged to:
determine a visibility level from the received visibility information; determine if the visibility level is below a predetermined threshold;
generate an object image based on the received object information; and determine a control signal to control the display device to display an object image of the object in dependence on the determined display settings;
an output arranged to output the control signal to the display device.
17. A system according to claim 16, wherein the input comprises an object detection module arranged to:
receive object information from an object detector; and
determine object types and information associated with the received information.
18. A system according to claim 16, wherein the system comprises a GPS module arranged to:
receive GPS information from a GPS device; and
determine object types and information associated with the received information.
19. A system according to claim 17 or claim 18, wherein the system comprises an object database arranged to:
store image representations of objects; and
receive object types; and
output the image representations to the output module.
20. A system according to claim 16, wherein the object information received is a video image, and wherein the processor is arranged to generate a processed video image in dependence on the determined visibility level.
21 . A vehicle comprising the system of any of the claims 16 to 20.
22. A vehicle substantially as described herein with reference to any one of accompanying Figures 1 a to 6.
PCT/EP2017/056965 2016-03-23 2017-03-23 Adaptive display for low visibility WO2017162812A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB1604936.3A GB201604936D0 (en) 2016-03-23 2016-03-23 Adaptive display for low visibility
GB1604936.3 2016-03-23

Publications (1)

Publication Number Publication Date
WO2017162812A1 true WO2017162812A1 (en) 2017-09-28

Family

ID=55968767

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/056965 WO2017162812A1 (en) 2016-03-23 2017-03-23 Adaptive display for low visibility

Country Status (2)

Country Link
GB (2) GB201604936D0 (en)
WO (1) WO2017162812A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107767682A (en) * 2017-12-18 2018-03-06 河南应用技术职业学院 A kind of guideboard automatic identifying method
CN111619343A (en) * 2019-02-28 2020-09-04 北京新能源汽车股份有限公司 Mode control method, system and equipment of head-up display and automobile

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7027278B2 (en) * 2018-08-07 2022-03-01 本田技研工業株式会社 Display devices, display control methods, and programs
US20190141310A1 (en) * 2018-12-28 2019-05-09 Intel Corporation Real-time, three-dimensional vehicle display
DE102019202585A1 (en) * 2019-02-26 2020-08-27 Volkswagen Aktiengesellschaft Method for operating a driver information system in an ego vehicle and driver information system
DE102019202578A1 (en) 2019-02-26 2020-08-27 Volkswagen Aktiengesellschaft Method for operating a driver information system in an ego vehicle and driver information system
DE102019202587A1 (en) 2019-02-26 2020-08-27 Volkswagen Aktiengesellschaft Method for operating a driver information system in an ego vehicle and driver information system
DE102019202592A1 (en) 2019-02-26 2020-08-27 Volkswagen Aktiengesellschaft Method for operating a driver information system in an ego vehicle and driver information system
DE102019202581B4 (en) 2019-02-26 2021-09-02 Volkswagen Aktiengesellschaft Method for operating a driver information system in an ego vehicle and driver information system
DE102019202588A1 (en) 2019-02-26 2020-08-27 Volkswagen Aktiengesellschaft Method for operating a driver information system in an ego vehicle and driver information system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015037117A1 (en) * 2013-09-13 2015-03-19 日立マクセル株式会社 Information display system, and information display device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002240659A (en) * 2001-02-14 2002-08-28 Nissan Motor Co Ltd Device for judging peripheral condition of vehicle
JP4351132B2 (en) * 2004-09-17 2009-10-28 本田技研工業株式会社 In-vehicle night vision system
JP2007158820A (en) * 2005-12-06 2007-06-21 Fujitsu Ten Ltd Photographing control device
US8098171B1 (en) * 2010-12-28 2012-01-17 GM Global Technology Operations LLC Traffic visibility in poor viewing conditions on full windshield head-up display
US9681062B2 (en) * 2011-09-26 2017-06-13 Magna Electronics Inc. Vehicle camera image quality improvement in poor visibility conditions by contrast amplification
WO2014027131A1 (en) * 2012-08-14 2014-02-20 Nokia Corporation Low light vision assistance
US9800794B2 (en) * 2013-06-03 2017-10-24 Magna Electronics Inc. Vehicle vision system with enhanced low light capabilities

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015037117A1 (en) * 2013-09-13 2015-03-19 日立マクセル株式会社 Information display system, and information display device
US20160082840A1 (en) * 2013-09-13 2016-03-24 Hitachi Maxell, Ltd. Information display system and information display device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107767682A (en) * 2017-12-18 2018-03-06 河南应用技术职业学院 A kind of guideboard automatic identifying method
CN111619343A (en) * 2019-02-28 2020-09-04 北京新能源汽车股份有限公司 Mode control method, system and equipment of head-up display and automobile
CN111619343B (en) * 2019-02-28 2022-02-25 北京新能源汽车股份有限公司 Mode control method, system and equipment of head-up display and automobile

Also Published As

Publication number Publication date
GB2550472A (en) 2017-11-22
GB201604936D0 (en) 2016-05-04
GB2550472B (en) 2019-10-16
GB201704594D0 (en) 2017-05-10

Similar Documents

Publication Publication Date Title
GB2550472B (en) Adaptive display for low visibility
US10168174B2 (en) Augmented reality for vehicle lane guidance
US10304228B2 (en) Vehicular display apparatus and vehicular display method
CN107848416B (en) Display control device, display device, and display control method
US9760782B2 (en) Method for representing objects surrounding a vehicle on the display of a display device
US20160321920A1 (en) Vehicle surroundings monitoring device
CN107406072B (en) Vehicle assistance system
JP7140498B2 (en) Display controller and display system
US10488658B2 (en) Dynamic information system capable of providing reference information according to driving scenarios in real time
US20210155159A1 (en) Vehicle display apparatus
JP2016020876A (en) Vehicular display apparatus
US10996469B2 (en) Method and apparatus for providing driving information of vehicle, and recording medium
JP2006162442A (en) Navigation system and navigation method
US20200171951A1 (en) Vehicular projection control device and head-up display device
JP2016038757A (en) Traffic light recognition apparatus and traffic light recognition method
CN110546026A (en) Adjusting device, display system and adjusting method
CN112119398A (en) Method and device for operating a camera-monitor system of a motor vehicle
WO2020105685A1 (en) Display control device, method, and computer program
JP6186905B2 (en) In-vehicle display device and program
JP5192009B2 (en) Vehicle periphery monitoring device
JP2018092290A (en) Vehicle display device
JP2011191859A (en) Apparatus for monitoring surroundings of vehicle
JP2020190942A (en) Display controller
US20220065649A1 (en) Head-up display system
KR101405327B1 (en) Apparatus for monitoring objects on road

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17713007

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17713007

Country of ref document: EP

Kind code of ref document: A1