GB2550472B - Adaptive display for low visibility - Google Patents

Adaptive display for low visibility Download PDF

Info

Publication number
GB2550472B
GB2550472B GB1704594.9A GB201704594A GB2550472B GB 2550472 B GB2550472 B GB 2550472B GB 201704594 A GB201704594 A GB 201704594A GB 2550472 B GB2550472 B GB 2550472B
Authority
GB
United Kingdom
Prior art keywords
visibility
information
display
detector
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
GB1704594.9A
Other versions
GB201704594D0 (en
GB2550472A (en
Inventor
Paszkowicz Sebastian
Hardy Robert
Alexander George
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Publication of GB201704594D0 publication Critical patent/GB201704594D0/en
Publication of GB2550472A publication Critical patent/GB2550472A/en
Application granted granted Critical
Publication of GB2550472B publication Critical patent/GB2550472B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/002Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/23
    • B60K35/28
    • B60K35/81
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/57Control of contrast or brightness
    • B60K2360/177
    • B60K2360/179
    • B60K2360/21
    • B60K2360/334
    • B60K2360/349
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/107Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using stereoscopic cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/205Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using a head-up display
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/302Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with GPS information or vehicle data, e.g. vehicle speed, gyro, steering angle data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8053Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for bad weather conditions or night vision
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • G09G2360/144Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications

Description

ADAPTIVE DISPLAY FOR LOW VISIBILITY
TECHNICAL FIELD
The present disclosure relates to an adaptive display for low visibility, and particularly,but not exclusively, to an adaptive display for low visibility for a land vehicle. Aspects ofthe invention relate to an adaptive display, an adaptive display low visibility system andto a method of providing information to a driver through an adaptive display.
BACKGROUND
Driving is often an essential part of a modern lifestyle, and frequently people arerequired to drive despite low visibility conditions. However, low visibility conditions canhave a considerable impact on road safety. It is of little surprise that as a driver’svisibility ofthe surroundings decreases, the risk of an accident occurring increases.
There are many factors which can cause low visibility conditions for a driver. Onesignificant factor is weather such as fog, snow and rain, which are well known to resultin dangerous driving conditions. As well as simply obscuring an object or hazard, suchweather conditions can produce low contrast visibility between objects and theirsurroundings, a consequence of which is that the objects are perceived as beingfurther away from the observer than they are in reality. Moreover, low contrast canresult in a lack of clear external markers for a driver, and in this case it is easy for thedriver to underestimate their speed and over-estimate distances. These factors cancontribute to higher accident rates.
Furthermore, low visibility conditions can also be the consequence of light intensity, asboth not enough and too much light can be detrimental to the vision of a driver.
Therefore, it is of growing importance, especially as vehicle technology develops, toprovide drivers with systems which can aid safer driving when there are adverseeffects on visibility. In response to low visibility conditions, as well as other road safetyconsiderations, vehicles are now often equipped with perception sensors, such asradar, lidar and CCD cameras, which can continuously monitor driving conditions.Driver assistance systems can analyse both inside and outside conditions in order to provide the driver with warnings, images or information, or initiate active systems suchas automatically turning on lights.
Although there are many automatic systems which have been developed in order toaid drivers, one system which can significantly aid driving is Head-Up Displays(HUDs). HUDs are used to present a range of information to a driver by displaying it intheir typical line of sight. Information may be projected onto a windscreen, or any cleartransparent medium placed within the driver’s field of view. This allows a driver toreceive and process information about the surrounding environment without divertingor shifting the driver’s focus and attention from their usual line of sight.
However, the information displayed by HUDs may also have the undesired effect ofbeing distracting to the driver by providing inessential data. This is especially apparentin more difficult driving conditions, which particularly require the driver’s completeconcentration.
The present invention has been devised to mitigate or overcome at least some of theabove-mentioned problems.
SUMMARY OF THE INVENTION
According to an aspect of the present invention there is provided a method ofcontrolling a display device to display an object image of an object, the object being ina low visibility area, the method comprising: receiving visibility information relating tothe low visibility area, from a visibility detector, wherein the visibility informationcomprises intensity values for each pixel in an array of pixels; determining a spread ofintensity values; determining a visibility level based on the spread of intensity values;and in the event that the visibility is below a predetermined threshold; determining thedisplay settings dependent on the determined visibility level; receiving objectinformation from an object detector; generating an object image based on the receivedobject information; determining a control signal to control the display device to displayan object image of the object in dependence on the determined display settings; andoutputting the control signal to the display device.
The aforementioned method provides a display which adapts according to thedetermined visibility conditions in an area, the area being the field of view of thevisibility detector. In an embodiment where the display device is located within the vehicle, such that a driver is presented with the adaptive display in their immediatefield of view, the display adapts according to the visibility conditions in the field of viewof the visibility detector, which is taken to be substantially the same as the field of viewof the vehicle, or driver. In this embodiment, the driver is displayed road informationwhen the visibility conditions are determined to be below the predetermined threshold,in a manner dependent on the visibility conditions, However, in the case that thevisibility conditions are not below the predetermined threshold, the driver is notdisplayed any information.
Road information comprises one or more of, but is not limited to, surrounding objectssuch as vehicles, pedestrians and highway furniture, where highway furniture mayrelate to but is not limited to, road signs, barriers and speed bumps. In someembodiments road information may also include road network related information.Road network related information comprises information associated with one or moreof road networks, speed limits and road works.
This method provides the advantage that the driver is only presented with roadinformation when it is necessary, namely, in low visibility conditions. This method isdevised to be less distracting to the driver than previous driver information displays,and to aid road awareness. Moreover, the display adapts the display settingsaccording to determined visibility conditions, optimising the visuality of the displaywhilst ensuring the display is not distracting to the driver.
Furthermore, the above describes a method of controlling a display device such that itautomatically adapts to the visibility conditions without any user interaction required. Inthe presence of low visibility this method holds the advantage that information ispresented to the user without any attention being redirected towards the operation ofthe display device. In an embodiment where the user is driving a vehicle it can be seenthat this is advantageous as the driver can give their full attention to driving in the lowvisibility conditions, whilst at the same time being displayed useful information whichmay increase their awareness of the surroundings and consequently their safety.
Optionally, the visibility level being below a predetermined threshold may be indicativeof the occurrence of one or more of mist, fog, snow, drizzle, rain, smoke, dust or othervisual obscurant.
Optionally, the display settings comprise any one or more of: brightness levels;contrast levels; colours; and opaqueness levels.
Optionally the display device comprises a head-up display. Optionally, the methodcomprises outputting a first control signal and subsequently outputting a secondcontrol signal, the first and second control signals being associated with differentdisplay settings wherein the display device is arranged to fade between the differentdisplay settings.
Optionally the visibility detector comprises a camera.
Optionally the visibility detector comprises a photometer arranged to measure lightintensity.
Optionally the visibility detector comprises a photometer arranged to measurescattering of light.
Optionally the object detector comprises a camera. Furthermore, optionally the cameracomprises a stereoscopic camera.
Optionally the object detector comprises a radar system.
In certain embodiments generating an object image comprises determining an objecttype from the object information and receiving a representation of the object type froman object database.
Optionally the method also comprises receiving GPS information from a GPS device,where the GPS information comprises location information regarding the currentposition of the GPS device and may also comprise road network related information.
Optionally, the method comprises determining object types or object information, fromthe GPS information, to be displayed.
Optionally receiving object information comprises receiving a video image andgenerating an object image comprises generating a processed video image independence on the determined visibility level.
Optionally the method comprises periodically determining the control signal to controlthe display device.
According to an aspect of the invention there is provided a computer storage mediumcomprising computer-readable instructions for a computer to carry out theaforementioned method.
According to another aspect of the invention there is provided a non-transitorycomputer-readable storage medium storing executable computer program instructionsto implement the aforementioned method.
According to a further aspect of the present invention there is provided a system fordisplaying an object image of an object, the object being in a low visibility area, thesystem comprising: an input arranged to: receive visibility information relating to thelow visibility area from a visibility detector, wherein the visibility information comprisesintensity values for each pixel in an array of pixels; and receive object information froman object detector; a processor arranged to: determine a spread of intensity values;determine a visibility level based on the spread of intensity values; determine if thevisibility level is below a predetermined threshold; and in the event that the visibilitylevel is below a predetermined threshold; determine display settings dependent on thedetermined visibility level; generate an object image based on the received objectinformation; and determine a control signal to control the display device to display anobject image of the object in dependence on the determined display settings; an outputarranged to output the control signal to the display device.
The visibility detector may provide a visibility level in the form of a contrast value. Thecontrast value may be determined from an array of intensity values corresponding toan imaged scene. The imaged scene may comprise the low visibility area. Thecontrast value may be a measure of the spread of intensity values in the array ofintensity values.
Optionally the input comprises an object detection module arranged to receive a videoimage from the object detector and to generate an object image, comprisinggenerating a processed video image in dependence on the determined visibility level.
Optionally the system comprises an object database arranged to: store imagerepresentations of object types; and receive object types from the object detectionmodule; and output the image representations to the object detection module.
Optionally, the system comprises a GPS module arranged to: receive road networkrelated information and the current position of the GPS device; and determine objecttypes and information associated with the received information; and retrieve objectimages of the determined object types and information; and output the imagerepresentations.
An aspect of the invention provides for a vehicle comprising the system of the abovefurther aspect of the present invention.
Within the scope of this application it is expressly intended that the various aspects,embodiments, examples and alternatives set out in the preceding paragraphs, in theclaims and/or in the following description and drawings, and in particular the individualfeatures thereof, may be taken independently or in any combination. That is, allembodiments and/or features of any embodiment can be combined in any way and/orcombination, unless such features are incompatible. The applicant reserves the right tochange any originally filed claim or file any new claim accordingly, including the right toamend any originally filed claim to depend from and/or incorporate any feature of anyother claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of exampleonly, with reference to the accompanying drawings, in which:
Figure 1(a) is a schematic top view of a driving system with low visibility in thefield of view of a vehicle, the vehicle comprising a low visibility systemaccording to an embodiment of the present invention;
Figure 1(b) is a schematic top view of a driving system with low visibility in thefield of view of a vehicle, the vehicle comprising a low visibility system includinga global positioning system (GPS) device, according to another embodiment ofthe present invention;
Figure 1(c) is a schematic top view of a driving system with very low visibility inthe field of view of a vehicle, the vehicle comprising a low visibility systemaccording to an embodiment of the present invention, illustrating very lowvisibility;
Figure 1(d) is a schematic top view of a driving system with extremely lowvisibility in the field of view of a vehicle, the vehicle comprising a low visibility system according to an embodiment of the present invention, illustratingextremely low visibility;
Figure 2(a) is a schematic block diagram of the low visibility system of Figure1(a);
Figure 2(b) is a schematic block diagram of the low visibility system of Figure1(b);
Figure 3 is a flowchart of a process carried out by the low visibility system ofFigure 2(a);
Figure 4(a) is a schematic view illustrating the point of view of a driver througha vehicle windscreen, with a display according to an embodiment of the presentinvention, illustrating low visibility;
Figure 4(b) is a schematic view illustrating the point of view of a driver througha vehicle windscreen, with a display according to an embodiment of the presentinvention, illustrating very low visibility;
Figure 4(c) is a schematic view illustrating the point of view of a driver througha vehicle windscreen, with a display according to an embodiment of the presentinvention, illustrating extremely low visibility;
Figure 5 is a schematic top view of a driver system showing a specific scenario,illustrating how the driving system of Figure 1(b) could be employed; and
Figure 6 is a schematic view illustrating the point of view of a driver through avehicle windscreen corresponding to the specific scenario shown in Figure 5.
DETAILED DESCRIPTION
Figure 1(a) shows a driving environment 100 comprising a vehicle 102, with a lowvisibility area 104, and an object 106. The object 106 is located within the low visibilityarea 104. The vehicle 102 comprises a visibility detector 108, an object detector 110, alow visibility system 112a, 112b and a display device 114.
The visibility detector 108, object detector 110 and display device 114 are eachoperatively connected to the low visibility system 112a, 112b. In other embodimentsthe visibility detector 108, object detector 110 and display device 114 are wirelesslyconnected to the low visibility system 112a, 112b.
The display device 114 is arranged to adapt its brightness and/or contrast levelsand/or colours and/or opaqueness, which comprise the display settings, dependent onthe visibility conditions, and to adapt the display settings dependent on detectedvisibility conditions. Moreover, in an embodiment, the display device 114 is arranged tofade between display settings as the display settings adapt to the detected visibilityconditions. In an embodiment, the display device 114 is a Head-Up Display (HUD).
The visibility detector 108 is a sensor which has a field of view depicted by the lines116, which is substantially the same as the field of view of the object detector 110 andof the driver. The low visibility area 104 and object 106 are ahead of the vehicle 102and in the field of view 116 of the visibility detector 108 and the object detector 110.
In an embodiment of the invention, the visibility detector 108 is a camera, arranged tocapture images of the external environment of the vehicle 102 at between 24 to 240frames per second. In an aspect of the embodiment, the camera is a digital cameraand arranged to output electronic data.
In another embodiment the visibility detector 108 is photometer, arranged to measurethe light intensity of the external environment. In yet another embodiment of the invention, the visibility detector 108 is a photometer, arranged to measure thescattering of light in the external environment. In an aspect of the embodiment, thephotometer is digital and arranged to output electronic data.
In an embodiment of the invention, the object detector 110 may comprise a radarsystem. The radar system includes an emitter and a receiver. The emitter emits radiowaves which are projected to scan all directions within the vehicle’s field of view 116.
The receiver detects any reflections from the emitted waves and a filter is used todiscriminate between those emitted waves and waves due to noise.
In another embodiment, the object detector 110 may comprise a stereoscopic camera.The stereoscopic camera includes two lenses in order to provide a degree of depthperception for obtaining three dimensional representations of any objects 106 detectedin the field of view 116. In an aspect of embodiment, the stereoscopic camera is digitaland arranged to output electronic data.
In another embodiment, the object detector 110 may comprise a camera arranged tocapture images of the external environment of the vehicle 102 at between 24 to 240frames per second. In yet another embodiment, the object detector 110 may comprisea video camera arranged to capture video images.
Data from the visibility detector 108 and the object detector 110 is processed by thelow visibility system 112a, 112b which outputs control signals in order to display objectimages on the display device 114.
Figure 1(b) shows another embodiment of the invention, illustrating a drivingenvironment 101 comprising a vehicle 102, with a low visibility area 104 in the field ofview 116, and an object 106. The object 106 is located within the low visibility area104. The vehicle 102 comprises a visibility detector 108, an object detector 110, aglobal positioning system (GPS) device 105, a low visibility system 112c and a displaydevice 114. In an embodiment of the invention, the display device 114 is a head-updisplay (HUD).
The features of the driving environment 101 are substantially the same as thosepreviously described in the driving environment 100 of Figure 1(a), excluding the GPSdevice 105. The GPS device 105 is operatively connected to a low visibility system 112c, and is arranged to determine its current position, which will be substantially thesame as the current position of the vehicle 102. Furthermore, the GPS device 105 isalso arranged to include, or remotely access, a database of maps, which include roadnetworks and other information, such as road works and speed limits. The GPS device105 is arranged to search the database of maps with respect to its determined positionand consequently to output data relevant to the immediate and/or approachingexternal environment of the vehicle to the low visibility system 112c.
Figure 1(c) shows the same driving environment as given in Figure 1(a), with a verylow visibility area 104 in the field of view 116. Figure 1(d) shows the same drivingenvironment as given in Figure 1(a), with an extremely low visibility area 104 in thefield of view. These illustrate that the driving system is arranged to work in differinglevels of visibility.
Figure 2(a) shows an embodiment of the low visibility system 112a in greater detail.The low visibility system 112a comprises a controller 200, a visibility detection module202, an object detection module 204, and an output module 208. The visibilitydetection module 202, object detection module 204 and output module 208 are eachoperatively connected to the controller 200.
The visibility detection module 202 is arranged to receive input from the visibilitydetector 108, where the input is visibility information relating to the low visibility area104, and further to process the visibility information. The visibility detection module 202is also arranged to analyse visibility information in order to calculate the visibility levelof the low visibility area 104. The visibility level will be compared to a predeterminedthreshold, and if the visibility level is below the predetermined threshold, the visibilitydetection module 202 is arranged to determine display settings dependent on thedetermined visibility level, and initiate further processes of the system in reaction to thevisibility level. The display settings are sent to the output module 208 to be outputted,as part of the control signal, to the display device 114. In an example of anembodiment of the invention, images are received by a camera on the vehicle 102 andare processed by the visibility detection module. The visibility level is determined to beextremely low, far below the predetermined threshold. The visibility detection modulewill therefore output high contrast and high brightness display settings.
The visibility detector may provide intensity values for each pixel in an array of pixelscorresponding to an imaged scene, for example a scene ahead of the vehicle.
Therefore, an array of intensity values can be determined for an imaged scene. Fromthe array of intensity values, different visibility parameters may be determined.
The data forming the array of intensity values may be represented as a histogram.From analysis of the array of intensity values, a mean luminance value can beobtained. A low mean luminance may indicate low light level conditions. A high meanluminance value may indicate high light level conditions.
From the data forming the array of intensity values, and/or from the histogram, thespread of intensity values can be used to indicate contrast levels in the array of pixels.A small spread of intensity values may indicate low contrast conditions. A large spreadof intensity values may indicate high contrast conditions.
Conditions indicative of night-time, or other low light level conditions, such as may beexperienced when a vehicle enters a tunnel or other space sheltered from externallighting, may be indicated by a low mean luminance and a small spread of intensityvalues.
Conditions indicative of the occurrence of one or more of mist, fog, snow, drizzle, rain,smoke, dust, or other visual obscurant, may be indicated by a high mean luminanceand a small spread of intensity values. However, if the visual obscurant is encounteredduring low light level conditions, such as at night-time, then a low mean luminance anda small spread of intensity values may be observed.
In night-time conditions, without the presence of a visual obscurant, the driver of thevehicle may operate lighting, such as forward lighting, of the vehicle, in order toilluminate the scene around, and in particular in front of, the vehicle. By doing so, themean luminance level may be increased, whilst the spread of intensity values is alsoincreased.
When the vehicle is operated in conditions of fog, or other visual obscurant, then theoperation of vehicle lighting to illuminate the scene may increase the mean luminancelevel but not provide a corresponding increase in the spread of intensity values. Insome conditions, the spread of intensity values may decrease when the vehiclelighting is operated in conditions of fog, or other visual obscurant.
Therefore it may be observed that the reduction in contrast, that is, the reduction in thespread of intensity values in the histogram of the array of intensity values, independentof the mean luminance value, provides the increased difficulty in the driving conditionsfor the vehicle driver. Operating vehicle lighting may be detrimental to the observedcontrast.
The present invention provides for enhancing object detection when the detectedspread of intensity values in the histogram of the array of intensity values is low. Thisis identified herein as a low visibility level, and may be independent of the meanluminance level. The enhancement of object detection may be dependent on thespread of intensity values in the histogram of the array of intensity values. The visibilitylevel is a measure of the spread of intensity values in a histogram of an array ofintensity values from a visibility detector.
The object detection module is 204 is arranged to receive object information from theobject detector 110. The object detection module 204 is arranged to perform objectrecognition analysis on the object information received from the object detector 110 todetermine if there is an object present in the low visibility area 104. If it is determinedthat there is an object in the low visibility area 104, the object detection module 204 isarranged to generate an object image based on the received object information. Theobject detection module 204 is arranged to send the generated object image to theoutput module 208 to be outputted, as part of the control signal, to the display device114.
The object detection module 204 performs object recognition analysis based on thedetected size and shape of the object 106. The objects 106 detectable by the objectdetection module 204 include, but are not limited to, vehicles, pedestrians, andhighway furniture, such as but not limited to, road signs, barriers and speed bumps.
In an embodiment of the invention, wherein the object detector 110 comprises acamera, the object detection module 204 is also arranged to process the images in away that is dependent on the visibility level, such that the images clearly show anyobjects within the image.
In another embodiment of the invention, wherein the object detector 110 comprises avideo camera, the object detection module 204 is also arranged to process videoimages in a way that is dependent on the visibility level, such that the video images clearly show any objects within the video stream. In this embodiment, the objectdetection module 204 is arranged to continually output the processed video stream tothe output module 208 to be continually outputted to the display device 114, until sucha time that the detected visibility level is above the predetermined threshold or noobjects are detected within the low visibility area 104.
Otherwise described by the low visibility system 112a of Figure 2(a), the embodimentof the low visibility system 112b of Figure 2(b) also comprises an object database 206.The object database 206 comprises image representations of object types. Forexample, image representations may be symbols of vehicles, pedestrians, andhighway furniture. In this embodiment, the object detection module 204 is arranged toretrieve an image representation from the object database 206 of the determinedobject type.
In another embodiment of the invention, the object database 206 is remote from thelow visibility system 112b, and can be accessed wirelessly.
Figure 2(c) shows an embodiment of the low visibility system 112c according to theembodiment of the invention shown in Figure 1(c). The low visibility system 112ccomprises a controller 200, a visibility detection module 202, an object detectionmodule 204, an object database 206, a GPS module 205 and an output module 208.The visibility detection module 202, object detection module 204, object database 206,GPS module 205, and output module 208 are each operatively connected to thecontroller 200.
Otherwise described by the low visibility system 112b of Figure 2(b), the embodimentof the low visibility system 112c of Figure 2(c) includes a GPS module 205 that isarranged to receive and process data relevant to the current position of the GPSdevice 105. For example, GPS module 205 may receive information about the positionof the GPS device with respect to the road. As another example, the GPS module 205may receive information about an approaching change in speed limit. The GPSmodule 205 processes the received data, and determines any object types to bedisplayed. The GPS module 205 is arranged to retrieve image representations of theobject types from the object database 206. The GPS module is also arranged to sendthe image representations to the output module 208 to be outputted, as part of thecontrol signal, to the display device 114.
Each of the low visibility systems shown in Figures 2(a), 2(b) and 2(c), comprise of aninput, a processor as illustrated by box 210, and an output. It is noted that theprocessor may, depending on configuration, comprise a single processing module or aplurality of modules. Additionally, the input may comprise a single input or a number ofinputs depending on the configuration of the system. Furthermore, in anotherembodiment of the invention, the output may comprise of a number of outputs.
Figure 3 shows a process 300 according to an embodiment of the present inventioncarried out by the low visibility system 112a given in Figure 2(a). Visibility informationis received from the visibility detector 108 in Step 302. The visibility information isanalysed in Step 304, to determine the level of visibility of the field of view of thevisibility detector 108. The visibility level is then compared to a predeterminedthreshold in Step 306, to confirm whether the field of view comprises low visibility area104 and further if the visibility is low enough such that a display of an object image isrequired. If the visibility level is not lower than the predetermined threshold, theprocess 300 begins again at Step 302. However, if, following the analysis of Step 304,the visibility level is lower the predetermined threshold, then display settings aredetermined, dependent on the visibility level, shown by Step 308. Then, objectinformation is received from the object detector 110 in Step 310. The objectinformation from the object detector 110 is analysed by the object detection module204, such that any objects the low visibility area 104 are detected. If there are noobjects in the low visibility area 104, then the process returns to Step 310. However, ifthere is an object or objects detected in the low visibility area 104, then the objectdetection module 204 generates an object image based on the received objectinformation, shown in Step 314. Then, shown by Step 316, the display settings andobject image, which comprise a control signal, are outputted to the display device 114.
The low visibility system 112b of Figure 2(b) is configured to carry out a substantiallysimilar process to process 300, however Step 314, generating the object image,comprises the object detection module 204 determining an object type and retrievingan image representation of the object type from the object database 206.
The low visibility system 112c of Figure 2(c) is configured to carry out a substantiallysimilar process to process 300, however Step 314, generating the object image,comprises the object detection module 204 determining an object type and retrievingan image representation of the object type from the object database 206, and further the GPS module 205 determining any object types to be displayed and retrievingimage representations from the object database 206.
Each of the low visibility systems 112a, 112b and 112c are arranged to repeatreiteratively, after a predetermined time period, such that the system is adapting thecontrol signal dependent on any detected changes in the visibility level or objects inthe low visibility area 104.
Figures 4(a), 4(b) and 4(c) illustrate the adaptive nature of the system. Figures 4(a) to(c) show the view of a driver 400 through a vehicle windscreen 402 with the adaptivedisplay 404 in use. In Figures 4(a), 4(b), and 4(c) a car has been detected in front ofthe vehicle, such that a representation 406 is shown on the adaptive display 404.Figure 4(a) shows low visibility conditions 408. Figure 4(b) shows very low visibilityconditions 410. Figure 4(c) shows extremely low visibility conditions 412. Comparingthe representation 406 in each of the Figures, it is illustrated that as visibility conditionsworsen, the brightness and contrast of the representation 406 increase, dependent onthe detected visibility conditions 408, 410, 412.
In an embodiment of the invention, the display settings are faded between thereiterative implementation of the process, or when no display is required, such that thedisplay appears to the driver as smoothly transitioning in response to changingvisibility levels, and that the transitioning between the display settings dependent onvisibility levels is not obvious to the driver. Therefore, it should appear to the driver asa continuously adapting display. Furthermore, fading between different display settingsis arranged such that it is of minimum possible distraction to the driver.
In one embodiment of the invention, the low visibility system 112a, 112b, 112c canprecisely detect the level of the visibility so that it can characterise very many differentvarying levels of visibility, such that the adaptive display settings are very sensitive tothe level of visibility.
In another embodiment of the invention, the display has a limited number of displaylevels, such that the display only varies after a substantive change in the visibility level. A specific scenario is shown in Figures 5 and 6 to illustrate how the system could beemployed. This scenario is non-limiting and is provided here for illustrative purposes.
Figure 5 shows a vehicle 102 in two positions. Initially, the vehicle 102 is in position500. After a period of time, the vehicle 102 travels to the second position 502. In thesecond position 502, there are low visibility conditions 504 within the field of view ofthe vehicle 102. The vehicle 102 comprises a visibility detector 108, an object detector110, a global positioning system (GPS) device 105, a low visibility system 112c and adisplay device 114, where the display device 114 is a HUD. In this embodiment of theinvention, the visibility detector 108 is a digital camera and the object detector 110 is aradar system with a digital output.
In the first position 500, the camera 108 sends images to the visibility detection module202. The visibility detection module 202 processes the images and analyses thevisibility information to determine contrast levels. In this position, the contrast levelsare normal and therefore are within the predetermined threshold. Therefore, no otherprocesses are required.
When the car has travelled to the second position 502, the camera 108 sends imagesto the visibility detection module 202. The visibility detection module 202 processes theimages and analyses the data to determine visibility levels. In the second position 502,very low visibility levels are detected, and the visibility level is determined to be belowthe predetermined threshold.
The visibility detection module 202 analyses the visibility level and determines thecorresponding display settings. This initiates the radar system 110 and the GPS device105, which collect information about the vehicle surroundings. The object detectionmodule 204 receives and processes object information from the radar system 110, anddetermines if there is an object 106 in the field of view 116, and further that the objecttype is a car. The object detection module 204 then retrieves an image representationof a car from the object database 206. The object detection module also determinesthe distance to the car 106 at the point the image was taken, in order to determinewhat size the object representation should be shown at. The GPS module 205receives and processes GPS information from the GPS device 105 and determinesthat the vehicle is in the left-hand lane and additionally that the speed-limit will changein the next 100m from 70mph (about 110 km/h) to 60mph (about 95 km/h). The GPSmodule 105 therefore retrieves the image representations for a left-hand lane and for a60mph (or 95km/h) sign from the object database 206. The visibility detection module202 outputs the display settings to the output module 208, the object detection module204 outputs the car representation to the output module 208 and the GPS module 205 outputs the left-hand lane and 60mph (or 95 km/h) sign representations to the outputmodule 208, all of which then comprise the control signal which is sent to the displaydevice 114. The display device 114 which is a head -up display receives the controlsignal from the output module 208 and displays the object representations at thedisplay settings corresponding to the determined visibility level.
Figure 6 provides an illustration of the view of a driver 600 through a vehiclewindscreen 402 in the scenario described in Figure 5, when the vehicle is in thesecond position 502.
Many modifications may be made to the above examples without departing from thescope of the present invention as defined in the accompanying claims.

Claims (22)

1. A method of controlling a display device to display an object image of an object, theobject being in a low visibility area, the method comprising: receiving visibility information relating to the low visibility area, from a visibilitydetector, wherein the visibility information comprises intensity values for each pixel inan array of pixels; determining a spread of intensity values; determining a visibility level based on the spread of intensity values; andin the event that the visibility level is below a predetermined threshold; determining display settings dependent on the determined visibility level;receiving object information from an object detector; generating an object image based on the received object information; determining a first control signal to control the display device to display theobject image of the object in dependence on the determined display settings; and outputting the first control signal to the display device.
2. A method according to claim 1, wherein the visibility level being below apredetermined threshold is indicative of the occurrence of one or more of mist, fog,snow, drizzle, rain, smoke, dust or other visual obscurant.
3. A method according to claim 1 or claim 2, wherein the display settings comprise anyone or more of: brightness levels; contrast levels; colours; and opaqueness levels.
4. A method according to any of the preceding claims, wherein the display devicecomprises a head-up display.
5. A method according to claim 4, comprising outputting a second control signal, thefirst and second control signals being associated with different display settings whereinthe display device is arranged to fade between the different display settings.
6. A method according to any of the preceding claims, wherein the visibility detectorcomprises a camera.
7. A method according to any of claims 1 to 5, wherein the visibility detector comprisesa photometer arranged to measure light intensity.
8. A method according to any of claims 1 to 5, wherein the visibility detector comprisesa photometer arranged to measure scattering of light.
9. A method according to any of the preceding claims, wherein the object detectorcomprises a camera.
10. A method according to claim 9, wherein the camera is a stereoscopic camera.
11. A method according to any of claims 1 to 8, wherein the object detector comprisesa radar system.
12. A method according to any of the preceding claims, wherein generating an objectimage comprises determining an object type from the object information and receivinga representation of the object type from an object database.
13. A method according to any of the preceding claims, wherein the method comprisesreceiving GPS information from a GPS device, and determining an object type orobject information from the GPS information to be displayed.
14. A method according to claims 1 to 9, wherein receiving object informationcomprises receiving a video image and generating an object image comprisesgenerating a processed video image in dependence on the determined visibility level.
15. A method according to any of the preceding claims, comprising periodicallydetermining the control signal to control the display device.
16. A non-transitory computer readable storage medium comprising computerreadable instructions for a computer processor to carry out the method of anypreceding claim.
17. A system for displaying an object image of an object, the object being in a lowvisibility area, the system comprising: an input arranged to: receive visibility information relating to the low visibility area from avisibility detector, wherein the visibility information comprises intensity valuesfor each pixel in an array of pixels; and receive object information from an object detector; a processor arranged to: determine a spread of intensity values; determine a visibility level based on the spread of intensity values;determine if the visibility level is below a predetermined threshold; andin the event that the visibility level is below a predetermined threshold; determine display settings dependent on the determined visibilitylevel; generate an object image based on the received objectinformation; and determine a control signal to control the display device to displayan object image of the object in dependence on the determined displaysettings; an output arranged to output the control signal to the display device.
18. A system according to claim 17, wherein the input comprises an object detectionmodule arranged to: receive object information from an object detector; anddetermine object types and information associated with the receivedinformation.
19. A system according to claim 17, wherein the system comprises a GPS modulearranged to: receive GPS information from a GPS device; and determine object types and information associated with the receivedinformation.
20. A system according to claim 18 or claim 19, wherein the system comprises anobject database arranged to: store image representations of objects; and receive object types; and output the image representations to the output module.
21. A system according to claim 17, wherein the object information received is a videoimage, and wherein the processor is arranged to generate a processed video image independence on the determined visibility level.
22. A vehicle comprising the system of any of the claims 17 to 21.
GB1704594.9A 2016-03-23 2017-03-23 Adaptive display for low visibility Active GB2550472B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB1604936.3A GB201604936D0 (en) 2016-03-23 2016-03-23 Adaptive display for low visibility

Publications (3)

Publication Number Publication Date
GB201704594D0 GB201704594D0 (en) 2017-05-10
GB2550472A GB2550472A (en) 2017-11-22
GB2550472B true GB2550472B (en) 2019-10-16

Family

ID=55968767

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB1604936.3A Ceased GB201604936D0 (en) 2016-03-23 2016-03-23 Adaptive display for low visibility
GB1704594.9A Active GB2550472B (en) 2016-03-23 2017-03-23 Adaptive display for low visibility

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB1604936.3A Ceased GB201604936D0 (en) 2016-03-23 2016-03-23 Adaptive display for low visibility

Country Status (2)

Country Link
GB (2) GB201604936D0 (en)
WO (1) WO2017162812A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107767682A (en) * 2017-12-18 2018-03-06 河南应用技术职业学院 A kind of guideboard automatic identifying method
JP7027278B2 (en) * 2018-08-07 2022-03-01 本田技研工業株式会社 Display devices, display control methods, and programs
US20190141310A1 (en) * 2018-12-28 2019-05-09 Intel Corporation Real-time, three-dimensional vehicle display
DE102019202585A1 (en) * 2019-02-26 2020-08-27 Volkswagen Aktiengesellschaft Method for operating a driver information system in an ego vehicle and driver information system
DE102019202581B4 (en) 2019-02-26 2021-09-02 Volkswagen Aktiengesellschaft Method for operating a driver information system in an ego vehicle and driver information system
DE102019202588A1 (en) 2019-02-26 2020-08-27 Volkswagen Aktiengesellschaft Method for operating a driver information system in an ego vehicle and driver information system
CN111619343B (en) * 2019-02-28 2022-02-25 北京新能源汽车股份有限公司 Mode control method, system and equipment of head-up display and automobile

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1233390A2 (en) * 2001-02-14 2002-08-21 Nissan Motor Co., Ltd. Vehicle obstacle warning system and method
US20060186347A1 (en) * 2004-09-17 2006-08-24 Honda Motor Co., Ltd. Vehicle night vision system
JP2007158820A (en) * 2005-12-06 2007-06-21 Fujitsu Ten Ltd Photographing control device
US8098171B1 (en) * 2010-12-28 2012-01-17 GM Global Technology Operations LLC Traffic visibility in poor viewing conditions on full windshield head-up display
WO2013048994A1 (en) * 2011-09-26 2013-04-04 Magna Electronics, Inc. Vehicle camera image quality improvement in poor visibility conditions by contrast amplification
US20140354811A1 (en) * 2013-06-03 2014-12-04 Magna Electronics Inc. Vehicle vision system with enhanced low light capabilities
US20150208004A1 (en) * 2012-08-14 2015-07-23 Nokia Corporation Low light vision assistance

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6346614B2 (en) * 2013-09-13 2018-06-20 マクセル株式会社 Information display system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1233390A2 (en) * 2001-02-14 2002-08-21 Nissan Motor Co., Ltd. Vehicle obstacle warning system and method
US20060186347A1 (en) * 2004-09-17 2006-08-24 Honda Motor Co., Ltd. Vehicle night vision system
JP2007158820A (en) * 2005-12-06 2007-06-21 Fujitsu Ten Ltd Photographing control device
US8098171B1 (en) * 2010-12-28 2012-01-17 GM Global Technology Operations LLC Traffic visibility in poor viewing conditions on full windshield head-up display
WO2013048994A1 (en) * 2011-09-26 2013-04-04 Magna Electronics, Inc. Vehicle camera image quality improvement in poor visibility conditions by contrast amplification
US20150208004A1 (en) * 2012-08-14 2015-07-23 Nokia Corporation Low light vision assistance
US20140354811A1 (en) * 2013-06-03 2014-12-04 Magna Electronics Inc. Vehicle vision system with enhanced low light capabilities

Also Published As

Publication number Publication date
GB201704594D0 (en) 2017-05-10
WO2017162812A1 (en) 2017-09-28
GB2550472A (en) 2017-11-22
GB201604936D0 (en) 2016-05-04

Similar Documents

Publication Publication Date Title
GB2550472B (en) Adaptive display for low visibility
US9760782B2 (en) Method for representing objects surrounding a vehicle on the display of a display device
US20180144202A1 (en) Brake Light Detection
US20180334099A1 (en) Vehicle environment imaging systems and methods
US20160321920A1 (en) Vehicle surroundings monitoring device
US20180015879A1 (en) Side-view mirror camera system for vehicle
US10872419B2 (en) Method and apparatus for evaluating a vehicle travel surface
US10488658B2 (en) Dynamic information system capable of providing reference information according to driving scenarios in real time
Langner et al. Traffic awareness driver assistance based on stereovision, eye-tracking, and head-up display
US20190141310A1 (en) Real-time, three-dimensional vehicle display
US20210155159A1 (en) Vehicle display apparatus
JP5948170B2 (en) Information display device, information display method, and program
US10996469B2 (en) Method and apparatus for providing driving information of vehicle, and recording medium
US20220277496A1 (en) Park Assistance System for a Motor Vehicle
US10946744B2 (en) Vehicular projection control device and head-up display device
CN112119398A (en) Method and device for operating a camera-monitor system of a motor vehicle
US20170004809A1 (en) Method for operating a display device for a vehicle
US10354153B2 (en) Display controller, display control method, and recording medium storing program
CN116841042A (en) Augmented reality head-up display with symbols superimposed on visually imperceptible objects
JP5192009B2 (en) Vehicle periphery monitoring device
US20220065649A1 (en) Head-up display system
US20190381936A1 (en) Device for assisting night-time road navigation
JP6956473B2 (en) Sideways state judgment device
US20220258759A1 (en) Method and driver assistance system for providing visual information about a first vehicle in an environment of a second vehicle, computer program and computer-readable medium
JP6624312B2 (en) Display device, control method, program, and storage medium