EP3671715A1 - Procédé et système de réglage de profils de luminance dans des visiocasques - Google Patents

Procédé et système de réglage de profils de luminance dans des visiocasques Download PDF

Info

Publication number
EP3671715A1
EP3671715A1 EP18275185.9A EP18275185A EP3671715A1 EP 3671715 A1 EP3671715 A1 EP 3671715A1 EP 18275185 A EP18275185 A EP 18275185A EP 3671715 A1 EP3671715 A1 EP 3671715A1
Authority
EP
European Patent Office
Prior art keywords
user
image
luminance
lighting
regard
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP18275185.9A
Other languages
German (de)
English (en)
Inventor
designation of the inventor has not yet been filed The
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BAE Systems PLC
Original Assignee
BAE Systems PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BAE Systems PLC filed Critical BAE Systems PLC
Priority to EP18275185.9A priority Critical patent/EP3671715A1/fr
Priority to US17/429,704 priority patent/US11500210B2/en
Priority to PCT/GB2019/053597 priority patent/WO2020128459A1/fr
Priority to CA3121740A priority patent/CA3121740A1/fr
Priority to AU2019411520A priority patent/AU2019411520A1/en
Priority to EP19828296.4A priority patent/EP3899922A1/fr
Priority to GB1918682.4A priority patent/GB2581573B/en
Publication of EP3671715A1 publication Critical patent/EP3671715A1/fr
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/3406Control of illumination source
    • G09G3/342Control of illumination source using several illumination sources separately controlled corresponding to different display panel areas, e.g. along one dimension such as lines
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness
    • G09G2320/0633Adjustment of display parameters for control of overall brightness by amplitude modulation of the brightness of the illumination source
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0673Adjustment of display parameters for control of gamma adjustment, e.g. selecting another gamma curve
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0686Adjustment of display parameters with two or more screen areas displaying information with different brightness or colours
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • G09G2360/144Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light

Definitions

  • the present invention relates to improvements in or relating to luminance profile control in head-mounted display systems, particularly but not exclusively to those for use in aviation, ground vehicles or at a workstation.
  • HMDs head-mounted displays
  • HWDs head-worn displays
  • HMDs can aid users, for example vehicle operators, by displaying images to the user that contain useful symbology and other information.
  • the displayed information is intended to improve the decision-making and/or operation of the vehicle by the HMD user, and to reduce the workload of the user by providing relevant information to the user in an accessible manner.
  • symbology may include positional information displayed to the user at a predetermined position on the HMD, or may include a projection of a location of the user's destination overlaid onto the external environment.
  • the HMD may display images that provide labels to the user corresponding to instruments in the vehicle interior.
  • HMDs may be used at any time of day or in simulated environments. HMDs may also be used in different or changing environments.
  • conventional HMDs incorporate a photodetector, photodiode or other sensor configured to measure the instantaneous luminance level of the entire surrounding environment or an average luminance level of a specific, fixed region.
  • the HMDs are configured to alter the luminance level or intensity of the images displayed to the user according to the luminance level measured by the photodetector.
  • this assumes that the lighting condition of the environment is uniformly distributed.
  • images displayed to the user via their HMD that have a general luminance level set according to the measurements of a photodetector may not be optimised compared to a lighting condition within the environment and so may impair visibility for the user.
  • one object of the present invention is to overcome the problems of existing HMDs.
  • a method for adjusting a luminance profile of an image displayed on a display based on lighting conditions comprising: determining a field of regard of the user; determining one or more lighting conditions in the field of regard; adjusting the luminance profile for the image to accommodate the lighting conditions of the field of regard, wherein, as a result of the adjusted luminance profile, the image is reconciled with the field of regard; and transmitting the luminance profile for use in displaying the image to the user by the head-mounted display.
  • the luminance profile may be variable.
  • the luminance profile may be capable of including a plurality of different luminance levels.
  • the method may comprise, in response to determining that more than one lighting conditions exist in the field of regard, determining the lighting conditions comprises determining a value of a lighting parameter for each distinct lighting condition.
  • the luminance profile may comprise at least one luminance level corresponding to each distinct lighting condition.
  • Adjusting the luminance profile for the image may comprise sacrificing grey shades to artificially adjust luminance levels corresponding to lighting conditions other than the lighting condition having the highest determined value for the lighting parameter.
  • Adjusting the luminance profile for the image may comprise adjusting the luminance level corresponding to the lighting condition having the highest determined value for the lighting parameter.
  • Adjusting the luminance profile may comprise adjusting luminance levels corresponding to individual light sources of an image generation unit.
  • An image may be reconciled if the variation between the luminance profile of the image and the lighting conditions is reduced.
  • the variation may be determined based on individual luminance levels.
  • the reduction may be quantified using a percentage threshold.
  • Adjusting the luminance profile for the image may comprise generating an image having an adjusted luminance profile.
  • Transmitting the luminance profile may comprise transmitting the image to the head-mounted display.
  • the lighting data source may comprise one selected from: a luminance sensor; a data store; an artificial lighting control system; or a data link.
  • the lighting data source may be a luminance sensor, and the luminance sensor may be a photodetector, light sensor, and/or a camera.
  • the method may comprise determining a spatial configuration of the user's head.
  • the field of regard may be determined based at least in part on the spatial configuration.
  • Determining the spatial configuration of the user's head may comprise determining the position and/or orientation of user's head.
  • Determining the spatial configuration of the user's head may comprise determining the position and orientation of the user's head relative to a predetermined coordinate system.
  • Determining the spatial configuration of the user's head may comprise receiving data from a head-tracking system.
  • the method may comprise inferring, from the spatial configuration of the user's head, a visual field of the user.
  • the method may comprise comparing the visual field against a model environment.
  • a system for adjusting a luminance profile of an image displayed on a head-mounted display being worn by a user based on lighting conditions comprising a head-tracking system and a processor, the system being configured to carry out the method described above.
  • the system may comprise a head-mounted display configured to receive the transmitted luminance profile and display an adjusted image having the adjusted luminance profile.
  • the head-mounted display may comprise an optical combiner.
  • the system may comprise an image generation unit.
  • the optical combiner may present a generated image for viewing by a user against an outside scene.
  • the image generation unit may include one or more light sources.
  • the image generation unit may comprise an addressable illuminator.
  • the present invention relates to improvement in or relating to adjusting a luminance profile of an image displayed on a head-mounted display (HMD) being worn by a user based on lighting conditions.
  • HMD head-mounted display
  • the present invention relates to a method that ensures that images displayed on a HMD worn by a user are reconciled with the lighting conditions of the surroundings against which they are viewed. Reconciliation of images and lighting conditions can be considered to be achieved if the luminance profile of images is suitably altered to ensure that the image content is discriminable to the user against the background of the lighting conditions and so that the image content is harmonised against the background. As a result, the images are visible and the user's vision is not impaired by the light and environment conditions in which the user is located. For example in situations where the brightness of the conditions may impair the user's vision.
  • the user's field of regard is determined, and that field of regard is used to identify information relating to lighting conditions within that field of regard.
  • the field of regard indicates where the user is looking in relation to their surroundings, i.e. relative to local and/or global references.
  • a luminance profile for displaying images on the HMD can be generated such that the images are visible and are not perceived in a detrimental manner.
  • the images are subsequently transmitted for display to the user. This is particularly useful in situations where there is a high contrast ratio between two different regions of the background environment, such as the contrast between interior of an aircraft and the exterior environment at any time of the day.
  • Figure 1 shows an exemplary representation of an HMD shown generally at 100.
  • the HMD 100 is capable of augmenting the user's environment with images capable of assisting the user.
  • the HMD 100 displays images to the user at a luminance profile that is adjusted relative to the lighting conditions of the environment against which the images are viewed.
  • the HMD is wearable by means of an appropriate support 102.
  • the support includes one or more optical elements 104 which can be viewed by one of both eyes of the user.
  • the HMD further includes a control system (not shown).
  • the HMD can be of any appropriate type including googles, glasses, a helmet or helmet visor.
  • the device is portable or adapted to be portable by means of the support.
  • the support may include a support adapted to support the optical elements in front of the eye.
  • the support may include: frames; side arms and supports for goggles and glasses; a helmet or visor; a headband; a neck or shoulder worn support; a gaming headset; or any other support that could be worn to hold the optical elements in the desired position.
  • the control system is variable depending on the use of the HMD.
  • the control unit may be in situ or remote from the HMD.
  • the control device may include a communications module for communicating with the optical elements and with other modules either on the HMD or remote therefrom.
  • the communications may be wireless and/or wired.
  • the control module may include different modules for carrying out different functions. These functions are not limited in any way but may include imaging, tracking, scene generation, processing, storage, power supply, audio etc.
  • the one or more optical elements 104 may be any appropriate type, such as, for example, an eyepiece or waveguide. Although not shown in detail, the optical elements 104 include a substantially transparent display medium. The user is able to view the environment through the optical elements 104, as well as any image relayed to the eye of the user in use via the HMD.
  • the HMD is able to generate images in an image generation unit (not shown in Figure 1 ) and display images to the user in any suitable way, including projecting images to the user via the optical elements.
  • Images may be displayed statically on the HMD, so that the images are displayed to the user regardless of where they are looking, or statically relative to the user's environment, so that movements of the HMD and user's head are matched by a corresponding movement of the image.
  • Displaying images statically on the HMD is useful for symbology and particular operational information within the context of a vehicle environment. For example, symbology can comprise altitude, directional, or speed information that the user should be aware of at all times. Examples of images that may be displayed statically relative to the user's environment are landmark information of the external environment or labels for particular instruments within the interior environment.
  • FIG. 2 shows a block diagram of a system 200 according to an embodiment of the invention.
  • the system 200 includes the HMD 100.
  • the system 200 adjusts a luminance profile for displaying an image to the user on the HMD 100 according to the lighting conditions of the user's surroundings and particularly a field of regard lying within a visual field of the user.
  • the system 200 includes a central processing module 202 that gathers and analyses data from a plurality of sources and generates outputs for transmission to the connected HMD 100, such as that of Figure 1 .
  • a central processing module 202 that gathers and analyses data from a plurality of sources and generates outputs for transmission to the connected HMD 100, such as that of Figure 1 .
  • "transmission”, "transmit”, and “transmitting” is intended to encompass transmission to any part of the HMD.
  • the transmission may be from a control system of the HMD to the optical system or from an external system to the HMD more generally.
  • the connection between the HMD and processing module may be wired or wireless using an appropriate protocol.
  • the processing module may double as the control system of the HMD.
  • the HMD may have an individual control system.
  • the processing module 202 is in communication with devices, systems, or modules from which it is able to gather data in order to generate an output for transmission to the HMD. Shown in Figure 2 are a head-tracking system 204, a sensor array 206, a simulation module 210 (if the user is using a simulator), an interior environment control module 208 (or artificial lighting control system), a data store 212, and a data link 214 for connecting to an external data store (not shown). For clarity, other systems or modules from which the processing module 202 gathers data are not shown in Figure 2 . However, it will be appreciated that the content display to a user via the HMD may be generated by the processing module using numerous other sensing systems such as navigational modules, data stores, or information received via a communications network, as required.
  • the head-tracking system determines a spatial configuration of the HMD user's head.
  • the term spatial configuration is intended to encompass at least one of a position of the head and an orientation of the head, and may be a combination of the two components.
  • the position and/or orientation of the head may be determined relative to a predetermined coordinate system, such as a 3D model of the environment.
  • the head-tracking system 204 may be any head-tracking system suitable for tracking position and/or orientation of the user's head, directly or indirectly.
  • the head-tracking system may be an inertial measurement unit, or may utilise cameras and/or sensors to track fiducials, i.e. a camera mounted on the HMD to track fiducials within the interior, or a camera mounted within the interior to track fiducials on the HMD.
  • the head-tracking system may comprise optical sensors, acoustic sensors, electromagnetic sensors, accelerometers, or other means mounted in the HMD and/or the interior environment.
  • the head-tracking system may also incorporate a processing system to determine the spatial configuration from sensor data. As specific implementations of head-tracking systems are not the subject of this application and would be known to the skilled person, they will not be discussed in further detail here.
  • the head-tracking system 204 utilises vehicle orientation and/or position data from vehicle sensors to determine the position of the HMD in relation to a global reference.
  • the system includes data sources, such as the sensor array 206, the interior environment control system 208, the simulation module 210, or the data store
  • the sensor array 206 comprises a light, or luminance, sensor.
  • the light sensor is configured to measure at least one parameter to determine lighting conditions.
  • the light sensor detects luminance level.
  • the light sensor is a photodetector or an auto-brilliance sensor or a camera.
  • the light sensor may be internally or externally mounted to the aircraft.
  • the sensor array may be partly or wholly shared with the head-tracking system, or may be a separate set of sensors.
  • the interior environment control system 208 provides further input to the processing module.
  • the interior control system may illuminate instruments within the interior environment.
  • the input received by the processing module from the interior control system is indicative of how the instruments are illuminated, e.g. to what intensity, luminance level, colour, and other parameters, and enable the processing module to determine lighting conditions for the interior environment, for use in adjusting the luminance profile of images viewed on the HMD by the user against the interior environment.
  • the processing module 202 connects with a simulation module 210.
  • the simulation module 210 provides information relating to the operation of the simulator for use in determining lighting conditions. For example, the relevant lighting information and luminance level or profile of projected simulator imagery may be provided by the simulation module 210.
  • Figure 3 shows a flow chart illustrating a method 300 that governs that operation of the system of Figure 2 .
  • the HMD is being worn by the user and that the HMD is operational to display images to the user.
  • the spatial configuration of the user's head is determined 302. This is typically performed by the head-tracking system. As discussed above, position and/or rotation are considered to determine a spatial configuration.
  • the head-tracking system shares spatial configuration data with the processing system.
  • a visual field of regard of the user corresponding to the spatial configuration data is determined or inferred 304.
  • the direction of the user's gaze is determined based on the spatial configuration data, and an estimation of what is visible to the user is applied to the spatial configuration data. This may be performed by the head-tracking system and/or by the processing module.
  • the visual field of regard may be determined in a number of ways.
  • the spatial configuration of the user's head is mapped to a 3D model of the interior of the aircraft in which the user is located.
  • a visual cone is generated, inferring the visual field of regard of user relative to the user's head, and this can be used within the 3D mode.
  • the visual field of regard may be variable.
  • the visual field of regard may vary depending on different factors, such as individual users, time of day, or direction of view.
  • the determined visual field of regard is used by the system to determine 306 a focal region of the user that falls within the visual field.
  • the system determines what the user is currently looking at in their surroundings, based on the position of their head.
  • the focal region may be determined based on a comparison of the visual field with a model environment, using the 3D model or as part of the predetermined coordinate system used for spatial configuration determination for example. By identifying where the visual field and the model environment interact or intersect, the focal region can be determined. In particular, by comparing the model and visual field, points of intersection can be identified, and the focal region can be determined based on these points.
  • the focal region may be considered to be a 2D representation of a portion of the 3D, model environment.
  • the lighting conditions present within the focal region are determined.
  • the lighting conditions typically the instantaneous luminance levels or profile present across the focal region, are determined based on data gathered from one or more data sources of the system 202.
  • these data sources comprise a sensor array 206, an interior environment control system 208, a simulation module 210 and a data store 212.
  • the system may compare the focal region with the model environment. Regions of the model environment may have associated indicators that can be used to determine the relevant data source from which to gather lighting condition data. By comparing the model and focal region, the relevant data sources can be selected for the focal region, so that data can be retrieved from the sources as required.
  • a luminance profile for one or more images on the HMD is adjusted 308 by the processing module 202.
  • the luminance profile may be variable and may be capable of including one or more luminance levels corresponding to one or more portions of the focal region.
  • the luminance profile is adjusted to accommodate the lighting conditions.
  • the luminance profile is adjusted to cause the image to be reconciled or harmonised with the environment against which it is viewed.
  • An image is considered to be reconciled if the variation or difference between the lighting of the image and the lighting conditions of the focal region is reduced, preferably to within a percentage threshold value.
  • reconciliation can be measured by comparison of luminance levels, and calculation of a contrast ratio.
  • a typical contrast ratio of 1.2:1 or greater is desirable to allow the display to be visible to the user.
  • the contrast ratio calculation can be adapted to take into account the transmission of and losses or reductions caused by optical elements of the HMD.
  • the luminance profile is adjusted from a previous or nominal level based on the retrieved data that relates to the focal region.
  • the focal region has only one lighting condition.
  • the luminance profile comprises a luminance level
  • a single data source is required to retrieve data to adjust the image accordingly. Adjusting a single luminance level for an image or images on a HMD is effectively implemented by modulating a single light source or display source to present the images to a user via the optical elements of the HMD.
  • the focal region covers more than one area of the user environment and so more than one lighting conditions may be present, the distinct lighting conditions being quantified by the same or different data sources. Therefore, the luminance profile will comprise more than one luminance level, and these luminance levels may differ across the image. The luminance levels will be determined to correspond to each distinct lighting condition at least, the lighting conditions being quantified using a value for a lighting parameter.
  • Luminance profiles having more than one luminance level may be achieved using an addressable region illuminator, addressable region display source, multiple light sources or multiple display sources within the image generation portion of the HMD, to present the images to a user via the optical elements of the HMD.
  • a multi-level luminance profile may be achieved using a single light source or display source as a projector and artificially adjust the luminance levels for different regions by sacrificing available grey shades in the image.
  • a single global luminance level is set for the luminance profile that corresponds to a luminance level for the lighting condition that is the 'brightest', i.e. it has the highest value of the parameter used to quantify lighting conditions.
  • Grey shades are sacrificed to artificially adjust luminance levels for lighting conditions other than the brightest.
  • Sacrificing grey shades is achieved by altering the addressable grey shade range.
  • an addressable grey shade range may have a range of 0 to 255. In a grayscale image, 0 corresponds to black, and 255 corresponds to white.
  • the intermediate values are incremental grey shades.
  • the addressable grey shade range can be altered to have a different maximum value that corresponds to white in a black and white image. For example, the grey shade values in the ordinary range could be multiplied by 0.5 and rounded, to create a maximum grey shade value of 128, the range being between 0 and 128, where 0 corresponds to black and 128 to white in a grayscale image. New incremental shades between 0 and 128 are used. In setting a new maximum, the resultant image would be dimmer with fewer addressable grey shades.
  • grey shades may be sacrificed in combination with an illuminator or projector comprising multiple light sources, or an addressable region illuminator to achieve the adjusted luminance profiles.
  • the adjusted luminance profile is transmitted 310, or otherwise output, from the processing module to the HMD 100.
  • the provision of the adjusted luminance profile to the HMD results in the HMD displaying images to the user that are reconciled with the lighting conditions against which they are being viewed.
  • the adjusted luminance profile may be combined with images to be displayed by the processing module, and the images with an adjusted luminance profile may be transmitted to the HMD.
  • the adjusted luminance profile is communicated to the HMD as a parameter setting intended to replace the previously set luminance profile.
  • the focal region may be determined based on the spatial configuration alone, with a plurality of spatial configurations being used for comparison, and a region output that corresponds with the spatial configuration.
  • the method 300 of Figure 3 is applied to determine the spatial configuration of a user's head and to determine whether the user's visual field corresponds to the user looking substantially towards the external environment, substantially towards the interior environment, or a combination of the two.
  • windows may therefore be used to designate that the region being viewed is the external environment. If required, a more precise determination may be made by dividing the interior and/or external environments into regions having their own specific luminance levels.
  • Figure 4 illustrates a user's head within a model environment.
  • the user's head, 402 the interior environment 404, the external environment 406, HMD 100, and the visual field 408 are shown.
  • the interior environment comprises an overhead portion 410 above the user's head 402 and a panel 412 in front of and below the user's head 402.
  • the user is shown viewing the external environment 406, as the visual field 408 intersects a window 414.
  • the lighting condition for the external environment is pertinent
  • the user's visual field 408 were to be directed in the direction of arrows 418 and 416, i.e. generally towards the overhead portion 410 or the panel 412, the user would be considered to be viewing the interior 404, and the system 200 and method 300 would alter the luminance profile of the displayed images accordingly.
  • the operation of the system and the method vary according to the scenario in which it is employed.
  • the lighting conditions of the exterior environment differ from those from the interior environment as the external environment has a higher luminance level. That is to say, that the external environment is perceived by the user as being brighter than the internal environment.
  • the external environment may have a luminance five times greater than that of the interior which may be dimly lit in comparison. Therefore, the luminance of the HMD will need to be altered dynamically such that the contrast ratio of the images presented to the user via the HMD is maintained as the user looks between different portions of their environment.
  • an adjusted luminance level will be generated and transmitted to the HMD to reduce or eradicate the glow, or other issues, by creating a more visually reconciled image where the luminance level is reduced compared to its initial level and the variation between background and image is not as great.
  • Images displayed at a first luminance level for visibility and minimised glow or artefacts against the interior environment would be adjusted according to an adjusted luminance level as part of a luminance profile by the system and method of the present invention if the user looked towards the exterior environment, increasing the luminance level of the images to a second luminance level that exceeds the first luminance level.
  • the interior is dimly lit by illuminated instruments, controlled by the interior environment control system, while the external environment may be less bright than the interior. Therefore, using a conventional system, displaying images to be visible against the illuminated interior would result in an unwanted background glow or artefacts when viewing images against the external environment. This may hamper the user's perception of the external environment.
  • the interior may be lit at approximately five times greater than the less bright exterior environment.
  • a conventional system may display images at a luminance level to ensure visibility against the interior luminance level. In doing so, a glow against the dimmer external environment would be caused.
  • the luminance profile is adjusted according to the user's visual field, reducing the possibility that glow or low visibility of images impacts the user's perception of either their surroundings or the images on the HMD.
  • a screen 420 may also be included, particularly for a simulated mode of operation. This is also shown in Figure 4 .
  • the luminance levels may vary due to the presence of lighting illuminating the simulation screen to enable viewers to watch the user operating the simulator. It may also be the case that portions of the interior of the environment are simulated, while other parts are real and illuminated as in a real vehicle or scenario. Therefore, the external environment may have different lighting conditions than the realistic interior. Accordingly, the system of the invention operates to determine whether the user is looking at the real interior or the simulated environment at a given moment to determine the required luminance level at which to display the images
  • the method 500 of Figure 5 includes setting a nominal luminance profile for the display images at step 502.
  • the spatial configuration, i.e. position and orientation, of the user's head is determined in step 504.
  • step 506 the position of the user's head, and their inferred visual field, are compared with a 3D model.
  • a determination at step 508 is made to determine the direction the user is facing, i.e. the focal region is determined.
  • the focal region lies wholly over the external environment.
  • the lighting conditions of the external environment are gained by accessing light sensor data and/or manual settings.
  • Input data from the sensor array (i.e. the auto-brilliance sensor) and manual settings are accessed at step 512.
  • the manual settings may be based on the mode of operation, and may be accessed from a data store connected to the system.
  • a manual setting may be set via a user interface associated with the HMD.
  • the manual setting may comprise an override luminance value.
  • the nominal luminance profile is adjusted 514 to form an adjusted luminance profile, here comprising a luminance level because the user is facing the external environment only.
  • the adjusted profile is transmitted to the HMD at step 516.
  • an adjusted luminance profile is generated 522 accordingly using data accessed 520 from the interior environment control system, such as the luminance level of illuminated instrumentation, and/or manual settings.
  • the focal region is determined to be a combination of the internal and external environments.
  • the lighting conditions within are determined to be both those of the internal and external environments, and so at step 528, both luminance sensor data for the external environment and interior environment control system data for the interior environment is accessed, along with any manual settings.
  • An adjusted luminance level for each individual portion of the region is determined at step 530.
  • the adjusted profile is transmitted to the HMD 534 for display to the user.
  • An additional step may be included that determines whether it is day or night time, or whether a simulator is being used.
  • the luminance levels may be set according to pre-stored information, rather than using sensor data.
  • the system and method may equally be applied to any situation involving aircraft, vehicles or workstations and the relevant operators or users wearing a HMD, and may also be applied to differentiate between different parts of the environment.
  • the method may operate differently according to whether the user is looking at land or sky from their vehicle, with the horizon being used by the system to judge what is being viewed.
  • An example method may be to fuse tracking data obtained in relation to the HMD and position or location tracking data obtained in relation to the vehicle or wider system.
  • the position of the horizon may be determined using an image processing system based on data obtained from an image sensor. Alternatively, or additionally, the position of the horizon may be initially known and a tracking system may utilise a model to recalculate the position of the horizon relative to the HMD.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)
EP18275185.9A 2018-12-19 2018-12-19 Procédé et système de réglage de profils de luminance dans des visiocasques Pending EP3671715A1 (fr)

Priority Applications (7)

Application Number Priority Date Filing Date Title
EP18275185.9A EP3671715A1 (fr) 2018-12-19 2018-12-19 Procédé et système de réglage de profils de luminance dans des visiocasques
US17/429,704 US11500210B2 (en) 2018-12-19 2019-12-18 Method and system for adjusting luminance profiles in head-mounted displays
PCT/GB2019/053597 WO2020128459A1 (fr) 2018-12-19 2019-12-18 Procédé et système de réglage de profils de luminance dans des visiocasques
CA3121740A CA3121740A1 (fr) 2018-12-19 2019-12-18 Procede et systeme de reglage de profils de luminance dans des visiocasques
AU2019411520A AU2019411520A1 (en) 2018-12-19 2019-12-18 Method and system for adjusting luminance profiles in head-mounted displays
EP19828296.4A EP3899922A1 (fr) 2018-12-19 2019-12-18 Procédé et système de réglage de profils de luminance dans des visiocasques
GB1918682.4A GB2581573B (en) 2018-12-19 2019-12-18 Method and system for adjusting luminance profiles in head-mounted displays

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP18275185.9A EP3671715A1 (fr) 2018-12-19 2018-12-19 Procédé et système de réglage de profils de luminance dans des visiocasques

Publications (1)

Publication Number Publication Date
EP3671715A1 true EP3671715A1 (fr) 2020-06-24

Family

ID=64746462

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18275185.9A Pending EP3671715A1 (fr) 2018-12-19 2018-12-19 Procédé et système de réglage de profils de luminance dans des visiocasques

Country Status (1)

Country Link
EP (1) EP3671715A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120132806A1 (en) * 2010-11-30 2012-05-31 Stmicroelectronics (Research & Development) Limited Sensor array microchip
US20130222354A1 (en) * 2010-09-17 2013-08-29 Nokia Corporation Adjustment of Display Brightness
EP2750125A2 (fr) * 2012-12-27 2014-07-02 LG Display Co., Ltd. Unité de génération de tension gamma et dispositif d'affichage l'utilisant
US20160314762A1 (en) * 2015-04-21 2016-10-27 Samsung Electronics Co., Ltd. Display apparatus and method for controlling the same
US20180188803A1 (en) * 2016-12-31 2018-07-05 Intel Corporation Context aware selective backlighting techniques

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130222354A1 (en) * 2010-09-17 2013-08-29 Nokia Corporation Adjustment of Display Brightness
US20120132806A1 (en) * 2010-11-30 2012-05-31 Stmicroelectronics (Research & Development) Limited Sensor array microchip
EP2750125A2 (fr) * 2012-12-27 2014-07-02 LG Display Co., Ltd. Unité de génération de tension gamma et dispositif d'affichage l'utilisant
US20160314762A1 (en) * 2015-04-21 2016-10-27 Samsung Electronics Co., Ltd. Display apparatus and method for controlling the same
US20180188803A1 (en) * 2016-12-31 2018-07-05 Intel Corporation Context aware selective backlighting techniques

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JANG WONCHEOL ET AL: "Human field of regard, field of view, and attention bias", COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, ELSEVIER, AMSTERDAM, NL, vol. 135, 19 July 2016 (2016-07-19), pages 115 - 123, XP029709591, ISSN: 0169-2607, DOI: 10.1016/J.CMPB.2016.07.026 *

Similar Documents

Publication Publication Date Title
JP7036773B2 (ja) 仮想および拡張現実システムならびに方法
US9069163B2 (en) Head-up display with brightness control
CA2970894C (fr) Amelioration de la perception visuelle de la symbologie de couleur affichee
US9869886B2 (en) Adaptive spectacles for motor vehicle drivers or passengers
US8681073B1 (en) System for and method of controlling contrast or color contrast in see-through displays
CN103930818B (zh) 具有虚拟图像对比度控制的光学显示系统和方法
CA2781064C (fr) Grossissement d'image sur un visiocasque
TW201626046A (zh) 頭部配戴型顯示裝置、頭部配戴型顯示裝置之控制方法及電腦程式
US11500210B2 (en) Method and system for adjusting luminance profiles in head-mounted displays
US11567568B2 (en) Display apparatuses and methods incorporating foveated rendering
JP2018090170A (ja) ヘッドアップディスプレイシステム
CN113330506A (zh) 用于在亮度受控环境中进行局部调光的装置、系统和方法
CN112384883A (zh) 可穿戴设备及其控制方法
Keller et al. Perception in HMDs: what is it in head-mounted displays (HMDs) that really make them all so terrible?
US11493766B2 (en) Method and system for controlling transparency of a displaying device
JP2009265352A (ja) 表示装置
EP3671715A1 (fr) Procédé et système de réglage de profils de luminance dans des visiocasques
CN105579889A (zh) 包括防眩屏幕的数据显示眼镜
CN114365077B (zh) 观看者同步的照明感测
US11948483B2 (en) Image generation apparatus and image generation method
KR102235903B1 (ko) 2개의 조도센서를 이용한 머리 착용형 디스플레이 장치의 영상 최적화 방법
US11762205B1 (en) Method for creating uniform contrast on a headworn display against high dynamic range scene
US20240037698A1 (en) Head-frame symbology isolation for head worn display (hwd)
KR20190001983A (ko) 고시인성 마이크로디스플레이 장치 및 이를 포함하는 헤드 마운트 디스플레이
GB2603483A (en) Holographic imaging system

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME