WO2014135901A1 - High dynamic range imaging systems - Google Patents

High dynamic range imaging systems Download PDF

Info

Publication number
WO2014135901A1
WO2014135901A1 PCT/GB2014/050696 GB2014050696W WO2014135901A1 WO 2014135901 A1 WO2014135901 A1 WO 2014135901A1 GB 2014050696 W GB2014050696 W GB 2014050696W WO 2014135901 A1 WO2014135901 A1 WO 2014135901A1
Authority
WO
WIPO (PCT)
Prior art keywords
data processing
tone mapping
display
data
processing module
Prior art date
Application number
PCT/GB2014/050696
Other languages
French (fr)
Inventor
Alan Gordon CHALMERS
Kurt Debattista
Maximino Esteves Correia BESSA
Original Assignee
The University Of Warwick
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The University Of Warwick filed Critical The University Of Warwick
Priority to GB1515749.8A priority Critical patent/GB2526478B/en
Publication of WO2014135901A1 publication Critical patent/WO2014135901A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing
    • G06T2207/20012Locally adaptive
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing

Definitions

  • This invention relates to high dynamic range imaging systems.
  • a human being can distinguish objects both in starlight during night time or in bright sun light, even though on a moonless night objects receive only approximately 1/1 ,000,000,000 of the illumination they would on a bright sunny day: This corresponds to a dynamic range of 90 db.
  • the eye needs time to adjust to different light levels.
  • the dynamic range of the human eye without adjustment of the pupil is only approximately 30 db.
  • the differences between very dark and very light spots in a picture taken by a modern camera system can easily be larger than those the human eye can distinguish without adaptation; with a modern camera system it is thus possible to determine fine details in very dark spots even while very bright spots are also present in a picture.
  • the dynamic range of a modern camera system may also easily surpass the dynamic range of a conventional display.
  • Tone mapping has been developed as a way for adapting images recorded with a high dynamic range camera, or generated by a computer, to match the limitations of a display. Tone mapping is very helpful in the production of realistic images and several operators have been proposed. Two main tone mapping operator classes exist, namely tone reproduction curves (TRCs) and tone reproduction operators (TROs). Both will be referred to as tone mapping functions in the present application.
  • TRCs tone reproduction curves
  • TROs tone reproduction operators
  • TRC algorithms are efficient because the operation is applied to pixels
  • TRCs fail to capture the important information on local contrast that could be represented in the spatial context of neighbouring image pixels which is of great importance for the human visual system.
  • One object of the systems disclosed herein is to improve the adaptability of tone mapping in accordance with ambient light.
  • HDR high dynamic range
  • the content of the scene depicted can change dynamically, as can the ambient lighting conditions, particularly when viewing the content on a mobile device, whilst on the move.
  • TMO tone mapping operator
  • a method for tone mapping a high dynamic range image to be presented on a display wherein there is associated with the display an imaging device which captures an image indicative of ambient light levels experienced at a number of positions over the display, and a data processing module processes signals from the imaging device and generates data which is used in adapting a tone mapping function applied to the high dynamic range image so as to account for different ambient light conditions.
  • a tone mapping function applied to the high dynamic range image is done dynamically.
  • the use of an imaging device, as opposed to a simple light sensor, provides greater versatility in adapting the tone mapping function.
  • the system could take into account different lighting conditions at different positions on the display.
  • the tone mapping function can be adapted so that different parts of the image are treated differently, depending on whether they are in bright light or in shade.
  • the tone mapping function applied to the high dynamic range image is adapted so as to account for different ambient light conditions at different positions on the display.
  • the imaging device could be used and an average value of luminance values then calculated, which is used to adapt the overall tone mapping function, as opposed to adapting it on a position dependent basis.
  • the captured image will not be an image of the display itself, but an image captured by an imaging device looking away from the display, typically from a position adjacent the display.
  • the plurality of images can be combined to create a single image covering the entire display, and methods for combining images are well known in the art.
  • One or more imaging devices can capture an image or images indicative of ambient light levels experienced over the display, whereas one or more imaging devices can capture an image or images which can be used to detect other environmental conditions which affect the tone mapping required.
  • a camera on the front of the device directed at the user which can, for example, be used to take a self-portrait and this can be used to detect the ambient light levels experienced over the display.
  • a camera on the back of the device which is used to take photographs of other subject matter. This can be used, for example, to detect the amount of light on the other side of the device.
  • the orientation detecting module such as an accelerometer often found in mobile devices, to detect the orientation of the device, for example whether the user is looking down onto the device (with, typically, a dark surface beneath the device) or holding it up (with typically the sky or artificial lighting behind the device).
  • the orientation detecting module such as an accelerometer often found in mobile devices
  • the orientation of the device for example whether the user is looking down onto the device (with, typically, a dark surface beneath the device) or holding it up (with typically the sky or artificial lighting behind the device).
  • the orientation detecting module such as an accelerometer often found in mobile devices
  • a method for tone mapping a high dynamic range image to be presented on portable device the device having a front and a back and on the front of the device there is a display on which the image is to be presented, on the front of the device there is also a first camera directed away from the screen which captures first image data which indicates ambient lighting conditions, and on the back of the device there is a second camera directed away from the back which captures second image data which indicates ambient lighting conditions, and a data processing module processes signals from the first and second cameras and generates data which is used in adapting a tone mapping function applied to the high dynamic range image so as to account for different ambient light conditions as detected by both the first and second cameras.
  • the tone mapping is adapted dynamically, so that the imaging device captures images continually, and as the lighting conditions change the tone mapping function changes.
  • the imaging device captures frames at interval sufficiently closely spaced to be usable as a video.
  • the frames may be captured at more widely spaced intervals, which may be regular or irregular.
  • the quality of the image that is captured need not be of high enough resolution for use as a photograph or a video, but can be of significantly lower resolution.
  • a lower resolution image will require less processing time and speed for the tone mapping function to be adapted.
  • the captured image could be in greyscale only, to further reduce the processing required.
  • the captured image could be in colour so that the tone mapping function can be adapted for, e.g., coloured ambient light.
  • the number of colours captured may be limited, and substantially below that necessary for a photograph or a video.
  • the imaging device could be in the form of a camera.
  • mobile telephones, tablet computing devices and portable computers include a camera on the display side, intended to be used to capture an image of the user.
  • desktop computer displays are provided with a camera facing a user or can be provided with such a camera (frequently referred to as a "web cam").
  • a separate camera may be provided, if the facility exists to provide an interface, or the display device may be modified to provide an image capture function.
  • a TV display which is currently provided with a simple light sensor may detect whether room lights are on or off, a more sophisticated imaging device could be provided.
  • the image to be displayed and subjected to tone mapping could be a still image, or a video. Where there is a video, or a series of still images, the tone mapping functions can be adapted to account for different conditions as there are changes from scene to scene, or for example if there are different lighting conditions as a camera pans across a view. Adapting the tone mapping function also encompasses selecting different types of tone mapping function for particular circumstances.
  • the tone mapping function will vary according to the nature of the image, the capabilities of the image processing equipment that will handle the image, the capabilities of the display, the lighting conditions where the image is displayed and any other variables which affect the way the displayed image is perceived by a viewer.
  • processing of data from the imaging device and adapting the tone mapping function are carried out within a device providing the display.
  • data from the imaging device, or locally processed data from the imaging device can be fed to a server which supplies images to the display and can carry out the tone mapping.
  • the tone mapping is adapted individually for the target display in accordance with the data from the imaging device associated with that imaging device.
  • TMA tone mapping operator
  • the tone mapping may be adapted in accordance with the wishes of a person responsible for the appearance of the image, such as a director responsible for the appearance of scenes of a movie or other video.
  • a person can define a particular theme or mood to be applied, and the tone mapping will reflect that theme or mood.
  • the director may decide that a particular scene should have a blue tint.
  • the data being fed to the device which drives the display may include some data which reflects the director's preferences.
  • the director can look at a scene of a video on a monitor display, and alter colour correction so that a scene appears in a particular way that the director wishes. Having done that, the video can be colour corrected and stored in this corrected form.
  • the image data includes preference data which enables a tone mapping function which adapts the image data for display on a given display, to take into account the preferences of a parson responsible for producing the video data.
  • preference data which enables a tone mapping function which adapts the image data for display on a given display, to take into account the preferences of a parson responsible for producing the video data.
  • a director can view a scene and choose a particular mood - for example choosing a particular frame which has an appearance which bests fits the mood the director wishes to create.
  • Data can be generated which would generate that appearance on the "standard” display device under the “standard” lighting conditions, and this can accompany the video data as signifying preferences for a scene.
  • the colour appearance of the chosen frame can be determined compared to a predetermined "standard” colour chart, the display characteristics are known from the display specifications and its current settings (such as the current contrast setting), while the current
  • environmental conditions can be determined by a imaging device as described above.
  • tone mapping will be used in a conventional manner, or as described earlier to account for different ambient lighting conditions over the display, and also to reflect the preferences for a particular scene. Preferences could be set for any number of scenes, and in this context a scene includes and series of frames of any length. A director could choose preferences for an entire film. In any event, the preferences of the director will be reproduced whatever other changes tone mapping may cause to take place.
  • the aspect of the disclosure can be used in its own right and thus viewed from another aspect there is provided a method of tone mapping of a stream of images forming a video, wherein at least a series of those images is provided with preference data, and that preference data is used to adapt the appearance of the images in the series during tone mapping.
  • the preference data may be encoded together with the video.
  • the preference data could also be sent in the form of meta data rather than as example video frames. This meta data - describing the conditions when the preference was created - would be a lot smaller (and then of course much easier to transmit) than a sequence of video frames.
  • Example video frames could also be sent but these would have to be interlaced within the HDR video stream (although they would themselves be tone mapped images and thus smaller than an HDR video frame.)
  • Embodiments of the invention provide a tone mapping framework that can dynamically take into account both the scene being considered and the current environmental conditions to ensure a preferred user viewing experience.
  • the present invention proposes a tone mapping framework that is dynamic and providing the tone mapping operator and parameters for a preferred viewing experience based on the current ambient environmental conditions, such as natural or artificial lighting conditions, and the current frame being watched.
  • the preferred framework is reactive and changes the choice of tone mapping operator as the ambient lighting conditions change and the scene being watched changes.
  • Figure 1 is a diagrammatic view of a display device in a domestic environment, during daylight;
  • Figure 2 is a diagrammatic view of the display device in the same domestic environment, at night;
  • Figure 3 is a diagrammatic view of the display device, showing how a shadow is formed on the screen
  • Figure 4 shows a map for the screen, illustrating the area corresponding to the shadow
  • Figure 5 is a block diagram showing a process in accordance with the invention
  • Figure 6 is a block diagram of a routine for adapting the process if there is a change in ambient luminance
  • Figure 7 is a schematic representation of a system which introduces the ability to take into account preferences
  • Figure 8 is a front view of a portable device which can be used in accordance with the invention.
  • Figure 9 is a rear view of the portable device of Figure 8.
  • FIG. 1 there is show a display device 1 for a domestic or office environment, such as a television or a monitor for a computer, having a screen 2.
  • the display device is also provided with an imaging device 3, which in this case is a camera, which captures ambient light impinging on the screen, although more than one imaging device could be provided.
  • an imaging device 3 which in this case is a camera, which captures ambient light impinging on the screen, although more than one imaging device could be provided.
  • an artificial light 6 which is turned off. In this environment the screen 2 is illuminated evenly.
  • Figure 2 illustrates the environment of Figure 1 , at night time. Outside it is dark , as indicated at 7, and less light passes through the window 4. This provides little ambient illumination across the screen 2. In this case, the artificial light 6 is illuminated and this illuminates the screen, with the maximum illumination being in the region marked 8.
  • the process in accordance with the invention aims to provide a similar viewing experience of HDR material on the screen 2, in both the environments of Figures 1 and 2.
  • Figure 3 shows an object 9 placed in front of the display device 1 , which is illuminated as indicates at 10, so that a shadow is cast on the screen.
  • the imaging device 3 captures an image if the ambient light impinging on the screen and can detect that the light source 10 is partially blocked by the object 9.
  • Figure 4 shows a map 12 of the screen, divided into regions by a grid 13. These could be a small as individual pixels but in this case the grid defines groups of pixels. These groups could be larger, as desired.
  • the region indicated at 14 corresponds to the shadow cast on the screen and tone mapping in this region can be adapted to take account of this.
  • Figure 5 illustrates a process in accordance with the invention.
  • the image provided by the camera 3 is analysed and at 102 an ambient luminance map is created.
  • the difference in ambient luminance at different parts of the screen is used at 103 to create tone map modifiers.
  • a high dynamic range video frame is retrieved and the tone map is modified at 105.
  • the modified tone map is applied to the video frame at 106 and the tone mapped frame is displayed on the screen 2.
  • the tone map modifiers remain the same for all frames in a sequence.
  • Figure 6 illustrates a routine for altering the tone map modifiers if there is a change in ambient luminance. This takes the place of step 109 in Figure 5.
  • the camera 3 is used to detect whether there is a change in ambient luminance and at 202 a check is made to see whether this is above a minimum threshold. If it is not , the process continues as before, If there has been a significant change in ambient luminance, an ambient luminance map is created at 203 and tone map modifiers are created at 204.
  • the check for a change in ambient luminance could be made before or after a next video frame in a sequence is retrieved, and that the tone map could be modified before a frame is retrieved and then applied once it has been retrieved.
  • a check for a change in ambient luminance could be made for each frame in a sequence, or for groups of frames, or at predetermined intervals.
  • Tone map modifiers could be stored in a look up table, with coordinates for pixels or regions of pixels where a tone map is to be modified, and the modification that is to be made (such as a reduction or increase in luminance above any change to be applied by the tone mapping). Tone map modification may consist of a change to a different type of tone mapping, depending on whether, for example, the display is in bright daylight or in shade.
  • FIG. 7 illustrates a further refinement of the system, where user preferences are taken into account.
  • a stream of high dynamic range (HDR) images is input, for tone mapping prior to display.
  • HDR images can be static individual HDR images, or a sequence of frames of HDR video footage.
  • Any temporal information related to the sequence of HDR images is extracted at 302. This can either be done dynamically as the HDR images are received, or may be from a predetermined header associated with the current sequence of HDR images.
  • the images are then submitted to a tone map operator framework 303.
  • TMOs are typically classified as global or local. Global TMOs map the pixel values in the HDR image to the display pixel intensities based on global image
  • Local TMOs take into account the values of neighbouring pixels to the pixels being considered.
  • TMOs that take into account how the content changes over time may be classified additionally as temporal, for example.
  • the parameters chosen for each TMO can significantly affect how the HDR image is displayed.
  • the preferred TMO and parameters are selected by the framework and used at 304 to tone map the input HDR image.
  • the choice of preferred TMO and parameters may change for every input HDR image, or remain the same for a sequence of HDR images.
  • the choice of TMO and parameters may be influenced based on the temporal information, so as to avoid potential flickering if changes in ambient illumination (for example) occur too rapidly.
  • the tone mapped images are output at 305 and shown on the display at 306.
  • the framework 300 includes one or more environment sensors 307.
  • the current environmental conditions can be determined in a number of ways, including, but not limited to: specialist environment sensing technology, for example an ambient light sensor; images taken from cameras associated with the display being viewed, for example existing cameras on mobile devices; user centric means, for example by manual determination of the ambient light conditions; and other sensors on a device, such as accelerometers which determine orientation.
  • the sensor would include one or more imaging devices to capture an image indicative of ambient light levels experienced at a number of positions over the display.
  • the framework involves knowledge of the dynamic range capabilities of the display.
  • the dynamic range of an individual display may be defined as the difference in luminance value between the brightest and darkest pixel that can be displayed.
  • the framework also provides for user preferences: The user, who may be either the content provider or the viewer may wish the HDR content to be watched in a preferred manner, for example the mood of a certain film, or an artistic presentation style. These preferences may be encoded in data accompanying the HDR images, or may be selected by the viewer.
  • the framework 303 is dynamic and reactive, and determines the preferred TMO and parameters to tone map the input HDR image based on: the content of the HDR image; any user preferences; any creative intent, the characteristics of the intended display; and the environmental conditions.
  • Figure 8 illustrates the front of a portable data processing device 15 such as a mobile phone or a tablet, with a touch screen 16, a forward facing camera 17 and a control button 18.
  • Figure 9 shows the rear of the device 15, with a rear face 19, and a rear facing camera 20.
  • an accelerometer unit 21 which is included in the device and detects orientation.
  • the front camera 17, rear camera 20 and optionally accelerometer unit 21 are used as described earlier to provide information to the TMO framework 303.
  • the device 15 includes a data processing module 22 which is used to set up the TMO framework.
  • a method for tone mapping a high dynamic range image to be presented on a display device 15.
  • the display device is provided camera (17) which captures an image indicating ambient light levels experienced at a number of positions over the display (16) of the device.
  • a data processing module (22) processes signals from the camera and generates data which is used in adapting a tone mapping function applied to the high dynamic range image so as to account for different ambient light conditions.
  • the data processing module may take into account different lighting conditions at different positions on the display when generating the data which is used in adapting the tone mapping function.
  • the data processing module may also uses signals from a camera (20) on the other side of the device and / or from an orientation detection module (21).
  • the data processing module may also use preference data supplied with a series of video frames when adapting the tome mapping function.
  • the device may be a mobile phone, a table computing device or a portable computer.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Telephone Function (AREA)

Abstract

A method for tone mapping a high dynamic range image to be presented on a display device (15). The display device is provided with a camera (17) which captures an image indicating ambient light levels experienced at a number of positions over the display (16) of the device. A data processing module (22) processes signals from the camera and generates data which is used in adapting a tone mapping function applied to the high dynamic range image so as to account for different ambient light conditions. The data processing module may take into account different lighting conditions at different positions on the display when generating the data which is used in adapting the tone mapping function. The data processing module may also use signals from a camera (20) on the other side of the device and / or from an orientation detection module (21). The data processing module may also use preference data supplied for a series of video frames, when adapting the tone mapping function. The device may be a mobile phone, a table computing device or a portable computer.

Description

High Dynamic Range Imaging Systems
This invention relates to high dynamic range imaging systems. A human being can distinguish objects both in starlight during night time or in bright sun light, even though on a moonless night objects receive only approximately 1/1 ,000,000,000 of the illumination they would on a bright sunny day: This corresponds to a dynamic range of 90 db. However, the eye needs time to adjust to different light levels. Thus, the dynamic range of the human eye without adjustment of the pupil is only approximately 30 db. By contrast, the differences between very dark and very light spots in a picture taken by a modern camera system can easily be larger than those the human eye can distinguish without adaptation; with a modern camera system it is thus possible to determine fine details in very dark spots even while very bright spots are also present in a picture. The dynamic range of a modern camera system may also easily surpass the dynamic range of a conventional display.
Tone mapping has been developed as a way for adapting images recorded with a high dynamic range camera, or generated by a computer, to match the limitations of a display. Tone mapping is very helpful in the production of realistic images and several operators have been proposed. Two main tone mapping operator classes exist, namely tone reproduction curves (TRCs) and tone reproduction operators (TROs). Both will be referred to as tone mapping functions in the present application.
TRC algorithms are efficient because the operation is applied to pixels
independently and thus can be performed in parallel using a simple look-up table. In addition, models also exist that are able to capture some important aspects, such as visual adaptation. However, TRCs fail to capture the important information on local contrast that could be represented in the spatial context of neighbouring image pixels which is of great importance for the human visual system.
The algorithms based on TROs when compared with TRCs are able to capture the important information on local contrast. Unfortunately, they typically introduce artefacts in some parts of the image, such as dark halos, and are computationally more demanding.
It should be noted that the viewing conditions of an observer looking at a display may be completely different and can affect the appearance of the image. In Display Adaptive Tone Mapping by Rafat Mantiuk, Scott Daly and Louis Kerofsky, ACM Transactions on Graphics (Proc. of SIGGRAPH'08), 27(3), article no. 68, 2008, there is proposed the concept of tone-mapping closely coupled with a display device, which renders images optimized for a particular display and under the existing viewing conditions (ambient light). It is stated that a mobile phone should change its rendering algorithm when the backlight of its transreflective display is switched off to save power and that a TV display should adjust the display algorithm when light in the room is lit. It is noted that simple dimming due to ambient illumination is already performed in some TV displays.
One object of the systems disclosed herein is to improve the adaptability of tone mapping in accordance with ambient light. When viewing high dynamic range (HDR) images on a display, the content of the scene depicted can change dynamically, as can the ambient lighting conditions, particularly when viewing the content on a mobile device, whilst on the move. There is thus a need for a tone mapping operator (TMO) framework that can dynamically take into account both the scene being considered and the current environmental conditions to ensure a preferred user viewing experience. Viewed from one aspect there is disclosed a method for tone mapping a high dynamic range image to be presented on a display, wherein there is associated with the display an imaging device which captures an image indicative of ambient light levels experienced at a number of positions over the display, and a data processing module processes signals from the imaging device and generates data which is used in adapting a tone mapping function applied to the high dynamic range image so as to account for different ambient light conditions.
Preferably, adapting a tone mapping function applied to the high dynamic range image so as to account for different ambient light conditions, is done dynamically. The use of an imaging device, as opposed to a simple light sensor, provides greater versatility in adapting the tone mapping function. For example, the system could take into account different lighting conditions at different positions on the display. Thus, for example, if the captured image indicates that one portion of the display on a portable device such as a mobile telephone or a tablet computer will be in bright sunlight whilst another portion of the display will be in shade (for example from a body part of a user holding the device), the tone mapping function can be adapted so that different parts of the image are treated differently, depending on whether they are in bright light or in shade. Thus, in preferred embodiments of the invention, the tone mapping function applied to the high dynamic range image is adapted so as to account for different ambient light conditions at different positions on the display.
The imaging device could be used and an average value of luminance values then calculated, which is used to adapt the overall tone mapping function, as opposed to adapting it on a position dependent basis. There will be a plurality of possible average luminance values, between a simple light off value and a simple light on values. This provides for more detailed adaptation of the tone mapping than a simple on off arrangement as disclosed in Display Adaptive Tone Mapping by Rafat Mantiuk et al.
The captured image will not be an image of the display itself, but an image captured by an imaging device looking away from the display, typically from a position adjacent the display.
In some embodiments of the invention, there may be a plurality of imaging devices, and a corresponding plurality of images are captured at any given time. These may, for example, cover different areas of the display although the area captured by the devices may overlap. The plurality of images can be combined to create a single image covering the entire display, and methods for combining images are well known in the art.
In another embodiment, there may be a plurality of imaging devices which are directed in opposite directions. One or more imaging devices can capture an image or images indicative of ambient light levels experienced over the display, whereas one or more imaging devices can capture an image or images which can be used to detect other environmental conditions which affect the tone mapping required. For example, on a mobile telephone or a mobile tablet device there may be a camera on the front of the device directed at the user which can, for example, be used to take a self-portrait and this can be used to detect the ambient light levels experienced over the display. However, there is frequently provided a camera on the back of the device which is used to take photographs of other subject matter. This can be used, for example, to detect the amount of light on the other side of the device. It can be used, for example, with the orientation detecting module, such as an accelerometer often found in mobile devices, to detect the orientation of the device, for example whether the user is looking down onto the device (with, typically, a dark surface beneath the device) or holding it up (with typically the sky or artificial lighting behind the device). In any event, with a portable device - typically hand held - the area around the device will be in the field of view of the user, and lighting conditions on the other side beyond the device can affect how the user's eyes respond.
According to another aspect of the invention, there is provided a method for tone mapping a high dynamic range image to be presented on portable device, the device having a front and a back and on the front of the device there is a display on which the image is to be presented, on the front of the device there is also a first camera directed away from the screen which captures first image data which indicates ambient lighting conditions, and on the back of the device there is a second camera directed away from the back which captures second image data which indicates ambient lighting conditions, and a data processing module processes signals from the first and second cameras and generates data which is used in adapting a tone mapping function applied to the high dynamic range image so as to account for different ambient light conditions as detected by both the first and second cameras.
In some preferred embodiments the tone mapping is adapted dynamically, so that the imaging device captures images continually, and as the lighting conditions change the tone mapping function changes. By "continually" it is not implied that the imaging device captures frames at interval sufficiently closely spaced to be usable as a video. The frames may be captured at more widely spaced intervals, which may be regular or irregular.
The quality of the image that is captured need not be of high enough resolution for use as a photograph or a video, but can be of significantly lower resolution. A lower resolution image will require less processing time and speed for the tone mapping function to be adapted. The captured image could be in greyscale only, to further reduce the processing required. However, the captured image could be in colour so that the tone mapping function can be adapted for, e.g., coloured ambient light. The number of colours captured may be limited, and substantially below that necessary for a photograph or a video.
However, the imaging device could be in the form of a camera. As noted above, mobile telephones, tablet computing devices and portable computers include a camera on the display side, intended to be used to capture an image of the user. In some cases, desktop computer displays are provided with a camera facing a user or can be provided with such a camera (frequently referred to as a "web cam"). For other display devices, a separate camera may be provided, if the facility exists to provide an interface, or the display device may be modified to provide an image capture function. For a TV display which is currently provided with a simple light sensor may detect whether room lights are on or off, a more sophisticated imaging device could be provided.
The image to be displayed and subjected to tone mapping could be a still image, or a video. Where there is a video, or a series of still images, the tone mapping functions can be adapted to account for different conditions as there are changes from scene to scene, or for example if there are different lighting conditions as a camera pans across a view. Adapting the tone mapping function also encompasses selecting different types of tone mapping function for particular circumstances.
In embodiments of this aspect of the disclosure, the tone mapping function will vary according to the nature of the image, the capabilities of the image processing equipment that will handle the image, the capabilities of the display, the lighting conditions where the image is displayed and any other variables which affect the way the displayed image is perceived by a viewer.
In some embodiments of the invention, processing of data from the imaging device and adapting the tone mapping function are carried out within a device providing the display. However, in some cases data from the imaging device, or locally processed data from the imaging device, can be fed to a server which supplies images to the display and can carry out the tone mapping. When a server is used, the tone mapping is adapted individually for the target display in accordance with the data from the imaging device associated with that imaging device.
In addition there is also a need for a tone mapping operator (TMO) framework that can dynamically take into account both the scene being considered and the current environmental conditions to ensure that the creative intent of the person who produced the content is delivered to the user whatever the current environmental conditions.
In accordance with another feature of this disclosure, the tone mapping may be adapted in accordance with the wishes of a person responsible for the appearance of the image, such as a director responsible for the appearance of scenes of a movie or other video. Such a person can define a particular theme or mood to be applied, and the tone mapping will reflect that theme or mood. For example, the director may decide that a particular scene should have a blue tint. The data being fed to the device which drives the display, may include some data which reflects the director's preferences. By way of example, the director can look at a scene of a video on a monitor display, and alter colour correction so that a scene appears in a particular way that the director wishes. Having done that, the video can be colour corrected and stored in this corrected form. However, the director will have made decisions based on the appearance of the scene on the particular monitor display under certain ambient lighting conditions, and there can be no guarantee that there will be the same effect when the scene is viewed on another display in different lighting conditions. According to this aspect of the disclosure, the image data includes preference data which enables a tone mapping function which adapts the image data for display on a given display, to take into account the preferences of a parson responsible for producing the video data. By way of example, there could be a "standard" display device and "standard" lighting conditions, so that other display devices and the conditions in which they are being viewed can be compared to this. A director can view a scene and choose a particular mood - for example choosing a particular frame which has an appearance which bests fits the mood the director wishes to create. Data can be generated which would generate that appearance on the "standard" display device under the "standard" lighting conditions, and this can accompany the video data as signifying preferences for a scene. For example the colour appearance of the chosen frame can be determined compared to a predetermined "standard" colour chart, the display characteristics are known from the display specifications and its current settings (such as the current contrast setting), while the current
environmental conditions can be determined by a imaging device as described above.
At a destination device for the video data, tone mapping will be used in a conventional manner, or as described earlier to account for different ambient lighting conditions over the display, and also to reflect the preferences for a particular scene. Preferences could be set for any number of scenes, and in this context a scene includes and series of frames of any length. A director could choose preferences for an entire film. In any event, the preferences of the director will be reproduced whatever other changes tone mapping may cause to take place.
The aspect of the disclosure can be used in its own right and thus viewed from another aspect there is provided a method of tone mapping of a stream of images forming a video, wherein at least a series of those images is provided with preference data, and that preference data is used to adapt the appearance of the images in the series during tone mapping. The preference data may be encoded together with the video. The preference data could also be sent in the form of meta data rather than as example video frames. This meta data - describing the conditions when the preference was created - would be a lot smaller (and then of course much easier to transmit) than a sequence of video frames. Example video frames could also be sent but these would have to be interlaced within the HDR video stream (although they would themselves be tone mapped images and thus smaller than an HDR video frame.)
Embodiments of the invention provide a tone mapping framework that can dynamically take into account both the scene being considered and the current environmental conditions to ensure a preferred user viewing experience.
The present invention proposes a tone mapping framework that is dynamic and providing the tone mapping operator and parameters for a preferred viewing experience based on the current ambient environmental conditions, such as natural or artificial lighting conditions, and the current frame being watched. The preferred framework is reactive and changes the choice of tone mapping operator as the ambient lighting conditions change and the scene being watched changes. Embodiments of the invention will now be described by way of example and with reference to the accompanying drawings, in which:
Figure 1 is a diagrammatic view of a display device in a domestic environment, during daylight;
Figure 2 is a diagrammatic view of the display device in the same domestic environment, at night;
Figure 3 is a diagrammatic view of the display device, showing how a shadow is formed on the screen;
Figure 4 shows a map for the screen, illustrating the area corresponding to the shadow; Figure 5 is a block diagram showing a process in accordance with the invention;
Figure 6 is a block diagram of a routine for adapting the process if there is a change in ambient luminance; Figure 7 is a schematic representation of a system which introduces the ability to take into account preferences;
Figure 8 is a front view of a portable device which can be used in accordance with the invention: and
Figure 9 is a rear view of the portable device of Figure 8.
In Figure 1 there is show a display device 1 for a domestic or office environment, such as a television or a monitor for a computer, having a screen 2. The display device is also provided with an imaging device 3, which in this case is a camera, which captures ambient light impinging on the screen, although more than one imaging device could be provided. There is a window 4, through which the environment is lit by daylight, as indicated by 5. Within the environment there is also an artificial light 6, which is turned off. In this environment the screen 2 is illuminated evenly.
Figure 2 illustrates the environment of Figure 1 , at night time. Outside it is dark , as indicated at 7, and less light passes through the window 4. This provides little ambient illumination across the screen 2. In this case, the artificial light 6 is illuminated and this illuminates the screen, with the maximum illumination being in the region marked 8.
The process in accordance with the invention, as described in more detail below, aims to provide a similar viewing experience of HDR material on the screen 2, in both the environments of Figures 1 and 2.
Figure 3 shows an object 9 placed in front of the display device 1 , which is illuminated as indicates at 10, so that a shadow is cast on the screen. The imaging device 3 captures an image if the ambient light impinging on the screen and can detect that the light source 10 is partially blocked by the object 9. Figure 4 shows a map 12 of the screen, divided into regions by a grid 13. These could be a small as individual pixels but in this case the grid defines groups of pixels. These groups could be larger, as desired. The region indicated at 14 corresponds to the shadow cast on the screen and tone mapping in this region can be adapted to take account of this.
Figure 5 illustrates a process in accordance with the invention. At step 101 , the image provided by the camera 3 is analysed and at 102 an ambient luminance map is created. The difference in ambient luminance at different parts of the screen is used at 103 to create tone map modifiers.
At 104 a high dynamic range video frame is retrieved and the tone map is modified at 105. The modified tone map is applied to the video frame at 106 and the tone mapped frame is displayed on the screen 2.
At 108 a check is made to see if the frame is the last in the sequence, and if it is the process ends. If it is not the last frame, the next frame is retrieved at 109 and the process is repeated from step 105.
In this embodiment, the tone map modifiers remain the same for all frames in a sequence. Figure 6 illustrates a routine for altering the tone map modifiers if there is a change in ambient luminance. This takes the place of step 109 in Figure 5. At 201 , before retrieving the next frame, the camera 3 is used to detect whether there is a change in ambient luminance and at 202 a check is made to see whether this is above a minimum threshold. If it is not , the process continues as before, If there has been a significant change in ambient luminance, an ambient luminance map is created at 203 and tone map modifiers are created at 204. The process then continues as before, with the next frame being retrieved at 205, the tone map being modified at 206 and applied at 207, and the image being displayed at 208. There will then be a check for the frame being the last, and a loop back if it is not.
It will be appreciated that the check for a change in ambient luminance could be made before or after a next video frame in a sequence is retrieved, and that the tone map could be modified before a frame is retrieved and then applied once it has been retrieved. A check for a change in ambient luminance could be made for each frame in a sequence, or for groups of frames, or at predetermined intervals.
Tone map modifiers could be stored in a look up table, with coordinates for pixels or regions of pixels where a tone map is to be modified, and the modification that is to be made (such as a reduction or increase in luminance above any change to be applied by the tone mapping). Tone map modification may consist of a change to a different type of tone mapping, depending on whether, for example, the display is in bright daylight or in shade.
Figure 7 illustrates a further refinement of the system, where user preferences are taken into account. At 301 , a stream of high dynamic range (HDR) images is input, for tone mapping prior to display. These HDR images can be static individual HDR images, or a sequence of frames of HDR video footage. Any temporal information related to the sequence of HDR images is extracted at 302. This can either be done dynamically as the HDR images are received, or may be from a predetermined header associated with the current sequence of HDR images. The images are then submitted to a tone map operator framework 303. TMOs are typically classified as global or local. Global TMOs map the pixel values in the HDR image to the display pixel intensities based on global image
characteristics, regardless of the pixel's spatial location. Local TMOs take into account the values of neighbouring pixels to the pixels being considered.
Furthermore, TMOs that take into account how the content changes over time may be classified additionally as temporal, for example. The parameters chosen for each TMO can significantly affect how the HDR image is displayed.
The preferred TMO and parameters are selected by the framework and used at 304 to tone map the input HDR image. The choice of preferred TMO and parameters may change for every input HDR image, or remain the same for a sequence of HDR images. In addition, the choice of TMO and parameters may be influenced based on the temporal information, so as to avoid potential flickering if changes in ambient illumination (for example) occur too rapidly. The tone mapped images are output at 305 and shown on the display at 306.
The framework 300 includes one or more environment sensors 307. The current environmental conditions can be determined in a number of ways, including, but not limited to: specialist environment sensing technology, for example an ambient light sensor; images taken from cameras associated with the display being viewed, for example existing cameras on mobile devices; user centric means, for example by manual determination of the ambient light conditions; and other sensors on a device, such as accelerometers which determine orientation. In accordance with the invention, the sensor would include one or more imaging devices to capture an image indicative of ambient light levels experienced at a number of positions over the display.
At 308, the framework involves knowledge of the dynamic range capabilities of the display. The dynamic range of an individual display may be defined as the difference in luminance value between the brightest and darkest pixel that can be displayed.
At 309, the framework also provides for user preferences: The user, who may be either the content provider or the viewer may wish the HDR content to be watched in a preferred manner, for example the mood of a certain film, or an artistic presentation style. These preferences may be encoded in data accompanying the HDR images, or may be selected by the viewer.
The framework 303 is dynamic and reactive, and determines the preferred TMO and parameters to tone map the input HDR image based on: the content of the HDR image; any user preferences; any creative intent, the characteristics of the intended display; and the environmental conditions.
Figure 8 illustrates the front of a portable data processing device 15 such as a mobile phone or a tablet, with a touch screen 16, a forward facing camera 17 and a control button 18. Figure 9 shows the rear of the device 15, with a rear face 19, and a rear facing camera 20. Also illustrated by dotted line is an accelerometer unit 21 which is included in the device and detects orientation. The front camera 17, rear camera 20 and optionally accelerometer unit 21 are used as described earlier to provide information to the TMO framework 303. The device 15 includes a data processing module 22 which is used to set up the TMO framework.
Thus, at least in preferred embodiments of the invention there is provided method for tone mapping a high dynamic range image to be presented on a display device (15). The display device is provided camera (17) which captures an image indicating ambient light levels experienced at a number of positions over the display (16) of the device. A data processing module (22) processes signals from the camera and generates data which is used in adapting a tone mapping function applied to the high dynamic range image so as to account for different ambient light conditions. The data processing module may take into account different lighting conditions at different positions on the display when generating the data which is used in adapting the tone mapping function. The data processing module may also uses signals from a camera (20) on the other side of the device and / or from an orientation detection module (21). The data processing module may also use preference data supplied with a series of video frames when adapting the tome mapping function. The device may be a mobile phone, a table computing device or a portable computer.

Claims

1. A method for tone mapping a high dynamic range image to be presented on a display, wherein there is associated with the display an imaging device which captures an image indicative of ambient light levels experienced at a number of positions over the display, and a data processing module processes signals from the imaging device and generates data which is used in adapting a tone mapping function applied to the high dynamic range image so as to account for different ambient light conditions.
2. A method as claimed in claim 1 , wherein the data processing module takes into account different lighting conditions at different positions on the display when generating the data which is used in adapting the tone mapping function.
3. A method as claimed in claim 1 or 2, wherein the imaging device captures images continually and the tone mapping is adapted dynamically, so that as the ambient lighting conditions change the tone mapping function changes.
4. A method as claimed in claim 1 , 2 or 3, wherein the imaging device is a camera which is directed forwardly away from the display.
5. A method as claimed in claim 4, wherein the data processing module also processes signals from a second camera which is on the other side of the device from the display and is directed away from that other side of the device, and the data processing module also uses the signals from the second camera when generating the data which is used in adapting the tone mapping function.
6. A method as claimed in claim 4 or 5, wherein the device is a portable data processing device in the form of a mobile telephone, a tablet computing device or a portable computer.
7. A method as claimed in claim 6, wherein the data processing module also processes signals from an orientation detecting module of the portable data processing device and the data processing module also uses the signals from the orientation detecting module when generating the data which is used in adapting the tone mapping function.
8. A method as claimed in any preceding claim, which is used to tone map a series of image frames of a video which is displayed on the screen of the device.
9. A method as claimed in claim 8, wherein at least a series of the frames is provided with preference data which indicates how the series of frames is to appear, and this preference data is used by the data processing module when generating the data which is used in adapting the tone mapping function.
10. A data processing device which includes a display screen, a camera which is directed forwardly away from the display, and a data processing module which is configured to tone map a high dynamic range image to be presented on the display, wherein the camera captures an image indicative of ambient light levels
experienced at a number of positions over the display, and the data processing module processes signals from the camera and generates data which is used in adapting a tone mapping function applied to the high dynamic range image so as to account for different ambient light conditions.
11. A data processing device as claimed in claim 10, wherein the data processing module takes into account different lighting conditions at different positions on the display when generating the data which is used in adapting the tone mapping function.
12. A method as claimed in claim 10 or 11, wherein the imaging device captures images continually and the tone mapping is adapted dynamically, so that as the ambient lighting conditions change the tone mapping function changes.
13. A data processing device as claimed in claim 10, 1 or 12, in the form of a mobile telephone, a tablet computing device or a portable computer.
14. A data processing device as claimed in claim 13, wherein a second camera is on the other side of the device from the display and is directed away from that other side of the device, and the data processing module also uses the signals from the second camera when generating the data which is used in adapting the tone mapping function.
15. A data processing device as claimed in claim 13 or 14, wherein an orientation detecting module is provided in the portable data processing device and the data processing module also uses the signals from the orientation detecting module when generating the data which is used in adapting the tone mapping function.
16. A data processing device as claimed in any of claimsl 0 to 15, wherein the data processing module is adapted to use preference data when generating the data which is used in adapting the tone mapping function, the preference data being provided for a series of image frames of a video and indicating how the series of frames is to appear.
PCT/GB2014/050696 2013-03-08 2014-03-10 High dynamic range imaging systems WO2014135901A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1515749.8A GB2526478B (en) 2013-03-08 2014-03-10 High dynamic range imaging systems

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB1304221.3A GB201304221D0 (en) 2013-03-08 2013-03-08 High Dynamic Range Imaging Systems
GB1304221.3 2013-03-08

Publications (1)

Publication Number Publication Date
WO2014135901A1 true WO2014135901A1 (en) 2014-09-12

Family

ID=48189607

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2014/050696 WO2014135901A1 (en) 2013-03-08 2014-03-10 High dynamic range imaging systems

Country Status (2)

Country Link
GB (2) GB201304221D0 (en)
WO (1) WO2014135901A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017115979A1 (en) * 2015-06-02 2017-07-06 Samsung Electronics Co., Ltd. A method and an apparatus for tone mapping
US10089960B2 (en) 2015-06-05 2018-10-02 Apple Inc. Rendering and displaying HDR content according to a perceptual model
US10212429B2 (en) 2014-02-25 2019-02-19 Apple Inc. High dynamic range video capture with backward-compatible distribution
US11628849B2 (en) 2018-06-12 2023-04-18 Autostore Technology AS Express bin lift for automated storage system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011097060A2 (en) * 2010-02-04 2011-08-11 Microsoft Corporation High dynamic range image generation and rendering
WO2012030622A1 (en) * 2010-08-31 2012-03-08 Dolby Laboratories Licensing Corporation Ambient black level
US20120229526A1 (en) * 2011-03-11 2012-09-13 Calgary Scientific Inc. Method and system for remotely calibrating display of image data

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011097060A2 (en) * 2010-02-04 2011-08-11 Microsoft Corporation High dynamic range image generation and rendering
WO2012030622A1 (en) * 2010-08-31 2012-03-08 Dolby Laboratories Licensing Corporation Ambient black level
US20120229526A1 (en) * 2011-03-11 2012-09-13 Calgary Scientific Inc. Method and system for remotely calibrating display of image data

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
MANTIUK ET AL: "Display adaptive tone mapping", ACM SIGGRAPH 2008 PAPERS ON, SIGGRAPH '08, 1 January 2008 (2008-01-01), New York, New York, USA, pages 1, XP055043632, ISBN: 978-1-45-030112-1, DOI: 10.1145/1399504.1360667 *
RAFAL MANTIUK, DISPLAY ADAPTIVE TONE MAPPING
RAFAT MANTIUK; SCOTT DALY; LOUIS KEROFSKY: "Display Adaptive Tone Mapping", ACM TRANSACTIONS ON GRAPHICS (PROC. OF SIGGRAPH'08, vol. 27, no. 3, 2008

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10212429B2 (en) 2014-02-25 2019-02-19 Apple Inc. High dynamic range video capture with backward-compatible distribution
US10264266B2 (en) 2014-02-25 2019-04-16 Apple Inc. Non-linear display brightness adjustment
US10271054B2 (en) 2014-02-25 2019-04-23 Apple, Inc. Display-side adaptive video processing
US10812801B2 (en) 2014-02-25 2020-10-20 Apple Inc. Adaptive transfer function for video encoding and decoding
US10880549B2 (en) 2014-02-25 2020-12-29 Apple Inc. Server-side adaptive video processing
US10986345B2 (en) 2014-02-25 2021-04-20 Apple Inc. Backward-compatible video capture and distribution
US11445202B2 (en) 2014-02-25 2022-09-13 Apple Inc. Adaptive transfer function for video encoding and decoding
WO2017115979A1 (en) * 2015-06-02 2017-07-06 Samsung Electronics Co., Ltd. A method and an apparatus for tone mapping
US10136074B2 (en) 2015-06-02 2018-11-20 Samsung Electronics Co., Ltd. Distribution-point-based adaptive tone mapping
US10089960B2 (en) 2015-06-05 2018-10-02 Apple Inc. Rendering and displaying HDR content according to a perceptual model
US10249263B2 (en) 2015-06-05 2019-04-02 Apple Inc. Rendering and displaying high dynamic range content
US11628849B2 (en) 2018-06-12 2023-04-18 Autostore Technology AS Express bin lift for automated storage system

Also Published As

Publication number Publication date
GB2526478A (en) 2015-11-25
GB201515749D0 (en) 2015-10-21
GB201304221D0 (en) 2013-04-24
GB2526478B (en) 2019-06-12

Similar Documents

Publication Publication Date Title
JP7422832B2 (en) A scalable system for controlling color management including various levels of metadata
US9672764B2 (en) Liquid crystal display device
JP6007600B2 (en) Image processing apparatus, image processing method, and program
US8553103B1 (en) Compensation of ambient illumination
CN107295248B (en) Image display apparatus and image display method for displaying image, and storage medium
US9672603B2 (en) Image processing apparatus, image processing method, display apparatus, and control method for display apparatus for generating and displaying a combined image of a high-dynamic-range image and a low-dynamic-range image
US8780161B2 (en) System and method for modifying images
JP5072087B2 (en) Liquid crystal display device and liquid crystal display method
WO2015023074A1 (en) A method and apparatus for dynamic range enhancement of an image
CN107864349B (en) Television and display effect control method thereof, and computer readable storage medium
TW200704211A (en) Method and system for automatic brightness and contrast adjustment of a video source
JP2006285064A (en) Image display apparatus
JP2009520398A (en) Apparatus and method for automatically adjusting display under varying lighting conditions
CN101115149B (en) Image processing method
GB2526478B (en) High dynamic range imaging systems
CN106548763B (en) Image display method and device and terminal
JP5089783B2 (en) Image processing apparatus and control method thereof
CN105788562A (en) Brightness control method and system for display device
TW201728163A (en) Image adjusting method and related displaying apparatus
JP2011249895A (en) Signal processing system and signal processing apparatus
JP5039566B2 (en) Method and apparatus for improving visual perception of image displayed on liquid crystal screen, liquid crystal panel, and liquid crystal screen
JP2011150002A (en) Apparatus for control of large video display device
CN105513566A (en) Image adjusting method of executing optimal adjustment according to different environments and displayer
CN111726542A (en) Camera light supplementing method, terminal and storage medium
US20170054890A1 (en) Identification of flicker and banding in video

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14712025

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 1515749

Country of ref document: GB

Kind code of ref document: A

Free format text: PCT FILING DATE = 20140310

WWE Wipo information: entry into national phase

Ref document number: 1515749.8

Country of ref document: GB

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14712025

Country of ref document: EP

Kind code of ref document: A1