WO2010131167A1 - Appareil d'affichage et son procédé - Google Patents

Appareil d'affichage et son procédé Download PDF

Info

Publication number
WO2010131167A1
WO2010131167A1 PCT/IB2010/052001 IB2010052001W WO2010131167A1 WO 2010131167 A1 WO2010131167 A1 WO 2010131167A1 IB 2010052001 W IB2010052001 W IB 2010052001W WO 2010131167 A1 WO2010131167 A1 WO 2010131167A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
characteristic
scaling
region
display
Prior art date
Application number
PCT/IB2010/052001
Other languages
English (en)
Inventor
Leo J. Velthoven
Guido T. G. Volleberg
Mark M .J. W. Mertens
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Priority to US13/320,052 priority Critical patent/US20120050334A1/en
Priority to JP2012510407A priority patent/JP2012527005A/ja
Priority to CN201080020806.7A priority patent/CN102428492B/zh
Priority to EP10726232A priority patent/EP2430611A1/fr
Publication of WO2010131167A1 publication Critical patent/WO2010131167A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4092Image resolution transcoding, e.g. by using client-server architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • G06T7/42Analysis of texture based on statistical description of texture using transform domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • G06T7/44Analysis of texture based on statistical description of texture using image operators, e.g. filters, edge density metrics or local histograms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Definitions

  • the invention relates to a display system and in particular, but not exclusively, to display systems for presenting one or more images in large size display windows.
  • a given display may be used for Standard Definition (SD) television, for High
  • HD Definition
  • QD Quad Definition
  • very high resolution computer displays e.g. gaming
  • low quality video sequences such as e.g. web based video content at low data rates or mobile phone generated video
  • an HD television can upscale an SD video signal to the HD resolution of the display and present this upscaled version.
  • an HD television can upscale an SD video signal to the HD resolution of the display and present this upscaled version.
  • the presented image will in many scenarios be perceived to be of relatively low quality either due to the original source quality or due to quality degradations associated with the upscaling process.
  • the source content will often be the limiting factor for the perceived image quality.
  • a standard definition picture displayed on a 60 inch television will typically be perceived to be of relatively low quality.
  • an improved display system would be advantageous and in particular a system allowing increased flexibility, an improved perceived quality, an improved user experience, facilitated operation, facilitated implementation and/or improved performance would be advantageous.
  • the invention seeks to preferably mitigate, alleviate or eliminate one or more of the above mentioned disadvantages singly or in any combination.
  • a display apparatus for presenting an image
  • the display apparatus comprising: means for receiving an image to be displayed; analyzing means for performing a local image profile analysis on at least a first region of the image to determine a characteristic representing spatial variation of pixel values; scaling means for scaling at least a second region of the image in response to the pixel value spatial variation characteristic; and presentation means for presenting the scaled image.
  • the invention may allow an improved presentation of one or more images.
  • a geometric characteristic such as a display window size, for an image may be adjusted to match the spatial variation characteristics of the image being presented.
  • the invention may in many embodiments allow an improved adaptation to a large number of different types of image signals and signal sources.
  • the approach may allow an adaptation of the display window size to the sharpness or quality of the image.
  • the invention may allow an optimization based not merely on a pixel resolution of the image but rather allows this optimization to depend on the actual quality of the image itself. For example, different images with the same resolution may be scaled differently depending on the actual sharpness of the image within that resolution.
  • the scaling may be different for images that are originally generated at the resolution and images that are upscaled to result in the specific resolution.
  • the invention may for example allow a displayed size of the image to be dependent on the (current or original) visual quality of the image.
  • the invention may in some embodiments allow the scaling (e.g. the size, or scaling algorithm kind) of the image to be adjusted to provide the desired perceived quality.
  • the image may be an image of a sequence of images, such as an image of a video signal.
  • the pixel values may for example be luminance and/or colour values.
  • the image may be a prescaled image.
  • the display apparatus may comprise a prescaler which scales an input image to a given resolution.
  • the prescaler may be adapted to scale different input images with different resolutions to the same predetermined/ fixed resolution.
  • the analysis may accordingly be performed on an image with a known predetermined/ fixed resolution, however with a particular visual quality.
  • the prescaled image may then be scaled in the scaling means to provide the desired characteristic (e.g. size) suitable for the specific pixel value spatial variation characteristic.
  • the predetermined/ fixed resolution may specifically correspond to a maximum resolution of a display on which the image is presented.
  • the analysis means may comprise a prescaler that prescales the image to a predetermined/ fixed resolution.
  • the analysis may accordingly be performed on an image with a known predetermined/ fixed resolution.
  • the scaling may be performed on the image prior to the prescaling, i.e. to the image at the original resolution.
  • the adaptation of the scaling may accordingly take into account the difference between the resolution of the prescaled and the original image.
  • the analyzing means is arranged to perform a spatial frequency analysis of local spatial frequencies on at least the first region and to generate the pixel value spatial variation characteristic to comprise a spatial frequency characteristic.
  • This may provide particularly advantageous performance.
  • it may provide a low complexity yet accurate indication of a sharpness (and/or signal complexity and/or visual quality/correctness/beauty/detail) of the underlying image and may provide a particularly suitable parameter on which to base the optimization of the scaling of the image.
  • the spatial frequency analysis may be performed directly on the image or on a modified version of the image.
  • the spatial frequency analysis may be performed on an upscaled/prescaled version of the image.
  • the spatial frequency characteristics may be indicative of a spatial frequency of the image that will be presented if no scaling is applied.
  • the analyzing means is arranged to perform a sharpness analysis on at least the first region and to generate the pixel value spatial variation characteristic to comprise a sharpness characteristic.
  • This may provide a particularly advantageous performance and image presentation.
  • it may in many scenarios allow a size of the displayed image to be optimized for the specific characteristics of the image.
  • the analyzing means is arranged to perform a texture analysis on at least the first region and to generate the pixel value spatial variation characteristic to comprise a texture characteristic. This may provide a particularly advantageous image presentation and/or may facilitate analysis.
  • the analyzing means is arranged to perform a pixel value distribution analysis on at least the first region and to generate the pixel value spatial variation characteristic to comprise a pixel value distribution characteristic.
  • the distribution may for example be represented by a histogram.
  • the distribution may first be determined in one or more image segments.
  • the individual distributions may then be combined into a combined distribution with the pixel value distribution characteristic being determined as a characteristic of the combined distribution.
  • a characteristic may be determined for each segment distribution and the characteristics may then be combined.
  • the pixel value variance for each segment may be calculated and used to determine an average pixel value variance.
  • the analyzing means is arranged to perform a coding artifact analysis on at least the first region and to generate the pixel value spatial variation characteristic in response to a coding artifact characteristic.
  • the decoding artifact characteristic may be used to compensate another characteristic, such as the spatial frequency characteristic, and/or may be used directly to adapt the scaling.
  • the pixel value spatial variation characteristic may comprise the coding artifact characteristic.
  • the analyzing means is arranged to perform a motion analysis on at least the first region and to generate the pixel value spatial variation characteristic to comprise a motion characteristic.
  • the motion characteristic may specifically be a motion characteristic of one or more image objects.
  • the display apparatus further comprises for performing a content analysis on at least the first region and wherein the scaling means is arranged to scale the second region in response to a content characteristic.
  • the content analysis may specifically determine a content category for the image and the scaling may be modified in accordance with a predetermined bias for that content category.
  • the scaling comprises cropping.
  • the cropping may specifically be performed when the image variation characteristic meets a criterion.
  • the selection of which part to crop or retain may depend on the local spatial characteristics of the image.
  • the scaling is further dependent on a characteristic of a display window available for displaying the scaled image.
  • the characteristic of the display window available for displaying the scaled image may specifically be a size of the display window. This approach may be particularly advantageous in embodiments where the potential maximum display window may vary. For example, it may be highly advantageous in embodiments wherein the display is projected on different surfaces (wall, door, windscreen etc) that cannot be predicted in advance.
  • the scaling comprises resolution scaling to a resolution corresponding to a desired window size, which may typically be non linear.
  • the analyzing means is arranged to determine an image object spatial profile characteristic for at least one image object (i.e. look at how the pixel values vary across the object, which object may result from segmentation, or object detection, etc.), and the scaling means is arranged to perform the scaling in response to the object spatial profile characteristic. This may provide advantageous image presentation and/or performance in many embodiments.
  • the image is an image of a video sequence of images and the analyzing means is arranged to perform a temporal analysis on the video sequence to generate a temporal pixel value variation characteristic, and wherein the scaling means is arranged to scale the second region in response to the temporal pixel value variation characteristic (e.g. static scenes may be shown larger than everyday scenes, with complex motion).
  • the analyzing means is arranged to perform a temporal analysis on the video sequence to generate a temporal pixel value variation characteristic
  • the scaling means is arranged to scale the second region in response to the temporal pixel value variation characteristic (e.g. static scenes may be shown larger than chaotic scenes, with complex motion).
  • the presentation means is arranged to present the image in a sub-window of a display window, and the apparatus further comprises means for presenting another image in another sub-window of the display window.
  • This may provide advantageous image presentation and/or performance in many embodiments.
  • it may provide an improved user experience and may allow a flexible and dynamic presentation of various information to a user.
  • a method of presenting an image comprising: receiving an image to be displayed; performing a local image profile analysis on at least a first region of the image to determine a pixel value spatial variation characteristic; scaling at least a second region of the image in response to the pixel value spatial variation characteristic; and presenting the scaled image.
  • the optimal scaling will typically be done close to the actual display, however it could also be done on a remote server etc.
  • the image processing device may store the signal in a memory (e.g. IC, optical disk, ...) for later use, or (possibly after special encoding taking into account abnormal sizes or aspect ratios (or dividing it into different signals for several display means), such as with auxiliary signals for supplementing a main cutout) send the signal over a network, etc.
  • a memory e.g. IC, optical disk, 10.1, 10.1, 10.1, etc.
  • auxiliary signals for supplementing a main cutout
  • FIG. 1 is an illustration of an example of a display system in accordance with some embodiments of the invention.
  • FIG. 2 is an illustration of an example of a displaying of images in accordance with some embodiments of the invention.
  • FIG. 3 is an illustration of an example of a display system in accordance with some embodiments of the invention.
  • FIG. 4 is an illustration of an example of a displaying of images in accordance with some embodiments of the invention.
  • FIG. 5 shows a couple of exemplary scenarios of the general principle of optimal visual quality producing scaling.
  • FIG. 1 illustrates an example of a display apparatus for presenting one or more images.
  • the display apparatus comprises an image receiver 101 which receives one or more images to be displayed.
  • the image receiver 101 receives a video signal comprising a sequence of images (e.g. with each image corresponding to a frame of the video signal).
  • the image receiver 101 may receive the image from any suitable source which may be an internal or external source.
  • the video/image signal may be locally stored or generated in the display apparatus or may be received from an external source.
  • the display system may receive a large variety of different signals including SD, HD and QD video signals, low resolution and quality (very low data rate) video sequences, high resolution pictures etc.
  • the image receiver 101 is coupled to an image analyzer 103 and a scaling processor 105.
  • the image analyzer 103 is further coupled to the scaling processor 105.
  • the image analyzer 103 is arranged to perform a local image profile analysis on at least a first region (this may be e.g. 20 NxM pixel blocks uniformly or content -based non-uniformly scattered over the image; the resulting characteristic may typically be an average e.g. sharpness, but it may also be a more complex multidimensional description) of the image in order to determine a pixel value spatial variation characteristic.
  • the pixel value spatial variation characteristic is then fed to the scaling processor 105 which proceeds to scale at least one region of the image in response to the pixel value spatial variation characteristic.
  • the region which is scaled may be the same as the region on which the pixel value spatial variation characteristic is determined but need not be so.
  • the region being analyzed and/or the region being scaled may correspond to the entire image area of the image being processed.
  • the scaling processor 105 is coupled to a presentation controller 107 which is arranged to present the scaled image received from the scaling processor 105.
  • the presentation controller 107 is coupled to a display device 109 (which may be external to the display apparatus) and generates a suitable drive signal for the display device.
  • the display device 109 may be a very large display panel, such as a 60" or larger LCD or plasma display panel.
  • the presentation controller 107 may then generate a suitable display signal for the display panel.
  • the display device 109 may be a projector which is capable of projecting an image on an external surface, such as a wall or a reflective screen or (a part of) an object, and the presentation controller 107 may generate a suitable drive signal for the projector.
  • the display device 109 is arranged to display a large size picture.
  • the display device may be a display with a diagonal of several meters or more or may be projector intended to project an image with a diagonal of several meters or more.
  • the display apparatus is arranged to receive content from a variety of different sources and with varying characteristics.
  • content with varying resolutions and varying image quality may be received.
  • the display apparatus may receive originally generated QD or HD content.
  • HD content generated by upscaling, SD content, low quality video sequences (e.g. from mobile phones) high resolution photos, text based image content with different font sizes and text amount etc.
  • the resolution of the received content may vary greatly but the characteristics of the image content may also vary even for the same resolution.
  • different signals may be received with a resolution of, say, 1440x720 pixels but with very difficult content characteristics.
  • one signal at this HD resolution may be received from an HD signal source that has directly generated the content at this resolution.
  • the display device 109 is a panel display with a suitable size and resolution (e.g. a native resolution of 1440x720)
  • this image will be presented with a high quality and will appear very clear and crisp.
  • the signal may be generated by upscaling an SD signal. Although such upscaling may provide acceptable results in many scenarios, the upscaling process is not ideal and may further introduce noise and artifacts.
  • the presented image will be of reduced quality and will appear as such to the viewer even though it is upscaled to the same resolution.
  • the 1440x720 signal may originate from a very low resolution original signal (e.g. a video captured by a mobile phone with a resolution of, say, 352 x 288).
  • the image quality of the upscaled 1440x720 signal will be relatively low and will present a clearly suboptimal image to the viewer. The impact of such quality variations is particularly problematic at large display sizes.
  • presenting a low source-quality image on a 60" television or on a several square meter projected display area will result in the image being perceived to be of low quality and will typically appear to not only be noisy but also to be unsharp and blurred. This will be the case despite the image being upscaled to a higher resolution.
  • upscaling may introduce noise and artifacts that are perceptible in the presented image.
  • upscaling techniques create samples (pixels) at intermediate locations between the existing samples. The created samples have a certain correlation to the original pixels and are therefore not providing the same sharpness information as a source signal at the higher resolution.
  • the generated additional pixels tend to become more correlated thereby further reducing the sharpness perception.
  • the noise in the original signal is now extended to a plurality of pixels and are therefore displayed as larger noise entities (due to the upscaling and displaying on a larger screen size), and therefore become more visible.
  • the presented image for an input material of sufficient quality e.g. an original HD source signal
  • the presented images for source material of lower quality will be perceived to be of substantially lower quality and may indeed be perceived to be of unacceptable quality (even if high quality upscaling has been applied).
  • the inventors have realized that the perceptual quality is highest when the frequency content of the displayed image is close to (or even higher) than the maximum resolvable resolution of the human eye. Indeed, the inventors have realized that rather than merely presenting an image with characteristics dependent on the display system or the resolution of the image, it can be advantageous to scale at least some of the image depending on local pixel value spatial variations in the image. Indeed, such an approach may be used to directly control the spatial frequencies and thus sharpness of the presented images.
  • the image analyzer 103 may perform a local image profile analysis on e.g. the whole image to characterize the pixel value variations.
  • the local pixel value variations may be very substantial (sharp) at some places (e.g. corresponding to sharp edges). This sharpness may be reflected by a high local variability in various neighborhoods (e.g. within an area of say N pixel diameter both very light and very dark pixels can be found at least some places in the picture).
  • the spatial frequency content of the pixel values may cover most of the available spectrum (for the spatial sample frequency (corresponding to the resolution)).
  • signals of the same resolution but with lower image qualities e.g.
  • the image analyzer 103 performs an image analysis to determine a pixel value spatial variation characteristic that may specifically include one more of the above mentioned values.
  • the analysis may take into account such factors as viewing geometry (how far is the viewer from the display), and viewer preferences (some viewers like the blurry look of pictures, whereas others are critical), in which latter case the sealer will typically have access to a memory storing the viewer preferences, or a realtime user input means.
  • the pixel value spatial variation characteristic is then fed to the scaling processor 105 which proceeds to scale the received image in dependence on the pixel value spatial variation characteristic.
  • the scaling may for example be a simple adjustment of the size of the image when being presented.
  • the scaling processor 105 may simply scale the signal by adjusting the size at which it will be displayed. This may e.g. be achieved by controlling the optics of the projector such that the image is enlarged or reduced. This may effectively correspond to changing the pixel size of the presented image.
  • resolution scaling may be applied. For example, if the projector has a resolution of 1920 ⁇ 1080 pixels and is arranged to display a full resolution image in a corresponding display window of, say, 3m by 1.7m, the scaling processor 105 may modify the resolution of the image to correspond to a smaller size depending on the pixel value spatial variation characteristic.
  • the scaling processor 105 may use the full available resolution, i.e. the 1920 ⁇ 1080 pixels. However, for a low quality signal (e.g. the pixel value spatial variation characteristic indicating a low variation), the scaling processor 105 may only use a sub-window of the full available resolution. For example, for a low quality signal, the scaling processor 105 may generate an image with a resolution of 480x270 and position in the center of the full 1920 ⁇ 1080 image. Accordingly the projector will only present the image in a window size of 75 cm by 42 cm. However, at this lower display size the image will be perceived to be of relatively higher quality.
  • noise and artifacts are scaled to be less perceptible thereby compensating for the likelihood of these being more prevalent.
  • the resolution of the presented image will be much closer to (or even above) the maximum resolution that the human eye can resolve and accordingly the image will be perceived to be substantially sharper.
  • the scaling processor 105 performs resolution scaling to a resolution that corresponds to a desired window size.
  • the output signal from the scaling processor 105 has a fixed resolution corresponding to the full display window (e.g. full projected area or full resolution of large size display).
  • the scaling processor 105 then proceeds to perform scaling of an input image (e.g. of a video signal) to result in an image having a resolution that will give the desired display window size when presented by the display device.
  • This resolution scaling may include an upscaling from a lower resolution (e.g. an SD resolution) to a higher resolution that corresponds to a specific display window.
  • This upscaling may include the use of advanced upscaling techniques as will be known to the skilled person.
  • the resolution scaling may also include a down scaling, for example if the desired window size corresponds to an image resolution that is lower than the resolution of the received image.
  • the received image signal may already be at a resolution corresponding to the resolution of the display device and accordingly the scaling of the scaling processor 105 may either be no change or be a down-scaling.
  • the scaling process may not simply correspond to a direct rescaling of the input image but may apply very advanced localized and flexible scaling approaches. Also, the scaling may not be applied to the whole image but may be applied to one or more regions of the image. Indeed, the scaling processor 105 may adjust the size of the presented image by cropping the image in dependence on the pixel value spatial variation characteristic.
  • the scaling process may include an image analysis identifying foreground image objects and a background.
  • the background may then be upscaled to the desired window size while the image object is maintained at the same resolution and inserted in the upscaled background.
  • scaling may include any process or algorithm that affects the size of the image when displayed by the display device 109.
  • the scaling processor 105 may not modify the image in any way and may always provide the unmodified image signal to the presentation controller 105.
  • the scaling processor 105 may generate a control signal which is fed to the optics of the projector. The control signal may then control the optics to provide a displayed image of the appropriate size.
  • the display system of FIG. 1 automatically adapts the display window in which the image (or part of the image) is displayed in dependence on the pixel value spatial variation characteristic.
  • the pixel value spatial variation characteristic may be indicative of the quality or sharpness of the image content, this allows an automatic adaptation of the presented image to provide an improved user experience and an improved perception of quality.
  • the image receiver 101 may comprise a prescaler which may perform a prescaling of the input signal.
  • the input image may be prescaled by a fixed factor (e.g. it may be upscaled by a predetermined upscale process, such as a 4 times linear upscaler). This may e.g. be appropriate when the input signal is an SD signal (or lower) and the display is a QD display.
  • the image receiver 101 may apply a prescaling which immediately upscales the image to a resolution closer to that which corresponds to a full display window size.
  • the image analyzer may then process the prescaled image in order to determine the pixel value spatial variation characteristic, e.g. by evaluating the spatial frequency components.
  • the prescaled image may be used directly. However, if the high frequency content is too low (indicating that the image will appear unsharp), the scaling processor 105 further scales the prescaled signal. In particular, the scaling processor 105 may typically downscale the prescaled image to a lower resolution.
  • the scaling processor 105 may be used as a prescaler. E.g. it may first perform a predetermined scaling to generate a first scaled image. This image may then be analyzed by the image analyzer 103 to generate a pixel value spatial variation characteristic. Based on the pixel value spatial variation characteristic, a second scaling may be applied to the image (either to the original image or to the prescaled signal). The resulting scaled signal may then be analyzed. Such a process may e.g. be iterated until a stop criterion is met (e.g. that the pixel value spatial variation characteristic meets a criterion indicating that the perceived image quality will be acceptable).
  • the prescaler may apply an adaptable prescaling to the input image such that this is scaled to a predetermined or fixed resolution. For example, all input images may first be upscaled to the full resolution of the display device 109. The resulting image is then analyzed by the image analyzer 103. If this results in an acceptable pixel value spatial variation characteristic (according to any suitable criterion), the prescaled image may be fed to the presentation processor 107. Otherwise, the scaling processor 105 may perform a downscaling of the prescaled signal to result in a reduced size display window.
  • the scaling processor 105 may in such cases operate on the original image rather than the prescaled image, i.e. an upscaling with a scale factor less than the prescaling may be applied.
  • an upscaling with a scale factor less than the prescaling may be applied.
  • Such an example may correspond to the prescaler being part of the image analyzer 103.
  • the image analyzer 103 may specifically be arranged to perform a spatial frequency analysis of local spatial frequencies on at least one region of the image. Based on the frequency analysis, the pixel value spatial variation characteristic is then generated to comprise (consist in) a spatial frequency characteristic.
  • the frequency characteristic may specifically be an indication of the spatial frequencies that will result from the image being displayed within the full window available to the display device.
  • the scaling may be adapted such that a given frequency characteristic for the displayed image is maintained relatively constant for varying frequency characteristics of the input signal.
  • the scaling processor 105 may scale the image such that e.g. a maximum frequency of the output signal being fed to the presentation controller is substantially constant.
  • the image analyzer 103 analyses the input image to determine a frequency characteristic.
  • the frequency characteristic may specifically represent a frequency distribution characteristic and/or a frequency content characteristic of the image received from the image receiver 101.
  • the frequency characteristics may represent spatial frequency content relative to the spatial sampling frequency (e.g. the resolution).
  • the image analyzer 103 may convert the received image to the frequency domain, for example using a two dimensional Fast Fourier Transform (FFT). It may then evaluate the resulting frequency distribution to generate a suitable frequency characteristic.
  • the frequency characteristics may be generated to indicate a highest spatial frequency or e.g. the frequency corresponding to the 95 th percentile. This frequency will be given as a fraction of the sample frequency and may e.g. be scaled to a value between 0-1.
  • a value of 1 may be used to indicate that the highest frequency (or the 95 th percentile frequency) is the same as half the spatial sample frequency and a value of 0 may be used to represent that the image only contains a zero frequency (i.e. it is just a single colour/shade).
  • a high value is thus indicative of a high degree of high frequency components which indicative of a high degree of sharpness and especially of sharp transients being present in the image, whereas a low value is indicative of a lack of high frequency components and this of a low degree of sharpness (as the transitions do not exploit the full resolution available in the input image).
  • the image analyzer 103 may detect specific image areas or regions in which the analysis is performed.
  • the image analyzer 103 may for example include an edge detector and specifically apply the frequency analysis to one or more regions around such detected edges.
  • encoded digital signals typically provide frequency domain representations of individual blocks (such as 8x8 pixel blocks).
  • an encoded image signal may directly provide a frequency domain representation of each 8x8 block and the image analyzer 103 may extract these frequency domain values. It may then proceed to generate a histogram over the frequency values for the individual blocks resulting in a frequency distribution for the image represented by 64 frequency bins with a value for each bin representing the relative strength of frequencies within the frequency interval covered by the bin.
  • a histogram over the frequency values for the individual blocks resulting in a frequency distribution for the image represented by 64 frequency bins with a value for each bin representing the relative strength of frequencies within the frequency interval covered by the bin.
  • the scaling processor 105 then proceeds to scale the image such that it will provide a displayed image that has a suitable frequency content.
  • the relation between the size of the image being output from the scaling processor 105 and the displayed image may be known. For example, for a display of 3m by 1.7m sized display panel with a 1920 ⁇ 1080 resolution, each pixel will have a size of 0.15cm by 0.15cm.
  • a nominal viewing distance may be assumed allowing a desired frequency of the output signal of the scaling processor 105 corresponding to the maximum frequency of the eye to be resolved.
  • a desired, say, 95% or maximum frequency relative to the sample rate of the signal may be known. If the frequency characteristic indicates that a full scale signal will be sufficiently close to this frequency, the output image will be generated to use the full available display window.
  • the frequency characteristic indicates that the image only has a frequency which is half the desired frequency
  • this is compensated by scaling the signal.
  • the horizontal and vertical dimensions of the display may be reduced by a factor of two resulting in the displayed image having the same relative sharpness but being much smaller.
  • the scaling processor 105 may be arranged to scale the image such that the scaled image has a pixel value spatial variation characteristic and specifically a frequency characteristics that meets a criterion.
  • the scaling processor 105 may be arranged to scale the image such that the scaled image has a pixel value spatial variation characteristic and specifically a frequency characteristics that meets a criterion.
  • different images will be scaled such that the scaled image meets a perceived quality criterion.
  • the quality may be kept relatively constant with the size of the displayed image being modified.
  • a complex mathematical relationship between size and variation characteristic may be used and/or the scaling processor 105 may implement a look-up table that correlates the size and the variation characteristic.
  • Such functions or look-up tables may furthermore be configurable allowing a user to adapt the operation to the specific display scenario. E.g. the function or values may be adjusted to reflect a distance between a projector and the display surface and/or between the display surface and the viewer.
  • the image analyzer 103 is arranged to perform a sharpness analysis and to generate the pixel value spatial variation characteristic to comprise a sharpness characteristic.
  • a sharpness analysis is an example of a sharpness evaluation and that the frequency characteristic is indeed indicative of a perceived sharpness. Indeed, higher frequencies are indicative of faster transitions and thus of increased sharpness.
  • the image analyzer 103 may detect edges and specifically evaluate characteristics of such edges. For example, the rate of transition may be determined and averaged for all identified edges.
  • the selection processor 205 may select different scalings. Thus, low gradients (corresponding to blurred/ soft edges) may result in a small display window and high gradients may result in larger window sizes.
  • Fig. 5 shows a couple of more examples of the generic approach of looking at sharpness or frequency content. One would then typically look at signal complexity.
  • Fig. 5 a shows an easy to scale profile, namely a sharp inter -object edge. Linear scaling typically blurs this edge until it becomes unacceptable, non-linear scaling can mitigate this.
  • Fig. 5b is an example of a profile which may at certain instances be very scalable too, namely a near constant pixel value profile. This may occur e.g. as the sky in between trees. In that scenario the sky actually is undetailed, and will be so in different scales.
  • Fig. 5c is a microprof ⁇ le of a texture comprising of hairs (each hair having a variable illumination profile), whereas Fig. 5d is a basic texture element of a net texture, which is composed of small sharp edges 504, and a near constant interior 503.
  • the hairs may not scale well with another used sealer, due to both having blurred edges, and an incorrect interior (typically lacking a realistic pixel variation profile, usually micro-detail). If however the hairs look good enough on the original size, another sealer can be used (like the method of Damkat EP08156628) which retains the same information complexity after scaling. Information complexity typically has to do with different scales, such as object boundaries, and the pixel variations inside (compare to blurry interiors of bad compression).
  • Texture analysis may look at the size of the grains and then the complexity of the interiors.
  • Fig. 5f shows another example of generically looking at the frequencies rather than looking exactly at the values of the different FFT bins. It indicates that an object region may be too cartoony because there are long runs of pixel values 502 which do not have enough pixel variation.
  • some scalings may be determined as of good enough quality whereas other won't meaning the picture has to be displayed smaller.
  • the display device 109 may be a projector which can present a display on a wall of a room, such as living room. Such an example is illustrated in FIG. 2 where the projector may display the image to cover an entire wall, i.e. the maximum display window corresponds to a full wall.
  • the projector may specifically be a high resolution projector, having e.g. a resolution of 1920 ⁇ 1080 pixels.
  • the display apparatus of FIG. 1 may be used to display different image signals on the wall.
  • the input signal may originate from a conventional television signal that provides a standard SD video signal.
  • the SD video signal may be upscaled to a HD resolution (e.g. by the display apparatus or prior to being provided to the display apparatus) but this will typically result in a signal of reasonable but not extremely high quality. Therefore, the analysis by the image analyzer 103 will indicate e.g. a frequency content which does not fully exploit the available spatial frequency range. Accordingly, the scaling processor 105 will scale the signal such that an image of a reasonable size is provided (ref. FIG. 2a). This size is specifically optimized to provide a desired trade-off between perceived quality and image size.
  • the display apparatus may be used to present a high resolution still image.
  • the image may e.g. be provided directly in the resolution of 1920x1080 pixels. This image will typically fully utilize the available resolution and may have frequency content that represents the full available frequency range. Accordingly, the scaling processor 105 will scale the image such that it is displayed in the full window, i.e. such that it covers the entire wall (ref. FIG. 2b).
  • the display system may be used to watch a movie originating as a HD density movie at a resolution of 1440x720. Again, this content may be upscaled to the native resolution of the display system of 1920x1080 pixels. However, due to the improved quality of the original signal, the sharpness of the upscaled image is likely to be significantly better than for the SD signal but not as high as the high resolution photo. This will be reflected in the frequency characteristic generated by the image analyzer 103 and accordingly the display system can scale the signal to present the content in a display window which is in between the size used for the television signal and the size used for the high resolution photo (ref. FIG. 2c).
  • the image receiver 101 may receive three different signals that are all at the same resolution of 1920x1080 pixels. However, as they originate from very different signals, the actual image quality or sharpness of the actual image varies significantly within this fixed resolution frame work.
  • the display system can detect this automatically and can adjust the size of the presented image to provide the optimized trade-off between the perceived quality and the image size. Furthermore, this optimization may be performed automatically.
  • the display apparatus may be implemented as part of a portable device that can be fed a variety of signals and which can be used in different applications. For example, a user may enter a distance to the display surface on which the image is projected and a distance of a viewer. The portable projector may then be arranged to automatically calculate and apply the scaling for different image qualities.
  • the display system may be used as part of a consumer device or appliance to present additional information or instructions.
  • a washing machine 301 may comprise a projector 303 that can project an image 305 on a wall 307 behind and above the washing machine 301.
  • the system may for example be used to present a context sensitive instruction manual or decision tree to the user.
  • the washing machine 301 may have a range of images that can be presented to assist the user. These images may predominantly comprise text based information but may also comprise images or pictograms.
  • the images may all be stored at the same resolution but may vary substantially, i.e. some images may contain a small amount of large font text, others may contain simple images, yet others may contain large amounts of small font text etc.
  • the display apparatus may evaluate each image that is displayed to determine the sharpness or frequency content and may scale the image accordingly. For example, a page/image containing a large amount of text may be presented as a relatively large picture 305 on the wall 307. However, a page/image containing a small amount of text may be presented as a relatively small picture 305 on the wall 307.
  • the system may automatically adapt the image size to match the content of the specific image/page. For example, the size of the image may automatically be adapted to provide a substantially equal font size but with varying image sizes. It should be noted that this may be achieved without changing the resolution of the projected image, e.g. by the scaling processor 105 directly controlling the optics of the projector.
  • the scaling processor 105 may in some embodiments or scenarios not merely resize the image but may alternatively or additionally select one or more regions of the image.
  • the scaling processor 105 may be arranged to crop the image in certain scenarios.
  • the frequency analysis may indicate that the image should be presented in a display window which is larger than the display window which is available to the display apparatus.
  • the scaling processor 105 may proceed to resize the image to the desired size (e.g. one that corresponds to an appropriate frequency content of the displayed image) and to crop the resulting image to the size of the available display window.
  • some pages/images may have small amounts of large font text and this may be scaled to be presented in a window which is smaller than the full display window.
  • Other pages/images may have larger amounts of smaller font text that are still possible to be viewed when presented in the maximum display window.
  • other pages may comprise large amounts of text written with a small font.
  • these pages have a very high concentration of high frequencies indicating that there is a lot of detail in the picture.
  • the scaling processor 105 may evaluate that even if the image is presented in the maximum display window, it will not be sufficient to provide a sufficiently legible text to the user. Accordingly, the scaling processor 105 may not only re-size but may also crop the image.
  • the image may be scaled by a cropping which is dependent on the frequency characteristic.
  • the scaling may comprise cropping when the pixel value spatial variation characteristic meets a criterion. It will also be apparent from this description that the scaling can be dependent on a characteristic of the display window which is available for displaying the scaled image. Thus, in the specific example, the image is cropped if it cannot be displayed appropriately within the maximum size of the potential display window.
  • the projector 303 of the washing machine 301 may project the image on a free surface of the washing machine itself.
  • a flat white surface of the washing machine 301 may be used to display the image.
  • the available surface may be strictly limited by the various visual features of the washing machine 301 and therefore only a relatively small surface area may be available therefore resulting in a frequent cropping.
  • the display system may be used in a vehicle such as a car.
  • the display device 109 may comprise a projector which projects the image on the windscreen of the car.
  • part of the windscreen may be kept clear for the driver whereas the image is projected on the passenger side of the windscreen.
  • the size of the image is adjusted depending on the frequency/sharpness characteristic such that the image is no bigger than required or desired in view of the information content.
  • the display system can also take into account how much of the windshield should be kept free for allowing the outside to be viewed, i.e. to ensure that the driver can see sufficient amounts of the traffic outside.
  • a calculation of the image size may be performed as a function of the sharpness/frequency parameter as well as a second parameter indicating a potential display window.
  • an edge sharpness or frequency analysis are merely examples of the local image profile analysis that can be performed to determine the pixel value spatial variation characteristic. Indeed, any analysis which provides an indication of how much or how fast the pixel values vary spatially between pixels that are relatively close together can be used.
  • the pixel value spatial variation characteristic does not merely indicate the spread of pixels in the image as a whole bur rather how they vary on a local scale. For example, an image wherein half of the image is very dark and the other half is very bright may have a large pixel value variation.
  • the local pixel variation may be relatively low as there may be virtually no variations within each of the two halves (i.e. the variation occurs only on the edge). In contrast, an image with a lot of detail (e.g.
  • a "busy" picture will typically have a high local pixel value variation as the pixel values vary substantially even within local areas. It will be appreciated that for an edge detection sharpness approach and a spatial frequency approach the local characteristic of the pixel variations are automatically taken into account.
  • the pixel variation analysis may be localized by analyzing pixel variations within image sections or regions. For example, the image may be separated into a number of segments or regions and the pixel variation within each segment or region may be evaluated. The pixel value spatial variation characteristic may then be generated by analyzing or combining the results for the different segments.
  • the pixel value spatial variation characteristic is determined in response to a pixel value distribution characteristic that reflects how pixel values are distributed.
  • the pixel value may be a luminance value which can take on the values from 0 to 255.
  • the image may be divided into a number of segments and for each segment a histogram is generated for the luminance value.
  • the histogram is then used to generate a variance for the pixel value in each segment.
  • the variances for the different segments may then be combined e.g. by determining an average variance for the segments. This average variance can then be used as the pixel value spatial variation characteristic.
  • an image with a very bright half and a very dark half will result in a relatively low average variance as most segments will fall either within the bright half or the dark half and therefore have relatively low variance.
  • most segments will have a relatively high variance thereby resulting in a high average variance for the picture as a whole.
  • this average variance is indicative of the image containing a relatively high amount of local pixel variations thereby being indicative of a detailed image that can advantageously be presented in a relatively large display window.
  • the image analyzer 103 may be arranged to perform a texture analysis on one or more regions.
  • the pixel value spatial variation characteristic may then be determined in response to a texture characteristic.
  • the surface area of an image area may have a texture with substantial detail (e.g. texture grain profiles) and relatively sharp pixel value variations. This is indicative of a high degree of high frequency content and thus a sharp image that can be presented with a relatively large size.
  • texture analysis may for example be performed as a local/patch based variance computation with binning into classes.
  • textured image object surfaces may be identified. Each surface may be divided into a number of segments and for each segment the pixel values may be divided into bins of a histogram. The variance of the segment may then be determined. An average variance for all the textured surfaces may then be calculated and may be used to determine the texture characteristics to indicate a level of texture. For example, a low variance may indicate a flat surface/image (virtually no texture), a mid range variance may indicate that there is some limited texture, and a high variance may indicate that there is a high degree of texture.
  • the image analyzer 103 may be arranged to perform a motion analysis of the picture.
  • the pixel value spatial variation characteristic may then be generated in response to this motion characteristic.
  • the image being processed is a frame of an encoded video signal
  • motion data may be directly available and can simply be derived by extracting the relevant motion data from the encoded bitstream.
  • a temporal analysis of plurality of frames may be performed to detect image objects and characterize their movement within the image. It would be appreciated that many such techniques will be known to the skilled person.
  • the size of the display window used for presenting the image may then be adjusted in dependence on the motion characteristic. For example, in some embodiments a high degree of motion may be indicative of a high degree of individual image objects moving fast within the image. This may be more suitable for a presentation in a relatively large display window in order to allow the user to discern and follow the individual image objects.
  • the image analyzer 103 is arranged to perform a decoding artifact analysis and to generate the pixel value spatial variation characteristic in response to the decoding artifact characteristic.
  • the image signal (whether a static or moving image) will be digitally encoded.
  • a digital photo may be encoded in accordance with a JPEG encoding algorithm and a video signal may be encoded in accordance with an MPEG encoding algorithm.
  • Such encoding typically includes a substantial data compression and may result in a noticeable difference between the decoded signal and the original signal. Such differences may be considered coding artifacts and will be present in the decoded image. For example, for relatively high compression ratios, media signals may often experience a certain "blockyness" resulting from the encoding of the images being based on encoding of individual blocks.
  • the image analyzer 105 may specifically be arranged to evaluate such coding artifacts. For example, the image analyzer 105 may divide the image into 8 x 8 pixel blocks corresponding to the coding blocks. It may then proceed to determine characteristics of pixel variations within the blocks compared to pixel variations between the blocks. If this analysis indicates that the pixel variations between blocks are relatively high compared to the pixel variations within the block, this is likely to be due to a relatively high degree of coding blocks artifacts (of "blockyness").
  • the evaluation of coding artifacts may be used in different ways. In some scenarios it may be used directly as the pixel value spatial variation characteristic, i.e. it may directly be used to adjust the scaling of the image. This may allow the display window to be sized such that it matches the degree of encoding artifacts, i.e. such that these do not degrade the perceived quality too much. For example, if there is a high degree of coding artifacts, a smaller display window size may be used than if there is not a high degree of coding artifacts. However, alternatively or additionally, the coding artifact may be used to modify the analysis of other parameters.
  • the image may first be compensated to reduce or remove coding artifacts.
  • the spatial frequency analysis may be limited to be performed within encoding blocks and to not extend between encoding blocks. This may allow the frequency analysis to represent the detail and sharpness of the underlying image without being unduly affected by the coding artifacts. Indeed in many scenarios coding artifacts may actually introduce additional high frequency spatial components which may be misconstrued as indications of increased sharpness. Therefore, the coding artifacts mitigation may prevent that a lower quality image is interpreted as being sharper than the corresponding image without coding artifacts.
  • the scaling may be dependent on the frequency characteristic determined after the coding artifact mitigation.
  • the scaling may also be dependent on the coding artifact analysis in itself. For example, a full display size may only be used if the frequency characteristic is indicative of a sufficient degree of high frequency components and if the coding artifact characteristic is indicative of an amount of coding artifacts being below a threshold. If any of these two criteria is not met, a lower display window size is used.
  • the image analyzer 103 may be arranged to perform a content analysis on the image and the scaling means may be arranged to perform the scaling (at least partly) in response to the content characteristic.
  • the content analysis may specifically be arranged to determine a content category for the presented image and the scaling of the image may be dependent on the content category.
  • the content analysis may for example include a scene detection that is able to characterize the specific scene displayed by the image. For example, it may detect whether the image is a landscape or oceanscape and if so it may modify the scaling to provide an image that is larger than it would otherwise be.
  • the display system may be arranged to detect that the presented material corresponds to a football match and may in response be arranged to bias the scaling according to a preference for football matches (for example the size of the display window may be increased).
  • the content analysis is used to enhance the scaling based on the pixel value spatial variation characteristic.
  • the approach may allow a further adaptation of the display window to the user preferences. E.g. for a given level of sharpness and detail, the user may prefer different sizes of the presented image dependent on whether this is of a landscape or a football match.
  • the image may be part of a video sequence of images, for example a sequence of video frames may be analyzed.
  • the image analyzer 105 may also be arranged to perform a temporal analysis to determine a temporal pixel value variation characteristic. For example, as previously described a motion characteristic for image objects may be determined.
  • temporal noise may be estimated based on an image analysis.
  • a temporal analysis may provide further information that can be used.
  • the temporal analysis may allow a differentiation between texture of an image object and noise in the image object.
  • temporal noise may be estimated and used in a similar way as described for the coding artifacts. E.g. it may be used to compensate the pixel value spatial variation characteristic (in order to compensate for noise being interpreted as local high frequency variations) and may be used directly to determine the scaling (e.g. reducing the display window size when there are significant temporal noise in order to make this less perceptible).
  • the image analyzer 105 may specifically perform the analysis based on one more image objects in the image. For example, a segmentation of the image may be performed to detect one more image objects. An image object spatial profile characteristic may then be generated for the specific image object. Specifically, one or more of the analyses described above may be applied directly to the extracted image object. The scaling may then be performed to provide the desired property of the specific image object. For example, the image size may be selected such that the image object is presented with a desired degree of sharpness, or with a size not above a threshold (above which the object may e.g. look unrealistic, intimidating, etc.).
  • any available display spaces that are not used by the main image may be used to present another image.
  • the overall display area may be divided into a plurality of display windows which allow simultaneous presentation of several windows.
  • FIG. 4 An example of such a scenario is shown in FIG. 4.
  • a main window 401 is placed centrally in the display area 403.
  • the display area may specifically correspond to (part of) a wall, or a top surface of a washing machine or any outer surface of an other (part of an) object, on which a projector projects a picture comprising the plurality of display windows.
  • one window 405 may be used as a video chat window
  • another window 407 may be used to provide a weather satellite image
  • another window 409 may be used to provide a static weather forecast
  • another window 411 may be used to provide information of upcoming events
  • another window 413 may be a graphical image used to indicate the current time.
  • the display area may provide a "wall of information" that simultaneously presents different information to the user.
  • the previously described scaling may be applied to the image of the central display window 401 (the one selected as the main display window).
  • the main display window 401 may accordingly adjust its size depending on the specific sharpness indication of the image.
  • main display window 401 may be relatively small and other times it may be relatively large.
  • one or more characteristics of the other windows 405-413 may further be dependent on the characteristic of the main window 401.
  • the auxiliary windows 405-413 may be positioned and sized such that they are displayed outside of the main display window 401. For example, when the main display window 401 is relatively large, the auxiliary display windows 405-413 are moved further towards the side of the display window.
  • the size of one or more of the auxiliary windows 405-413 may also be adjusted depending on the scaling that is applied to the main window 401 (so that they fall less outside the displayable area 403). For example, when the size of the main window 401 is reduced the corresponding size of the auxiliary windows 405-413 is increased correspondingly.
  • the described adaptive scaling may alternatively or additionally be applied to one more of the auxiliary windows 405-413 (in that case the basic scaling algorithm is partly constrained, resulting in intermediate optimal scaling).
  • the main window 401 may have a fixed size whereas one or more of the auxiliary windows 405-413 is scaled according to a sharpness indication for that window 405-413.
  • this size of auxiliary window 405 may depend on the relative sharpness of the presented image of the person, or to render the face still realistic enough.
  • the size of the auxiliary window 405 may be set to the highest value that allows it to be presented within the display area 405 and outside the main window 401. However, if the relatively sharpness is low, the size of the auxiliary window 405 is reduced correspondingly thereby providing an improved perceived image quality. This may furthermore allow more or larger other extraordinary windows 407- 413 to be presented simultaneously.
  • the auxiliary window 405 may in some examples be used to present images of people.
  • the scaling of the window 405 may then be optimized to provide a display that provides an optimal sharpness for the person presented. Indeed, making the display too large will result in the recognition of the person being made more difficult as block artifacts and/or blurriness makes the image less easy to perceive by the user.
  • each individual window is optimized to be presented with the size that best matches the determined sharpness for that window.
  • the scaling of one window may further take into account the scaling/sharpness characteristic of other windows.
  • the optimal size may be determined for each individual window and consequently these sizes may be further modified to ensure that all windows can be fitted within the available display area 403. This may typically be done e.g. by allocating size budgets to the windows and defining an asymmetrically weighed cost function punishing deviation from the optimal sizes.
  • auxiliary display windows 405 -413 may be modified such that they extend beyond the available display area 403. E.g., in the example of FIG. 4, auxiliary display windows 405 -409 moved outwards such that only part of the windows 405-409 is visible. If the characteristics of the main window 401 changes resulting in the main window 401 reducing in size, the auxiliary display windows 405 -409 may automatically slide back into the viewable display window 403.
  • FIG. 4 may specifically provide an "elastic cloud" presentation of different information in different windows.
  • it may provide a cloud of co -windows moving beyond the boundary of the displayable area depending on the optimization of the main display window.
  • the scaling of the individual windows may further depend on whether the window is selected as the main window or not. For example, for each window, one desired quality measure is stored for the scenario where this is selected as the main window and another is stored for the scenario where the window is selected as an auxiliary window. E.g. when a window is the main display window, it is controlled to provide an optimized perceived quality. However, when it is selected as an auxiliary window, it is only required that a minimum quality level is achieved while otherwise allowing a free adjustment of the window to allow it to optimally fit with the other presented windows. It will also be appreciated that in the example where the auxiliary display windows 405 -413 may automatically be moved partially out of the available display window 403, a selection of part of the image may be applied.
  • an adaptable cropping or image object extraction may be applied to provide an optimal image section within the visible display window 403.
  • the scaling apparatus can take this into account by employing them in an appropriate way.
  • a main picture after scaling falling outside the achievable geometry of a first display may be partially represented on a second display (e.g. ambilight projects the outer picture regions (possible processed by image processing such as e.g. further non-linear stretching -or graphically adding extra objects/patterns e.g. trees etc.- to give the projection a more immersive character) as an environment on adjacent wall, or an adjacent second LCD display).
  • some of the windows e.g. if less than 50% of it would be visible on the main display area
  • may be offloaded to a second display e.g. the weather picture could be switched to a display on the universal remote control, and the clock picture could be sent to a photoframe display on the cupboard).
  • an embodiment of the invention may be physically, functionally and logically implemented in any suitable way. Indeed the functionality may be implemented in a single unit, in a plurality of units or as part of other functional units. As such, the invention may be implemented in a single unit or may be physically and functionally distributed between different units and processors.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Probability & Statistics with Applications (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Image Processing (AREA)

Abstract

L'invention porte sur un appareil d'affichage pour présenter une image, lequel appareil comprend un récepteur d'image (101) pour recevoir une image devant être affichée. Un analyseur d'image (103) effectue une analyse de profil d'image local sur au moins une première région de l'image pour déterminer une caractéristique de variation spatiale de valeur de pixel. L'analyseur d'image (103) est couplé à un processeur de mise à l'échelle (105) qui met à l'échelle au moins une seconde région de l'image en réponse à la caractéristique de variation spatiale de valeur de pixel. Le processeur de mise à l'échelle (105) est couplé à un contrôleur de présentation (107) qui présente l'image mise à l'échelle. La mise à l'échelle peut spécifiquement être ajustée en fonction d'une netteté ou de caractéristiques de fréquence spatiale de l'image. L'invention peut permettre une adaptation améliorée de la présentation d'une ou plusieurs images aux caractéristiques spécifiques de la ou des images.
PCT/IB2010/052001 2009-05-13 2010-05-06 Appareil d'affichage et son procédé WO2010131167A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US13/320,052 US20120050334A1 (en) 2009-05-13 2010-05-06 Display apparatus and a method therefor
JP2012510407A JP2012527005A (ja) 2009-05-13 2010-05-06 表示装置およびそのための方法
CN201080020806.7A CN102428492B (zh) 2009-05-13 2010-05-06 显示装置及其方法
EP10726232A EP2430611A1 (fr) 2009-05-13 2010-05-06 Appareil d'affichage et son procédé

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP09160124 2009-05-13
EP09160124.5 2009-05-13

Publications (1)

Publication Number Publication Date
WO2010131167A1 true WO2010131167A1 (fr) 2010-11-18

Family

ID=42562643

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2010/052001 WO2010131167A1 (fr) 2009-05-13 2010-05-06 Appareil d'affichage et son procédé

Country Status (5)

Country Link
US (1) US20120050334A1 (fr)
EP (1) EP2430611A1 (fr)
JP (1) JP2012527005A (fr)
CN (1) CN102428492B (fr)
WO (1) WO2010131167A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010044151A1 (de) * 2010-11-19 2012-05-24 BSH Bosch und Siemens Hausgeräte GmbH Hausgerät mit Bildprojektor
CN102761709A (zh) * 2011-04-25 2012-10-31 张影 一种广播影像控制方法
US8660351B2 (en) 2011-10-24 2014-02-25 Hewlett-Packard Development Company, L.P. Auto-cropping images using saliency maps
WO2015036648A1 (fr) * 2013-09-11 2015-03-19 Nokia Technologies Oy Appareil de traitement d'images et procédés associés
US10878532B2 (en) 2014-09-02 2020-12-29 Samsung Electronics Co., Ltd. Display device, system and controlling method therefor

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2007052395A1 (ja) * 2005-10-31 2009-04-30 シャープ株式会社 視聴環境制御装置、視聴環境制御システム、視聴環境制御方法、データ送信装置及びデータ送信方法
JP5683367B2 (ja) * 2011-04-20 2015-03-11 キヤノン株式会社 画像処理装置、画像処理装置の制御方法、およびプログラム
KR101913336B1 (ko) * 2011-10-06 2018-10-31 삼성전자주식회사 이동 장치 및 그 제어 방법
GB2499635B (en) * 2012-02-23 2014-05-14 Canon Kk Image processing for projection on a projection screen
US20130222228A1 (en) * 2012-02-29 2013-08-29 David Ryan Walker Automatic projector behaviour changes based on projection distance
JP2015079078A (ja) * 2013-10-16 2015-04-23 セイコーエプソン株式会社 表示制御装置及び方法、半導体集積回路装置、並びに、表示装置
KR102194635B1 (ko) * 2014-01-29 2020-12-23 삼성전자주식회사 디스플레이 컨트롤러 및 이를 포함하는 디스플레이 시스템
KR102214028B1 (ko) 2014-09-22 2021-02-09 삼성전자주식회사 가변구조형 스케일러를 포함하는 애플리케이션 프로세서와 이를 포함하는 장치들
US9483982B1 (en) 2015-05-05 2016-11-01 Dreamscreen Llc Apparatus and method for television backlignting
KR102512521B1 (ko) * 2015-10-12 2023-03-21 삼성전자주식회사 텍스쳐 처리 방법 및 장치
CN105916022A (zh) * 2015-12-28 2016-08-31 乐视致新电子科技(天津)有限公司 一种基于虚拟现实技术的视频图像处理方法及装置
US9691131B1 (en) * 2016-08-31 2017-06-27 Knapsack, LLC System and method for image resizing
JP6433537B2 (ja) * 2016-12-22 2018-12-05 カルソニックカンセイ株式会社 画像表示制御装置
CN112313705B (zh) * 2018-07-01 2024-06-25 谷歌有限责任公司 用于处理视频流的方法、系统以及介质
CN109725977B (zh) * 2019-01-02 2022-06-28 京东方科技集团股份有限公司 一种基于Android系统的多应用显示方法及终端设备
US20200304752A1 (en) * 2019-03-20 2020-09-24 GM Global Technology Operations LLC Method and apparatus for enhanced video display
CN112308212A (zh) * 2020-11-02 2021-02-02 佛山科学技术学院 一种基于神经网络的安防图像高清恢复方法及系统
US11669361B1 (en) * 2021-04-01 2023-06-06 Ai-Blockchain, Inc. System, method and program product for optimizing computer processing power in cloud computing systems
CN113099182B (zh) * 2021-04-08 2022-11-22 西安应用光学研究所 基于机载并行处理架构的多视窗实时缩放方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1026630A2 (fr) * 1999-02-08 2000-08-09 Sharp Kabushiki Kaisha Réduction de la fréquence d'échantillonnage utilisant une élimination d'éléments d'image
US20020191861A1 (en) * 2000-12-22 2002-12-19 Cheatle Stephen Philip Automated cropping of electronic images
US20070025637A1 (en) * 2005-08-01 2007-02-01 Vidya Setlur Retargeting images for small displays
US20070217707A1 (en) * 2006-03-16 2007-09-20 Lin Peng W Test method for image sharpness

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07193766A (ja) * 1993-12-27 1995-07-28 Toshiba Corp 画像情報処理装置
US20020141501A1 (en) * 1998-11-20 2002-10-03 Philips Electronics North America Corporation System for performing resolution upscaling on frames of digital video
JP3752991B2 (ja) * 2000-10-19 2006-03-08 株式会社日立製作所 映像表示装置
JP4011949B2 (ja) * 2002-04-01 2007-11-21 キヤノン株式会社 マルチ画面合成装置及びデジタルテレビ受信装置
US20050219362A1 (en) * 2004-03-30 2005-10-06 Cernium, Inc. Quality analysis in imaging
FI20045201A (fi) * 2004-05-31 2005-12-01 Nokia Corp Menetelmä ja järjestelmä kuvien katsomiseksi ja parantamiseksi
JP2006234869A (ja) * 2005-02-22 2006-09-07 Fuji Xerox Co Ltd 画質調整方法、画質調整装置、および出力制御装置、並びにプログラム
JP2008164666A (ja) * 2006-12-27 2008-07-17 Sharp Corp 映像信号処理装置、および該映像信号処理装置を備える映像表示装置
JP2009015025A (ja) * 2007-07-05 2009-01-22 Hitachi Ltd 画像信号処理装置および画像信号処理方法
US8331666B2 (en) * 2008-03-03 2012-12-11 Csr Technology Inc. Automatic red eye artifact reduction for images
US8379152B2 (en) * 2008-03-31 2013-02-19 Sharp Laboratories Of America, Inc. Systems and methods for increasing the temporal resolution of video data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1026630A2 (fr) * 1999-02-08 2000-08-09 Sharp Kabushiki Kaisha Réduction de la fréquence d'échantillonnage utilisant une élimination d'éléments d'image
US20020191861A1 (en) * 2000-12-22 2002-12-19 Cheatle Stephen Philip Automated cropping of electronic images
US20070025637A1 (en) * 2005-08-01 2007-02-01 Vidya Setlur Retargeting images for small displays
US20070217707A1 (en) * 2006-03-16 2007-09-20 Lin Peng W Test method for image sharpness

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
BARTEN P G J: "Evaluation of subjective image quality with the square-root integral method", JOURNAL OF THE OPTICAL SOCIETY OF AMERICA. A, OPTICS AND IMAGESCIENCE, OPTICAL SOCIETY OF AMERICA, XX, vol. 7, no. 10, 1 October 1990 (1990-10-01), pages 2024 - 2031, XP007914638, ISSN: 0740-3232 *
CHEN L-Q ET AL: "A VISUAL ATTENTION MODEL FOR ADAPTING IMAGES ON SMALL DISPLAYS", MULTIMEDIA SYSTEMS, ACM, NEW YORK, NY, US LNKD- DOI:10.1007/S00530-003-0105-4, vol. 9, no. 4, 1 October 2003 (2003-10-01), pages 353 - 364, XP001196335, ISSN: 0942-4962 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010044151A1 (de) * 2010-11-19 2012-05-24 BSH Bosch und Siemens Hausgeräte GmbH Hausgerät mit Bildprojektor
CN102761709A (zh) * 2011-04-25 2012-10-31 张影 一种广播影像控制方法
US8660351B2 (en) 2011-10-24 2014-02-25 Hewlett-Packard Development Company, L.P. Auto-cropping images using saliency maps
WO2015036648A1 (fr) * 2013-09-11 2015-03-19 Nokia Technologies Oy Appareil de traitement d'images et procédés associés
US10878532B2 (en) 2014-09-02 2020-12-29 Samsung Electronics Co., Ltd. Display device, system and controlling method therefor

Also Published As

Publication number Publication date
EP2430611A1 (fr) 2012-03-21
JP2012527005A (ja) 2012-11-01
CN102428492B (zh) 2014-01-01
US20120050334A1 (en) 2012-03-01
CN102428492A (zh) 2012-04-25

Similar Documents

Publication Publication Date Title
US20120050334A1 (en) Display apparatus and a method therefor
US9672437B2 (en) Legibility enhancement for a logo, text or other region of interest in video
AU2017204855B2 (en) Logo presence detector based on blending characteristics
CN105745914B (zh) 用于逆色调映射的方法和系统
US20030053692A1 (en) Method of and apparatus for segmenting a pixellated image
US8947450B2 (en) Method and system for viewing and enhancing images
EP2347402B1 (fr) Systèmes et procédés pour imager des objets
CN109345490B (zh) 一种移动播放端实时视频画质增强方法及系统
US20140320534A1 (en) Image processing apparatus, and image processing method
EP3298578B1 (fr) Procédé et appareil pour déterminer une carte de profondeur pour une image
US10531040B2 (en) Information processing device and information processing method to improve image quality on a large screen
US20180144446A1 (en) Image processing apparatus and method
EP2530642A1 (fr) Procédé de rognage d'un contenu 3D
TWI567707B (zh) 影像調整方法及其顯示器
EP2590417A1 (fr) Appareil d'affichage d'images stéréoscopiques
KR20050105399A (ko) 디스플레이장치 및 그 제어방법
Lambers et al. Interactive dynamic range reduction for SAR images
US20040213479A1 (en) Parametric means for reducing aliasing artifacts
AU2003204340A1 (en) Assessment of Video Picture Degradation Arising from Poor Cinematography
Huo et al. Display ambient light adaptive tone mapping algorithm

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080020806.7

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10726232

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2010726232

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2012510407

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 13320052

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 9062/CHENP/2011

Country of ref document: IN