WO2021224195A1 - Audio/visual display control - Google Patents

Audio/visual display control Download PDF

Info

Publication number
WO2021224195A1
WO2021224195A1 PCT/EP2021/061599 EP2021061599W WO2021224195A1 WO 2021224195 A1 WO2021224195 A1 WO 2021224195A1 EP 2021061599 W EP2021061599 W EP 2021061599W WO 2021224195 A1 WO2021224195 A1 WO 2021224195A1
Authority
WO
WIPO (PCT)
Prior art keywords
frames
content
audio
properties
adjustment
Prior art date
Application number
PCT/EP2021/061599
Other languages
French (fr)
Inventor
Thomas Morin
Philippe Schmouker
Sylvain Lelievre
Original Assignee
Interdigital Ce Patent Holdings
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Interdigital Ce Patent Holdings filed Critical Interdigital Ce Patent Holdings
Publication of WO2021224195A1 publication Critical patent/WO2021224195A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1662Details related to the integrated keyboard
    • G06F1/1671Special purpose buttons or auxiliary keyboards, e.g. retractable mini keypads, keypads or buttons that remain accessible at closed laptop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04897Special input arrangements or commands for improving display capability
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/4221Dedicated function buttons, e.g. for the control of an EPG, subtitles, aspect ratio, picture-in-picture or teletext
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4333Processing operations in response to a pause request
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/4508Management of client data or end-user data
    • H04N21/4532Management of client data or end-user data involving end-user characteristics, e.g. viewer profile, preferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4852End-user interface for client configuration for modifying audio parameters, e.g. switching between mono and stereo
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4854End-user interface for client configuration for modifying image parameters, e.g. image brightness, contrast
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0606Manual adjustment
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0613The adjustment depending on the type of the information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • the present disclosure generally relates to consumer electronics and, more specifically, to techniques for adjusting display settings using appropriate user feedback.
  • Modern audio/visual equipment uses different displays that are often configured to default factory settings. All display devices are configured with one fixed or default display settings, and some of them allow a display profile to be selected on first boot up. On a television or a monitor, the user generally must choose between a showroom profile and default home display profile. Even with the home profile, to render one or more images, many manufacturers use backlighting and black levels in a manner that an ultra-high level of contrast is provided. Unfortunately, these levels do not allow the content to be provided in its optimal state because image rendered quality strongly depends on the ambient lights in the display device environment.
  • An apparatus and a method are provided for a multimedia content including a plurality of frames, each frame having at least one of visual properties and audio properties.
  • a to adjust at least one of display and audio parameters of at least one frame is then received and one or more of the plurality of frames is based on the corresponding visual and/or audio properties.
  • the parameters of the selected frames can be adjusted and this adjustment can be applied to other frames of the content.
  • FIG. 1 is a block diagram of a home theater system in accordance with one embodiment
  • FIG. 2 is a block diagram of a home theater system in accordance with one embodiment
  • FIG. 3 is an illustration of display settings for a video game
  • FIG. 4 is an illustration of a dark scene displayed on a device
  • FIG. 5 is an illustration of display adjustment process for a user
  • FIG. 6A and 6B is adjustments made to luminosity to reduce darkness and glare respectively in accordance with one embodiment
  • FIG. 7 is a block diagram of a processing system in accordance with the present embodiment.
  • FIG 8 is an illustration of a user interface according to one embodiment
  • FIG 9 is an illustration of a device GUI when user requests a display setting according to one embodiment.
  • FIG 10 is a flow chart illustration of one embodiment.
  • FIG. 1 an exemplary home theater arrangement 100 is shown.
  • a variety of devices are connected to home theater receiver 108, including for example a game console 102, a cable box 104, and a DVD player 106.
  • the receiver 108 is connected to a television 110.
  • a smart device such as a mobile device like a smart phone, a lap top computer, a tablet or other such devices can, referenced generally as device 112, can also be connected to the receiver 108.
  • the devices can be controlled by a single user interface (such as a remote control) or have multiple ones. However, in one embodiment, one user interface can be used as a master manager of the others and can be used generally like a universal controller.
  • the smart processing device 112, game console 102, the cable box 104, the DVD player 106, the receiver 108, and the television 112 are referred to collectively herein as the “connected devices,” as they are all connected to one another, directly or indirectly, through a connection medium and protocol such as, e.g., HDMI.
  • a connection medium and protocol such as, e.g., HDMI.
  • the present embodiments are not limited to a home theater arrangement but may be employed in any system having multiple audio/visual devices. For example, the same principles apply to a purely audio system without any visual display at all, and any of a variety of content providing devices may be used instead of those listed.
  • the present embodiments are selected solely for the sake of explanation and should not be construed as being in any way limiting.
  • the devices are connected to one another via HDMI connections and communicate using the HDMI protocol, any appropriate interconnection system and communication protocol may be used instead.
  • the devices may be connected to one another via USB, Ethernet, or any other appropriate wired or wireless communication medium and protocol.
  • FIG. 2 an exemplary home theater arrangement 200 is shown.
  • a user interface 119 is shown that can either be connected directly to the television 110 or be connected and/or controlling any of the other devices shown.
  • the image settings depend on the display technology, but nearly all displays provide a menu to set the luminosity (sometimes called brightness) and contrast levels.
  • the luminosity defines thresholds for darkness, and the contrast defines the thresholds for brightness.
  • Many device manufacturers do not include fine tuning of image settings in the first-time installation. Furthermore, it is difficult to set darkness and brightness settings easily without an appropriate feedback.
  • FIG. 4 A problem of how difficult it is to configure the display settings with the most appropriate “in-content” feedback is shown in Figure 4. As shown adjusting observed brightness and darkness is not provided with many existing systems. As shown in Figure 5, the user is watching a movie when it suddenly becomes hard to distinguish part of the content being displayed. For example, there may be different ambient lights present, or the video or content may be using an artistic choice that is best suited for a big screen or that the user has never adjusted the display settings accordingly. In many traditional systems and devices such as television sets and monitors, full screen menu generally disappears when pressing OK on the selected display property. As will be discussed later in detail, one problem is that full screen menu navigation takes time and the video content generally continue playing, so the appropriate reference image to tune the settings is gone.
  • One way to adjust the content is to go back to the display settings. This may not be an easy task. In many devices a menu is provided but this may not be accessible immediately. In addition, even when menu options are available this requires the user to press the “menu” option (i.e. key) on the device or use a user interface such as a remote control. Most menu options trigger the “Main” menu option that is often provided on full screen. The entire process often results in the user missing one or more scenes. Depending on the process, the missed screen time could even be longer if there are several steps involved such as the user needing to select the “display” section, and sometimes even the “advanced settings” option.
  • the user often provides/presses a “BACK” command/button several times (or may have a shortcut with the MENU key) to close the menu and continue watching his movie. Subsequently, when another dark scene happens, if the “blind” update was too much or not enough, he must start over. It should be noted that if the user has high expectations to achieve image quality, the pictures settings may have to be set for each video sources - meaning TV, HDMI 1 , HDMI 2, etc. Picture levels are generally different from the TV itself, to an external device - Blu-ray player for instance.
  • Figure 6A and 6B are illustrations of images in embodiments of the invention where luminosity is used to reduce darkness ( Figure 6A) and luminosity is adjusted to reduce glare ( Figure 6B).
  • user feedback can be used directly during display of a scene. An aim is to provide the best image for feedback when a user wishes to adjust the display settings. Frames from the video content can be used to provide one or more frames to provide feedback for adjustment. With video on demand (VoD) content or live content provided with time shifting video, frames of the video content are available for selection for providing feedback to adjust the display settings.
  • a Time-Shifting feature provided such as in streaming or broadcasting a content using audio-visual devices can be utilized.
  • Smart TVs provide the Time- Shifting feature, and most of OTT/IPTV TV providers also provide Time-Shifting or Network- Time-Shifting features.
  • VoD - the full content is accessible at any time such as any image data, or sequence of images for adjustment of display settings.
  • a sequence of images can be analyzed on demand (when the user requests the display settings) or in real-time during watching (as with broadcast and as described in this document). The second option can be beneficial when a user calls the display.
  • the system when the user switches to a broadcast channel, the system silently starts to record it to provide Time-Shifting functionalities when desired by the user wants. In this manner, the brightness/darkness setting takes advantage of this available Time-Shifting features.
  • the GUI provides a way to select an image or a scene from the recently played video, as a feedback for changes/adjustments. This can also be optimized to automatically detect the most relevant scenes in what has just been watched.
  • a user interface such as a remote control
  • the remote control 80 in this example includes a control 81 for controlling darkness/brightness.
  • the user or the system can select a feedback image or scene in content that has been watched, to adjust the display settings.
  • readability features as well as audibility can be changed.
  • visual comfort is as important as audio comfort and one or both can be adjusted.
  • volume buttons can be used to provide shortcuts to avoid menu navigation in a similar manner and can be used the same way as display settings. Audio feedback and image can both be used together or separately when modifying parameters.
  • display settings are directly called using a dedicated key of the TV remote control 80 using control 81.
  • the user calls settings using the dedicated key 81 and then the currently watched video content is automatically paused, so that user would not miss a scene.
  • the pause mechanism can be dependent on the particular device architecture. Therefore, any of several different similar features can be used in alternate embodiments such as time-shifting or network time-shifting. In such instances, content like video will resume when done. After editing display or audio settings, a user can catchup realtime using existing Time-shifting GUI.
  • Figure 9 provides an example of a graphical user interface 91 displayed on a TV screen 90 (TV GUI) when the user requests the display settings in accordance with one or more embodiments.
  • an apparatus and a method are provided for receiving a video content and adjusting properties of said content.
  • the video content has multiple consecutive frames with the frames having different luminosity properties.
  • a user request is then received for modifying visibility or audio characteristics of at least one frame by altering properties pertaining to luminosity or audio characteristics.
  • the content is then paused at a first location, wherein the first location includes said to be modified frame.
  • An average image pixel luminosity is then determined using the paused content at the first location.
  • the luminosity of the frame at the paused first location is then adjusted and the content is unpaused and the rest of the content to be displayed are also then adjusted according to determination of luminosity adjustments to be made based on the previous adjustment.
  • the content is then paused at a first location.
  • a menu is displayed to select the most-appropriate frames from previously watched video content, in order to get the most appropriate feedback to manually adjust an image/audio characteristic. Appropriate frames are automatically detected by the system which computes average characteristic levels.
  • the method can include one or more of the following features, with reference to Figure 9:
  • Cursor position 3 may be used to represent the time when the video was paused (if paused, else the progress bar representation is dynamic - the cursor positions and the bar length are updated in realtime).
  • the user can return to the live stream by putting the cursor on the displayed representative symbol “live” 4 to resume real-time.
  • the cursor 3 can be moved using remote control keys: (for example, arrow keys, or Dpad- LEFT and Dpad-Right). For example, moving left goes back in time, moving right goes forward in time.
  • remote control keys for example, arrow keys, or Dpad- LEFT and Dpad-Right. For example, moving left goes back in time, moving right goes forward in time.
  • a dark scene is detected which is represented here by a black segment 6.
  • a bright scene is detected which is represented in this example by a white segment 7.
  • Selecting a dark/bright segment 6/7 in some embodiments the video of the segment can be played or looped. 2)
  • the image/scene used for feedback is displayed in background. Adjustments are applied as will be described in what follows.
  • the user uses the remote control to close the display settings menu and resume the video content (if paused) for example by pressing BACK on the remote control
  • the following is an example of a luminosity table of the video content showing average pixel luminosity of frames of the video content at different times
  • the table above is only one example of obtaining visibility parameters for selection of a feedback image adjustment of display characteristics.
  • the more recently recorded values may be more relevant than older ones. In such a case a more recent frame would be selected for adjustment of parameters.
  • the table implements a first in, first out (FIFO) or circular buffer mechanism so that it can efficiently store only recent values.
  • the given implementation can be based on the average pixel luminosity of images captured at a defined time frequency. It can also be based on frame frequency or on a rule (cf. optimizations section). In this example (as shown in the table), the average pixel luminosity is a value between 0 and 1 , but it can be any other scale.
  • several possible implementations can be used to identify the dark and bright scenes. A few of these examples will be discussed below with the understanding that others can also be possible as known by those skilled in the art: [0038] The identification of dark or bright scenes could be based on fixed factory set values such as luminosity thresholds or scene durations, for example
  • Detection of bright or dark scenes could be based on values for factory set time durations for example
  • N may be the same as N
  • fully black, white or grayish images could be skipped based on data from luminosity table.
  • images containing too few contours could be skipped based on data from a luminosity table such as the one showed above and discussed earlier.
  • image-science tools may also be used to balance measured average luminosity and obtain a more accurate readability factor, such as standard deviation and contour detection.
  • measuring average luminosity at a predefined frequency can also be used.
  • using part or all “Intra-coded” frames can provide better image quality.
  • this frame could be displayed while for adjustment.
  • optimization values can be obtained as discussed above based on contours.
  • scene cuts can be achieved as a “fade-in” or “fade- out”. These scenes may not be considered as relevant dark scenes. To ignore them, in one embodiment, a ‘temporal-contour detection’ algorithm can be used to skip the frames that are irrelevant.
  • Cuts in video stream can be detected to propose a “real” full scene, and a device or system can also be made aware or learn (machine learning) user preferences and adapt automatically the display settings, An example with learning of preferred darkness settings is outlined below.
  • the system learns L1 , related settings and (if possible) ambient light properties caught by a sensor, such as a detected user...
  • Other learning mechanism factors which may be used to dynamically balance the display settings, include one or more of the following factors: season, weather, time of the day, ambient light, type of content (cartoon, movie, tv-show, game).
  • FIG. 10 is an illustration of one embodiment. This embodiment is provided as a way of example, with the understanding that other alternative embodiments are possible that provide similar functions as discussed.
  • display of video content begins. This may be an initiation of video streaming, or in response to a channel change for example.
  • the time shifting and recording of the content can be provided for video data being live streamed. In some embodiments for video data available at any time, such as video on demand time shifting or content recording may not be applied.
  • the device obtains average luminosity data for a plurality of frames. This may be obtained for every N frames or after regular time durations.
  • the device can by itself or through electronic, access to other devices provide storage of image pixel average luminosity for every N frames, according to a rule or in some distinct time frame (like every milliseconds).
  • the user requests display settings and at step 1050 the video is paused. For embodiments where adjustment is made automatically, for example based on user preferences or previous user adjustments, for example step 1050 may be eliminated.
  • the device identifies the time segments of dark and bright scenes using for example, an obtained luminosity table similar to that of Table 1. The device then provides feedback selection bar with the dark and bright segments in step 1070. The feedback selection bar can be used to select an appropriate image adjust the display settings to the user’s desires.
  • the user may select a frame indicated as dark and the user may adjust the display settings using a user interface such as a remote control and view the effect of the adjustment of the display settings on the selected image.
  • the adjusted display settings can then be applied for display of other frames/scenes of the video content.
  • the processing system 700 includes at least one processor (CPU) 704 operatively coupled to other components via a system bus 702.
  • a first storage device 722 and a second storage device 724 are operatively coupled to system bus 702 by the I/O adapter 720.
  • the storage devices 722 and 724 can be any of a disk storage device (e.g., a magnetic or optical disk storage device), a solid-state magnetic device, and so forth.
  • the storage devices 722 and 724 can be the same type of storage device or different types of storage devices.
  • a speaker 732 is operatively coupled to system bus 702 by the sound adapter 730.
  • a transceiver 742 is operatively coupled to system bus 702 by network adapter 740.
  • a display device 762 is operatively coupled to system bus 702 by display adapter 760.
  • a first user input device 752, a second user input device 754, and a third user input device 756 are operatively coupled to system bus 702 by user interface adapter 750.
  • the user input devices 752, 754, and 756 can be any of a keyboard, a mouse, a keypad, a touchpad, an image capture device, a motion sensing device, a microphone, a device incorporating the functionality of at least two of the preceding devices, and so forth. Of course, other types of input devices can also be used, while maintaining the spirit of the present principles.
  • the user input devices 752, 754, and 756 can be the same type of user input device or different types of user input devices.
  • the user input devices 752, 754, and 756 are used to input and output information to and from system 700.
  • processing system 700 may also include other elements (not shown), as readily contemplated by one of skill in the art, as well as omit certain elements.
  • various other input devices and/or output devices can be included in processing system 700, depending upon the particular implementation of the same, as readily understood by one of ordinary skill in the art.
  • various types of wireless and/or wired input and/or output devices can be used.
  • additional processors, controllers, memories, and so forth, in various configurations can also be utilized as readily appreciated by one of ordinary skill in the art.

Abstract

An apparatus and a method are provided for receiving content including a plurality of frames, each frame having at least one of visual properties and audio properties; receiving a trigger to adjust at least one of display and audio parameters of at least one frame; selecting, for adjustment of said parameters, one or more of the plurality of frames based on the corresponding visual and/or audio properties; adjusting said parameters on the selected frame(s) and applying said adjustment to other frames of the video content.

Description

AUDIO/VISUAL DISPLAY CONTROL
TECHNICAL FIELD
[0001] The present disclosure generally relates to consumer electronics and, more specifically, to techniques for adjusting display settings using appropriate user feedback.
BACKGROUND
[0002] Modern audio/visual equipment uses different displays that are often configured to default factory settings. All display devices are configured with one fixed or default display settings, and some of them allow a display profile to be selected on first boot up. On a television or a monitor, the user generally must choose between a showroom profile and default home display profile. Even with the home profile, to render one or more images, many manufacturers use backlighting and black levels in a manner that an ultra-high level of contrast is provided. Unfortunately, these levels do not allow the content to be provided in its optimal state because image rendered quality strongly depends on the ambient lights in the display device environment.
[0003] In most situations, when the device is purchased and installed in any other location outside the showroom, the default setting is that of the factory settings. These settings are not optimized for homes. Even in situations when the device manufacturer provides more sophistical menu settings, each home is different and has different illumination and other audio-visual needs that are particular to its unique environment. Consequently, even when a more sophisticated menu is presented to the user that provides other display options (such as “display mode” options), the content is still not displayed optimally.
[0004] Some users try to vary the settings during the first installation of the device using manufacturing settings. Nonetheless, few manufacturers provide instructions to help the user set up the device to at least achieve a better display of the content. [0005] In recent years, this problem has become more acute because the recent advancements in audio-visual technology have enabled a more immersive and realistic experience for the users. Consequently, it is now even more important to provide displays that can be customized to user experiences. It is therefore desirous to introduce techniques that allow a user to adjust and modify the display of content in a manner that is optimal to the environment that houses the device.
SUMMARY
[0006] Additional features and advantages are realized through similar techniques and other embodiments and aspects are described in detail herein and are considered a part of the claimed invention. For a better understanding of the invention with advantages and features, refer to the description and to the drawings.
[0007] An apparatus and a method are provided for a multimedia content including a plurality of frames, each frame having at least one of visual properties and audio properties. A to adjust at least one of display and audio parameters of at least one frame is then received and one or more of the plurality of frames is based on the corresponding visual and/or audio properties. The parameters of the selected frames can be adjusted and this adjustment can be applied to other frames of the content.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] The teachings of the present disclosure can be readily understood by considering the following detailed description in conjunction with the accompanying drawings, in which: [0009] FIG. 1 is a block diagram of a home theater system in accordance with one embodiment;
[0010] FIG. 2 is a block diagram of a home theater system in accordance with one embodiment; [0011] FIG. 3 is an illustration of display settings for a video game;
[0012] FIG. 4 is an illustration of a dark scene displayed on a device;
[0013] FIG. 5 is an illustration of display adjustment process for a user;
[0014] FIG. 6A and 6B is adjustments made to luminosity to reduce darkness and glare respectively in accordance with one embodiment; and
[0015] FIG. 7 is a block diagram of a processing system in accordance with the present embodiment;
[0016] FIG 8 is an illustration of a user interface according to one embodiment;
[0017] FIG 9 is an illustration of a device GUI when user requests a display setting according to one embodiment; and
[0018] FIG 10 is a flow chart illustration of one embodiment.
[0019] It should be understood that the drawings are for purposes of illustrating the concepts of the invention and are not necessarily the only possible configuration for illustrating the invention. To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
[0020] Referring now to FIG. 1 , an exemplary home theater arrangement 100 is shown. A variety of devices are connected to home theater receiver 108, including for example a game console 102, a cable box 104, and a DVD player 106. The receiver 108, in turn, is connected to a television 110. In this arrangement 100, a smart device such as a mobile device like a smart phone, a lap top computer, a tablet or other such devices can, referenced generally as device 112, can also be connected to the receiver 108. The devices can be controlled by a single user interface (such as a remote control) or have multiple ones. However, in one embodiment, one user interface can be used as a master manager of the others and can be used generally like a universal controller. The smart processing device 112, game console 102, the cable box 104, the DVD player 106, the receiver 108, and the television 112 are referred to collectively herein as the “connected devices,” as they are all connected to one another, directly or indirectly, through a connection medium and protocol such as, e.g., HDMI. [0021] It should be understood that the present embodiments are not limited to a home theater arrangement but may be employed in any system having multiple audio/visual devices. For example, the same principles apply to a purely audio system without any visual display at all, and any of a variety of content providing devices may be used instead of those listed. The present embodiments are selected solely for the sake of explanation and should not be construed as being in any way limiting.
[0022] It should also be understood that, although it is specifically contemplated that all of the devices are connected to one another via HDMI connections and communicate using the HDMI protocol, any appropriate interconnection system and communication protocol may be used instead. For example, the devices may be connected to one another via USB, Ethernet, or any other appropriate wired or wireless communication medium and protocol.
[0023] Referring now to FIG. 2, an exemplary home theater arrangement 200 is shown. In this arrangement 200, a user interface 119 is shown that can either be connected directly to the television 110 or be connected and/or controlling any of the other devices shown.
[0024] The image settings depend on the display technology, but nearly all displays provide a menu to set the luminosity (sometimes called brightness) and contrast levels. The luminosity defines thresholds for darkness, and the contrast defines the thresholds for brightness. Many device manufacturers do not include fine tuning of image settings in the first-time installation. Furthermore, it is difficult to set darkness and brightness settings easily without an appropriate feedback.
[0025] A problem of how difficult it is to configure the display settings with the most appropriate “in-content” feedback is shown in Figure 4. As shown adjusting observed brightness and darkness is not provided with many existing systems. As shown in Figure 5, the user is watching a movie when it suddenly becomes hard to distinguish part of the content being displayed. For example, there may be different ambient lights present, or the video or content may be using an artistic choice that is best suited for a big screen or that the user has never adjusted the display settings accordingly. In many traditional systems and devices such as television sets and monitors, full screen menu generally disappears when pressing OK on the selected display property. As will be discussed later in detail, one problem is that full screen menu navigation takes time and the video content generally continue playing, so the appropriate reference image to tune the settings is gone.
[0026] One way to adjust the content is to go back to the display settings. This may not be an easy task. In many devices a menu is provided but this may not be accessible immediately. In addition, even when menu options are available this requires the user to press the “menu” option (i.e. key) on the device or use a user interface such as a remote control. Most menu options trigger the “Main” menu option that is often provided on full screen. The entire process often results in the user missing one or more scenes. Depending on the process, the missed screen time could even be longer if there are several steps involved such as the user needing to select the “display” section, and sometimes even the “advanced settings” option.
[0027] Whether this is the case with GUIs or user interfaces and audio-video devices, the user usually selects the desired property to change. Most often this leads to a switch to a full screen menu. In some embodiments, the full screen GUI etc., then turns into an overlaid one and the brightness or darkness of the original scene is lost so the appropriate adjustments can no longer be made. This means that because dark scene of the example, is gone, the user needs to wait to another one or has to try a “blind” update. Now the user needs to use the user interface to go back (BACK button, OK key on remote control etc.) to go back to the previous scene which may not be displaying anymore. It should be noted, that at this step, the user often provides/presses a “BACK” command/button several times (or may have a shortcut with the MENU key) to close the menu and continue watching his movie. Subsequently, when another dark scene happens, if the “blind” update was too much or not enough, he must start over. It should be noted that if the user has high expectations to achieve image quality, the pictures settings may have to be set for each video sources - meaning TV, HDMI 1 , HDMI 2, etc. Picture levels are generally different from the TV itself, to an external device - Blu-ray player for instance.
[0028] Figure 6A and 6B are illustrations of images in embodiments of the invention where luminosity is used to reduce darkness (Figure 6A) and luminosity is adjusted to reduce glare (Figure 6B). In one embodiment, user feedback can be used directly during display of a scene. An aim is to provide the best image for feedback when a user wishes to adjust the display settings. Frames from the video content can be used to provide one or more frames to provide feedback for adjustment. With video on demand (VoD) content or live content provided with time shifting video, frames of the video content are available for selection for providing feedback to adjust the display settings. In one embodiment, a Time-Shifting feature provided such as in streaming or broadcasting a content using audio-visual devices can be utilized. To aid understanding, an example of a Smart TV can be provided. Smart TVs provide the Time- Shifting feature, and most of OTT/IPTV TV providers also provide Time-Shifting or Network- Time-Shifting features. This means that the device provides features that enable it to go back in time in a Live broadcast video content. Video frames of that content are available for adjustment of the display settings. For other type of content - media files, VoD - the full content is accessible at any time such as any image data, or sequence of images for adjustment of display settings. In one embodiment, a sequence of images can be analyzed on demand (when the user requests the display settings) or in real-time during watching (as with broadcast and as described in this document). The second option can be beneficial when a user calls the display. In most of these devices, when the user switches to a broadcast channel, the system silently starts to record it to provide Time-Shifting functionalities when desired by the user wants. In this manner, the brightness/darkness setting takes advantage of this available Time-Shifting features. In addition, the GUI provides a way to select an image or a scene from the recently played video, as a feedback for changes/adjustments. This can also be optimized to automatically detect the most relevant scenes in what has just been watched.
[0029] In Figure 8, a user interface such as a remote control is provided. The remote control 80 in this example includes a control 81 for controlling darkness/brightness. In one embodiment, the user or the system can select a feedback image or scene in content that has been watched, to adjust the display settings. In one embodiment, readability features as well as audibility can be changed. In other words, visual comfort is as important as audio comfort and one or both can be adjusted. For example, volume buttons can be used to provide shortcuts to avoid menu navigation in a similar manner and can be used the same way as display settings. Audio feedback and image can both be used together or separately when modifying parameters.
[0030] In this way, display settings are directly called using a dedicated key of the TV remote control 80 using control 81. The user calls settings using the dedicated key 81 and then the currently watched video content is automatically paused, so that user would not miss a scene. In different embodiments, the pause mechanism can be dependent on the particular device architecture. Therefore, any of several different similar features can be used in alternate embodiments such as time-shifting or network time-shifting. In such instances, content like video will resume when done. After editing display or audio settings, a user can catchup realtime using existing Time-shifting GUI.
[0031] Figure 9 provides an example of a graphical user interface 91 displayed on a TV screen 90 (TV GUI) when the user requests the display settings in accordance with one or more embodiments.
[0032] In one embodiment, an apparatus and a method are provided for receiving a video content and adjusting properties of said content. In one embodiment, the video content has multiple consecutive frames with the frames having different luminosity properties. A user request is then received for modifying visibility or audio characteristics of at least one frame by altering properties pertaining to luminosity or audio characteristics. The content is then paused at a first location, wherein the first location includes said to be modified frame. An average image pixel luminosity is then determined using the paused content at the first location. Using average image pixel luminosity the luminosity of the frame at the paused first location is then adjusted and the content is unpaused and the rest of the content to be displayed are also then adjusted according to determination of luminosity adjustments to be made based on the previous adjustment. The content is then paused at a first location. A menu is displayed to select the most-appropriate frames from previously watched video content, in order to get the most appropriate feedback to manually adjust an image/audio characteristic. Appropriate frames are automatically detected by the system which computes average characteristic levels.
[0033] In one embodiment the method can include one or more of the following features, with reference to Figure 9:
A) The selection of a display setting 1 to adjust display parameters (using for example remote control buttons such as up/down control buttons -Dpad-UP, Dpad-Down).
B) Selection of the image/scene to provide feedback 2 for image settings adjustment
C) Cursor position 3 may be used to represent the time when the video was paused (if paused, else the progress bar representation is dynamic - the cursor positions and the bar length are updated in realtime).
If the video is a Live stream, the user can return to the live stream by putting the cursor on the displayed representative symbol “live” 4 to resume real-time.
The cursor 3 can be moved using remote control keys: (for example, arrow keys, or Dpad- LEFT and Dpad-Right). For example, moving left goes back in time, moving right goes forward in time. In one example as provided, a dark scene is detected which is represented here by a black segment 6. 1) a bright scene is detected which is represented in this example by a white segment 7.
Selecting a dark/bright segment 6/7 in some embodiments the video of the segment can be played or looped. 2) The image/scene used for feedback is displayed in background. Adjustments are applied as will be described in what follows.
The user uses the remote control to close the display settings menu and resume the video content (if paused) for example by pressing BACK on the remote control [0034] The following is an example of a luminosity table of the video content showing average pixel luminosity of frames of the video content at different times
Figure imgf000011_0001
[0035] It should be noted that the table above is only one example of obtaining visibility parameters for selection of a feedback image adjustment of display characteristics. In this embodiment the more recently recorded values may be more relevant than older ones. In such a case a more recent frame would be selected for adjustment of parameters. Alternatively, there could also be a table having -D instead of +D to illustrate average pixel luminosity at different times
[0036] In one embodiment, the table implements a first in, first out (FIFO) or circular buffer mechanism so that it can efficiently store only recent values. In one embodiment, the given implementation can be based on the average pixel luminosity of images captured at a defined time frequency. It can also be based on frame frequency or on a rule (cf. optimizations section). In this example (as shown in the table), the average pixel luminosity is a value between 0 and 1 , but it can be any other scale. [0037] In different alternate embodiments, several possible implementations can be used to identify the dark and bright scenes. A few of these examples will be discussed below with the understanding that others can also be possible as known by those skilled in the art: [0038] The identification of dark or bright scenes could be based on fixed factory set values such as luminosity thresholds or scene durations, for example
- Average Luminosity Threshold for Dark Scene Detection (e.g: less than 0.2)
- Average Luminosity Threshold for Bright Scene Detection (ex: greater than
0.7)
- Minimum Dark/Bright Scene Duration (e.g: 2 seconds or 60 frames)
[0039] Detection of bright or dark scenes could be based on values for factory set time durations for example
- darkest period lasting N seconds
- brightest period lasting N’ seconds (N’ may be the same as N),
[0040] In other embodiments, fully black, white or grayish images could be skipped based on data from luminosity table. In one or more embodiments, images containing too few contours could be skipped based on data from a luminosity table such as the one showed above and discussed earlier.
[0041] Other image-science tools may also be used to balance measured average luminosity and obtain a more accurate readability factor, such as standard deviation and contour detection.
[0042] Depending on the video encoding technology, measuring average luminosity at a predefined frequency can also be used. In some embodiments, using part or all “Intra-coded” frames can provide better image quality. In addition, in an intra-coded frame that contains the fewest detectable contours within the selected scene, this frame could be displayed while for adjustment. In one embodiment, optimization values can be obtained as discussed above based on contours. In one embodiment, scene cuts can be achieved as a “fade-in” or “fade- out”. These scenes may not be considered as relevant dark scenes. To ignore them, in one embodiment, a ‘temporal-contour detection’ algorithm can be used to skip the frames that are irrelevant.
[0043] Cuts in video stream can be detected to propose a “real” full scene, and a device or system can also be made aware or learn (machine learning) user preferences and adapt automatically the display settings, An example with learning of preferred darkness settings is outlined below.
* A user adjusted the dark settings for a video content where the average darkest scene luminosity is L1.
* The system then learns L1 , related settings and (if possible) ambient light properties caught by a sensor, such as a detected user...
[0044] * Later the user watches a video where average luminosity of the darkest scene is L2 with L2 < (L1 + factory threshold value).
* From learning L1 the device considers the video content to be too dark and adapts the dark settings automatically.
- Other learning mechanism factors which may be used to dynamically balance the display settings, include one or more of the following factors: season, weather, time of the day, ambient light, type of content (cartoon, movie, tv-show, game).
[0045] It should also be noted that while some of the embodiments have been mainly described to address refinement of darkness and brightness levels, in alternate embodiments other features can also be changed, such as other visibility characteristics such as gamma and tint, in the same manner. In the same manner the solution discussed for displays can also be implemented to other similar issues in alternate embodiments. For instance, if the user wants to set parameters of an audio compression, selecting scenes with high and low audio levels may be applied to adjust audio parameters. In one embodiment,
[0046] Figure 10 is an illustration of one embodiment. This embodiment is provided as a way of example, with the understanding that other alternative embodiments are possible that provide similar functions as discussed. In step 1010, display of video content begins. This may be an initiation of video streaming, or in response to a channel change for example. In step 1020, the time shifting and recording of the content can be provided for video data being live streamed. In some embodiments for video data available at any time, such as video on demand time shifting or content recording may not be applied. In step 1030 the device obtains average luminosity data for a plurality of frames. This may be obtained for every N frames or after regular time durations. In some embodiments, as shown in the device can by itself or through electronic, access to other devices provide storage of image pixel average luminosity for every N frames, according to a rule or in some distinct time frame (like every milliseconds). In step 1040, the user requests display settings and at step 1050 the video is paused. For embodiments where adjustment is made automatically, for example based on user preferences or previous user adjustments, for example step 1050 may be eliminated. In step 1060, the device identifies the time segments of dark and bright scenes using for example, an obtained luminosity table similar to that of Table 1. The device then provides feedback selection bar with the dark and bright segments in step 1070. The feedback selection bar can be used to select an appropriate image adjust the display settings to the user’s desires. For example, the user may select a frame indicated as dark and the user may adjust the display settings using a user interface such as a remote control and view the effect of the adjustment of the display settings on the selected image. The adjusted display settings can then be applied for display of other frames/scenes of the video content.
[0047] Referring now to FIG. 7, an exemplary processing system 700 is shown which may represent the smart startup module 112. The processing system 700 includes at least one processor (CPU) 704 operatively coupled to other components via a system bus 702. A cache 706, a Read Only Memory (ROM) 708, a Random Access Memory (RAM) 710, an input/output (I/O) adapter 720, a sound adapter 730, a network adapter 740, a user interface adapter 750, and a display adapter 760, are operatively coupled to the system bus 702.
[0048] A first storage device 722 and a second storage device 724 are operatively coupled to system bus 702 by the I/O adapter 720. The storage devices 722 and 724 can be any of a disk storage device (e.g., a magnetic or optical disk storage device), a solid-state magnetic device, and so forth. The storage devices 722 and 724 can be the same type of storage device or different types of storage devices.
[0049] A speaker 732 is operatively coupled to system bus 702 by the sound adapter 730. A transceiver 742 is operatively coupled to system bus 702 by network adapter 740. A display device 762 is operatively coupled to system bus 702 by display adapter 760.
[0050] A first user input device 752, a second user input device 754, and a third user input device 756 are operatively coupled to system bus 702 by user interface adapter 750. The user input devices 752, 754, and 756 can be any of a keyboard, a mouse, a keypad, a touchpad, an image capture device, a motion sensing device, a microphone, a device incorporating the functionality of at least two of the preceding devices, and so forth. Of course, other types of input devices can also be used, while maintaining the spirit of the present principles. The user input devices 752, 754, and 756 can be the same type of user input device or different types of user input devices. The user input devices 752, 754, and 756 are used to input and output information to and from system 700.
[0051] Of course, the processing system 700 may also include other elements (not shown), as readily contemplated by one of skill in the art, as well as omit certain elements. For example, various other input devices and/or output devices can be included in processing system 700, depending upon the particular implementation of the same, as readily understood by one of ordinary skill in the art. For example, various types of wireless and/or wired input and/or output devices can be used. Moreover, additional processors, controllers, memories, and so forth, in various configurations can also be utilized as readily appreciated by one of ordinary skill in the art. These and other variations of the processing system 700 are readily contemplated by one of ordinary skill in the art given the teachings of the present principles provided herein.
[0052] A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made. For example, elements of different implementations may be combined, supplemented, modified, or removed to produce other implementations. Additionally, one of ordinary skill will understand that other structures and processes may be substituted for those disclosed and the resulting implementations will perform at least substantially the same function(s), in at least substantially the same way(s), to achieve at least substantially the same result(s) as the implementations disclosed. Accordingly, these and other implementations are contemplated by this disclosure and are within the scope of this disclosure.

Claims

1. A method comprising: receiving a content including a plurality of frames, each frame having at least one of visual properties and audio properties; receiving a trigger to adjust at least one of display and audio parameters of at least one frame; selecting, for adjustment of said parameters, one or more of the plurality of frames based on the corresponding visual and/or audio properties; adjusting said parameters on the selected frame(s) and applying said adjustment to other frames of the content.
2. An apparatus comprising: one or more processors configured for receiving a content including a plurality of frames, each frame having at least one of visual properties and audio properties; receiving a trigger to adjust at least one of display and audio parameters of at least one frame; selecting, for adjustment of said parameters, one or more of the plurality of frames based on the corresponding visual and/or audio properties; adjusting said parameters on the selected frame(s) and applying said adjustment to other frames of the content.
3. The method of claim 1 comprising or the apparatus of claim 2 configured for pausing said content for adjustment of said properties and unpausing said content after adjustment of said properties and wherein said content is a video content.
4. The method of claim 1 or 3 or the apparatus of claim 2 or 3 wherein said adjustment is stored.
5. The method of any of claims 1 or 3-4 further comprising analyzing and storing user preferences made previously and storing for application to content for viewing by said user.
6. The method of any of claims claim 1 or 3-5 or apparatus of any of claims 2-4 wherein said one or more frames are selected based on average pixel luminosity of the corresponding frame(s)
7. The method or apparatus of claim 6 wherein said display parameter is adjusted based on average pixel luminosity of said selected frames.
8. The method or apparatus of any of claims 6-7 wherein said image pixel average luminosity is stored, for example every N time period.
9. The method or apparatus of any of claims 6-8 wherein said image pixel average luminosity is stored according to a rule.
10. The method or apparatus of any of claims 1 and 3-9 or apparatus of any of the claims 2-9, wherein frames of dark and bright scenes are identified based on pixel luminosity data.
11. The method of any of claims 1 or 3 to 10 or apparatus of claims 2 -10, wherein said trigger is a user input.
12. The method of any of claims 1 or 3 to 10 or apparatus of claims 2 -10, wherein said trigger is automatic based on user preferences or previous adjustments.
13. The method of any of claims 1 or 3-12 or apparatus of any of claims 2-12, wherein said content is broadcast and time shifting is used to access the plurality of frames.
14. The method of any of claims 1 or 3-13 or the apparatus of any of claims 2-13, wherein said visual properties comprises at least one of tint and gamma characteristics.
15. The method of any of claims 1 or 3-14 or the apparatus of any of claims 2-14, wherein said audio properties comprises audio compression characteristics.
16. A non-transitory computer-readable medium storing computer-executable instructions executable to perform the method of any of claims 1 and 3-15.
PCT/EP2021/061599 2020-05-05 2021-05-03 Audio/visual display control WO2021224195A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP20305444 2020-05-05
EP20305444.0 2020-05-05

Publications (1)

Publication Number Publication Date
WO2021224195A1 true WO2021224195A1 (en) 2021-11-11

Family

ID=70804609

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/061599 WO2021224195A1 (en) 2020-05-05 2021-05-03 Audio/visual display control

Country Status (1)

Country Link
WO (1) WO2021224195A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6008836A (en) * 1996-06-03 1999-12-28 Webtv Networks, Inc. Method and apparatus for adjusting television display control using a browser
US20110234654A1 (en) * 2008-10-02 2011-09-29 Sung-Jin Park Picture quality control method and image display using same
US20150058877A1 (en) * 2013-08-21 2015-02-26 Harman International Industries, Incorporated Content-based audio/video adjustment
US20170280200A1 (en) * 2016-03-24 2017-09-28 Echostar Technologies L.L.C. Direct capture and sharing of screenshots from video programming
US20190306211A1 (en) * 2018-04-02 2019-10-03 OVNIO Streaming Services, Inc. Seamless social multimedia
US10521188B1 (en) * 2012-12-31 2019-12-31 Apple Inc. Multi-user TV user interface

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6008836A (en) * 1996-06-03 1999-12-28 Webtv Networks, Inc. Method and apparatus for adjusting television display control using a browser
US20110234654A1 (en) * 2008-10-02 2011-09-29 Sung-Jin Park Picture quality control method and image display using same
US10521188B1 (en) * 2012-12-31 2019-12-31 Apple Inc. Multi-user TV user interface
US20150058877A1 (en) * 2013-08-21 2015-02-26 Harman International Industries, Incorporated Content-based audio/video adjustment
US20170280200A1 (en) * 2016-03-24 2017-09-28 Echostar Technologies L.L.C. Direct capture and sharing of screenshots from video programming
US20190306211A1 (en) * 2018-04-02 2019-10-03 OVNIO Streaming Services, Inc. Seamless social multimedia

Similar Documents

Publication Publication Date Title
US10109228B2 (en) Method and apparatus for HDR on-demand attenuation control
CN102630383B (en) Display device, control method for said display device
US10447961B2 (en) Luminance management for high dynamic range displays
US8436803B2 (en) Image display device and image display method
JP5688555B2 (en) Video display mode control
JP4796209B1 (en) Display device, control device, television receiver, display device control method, program, and recording medium
US8487940B2 (en) Display device, television receiver, display device control method, programme, and recording medium
US8836865B2 (en) Method and system for applying content-based picture quality profiles
US8044995B2 (en) Image processor and method for adjusting image quality
KR20080110079A (en) Method for setting configuration according to external av device or broadcasting program and display apparatus thereof
KR20170005416A (en) Method and apparatus for adjusting display settings of a display according to ambient lighting
CN113259765A (en) Method for automatically adjusting display parameters of display screen of television device and television device
US20170264937A1 (en) Method and apparatus for generating environment setting information of display device
JPWO2011037147A1 (en) Display device, program, and computer-readable storage medium storing program
WO2021224195A1 (en) Audio/visual display control
JP2011166315A (en) Display device, method of controlling the same, program, and recording medium
CN1306800C (en) Method and apparatus for controlling a video signal processing apparatus
JP5506239B2 (en) Video processing apparatus, control method, and program
US20210211757A1 (en) Systems and methods for adapting playback device for content display
KR101660730B1 (en) Method for displaying of image and system for displaying of image thereof
KR20110073853A (en) Image display device and method for setting up a picturequality
KR101067775B1 (en) Apparatus and method for control woofer of TV
KR100731357B1 (en) Method of controlling picturequality and display processing apparatus thereof
KR20100072681A (en) Apparatus and method for image displaying in image display device
KR20050123250A (en) Method for controlling video and audio based on genre data

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21722478

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21722478

Country of ref document: EP

Kind code of ref document: A1