WO2016092059A1 - Method and graphic processor for managing colors of a user interface - Google Patents

Method and graphic processor for managing colors of a user interface Download PDF

Info

Publication number
WO2016092059A1
WO2016092059A1 PCT/EP2015/079369 EP2015079369W WO2016092059A1 WO 2016092059 A1 WO2016092059 A1 WO 2016092059A1 EP 2015079369 W EP2015079369 W EP 2015079369W WO 2016092059 A1 WO2016092059 A1 WO 2016092059A1
Authority
WO
WIPO (PCT)
Prior art keywords
user interface
multimedia
frame
color
colors
Prior art date
Application number
PCT/EP2015/079369
Other languages
French (fr)
Inventor
Ratnam VADHRI VENKATA
Gaurav SAHI
Chawandi PRABHU
Antoine Burckard
Original Assignee
Nagravision S.A.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nagravision S.A. filed Critical Nagravision S.A.
Priority to BR112017011272A priority Critical patent/BR112017011272A2/en
Priority to MX2017007462A priority patent/MX2017007462A/en
Priority to AU2015359323A priority patent/AU2015359323B2/en
Priority to CN201580067673.1A priority patent/CN107004285A/en
Priority to CA2968472A priority patent/CA2968472A1/en
Priority to JP2017531214A priority patent/JP2018510396A/en
Priority to US15/535,337 priority patent/US10964069B2/en
Priority to SG11201704261SA priority patent/SG11201704261SA/en
Priority to KR1020177015997A priority patent/KR20170093848A/en
Priority to EP15808573.8A priority patent/EP3230955A1/en
Publication of WO2016092059A1 publication Critical patent/WO2016092059A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/28Indexing scheme for image data processing or generation, in general involving image processing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/16Calculation or use of calculated indices related to luminance levels in display data

Definitions

  • the present invention relates to an automated method and a graphic processor for managing and updating colors of a user interface comprising graphical elements, text, and images to be displayed over a background formed by a still image, a moving image, or video content.
  • a user-friendly graphical user interface provides attractive graphical effects having a pleasant esthetic appearance as well as a possibility of an easy and intuitive usage.
  • a user interface generally displays items comprising text strings, graphical icons, graphical objects, gradients and images.
  • a common way to display the list of items is to configure a first graphics layer of the graphical user interface as a static background and a second graphics layer to display the items over the background in a scrolling manner.
  • the background may have a predetermined area, which is distinguished from the surrounding area by a special effect, such as a highlight, a gradient or a frame.
  • the item which is displayed over the predetermined area is treated as an item of interest. Such a display does not alter the way in which the items are displayed on the second graphics layer.
  • Another way to display the list of items is to configure a first graphics layer of the graphical user interface to display the items and a second graphics layer to display a symbol or a frame in a scrolling manner over the first layer.
  • the item over which the symbol is displayed is treated as the item of interest.
  • Such display does not alter the way in which the items are displayed on the first graphics layer.
  • the User Interfaces are mostly two-dimensionally configured in an On Screen Display (OSD) manner.
  • OSD On Screen Display
  • the two-dimensional configuration of the Uls Since information is displayed in a planar fashion, a user feels inconvenience in recognition and use of a Ul.
  • the menu may be superimposed on a background image or frames of video content displayed on a main screen of a television set for example, in such a way that all or part of the Ul may be either visible or hidden or become unreadable. Therefore, there is a need for a method and a system able to manage and update colors used in the user interface in a smart way, which is automated and easier for the user.
  • document US8872969 discloses a method of dynamic relative adjustment of a color parameter of at least a portion of a video frame and/or a color parameter of at least a portion of a subtitle associated therewith before being displayed.
  • the method comprises steps of storing data related to a video frame separately from data related to a subtitle of the video frame in a memory of a data processing device, and comparing, through a processor communicatively coupled to the memory, a color parameter of the data related to the video frame to a color parameter of the data related to the subtitle.
  • the method also includes dynamically adjusting a color parameter of at least a portion of the data related to the subtitle and/or a color parameter of at least a portion of the data related to the video frame based on the comparison.
  • the method includes overlaying the data related to the subtitle on the data related to the video frame following the dynamic adjustment prior to rendering thereof on a display unit. Summary of the invention
  • An embodiment of the disclosure proposes an automated method for managing colors for a user interface to be displayed over at least one multimedia frame provided by an electronic device according to claim 1 .
  • a further object of the disclosure relates to a graphic processor configured to automatically manage colors in a user interface displayed over at least one multimedia frame provided by an electronic device according to claim 7.
  • a multimedia frame is defined in the context of the disclosure as any still or moving human eyes visible image provided by a multimedia content.
  • the method and the graphic processor of the present invention concern dynamic user interfaces displayed on video centric consumer electronic devices.
  • a user interface includes a set of graphical items having various colors, shapes, sizes, and locations on a display screen. Users will be able to select objects or functions from a single or a set of graphical items according to hardware and software configuration of the electronic device.
  • an aim consists of showing video content on a display at any time without modifying the size of the multimedia frame or cropping some part of it.
  • using translucent user interfaces may be a preferred option.
  • this translucence will result in the color of the user interface to confl ict with the background multimedia frame colors and provides distracting user experience.
  • multimedia content or multimedia frame also includes the case where a real scene produces, by means of a camera, a multimedia frame over which user interface graphical items may be displayed.
  • scenes seen by smart glasses or windscreens may be used as background for user interface items in form of text and/or graphics to be exploited by a user.
  • Figure 1 shows a layered structure of a video multimedia frame comprising a background image layer on which a user interface including a graphic layer and a text layer is superimposed.
  • Figure 2 shows a diagram of the organization of user interface items colors on a video image.
  • Figure 3 shows video data blocks of a background image from which dominant color arrays are extracted after analysis and used for user interface items generation.
  • Figure 4 shows a block diagram of the graphic processor with peripherals configured to manage colors in a user interface displayed over an image provided by an electronic device.
  • Figure 5 shows a flow chart of an embodiment of the method according to the invention using criteria managing color selection in arrays of dominant colors of the user interface items and the dominant colors of an image.
  • FIG. 1 illustrates an example of a layered multimedia frame provided by a multimedia content source where the bottom most layer or background layer BL occupies the entire surface of the screen while further layers of a user interface Ul may be placed at predefined positions on the screen without necessarily covering the entire surface of the screen.
  • the user interface Ul may comprise a graph ic layer GL including graphical items such as drawings, animated elements, logos, image boxes, buttons, etc. and a text layer TL including text, alphanumeric characters, ideograms, symbols, frames of various shapes, etc.
  • the color components of the user interface layers are set by applications rendering these layers and their graphical items on the display.
  • the applications generate, for example, graphical items in form of stacked windows disposed on layers in a predefined order.
  • the colors of the background image in the first layer are preset while the colors of the user interface windows are set by the applications.
  • the colors of the windows are defined in a static way so that in case of overlapping, some windows or items thereof may be hidden or displayed with an insufficient contrast over colors of an under laying window.
  • the diagram of figure 2 illustrates an example of a display having background colors AV on which user interface graphics are displayed in form of windows W1 , W2 and W3 having each a particular set of colors.
  • the display order AV, W1 , W2, W3 corresponds to the layers stack where the background image AV is placed on the bottom layer and the windows W3 on the top layer.
  • a digital multimedia content processed for example by a graphic processor of an interactive multimedia content rendering device or by a multimedia decoder includes digital video data blocks in a compressed form defining the images composition.
  • One known technique for video compression is referred to as the Motion Picture Experts Group (MPEG) compression algorithm.
  • MPEG Motion Picture Experts Group
  • each frame of a motion picture video is described either independently or as a change from a previously displayed frame.
  • a video scene might be described by a single independent frame which shows the entire scene as it initially appears, followed by a long series of change frames which describe the changes in the scene as actors move, for example.
  • MPEG Motion Picture Experts Group
  • the MPEG algorithm is capable of describing an image by either a single independent video frame, called an l-frame, or by a combination of an initial l-frame and one or more succeeding change frames, comprising P-frames describing a change to a previously displayed image and B-frames describing differences between a current frame and both the preceding and following frames to specify its content.
  • the P-frame data is written into a frame buffer of the graphic processor whose contents are already being displayed, resulting in a modification to the displayed image.
  • the color composition information of the background image which may be still or moving is mainly contained in the l-frames data blocks which are analyzed by the graphic processor according to the method of the present invention.
  • the multimedia content received and processed by the graphic processor may comprise video data blocks encoded by using other algorithms than MPEG, as for example Google VP8, VP9, RealVideo, Microsoft ASF, etc.
  • Images provided by cameras associated to glasses or windscreens are generally not compressed so that each frame may be analyzed for getting dominant color arrays.
  • Analog video content may be converted into digital video data blocks that are processed to extract dominant color information.
  • a dominant color is defined by its higher intensity or strength in relation to other colors in a spectrum. Dominant color strength values may be assigned using a mathematical intensity distribution curve formula.
  • the graphic processor receives one or more graphical items provided by a user interface generator driven by a specific application to be displayed over a part or the entire background multimedia frame provided by the multimedia content.
  • One or more colors of the array may thus be changed dynamically, i.e. each time the dominant colors of the background image part change.
  • the array of dominant colors of the background image may be filtered according to one or more color selection criteria such as quality of the background image, user preferences, genre of the multimedia content, available dominant colors in the array of the l-frame, etc.
  • quality of a background image may be defined by the resolution, such as the number of pixels per inch, compression rate, sharpness, motion regularity, etc.
  • the user interface generator may provide default colors for some graphical items and other graphical items for wh ich the color may be replaced dynamically in function of the dominant color array of the background multimedia frame.
  • the color change may be carried out by a color fader configured to change color within a predefined time period in order to prevent sudden color switching. The color changes are thus softened by introducing a progressive transition through less dominant colors.
  • the user interface items colors may be filtered in function of the multimedia content genre.
  • An action movie such as a thriller may enable the graphic processor to replace default colors of user interface items by vibrant colors i.e. the most dominant colors of the array.
  • User interface items colors over a movie related to a story for children may be changed into soft light colors.
  • User preferences may also be used for filtering background multimedia frames colors, as for example the color set may be l imited to particular colors selected in an interactive manner by the user on the fly, when graphical items of the user interface appear or preferred colors may be based on pre-stored settings.
  • a frequency of color changes in a time interval may also be applied on user interface items depending on the color of the background layers.
  • Color changes thus occur dynamically after analysis of the background multimedia frame leading to provide a set of colors to be selected for graphical items in order to present a visible user interface having a pleasant esthetic.
  • the graphical items color is thus adapted to the background multimedia frame so that the color may change when the background multimedia frame changes as well as when the user interface layers move across the background multimedia frame which may display parts with different dominant colors.
  • a yellow graphical item such as a line passing over a blue background part will change into cyan when it passes over a red background part.
  • a graphical item such as a rectangle may have a different color for each side depending on the dominant color of the background multimedia frame part over which the concerned side is displayed.
  • a subtitle on a background video frame may have different colors on each character depending on the parts of the background video frame the subtitle characters are displayed.
  • a character displayed over a white and a black background part will appear as black on the white background part and white on the black background part. In these conditions, the subtitle remains always visible whatever the color of the background video frame.
  • a graphical item such as a line, for example, may have an "average" color adapted to be visible on all of the different parts traversed by the line on the display screen .
  • the color array resulting from the analysis and filtering enables selecting the appropriate visible color for modifying color of a user interface graphical item according to the dominant color of the background multimedia frame. No color change occurs when the user interface item has already a color adapted to background dominant color and when no add itional filtering criteria have been previously applied to the background multimedia frame dominant color array.
  • the graphic processor GP driven by video processing software may preferably be implemented in a video centric electronic device.
  • the video centric device may be for example a video gateway device such as a set top box, a media player, a game player, a television set or the like in a user's home.
  • the graphic processor GP may be coupled to peripherals such as an interactive multimedia content rendering device IMRD providing, for example, l-frame, B-frame and P-frame video data blocks, a display driver DD coupled to a television set display screen DS and a user interface generator UIG.
  • IMRD interactive multimedia content rendering device
  • a display driver DD coupled to a television set display screen DS
  • UIG user interface generator
  • the interactive multimedia content rendering device IMRD forwards multimedia content data from the at least one multimedia frame to a content analyzer CA configured to analyze the digital multimedia content data comprising l-frames for example.
  • Dominant color arrays corresponding to at least a part of the at least one multimedia frame displayed on the screen of the television set DS are then extracted from the l-frames and forwarded to a filter F coupled to an output of the content analyzer CA.
  • the filter F is configured to filter the extracted array of dominant colors of the part of at least one multimedia frame according to at least one predefined criterion managing color selection, and to obtain a resulting array of dominant colors.
  • the graphic processor GP further comprises a dynamic user interface data processor UIP coupled to an output of the filter F and to an output of the user interface generator UIG, which generates user interface graphical items.
  • a dynamic user interface data processor UIP assembles the graphical items to form a user interface Ul overlaying the at least one multimedia frame.
  • the dominant color of the graphical items are selected based on the resulting array of dominant colors obtained at the output of the filter F in such a way to be visible in the part of the at least one multimedia frame whereon the graphical items are displayed.
  • the user interface generator UIG may be driven by applications running in the video centric device allowing interaction with a user.
  • the user interface Ul comprising the assembled graphical items is preferably displayed over the multimedia content thanks to the display driver DD.
  • the graphical items of the user interface Ul have therefore dominant colors which may be modified in function of the colors in a part or the entire displayed multimedia frame.
  • the user interface generator UIG selects one or more colors from the dominant colors array for user interface graphical items according to at least one of above mentioned filtering criterion managing graphical items color selection in the dominant colors array provided by the dominant color analyzer CA.
  • the user interface U l having adapted colors is forwarded by the dynam ic user interface video data processor UIP to the display driver DD for being displayed on the display screen DS over the multimedia frame.
  • the graphic processor GP further comprises a color fader CF inserted between an output of the dynamic user interface data processor UIP and an input of the display driver DD.
  • This color fader CF is configured to change color of graphical items composing the user interface within a predefined time period by introducing a progressive transition through less dominant colors.
  • the flow chart of figure 5 illustrates an embodiment of the method according to the invention using criteria managing user interface items color selection based on multimedia content quality and multimedia content genre.
  • User preferences are also taken into account for the user interface graphical items color selection.
  • l-frames related to dominant colors array are extracted and used as reference to user interface graphical items color changes.
  • An extracted color array K may be filtered by user settings, as for example, pre-stored user preferences, by parameters related to the multimedia content quality to obtain a filtered color array K1 which may be further filtered according to the multimedia content genre to obtain color array K2.
  • color selection is performed by filters applied to the extracted dominant color arrays according to predefined criteria: content quality, content genre, content images dominant colors, etc.
  • content quality e.g., content quality, content genre, content images dominant colors, etc.
  • the filter eliminates the least dominant colors to keep the most dominant colors used to enable replacements of the user interface graphical items colors.
  • the color set K2 is then used to change, if necessary, colors of the graphical items in the generated application user interface.
  • the color change of user interface graphical items may be performed through the color fader CF coupled to the display driver DD to prevent sudden color switch ing wh ich may d isturb user experience. If no color change is required the generated application user interface is directly displayed on the display screen over the images of the video content. In case of background visual content provided by real scenes as for example, through smart glasses or windscreens, no l-frame are produced contrarily to the case where a video content is received by a multimedia content rendering device connected to a video data source such as a video streaming server, an optical or hard disc, etc.
  • a camera pointed on a real scene may provide the background multimedia frame which can be analyzed to extract information on dominant colors arrays from color composition.
  • the colors of the user interface graphical items appearing over the real scene may thus change continuously in function of the background multimedia frames dominant colors which also change permanently.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)
  • Digital Computer Display Output (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Processing Or Creating Images (AREA)

Abstract

An automated method and a graphic processor for managing colors for a user interface to be displayed over at least one multimedia frame provided by an electronic device. The user interface comprises at least one graphical item having at least one predetermined color. The method comprises steps of: analyzing, by a graphic processor of the electronic device, the digital multimedia content comprising the at least one multimedia frame, extracting, from said digital multimedia content, data blocks defining an array of dominant colors in at least a part of the at least one multimedia frame, filtering the array of dominant colors of the part of at least one multimedia frame according to at least one predefined criterion managing color selection, obtaining a resulting array of dominant colors, generating, by a user interface generator coupled to the graphic processor, at least one graphical item by applying at least one dominant color from the resulting array of dominant colors to said at least one graphical item, said at least one dominant color being selected to be visible in the part of the at least one multimedia frame whereon the at least one graphical item is displayed.

Description

Method and graphic processor for managing colors of a user interface
Field of the invention
The present invention relates to an automated method and a graphic processor for managing and updating colors of a user interface comprising graphical elements, text, and images to be displayed over a background formed by a still image, a moving image, or video content.
Technical background
A user-friendly graphical user interface provides attractive graphical effects having a pleasant esthetic appearance as well as a possibility of an easy and intuitive usage. A user interface generally displays items comprising text strings, graphical icons, graphical objects, gradients and images. A common way to display the list of items is to configure a first graphics layer of the graphical user interface as a static background and a second graphics layer to display the items over the background in a scrolling manner. The background may have a predetermined area, which is distinguished from the surrounding area by a special effect, such as a highlight, a gradient or a frame. The item which is displayed over the predetermined area is treated as an item of interest. Such a display does not alter the way in which the items are displayed on the second graphics layer. Another way to display the list of items is to configure a first graphics layer of the graphical user interface to display the items and a second graphics layer to display a symbol or a frame in a scrolling manner over the first layer. The item over which the symbol is displayed is treated as the item of interest. Such display does not alter the way in which the items are displayed on the first graphics layer. With the development of technology in the field of electronic devices such as computers, mobile equipments, television sets associated to set top boxes, etc. , various types of user interfaces have been developed to facilitate users' experiences in using the devices. Today, many electronic devices are implemented with a touch screen to provide a graphic user interface Ul replacing keyboards. The Ul thus includes various types of menus and images, as for example, windows, scroll bars, icons, control buttons, etc.
In video centric devices and video display environments, the amount of content is considerably increased and the type of content is diversified as compared to an analog broadcast environment of the related art. The User Interfaces (Uls) are mostly two-dimensionally configured in an On Screen Display (OSD) manner. However, there is a limitation in the two-dimensional configuration of the Uls. Since information is displayed in a planar fashion, a user feels inconvenience in recognition and use of a Ul. For example, if a Ul including much information or a Ul including a main menu and a sub menu displayed in a planar fashion is provided, the menu may be superimposed on a background image or frames of video content displayed on a main screen of a television set for example, in such a way that all or part of the Ul may be either visible or hidden or become unreadable. Therefore, there is a need for a method and a system able to manage and update colors used in the user interface in a smart way, which is automated and easier for the user.
Regarding video display environments, document US8872969 discloses a method of dynamic relative adjustment of a color parameter of at least a portion of a video frame and/or a color parameter of at least a portion of a subtitle associated therewith before being displayed. The method comprises steps of storing data related to a video frame separately from data related to a subtitle of the video frame in a memory of a data processing device, and comparing, through a processor communicatively coupled to the memory, a color parameter of the data related to the video frame to a color parameter of the data related to the subtitle. The method also includes dynamically adjusting a color parameter of at least a portion of the data related to the subtitle and/or a color parameter of at least a portion of the data related to the video frame based on the comparison. Further, the method includes overlaying the data related to the subtitle on the data related to the video frame following the dynamic adjustment prior to rendering thereof on a display unit. Summary of the invention
An embodiment of the disclosure proposes an automated method for managing colors for a user interface to be displayed over at least one multimedia frame provided by an electronic device according to claim 1 . A further object of the disclosure relates to a graphic processor configured to automatically manage colors in a user interface displayed over at least one multimedia frame provided by an electronic device according to claim 7.
A multimedia frame is defined in the context of the disclosure as any still or moving human eyes visible image provided by a multimedia content. The method and the graphic processor of the present invention concern dynamic user interfaces displayed on video centric consumer electronic devices. A user interface includes a set of graphical items having various colors, shapes, sizes, and locations on a display screen. Users will be able to select objects or functions from a single or a set of graphical items according to hardware and software configuration of the electronic device. As for a video centric device, an aim consists of showing video content on a display at any time without modifying the size of the multimedia frame or cropping some part of it. To achieve such an aim, using translucent user interfaces may be a preferred option. At the same time, this translucence will result in the color of the user interface to confl ict with the background multimedia frame colors and provides distracting user experience.
It has to be noted that the term multimedia content or multimedia frame also includes the case where a real scene produces, by means of a camera, a multimedia frame over which user interface graphical items may be displayed. For example scenes seen by smart glasses or windscreens may be used as background for user interface items in form of text and/or graphics to be exploited by a user.
Brief description of the drawings
The invention will be better understood thanks to the following detailed description, which refers to the attached drawings given as non-limitative examples. Figure 1 shows a layered structure of a video multimedia frame comprising a background image layer on which a user interface including a graphic layer and a text layer is superimposed.
Figure 2 shows a diagram of the organization of user interface items colors on a video image.
Figure 3 shows video data blocks of a background image from which dominant color arrays are extracted after analysis and used for user interface items generation.
Figure 4 shows a block diagram of the graphic processor with peripherals configured to manage colors in a user interface displayed over an image provided by an electronic device.
Figure 5 shows a flow chart of an embodiment of the method according to the invention using criteria managing color selection in arrays of dominant colors of the user interface items and the dominant colors of an image.
Detailed description of the invention In general, images displayed on a screen of a video centric device are rendered in form of several superimposed layers. Each layer contains information provided by different sources in the video device. Figure 1 illustrates an example of a layered multimedia frame provided by a multimedia content source where the bottom most layer or background layer BL occupies the entire surface of the screen while further layers of a user interface Ul may be placed at predefined positions on the screen without necessarily covering the entire surface of the screen. The user interface Ul may comprise a graph ic layer GL including graphical items such as drawings, animated elements, logos, image boxes, buttons, etc. and a text layer TL including text, alphanumeric characters, ideograms, symbols, frames of various shapes, etc. The color components of the user interface layers are set by applications rendering these layers and their graphical items on the display. The applications generate, for example, graphical items in form of stacked windows disposed on layers in a predefined order. The colors of the background image in the first layer are preset while the colors of the user interface windows are set by the applications. In a conventional user interface, the colors of the windows are defined in a static way so that in case of overlapping, some windows or items thereof may be hidden or displayed with an insufficient contrast over colors of an under laying window.
The diagram of figure 2 illustrates an example of a display having background colors AV on which user interface graphics are displayed in form of windows W1 , W2 and W3 having each a particular set of colors. The display order AV, W1 , W2, W3 corresponds to the layers stack where the background image AV is placed on the bottom layer and the windows W3 on the top layer.
A digital multimedia content processed for example by a graphic processor of an interactive multimedia content rendering device or by a multimedia decoder includes digital video data blocks in a compressed form defining the images composition. One known technique for video compression is referred to as the Motion Picture Experts Group (MPEG) compression algorithm. In this algorithm, each frame of a motion picture video is described either independently or as a change from a previously displayed frame. Thus a video scene might be described by a single independent frame which shows the entire scene as it initially appears, followed by a long series of change frames which describe the changes in the scene as actors move, for example. Using such a techn ique, video data throughput in a transmission channel is considerably increased by eliminating redundant transmission of constant elements of the scene. The MPEG algorithm is capable of describing an image by either a single independent video frame, called an l-frame, or by a combination of an initial l-frame and one or more succeeding change frames, comprising P-frames describing a change to a previously displayed image and B-frames describing differences between a current frame and both the preceding and following frames to specify its content. Typically, the P-frame data is written into a frame buffer of the graphic processor whose contents are already being displayed, resulting in a modification to the displayed image.
The color composition information of the background image which may be still or moving is mainly contained in the l-frames data blocks which are analyzed by the graphic processor according to the method of the present invention. This analysis results by an extraction of data blocks defining an array or set of dominant colors of the background image Kn = (C1 , C2, C3, ... , Cn) where C1 is the least dominant color and Cn the most dominant color, as shown by figure 3.
According to further embodiments, the multimedia content received and processed by the graphic processor may comprise video data blocks encoded by using other algorithms than MPEG, as for example Google VP8, VP9, RealVideo, Microsoft ASF, etc. Images provided by cameras associated to glasses or windscreens are generally not compressed so that each frame may be analyzed for getting dominant color arrays. Analog video content may be converted into digital video data blocks that are processed to extract dominant color information. A dominant color is defined by its higher intensity or strength in relation to other colors in a spectrum. Dominant color strength values may be assigned using a mathematical intensity distribution curve formula.
The graphic processor receives one or more graphical items provided by a user interface generator driven by a specific application to be displayed over a part or the entire background multimedia frame provided by the multimedia content.
The dominant colors of the user interface graphical items may be modified in function of the array of dominant colors of at least a part of the background image Kn = (C1 , C2, C3,... , Cn) in a way to be visible in relation to the colors of the concerned part of the background image. One or more colors of the array may thus be changed dynamically, i.e. each time the dominant colors of the background image part change.
The array of dominant colors of the background image may be filtered according to one or more color selection criteria such as quality of the background image, user preferences, genre of the multimedia content, available dominant colors in the array of the l-frame, etc. The quality of a background image may be defined by the resolution, such as the number of pixels per inch, compression rate, sharpness, motion regularity, etc.
In particular, the user interface generator may provide default colors for some graphical items and other graphical items for wh ich the color may be replaced dynamically in function of the dominant color array of the background multimedia frame. According to an option, the color change may be carried out by a color fader configured to change color within a predefined time period in order to prevent sudden color switching. The color changes are thus softened by introducing a progressive transition through less dominant colors. I n case of rapidly changing dominant colors of background multimedia frames provided by a multimedia content source, the user interface items colors may be filtered in function of the multimedia content genre. An action movie such as a thriller may enable the graphic processor to replace default colors of user interface items by vibrant colors i.e. the most dominant colors of the array. User interface items colors over a movie related to a story for children for example may be changed into soft light colors.
User preferences may also be used for filtering background multimedia frames colors, as for example the color set may be l imited to particular colors selected in an interactive manner by the user on the fly, when graphical items of the user interface appear or preferred colors may be based on pre-stored settings.
A frequency of color changes in a time interval may also be applied on user interface items depending on the color of the background layers.
Color changes thus occur dynamically after analysis of the background multimedia frame leading to provide a set of colors to be selected for graphical items in order to present a visible user interface having a pleasant esthetic. The graphical items color is thus adapted to the background multimedia frame so that the color may change when the background multimedia frame changes as well as when the user interface layers move across the background multimedia frame which may display parts with different dominant colors. For example, a yellow graphical item such as a line passing over a blue background part will change into cyan when it passes over a red background part. According to a further example, a graphical item such as a rectangle may have a different color for each side depending on the dominant color of the background multimedia frame part over which the concerned side is displayed. A subtitle on a background video frame may have different colors on each character depending on the parts of the background video frame the subtitle characters are displayed. A character displayed over a white and a black background part will appear as black on the white background part and white on the black background part. In these conditions, the subtitle remains always visible whatever the color of the background video frame.
According to a further embodiment, a graphical item such as a line, for example, may have an "average" color adapted to be visible on all of the different parts traversed by the line on the display screen . In this example the color array resulting from the analysis and filtering enables selecting the appropriate visible color for modifying color of a user interface graphical item according to the dominant color of the background multimedia frame. No color change occurs when the user interface item has already a color adapted to background dominant color and when no add itional filtering criteria have been previously applied to the background multimedia frame dominant color array.
An exemplary graphic processor GP according to the invention is shown by figure 4. The graphic processor GP driven by video processing software may preferably be implemented in a video centric electronic device. The video centric device may be for example a video gateway device such as a set top box, a media player, a game player, a television set or the like in a user's home.
The graphic processor GP may be coupled to peripherals such as an interactive multimedia content rendering device IMRD providing, for example, l-frame, B-frame and P-frame video data blocks, a display driver DD coupled to a television set display screen DS and a user interface generator UIG.
The interactive multimedia content rendering device IMRD forwards multimedia content data from the at least one multimedia frame to a content analyzer CA configured to analyze the digital multimedia content data comprising l-frames for example. Dominant color arrays corresponding to at least a part of the at least one multimedia frame displayed on the screen of the television set DS are then extracted from the l-frames and forwarded to a filter F coupled to an output of the content analyzer CA. The filter F is configured to filter the extracted array of dominant colors of the part of at least one multimedia frame according to at least one predefined criterion managing color selection, and to obtain a resulting array of dominant colors. The graphic processor GP further comprises a dynamic user interface data processor UIP coupled to an output of the filter F and to an output of the user interface generator UIG, which generates user interface graphical items. A dynamic user interface data processor UIP assembles the graphical items to form a user interface Ul overlaying the at least one multimedia frame.
The dominant color of the graphical items are selected based on the resulting array of dominant colors obtained at the output of the filter F in such a way to be visible in the part of the at least one multimedia frame whereon the graphical items are displayed.
The user interface generator UIG may be driven by applications running in the video centric device allowing interaction with a user. The user interface Ul comprising the assembled graphical items is preferably displayed over the multimedia content thanks to the display driver DD. The graphical items of the user interface Ul have therefore dominant colors which may be modified in function of the colors in a part or the entire displayed multimedia frame. The user interface generator UIG selects one or more colors from the dominant colors array for user interface graphical items according to at least one of above mentioned filtering criterion managing graphical items color selection in the dominant colors array provided by the dominant color analyzer CA.
The user interface U l having adapted colors is forwarded by the dynam ic user interface video data processor UIP to the display driver DD for being displayed on the display screen DS over the multimedia frame.
According to an option, the graphic processor GP further comprises a color fader CF inserted between an output of the dynamic user interface data processor UIP and an input of the display driver DD. This color fader CF is configured to change color of graphical items composing the user interface within a predefined time period by introducing a progressive transition through less dominant colors.
The flow chart of figure 5 illustrates an embodiment of the method according to the invention using criteria managing user interface items color selection based on multimedia content quality and multimedia content genre. User preferences are also taken into account for the user interface graphical items color selection. After decoding and analysis of the multimedia content provided by the interactive multimedia content rendering device, l-frames related to dominant colors array are extracted and used as reference to user interface graphical items color changes. An extracted color array K may be filtered by user settings, as for example, pre-stored user preferences, by parameters related to the multimedia content quality to obtain a filtered color array K1 which may be further filtered according to the multimedia content genre to obtain color array K2.
In the example of figure 5, color selection is performed by filters applied to the extracted dominant color arrays according to predefined criteria: content quality, content genre, content images dominant colors, etc. For example, in case of a content related to sports, the filter eliminates the least dominant colors to keep the most dominant colors used to enable replacements of the user interface graphical items colors.
In the example, the color set K2 is then used to change, if necessary, colors of the graphical items in the generated application user interface. The color change of user interface graphical items may be performed through the color fader CF coupled to the display driver DD to prevent sudden color switch ing wh ich may d isturb user experience. If no color change is required the generated application user interface is directly displayed on the display screen over the images of the video content. In case of background visual content provided by real scenes as for example, through smart glasses or windscreens, no l-frame are produced contrarily to the case where a video content is received by a multimedia content rendering device connected to a video data source such as a video streaming server, an optical or hard disc, etc.
However, a camera pointed on a real scene may provide the background multimedia frame which can be analyzed to extract information on dominant colors arrays from color composition. The colors of the user interface graphical items appearing over the real scene may thus change continuously in function of the background multimedia frames dominant colors which also change permanently.

Claims

1 . An automated method for managing colors for a user interface (Ul) to be displayed over at least one multimedia frame provided by an electronic device comprising a graphic processor (GP) configured to process digital multimedia content, the user interface (Ul) comprising at least one graphical item having at least one predetermined color, the method comprising the steps of:
- analyzing, by the graphic processor (GP), the digital multimedia content comprising the at least one multimedia frame,
- extracting, from said digital multimedia content, data blocks defining an array of dominant colors in at least a part of the at least one multimedia frame,
- filtering the array of dominant colors of the part of at least one multimedia frame according to at least one predefined criterion managing color selection,
- obtaining a resulting array of dominant colors,
- generating, by a user interface generator (UIG) coupled to the graphic processor (GP), at least one graphical item by applying at least one dominant color from the resulting array of dominant colors to said at least one graphical item, said at least one dominant color being selected to be visible in the part of the at least one multimedia frame whereon the at least one graphical item is displayed.
2. The method according to claim 1 , characterized in that it comprises a further step of assembling, by the graphic processor (GP), a set of graphical items to form a user interface (Ul) overlaying the at least one multimedia frame, the dominant color of the graphical items being selected to be visible in the parts of the at least one multimedia frame covered by the graphical items of the user interface (Ul).
3. The method according to claim 1 , characterized in that the graphic processor (GP) analyzes multimedia content provided by a video transport stream comprising I- frame data blocks and extracts, from the l-frame data blocks, information on dominant colors in the part of the at least one multimedia frame.
4. The method according to claim 1 , characterized in that the multimedia content is provided by a camera implemented in smart glasses or a windscreen, information on dominant colors in the part of at least one multimedia frame being extracted from color composition.
5. The method according to claim 1 , characterized in that the array of dominant colors of the at least one multimedia frame is filtered according to user preferences introduced in an interactive manner when graphical items of the user interface are displayed or on the basis of pre-stored settings.
6. The method according to anyone of claims 1 to 4, characterized in that the array of dominant colors of the at least one multimedia frame is filtered according to quality or genre of a multimedia content represented by the at least one multimedia frame.
7. A graphic processor (GP) configured to automatically manage colors for a user interface (Ul) to be displayed over at least one multimedia frame provided by an electronic device configured to process digital multimedia content, the user interface (Ul) comprising at least one graphical item having at least one predetermined color, the graphic processor comprising: a content analyzer (CA), configured to analyze the digital multimedia content comprising the at least one multimedia frame, and to extract, from said digital multimedia content, data blocks defining an array of dominant colors in at least a part of the at least one multimedia frame, a filter (F) coupled to an output of the content analyzer (CA), said filter (F) being configured to filter the array of dominant colors of the part of at least one multimedia frame according to at least one predefined criterion managing color selection, and to obtain a resulting array of dominant colors, a dynamic user interface data processor (UIP) coupled to an output of the filter (F), said dynamic user interface data processor (UIP) being configured to assemble a set of graphical items received from a user interface generator (UIG), said assembled graphical items forming a user interface (Ul) overlaying the at least one multimedia frame, the dominant color of the graphical items being selected based on the resulting array of dominant colors to be visible in the part of the at least one multimedia frame whereon the graphical items are displayed
8. The graphic processor according to claim 7 characterized in that the content analyzer (CA) is configured to analyze multimedia content provided by a video transport stream comprising l-frame data blocks, and to extract, from the l-frame data blocks, information on dominant colors in the part of the at least one multimedia frame.
9. The graphic processor according to claim 7, characterized in that the content analyzer (CA) is configured to analyze multimedia content provided by a camera implemented in smart glasses or a windscreen , and to extract, from color composition, information on dominant colors in the part of the at least one multimedia frame.
10. The graphic processor according to anyone of claim 7 to 9 characterized in that the filter (F) is configured to filter the dominant colors array of the at least one multimed ia frame accord ing to user preferences introduced on the fly in an interactive manner when graphical items of the user interface are displayed or on the basis of pre-stored settings.
1 1 . The graphic processor according to anyone of claims 7 to 10, characterized in that the filter (F) is configured to filter the dominant colors array of the at least one multimedia frame according to quality or genre of a multimedia content represented by the at least one multimedia frame.
12. The graphic processor according to anyone of claims 7 to 1 1 , characterized in that it further comprises a color fader (CF) coupled to an output of the dynamic user interface data processor (UIP), the color fader (CF) being configured to change color of graphical items within a predefined time period by introducing a progressive transition through less dominant colors.
PCT/EP2015/079369 2014-12-12 2015-12-11 Method and graphic processor for managing colors of a user interface WO2016092059A1 (en)

Priority Applications (10)

Application Number Priority Date Filing Date Title
BR112017011272A BR112017011272A2 (en) 2014-12-12 2015-12-11 method and graphics processor for managing the colors of a user interface
MX2017007462A MX2017007462A (en) 2014-12-12 2015-12-11 Method and graphic processor for managing colors of a user interface.
AU2015359323A AU2015359323B2 (en) 2014-12-12 2015-12-11 Method and graphic processor for managing colors of a user interface
CN201580067673.1A CN107004285A (en) 2014-12-12 2015-12-11 Method and graphics processor for the color of managing user interface
CA2968472A CA2968472A1 (en) 2014-12-12 2015-12-11 Method and graphic processor for managing colors of a user interface
JP2017531214A JP2018510396A (en) 2014-12-12 2015-12-11 Method and graphic processor for managing user interface colors
US15/535,337 US10964069B2 (en) 2014-12-12 2015-12-11 Method and graphic processor for managing colors of a user interface
SG11201704261SA SG11201704261SA (en) 2014-12-12 2015-12-11 Method and graphic processor for managing colors of a user interface
KR1020177015997A KR20170093848A (en) 2014-12-12 2015-12-11 Method and graphic processor for managing colors of a user interface
EP15808573.8A EP3230955A1 (en) 2014-12-12 2015-12-11 Method and graphic processor for managing colors of a user interface

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP14197598 2014-12-12
EP14197598.7 2014-12-12

Publications (1)

Publication Number Publication Date
WO2016092059A1 true WO2016092059A1 (en) 2016-06-16

Family

ID=52101137

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2015/079369 WO2016092059A1 (en) 2014-12-12 2015-12-11 Method and graphic processor for managing colors of a user interface

Country Status (11)

Country Link
US (1) US10964069B2 (en)
EP (1) EP3230955A1 (en)
JP (1) JP2018510396A (en)
KR (1) KR20170093848A (en)
CN (1) CN107004285A (en)
AU (1) AU2015359323B2 (en)
BR (1) BR112017011272A2 (en)
CA (1) CA2968472A1 (en)
MX (1) MX2017007462A (en)
SG (1) SG11201704261SA (en)
WO (1) WO2016092059A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3258465A1 (en) * 2016-06-17 2017-12-20 Ningbo Geely Automobile Research & Development Co., Ltd. A method for automatic adaptation of a user interface
CN108737878A (en) * 2017-04-18 2018-11-02 谷歌有限责任公司 The method and system of user interface color is changed for being presented in conjunction with video
US10460479B2 (en) 2016-02-10 2019-10-29 Google Llc Dynamic color determination for user interface components of a video player
JP2021028829A (en) * 2017-09-09 2021-02-25 アップル インコーポレイテッドApple Inc. Device and method for displaying affordance on background, and graphical user interface

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10152804B2 (en) * 2015-02-13 2018-12-11 Smugmug, Inc. System and method for dynamic color scheme application
US10534973B2 (en) * 2017-04-18 2020-01-14 Google Llc Methods, systems, and media for color palette extraction for video content items
CN107122199A (en) * 2017-07-04 2017-09-01 京东方科技集团股份有限公司 The display control method and device of a kind of display field
US11295497B2 (en) * 2019-11-25 2022-04-05 International Business Machines Corporation Dynamic subtitle enhancement
CN111796896A (en) * 2020-06-29 2020-10-20 京东方科技集团股份有限公司 Theme switching method of application page and related equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0856829A2 (en) * 1997-01-31 1998-08-05 Hitachi, Ltd. Image displaying system and information processing apparatus
US6317128B1 (en) * 1996-04-18 2001-11-13 Silicon Graphics, Inc. Graphical user interface with anti-interference outlines for enhanced variably-transparent applications
US20130104061A1 (en) * 2003-05-16 2013-04-25 Pure Depth Limited Display control system
US8872969B1 (en) * 2013-09-03 2014-10-28 Nvidia Corporation Dynamic relative adjustment of a color parameter of at least a portion of a video frame/image and/or a color parameter of at least a portion of a subtitle associated therewith prior to rendering thereof on a display unit

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6486894B1 (en) * 1999-11-18 2002-11-26 International Business Machines Corporation Contrasting graphical user interface pointer
US6813313B2 (en) * 2000-07-06 2004-11-02 Mitsubishi Electric Research Laboratories, Inc. Method and system for high-level structure analysis and event detection in domain specific videos
KR20020050264A (en) * 2000-09-08 2002-06-26 요트.게.아. 롤페즈 Reproducing apparatus providing a colored slider bar
JP4870665B2 (en) * 2004-06-30 2012-02-08 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Dominant color extraction using perceptual rules to generate ambient light derived from video content
JP2009500941A (en) * 2005-07-08 2009-01-08 エルジー エレクトロニクス インコーポレイティド Method for modeling video signal coding information to compress / decompress information
US8532800B2 (en) * 2007-05-24 2013-09-10 Mavs Lab. Inc. Uniform program indexing method with simple and robust audio feature enhancing methods
JP2010193030A (en) * 2009-02-17 2010-09-02 Seiko Epson Corp Device, system, method and program for processing image
EP2230839A1 (en) * 2009-03-17 2010-09-22 Koninklijke Philips Electronics N.V. Presentation of video content
US8675019B1 (en) * 2009-12-03 2014-03-18 Innoventions, Inc. View navigation guidance system for hand held devices with display
US20110304636A1 (en) * 2010-06-14 2011-12-15 Acer Incorporated Wallpaper image generation method and portable electric device thereof
US20110319160A1 (en) * 2010-06-25 2011-12-29 Idevcor Media, Inc. Systems and Methods for Creating and Delivering Skill-Enhancing Computer Applications
US8847973B2 (en) * 2010-12-15 2014-09-30 Microsoft Corporation Automatic adjustment of computer interface colors using image processing
US8897552B2 (en) * 2012-08-01 2014-11-25 Microsoft Corporation Setting an operating-system color using a photograph
US9397844B2 (en) * 2012-09-11 2016-07-19 Apple Inc. Automated graphical user-interface layout
WO2015038338A1 (en) * 2013-09-16 2015-03-19 Thomson Licensing Browsing videos by searching multiple user comments and overlaying those into the content
EP3015952B1 (en) * 2014-10-30 2019-10-23 4tiitoo GmbH Method and system for detecting objects of interest

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6317128B1 (en) * 1996-04-18 2001-11-13 Silicon Graphics, Inc. Graphical user interface with anti-interference outlines for enhanced variably-transparent applications
EP0856829A2 (en) * 1997-01-31 1998-08-05 Hitachi, Ltd. Image displaying system and information processing apparatus
US20130104061A1 (en) * 2003-05-16 2013-04-25 Pure Depth Limited Display control system
US8872969B1 (en) * 2013-09-03 2014-10-28 Nvidia Corporation Dynamic relative adjustment of a color parameter of at least a portion of a video frame/image and/or a color parameter of at least a portion of a subtitle associated therewith prior to rendering thereof on a display unit

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WEI HONG ET AL: "Smart compositing: A real-time content-adaptive blending method for remote visual collaboration", 2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2012) : KYOTO, JAPAN, 25 - 30 MARCH 2012 ; [PROCEEDINGS], IEEE, PISCATAWAY, NJ, 25 March 2012 (2012-03-25), pages 2317 - 2320, XP032227616, ISBN: 978-1-4673-0045-2, DOI: 10.1109/ICASSP.2012.6288378 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10460479B2 (en) 2016-02-10 2019-10-29 Google Llc Dynamic color determination for user interface components of a video player
EP3258465A1 (en) * 2016-06-17 2017-12-20 Ningbo Geely Automobile Research & Development Co., Ltd. A method for automatic adaptation of a user interface
WO2017216386A1 (en) * 2016-06-17 2017-12-21 Ningbo Geely Automobile Research & Development Co., Ltd. A method for automatic adaptation of a user interface
CN108737878A (en) * 2017-04-18 2018-11-02 谷歌有限责任公司 The method and system of user interface color is changed for being presented in conjunction with video
US10957280B2 (en) 2017-04-18 2021-03-23 Google Llc Methods, systems, and media for modifying user interface colors in connection with the presentation of a video
CN108737878B (en) * 2017-04-18 2021-07-20 谷歌有限责任公司 Method and system for modifying user interface color in conjunction with video presentation
CN113660514A (en) * 2017-04-18 2021-11-16 谷歌有限责任公司 Method and system for modifying user interface color in conjunction with video presentation
US11551638B2 (en) 2017-04-18 2023-01-10 Google Llc Methods, systems, and media for modifying user interface colors in connection with the presentation of a video
CN113660514B (en) * 2017-04-18 2023-12-22 谷歌有限责任公司 Method and system for modifying user interface color in connection with video presentation
US12002436B2 (en) 2017-04-18 2024-06-04 Google Llc Methods, systems, and media for modifying user interface colors in connection with the presentation of a video
JP2021028829A (en) * 2017-09-09 2021-02-25 アップル インコーポレイテッドApple Inc. Device and method for displaying affordance on background, and graphical user interface

Also Published As

Publication number Publication date
KR20170093848A (en) 2017-08-16
CN107004285A (en) 2017-08-01
BR112017011272A2 (en) 2017-12-26
AU2015359323B2 (en) 2018-10-18
AU2015359323A1 (en) 2017-06-15
EP3230955A1 (en) 2017-10-18
US10964069B2 (en) 2021-03-30
SG11201704261SA (en) 2017-06-29
US20170365072A1 (en) 2017-12-21
CA2968472A1 (en) 2016-06-16
JP2018510396A (en) 2018-04-12
MX2017007462A (en) 2017-10-02

Similar Documents

Publication Publication Date Title
AU2015359323B2 (en) Method and graphic processor for managing colors of a user interface
KR101318459B1 (en) Method of viewing audiovisual documents on a receiver, and receiver for viewing such documents
US11551638B2 (en) Methods, systems, and media for modifying user interface colors in connection with the presentation of a video
JP4614391B2 (en) Image display method and image display apparatus
US8613012B2 (en) Method and device for displaying a message on a screen of a television
JP2016532386A (en) Method for displaying video and apparatus for displaying video
CN111405339A (en) Split screen display method, electronic equipment and storage medium
US9032472B2 (en) Apparatus and method for adjusting the cognitive complexity of an audiovisual content to a viewer attention level
KR20130104215A (en) Method for adaptive and partial replacement of moving picture, and method of generating program moving picture including embedded advertisement image employing the same
CN104205795B (en) color grading preview method and device
US20080260290A1 (en) Changing the Aspect Ratio of Images to be Displayed on a Screen
JP2009100270A (en) Video-editing method and television broadcast receiver
EP3596700A1 (en) Methods, systems, and media for color palette extraction for video content items
US7710435B2 (en) Method and apparatus for generating visual effects
US20070133950A1 (en) Reproduction apparatus, reproduction method, recording method, image display apparatus and recording medium
CN107743710B (en) Display device and control method thereof
CN111787397A (en) Method for rendering multiple paths of videos on same canvas based on D3D
JP2011061670A (en) Display apparatus, method and program for displaying summary content
KR20240011779A (en) Display of sign language videos through adjustable user interface (UI) elements
KR101645247B1 (en) Method for displaying broadcast channel
CN107027061A (en) Television system and multi-medium play method
JP2010154257A (en) Information processing apparatus, display control method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15808573

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2968472

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 11201704261S

Country of ref document: SG

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112017011272

Country of ref document: BR

WWE Wipo information: entry into national phase

Ref document number: MX/A/2017/007462

Country of ref document: MX

ENP Entry into the national phase

Ref document number: 2017531214

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20177015997

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 15535337

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2015359323

Country of ref document: AU

Date of ref document: 20151211

Kind code of ref document: A

REEP Request for entry into the european phase

Ref document number: 2015808573

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 112017011272

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20170529