WO2020165331A1 - Determining light effects based on a light script and/or media content and light rendering properties of a display device - Google Patents

Determining light effects based on a light script and/or media content and light rendering properties of a display device Download PDF

Info

Publication number
WO2020165331A1
WO2020165331A1 PCT/EP2020/053739 EP2020053739W WO2020165331A1 WO 2020165331 A1 WO2020165331 A1 WO 2020165331A1 EP 2020053739 W EP2020053739 W EP 2020053739W WO 2020165331 A1 WO2020165331 A1 WO 2020165331A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
display device
media content
rendering properties
effects
Prior art date
Application number
PCT/EP2020/053739
Other languages
French (fr)
Inventor
Dzmitry Viktorovich Aliakseyeu
Bartel Marinus Van De Sluis
Jérôme Eduard MAES
Original Assignee
Signify Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Signify Holding B.V. filed Critical Signify Holding B.V.
Publication of WO2020165331A1 publication Critical patent/WO2020165331A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43632Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wired protocol, e.g. IEEE 1394
    • H04N21/43635HDMI

Definitions

  • the invention relates to a system for determining one or more light effects and controlling one or more light elements to render said one or more light effects while a display device displays media content and controls one or more further light elements based on said media content.
  • the invention further relates to a method of determining one or more light effects and controlling one or more light elements to render said one or more light effects while a display device displays media content and controls one or more further light elements based on said media content.
  • the invention also relates to a computer program product enabling a computer system, e.g. a PC or mobile phone, to perform such a method.
  • a computer system e.g. a PC or mobile phone
  • One of the main benefits of having a dynamic lighting system such as the Philips HueSync app and Philips Ambilight TVs is to extend the content that is displayed on a screen. By allowing the lighting system to dynamically display colors in the room that are extracted from the content, an enhanced experience can be offered.
  • WO2017/072537 A1 discloses that a video signal may be captured and be processed by a backlighting processor unit (BPU) in real-time and may be fed to the TV in its original form. Processing may be made to capture pixel areas along the border of an image and obtain an average (e.g. a median, mean or mode) color of each of these areas. These colors are then displayed at a respective LED module on the attached to the back of the TV screen. All the LEDs on the back of the display device work together to illuminate e.g. a wall behind a TV making an image which extends screen borders and provides a light gradient between a bright image on the display and a dark ambient of the room. This gradient helps to relieve eye strain and make a visually larger picture. If the Single LED Module is used, the capturing area is a single area and equals the whole displayed image
  • US2018/02350391 A1 discloses a lighting system that comprises one or more lamps for illuminating an environment.
  • a set of configuration data is retrieved from a memory, the set of configuration data defining an initial lighting scene for rendering by a group of lamps. It is detected when at least one of the lamps in the lighting system, e.g. a lamp that renders light effects based on TV-related content, is emitting illumination that would disrupt the initial lighting scene and at least one characteristic of the disruptive illumination is determined.
  • the set of configuration data is modified based on the determined characteristic to account for the disruptive illumination, e.g. by increasing the brightness of the group of lamps to limit cross interference.
  • One or more lamps of the group are controlled, via the control interface, to render a modified lighting scene defined by the modified set.
  • a system for determining one or more light effects and controlling one or more light elements to render said one or more light effects while a display device displays media content and controls one or more further light elements based on said media content comprises at least one input interface, at least one output interface, and at least one processor configured to use said at least one input interface to receive at least one input signal comprising an identification of said display device, determine one or more light rendering properties associated with said display device based on said at least one input signal, said one or more light rendering properties representing how said display device generates light effects based on audio and/or video and/or how said light effects appear when rendered on said one or more further light elements, determine said one or more light effects based on said one or more light rendering properties and further based on said media content and/or a light script associated with said media content, and use said at least one output interface to control said one or more light elements to render said one or more light effects.
  • the inventors have realized that when a first set of light elements is used by a display device to generate and render light effects based on audio and/or video and another device generates and renders light effects using a second set of light elements, the different light effects may interfere with each other.
  • the system determines light effects that match light effects generated and rendered by a display device, e.g. a Philips Ambilight TV. Using the Philips HueSync app together with an Ambilight TV without using the invention would typically have unexpected or even negative impact, because the colors rendered by the Ambilight TV would typically not match with the colors rendered by the Hue system.
  • Hue lights are controlled based on a light script associated with media content which is being rendered on the Ambilight TV, but also when Hue lights are controlled based on an independent analysis of the media content.
  • Hue lights are controlled based on a light script, they are normally controlled less
  • the script may, for instance, comprise subtle light changes whereas the Ambilight control is based on changes in the image on screen which may occur more dynamically (faster).
  • the system needs to determine light effects that match the light effects created by the Ambilight TV.
  • the system is able to determine the light rendering properties associated with the display device.
  • a display device of a first manufacturer of display devices typically generates and/or renders light effects in a different manner than a display device of a second manufacturer of display devices.
  • a first display device model may generate and/or render light effects in a different manner than a second display device model of the same manufacturer, e.g. the first model may use LEDs and the second model may use projectors to render light effects.
  • the system uses the determined light rendering properties to determine light effects that match the light effects generated and rendered by the display device.
  • These light effects rendered by the display device may not only be rendered on one or more auxiliary light elements configured to illuminate an area behind and/or next to the display device, as is the case in Ambilight TVs, but also on one or more light elements which form a backlight of the display device and/or on one or more light elements which are pixels of a display incorporated into the display device.
  • This is beneficial, because the overall lighting experience is typically influenced by the brightness settings and type of display used. For instance, in the case of a dark image or video scene, an OLED display emits no light, whereas an LCD display does emit light in such a case.
  • the one or more further light elements may be incorporated into said display device.
  • the one or more further light elements may be incorporated into one or more accessories of the display device which are controlled by the display device, e.g. if the display devices renders light effects on one or more auxiliary light elements configured to illuminate an area behind and/or next to the display device.
  • Said at least one input signal may further comprise at least one user preference related to said one or more light rendering properties, said user preference being configured by a user in said display device. If a user is able to configure the manner in which the display device generates and/or renders light effects, it is important that the system knows these user preferences in order to determine light effects matching the light effects generated and rendered by the lighting device. For example, in Ambilight TVs, a user is able to select one of a plurality of modes and the dynamicity of the light effects depends on the selected mode.
  • Said at least one input signal may be received from a user input device or from said display device.
  • the user may be able to select his brand of TV and/or model of TV from a list and/or the TV may broadcast its identifier or transmit its identifier on request.
  • Said one or more light rendering properties may represent how colors are extracted from the video, may represent a dynamicity of light effects generated by said display device, may represent a dynamicity of a chromaticity of light effects generated by said display device and/or a dynamicity of a brightness of light effects generated by said display device, may represent whether said display device generates light effects based on the audio, and/or may represent a relationship between colors of the video and colors rendered by said one or more further light elements, for example.
  • the dynamicity of light effects may be represented by a speed of transitions between light settings, e.g. a speed of transitions in brightness and/or chromaticity.
  • the dynamicity of light effects may be represented by a speed of transitions that exceed a certain difference between the most different brightness and/or chromaticity points in the transition. For example, if the highest and lowest brightness are not very different, even a fast transition between them might not be perceived as dynamic. However, if the highest and lowest brightness are very different, then the effect would look more dynamic.
  • a method of determining one or more light effects and controlling one or more light elements to render said one or more light effects while a display device displays media content and controls one or more further light elements based on said media content comprises receiving at least one input signal comprising an identification of said display device, determining one or more light rendering properties associated with said display device based on said at least one input signal, said one or more light rendering properties representing how said display device generates light effects based on audio and/or video and/or how said light effects appear when rendered on said one or more further light elements, determining said one or more light effects based on said one or more light rendering properties and further based on said media content and/or a light script associated with said media content, and controlling said one or more light elements to render said one or more light effects.
  • Said method may be performed by software running on a programmable device. This software may be provided as a computer program product.
  • a computer program for carrying out the methods described herein, as well as a non-transitory computer readable storage-medium storing the computer program are provided.
  • a computer program may, for example, be downloaded by or uploaded to an existing device or be stored upon manufacturing of these systems.
  • a non-transitory computer-readable storage medium stores a software code portion, the software code portion, when executed or processed by a computer, being configured to perform executable operations for determining one or more light effects and controlling one or more light elements to render said one or more light effects while a display device displays media content and controls one or more further light elements based on said media content.
  • the executable operations comprise receiving at least one input signal comprising an identification of said display device, determining one or more light rendering properties associated with said display device based on said at least one input signal, said one or more light rendering properties representing how said display device generates light effects based on audio and/or video and/or how said light effects appear when rendered on said one or more further light elements, determining said one or more light effects based on said one or more light rendering properties and further based on said media content and/or a light script associated with said media content, and controlling said one or more light elements to render said one or more light effects.
  • aspects of the present invention may be embodied as a device, a method or a computer program product, e.g. an app.
  • aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit", “module” or “system.”
  • Functions described in this disclosure may be implemented as an algorithm executed by a processor/microprocessor of a computer.
  • aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied, e.g., stored, thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may include, but are not limited to, the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store, a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • These computer program instructions may be provided to a processor, in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • a processor in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • Fig. l is a block diagram of an embodiment of the system
  • Fig. 2 depicts embodiments of the display device and lighting devices of Fig. i ;
  • Fig. 3 is a flow diagram of a first embodiment of the method
  • Fig. 4 is a flow diagram of a second embodiment of the method.
  • Fig. 5 is a block diagram of an exemplary data processing system for performing the method of the invention.
  • Fig. 1 shows an embodiment of the system for determining one or more light effects and controlling one or more light elements to render the one or more light effects: computer 1.
  • the one or more light effects are rendered while a display device 27, e.g. a TV, displays media content and controls one or more further light elements based on the media content.
  • the light elements are lighting devices 13-14 and the further light elements are further lighting devices 28 and 29, e.g. LED strips.
  • Computer 1 is connected to a wireless LAN access point 23.
  • a bridge 11 is also connected to the wireless LAN access point 23, e.g. via Ethernet.
  • Lighting devices 13-14 communicate wirelessly with the bridge 11, e.g. using the Zigbee protocol, and can be controlled via the bridge 11, e.g. by the computer 1.
  • the bridge 11 may be a Philips Hue bridge and the lighting devices 13-14 may be Philips Hue lights, for example. In an alternative embodiment, lighting devices are controlled without a bridge.
  • the computer 1, the bridge 11 and the lighting devices 13-14 are part of lighting system 21.
  • the computer 1 may be a desktop, laptop, tablet or mobile phone, for example.
  • the display device 27 is also connected to the wireless LAN access point 23.
  • the wireless LAN access point 23 is connected to the Internet (backbone) 24.
  • An Internet server 25 is also connected to the Internet (backbone) 24.
  • the Internet server 25 may store light scripts, for example.
  • the computer 1 may run the Philips Hue Sync app, for example.
  • the computer 1 comprises a processor 5, a receiver 3, a transmitter 4, a memory 7, and a display 9.
  • the processor 5 is configured to use the receiver 3 and/or an input interface to a user input device to receive at least one input signal comprising an identification of the display device 27.
  • the at least one input signal may be received from the user input device or from the display device 27, for example.
  • the user input device may be a mouse, keyboard or touchscreen, for example.
  • the display 9 may be such a touchscreen.
  • the processor 5 is also configured to determine one or more light rendering properties associated with the display device 27 based on the at least one input signal.
  • the one or more light rendering properties represent how the display device 27 generates light effects based on audio and/or video and/or how the light effects appear when rendered on the further lighting devices 28 and 29.
  • the processor 5 is further configured to determine the one or more light effects based on the one or more light rendering properties and further based on the media content and/or a light script associated with the media content and use the transmitter 4 to control the lighting devices 13-14 to render the one or more light effects.
  • the processor 5 is configured to determine the one or more light effects by adjusting the light script or by adjusting a mapping that specifies how light effects are determined based on media content.
  • the further light elements are light elements integrated in the display device 27 (e.g. in an Ambilight TV). In an alternative embodiment, the further light elements are separately positioned yet controlled by the display device 27. One or more of the further light elements may be part of the display or backlighting of the display device 27.
  • the computer 1 does not have any direct control over the lighting capabilities of the display device 27 but is able to determine light rendering properties of the display device 27 and adjust its own light effects accordingly. By adjusting the light effects that would normally be rendered and by adjusting these light effects based on the light rendering properties, a single media content-based light experience is created that involves separately controlled sets of light elements, i.e. a first set comprising lighting devices 13 and 14 and a second set comprising further lighting devices 28 and 29.
  • the computer 1 uses the transmitter 4 to transmit the media content to the display device 27, e.g. using Wi-Fi Miracast or wired HDMI output.
  • the display device 27 e.g. using Wi-Fi Miracast or wired HDMI output.
  • the at least one input signal further comprises at least one user preference related to the one or more light rendering properties.
  • the user preference is configured by a user in the display device 27.
  • a single display device displays media content and controls one or more further light elements based on the media content.
  • multiple display devices display media content and control one or more further light elements based on the media content.
  • the multiple display devices may be gaming monitors, for example. These gaming monitors may have auxiliary light elements, for example.
  • a difference between normal TV viewing and gaming is that gamers often have two or three monitors placed next to each other. In one case, a game is rendered on all two or three monitors, and in another case, the game runs on one monitor while the remaining monitors might still be switched on with some, other often game-related, content.
  • the computer 1 In the embodiment of the computer 1 shown in Fig. 1, the computer 1
  • the computer 1 comprises one processor 5.
  • the computer 1 comprises multiple processors.
  • the processor 5 of the computer 1 may be a general-purpose processor, e.g. from Intel, AMD, or Qualcomm or ARM-based, or an application-specific processor.
  • the processor 5 of the computer 1 may run a Windows, a macOS, an Android or iOS operating system for example.
  • the memory 7 may comprise one or more memory units.
  • the memory 7 may comprise solid-state memory, for example.
  • the memory 7 may be used to store an operating system, applications and application data, for example.
  • the receiver 3 and the transmitter 4 may use one or more wireless
  • the invention may be implemented using a computer program running on one or more processors.
  • the computer 1 may be a video module.
  • the video module may be a dedicated HDMI module that can be put between the display device 27 and a device providing the HDMI input in order to analyze the HDMI input, (e.g. when the HDMI input is not protected with HDCP) for example.
  • the system comprises a single device. In an alternative embodiment, the system comprises multiple devices.
  • Fig. 2 depicts embodiments of the display device 27 and lighting devices 13- 14 of Fig. 1.
  • the display device 27 is an LCD TV and the one or more further light elements are incorporated into the TV.
  • the one or more further light elements comprise multiple auxiliary light elements configured to illuminate an area behind and next to the TV.
  • the one or more further light elements further comprise multiple further light elements which form a backlight of the TV, e.g. edge-lit or direct-lit. For example, a higher brightness may be used for the light effects when the LCD TV is an HDR TV with a high light output.
  • the one or more further light elements comprise at least one further light element that is a pixel of a display incorporated into the display device 27.
  • the display device 27 controls further light elements at both sides of the display device 27 based on the audio and/or video rendered by the display device 27. At the moment depicted in Fig. 2, these further light elements render further light effects 41 and 42.
  • the lighting devices 13-14 are separately controlled by the computer 1 of Fig. 1 based on a light script associated with the media content being rendered by the display device 27 or based on an analysis of the media content being rendered by the display device 27. This analysis may be performed by the computer 1 and/or by another device.
  • the lighting devices 13-14 render light effects 43 and 44, respectively. If the light effects 43 and 44 do not match the further light effects 41 and 42, this will affect the viewer’s entertainment experience, especially since the light effects 41 and 43 overlap and the light effects 42 and 44 overlap.
  • a first embodiment of the method of determining one or more light effects and controlling one or more light elements to render the one or more light effects is shown in Fig. 3.
  • the one or more light elements are controlled while a display device displays media content and controls one or more further light elements based on the media content.
  • a step 101 comprises receiving at least one input signal comprising an identification of the display device.
  • a step 103 comprises determining one or more light rendering properties associated with the display device based on the at least one input signal.
  • the one or more light rendering properties represent how the display device generates light effects based on audio and/or video and/or how the light effects appear when rendered on the one or more further light elements.
  • the one or more light rendering properties may represent how colors are extracted from the video. For example, an average color, e.g. trimean, of a certain region of the frames of the media content may be determined and one or more of the light elements may be controlled to render a light effect with this extracted color. For light elements to the left of the display device, average colors may be extracted from a region on the left side of the frames of the media content. For light elements to the right of the display device, average colors may be extracted from a region on the right side of the frames of the media content. Alternatively, a color may be extracted, for example, from a dominant feature in the frame, e.g. a building.
  • a dominant feature in the frame e.g. a building.
  • the one or more light rendering properties may represent a dynamicity of light effects generated by the display device.
  • the one or more light rendering properties may represent a dynamicity of a chromaticity of light effects generated by the display device and/or a dynamicity of a brightness of light effects generated by the display device.
  • the display device may use the same or a different dynamicity for the chromaticity and the brightness of the light effects (e.g. slower brightness transitions than chromaticity transitions). If the color of a light effect is only determined based on the current video frame, the light effects will be very dynamic. To decrease the dynamicity, an average of colors extracted from multiple frames can be determined. The larger the number of frames whose colors are averaged and/or the higher the weight of colors extracted from previous frames, the lower the dynamicity of the light effects.
  • the one or more light rendering properties may represent whether the display device generates light effects based on the audio.
  • the brightness of the light effects may be determined based on the intensity of the audio. This works well for music, for example.
  • the display device may generate the light effects only based on the audio or based on both the audio and the video, for example.
  • the one or more light rendering properties may represent a relationship between colors of the video and colors rendered by the one or more further light elements.
  • a High Dynamic Range (HDR) TV may use a much higher light output when rendering bright pixels than a non-HDR TV (e.g. by increasing the light output of the backlight in an LCD TV or by increasing the light output of the pixels themselves in an OLED TV).
  • black/blackish colors in the video typically appear much blacker when rendered by an OLED TV than on when rendered on an LCD TV.
  • a step 105 comprises determining the one or more light effects based on the one or more light rendering properties and further based on the media content and/or a light script associated with the media content.
  • a step 107 comprises controlling the one or more light elements to render the one or more light effects.
  • a second embodiment of the method of determining one or more light effects and controlling one or more light elements, e.g. Hue lights, to render the one or more light effects is shown in Fig. 4.
  • a step 121 it is checked whether an identifiable display device is detected, e.g. using UPnP or Bluetooth technology. If an identifiable display device is detected, a step 123 is performed. If no identifiable display device is detected, a step 127 is performed. Step 123 comprises transmitting a request to the display device to receive an identification and a user preference if applicable. Step 125 comprises receiving an input signal comprising the identification of the display device, and the user preference if applicable.
  • the identification may comprise the brand of the display device, e.g.“Philips Ambilight TV”, or may specify a model number, for example.
  • Step 127 comprises asking a user to provide an identification of the display device rendering further light effects, e.g. by asking the user to select from a list of display device brands and/or models using a touchscreen.
  • a step 129 comprises receiving an input signal comprising the identification of the display device.
  • Step 131 comprises determining which settings are configurable on the identified display device and asking the user to specify the values of these settings.
  • Step 133 comprises receiving an input signal comprising the configured values for the corresponding settings reflecting the user preferences.
  • Step 103 of Fig. 3 is performed after steps 125 and 133.
  • Step 103 comprises determining one or more light rendering properties associated with the display device based on the input signal(s).
  • a step 135 comprises checking whether the user has chosen to render a light script associated with the media content or to use audio and/or video analysis to determine the light effects. If the user has chosen the former, a step 139 is performed. If the user has chosen the latter, a step 137 is performed. Step 139 comprises parsing the light script. Step 137 comprises analyzing the audio and/or video. One or more of the light rendering properties may be used in step 137, e.g. to determine which color extraction method to use and whether audio should be analyzed or not.
  • Step 105 comprises determining the one or more light effects based on the one or more light rendering properties and further based on the analysis of the media content or the light script.
  • One or more light rendering properties may be used in step 105 to determine the light effects, e.g. the dynamicity of the light effects, from the audio and/or video analysis results of step 137.
  • the one or more light rendering properties may be used to adjust the light effect parameters specified in the light script and parsed in step 139.
  • the analysis of the media content and the determination of the light effects from the results of this analysis or from the light script may be adjusted by changing a preset. Instead of allowing the user to change this preset manually, this preset is now changed automatically. For example, if an Ambilight TV in dynamic mode is detected, the HueSync app could first search for the preset“Ambilight-Dynamic” in its memory and activate this present if found. Thus, such a preset would describe how the light effects generated by the HueSync app should be determined. Moreover, the preset might also contain a
  • step 137 This recommendation may be output to the user.
  • step 137 and/or the determination of step 105 result in an adjustment of the light effects that that would normally be rendered, e.g. which are specified in the light script. These adjustments may include:
  • the HueSync app could use an algorithm that is closer to or similar to the algorithm used by the Ambilight TV.
  • Brightness adjustment The display device’s light elements will increase the overall light effects brightness. If a specific level of brightness is requested by the user in the HueSync app, then the app could also take into account additional brightness created by the display device. This does not only apply to auxiliary light elements: the adjustment may also be based on the specified maximum lumen output of the display device, preferably combined with retrieved brightness settings (user preferences).
  • the HueSynch app could automatically adjust the dynamicity of the light effects that it determines to match the dynamicity of the further light effects generated by the display device.
  • the determination of the light effects may also depend on the position of the Hue lights around the display device. For example, in the case of an Ambilight TV, if Hue lights are not located next to the Ambilight TV, adjustment of the light effects could be omitted. However, if the Hue lights are positioned such that light effects from the Hue lights are expected to mix with the light effects from the Ambilight TV, then adjustment is beneficial.
  • Step 107 comprises controlling the one or more light elements, e.g. the Hue lights, to render the one or more light effects.
  • Step 137 or step 139 is repeated after step 107, depending on whether step 137 or step 139 was performed before step 105.
  • Fig. 5 depicts a block diagram illustrating an exemplary data processing system that may perform the method as described with reference to Figs. 2 and 3.
  • the data processing system 500 may include at least one processor 502 coupled to memory elements 504 through a system bus 506.
  • the data processing system may store program code within memory elements 504.
  • the processor 502 may execute the program code accessed from the memory elements 504 via a system bus 506.
  • the data processing system may be implemented as a computer that is suitable for storing and/or executing program code. It should be appreciated, however, that the data processing system 500 may be implemented in the form of any system including a processor and a memory that can perform the functions described within this specification.
  • the memory elements 504 may include one or more physical memory devices such as, for example, local memory 508 and one or more bulk storage devices 510.
  • the local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code.
  • a bulk storage device may be implemented as a hard drive or other persistent data storage device.
  • the processing system 500 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the quantity of times program code must be retrieved from the bulk storage device 510 during execution.
  • the processing system 500 may also be able to use memory elements of another processing system, e.g. if the processing system 500 is part of a cloud-computing platform.
  • I/O devices depicted as an input device 512 and an output device 514 optionally can be coupled to the data processing system.
  • input devices may include, but are not limited to, a keyboard, a pointing device such as a mouse, a microphone (e.g. for voice and/or speech recognition), or the like.
  • output devices may include, but are not limited to, a monitor or a display, speakers, or the like.
  • Input and/or output devices may be coupled to the data processing system either directly or through intervening I/O controllers.
  • the input and the output devices may be implemented as a combined input/output device (illustrated in Fig. 5 with a dashed line surrounding the input device 512 and the output device 514).
  • a combined device is a touch sensitive display, also sometimes referred to as a“touch screen display” or simply“touch screen”.
  • input to the device may be provided by a movement of a physical object, such as e.g. a stylus or a finger of a user, on or near the touch screen display.
  • a network adapter 516 may also be coupled to the data processing system to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks.
  • the network adapter may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to the data processing system 500, and a data transmitter for transmitting data from the data processing system 500 to said systems, devices and/or networks.
  • Modems, cable modems, and Ethernet cards are examples of different types of network adapter that may be used with the data processing system 500.
  • the memory elements 504 may store an application 518.
  • the application 518 may be stored in the local memory 508, the one or more bulk storage devices 510, or separate from the local memory and the bulk storage devices. It should be appreciated that the data processing system 500 may further execute an operating system (not shown in Fig. 5) that can facilitate execution of the application 518.
  • the application 518 being implemented in the form of executable program code, can be executed by the data processing system 500, e.g., by the processor 502. Responsive to executing the application, the data processing system 500 may be configured to perform one or more operations or method steps described herein.
  • Various embodiments of the invention may be implemented as a program product for use with a computer system, where the program(s) of the program product define functions of the embodiments (including the methods described herein).
  • the program(s) can be contained on a variety of non-transitory computer-readable storage media, where, as used herein, the expression“non-transitory computer readable storage media” comprises all computer-readable media, with the sole exception being a transitory, propagating signal.
  • the program(s) can be contained on a variety of transitory computer-readable storage media.
  • Illustrative computer-readable storage media include, but are not limited to: (i) non-writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored.
  • the computer program may be run on the processor 502 described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A system controls one or more light elements (13,14) to render one or more light effects (43,44) while a display device (27) displays media content and controls one or more further light elements based on the media content. The system is configured to receive at least one input signal comprising an identification of the display device and determine one or more light rendering properties associated with the display device based on the input signal. The light rendering properties represent how the display device generates light effects (41,42) based on audio and/or video and/or how the light effects appear when rendered on the further light elements. The system is further configured to determine the light effects based on the light rendering properties and further based on the media content and/or a light script associated with the media content and control the light elements to render the light effects.

Description

Determining light effects based on a light script and/or media content and light rendering properties of a display device
FIELD OF THE INVENTION
The invention relates to a system for determining one or more light effects and controlling one or more light elements to render said one or more light effects while a display device displays media content and controls one or more further light elements based on said media content.
The invention further relates to a method of determining one or more light effects and controlling one or more light elements to render said one or more light effects while a display device displays media content and controls one or more further light elements based on said media content.
The invention also relates to a computer program product enabling a computer system, e.g. a PC or mobile phone, to perform such a method.
BACKGROUND OF THE INVENTION
One of the main benefits of having a dynamic lighting system such as the Philips HueSync app and Philips Ambilight TVs is to extend the content that is displayed on a screen. By allowing the lighting system to dynamically display colors in the room that are extracted from the content, an enhanced experience can be offered.
WO2017/072537 A1 discloses that a video signal may be captured and be processed by a backlighting processor unit (BPU) in real-time and may be fed to the TV in its original form. Processing may be made to capture pixel areas along the border of an image and obtain an average (e.g. a median, mean or mode) color of each of these areas. These colors are then displayed at a respective LED module on the attached to the back of the TV screen. All the LEDs on the back of the display device work together to illuminate e.g. a wall behind a TV making an image which extends screen borders and provides a light gradient between a bright image on the display and a dark ambient of the room. This gradient helps to relieve eye strain and make a visually larger picture. If the Single LED Module is used, the capturing area is a single area and equals the whole displayed image
US2018/02350391 A1 discloses a lighting system that comprises one or more lamps for illuminating an environment. A set of configuration data is retrieved from a memory, the set of configuration data defining an initial lighting scene for rendering by a group of lamps. It is detected when at least one of the lamps in the lighting system, e.g. a lamp that renders light effects based on TV-related content, is emitting illumination that would disrupt the initial lighting scene and at least one characteristic of the disruptive illumination is determined. The set of configuration data is modified based on the determined characteristic to account for the disruptive illumination, e.g. by increasing the brightness of the group of lamps to limit cross interference. One or more lamps of the group are controlled, via the control interface, to render a modified lighting scene defined by the modified set.
Although the system of US2018/02350391 A1 works well if different people use the different sets of light elements for different activities, e.g. one person is reading and another person is watching TV, the system is not suited to create a single light experience that involves multiple separately controlled sets of light elements. Instead, although one light experience is adjusted based on the other light experience, two different light experiences are created, each one involving a different set of light elements.
SUMMARY OF THE INVENTION
It is a first object of the invention to provide a system for determining one or more light effects, which is able to create a single media content-based light experience that involves separately controlled sets of light elements.
It is a second object of the invention to provide a method of determining one or more light effects, which can be used to create a single media content-based light experience that involves separately controlled sets of light elements.
In a first aspect of the invention, a system for determining one or more light effects and controlling one or more light elements to render said one or more light effects while a display device displays media content and controls one or more further light elements based on said media content comprises at least one input interface, at least one output interface, and at least one processor configured to use said at least one input interface to receive at least one input signal comprising an identification of said display device, determine one or more light rendering properties associated with said display device based on said at least one input signal, said one or more light rendering properties representing how said display device generates light effects based on audio and/or video and/or how said light effects appear when rendered on said one or more further light elements, determine said one or more light effects based on said one or more light rendering properties and further based on said media content and/or a light script associated with said media content, and use said at least one output interface to control said one or more light elements to render said one or more light effects.
The inventors have realized that when a first set of light elements is used by a display device to generate and render light effects based on audio and/or video and another device generates and renders light effects using a second set of light elements, the different light effects may interfere with each other. In order to create a single media content-based light experience that involves separately controlled sets of light elements, the system determines light effects that match light effects generated and rendered by a display device, e.g. a Philips Ambilight TV. Using the Philips HueSync app together with an Ambilight TV without using the invention would typically have unexpected or even negative impact, because the colors rendered by the Ambilight TV would typically not match with the colors rendered by the Hue system.
This problem may occur when Hue lights are controlled based on a light script associated with media content which is being rendered on the Ambilight TV, but also when Hue lights are controlled based on an independent analysis of the media content. When the Hue lights are controlled based on a light script, they are normally controlled less
dynamically than the lights of the Ambilight TV. The script may, for instance, comprise subtle light changes whereas the Ambilight control is based on changes in the image on screen which may occur more dynamically (faster).
Since the light effect rendering of the Ambilight TV, i.e. the display device, cannot be adjusted, the system needs to determine light effects that match the light effects created by the Ambilight TV. By receiving and using an identification of the display device, the system is able to determine the light rendering properties associated with the display device. For example, a display device of a first manufacturer of display devices typically generates and/or renders light effects in a different manner than a display device of a second manufacturer of display devices. Furthermore, a first display device model may generate and/or render light effects in a different manner than a second display device model of the same manufacturer, e.g. the first model may use LEDs and the second model may use projectors to render light effects.
The system uses the determined light rendering properties to determine light effects that match the light effects generated and rendered by the display device. These light effects rendered by the display device may not only be rendered on one or more auxiliary light elements configured to illuminate an area behind and/or next to the display device, as is the case in Ambilight TVs, but also on one or more light elements which form a backlight of the display device and/or on one or more light elements which are pixels of a display incorporated into the display device. This is beneficial, because the overall lighting experience is typically influenced by the brightness settings and type of display used. For instance, in the case of a dark image or video scene, an OLED display emits no light, whereas an LCD display does emit light in such a case.
The one or more further light elements may be incorporated into said display device. Alternatively, the one or more further light elements may be incorporated into one or more accessories of the display device which are controlled by the display device, e.g. if the display devices renders light effects on one or more auxiliary light elements configured to illuminate an area behind and/or next to the display device.
Said at least one input signal may further comprise at least one user preference related to said one or more light rendering properties, said user preference being configured by a user in said display device. If a user is able to configure the manner in which the display device generates and/or renders light effects, it is important that the system knows these user preferences in order to determine light effects matching the light effects generated and rendered by the lighting device. For example, in Ambilight TVs, a user is able to select one of a plurality of modes and the dynamicity of the light effects depends on the selected mode.
Said at least one input signal may be received from a user input device or from said display device. For example, the user may be able to select his brand of TV and/or model of TV from a list and/or the TV may broadcast its identifier or transmit its identifier on request.
Said one or more light rendering properties may represent how colors are extracted from the video, may represent a dynamicity of light effects generated by said display device, may represent a dynamicity of a chromaticity of light effects generated by said display device and/or a dynamicity of a brightness of light effects generated by said display device, may represent whether said display device generates light effects based on the audio, and/or may represent a relationship between colors of the video and colors rendered by said one or more further light elements, for example. The dynamicity of light effects may be represented by a speed of transitions between light settings, e.g. a speed of transitions in brightness and/or chromaticity. The dynamicity of light effects may be represented by a speed of transitions that exceed a certain difference between the most different brightness and/or chromaticity points in the transition. For example, if the highest and lowest brightness are not very different, even a fast transition between them might not be perceived as dynamic. However, if the highest and lowest brightness are very different, then the effect would look more dynamic.
In a second aspect of the invention, a method of determining one or more light effects and controlling one or more light elements to render said one or more light effects while a display device displays media content and controls one or more further light elements based on said media content comprises receiving at least one input signal comprising an identification of said display device, determining one or more light rendering properties associated with said display device based on said at least one input signal, said one or more light rendering properties representing how said display device generates light effects based on audio and/or video and/or how said light effects appear when rendered on said one or more further light elements, determining said one or more light effects based on said one or more light rendering properties and further based on said media content and/or a light script associated with said media content, and controlling said one or more light elements to render said one or more light effects. Said method may be performed by software running on a programmable device. This software may be provided as a computer program product.
Moreover, a computer program for carrying out the methods described herein, as well as a non-transitory computer readable storage-medium storing the computer program are provided. A computer program may, for example, be downloaded by or uploaded to an existing device or be stored upon manufacturing of these systems.
A non-transitory computer-readable storage medium stores a software code portion, the software code portion, when executed or processed by a computer, being configured to perform executable operations for determining one or more light effects and controlling one or more light elements to render said one or more light effects while a display device displays media content and controls one or more further light elements based on said media content.
The executable operations comprise receiving at least one input signal comprising an identification of said display device, determining one or more light rendering properties associated with said display device based on said at least one input signal, said one or more light rendering properties representing how said display device generates light effects based on audio and/or video and/or how said light effects appear when rendered on said one or more further light elements, determining said one or more light effects based on said one or more light rendering properties and further based on said media content and/or a light script associated with said media content, and controlling said one or more light elements to render said one or more light effects. As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a device, a method or a computer program product, e.g. an app. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit", "module" or "system." Functions described in this disclosure may be implemented as an algorithm executed by a processor/microprocessor of a computer. Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied, e.g., stored, thereon.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a computer readable storage medium may include, but are not limited to, the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of the present invention, a computer readable storage medium may be any tangible medium that can contain, or store, a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing. Computer program code for carrying out operations for aspects of the present invention may be written in any
combination of one or more programming languages, including an object oriented programming language such as Java(TM), Smalltalk, C++ or the like, conventional procedural programming languages, such as the "C" programming language or similar programming languages, and functional programming languages such as Scala, Haskel or the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the present invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor, in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of devices, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
BRIEF DESCRIPTION OF THE DRAWINGS
These and other aspects of the invention are apparent from and will be further elucidated, by way of example, with reference to the drawings, in which:
Fig. l is a block diagram of an embodiment of the system;
Fig. 2 depicts embodiments of the display device and lighting devices of Fig. i ;
Fig. 3 is a flow diagram of a first embodiment of the method;
Fig. 4 is a flow diagram of a second embodiment of the method; and
Fig. 5 is a block diagram of an exemplary data processing system for performing the method of the invention.
Corresponding elements in the drawings are denoted by the same reference numeral.
DETAILED DESCRIPTION OF THE EMBODIMENTS
Fig. 1 shows an embodiment of the system for determining one or more light effects and controlling one or more light elements to render the one or more light effects: computer 1. The one or more light effects are rendered while a display device 27, e.g. a TV, displays media content and controls one or more further light elements based on the media content. In the example of Fig. 1, the light elements are lighting devices 13-14 and the further light elements are further lighting devices 28 and 29, e.g. LED strips.
Computer 1 is connected to a wireless LAN access point 23. A bridge 11 is also connected to the wireless LAN access point 23, e.g. via Ethernet. Lighting devices 13-14 communicate wirelessly with the bridge 11, e.g. using the Zigbee protocol, and can be controlled via the bridge 11, e.g. by the computer 1. The bridge 11 may be a Philips Hue bridge and the lighting devices 13-14 may be Philips Hue lights, for example. In an alternative embodiment, lighting devices are controlled without a bridge.
The computer 1, the bridge 11 and the lighting devices 13-14 are part of lighting system 21. The computer 1 may be a desktop, laptop, tablet or mobile phone, for example. The display device 27 is also connected to the wireless LAN access point 23. The wireless LAN access point 23 is connected to the Internet (backbone) 24. An Internet server 25 is also connected to the Internet (backbone) 24. The Internet server 25 may store light scripts, for example. The computer 1 may run the Philips Hue Sync app, for example.
The computer 1 comprises a processor 5, a receiver 3, a transmitter 4, a memory 7, and a display 9. The processor 5 is configured to use the receiver 3 and/or an input interface to a user input device to receive at least one input signal comprising an identification of the display device 27. The at least one input signal may be received from the user input device or from the display device 27, for example. The user input device may be a mouse, keyboard or touchscreen, for example. The display 9 may be such a touchscreen.
The processor 5 is also configured to determine one or more light rendering properties associated with the display device 27 based on the at least one input signal. The one or more light rendering properties represent how the display device 27 generates light effects based on audio and/or video and/or how the light effects appear when rendered on the further lighting devices 28 and 29.
The processor 5 is further configured to determine the one or more light effects based on the one or more light rendering properties and further based on the media content and/or a light script associated with the media content and use the transmitter 4 to control the lighting devices 13-14 to render the one or more light effects. In particular, the processor 5 is configured to determine the one or more light effects by adjusting the light script or by adjusting a mapping that specifies how light effects are determined based on media content.
In the embodiment of Fig. 1, the further light elements are light elements integrated in the display device 27 (e.g. in an Ambilight TV). In an alternative embodiment, the further light elements are separately positioned yet controlled by the display device 27. One or more of the further light elements may be part of the display or backlighting of the display device 27. The computer 1 does not have any direct control over the lighting capabilities of the display device 27 but is able to determine light rendering properties of the display device 27 and adjust its own light effects accordingly. By adjusting the light effects that would normally be rendered and by adjusting these light effects based on the light rendering properties, a single media content-based light experience is created that involves separately controlled sets of light elements, i.e. a first set comprising lighting devices 13 and 14 and a second set comprising further lighting devices 28 and 29.
In the embodiment of Fig. 1, the computer 1 uses the transmitter 4 to transmit the media content to the display device 27, e.g. using Wi-Fi Miracast or wired HDMI output. This has the advantage that no special measures need to be implemented to determine which part of the media content is currently being rendered by the display device 27 to synchronize the one or more light effects with the corresponding further light effects rendered by the display device 27.
In the embodiment of Fig. 1, the at least one input signal further comprises at least one user preference related to the one or more light rendering properties. The user preference is configured by a user in the display device 27.
In the embodiment of Fig. 1, a single display device displays media content and controls one or more further light elements based on the media content. In an alternative embodiment, multiple display devices display media content and control one or more further light elements based on the media content. The multiple display devices may be gaming monitors, for example. These gaming monitors may have auxiliary light elements, for example. A difference between normal TV viewing and gaming is that gamers often have two or three monitors placed next to each other. In one case, a game is rendered on all two or three monitors, and in another case, the game runs on one monitor while the remaining monitors might still be switched on with some, other often game-related, content.
In the embodiment of the computer 1 shown in Fig. 1, the computer 1
comprises one processor 5. In an alternative embodiment, the computer 1 comprises multiple processors. The processor 5 of the computer 1 may be a general-purpose processor, e.g. from Intel, AMD, or Qualcomm or ARM-based, or an application-specific processor. The processor 5 of the computer 1 may run a Windows, a macOS, an Android or iOS operating system for example. The memory 7 may comprise one or more memory units. The memory 7 may comprise solid-state memory, for example. The memory 7 may be used to store an operating system, applications and application data, for example.
The receiver 3 and the transmitter 4 may use one or more wireless
communication technologies such as Wi-Fi (IEEE 802.11) to communicate with the wireless LAN access point 23, for example. In an alternative embodiment, multiple receivers and/or multiple transmitters are used instead of a single receiver and a single transmitter. In the embodiment shown in Fig. 1, a separate receiver and a separate transmitter are used. In an alternative embodiment, the receiver 3 and the transmitter 4 are combined into a transceiver. The display 9 may comprise an LCD or OLED panel, for example. The computer 1 may comprise other components typical for a computer such as a power connector. The invention may be implemented using a computer program running on one or more processors.
Instead of a desktop, laptop, tablet or mobile phone, the computer 1 may be a video module. The video module may be a dedicated HDMI module that can be put between the display device 27 and a device providing the HDMI input in order to analyze the HDMI input, (e.g. when the HDMI input is not protected with HDCP) for example. In the embodiment of Fig. 1, the system comprises a single device. In an alternative embodiment, the system comprises multiple devices.
Fig. 2 depicts embodiments of the display device 27 and lighting devices 13- 14 of Fig. 1. In the embodiment of the display device 27 of Fig. 2, the display device 27 is an LCD TV and the one or more further light elements are incorporated into the TV. The one or more further light elements comprise multiple auxiliary light elements configured to illuminate an area behind and next to the TV. The one or more further light elements further comprise multiple further light elements which form a backlight of the TV, e.g. edge-lit or direct-lit. For example, a higher brightness may be used for the light effects when the LCD TV is an HDR TV with a high light output. In an alternative embodiment, e.g. when the display device 27 is an OLED TV, the one or more further light elements comprise at least one further light element that is a pixel of a display incorporated into the display device 27.
In the embodiment of Fig. 2, the display device 27 controls further light elements at both sides of the display device 27 based on the audio and/or video rendered by the display device 27. At the moment depicted in Fig. 2, these further light elements render further light effects 41 and 42. The lighting devices 13-14 are separately controlled by the computer 1 of Fig. 1 based on a light script associated with the media content being rendered by the display device 27 or based on an analysis of the media content being rendered by the display device 27. This analysis may be performed by the computer 1 and/or by another device.
At the moment depicted in Fig. 2, the lighting devices 13-14 render light effects 43 and 44, respectively. If the light effects 43 and 44 do not match the further light effects 41 and 42, this will affect the viewer’s entertainment experience, especially since the light effects 41 and 43 overlap and the light effects 42 and 44 overlap.
A first embodiment of the method of determining one or more light effects and controlling one or more light elements to render the one or more light effects is shown in Fig. 3. The one or more light elements are controlled while a display device displays media content and controls one or more further light elements based on the media content.
A step 101 comprises receiving at least one input signal comprising an identification of the display device. A step 103 comprises determining one or more light rendering properties associated with the display device based on the at least one input signal. The one or more light rendering properties represent how the display device generates light effects based on audio and/or video and/or how the light effects appear when rendered on the one or more further light elements.
The one or more light rendering properties may represent how colors are extracted from the video. For example, an average color, e.g. trimean, of a certain region of the frames of the media content may be determined and one or more of the light elements may be controlled to render a light effect with this extracted color. For light elements to the left of the display device, average colors may be extracted from a region on the left side of the frames of the media content. For light elements to the right of the display device, average colors may be extracted from a region on the right side of the frames of the media content. Alternatively, a color may be extracted, for example, from a dominant feature in the frame, e.g. a building.
Alternatively or additionally, the one or more light rendering properties may represent a dynamicity of light effects generated by the display device. For example, the one or more light rendering properties may represent a dynamicity of a chromaticity of light effects generated by the display device and/or a dynamicity of a brightness of light effects generated by the display device. The display device may use the same or a different dynamicity for the chromaticity and the brightness of the light effects (e.g. slower brightness transitions than chromaticity transitions). If the color of a light effect is only determined based on the current video frame, the light effects will be very dynamic. To decrease the dynamicity, an average of colors extracted from multiple frames can be determined. The larger the number of frames whose colors are averaged and/or the higher the weight of colors extracted from previous frames, the lower the dynamicity of the light effects.
Alternatively or additionally, the one or more light rendering properties may represent whether the display device generates light effects based on the audio. For example, the brightness of the light effects may be determined based on the intensity of the audio. This works well for music, for example. The display device may generate the light effects only based on the audio or based on both the audio and the video, for example.
Alternatively or additionally, the one or more light rendering properties may represent a relationship between colors of the video and colors rendered by the one or more further light elements. As a first example, a High Dynamic Range (HDR) TV may use a much higher light output when rendering bright pixels than a non-HDR TV (e.g. by increasing the light output of the backlight in an LCD TV or by increasing the light output of the pixels themselves in an OLED TV). As a second example, black/blackish colors in the video typically appear much blacker when rendered by an OLED TV than on when rendered on an LCD TV.
A step 105 comprises determining the one or more light effects based on the one or more light rendering properties and further based on the media content and/or a light script associated with the media content. A step 107 comprises controlling the one or more light elements to render the one or more light effects.
A second embodiment of the method of determining one or more light effects and controlling one or more light elements, e.g. Hue lights, to render the one or more light effects is shown in Fig. 4. In a step 121, it is checked whether an identifiable display device is detected, e.g. using UPnP or Bluetooth technology. If an identifiable display device is detected, a step 123 is performed. If no identifiable display device is detected, a step 127 is performed. Step 123 comprises transmitting a request to the display device to receive an identification and a user preference if applicable. Step 125 comprises receiving an input signal comprising the identification of the display device, and the user preference if applicable. The identification may comprise the brand of the display device, e.g.“Philips Ambilight TV”, or may specify a model number, for example.
For example, the Philips HueSync app could itself check if there any connected display devices that have light rendering capabilities. Since most display devices are connected these days and often have an open API, it should be possible for the HueSync app to get a sense of what is connected and in what mode it is currently in (e.g. Ambilight feature might for example be off so no adaptation would be required). Step 127 comprises asking a user to provide an identification of the display device rendering further light effects, e.g. by asking the user to select from a list of display device brands and/or models using a touchscreen. A step 129 comprises receiving an input signal comprising the identification of the display device. Step 131 comprises determining which settings are configurable on the identified display device and asking the user to specify the values of these settings. Step 133 comprises receiving an input signal comprising the configured values for the corresponding settings reflecting the user preferences.
Step 103 of Fig. 3 is performed after steps 125 and 133. Step 103 comprises determining one or more light rendering properties associated with the display device based on the input signal(s).
Next, a step 135 comprises checking whether the user has chosen to render a light script associated with the media content or to use audio and/or video analysis to determine the light effects. If the user has chosen the former, a step 139 is performed. If the user has chosen the latter, a step 137 is performed. Step 139 comprises parsing the light script. Step 137 comprises analyzing the audio and/or video. One or more of the light rendering properties may be used in step 137, e.g. to determine which color extraction method to use and whether audio should be analyzed or not.
Next, steps 105 and 107 of Fig. 3 are performed. Step 105 comprises determining the one or more light effects based on the one or more light rendering properties and further based on the analysis of the media content or the light script. One or more light rendering properties may be used in step 105 to determine the light effects, e.g. the dynamicity of the light effects, from the audio and/or video analysis results of step 137. Alternatively, the one or more light rendering properties may be used to adjust the light effect parameters specified in the light script and parsed in step 139.
The analysis of the media content and the determination of the light effects from the results of this analysis or from the light script may be adjusted by changing a preset. Instead of allowing the user to change this preset manually, this preset is now changed automatically. For example, if an Ambilight TV in dynamic mode is detected, the HueSync app could first search for the preset“Ambilight-Dynamic” in its memory and activate this present if found. Thus, such a preset would describe how the light effects generated by the HueSync app should be determined. Moreover, the preset might also contain a
recommendation for the user to configure the display device in a certain manner if an optimal experience is difficult to achieve by the HueSync app with the current user preference(s).
This recommendation may be output to the user. The analysis of step 137 and/or the determination of step 105 result in an adjustment of the light effects that that would normally be rendered, e.g. which are specified in the light script. These adjustments may include:
• Change in color palette. Often Ambilight TVs uses an automatic way of
generating light effects that depends on the color palette on screen. The way how colors are extracted might be different compared to the one used in the HueSync app. To avoid a color mismatch for lights next to the screen, the HueSync app could use an algorithm that is closer to or similar to the algorithm used by the Ambilight TV.
• Brightness adjustment. The display device’s light elements will increase the overall light effects brightness. If a specific level of brightness is requested by the user in the HueSync app, then the app could also take into account additional brightness created by the display device. This does not only apply to auxiliary light elements: the adjustment may also be based on the specified maximum lumen output of the display device, preferably combined with retrieved brightness settings (user preferences).
• Dynamics (speed of chromaticity and brightness change). If there is a
mismatch between the dynamicity setting in the HueSync app and the dynamicity of the further light effects generated by the display device, the HueSynch app could automatically adjust the dynamicity of the light effects that it determines to match the dynamicity of the further light effects generated by the display device.
The determination of the light effects may also depend on the position of the Hue lights around the display device. For example, in the case of an Ambilight TV, if Hue lights are not located next to the Ambilight TV, adjustment of the light effects could be omitted. However, if the Hue lights are positioned such that light effects from the Hue lights are expected to mix with the light effects from the Ambilight TV, then adjustment is beneficial.
Step 107 comprises controlling the one or more light elements, e.g. the Hue lights, to render the one or more light effects. Step 137 or step 139 is repeated after step 107, depending on whether step 137 or step 139 was performed before step 105.
Fig. 5 depicts a block diagram illustrating an exemplary data processing system that may perform the method as described with reference to Figs. 2 and 3. As shown in Fig. 5, the data processing system 500 may include at least one processor 502 coupled to memory elements 504 through a system bus 506. As such, the data processing system may store program code within memory elements 504. Further, the processor 502 may execute the program code accessed from the memory elements 504 via a system bus 506. In one aspect, the data processing system may be implemented as a computer that is suitable for storing and/or executing program code. It should be appreciated, however, that the data processing system 500 may be implemented in the form of any system including a processor and a memory that can perform the functions described within this specification.
The memory elements 504 may include one or more physical memory devices such as, for example, local memory 508 and one or more bulk storage devices 510. The local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code. A bulk storage device may be implemented as a hard drive or other persistent data storage device. The processing system 500 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the quantity of times program code must be retrieved from the bulk storage device 510 during execution. The processing system 500 may also be able to use memory elements of another processing system, e.g. if the processing system 500 is part of a cloud-computing platform.
Input/output (I/O) devices depicted as an input device 512 and an output device 514 optionally can be coupled to the data processing system. Examples of input devices may include, but are not limited to, a keyboard, a pointing device such as a mouse, a microphone (e.g. for voice and/or speech recognition), or the like. Examples of output devices may include, but are not limited to, a monitor or a display, speakers, or the like.
Input and/or output devices may be coupled to the data processing system either directly or through intervening I/O controllers.
In an embodiment, the input and the output devices may be implemented as a combined input/output device (illustrated in Fig. 5 with a dashed line surrounding the input device 512 and the output device 514). An example of such a combined device is a touch sensitive display, also sometimes referred to as a“touch screen display” or simply“touch screen”. In such an embodiment, input to the device may be provided by a movement of a physical object, such as e.g. a stylus or a finger of a user, on or near the touch screen display.
A network adapter 516 may also be coupled to the data processing system to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks. The network adapter may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to the data processing system 500, and a data transmitter for transmitting data from the data processing system 500 to said systems, devices and/or networks. Modems, cable modems, and Ethernet cards are examples of different types of network adapter that may be used with the data processing system 500.
As pictured in Fig. 5, the memory elements 504 may store an application 518. In various embodiments, the application 518 may be stored in the local memory 508, the one or more bulk storage devices 510, or separate from the local memory and the bulk storage devices. It should be appreciated that the data processing system 500 may further execute an operating system (not shown in Fig. 5) that can facilitate execution of the application 518.
The application 518, being implemented in the form of executable program code, can be executed by the data processing system 500, e.g., by the processor 502. Responsive to executing the application, the data processing system 500 may be configured to perform one or more operations or method steps described herein.
Various embodiments of the invention may be implemented as a program product for use with a computer system, where the program(s) of the program product define functions of the embodiments (including the methods described herein). In one embodiment, the program(s) can be contained on a variety of non-transitory computer-readable storage media, where, as used herein, the expression“non-transitory computer readable storage media” comprises all computer-readable media, with the sole exception being a transitory, propagating signal. In another embodiment, the program(s) can be contained on a variety of transitory computer-readable storage media. Illustrative computer-readable storage media include, but are not limited to: (i) non-writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored. The computer program may be run on the processor 502 described herein.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of embodiments of the present invention has been presented for purposes of illustration, but is not intended to be exhaustive or limited to the implementations in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the present invention. The embodiments were chosen and described in order to best explain the principles and some practical applications of the present invention, and to enable others of ordinary skill in the art to understand the present invention for various embodiments with various modifications as are suited to the particular use contemplated.

Claims

CLAIMS:
1. A system (1) for determining one or more light effects (43,44) and controlling one or more light elements (13,14) to render said one or more light effects (43,44) while a display device (27) displays media content and controls one or more further light elements based on said media content, said system (1) comprising:
at least one input interface (3);
at least one output interface (4); and
at least one processor (5) configured to:
- use said at least one input interface (3) to receive at least one input signal comprising an identification of said display device (27),
- determine one or more light rendering properties associated with said display device (27) based on said input signal, said one or more light rendering properties representing how said display device (27) generates light effects based on audio and/or video and/or how said light effects appear when rendered on said one or more further light elements,
- determine said one or more light effects (43,44) based on said one or more light rendering properties and further based on said media content and/or a light script associated with said media content, and
- use said at least one output interface (4) to control said one or more light elements (13,14) to render said one or more light effects (43,44).
2. A system (1) as claimed in claim 1, wherein said at least one input signal further comprises at least one user preference related to said one or more light rendering properties, said user preference being configured by a user in said display device (27).
3. A system (1) as claimed in claim 1 or 2, where said at least one input signal is received from a user input device (9) or from said display device (27).
4. A system (1) as claimed in any one of the preceding claims, wherein said one or more light rendering properties represent how colors are extracted from the video.
5. A system (1) as claimed in any one of the preceding claims, wherein said one or more light rendering properties represent a dynamicity of light effects generated by said display device (27).
6. A system (1) as claimed in claim 5, wherein said one or more light rendering properties represent a dynamicity of a chromaticity of light effects generated by said display device (27) and/or a dynamicity of a brightness of light effects generated by said display device (27).
7. A system (1) as claimed in any one of the preceding claims, wherein said one or more light rendering properties represent whether said display device (27) generates light effects based on the audio.
8. A system (1) as claimed in any one of the preceding claims, wherein said one or more light rendering properties represent a relationship between colors of the video and colors rendered by said one or more further light elements.
9. A system (1) as claimed in any one of the preceding claims, wherein said one or more further light elements are incorporated into said display device (27).
10. A system (1) as claimed in claim 9, wherein said one or more further light elements comprise at least one auxiliary light element configured to illuminate an area behind and/or next to said display device (27).
11. A system (1) as claimed in claim 9 or 10, wherein said one or more further light elements comprise at least one further light element which forms a backlight of said display device (27).
12. A system (1) as claimed in claim 9 or 10, wherein said one or more further light elements comprise at least one further light element which is a pixel of a display incorporated into said display device (27).
13. A method of determining one or more light effects and controlling one or more light elements to render said one or more light effects while a display device displays media content and controls one or more further light elements based on said media content, said method comprising:
- receiving (101, 125, 129) a at least one input signal comprising an identification of said display device,
- determining (103) one or more light rendering properties associated with said display device based on said at least one input signal, said one or more light rendering properties representing how said display device generates light effects based on audio and/or video and/or how said light effects appear when rendered on said one or more further light elements;
- determining (105) said one or more light effects based on said one or more light rendering properties and further based on said media content and/or a light script associated with said media content; and
- controlling (107) said one or more light elements to render said one or more light effects.
14. A computer program or suite of computer programs comprising at least one software code portion or a computer program product storing at least one software code portion, the software code portion, when run on a computer system, being configured for enabling the method of claim 13 to be performed.
PCT/EP2020/053739 2019-02-15 2020-02-13 Determining light effects based on a light script and/or media content and light rendering properties of a display device WO2020165331A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP19157491.2 2019-02-15
EP19157491 2019-02-15

Publications (1)

Publication Number Publication Date
WO2020165331A1 true WO2020165331A1 (en) 2020-08-20

Family

ID=65443763

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/053739 WO2020165331A1 (en) 2019-02-15 2020-02-13 Determining light effects based on a light script and/or media content and light rendering properties of a display device

Country Status (1)

Country Link
WO (1) WO2020165331A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022157299A1 (en) * 2021-01-25 2022-07-28 Signify Holding B.V. Selecting a set of lighting devices based on an identifier of an audio and/or video signal source
CN115334099A (en) * 2022-07-20 2022-11-11 榜威电子科技(上海)有限公司 Linkage system, method and storage medium of streaming media audio/video data and lamp

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007113740A1 (en) * 2006-03-31 2007-10-11 Koninklijke Philips Electronics, N.V. Ambient lighting filter control
EP2230839A1 (en) * 2009-03-17 2010-09-22 Koninklijke Philips Electronics N.V. Presentation of video content
WO2015050857A1 (en) * 2013-10-02 2015-04-09 Dolby Laboratories Licensing Corporation Transmitting display management metadata over hdmi
WO2017072537A1 (en) 2015-10-30 2017-05-04 Woodenshark Llc Display apparatus for eye strain reduction
US20180235039A1 (en) 2015-08-07 2018-08-16 Philips Lighting Holding B.V. Lighting control

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007113740A1 (en) * 2006-03-31 2007-10-11 Koninklijke Philips Electronics, N.V. Ambient lighting filter control
EP2230839A1 (en) * 2009-03-17 2010-09-22 Koninklijke Philips Electronics N.V. Presentation of video content
WO2015050857A1 (en) * 2013-10-02 2015-04-09 Dolby Laboratories Licensing Corporation Transmitting display management metadata over hdmi
US20180235039A1 (en) 2015-08-07 2018-08-16 Philips Lighting Holding B.V. Lighting control
WO2017072537A1 (en) 2015-10-30 2017-05-04 Woodenshark Llc Display apparatus for eye strain reduction

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022157299A1 (en) * 2021-01-25 2022-07-28 Signify Holding B.V. Selecting a set of lighting devices based on an identifier of an audio and/or video signal source
CN115334099A (en) * 2022-07-20 2022-11-11 榜威电子科技(上海)有限公司 Linkage system, method and storage medium of streaming media audio/video data and lamp
CN115334099B (en) * 2022-07-20 2024-02-27 榜威电子科技(上海)有限公司 Linkage system, method and storage medium for streaming media audio/video data and lamp

Similar Documents

Publication Publication Date Title
US10462878B2 (en) Method and apparatus and device for implementing television theater mode, and storage medium
US9613397B2 (en) Display method and electronic apparatus
US20220116529A1 (en) Light supplementing method for shooting pictures, smart tv and computer-readable storage medium
US10061479B2 (en) Display system, information processing apparatus, computer readable recording medium, and power source control method
EP3760008B1 (en) Rendering a dynamic light scene based on one or more light settings
US9280936B2 (en) Image display unit, mobile phone and method with image adjustment according to detected ambient light
US20170213494A1 (en) Dynamically established white balance in video display device based on ambient light
WO2020165331A1 (en) Determining light effects based on a light script and/or media content and light rendering properties of a display device
KR20120112378A (en) Display system for meeting room and control method thereof
US8467001B2 (en) Video system capable of controlling ambient light and control method thereof
EP4018646B1 (en) Selecting an image analysis area based on a comparison of dynamicity levels
US11856673B2 (en) Determining a light effect based on an average color after a detected transition in content
KR20210033141A (en) Display device and the method for controlling the same
US20230360352A1 (en) Determining an image analysis region for entertainment lighting based on a distance metric
US20230225035A1 (en) Controlling a pixelated lighting device based on a relative location of a further light source
CN110945970B (en) Attention dependent distraction storing preferences for light states of light sources
WO2020074303A1 (en) Determining dynamicity for light effects based on movement in video content
CN113396643B (en) Determining a light effect based on an average color after a detected content transition
US11695980B1 (en) Method and system for controlling lighting in a viewing area of a content-presentation device
US20160350050A1 (en) Information processing apparatus, operation screen display method, and computer-readable recording medium
US20210378076A1 (en) Creating a combined image by sequentially turning on light sources
CN116569556A (en) Determining light effects based on whether an overlay is likely to be displayed over video content
TWI384314B (en) Projector and a method of adjusting an ambient light for the projector
CN116762481A (en) Determining a lighting device white point based on a display white point
WO2023213750A1 (en) Selecting entertainment lighting devices based on dynamicity of video content

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20703495

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20703495

Country of ref document: EP

Kind code of ref document: A1