EP4581908A1 - Wiedergabe von unterhaltungslichteffekten auf der basis von präferenzen des nächsten benutzers - Google Patents

Wiedergabe von unterhaltungslichteffekten auf der basis von präferenzen des nächsten benutzers

Info

Publication number
EP4581908A1
EP4581908A1 EP23755424.1A EP23755424A EP4581908A1 EP 4581908 A1 EP4581908 A1 EP 4581908A1 EP 23755424 A EP23755424 A EP 23755424A EP 4581908 A1 EP4581908 A1 EP 4581908A1
Authority
EP
European Patent Office
Prior art keywords
user
light effects
nearest
media content
mobile device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP23755424.1A
Other languages
English (en)
French (fr)
Other versions
EP4581908B1 (de
Inventor
Niek Marcellus Cornelis Martinus JANSSEN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Signify Holding BV
Original Assignee
Signify Holding BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Signify Holding BV filed Critical Signify Holding BV
Publication of EP4581908A1 publication Critical patent/EP4581908A1/de
Application granted granted Critical
Publication of EP4581908B1 publication Critical patent/EP4581908B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/165Controlling the light source following a pre-assigned programmed sequence; Logic control [LC]
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources

Definitions

  • the invention relates to a system for controlling one or more lighting devices to render light effects determined based on media content to accompany a rendering of said media content.
  • the invention also relates to a computer program product enabling a computer system to perform such a method.
  • systems may be used which are able to control lighting devices to change color depending on video and/or audio content.
  • An example of such a system is the Hue system in which the audio or video can be passed through a Hue (HDMI) Sync box or generated from a desktop PC (Hue Sync PC desktop app).
  • the audio and/or video is analyzed, and the colors rendered by a group of lighting devices change depending on the analyzed content. In other words, the lighting devices are controlled to render entertainment light effects.
  • An HDMI module like the Hue Sync box sits in between HDMI devices and the TV, receiving the signal earlier than the TV and thereby enabling a seamless effect that matches what is being shown on the TV screen.
  • the Hue Sync box controls the group of lighting device to render entertainment light effects based on the relative positions of the lighting devices in the users’ room, which a user has indicated during the creation of a so- called entertainment area. During this process, the user can select several lighting devices and drag and drop them in a 3D visualization of a room.
  • users can change brightness, which basically dims down the lighting devices across the entire entertainment area, i.e. reduces their light output levels.
  • users can change intensity mode: a selection of four options ranging from “subtle” to “extreme” which correspond to different levels of dynamicity of the light effects.
  • “subtle” setting effects only happen occasionally on drastic changes within the content (e.g., from blue to orange color) whereas with the “extreme” setting, the lighting devices react to any (minor) change within the content.
  • a system for controlling one or more lighting devices to render light effects determined based on media content to accompany a rendering of said media content comprises at least one receiver, at least one transmitter, and at least one processor configured to detect whether one or more users are in an environment, if multiple users are detected to be in said environment, determine which user of said multiple users is nearest to said system based on shortrange wireless signals received, via said at least one receiver, from mobile devices of said multiple users, retrieve, from a memory, a user profile associated with said user nearest to said system, said user profile specifying user preferences, determine said light effects based on said media content and said user preferences, and control, via said at least one transmitter, said one or more lighting devices to render said light effects.
  • the HDMI module may understand, e.g. using Wi-Fi and device usage, if and how many people of the household are in an environment (e.g. at home).
  • multi-user geofencing may be used to detect who is in the environment (e.g. at home), e.g. by detecting who has arrived home over time and/or who has left home over time.
  • the HDMI module may perform the next step of determining which user is more likely to have turned the HDMI module on or to have caused the HDMI module to turn on.
  • a shortrange wireless technology like Bluetooth may then be used to scan for smart phones in the area to identify people near the HDMI module when it turns on. The person near the HDMI module may then be identified and this person’s profile, including their preferred settings, may then be loaded.
  • Said at least one processor may be configured to receive, via said at least one receiver, absolute locations of a plurality of mobile devices, compare said absolute locations of said plurality of mobile devices with an absolute location of said environment, and for each mobile device of said plurality of mobile devices which has an absolute location in a predefined spatial area around said absolute location of said environment, determine that a user of said mobile device is in said environment.
  • existing technologies implementing Geofencing may be used, e.g. the Hue Home and Away feature. This feature works even if users are in said environment (e.g. at home) but not connected to the (home) network, as long as the absolute locations of their mobile devices can be obtained. If the Hue Home and Away feature is used, users need to have access to the Hue cloud.
  • Said at least one processor may be configured to identify a (home) network to which said system is connected, detect whether one or more of a plurality of mobile devices are connected to said (home) network, and for each mobile device of said plurality of mobile devices which is connected to said (home) network, determine that a user of said mobile device is at in said environment. Technologies are known that detect whether a first device and a second device are connected to the same network and only allow the second device to be controlled by the first device if both devices are connected to the same network. These same technologies may be used for determining which user profile to select. These technologies work even if users have disabled location services on their mobile phones, e.g. for privacy reasons.
  • Said at least one processor may be configured to determine a signal strength of each of said shortrange wireless signals received from said mobile devices of said multiple users, select a mobile device from said mobile devices of said multiple users, said signal received from said selected mobile device having a highest signal strength of said signal strengths, and determine that a user of said selected mobile device is nearest to said system. This is beneficial, as signal strength reduces with distance. When the user is in another room, the walls or ceilings may additionally reduce signal strength, but it is less likely that a user who is watching and/or/ listening to the content (and perceiving the entertainment light effects) is in another room than the system.
  • Said at least one processor may be configured to receive, via said at least one receiver, a control signal from a further mobile device of a further user of said multiple users, said control signal comprising a command to start rendering said light effects, select said further user instead of said user nearest to said system, retrieve a further user profile associated with said further user, said further user profile specifying further user preferences, determine said light effects based on said media content and said further user preferences, and control, via said at least one transmitter, said one or more lighting devices to render said light effects.
  • Said at least one processor may be configured to transmit, via said at least one transmitter, a notification to a mobile device of said user nearest to said system, said notification notifying said user that said user’s user profile is now active.
  • This notification may be provided to the user via an app, for example. This notification allows a user to take action if the wrong user profile has been automatically selected.
  • Said at least one processor may be configured to receive, via said at least one receiver, a second control signal from said mobile device of said user nearest to said system, said second control signal identifying another user, select said other user instead of said user nearest to said system, retrieve another user profile associated with said other user, said other user profile specifying other user preferences, determine said light effects based on said media content and said other user preferences, and control, via said at least one transmitter, said one or more lighting devices to render said light effects.
  • the notification may be used to confirm whether the person that is recognized is indeed the user for whom the entertainment light effects are intended. If not, the person that is recognized is able to select a different profile manually. Alternatively, the user for whom the light effects are intended may be able to select a different profile manually.
  • machine learning may be used to identify situations in which the user profile of the user nearest to the system should not be selected. For example, if a different user profile is often selected manually instead of the automatically selected user profile of the nearest user when a certain user is nearest to the system, e.g. under certain circumstances, the system may learn not to select the user profile of this user automatically, e.g. under these certain circumstances. For instance, the system may learn that the user profile of the nearest user should not be selected if a voice assistant is used to control the system and the nearest user is a certain user (who does not use voice assistants).
  • Said user profile may specify user preferences per content type and said at least one processor may be configured to obtain a content type of said media content, select user preferences associated with said content type from said user profile, and determine said light effects based on said media content and said user preferences associated with said content type. For example, a user may prefer different settings, e.g. a higher level of dynamicity, for music than for video or games.
  • the content type may be determined based on metadata associated with the content, for example.
  • a computer program for carrying out the methods described herein, as well as a non-transitory computer readable storage-medium storing the computer program are provided.
  • a computer program may, for example, be downloaded by or uploaded to an existing device or be stored upon manufacturing of these systems.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java(TM), Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • Fig. 4 is a flow diagram of a first embodiment of the method
  • Fig. 6 is a flow diagram of a third embodiment of the method.
  • Fig. 7 is a flow diagram of a fourth embodiment of the method.
  • Fig. 9 is a flow diagram of a sixth embodiment of the method.
  • Fig. 10 is a block diagram of an exemplary data processing system for performing the method of the invention.
  • the HDMI module 1 may be able to communicate directly with the bridge 19, e.g. using Zigbee technology, and/or may be able to communicate with the bridge 19 via the Internet/cloud, e.g. via Internet server 29.
  • the HDMI module 1 may be able to control the lighting devices 11-13 without a bridge, e.g. directly via Wi-Fi, Bluetooth or Zigbee or via the Internet/cloud.
  • the wireless LAN access point 21 is connected to the Internet 25.
  • a media server 27 is also connected to the Internet 25.
  • Media server 27 may be a server of a video-on- demand service such as Netflix, Amazon Prime Video, Hulu, Disney+, or Apple TV+, for example.
  • the HDMI module 1 is connected to a display device 23 and local media receivers 31 and 32 via HDMI.
  • the local media receivers 31 and 32 may comprise one or more streaming or content generation devices, e.g., an Apple TV, Microsoft Xbox and/or Sony PlayStation, and/or one or more cable or satellite TV receivers.
  • the mobile device 34 and/or the mobile device 35 may be used for controlling and configuring the HDMI module 1.
  • the user may be able define an entertainment area by using an app.
  • the user has included lighting devices 11 and 12 in the entertainment area.
  • This entertainment area may be defined in the user preferences or may be user-independent. Only the lighting devices included in the entertainment area are controlled to render entertainment light effects.
  • a user profile with user preferences may be created manually and/or automatically. For example, after the user has defined the entertainment area, the user may be brought to the main user interface of the HDMI module 1 in the app. There the user may be given the option to create a profile. Within that profile, the user may be able to select their preferred settings per content category, e.g. music, video, and/or gaming. Options within each category may vary from brightness, minimum background lighting level and intensity modes, to preferred HMDI channel and entertainment area.
  • the HDMI module 1 or app does this automatically based on the settings that the user selects within the user’s entertainment session. The selected settings are then stored in the automatically created profile.
  • the processor 5 may be configured to receive, via the receiver 3, absolute locations of a plurality of mobile devices, e.g. from the Internet server 29, compare the absolute locations of the plurality of mobile devices with an absolute location of the environment (e.g. the home), and for each mobile device of the plurality of mobile devices which has an absolute location in a predefined spatial area around the absolute location of the environment, determine that a user of the mobile device is in said environment.
  • absolute locations of a plurality of mobile devices e.g. from the Internet server 29, compare the absolute locations of the plurality of mobile devices with an absolute location of the environment (e.g. the home), and for each mobile device of the plurality of mobile devices which has an absolute location in a predefined spatial area around the absolute location of the environment, determine that a user of the mobile device is in said environment.
  • the HDMI module 1 is connected directly to the wireless LAN access point 21.
  • the HDMI module 1 may be connected indirectly to the (home) network via the bridge 19.
  • the processor 5 is configured to determine a signal strength of each of the shortrange wireless signals received from the mobile devices of the multiple users, select a mobile device whose received signal has a highest signal strength of the signal strengths, and determine that a user of the selected mobile device is nearest to the HDMI module 1. In an alternative embodiment, it is determined in a different manner which user of the multiple users is nearest to the HDMI module 1 based on shortrange wireless signals received from mobile devices of the multiple users.
  • the selected mobile device or each mobile device may transmit information identifying the user of the mobile device.
  • the processor 5 may be configured to retrieve the user profile associated with this user identifier.
  • the selected mobile device or each mobile device may transmit a device identifier.
  • the processor 5 may be configured to retrieve the user profile associated with this device identifier. This will work in many practical situations, as a mobile device often only has one user.
  • Existing device identifiers e.g. Bluetooth device identifiers, may be used for example.
  • a new user profile may be created automatically, e.g. with system-default preferences or the last used preferences, and then linked to this new device identifier.
  • a first embodiment of the method of controlling one or more lighting devices to render light effects determined based on media content to accompany a rendering of the media content is shown in Fig. 4.
  • the method may be performed by the HDMI module 1 of Fig. 1 or the lighting system 41 of Fig. 3, for example.
  • the user profile retrieved in step 105 or step 111 specifies user preferences. These user preferences may indicate, for example, one or more of: a level of dynamicity of the light effects, a light output level of the light effects, a minimum light output level, the one or more lighting devices to be used for rendering the light effects, and a source which provides the media content.
  • a step 107 comprises determining the light effects based on the media content and the user preferences retrieved in step 105 or step 111. For example, in step 107, an average color may be extracted from one or more video frames at each video frame of the content. The number of frames from which the average color is extracted may depend on the user preferences. The light output level of the light effects may (also) depend on the user preferences, for example.
  • a step 109 comprises controlling the one or more lighting devices to render the light effects determined in step 107.
  • FIG. 5 A second embodiment of the method of controlling one or more lighting devices to render light effects determined based on media content to accompany a rendering of the media content is shown in Fig. 5.
  • the method may be performed by the HDMI module 1 of Fig. 1 or the lighting system 41 of Fig. 3, for example.
  • the second embodiment of Fig. 5 is an extension of the first embodiment of Fig. 4.
  • a step 161 comprises obtaining a content type of the media content.
  • Step 101 comprises detecting whether one or more users are in an environment.
  • a step 102 comprising checking, based on the results of step 101, whether a single user is detected to be in said environment or multiple users are detected to be in said environment. If it is determined in step 102 that multiple users are detected to be in said environment, then a step 103 is performed next. If it is determined in step 102 that a single user is detected to be in said environment, then a step I l l is performed next.
  • Step 103 comprises determining which user of the multiple users is nearest to the system based on shortrange wireless signals received from mobile devices of the multiple users. In the embodiment of Fig. 5, step 103 is implemented by steps 163, 165, and 167.
  • a step 173 comprises transmitting a notification to the mobile device of the user nearest to the system, as determined in step 103, or to the mobile device of the single user detected in step 101. This notification notifies the user that the user’s user profile is now active.
  • Step 101 comprises detecting whether one or more users are in said environment.
  • a step 102 comprising checking, based on the results of step 101, whether a single user is detected to be in said environment or multiple users are detected to be in said environment. If it is determined in step 102 that multiple users are detected to be in said environment, then a step 103 is performed next. If it is determined in step 102 that a single user is detected to be in said environment, then step I l l is performed next.
  • FIG. 7 A fourth embodiment of the method of controlling one or more lighting devices to render light effects determined based on media content to accompany a rendering of the media content is shown in Fig. 7.
  • the method may be performed by the HDMI module 1 of Fig. 1 or the lighting system 41 of Fig. 3, for example.
  • the fourth embodiment of Fig. 7 is an extension of the first embodiment of Fig. 4.
  • step 173 is performed after step 105 or step 111 of Fig. 4 has been performed.
  • Step 173 comprises transmitting a notification to the mobile device of the user nearest to the system, as determined in step 103, or to the mobile device of the single user detected in step 101. This notification notifies the user that the user’s user profile is now active.
  • Step 107 is performed after step 225 has been performed. If no control signal identifying another user is received, then step 107 is performed directly after step 173. Step 107 comprises determining the light effects based on the media content and the user preferences retrieved in step 105, step 111, or step 225.
  • Step 101 comprises detecting whether one or more users are in an environment.
  • Step 101 is implemented by steps 241, 243, and 245.
  • Step 241 comprises receiving absolute locations of a plurality of mobile devices.
  • Step 243 comprises comparing the absolute locations of the plurality of mobile devices, as received in step 241, with an absolute location of the home.
  • Step 245 comprises determining, based on the comparison of step 243. that a user of a mobile device is in said environment when the mobile device has an absolute location in a predefined spatial area around the absolute location of the environment.
  • FIG. 9 A sixth embodiment of the method of controlling one or more lighting devices to render light effects determined based on media content to accompany a rendering of the media content is shown in Fig. 9.
  • the method may be performed by the HDMI module 1 of Fig. 1 or the lighting system 41 of Fig. 3, for example.
  • Figs. 4 to 9 differ from each other in multiple aspects, i.e., multiple steps have been added or replaced. In variations on these embodiments, only a subset of these steps is added or replaced and/or one or more steps is omitted. For example, example, steps 163-167 may be omitted from the embodiment of Fig. 5 and/or added to one or more of the embodiments of Fig. 4 and Figs. 6 to 9. Moreover, one or more of the embodiments of Figs. 4 to 9 may be combined. For example, the embodiments of Figs. 4 to 7 may be combined with the embodiment of Fig. 8 or Fig. 9.
  • the memory elements 304 may include one or more physical memory devices such as, for example, local memory 308 and one or more bulk storage devices 310.
  • the local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code.
  • a bulk storage device may be implemented as a hard drive or other persistent data storage device.
  • the processing system 300 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the quantity of times program code must be retrieved from the bulk storage device 310 during execution.
  • the processing system 300 may also be able to use memory elements of another processing system, e.g. if the processing system 300 is part of a cloud-computing platform.
  • the input and the output devices may be implemented as a combined input/output device (illustrated in Fig. 10 with a dashed line surrounding the input device 312 and the output device 314).
  • a combined device is a touch sensitive display, also sometimes referred to as a “touch screen display” or simply “touch screen”.
  • input to the device may be provided by a movement of a physical object, such as e.g. a stylus or a finger of a user, on or near the touch screen display.
  • the memory elements 304 may store an application 318.
  • the application 318 may be stored in the local memory 308, the one or more bulk storage devices 310, or separate from the local memory and the bulk storage devices.
  • the data processing system 300 may further execute an operating system (not shown in Fig. 10) that can facilitate execution of the application 318.
  • the application 318 being implemented in the form of executable program code, can be executed by the data processing system 300, e.g., by the processor 302. Responsive to executing the application, the data processing system 300 may be configured to perform one or more operations or method steps described herein.
  • Various embodiments of the invention may be implemented as a program product for use with a computer system, where the program(s) of the program product define functions of the embodiments (including the methods described herein).
  • the program(s) can be contained on a variety of non-transitory computer-readable storage media, where, as used herein, the expression “non-transitory computer readable storage media” comprises all computer-readable media, with the sole exception being a transitory, propagating signal.
  • the program(s) can be contained on a variety of transitory computer-readable storage media.
  • Illustrative computer-readable storage media include, but are not limited to: (i) non-writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored.
  • the computer program may be run on the processor 302 described herein.
  • the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.

Landscapes

  • Circuit Arrangement For Electric Light Sources In General (AREA)
EP23755424.1A 2022-08-29 2023-08-18 Wiedergabe von unterhaltungslichteffekten basierend auf präferenzen des nächsten benutzers Active EP4581908B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP22192560 2022-08-29
PCT/EP2023/072771 WO2024046781A1 (en) 2022-08-29 2023-08-18 Rendering entertainment light effects based on preferences of the nearest user

Publications (2)

Publication Number Publication Date
EP4581908A1 true EP4581908A1 (de) 2025-07-09
EP4581908B1 EP4581908B1 (de) 2025-12-17

Family

ID=83151969

Family Applications (1)

Application Number Title Priority Date Filing Date
EP23755424.1A Active EP4581908B1 (de) 2022-08-29 2023-08-18 Wiedergabe von unterhaltungslichteffekten basierend auf präferenzen des nächsten benutzers

Country Status (3)

Country Link
EP (1) EP4581908B1 (de)
CN (1) CN119790713A (de)
WO (1) WO2024046781A1 (de)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160037225A1 (en) * 2008-08-28 2016-02-04 Lg Electronics Inc. Video display apparatus and method of setting user viewing conditions

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022013131A1 (en) * 2020-07-13 2022-01-20 Signify Holding B.V. Selecting lighting devices for rendering entertainment lighting based on relative distance information
WO2022144171A1 (en) * 2021-01-04 2022-07-07 Signify Holding B.V. Adjusting light effects based on adjustments made by users of other systems

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160037225A1 (en) * 2008-08-28 2016-02-04 Lg Electronics Inc. Video display apparatus and method of setting user viewing conditions

Also Published As

Publication number Publication date
WO2024046781A1 (en) 2024-03-07
CN119790713A (zh) 2025-04-08
EP4581908B1 (de) 2025-12-17

Similar Documents

Publication Publication Date Title
US10681479B2 (en) Methods, devices and systems for bluetooth audio transmission
US11126389B2 (en) Controlling visual indicators in an audio responsive electronic device, and capturing and providing audio using an API, by native and non-native computing devices and services
US11259390B2 (en) Rendering a dynamic light scene based on one or more light settings
CN105846865B (zh) 用于蓝牙音频传输的方法、设备和系统
US10306366B2 (en) Audio system, audio device, and audio signal playback method
US12185443B2 (en) Selecting lighting devices for rendering entertainment lighting based on relative distance information
US20250103277A1 (en) Transmitting messages to a display device based on detected audio output
US20200187147A1 (en) Conditionally providing location-based functions
EP4274387A1 (de) Auswahl von unterhaltungsbeleuchtungsvorrichtungen auf basis der dynamik von videoinhalt
CN118140482A (zh) 电子装置及其操作方法
EP4581908B1 (de) Wiedergabe von unterhaltungslichteffekten basierend auf präferenzen des nächsten benutzers
WO2021160552A1 (en) Associating another control action with a physical control if an entertainment mode is active
US12284740B2 (en) Requesting a lighting device to control other lighting devices to render light effects from a light script
US20250193988A1 (en) Controlling lighting devices as a group when a light scene or mode is activated in another spatial area
WO2020165331A1 (en) Determining light effects based on a light script and/or media content and light rendering properties of a display device
US20240306280A1 (en) Selecting a more suitable input modality in relation to a user command for light control
US20230269853A1 (en) Allocating control of a lighting device in an entertainment mode
WO2022058282A1 (en) Determining different light effects for screensaver content
US8943247B1 (en) Media sink device input identification
US20250338379A1 (en) Selecting lighting devices based on an indicated light effect and distances between available lighting devices
US20210399915A1 (en) Selecting a destination for a sensor signal in dependence on an active light setting

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

17P Request for examination filed

Effective date: 20250331

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
INTG Intention to grant announced

Effective date: 20250708

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

P01 Opt-out of the competence of the unified patent court (upc) registered

Free format text: CASE NUMBER: UPC_APP_0010188_4581908/2025

Effective date: 20251017

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: F10

Free format text: ST27 STATUS EVENT CODE: U-0-0-F10-F00 (AS PROVIDED BY THE NATIONAL OFFICE)

Effective date: 20251217

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602023009903

Country of ref document: DE