WO2018224390A1 - Mapping a light effect to light sources using a mapping function - Google Patents

Mapping a light effect to light sources using a mapping function Download PDF

Info

Publication number
WO2018224390A1
WO2018224390A1 PCT/EP2018/064361 EP2018064361W WO2018224390A1 WO 2018224390 A1 WO2018224390 A1 WO 2018224390A1 EP 2018064361 W EP2018064361 W EP 2018064361W WO 2018224390 A1 WO2018224390 A1 WO 2018224390A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
light sources
available
effect
sources
Prior art date
Application number
PCT/EP2018/064361
Other languages
French (fr)
Inventor
Antonie Leonardus Johannes KAMP
Aloys HUBBERS
Bartel Marinus Van De Sluis
Original Assignee
Philips Lighting Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Philips Lighting Holding B.V. filed Critical Philips Lighting Holding B.V.
Publication of WO2018224390A1 publication Critical patent/WO2018224390A1/en

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/20Controlling the colour of the light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/175Controlling the light source by remote control
    • H05B47/19Controlling the light source by remote control via wireless transmission

Definitions

  • the invention relates to an electronic device for mapping a light effect to one or more light sources.
  • the invention further relates to a method of mapping a light effect to one or more light sources.
  • the invention also relates to a computer program product enabling a computer system to perform such a method.
  • media-playback devices e.g. displays and loudspeakers
  • immersive experiences can be created by generating light effects in accordance to the effects and events in the rendered media.
  • audio and video devices e.g. home cinema 5.1, 2.1 speakers or "soundbars" typically centered around a TV screen and typical user position
  • people's connected lighting infrastructure and setup is even more diverse and is really unique for each user, which makes the creation of lighting atmospheres, scenes and scripts more difficult.
  • US 2010/0090617 discloses a method of composing a lighting atmosphere from an abstract description, wherein the lighting atmosphere is generated by several lighting devices, by automatically rendering the desired lighting atmosphere from the abstract description.
  • the abstract description describes the type of light with certain lighting parameters desired at certain semantic locations at certain semantic times.
  • the invention has the main advantage that it allows to create light scenes and lighting atmospheres at a high level of abstraction without requiring the definition of a lighting atmosphere or scene by setting the intensity, colors, etc. for single lighting units or devices.
  • the electronic device comprises at least one processor configured to obtain information identifying available light sources, said information comprising for each of said available light sources a position of said available light source relative to a reference position, receive an instruction to render a light effect, said instruction comprising one or more light effect parameters, a mapping function identifier and one or more spatial indications, said one or more spatial indications being specified relative to said reference position, map said light effect to one or more of said available light sources based on said one or more light effect parameters, said mapping function identifier, said one or more spatial indications and said positions of said available light sources, an output of said mapping identifying one or more selected ones of said available light sources and comprising one or more light states for each of said one or more selected light sources, and transmit one or more control commands for controlling said one or more selected light sources, said one or more control commands commanding said one or more selected light sources to assume said one or more light states.
  • the instruction to render the light effect does not only not require individual lights to be specified, the instruction does not require a specific location to be specified.
  • light scenes and atmospheres may be created more easily, e.g. by first determining the light effects and only then determining how to map the light effects to light sources.
  • a specific location or a specific setup of light sources does not need to be specified, an author of a light scene or atmosphere may prevent by using multiple mapping functions that this results in the light scene or atmosphere being less attractive.
  • a certain light setup might not include light sources in the back of the room.
  • said one or more light effect parameters are independent from a mapping function identified by said mapping function identifier.
  • Said one or more light effect parameters may comprise at least one of: a static color value, a static intensity value, a static opacity value, a varying color value, a varying intensity value, a varying opacity value, a static size, a varying size and a trajectory, for example.
  • the light effect parameter relates to at least a color and/or intensity of light.
  • Said one or more spatial indications may indicate one or more positions and/or areas.
  • This range is an example of the spatial indication indicating an area.
  • the mapping function determines which light sources are to be controlled to render light output based on the received instruction to render a light effect.
  • Said reference position may be a typical or current position of a user or a position of a media-rendering object, for example.
  • the current position of the mobile device of the user may be assumed to represent the current position of the user or a depth camera may be used to determine the current position of the user, for example.
  • the media-rendering object may be a media- rendering device (e.g. a Television) or a media-rendering projection surface (e.g. a wall or wall portion), for example.
  • the electronic device may be, for example, a mobile device or an (other) media-rendering device, e.g. a Television. In this case, the electronic device may obtain the information identifying the available light sources from a bridge which collects and stores this information, for example. Alternatively, the electronic device may be a bridge, for example.
  • Said at least one processor may be configured to map said light effect to said one or more of said available light sources by selecting one or more of said available light sources based on said mapping function identifier, said one or more spatial indications and said positions of said available light sources, and determining said one or more light states for each of said one or more selected light sources based on said one or more light effect parameters.
  • the one or more light states may simply comprise a constant or dynamic color and/or intensity.
  • the one or more light states may additionally be determined by the mapping function.
  • the color and/or intensity of a light source may be set to be weaker (e.g.
  • the mapping function identifies one or more selected ones of said available light sources and one or more light states for said one or more selected light sources.
  • the selection of light sources is further based on the one or more light effect parameters.
  • the light effect may specify a small explosion or a large explosion which involves multiple light sources.
  • Said one or more spatial indications may indicate a position and said at least one processor may be configured to determine for each of said one or more selected light sources a contribution of said one or more light effect parameters to said one or more light states of said selected light source based on a distance between said position of said selected light source and said indicated position.
  • This mapping function allows the color and/or intensity of a light source to be set to be weaker (e.g. less intense or whiter) if the lights source is further from the spatial indication and to be set to be stronger if the light source is closer to the spatial indication.
  • Said one or more spatial indications may indicate at least a first position and a second position and said one or more light effect parameters may comprise at least a first light effect parameter and a second light effect parameter, said first light effect parameter being associated with said first position and said second light effect parameter being associated with said second position.
  • Said at least one processor may be configured to determine a first subset of one or more of said one or more selected light sources and a second subset of one or more of said one or more selected light sources, said one or more light sources of said first subset being closer to said first position than said one or more light sources of said second subset and said one or more light sources of said second subset being closer to said second position than said one or more light sources of said first subset, and determine said one or more light states for said one or more light sources of said first subset based on said first light effect parameter and said one or more light states for said one or more light sources of said second subset based on said second light effect parameter.
  • This mapping function allows all light effects to be reproduced even if no light sources are close to the spatial indication specified in an instruction to render a light effect.
  • One or more further subsets may be determined in the same manner as defined above in relation to the first and second subsets.
  • one or more further subsets may be determined in a different manner. For example, if more light effect parameters have been defined than there are light sources, then light effects associated with the positions farthest from the light sources will not be rendered. Each light effect is rendered by the light source closest to the position associated with light effect.
  • said at least one processor may be configured to determine said one or more light states for said single available light source based on said first light effect parameter if said single light source is closer to said first position than to said second position, and determine said one or more light states for said single available light source based on said second light effect parameter if said single light source is closer to said second position than to said first position.
  • Said at least one processor may be configured to determine a sequence of said one or more selected light sources based on said one or more spatial indications and said positions of said one or more selected light sources, and determine said one or more light states for said one or more selected light sources based on said one or more light effect parameters, wherein said one or more control commands command said one or more selected light sources to assume said determined one or more light states in the order specified by said sequence, a next light source in said sequence assuming said one or more determined light states determined for said next light source a specified period after a previous light source in said sequence assumes said one or more determined light states determined for said previous light source.
  • the spatial indication for this mapping function preferably specifies at least one point in space (coordinate system) and a direction (e.g. left to right). This mapping function allows light sources to be activated in sequence, e.g. in a certain direction.
  • Said one or more spatial indications may indicate an area and said at least one processor is configured to select only available lights with a position in said indicated area.
  • This mapping function may be used to limit light effects to a certain area of a room, e.g. the back side, the left side or the back-left side.
  • Said at least one processor may be configured to map said light effect to one or more of said available light sources based on said one or more light effect parameters, said mapping function identifier, said one or more spatial indications, said positions of said available light sources and information identifying rendering capabilities of said available light sources.
  • a better mapping is achieved. For example, when it is important that each channel of a certain color light effect is rendered by at least one light source, the at least one processor should ensure that each channel of the light effect is mapped to at least one color light source.
  • the method comprises obtaining information identifying available light sources, said information comprising for each of said available light sources a position of said available light source relative to a reference position, receiving an instruction to render a light effect, said instruction comprising one or more light effect parameters, a mapping function identifier and one or more spatial indications, said one or more spatial indications being specified relative to said reference position, mapping said light effect to one or more of said available light sources based on said one or more light effect parameters, said mapping function identifier, said one or more spatial indications and said positions of said available light sources, an output of said mapping identifying one or more selected ones of said available light sources and comprising one or more light states for each of said one or more selected light sources, and transmitting one or more control commands for controlling said one or more selected light sources, said one or more control commands commanding said one or more selected light sources to assume said one or more light states.
  • the method may be implemented in hardware and/or software.
  • Mapping said light effect to said one or more of said available light sources may comprise selecting one or more of said available light sources based on said mapping function identifier, said one or more spatial indications and said positions of said available light sources and determining said one or more light states for each of said one or more selected light sources based on said one or more light effect parameters.
  • a computer program for carrying out the methods described herein, as well as a non-transitory computer readable storage-medium storing the computer program are provided.
  • a computer program may, for example, be downloaded by or uploaded to an existing device or be stored upon manufacturing of these systems.
  • a non-transitory computer-readable storage medium stores at least one software code portion, the software code portion, when executed or processed by a computer, being configured to perform executable operations comprising: obtaining information identifying available light sources, said information comprising for each of said available light sources a position of said available light source relative to a reference position, receiving an instruction to render a light effect, said instruction comprising one or more light effect parameters, a mapping function identifier and one or more spatial indications, said one or more spatial indications being specified relative to said reference position, mapping said light effect to one or more of said available light sources based on said one or more light effect parameters, said mapping function identifier, said one or more spatial indications and said positions of said available light sources, an output of said mapping identifying one or more selected ones of said available light sources and comprising one or more light states for each of said one or more selected light sources, and transmitting one or more control commands for controlling said one or more selected light sources, said one or more control commands commanding said one or more selected light sources to assume said one or more light states.
  • aspects of the present invention may be embodied as a device, a method or a computer program product.
  • aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, microcode, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit", “module” or “system.”
  • Functions described in this disclosure may be implemented as an algorithm executed by a processor/microprocessor of a computer.
  • aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied, e.g., stored, thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a computer readable storage medium may include, but are not limited to, the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • CD-ROM compact disc read-only memory
  • a computer readable storage medium may be any tangible medium that can contain, or store, a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • These computer program instructions may be provided to a processor, in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • a processor in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • Fig. 1 shows a system which comprises a first embodiment of the electronic device of the invention
  • Fig. 2 is a block diagram of the first embodiment of the electronic device of
  • FIG. 3 shows a system which comprises a second embodiment of the electronic device of the invention
  • Fig. 4 is a block diagram of the second embodiment of the electronic device of
  • Fig. 5 depicts an example of a coordinate system used in an embodiment of the invention
  • Fig. 6 depicts an example of a user interface for indicating positions of light sources relative to a user position
  • Figs. 7 and 8 illustrate a first mapping function which may be implemented in an embodiment of the invention
  • Fig. 9 illustrates a second mapping function which may be implemented in an embodiment of the invention
  • Fig. 10 illustrates a third mapping function which may be implemented in an embodiment of the invention.
  • Figs. 11 to 13 illustrate a fourth mapping function which may be implemented in an embodiment of the invention
  • Fig. 14 is a flow diagram of an embodiment of the method of the invention.
  • Fig. 15 is a block diagram of an exemplary data processing system for performing the method of the invention.
  • Fig.l shows a first embodiment of a system, e.g. a Philips Hue system, which comprises a first embodiment of the electronic device of the invention: a bridge 1.
  • the bridge 1 is located in a room 27.
  • a television 21, three lights sources (e.g. lamps) 11, 12 and 13 and a person 23 holding a mobile device 25 are also located in the room 27.
  • the bridge 1 controls light sources 11 to 13, e.g. via ZigBee or a protocol based on ZigBee.
  • the bridge 1 can be controlled by other devices, e.g. mobile device 25, via an Application Programming Interface (API).
  • the mobile device 25 may be configured to communicate with the bridge 1 directly or via a wireless LAN access point (not shown), for example.
  • API Application Programming Interface
  • the mobile device 25 may be configured to communicate with the bridge 1 via Wi-Fi (IEEE 802.11), BlueTooth and/or ZigBee, for example.
  • the mobile device 25 runs an application which transmits instructions to a server running on bridge 1.
  • the application has been developed using the above- mentioned API and uses this API to communicate with the bridge 1.
  • the bridge 1 may be connected to the Internet, e.g. via a wireless LAN access point.
  • the bridge 1 comprises a transceiver 3 and a processor 5.
  • the processor 5 is configured to obtain information identifying available light sources. The information comprises for each of the available light sources a position of the available light source relative to a reference position.
  • the processor 5 is further configured to receive an instruction to render a light effect.
  • the instruction comprises or more light effect parameters, a mapping function identifier and one or more spatial indications. The one or more spatial indications are specified relative to the reference position.
  • the processor 5 is also configured to map the light effect to one or more of the available light sources based on the one or more light effect parameters, the mapping function identifier, the one or more spatial indications and the positions of the available light sources.
  • An output of the mapping identifies one or more selected ones of the available light sources and comprises one or more light states for each of the one or more selected light sources.
  • the processor 5 is also configured to use the transceiver 3 to transmit one or more control commands for controlling the one or more selected light sources.
  • the one or more control commands command the one or more selected light sources to assume the one or more light states.
  • the processor 5 is configured to map the light effect to the one or more of the available light sources by selecting one or more of the available light sources based on the mapping function identifier, the one or more spatial indications and the positions of the available light sources and determining the one or more light states for each of the one or more selected light sources based on the one or more light effect parameters.
  • the bridge 1 comprises one processor 5.
  • the bridge 1 comprises multiple processors.
  • the processor 5 of the bridge 1 may be a general-purpose processor, e.g. from ARM or
  • the processor 5 of the bridge 1 may run a Linux operating system for example.
  • a receiver and a transmitter have been combined into a transceiver 3.
  • one or more separate receiver components and one or more separate transmitter components are used.
  • multiple transceivers are used instead of a single transceiver.
  • the transceiver 3 may use one or more wireless communication technologies to transmit and receive data, e.g. LTE, Wi-Fi, ZigBee and/or Bluetooth.
  • the mobile device 1 further comprises storage means 7, e.g. for storing information identifying lights 11 to 13.
  • the storage means 7 may comprise one or more memory units.
  • the storage means 7 may comprise solid state memory, for example.
  • Fig.3 shows a second embodiment of a system, e.g. a Philips Hue system, which comprises a second embodiment of the electronic device of the invention: a mobile device 31.
  • the mobile device 31 communicates with the lights 11 to 13 via a data forwarding device 29, e.g. a wireless LAN access point.
  • a data forwarding device 29 e.g. a wireless LAN access point.
  • An embodiment of the mobile device 31 is shown in more detail in Fig.4.
  • the mobile device 31 comprises a transceiver 33 and a processor 35.
  • the mobile device 31 further comprises a touchscreen 39 which may be used to start or stop the afore-mentioned light control application, for example.
  • the data forwarding device 29 may comprise a wireless LAN access point and/or a network (e.g. Ethernet) switch, for example.
  • the processor 35 is configured to obtain information identifying available light sources.
  • the information comprises for each of the available light sources a position of the available light source relative to a reference position.
  • the information is collected directly from lights sources 11, 12 and 13.
  • a bridge is present and the information is obtained from the bridge by the mobile device 31.
  • the processor 35 is further configured to receive an instruction to render a light effect.
  • the instruction comprises or more light effect parameters, a mapping function identifier and one or more spatial indications.
  • the one or more spatial indications are specified relative to the reference position.
  • the processor 35 is also configured to map the light effect to one or more of the available light sources based on the one or more light effect parameters, the mapping function identifier, the one or more spatial indications and the positions of the available light sources.
  • An output of the mapping identifies one or more selected ones of the available light sources and comprises one or more light states for each of the one or more selected light sources.
  • the processor 35 is also configured to use the transceiver 33 to transmit one or more control commands for controlling the one or more selected light sources.
  • the one or more control commands command the one or more selected light sources to assume the one or more light states.
  • the mobile device 31 comprises one processor 35. In an alternative embodiment, the mobile device 31
  • the processor 35 of the mobile device 31 may be a general- purpose processor, e.g. from ARM or Qualcomm, or an application-specific processor.
  • the processor 35 of the mobile device 31 may run an iOS, Windows or Android operating system for example.
  • the invention may be implemented using a computer program running on one or more processors.
  • a receiver and a transmitter have been combined into a transceiver 33.
  • one or more separate receiver components and one or more separate transmitter components are used.
  • multiple transceivers are used instead of a single transceiver.
  • the transceiver 33 may use one or more wireless communication technologies to transmit and receive data, e.g. LTE, Wi-Fi, ZigBee and/or Bluetooth.
  • the mobile device 31 further comprises storage means 37, e.g. for storing applications (also referred to as "apps") and application data.
  • the storage means 37 may comprise one or more memory units.
  • the storage means 37 may comprise solid state memory, for example.
  • the touchscreen 39 may comprise an LCD or OLED display panel, for example.
  • Spatial indications and positions of light sources are typically specified using a coordinate system.
  • the API may use a single coordinate system or multiple coordinate systems. In case of the latter, the developer may be able to specify the coordinate system that he wants to use.
  • An example of such a coordinate system is shown in Fig.5.
  • the reference position is a typical position of a user, i.e. position 43 (0,0.5).
  • Position 43 may represent the position of a couch, for example.
  • the coordinate system of Fig.5 defines another fixed position: position 41 (0, -0.5) of a Television (or alternatively, of a projection surface, for example).
  • the front-left corner has coordinate (-1,1)
  • the right-left corner has coordinate (1,1)
  • the back-left corner has coordinate (-1,1)
  • the back-right has coordinate (1,-1)
  • the center has coordinate (0,0).
  • x ⁇ 0 is left
  • x > 0 is right
  • y ⁇ 0 is back
  • y > 0 is front.
  • the position 41 of the Television i.e. the media-rendering device, is the reference position.
  • the positions of the light sources are specified manually by a user, e.g. via the user interface displayed on display 39 of the mobile device 31, as shown in Fig.6.
  • the same user interface may be displayed on a display of the mobile device 25 of Fig.l.
  • the position of the light sources is specified relative to the position of the user, i.e. by an icon 53.
  • Icon 53 may represent the current position of the user or a typical position of the user.
  • the specified positions of the light sources may be converted to positions relative to a typical user position if such a typical user position is known.
  • the user may be able to move around icon 53 on the screen displayed on display 39.
  • An icon 51 represents a Television.
  • This icon may 51 have a fixed position relative to icon 53, e.g. in accordance with the coordinate system of Fig.5, or the user may be able to move around icon 51 on the screen displayed on display 39. In the latter case, the position of the Television may also be automatically detected.
  • Light sources 11, 12 and 13 of Fig.l are represented by icons 61, 62 and 63, respectively. The user is able to move around icons 61 to 63 on the screen displayed on display 39.
  • a user is able to specify 2D positions on a floorplan.
  • the user may indicate an approximate height in the room for each light source (or simply indicate low, medium or high in the room).
  • Certain more advanced effects and embodiments would require knowing the 3D position of each light source in the area, or actually the 3D position in space where the light source is creating an effect (e.g. location and dimensions of each light source's effect surface). This could be supported if users can easily position lamp representations in a 3D representation of the room.
  • the user may also indicate an approximate, height and orientation of a luminaire, or indicate the current shape and length of a flexible, cut-to- measure light strip.
  • the light sources 11 to 13 represented by icons 61 to 63 in Fig.6 are controlled as a single entity.
  • the positions of individual pixels or groups of pixels preferably need to be known as well. These positions may be detected automatically.
  • embedded sensors may be able to detect orientation, height of the lighting device.
  • future embedded sensors may even be able to detect shape of the pixelated light strip.
  • a LED strip may be able to detect where it is bended, stretched or twisted.
  • a user specifies the positions of the light sources 11 to 13 of Fig.l.
  • automatic lighting device position detection may be used by using known methods of RF-based position detection (e.g. RSSI). Determining the position of the user and/or of the media-rendering device may be done in a similar way, by using RF-based positioning to detect the approximate location of the user's mobile device and/or of the media-rendering device.
  • RF-based position detection e.g. RSSI
  • the one or more spatial indications may indicate one or more positions and/or areas.
  • the one or more light effect parameters may comprise at least one of: a static color value, a static intensity value, a static opacity value, a varying color value, a varying intensity value, a varying opacity value, a static size, a varying size and a trajectory, for example.
  • mapping functions are defined. In these examples, it is assumed that all lights sources have color-rendering capabilities.
  • a room may also have connected lamps installed which can only render white light (e.g. the Philips Hue Ambience lamp).
  • the capabilities of the light sources may also be taken into account when selecting light sources for rendering the light effects.
  • a first mapping function is called AreaEffect. This mapping function is illustrated in Figs. 7 and 8.
  • the one or more spatial indications indicate an area, e.g. one or more of areas 71 to 74.
  • the processor 5 or 35 is configured to select only available lights with a position in the indicated area.
  • AreaEffect will play on all light sources in a given area (or multiple areas).
  • An important property of an AreaEffect is that if a specific setup has no light in a certain area, the effect won't be visible. This may be desired in case it is better to not show an effect at all than to show it in the wrong location.
  • each light device (comprising a single light source) covers one area.
  • larger lighting devices e.g. a long lightstrip, or a large ceiling lighting array
  • different segments of light sources of the large lighting device could be assigned to different areas.
  • An example usage of an AreaEffect could be indicating in a game from which direction a character is being hit.
  • a second mapping function is called MultiChannelEffect.
  • This mapping function is illustrated in Fig. 9.
  • the one or more spatial indications indicate at least a first position, e.g. position 81, and a second position, e.g. position 83
  • the one or more light effect parameters comprise at least a first light effect parameter and a second light effect parameter.
  • the first light effect parameter is associated with the first position, e.g. position 81
  • the second light effect parameter is associated with the second position, e.g. position 83.
  • the processor 5 or 35 is configured to determine a first subset of one or more of the one or more selected light sources, e.g. light sources 11 and 13, and a second subset of one or more of the one or more selected light sources, e.g. light source 12.
  • the one or more light sources of the first subset, e.g. light sources 11 and 13 are closer to the first position, e.g. position 81, than the one or more light sources of the second subset, e.g. light source 12, and the one or more light sources of the second subset, e.g. light source 12, are closer to the second position, e.g. position 83, than the one or more light sources of the first subset, e.g. light source 11 and 13.
  • the processor 5 or 35 is further configured to determine the one or more light states for the one or more light sources of the first subset, e.g. light sources 11 and 13, based on the first light effect parameter and the one or more light states for the one or more light sources of the second subset, e.g. light source 12, based on the second light effect parameter.
  • MultiChannelEffect will try to distribute the light channels (compare to e.g. audio channels) evenly over the available light sources, prioritizing light sources closest to the channel location.
  • the mapping is significantly different than that of an AreaEffect: even though a channel has a certain location, the MultiChannelEffect tries to make as many channels visible as possible even if a channel does not have light sources in close proximity to the channel location.
  • the MultiChannelEffect maps channels to light sources, making sure all channels are mapped to light sources closest to their position.
  • the MultiChannelEffect may be used if the creator finds rendering of all channels more important than rendering the channel at a specific location.
  • a third mapping function is called LightSourceEffect. This mapping function is illustrated in Fig. 10.
  • the one or more spatial indications indicate a position, e g. position 87
  • the processor 5 or 35 is configured to determine for each of the one or more selected light sources, e.g. light sources 11 to 13, a contribution of the one or more light effect parameters to the one or more light states of the selected light source based on a distance between the position of the selected light source, e.g. of light sources 11 to 13, and the indicated position, e.g. position 87.
  • LightSourceEffect maps a virtual light source to actual lights such that lights close to the light source are more strongly influenced than lights further away from the light source.
  • the mapping is significantly different than that of the previous two mapping functions: in both AreaEffect and MultiChannelEffect a light source either belongs to an area or channel or not, whereas it's a more gradual relation for the LightSourceEffect: the closer to the virtual source, the more strongly a light is influenced by the source.
  • the LightSourceEffect may be used if the creator wants al lights to contribute depending on their distance to the location of the effect.
  • the color, location and/or radius may be dynamic instead of constant.
  • An example of a dynamic effect that may be created with the LightSourceEffect mapping function is an explosion which increases and decreases radius.
  • a fourth mapping function is called LightlteratorEffect. This mapping function is illustrated in Figs. 11 to 13.
  • the processor 5 or 35 is configured to determine a sequence of the one or more selected light sources, e.g. light sources 11 to 13, based on the one or more spatial indications and the positions of the one or more selected light sources and determine the one or more light states for the one or more selected light sources based on the one or more light effect parameters.
  • the one or more control commands command the one or more selected light sources, e.g. light sources 11 to 13, to assume the determined one or more light states in the order specified by the sequence.
  • a next light source in the sequence assumes the one or more determined light states determined for the next light source a specified period, e.g. a second, after a previous light source in the sequence assumes the one or more determined light states determined for the previous light source.
  • LightlteratorEffect will iterate over individual lights with a certain offset, order and mode.
  • the order indicates in which order the light sources will be controlled (e.g. 'from front to back' or 'counter clockwise').
  • the mode indicates what happens after every light source is iterated over: stop ('single'), repeat ('cycle') or revert ('bounce').
  • the offset indicates how much later the light effect starts on the next light source than on the previous light source.
  • the LightlteratorEffect is clearly different from the previous effect types, because it iterates over individual light sources. This means that the total duration or the speed of an iteration over all light sources depends on the number of light sources involved in the iteration, e.g. present in the setup. However, all effects share one important property: a single effect definition works on every possible light setup. They just differ in the way they do the mapping.
  • an effect creator may want to create a sequence of light sources lighting up red one by one for one second, in a left to right order.
  • he may create a LightlteratorEffect and specify a color animation according to which the color needs to be red for one second, a left to right ordering, a repeat mode (CYCLE) and a step duration of one second.
  • This LightlteratorEffect example might be specified with the following code: LightlteratorEffect effect
  • Tween function that is used in the above example uses the following syntax: Tween( ⁇ startvalue>, ⁇ endvalue>, ⁇ timeMs>, ⁇ tweentype>).
  • Tween(l, 1, 1000, LINEAR) it is a tween from 1 to 1 in 1000ms with a linear transition, i.e. the color is stable on value 1 for 1 second.
  • LightlteratorEffect iterates over each light source to be on for 1 second in this example.
  • Other Tween functions may be used, for example, instead of the Tween function described above.
  • Segments of pixelated lighting devices may be considered as individual light sources. For example, if a user wants to create a media-based light effect while only having a Pixelated Philips Hue Lightstrip in the room, the LightlteratorEffect may play the effect on 3 segments of the Pixelated Lightstrip.
  • An example usage of a LightlteratorEffect is to make a so-called "chaser".
  • An effect creator may limit a mapping function to a certain area.
  • the LightlteratorEffect only involves the light sources in the front area 91, i.e. light sources 11, 12 and 13, and does not involve the light sources in the back area 93, i.e. light sources 14 and 15.
  • the four mapping functions have been described operating in isolation.
  • effects preferably have a layer and a specified opacity. Based on the layer of each effect and their opacity, the effects may be blended (using standard alpha blending) to result in the final color for each light.
  • mapping functions mainly 'constant' animations were used for ease of explanation.
  • any property of an effect might be animated by any kind of tween or curve.
  • the opacity of an AreaEffect may be set to change from 100% to 50% linearly over 500ms
  • the color of a channel may be set to changes from red to blue in 2 seconds
  • the radius of a LightSourceEffect may be set to change from 0.5 to 1 quadratically over 5 seconds and then linearly back to 0.5 in 1 second
  • the rendering capabilities of the light source may also be taken into account. For example, light sources which can render colors may be selected for a colored light effect. And for a rich pattern or color variation such as a rainbow effect, a pixelated, linear lighting device may be most suited. Next to colors and patterns, the rendering capabilities in terms of minimum or maximum light output may also be taken into account. For example, if a low- brightness "night scene” is required, it is useful to know which light sources have the deepest dim level capabilities. Whereas in the case of a "lightning strike", light sources may be selected which can generate a high lighting output, next to other requirements such as preferably being located high up in the room (e.g. ceiling) while having a linear (lightninglike) shape.
  • mapping functions are example functions which can be used to generate a variety of light effects.
  • a light effect is generally mapped to multiple light sources using the mapping function. However, the light effect itself may also involve multiple light sources.
  • the light effect may specify a spatial distribution, for example.
  • a simple spatial distribution may include an effect around a (relative) location with a specific effect size, color and brightness. More complex spatial distributions may include brightness and color gradients and 2D or 3D light effect shape.
  • a high-quality rendering of a more sophisticated spatial light distribution normally requires a more precise input of the relative positions of the individual light sources.
  • a 3D model of a room may be created in which exact positions of individual light sources are specified in relation to a current or typical user position, for example.
  • Spatial distributions may also be used for dynamic light effects. Besides dynamically varying the brightness and color, a light pattern and shape may be defined which follows a specific 2D or 3D trajectory with a certain speed over time. For instance, an explosion light effect may suddenly appear and increase in size, or a fireball may fly follow a trajectory where it moves from one side of room to the other.
  • Light effects may be related to media events.
  • the connected lighting system may receive input from media devices (e.g. Television 21 of Fig.l),
  • the media device generates a media effect which has a position relative to the user whereas the light sources generate a media-related light effect which has a position relative to the user and these positions are generally in the same area.
  • Media devices can range from the media devices processing the media for playback (e.g. game console, audio system) to media-rendering devices such as a smartTV, connected loudspeaker or devices which both process and render the media (e.g. smart TV, tablet, smartphone).
  • the media events can range from a simple playback start event of a specific content item, to semantic events which are derived from the media or derived from a script that has been defined for the specific content item such as a movie or music track.
  • the script can be a linear sequence of media events whereas in the case of games or interactive content, the media events are non-linear or less predictable and largely dependent on the inputs of the user(s).
  • Media can cover a variety of content types, ranging from music and movies to games and interactive augmented reality content.
  • a light source may be part of a pixelated lighting device. Based on the determined state (e.g. color and/or intensity value) per light source, control signals are generated to control the light sources accordingly.
  • an aggregated control signal may be determined for the pixelated lighting device. For instance, a pixelated light strip could make use of a serial data signal which feeds control values to all individual light nodes.
  • a step 201 comprises obtaining information identifying available light sources.
  • the information comprises for each of the available light sources a position of the available light source relative to a reference position.
  • a step 203 comprises receiving an instruction to render a light effect.
  • the instruction comprises one or more light effect parameters, a mapping function identifier and one or more spatial indications.
  • the one or more spatial indications are specified relative to the reference position.
  • a step 205 comprises mapping the light effect to one or more of the available light sources based on the one or more light effect parameters, the mapping function identifier, the one or more spatial indications and the positions of the available light sources.
  • An output of the mapping identifies one or more selected ones of the available light sources and comprises one or more light states for each of the one or more selected light sources.
  • a step 207 comprises transmitting one or more control commands for controlling the one or more selected light sources. The one or more control commands command the one or more selected light sources to assume the one or more light states.
  • step 205 comprises sub steps 211 and 213.
  • Step 211 comprises selecting one or more of the available light sources based on the mapping function identifier, the one or more spatial indications and the positions of the available light sources.
  • Step 213 comprises determining the one or more light states for each of the one or more selected light sources based on the one or more light effect parameters.
  • Fig. 15 depicts a block diagram illustrating an exemplary data processing system that may perform the method as described with reference to Fig. 14.
  • the data processing system 300 may include at least one processor 302 coupled to memory elements 304 through a system bus 306. As such, the data processing system may store program code within memory elements 304. Further, the processor 302 may execute the program code accessed from the memory elements 304 via a system bus 306. In one aspect, the data processing system may be implemented as a computer that is suitable for storing and/or executing program code. It should be appreciated, however, that the data processing system 300 may be implemented in the form of any system including a processor and a memory that is capable of performing the functions described within this specification.
  • the memory elements 304 may include one or more physical memory devices such as, for example, local memory 308 and one or more bulk storage devices 310.
  • the local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code.
  • a bulk storage device may be implemented as a hard drive or other persistent data storage device.
  • the processing system 300 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the quantity of times program code must be retrieved from the bulk storage device 310 during execution.
  • I/O Input/output
  • Examples of input devices may include, but are not limited to, a keyboard, a pointing device such as a mouse, or the like.
  • Examples of output devices may include, but are not limited to, a monitor or a display, speakers, or the like. Input and/or output devices may be coupled to the data processing system either directly or through intervening I/O controllers.
  • the input and the output devices may be implemented as a combined input/output device (illustrated in Fig. 15 with a dashed line surrounding the input device 312 and the output device 314).
  • a combined device is a touch sensitive display, also sometimes referred to as a "touch screen display” or simply "touch screen”.
  • input to the device may be provided by a movement of a physical object, such as e.g. a stylus or a finger of a user, on or near the touch screen display.
  • a network adapter 316 may also be coupled to the data processing system to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks.
  • the network adapter may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to the data processing system 300, and a data transmitter for transmitting data from the data processing system 300 to said systems, devices and/or networks.
  • Modems, cable modems, and Ethernet cards are examples of different types of network adapter that may be used with the data processing system 300.
  • the memory elements 304 may store an application 318.
  • the application 318 may be stored in the local memory 308, the one or more bulk storage devices 310, or separate from the local memory and the bulk storage devices.
  • the data processing system 300 may further execute an operating system (not shown in Fig. 15) that can facilitate execution of the application 318.
  • the application 318 being implemented in the form of executable program code, can be executed by the data processing system 300, e.g., by the processor 302.
  • the data processing system 300 may be configured to perform one or more operations or method steps described herein.
  • Various embodiments of the invention may be implemented as a program product for use with a computer system, where the program(s) of the program product define functions of the embodiments (including the methods described herein).
  • the program(s) can be contained on a variety of non-transitory computer-readable storage media, where, as used herein, the expression "non-transitory computer readable storage media" comprises all computer-readable media, with the sole exception being a transitory, propagating signal.
  • the program(s) can be contained on a variety of transitory computer-readable storage media.
  • Illustrative computer-readable storage media include, but are not limited to: (i) non- writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non- volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored.
  • the computer program may be run on the processor 302 described herein.

Abstract

An electronic device is configured to obtain information identifying available light sources (11-13), including their positions. The electronic device is further configured to receive an instruction to render a light effect. The instruction comprises one or more light effect parameters, a mapping function identifier and one or more spatial indications (87). The electronic device is further configured to map the light effect to one or more of the available light sources based on the one or more light effect parameters, the mapping function identifier, the one or more spatial indications and the positions of the available light sources. An output of the mapping identifies one or more selected ones (11-13) of the available light sources and comprises one or more light states for each of the selected light sources. The electronic device is further configured to transmit control commands for controlling the one or more selected light sources.

Description

Mapping a light effect to light sources using a mapping function
FIELD OF THE INVENTION
The invention relates to an electronic device for mapping a light effect to one or more light sources.
The invention further relates to a method of mapping a light effect to one or more light sources.
The invention also relates to a computer program product enabling a computer system to perform such a method.
BACKGROUND OF THE INVENTION
More and more devices become connected. If media-playback devices (e.g. displays and loudspeakers) become connected to and fully interoperable with lighting devices, immersive experiences can be created by generating light effects in accordance to the effects and events in the rendered media. Many typical set-ups exist for audio and video devices (e.g. home cinema 5.1, 2.1 speakers or "soundbars" typically centered around a TV screen and typical user position), however, people's connected lighting infrastructure and setup is even more diverse and is really unique for each user, which makes the creation of lighting atmospheres, scenes and scripts more difficult.
US 2010/0090617 discloses a method of composing a lighting atmosphere from an abstract description, wherein the lighting atmosphere is generated by several lighting devices, by automatically rendering the desired lighting atmosphere from the abstract description. The abstract description describes the type of light with certain lighting parameters desired at certain semantic locations at certain semantic times. The invention has the main advantage that it allows to create light scenes and lighting atmospheres at a high level of abstraction without requiring the definition of a lighting atmosphere or scene by setting the intensity, colors, etc. for single lighting units or devices.
A drawback of the method of US 2010/0090617 is that the created light scenes and light atmospheres are specific for certain light locations and need to be adapted for other light locations. SUMMARY OF THE INVENTION
It is a first object of the invention to provide an electronic device, which allows light effects to be specified independent of light location.
It is a second object of the invention to provide a method, which allows light effects to be specified independent of light location.
In a first aspect of the invention, the electronic device comprises at least one processor configured to obtain information identifying available light sources, said information comprising for each of said available light sources a position of said available light source relative to a reference position, receive an instruction to render a light effect, said instruction comprising one or more light effect parameters, a mapping function identifier and one or more spatial indications, said one or more spatial indications being specified relative to said reference position, map said light effect to one or more of said available light sources based on said one or more light effect parameters, said mapping function identifier, said one or more spatial indications and said positions of said available light sources, an output of said mapping identifying one or more selected ones of said available light sources and comprising one or more light states for each of said one or more selected light sources, and transmit one or more control commands for controlling said one or more selected light sources, said one or more control commands commanding said one or more selected light sources to assume said one or more light states.
By using a mapping function, the instruction to render the light effect does not only not require individual lights to be specified, the instruction does not require a specific location to be specified. By allowing light effects and mapping functions to be specified independently, light scenes and atmospheres may be created more easily, e.g. by first determining the light effects and only then determining how to map the light effects to light sources. Although a specific location or a specific setup of light sources does not need to be specified, an author of a light scene or atmosphere may prevent by using multiple mapping functions that this results in the light scene or atmosphere being less attractive. For example, a certain light setup might not include light sources in the back of the room. For certain light effects, it is preferable to let other light sources render the effects intended for these light sources in the back of the room. For certain other light effects, it is preferable not to let other light sources render the effects intended for these light sources in the back of the room. This may be realized using multiple mapping functions.
Preferably, said one or more light effect parameters are independent from a mapping function identified by said mapping function identifier. Said one or more light effect parameters may comprise at least one of: a static color value, a static intensity value, a static opacity value, a varying color value, a varying intensity value, a varying opacity value, a static size, a varying size and a trajectory, for example. Preferably, the light effect parameter relates to at least a color and/or intensity of light. The light effect parameter may simply specify "color = yellow, intensity = 50%", for example, or specify a predefined effect such as 'explosion'. Said one or more spatial indications may indicate one or more positions and/or areas. The spatial indication may comprise just single values, e.g. x=l;y=0.5, or a range, e.g. x=0.5-l; y=0.5-l . This range is an example of the spatial indication indicating an area. A spatial indication may comprise a radius to indicate an area, e.g. 0.2 around position x=l; y=0.5.
The mapping function determines which light sources are to be controlled to render light output based on the received instruction to render a light effect. Said reference position may be a typical or current position of a user or a position of a media-rendering object, for example. The current position of the mobile device of the user may be assumed to represent the current position of the user or a depth camera may be used to determine the current position of the user, for example. The media-rendering object may be a media- rendering device (e.g. a Television) or a media-rendering projection surface (e.g. a wall or wall portion), for example. The electronic device may be, for example, a mobile device or an (other) media-rendering device, e.g. a Television. In this case, the electronic device may obtain the information identifying the available light sources from a bridge which collects and stores this information, for example. Alternatively, the electronic device may be a bridge, for example.
Said at least one processor may be configured to map said light effect to said one or more of said available light sources by selecting one or more of said available light sources based on said mapping function identifier, said one or more spatial indications and said positions of said available light sources, and determining said one or more light states for each of said one or more selected light sources based on said one or more light effect parameters. As a simple example, the one or more light states may simply comprise a constant or dynamic color and/or intensity. Alternatively, the one or more light states may additionally be determined by the mapping function. For example, the color and/or intensity of a light source may be set to be weaker (e.g. less intense or whiter) if the light source is further from the spatial indication and set to be stronger if the light source is closer to the spatial indication. In this case, the mapping function identifies one or more selected ones of said available light sources and one or more light states for said one or more selected light sources. Optionally the selection of light sources is further based on the one or more light effect parameters. For example, the light effect may specify a small explosion or a large explosion which involves multiple light sources.
Said one or more spatial indications may indicate a position and said at least one processor may be configured to determine for each of said one or more selected light sources a contribution of said one or more light effect parameters to said one or more light states of said selected light source based on a distance between said position of said selected light source and said indicated position. This mapping function allows the color and/or intensity of a light source to be set to be weaker (e.g. less intense or whiter) if the lights source is further from the spatial indication and to be set to be stronger if the light source is closer to the spatial indication.
Said one or more spatial indications may indicate at least a first position and a second position and said one or more light effect parameters may comprise at least a first light effect parameter and a second light effect parameter, said first light effect parameter being associated with said first position and said second light effect parameter being associated with said second position. Said at least one processor may be configured to determine a first subset of one or more of said one or more selected light sources and a second subset of one or more of said one or more selected light sources, said one or more light sources of said first subset being closer to said first position than said one or more light sources of said second subset and said one or more light sources of said second subset being closer to said second position than said one or more light sources of said first subset, and determine said one or more light states for said one or more light sources of said first subset based on said first light effect parameter and said one or more light states for said one or more light sources of said second subset based on said second light effect parameter. This mapping function allows all light effects to be reproduced even if no light sources are close to the spatial indication specified in an instruction to render a light effect. One or more further subsets (e.g. third and fourth subsets) may be determined in the same manner as defined above in relation to the first and second subsets. Alternative or additionally, one or more further subsets may be determined in a different manner. For example, if more light effect parameters have been defined than there are light sources, then light effects associated with the positions farthest from the light sources will not be rendered. Each light effect is rendered by the light source closest to the position associated with light effect.
If said information identifying available light sources only identifies a single available light source, said at least one processor may be configured to determine said one or more light states for said single available light source based on said first light effect parameter if said single light source is closer to said first position than to said second position, and determine said one or more light states for said single available light source based on said second light effect parameter if said single light source is closer to said second position than to said first position.
Said at least one processor may be configured to determine a sequence of said one or more selected light sources based on said one or more spatial indications and said positions of said one or more selected light sources, and determine said one or more light states for said one or more selected light sources based on said one or more light effect parameters, wherein said one or more control commands command said one or more selected light sources to assume said determined one or more light states in the order specified by said sequence, a next light source in said sequence assuming said one or more determined light states determined for said next light source a specified period after a previous light source in said sequence assumes said one or more determined light states determined for said previous light source. The spatial indication for this mapping function preferably specifies at least one point in space (coordinate system) and a direction (e.g. left to right). This mapping function allows light sources to be activated in sequence, e.g. in a certain direction.
Said one or more spatial indications may indicate an area and said at least one processor is configured to select only available lights with a position in said indicated area. This mapping function may be used to limit light effects to a certain area of a room, e.g. the back side, the left side or the back-left side.
Said at least one processor may be configured to map said light effect to one or more of said available light sources based on said one or more light effect parameters, said mapping function identifier, said one or more spatial indications, said positions of said available light sources and information identifying rendering capabilities of said available light sources. By additionally taking the capabilities (e.g. dim levels or white vs. color) of the available light sources into account, a better mapping is achieved. For example, when it is important that each channel of a certain color light effect is rendered by at least one light source, the at least one processor should ensure that each channel of the light effect is mapped to at least one color light source.
In a second aspect of the invention, the method comprises obtaining information identifying available light sources, said information comprising for each of said available light sources a position of said available light source relative to a reference position, receiving an instruction to render a light effect, said instruction comprising one or more light effect parameters, a mapping function identifier and one or more spatial indications, said one or more spatial indications being specified relative to said reference position, mapping said light effect to one or more of said available light sources based on said one or more light effect parameters, said mapping function identifier, said one or more spatial indications and said positions of said available light sources, an output of said mapping identifying one or more selected ones of said available light sources and comprising one or more light states for each of said one or more selected light sources, and transmitting one or more control commands for controlling said one or more selected light sources, said one or more control commands commanding said one or more selected light sources to assume said one or more light states. The method may be implemented in hardware and/or software.
Mapping said light effect to said one or more of said available light sources may comprise selecting one or more of said available light sources based on said mapping function identifier, said one or more spatial indications and said positions of said available light sources and determining said one or more light states for each of said one or more selected light sources based on said one or more light effect parameters.
Moreover, a computer program for carrying out the methods described herein, as well as a non-transitory computer readable storage-medium storing the computer program are provided. A computer program may, for example, be downloaded by or uploaded to an existing device or be stored upon manufacturing of these systems.
A non-transitory computer-readable storage medium stores at least one software code portion, the software code portion, when executed or processed by a computer, being configured to perform executable operations comprising: obtaining information identifying available light sources, said information comprising for each of said available light sources a position of said available light source relative to a reference position, receiving an instruction to render a light effect, said instruction comprising one or more light effect parameters, a mapping function identifier and one or more spatial indications, said one or more spatial indications being specified relative to said reference position, mapping said light effect to one or more of said available light sources based on said one or more light effect parameters, said mapping function identifier, said one or more spatial indications and said positions of said available light sources, an output of said mapping identifying one or more selected ones of said available light sources and comprising one or more light states for each of said one or more selected light sources, and transmitting one or more control commands for controlling said one or more selected light sources, said one or more control commands commanding said one or more selected light sources to assume said one or more light states.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a device, a method or a computer program product.
Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, microcode, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit", "module" or "system." Functions described in this disclosure may be implemented as an algorithm executed by a processor/microprocessor of a computer. Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied, e.g., stored, thereon.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a computer readable storage medium may include, but are not limited to, the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory
(ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of the present invention, a computer readable storage medium may be any tangible medium that can contain, or store, a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing. Computer program code for carrying out operations for aspects of the present invention may be written in any
combination of one or more programming languages, including an object oriented programming language such as Java(TM), Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the present invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor, in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of devices, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
BRIEF DESCRIPTION OF THE DRAWINGS
These and other aspects of the invention are apparent from and will be further elucidated, by way of example, with reference to the drawings, in which:
Fig. 1 shows a system which comprises a first embodiment of the electronic device of the invention;
Fig. 2 is a block diagram of the first embodiment of the electronic device of
Fig. 1;
. 3 shows a system which comprises a second embodiment of the electronic device of the invention;
Fig. 4 is a block diagram of the second embodiment of the electronic device of
Fig. 3;
Fig. 5 depicts an example of a coordinate system used in an embodiment of the invention;
Fig. 6 depicts an example of a user interface for indicating positions of light sources relative to a user position;
Figs. 7 and 8 illustrate a first mapping function which may be implemented in an embodiment of the invention; Fig. 9 illustrates a second mapping function which may be implemented in an embodiment of the invention;
Fig. 10 illustrates a third mapping function which may be implemented in an embodiment of the invention;
Figs. 11 to 13 illustrate a fourth mapping function which may be implemented in an embodiment of the invention;
Fig. 14 is a flow diagram of an embodiment of the method of the invention; and
Fig. 15 is a block diagram of an exemplary data processing system for performing the method of the invention.
Corresponding elements in the drawings are denoted by the same reference numeral.
DETAILED DESCRIPTION OF THE EMBODIMENTS
Fig.l shows a first embodiment of a system, e.g. a Philips Hue system, which comprises a first embodiment of the electronic device of the invention: a bridge 1. The bridge 1 is located in a room 27. A television 21, three lights sources (e.g. lamps) 11, 12 and 13 and a person 23 holding a mobile device 25 are also located in the room 27. The bridge 1 controls light sources 11 to 13, e.g. via ZigBee or a protocol based on ZigBee. The bridge 1 can be controlled by other devices, e.g. mobile device 25, via an Application Programming Interface (API). The mobile device 25 may be configured to communicate with the bridge 1 directly or via a wireless LAN access point (not shown), for example. The mobile device 25 may be configured to communicate with the bridge 1 via Wi-Fi (IEEE 802.11), BlueTooth and/or ZigBee, for example. The mobile device 25 runs an application which transmits instructions to a server running on bridge 1. The application has been developed using the above- mentioned API and uses this API to communicate with the bridge 1. The bridge 1 may be connected to the Internet, e.g. via a wireless LAN access point.
An embodiment of the bridge 1 is shown in more detail in Fig.2. In this embodiment, the bridge 1 comprises a transceiver 3 and a processor 5. The processor 5 is configured to obtain information identifying available light sources. The information comprises for each of the available light sources a position of the available light source relative to a reference position. The processor 5 is further configured to receive an instruction to render a light effect. The instruction comprises or more light effect parameters, a mapping function identifier and one or more spatial indications. The one or more spatial indications are specified relative to the reference position.
The processor 5 is also configured to map the light effect to one or more of the available light sources based on the one or more light effect parameters, the mapping function identifier, the one or more spatial indications and the positions of the available light sources. An output of the mapping identifies one or more selected ones of the available light sources and comprises one or more light states for each of the one or more selected light sources. The processor 5 is also configured to use the transceiver 3 to transmit one or more control commands for controlling the one or more selected light sources. The one or more control commands command the one or more selected light sources to assume the one or more light states.
In this embodiment, the processor 5 is configured to map the light effect to the one or more of the available light sources by selecting one or more of the available light sources based on the mapping function identifier, the one or more spatial indications and the positions of the available light sources and determining the one or more light states for each of the one or more selected light sources based on the one or more light effect parameters.
In the embodiment of the bridge 1 shown in Fig.2, the bridge 1 comprises one processor 5. In an alternative embodiment, the bridge 1 comprises multiple processors. The processor 5 of the bridge 1 may be a general-purpose processor, e.g. from ARM or
Qualcomm, or an application-specific processor. The processor 5 of the bridge 1 may run a Linux operating system for example. In the embodiment shown in Fig.2, a receiver and a transmitter have been combined into a transceiver 3. In an alternative embodiment, one or more separate receiver components and one or more separate transmitter components are used. In an alternative embodiment, multiple transceivers are used instead of a single transceiver. The transceiver 3 may use one or more wireless communication technologies to transmit and receive data, e.g. LTE, Wi-Fi, ZigBee and/or Bluetooth. In the embodiment shown in Fig.2, the mobile device 1 further comprises storage means 7, e.g. for storing information identifying lights 11 to 13. The storage means 7 may comprise one or more memory units. The storage means 7 may comprise solid state memory, for example.
Fig.3 shows a second embodiment of a system, e.g. a Philips Hue system, which comprises a second embodiment of the electronic device of the invention: a mobile device 31. No bridge is used in this example. The mobile device 31 communicates with the lights 11 to 13 via a data forwarding device 29, e.g. a wireless LAN access point. An embodiment of the mobile device 31 is shown in more detail in Fig.4. In this embodiment, the mobile device 31 comprises a transceiver 33 and a processor 35. The mobile device 31 further comprises a touchscreen 39 which may be used to start or stop the afore-mentioned light control application, for example. The data forwarding device 29 may comprise a wireless LAN access point and/or a network (e.g. Ethernet) switch, for example.
The processor 35 is configured to obtain information identifying available light sources. The information comprises for each of the available light sources a position of the available light source relative to a reference position. In this embodiment, the information is collected directly from lights sources 11, 12 and 13. In an alternative embodiment, a bridge is present and the information is obtained from the bridge by the mobile device 31. The processor 35 is further configured to receive an instruction to render a light effect. The instruction comprises or more light effect parameters, a mapping function identifier and one or more spatial indications. The one or more spatial indications are specified relative to the reference position.
The processor 35 is also configured to map the light effect to one or more of the available light sources based on the one or more light effect parameters, the mapping function identifier, the one or more spatial indications and the positions of the available light sources. An output of the mapping identifies one or more selected ones of the available light sources and comprises one or more light states for each of the one or more selected light sources. The processor 35 is also configured to use the transceiver 33 to transmit one or more control commands for controlling the one or more selected light sources. The one or more control commands command the one or more selected light sources to assume the one or more light states.
In the embodiment of the mobile device 31 shown in Fig.4, the mobile device 31 comprises one processor 35. In an alternative embodiment, the mobile device 31
comprises multiple processors. The processor 35 of the mobile device 31 may be a general- purpose processor, e.g. from ARM or Qualcomm, or an application-specific processor. The processor 35 of the mobile device 31 may run an iOS, Windows or Android operating system for example. The invention may be implemented using a computer program running on one or more processors.
In the embodiment shown in Fig.4, a receiver and a transmitter have been combined into a transceiver 33. In an alternative embodiment, one or more separate receiver components and one or more separate transmitter components are used. In an alternative embodiment, multiple transceivers are used instead of a single transceiver. The transceiver 33 may use one or more wireless communication technologies to transmit and receive data, e.g. LTE, Wi-Fi, ZigBee and/or Bluetooth. In the embodiment shown in Fig.4, the mobile device 31 further comprises storage means 37, e.g. for storing applications (also referred to as "apps") and application data. The storage means 37 may comprise one or more memory units. The storage means 37 may comprise solid state memory, for example. The touchscreen 39 may comprise an LCD or OLED display panel, for example.
Spatial indications and positions of light sources are typically specified using a coordinate system. The API may use a single coordinate system or multiple coordinate systems. In case of the latter, the developer may be able to specify the coordinate system that he wants to use. An example of such a coordinate system is shown in Fig.5. In this coordinate system, the reference position is a typical position of a user, i.e. position 43 (0,0.5). Position 43 may represent the position of a couch, for example. The coordinate system of Fig.5 defines another fixed position: position 41 (0, -0.5) of a Television (or alternatively, of a projection surface, for example). The front-left corner has coordinate (-1,1), the right-left corner has coordinate (1,1), the back-left corner has coordinate (-1,1), the back-right has coordinate (1,-1) and the center has coordinate (0,0). Thus, x < 0 is left, x > 0 is right, y < 0 is back, and y > 0 is front. In an alternative embodiment, the position 41 of the Television, i.e. the media-rendering device, is the reference position.
In an embodiment of the system, the positions of the light sources are specified manually by a user, e.g. via the user interface displayed on display 39 of the mobile device 31, as shown in Fig.6. The same user interface may be displayed on a display of the mobile device 25 of Fig.l. The position of the light sources is specified relative to the position of the user, i.e. by an icon 53. Icon 53 may represent the current position of the user or a typical position of the user. In the former case, the specified positions of the light sources may be converted to positions relative to a typical user position if such a typical user position is known. In the latter case, the user may be able to move around icon 53 on the screen displayed on display 39. An icon 51 represents a Television. This icon may 51 have a fixed position relative to icon 53, e.g. in accordance with the coordinate system of Fig.5, or the user may be able to move around icon 51 on the screen displayed on display 39. In the latter case, the position of the Television may also be automatically detected. Light sources 11, 12 and 13 of Fig.l are represented by icons 61, 62 and 63, respectively. The user is able to move around icons 61 to 63 on the screen displayed on display 39.
With the user interface shown in Fig.6, a user is able to specify 2D positions on a floorplan. In a more advanced embodiment, the user may indicate an approximate height in the room for each light source (or simply indicate low, medium or high in the room). Certain more advanced effects and embodiments would require knowing the 3D position of each light source in the area, or actually the 3D position in space where the light source is creating an effect (e.g. location and dimensions of each light source's effect surface). This could be supported if users can easily position lamp representations in a 3D representation of the room. In such a user interface the user may also indicate an approximate, height and orientation of a luminaire, or indicate the current shape and length of a flexible, cut-to- measure light strip.
The light sources 11 to 13 represented by icons 61 to 63 in Fig.6 are controlled as a single entity. In case pixelated lighting devices are used of which different pixels can be controlled differently, the positions of individual pixels or groups of pixels preferably need to be known as well. These positions may be detected automatically. For example, embedded sensors may be able to detect orientation, height of the lighting device. In the case of flexible pixelated light strips, future embedded sensors may even be able to detect shape of the pixelated light strip. For example, a LED strip may be able to detect where it is bended, stretched or twisted.
In the example shown in Fig.6, a user specifies the positions of the light sources 11 to 13 of Fig.l. In an alternative embodiment, automatic lighting device position detection may be used by using known methods of RF-based position detection (e.g. RSSI). Determining the position of the user and/or of the media-rendering device may be done in a similar way, by using RF-based positioning to detect the approximate location of the user's mobile device and/or of the media-rendering device.
Many different mapping functions may be made available in the API. In general, the one or more spatial indications may indicate one or more positions and/or areas. The one or more light effect parameters may comprise at least one of: a static color value, a static intensity value, a static opacity value, a varying color value, a varying intensity value, a varying opacity value, a static size, a varying size and a trajectory, for example.
In the next paragraphs, four examples of mapping functions are defined. In these examples, it is assumed that all lights sources have color-rendering capabilities.
However, in alternatively examples, a room may also have connected lamps installed which can only render white light (e.g. the Philips Hue Ambience lamp). Although not discussed in the four examples, the capabilities of the light sources may also be taken into account when selecting light sources for rendering the light effects.
A first mapping function is called AreaEffect. This mapping function is illustrated in Figs. 7 and 8. In this mapping function, the one or more spatial indications indicate an area, e.g. one or more of areas 71 to 74. The processor 5 or 35 is configured to select only available lights with a position in the indicated area.
AreaEffect will play on all light sources in a given area (or multiple areas). An important property of an AreaEffect is that if a specific setup has no light in a certain area, the effect won't be visible. This may be desired in case it is better to not show an effect at all than to show it in the wrong location.
In the example shown in Fig.7, two AreaEffects are specified. One is a constant red color on the front half and one is a constant blue color on the back half. In this example, all three light sources 11 to 13 are located in the front half so all lights will color red and none will color blue. This AreaEffect example might be specified with the following code (the color being specified as color values for red, green and blue, respectively):
AreaEffect frontRed
frontRed.addArea(FRONTHALF)
frontRed.setColorAnimation(Constant(l),Constant(0),Constant(0))
AreaEffect backBlue
backBlue.addArea(BACKHALF)
backBlue. setColorAnimation(Constant(0),Constant(0),Constant(l))
If there would have been lights located in the back half of room 27 of Fig.7 then they would have colored blue. This is shown in Fig.8, where lights 14 and 15 are located in the back half of room 27 and are colored blue.
By choosing the AreaEffect, the effect creator can make the explicit choice that if there are no lights in the designated area, the effect is not rendered. For the example above, the creator might find rendering of the blue effect in another area than he expected a bad experience. In the example of Figs. 7 and 8, each light device (comprising a single light source) covers one area. However, larger lighting devices (e.g. a long lightstrip, or a large ceiling lighting array) may cover multiple areas. In such a case, different segments of light sources of the large lighting device could be assigned to different areas. An example usage of an AreaEffect could be indicating in a game from which direction a character is being hit. When e.g. a character is hit from the left it is important to only play this on the left and not on the right as this would be confusing. A second mapping function is called MultiChannelEffect. This mapping function is illustrated in Fig. 9. In this mapping function, the one or more spatial indications indicate at least a first position, e.g. position 81, and a second position, e.g. position 83, and the one or more light effect parameters comprise at least a first light effect parameter and a second light effect parameter. The first light effect parameter is associated with the first position, e.g. position 81, and the second light effect parameter is associated with the second position, e.g. position 83. The processor 5 or 35 is configured to determine a first subset of one or more of the one or more selected light sources, e.g. light sources 11 and 13, and a second subset of one or more of the one or more selected light sources, e.g. light source 12. The one or more light sources of the first subset, e.g. light sources 11 and 13, are closer to the first position, e.g. position 81, than the one or more light sources of the second subset, e.g. light source 12, and the one or more light sources of the second subset, e.g. light source 12, are closer to the second position, e.g. position 83, than the one or more light sources of the first subset, e.g. light source 11 and 13.
The processor 5 or 35 is further configured to determine the one or more light states for the one or more light sources of the first subset, e.g. light sources 11 and 13, based on the first light effect parameter and the one or more light states for the one or more light sources of the second subset, e.g. light source 12, based on the second light effect parameter. Thus, MultiChannelEffect will try to distribute the light channels (compare to e.g. audio channels) evenly over the available light sources, prioritizing light sources closest to the channel location. The mapping is significantly different than that of an AreaEffect: even though a channel has a certain location, the MultiChannelEffect tries to make as many channels visible as possible even if a channel does not have light sources in close proximity to the channel location.
For example, suppose an effect creator created two light channels for an audio script, one channel in the 'Front' and another in the 'Back'. In that case he creates two Channels one for the front (position 81; Location(0,l)) and one for the back (position 83; Location(0,-1)) and adds them to the multichannel effect. This MultiChannelEffect example might be specified with the following code:
Channel frontRed
frontRed.setLocation(Location(0, 1 ))
frontRed.setColorAnimation(Constant(l),Constant(0),Constant(0)) Channel backBlue
backBlue.setLocation(Location(0,-l))
backBlue. setColorAnimation(Constant(0),Constant(0),Constant(l))
MultiChannelEffect effect
effect. addChannel(frontRed)
effect. addChannel(backBlue)
As indicated in the code specified above, there is a channel with a constant red color in the front and a channel with a constant blue color in the back. Even though all light sources, i.e. light sources 11 to 13, in this setup are located closer to the red channel, still one light source, i.e. light source 12, is mapped to the blue channel, as shown in Fig.9. Thus, the MultiChannelEffect maps channels to light sources, making sure all channels are mapped to light sources closest to their position. The MultiChannelEffect may be used if the creator finds rendering of all channels more important than rendering the channel at a specific location.
A third mapping function is called LightSourceEffect. This mapping function is illustrated in Fig. 10. In this mapping function, the one or more spatial indications indicate a position, e g. position 87, and the processor 5 or 35 is configured to determine for each of the one or more selected light sources, e.g. light sources 11 to 13, a contribution of the one or more light effect parameters to the one or more light states of the selected light source based on a distance between the position of the selected light source, e.g. of light sources 11 to 13, and the indicated position, e.g. position 87.
Thus, LightSourceEffect maps a virtual light source to actual lights such that lights close to the light source are more strongly influenced than lights further away from the light source. The mapping is significantly different than that of the previous two mapping functions: in both AreaEffect and MultiChannelEffect a light source either belongs to an area or channel or not, whereas it's a more gradual relation for the LightSourceEffect: the closer to the virtual source, the more strongly a light is influenced by the source.
In the example depicted in Fig.10, there is a LightSourceEffect with a constant blue color, constant location in the middle of the room, at position 87, and constant radius of 1.5. Lights in the actual setup which are located close to the middle have a strong blue color whereas lights further away from the middle have a less strong blue color. The user has 3 lights at the front: light sources 11 to 13. The LightSourceEffect maps color contribution depending on the distance to position 87 (Location (0,0)). This LightSourceEffect example might be specified with the following code:
LightSourceEffect effect
effect.setColorAnimation(Constant(0),Constant(0),Constant(l)) effect. setLocationAnimation(Constant(0),Constant(0))
effect.setRadiusAnimation(Constant( 1.5))
The LightSourceEffect may be used if the creator wants al lights to contribute depending on their distance to the location of the effect. The color, location and/or radius may be dynamic instead of constant. An example of a dynamic effect that may be created with the LightSourceEffect mapping function is an explosion which increases and decreases radius.
A fourth mapping function is called LightlteratorEffect. This mapping function is illustrated in Figs. 11 to 13. In this mapping function, the processor 5 or 35 is configured to determine a sequence of the one or more selected light sources, e.g. light sources 11 to 13, based on the one or more spatial indications and the positions of the one or more selected light sources and determine the one or more light states for the one or more selected light sources based on the one or more light effect parameters. The one or more control commands command the one or more selected light sources, e.g. light sources 11 to 13, to assume the determined one or more light states in the order specified by the sequence. A next light source in the sequence assumes the one or more determined light states determined for the next light source a specified period, e.g. a second, after a previous light source in the sequence assumes the one or more determined light states determined for the previous light source.
Preferably, LightlteratorEffect will iterate over individual lights with a certain offset, order and mode. The order indicates in which order the light sources will be controlled (e.g. 'from front to back' or 'counter clockwise'). The mode indicates what happens after every light source is iterated over: stop ('single'), repeat ('cycle') or revert ('bounce'). The offset indicates how much later the light effect starts on the next light source than on the previous light source. The LightlteratorEffect is clearly different from the previous effect types, because it iterates over individual light sources. This means that the total duration or the speed of an iteration over all light sources depends on the number of light sources involved in the iteration, e.g. present in the setup. However, all effects share one important property: a single effect definition works on every possible light setup. They just differ in the way they do the mapping.
As an example of the LightlteratorEffect, an effect creator may want to create a sequence of light sources lighting up red one by one for one second, in a left to right order. In this case, he may create a LightlteratorEffect and specify a color animation according to which the color needs to be red for one second, a left to right ordering, a repeat mode (CYCLE) and a step duration of one second. This LightlteratorEffect example might be specified with the following code: LightlteratorEffect effect
effect.setColorAnimation(Tween( 1,1,1000,LINEAR),Constant(0),Constant(0)) effect.setOrder(LEFTRIGHT)
effect.setMode(CYCLE)
effect.setOffset(lOOO)
The Tween function that is used in the above example uses the following syntax: Tween(<startvalue>,<endvalue>,<timeMs>,<tweentype>). Tween(l, 1, 1000, LINEAR) it is a tween from 1 to 1 in 1000ms with a linear transition, i.e. the color is stable on value 1 for 1 second. Thus, LightlteratorEffect iterates over each light source to be on for 1 second in this example. Fig. 11 depicts light source 11 being on (colored red) and light sources 12 and 13 being off when the LightlteratorEffect starts (t=0). Fig.12 depicts light source 12 being on (colored red) and light sources 11 and 13 being off when 1 second has elapsed (t=l second). Other Tween functions may be used, for example, instead of the Tween function described above.
Segments of pixelated lighting devices may be considered as individual light sources. For example, if a user wants to create a media-based light effect while only having a Pixelated Philips Hue Lightstrip in the room, the LightlteratorEffect may play the effect on 3 segments of the Pixelated Lightstrip. An example usage of a LightlteratorEffect is to make a so-called "chaser".
An effect creator may limit a mapping function to a certain area. In the example shown in Fig.13, the LightlteratorEffect only involves the light sources in the front area 91, i.e. light sources 11, 12 and 13, and does not involve the light sources in the back area 93, i.e. light sources 14 and 15. The four mapping functions have been described operating in isolation.
However, multiple light effects may occur simultaneously and multiple effects can map to the same lights. Therefore, effects preferably have a layer and a specified opacity. Based on the layer of each effect and their opacity, the effects may be blended (using standard alpha blending) to result in the final color for each light.
In the example of the four mapping functions, mainly 'constant' animations were used for ease of explanation. In general, any property of an effect might be animated by any kind of tween or curve. For example:
The opacity of an AreaEffect may be set to change from 100% to 50% linearly over 500ms
The color of a channel may be set to changes from red to blue in 2 seconds The radius of a LightSourceEffect may be set to change from 0.5 to 1 quadratically over 5 seconds and then linearly back to 0.5 in 1 second
For determining which light source should contribute to which part of the light effect, the rendering capabilities of the light source may also be taken into account. For example, light sources which can render colors may be selected for a colored light effect. And for a rich pattern or color variation such as a rainbow effect, a pixelated, linear lighting device may be most suited. Next to colors and patterns, the rendering capabilities in terms of minimum or maximum light output may also be taken into account. For example, if a low- brightness "night scene" is required, it is useful to know which light sources have the deepest dim level capabilities. Whereas in the case of a "lightning strike", light sources may be selected which can generate a high lighting output, next to other requirements such as preferably being located high up in the room (e.g. ceiling) while having a linear (lightninglike) shape.
The above four mapping functions are example functions which can be used to generate a variety of light effects. A light effect is generally mapped to multiple light sources using the mapping function. However, the light effect itself may also involve multiple light sources. The light effect may specify a spatial distribution, for example. A simple spatial distribution may include an effect around a (relative) location with a specific effect size, color and brightness. More complex spatial distributions may include brightness and color gradients and 2D or 3D light effect shape. However, a high-quality rendering of a more sophisticated spatial light distribution normally requires a more precise input of the relative positions of the individual light sources. To address this, a 3D model of a room may be created in which exact positions of individual light sources are specified in relation to a current or typical user position, for example.
Spatial distributions may also be used for dynamic light effects. Besides dynamically varying the brightness and color, a light pattern and shape may be defined which follows a specific 2D or 3D trajectory with a certain speed over time. For instance, an explosion light effect may suddenly appear and increase in size, or a fireball may fly follow a trajectory where it moves from one side of room to the other.
Light effects may be related to media events. In this case, the connected lighting system may receive input from media devices (e.g. Television 21 of Fig.l),
indicating current media events. The media device generates a media effect which has a position relative to the user whereas the light sources generate a media-related light effect which has a position relative to the user and these positions are generally in the same area. Media devices can range from the media devices processing the media for playback (e.g. game console, audio system) to media-rendering devices such as a smartTV, connected loudspeaker or devices which both process and render the media (e.g. smart TV, tablet, smartphone). The media events, can range from a simple playback start event of a specific content item, to semantic events which are derived from the media or derived from a script that has been defined for the specific content item such as a movie or music track. In the case of music and movies, the script can be a linear sequence of media events whereas in the case of games or interactive content, the media events are non-linear or less predictable and largely dependent on the inputs of the user(s). Media can cover a variety of content types, ranging from music and movies to games and interactive augmented reality content.
As previously described, a light source may be part of a pixelated lighting device. Based on the determined state (e.g. color and/or intensity value) per light source, control signals are generated to control the light sources accordingly. In the case of pixelated lighting devices, an aggregated control signal may be determined for the pixelated lighting device. For instance, a pixelated light strip could make use of a serial data signal which feeds control values to all individual light nodes.
An embodiment of the method of the invention is shown in Fig.14. A step 201 comprises obtaining information identifying available light sources. The information comprises for each of the available light sources a position of the available light source relative to a reference position. A step 203 comprises receiving an instruction to render a light effect. The instruction comprises one or more light effect parameters, a mapping function identifier and one or more spatial indications. The one or more spatial indications are specified relative to the reference position.
A step 205 comprises mapping the light effect to one or more of the available light sources based on the one or more light effect parameters, the mapping function identifier, the one or more spatial indications and the positions of the available light sources. An output of the mapping identifies one or more selected ones of the available light sources and comprises one or more light states for each of the one or more selected light sources. A step 207 comprises transmitting one or more control commands for controlling the one or more selected light sources. The one or more control commands command the one or more selected light sources to assume the one or more light states.
In the embodiment shown in Fig.14, step 205 comprises sub steps 211 and 213. Step 211 comprises selecting one or more of the available light sources based on the mapping function identifier, the one or more spatial indications and the positions of the available light sources. Step 213 comprises determining the one or more light states for each of the one or more selected light sources based on the one or more light effect parameters.
Fig. 15 depicts a block diagram illustrating an exemplary data processing system that may perform the method as described with reference to Fig. 14.
As shown in Fig. 15, the data processing system 300 may include at least one processor 302 coupled to memory elements 304 through a system bus 306. As such, the data processing system may store program code within memory elements 304. Further, the processor 302 may execute the program code accessed from the memory elements 304 via a system bus 306. In one aspect, the data processing system may be implemented as a computer that is suitable for storing and/or executing program code. It should be appreciated, however, that the data processing system 300 may be implemented in the form of any system including a processor and a memory that is capable of performing the functions described within this specification.
The memory elements 304 may include one or more physical memory devices such as, for example, local memory 308 and one or more bulk storage devices 310. The local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code. A bulk storage device may be implemented as a hard drive or other persistent data storage device. The processing system 300 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the quantity of times program code must be retrieved from the bulk storage device 310 during execution. Input/output (I/O) devices depicted as an input device 312 and an output device 314 optionally can be coupled to the data processing system. Examples of input devices may include, but are not limited to, a keyboard, a pointing device such as a mouse, or the like. Examples of output devices may include, but are not limited to, a monitor or a display, speakers, or the like. Input and/or output devices may be coupled to the data processing system either directly or through intervening I/O controllers.
In an embodiment, the input and the output devices may be implemented as a combined input/output device (illustrated in Fig. 15 with a dashed line surrounding the input device 312 and the output device 314). An example of such a combined device is a touch sensitive display, also sometimes referred to as a "touch screen display" or simply "touch screen". In such an embodiment, input to the device may be provided by a movement of a physical object, such as e.g. a stylus or a finger of a user, on or near the touch screen display.
A network adapter 316 may also be coupled to the data processing system to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks. The network adapter may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to the data processing system 300, and a data transmitter for transmitting data from the data processing system 300 to said systems, devices and/or networks. Modems, cable modems, and Ethernet cards are examples of different types of network adapter that may be used with the data processing system 300.
As pictured in Fig. 15, the memory elements 304 may store an application 318. In various embodiments, the application 318 may be stored in the local memory 308, the one or more bulk storage devices 310, or separate from the local memory and the bulk storage devices. It should be appreciated that the data processing system 300 may further execute an operating system (not shown in Fig. 15) that can facilitate execution of the application 318. The application 318, being implemented in the form of executable program code, can be executed by the data processing system 300, e.g., by the processor 302.
Responsive to executing the application, the data processing system 300 may be configured to perform one or more operations or method steps described herein.
Various embodiments of the invention may be implemented as a program product for use with a computer system, where the program(s) of the program product define functions of the embodiments (including the methods described herein). In one embodiment, the program(s) can be contained on a variety of non-transitory computer-readable storage media, where, as used herein, the expression "non-transitory computer readable storage media" comprises all computer-readable media, with the sole exception being a transitory, propagating signal. In another embodiment, the program(s) can be contained on a variety of transitory computer-readable storage media. Illustrative computer-readable storage media include, but are not limited to: (i) non- writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non- volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored. The computer program may be run on the processor 302 described herein.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of embodiments of the present invention has been presented for purposes of illustration, but is not intended to be exhaustive or limited to the implementations in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the present invention. The embodiments were chosen and described in order to best explain the principles and some practical applications of the present invention, and to enable others of ordinary skill in the art to understand the present invention for various embodiments with various modifications as are suited to the particular use contemplated.

Claims

CLAIMS:
1. An electronic device (1 ,31) comprising at least one processor (5,35) configured to:
obtain information identifying available light sources (11-15), said information comprising for each of said available light sources (11-15) a position of said available light source relative to a reference position (43,53),
receive an instruction to render a light effect, said instruction comprising one or more light effect parameters, a mapping function identifier indicating which one of multiple mapping functions to apply and one or more spatial indications, said one or more spatial indications being specified relative to said reference position (43,53),
- map said light effect to one or more of said available light sources (11-15) based on said instruction, an output of said mapping identifying one or more selected ones (11-13) of said available light sources (11-15) and comprising one or more light states for each of said one or more selected light sources (11-13), and
transmit one or more control commands for controlling said one or more selected light sources (11-13), said one or more control commands commanding said one or more selected light sources (11-13) to assume said one or more light states.
2. An electronic device (1,31) as claimed in claim 1, wherein said at least one processor (5,35) is configured to map said light effect to said one or more of said available light sources (11-15) by:
selecting one or more of said available light sources (11-15) based on said mapping function identifier, said one or more spatial indications and said positions of said available light sources (11-15), and
determining said one or more light states for each of said one or more selected light sources (11-13) based on said one or more light effect parameters.
3. An electronic device (1,31) as claimed in claim 1 or 2, wherein said one or more spatial indications indicate a position and said at least one processor (5,35) is configured to: determine for each of said one or more selected light sources (11-13) a contribution of said one or more light effect parameters to said one or more light states of said selected light source based on a distance between said position of said selected light source and said indicated position.
4. An electronic device (1,31) as claimed in claim 1 or 2, wherein said one or more spatial indications indicate at least a first position (81) and a second position (83) and said one or more light effect parameters comprise at least a first light effect parameter and a second light effect parameter, said first light effect parameter being associated with said first position (81) and said second light effect parameter being associated with said second position (83), and said at least one processor (5,35) is configured to:
determine a first subset (11,13) of one or more of said one or more selected light sources (11-13) and a second subset (12) of one or more said one or more selected light sources (11-13), said one or more light sources of said first subset (11,13) being closer to said first position (81) than said one or more light sources of said second subset (12) and said one or more light sources of said second subset (12) being closer to said second position (83) than said one or more light sources of said first subset (11,13), and
determine said one or more light states for said one or more light sources of said first subset (11,13) based on said first light effect parameter and said one or more light states for said one or more light sources of said second subset (12) based on said second light effect parameter.
5. An electronic device (1,31) as claimed in claim 1 or 2, wherein said at least one processor (5,35) is configured to:
- determine a sequence, indicative of an order in which light sources assume a light state, of said one or more selected light sources (11-13) based on said one or more spatial indications and said positions of said one or more selected light sources (11-13), and determine said one or more light states for said one or more selected light sources (11-13) based on said one or more light effect parameters,
wherein said one or more control commands command said one or more selected light sources (11-13) to assume said determined one or more light states in the order specified by said sequence, a next light source in said sequence assuming said one or more determined light states determined for said next light source a specified period after a previous light source in said sequence assumes said one or more determined light states determined for said previous light source.
6. An electronic device (1,31) as claimed in claim 1 or 2, wherein said one or more spatial indications indicate an area (71-74, 91-93) and said at least one processor (5,35) is configured to select only available lights with a position in said indicated area (71-74, 91- 93).
7. An electronic device (1,31) as claimed in claim 1 or 2, wherein said reference position (43,53) is a typical position of a user (43,53), a current position of a user or a position of a media-rendering object.
8. An electronic device (1,31) as claimed in claim 1 or 2, wherein said one or more spatial indications indicate one or more positions (81,83,87) and/or areas (71-74, 91- 93).
9. An electronic device (1,31) as claimed in claim 1 or 2, wherein said one or more light effect parameters comprise at least one of: a static color value, a static intensity value, a static opacity value, a varying color value, a varying intensity value, a varying opacity value, a static size, a varying size and a trajectory.
10. An electronic device (1,31) as claimed in claim 1 or 2, wherein said at least one processor (5,35) is configured to map said light effect to one or more of said available light sources (11-15) based on said one or more light effect parameters, said mapping function identifier, said one or more spatial indications, said positions of said available light sources (11-15) and information identifying rendering capabilities of said available light sources (11-15).
11. An electronic device (1 ,31) as claimed in claim 1 or 2, wherein said information identifying available light sources identifies a single available light source, said one or more spatial indications indicate at least a first position (81) and a second position (83) and said one or more light effect parameters comprise at least a first light effect parameter and a second light effect parameter, said first light effect parameter being associated with said first position (81) and said second light effect parameter being associated with said second position (83), and said at least one processor (5,35) is configured to:
determine said one or more light states for said single available light source based on said first light effect parameter if said single light source is closer to said first position (81) than to said second position (83), and
determine said one or more light states for said single available light source based on said second light effect parameter if said single light source is closer to said second position (83) than to said first position (81).
12. A method of mapping a light effect to one or more light sources, comprising:
obtaining (201) information identifying available light sources, said information comprising for each of said available light sources a position of said available light source relative to a reference position;
receiving (203) an instruction to render a light effect, said instruction comprising one or more light effect parameters, a mapping function identifier indicating which one of multiple mapping functions to apply and one or more spatial indications, said one or more spatial indications being specified relative to said reference position;
mapping (205) said light effect to one or more of said available light sources based on said one or more light effect parameters, said mapping function identifier, said one or more spatial indications and said positions of said available light sources, an output of said mapping identifying one or more selected ones of said available light sources and comprising one or more light states for each of said one or more selected light sources; and
transmitting (207) one or more control commands for controlling said one or more selected light sources, said one or more control commands commanding said one or more selected light sources to assume said one or more light states.
13. A method as claimed in claim 10, wherein mapping (205) said light effect to said one or more of said available light sources comprises:
selecting (211) one or more of said available light sources based on said mapping function identifier, said one or more spatial indications and said positions of said available light sources; and
determining (213) said one or more light states for each of said one or more selected light sources based on said one or more light effect parameters.
14. A computer program or suite of computer programs comprising at least one software code portion or a computer program product storing at least one software code portion, the software code portion, when run on a computer system, being configured for enabling the method of any one of claims 12 to 13 to be performed.
PCT/EP2018/064361 2017-06-08 2018-05-31 Mapping a light effect to light sources using a mapping function WO2018224390A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP17174940 2017-06-08
EP17174940.1 2017-06-08

Publications (1)

Publication Number Publication Date
WO2018224390A1 true WO2018224390A1 (en) 2018-12-13

Family

ID=59257938

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/064361 WO2018224390A1 (en) 2017-06-08 2018-05-31 Mapping a light effect to light sources using a mapping function

Country Status (1)

Country Link
WO (1) WO2018224390A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT202000006025A1 (en) * 2020-03-20 2021-09-20 Ledworks Srl Method and system for the generation of lighting effects
WO2021209306A1 (en) 2020-04-14 2021-10-21 Signify Holding B.V. Controlling a lighting device associated with a light segment of an array
WO2021249938A1 (en) 2020-06-09 2021-12-16 Signify Holding B.V. A control system and method of configuring a light source array
WO2021254901A1 (en) 2020-06-19 2021-12-23 Signify Holding B.V. Controlling a pixelated lighting device based on a relative location of a further light source
WO2023222426A1 (en) 2022-05-17 2023-11-23 Signify Holding B.V. Obtaining locations of light sources of a light string wrapped around an object

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009087537A2 (en) * 2007-12-31 2009-07-16 Koninklijke Philips Electronics, N.V. Methods and apparatus for facilitating design, selection and/or customization of lighting effects or lighting shows
WO2010004480A1 (en) * 2008-07-11 2010-01-14 Koninklijke Philips Electronics N. V. Method and computer implemented apparatus for lighting experience translation
US20100090617A1 (en) 2006-09-29 2010-04-15 Koninklijke Philips Electronics N V Method and device for composing a lighting atmosphere from an abstract description and lighting atmosphere composition system
WO2013088394A2 (en) * 2011-12-14 2013-06-20 Koninklijke Philips Electronics N.V. Methods and apparatus for controlling lighting
US20140106735A1 (en) * 2012-10-12 2014-04-17 Crestron Electronics, Inc. User Identification and Location Determination in Control Applications
WO2017029368A1 (en) * 2015-08-20 2017-02-23 Philips Lighting Holding B.V. Spatial light effects based on lamp location

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100090617A1 (en) 2006-09-29 2010-04-15 Koninklijke Philips Electronics N V Method and device for composing a lighting atmosphere from an abstract description and lighting atmosphere composition system
WO2009087537A2 (en) * 2007-12-31 2009-07-16 Koninklijke Philips Electronics, N.V. Methods and apparatus for facilitating design, selection and/or customization of lighting effects or lighting shows
WO2010004480A1 (en) * 2008-07-11 2010-01-14 Koninklijke Philips Electronics N. V. Method and computer implemented apparatus for lighting experience translation
WO2013088394A2 (en) * 2011-12-14 2013-06-20 Koninklijke Philips Electronics N.V. Methods and apparatus for controlling lighting
US20140106735A1 (en) * 2012-10-12 2014-04-17 Crestron Electronics, Inc. User Identification and Location Determination in Control Applications
WO2017029368A1 (en) * 2015-08-20 2017-02-23 Philips Lighting Holding B.V. Spatial light effects based on lamp location

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT202000006025A1 (en) * 2020-03-20 2021-09-20 Ledworks Srl Method and system for the generation of lighting effects
WO2021186420A1 (en) * 2020-03-20 2021-09-23 Ledworks Srl Method and system for generating light effects
US20230247744A1 (en) * 2020-03-20 2023-08-03 Ledworks Srl Method and system for generating light effects
WO2021209306A1 (en) 2020-04-14 2021-10-21 Signify Holding B.V. Controlling a lighting device associated with a light segment of an array
WO2021249938A1 (en) 2020-06-09 2021-12-16 Signify Holding B.V. A control system and method of configuring a light source array
WO2021254901A1 (en) 2020-06-19 2021-12-23 Signify Holding B.V. Controlling a pixelated lighting device based on a relative location of a further light source
WO2023222426A1 (en) 2022-05-17 2023-11-23 Signify Holding B.V. Obtaining locations of light sources of a light string wrapped around an object

Similar Documents

Publication Publication Date Title
WO2018224390A1 (en) Mapping a light effect to light sources using a mapping function
US8970786B2 (en) Ambient light effects based on video via home automation
EP3375253B1 (en) Image based lighting control
JP6434197B1 (en) Controller and method for controlling light source
US11259390B2 (en) Rendering a dynamic light scene based on one or more light settings
US11665802B2 (en) Lighting system
CN110583100A (en) Group of devices formed by analyzing device control information
WO2021160552A1 (en) Associating another control action with a physical control if an entertainment mode is active
CN114245906A (en) Selecting image analysis regions based on a comparison of levels of dynamics
US20230033157A1 (en) Displaying a light control ui on a device upon detecting interaction with a light control device
US20230360352A1 (en) Determining an image analysis region for entertainment lighting based on a distance metric
CN110945970B (en) Attention dependent distraction storing preferences for light states of light sources
WO2022058282A1 (en) Determining different light effects for screensaver content
CN111869330B (en) Rendering dynamic light scenes based on one or more light settings
WO2023169993A1 (en) Controlling lighting devices as a group when a light scene or mode is activated in another spatial area
US20240096300A1 (en) Determining light effects in dependence on whether an overlay is likely displayed on top of video content
CN116724667A (en) Requesting a lighting device to control other lighting devices to render light effects from a light script
WO2024022846A1 (en) Selecting lighting devices based on an indicated light effect and distances between available lighting devices
CN113678169A (en) Determining lighting design preferences in augmented and/or virtual reality environments
CN114332335A (en) Lamplight processing method, device, equipment and medium of three-dimensional enhanced model
CN117898025A (en) Rendering polychromatic light effects on pixelated lighting devices based on surface color

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18726507

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18726507

Country of ref document: EP

Kind code of ref document: A1