WO2022194773A1 - Génération de réglages de lumière pour une unité d'éclairage sur la base d'un contenu vidéo - Google Patents

Génération de réglages de lumière pour une unité d'éclairage sur la base d'un contenu vidéo Download PDF

Info

Publication number
WO2022194773A1
WO2022194773A1 PCT/EP2022/056531 EP2022056531W WO2022194773A1 WO 2022194773 A1 WO2022194773 A1 WO 2022194773A1 EP 2022056531 W EP2022056531 W EP 2022056531W WO 2022194773 A1 WO2022194773 A1 WO 2022194773A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
lighting
images
lighting device
light effect
Prior art date
Application number
PCT/EP2022/056531
Other languages
English (en)
Inventor
Antonie Leonardus Johannes KAMP
Original Assignee
Signify Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Signify Holding B.V. filed Critical Signify Holding B.V.
Publication of WO2022194773A1 publication Critical patent/WO2022194773A1/fr

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/125Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/20Controlling the colour of the light

Definitions

  • the invention relates to a method of generating light settings for a lighting unit of a lighting system based on video content.
  • the invention further relates to computer program product for executing the method.
  • the invention further relates to a control system for generating light settings for a lighting unit of a lighting system based on video content.
  • Current lighting systems enable control of lighting units based on video content (e.g. movies, tv-shows, games, etc.).
  • the video content may be analyzed and light settings may be extracted from the video content.
  • colors are extracted from areas of the video, and the light settings are determined based on these colors.
  • the areas may, for example, be areas at the edge of the video content (i.e. at the edge of the display) and lighting units surrounding the display are controlled accordingly.
  • Another technique for controlling the light based on video content is to analyze the video content for salient features (e.g. explosions) and to determine the light setting based on these salient features.
  • video content typically contains scenes comprising lighting devices, or the effect of lighting devices.
  • the video content may, for example, be a scene of a street where the streetlights are switched on, a scene of a room with a (flickering) fluorescent tube, etc.
  • the inventor also has realized that, in order to make the experience more immersive, it is beneficial to use the light effects of lighting devices to control the lights. It is therefore an object of the present invention to make the video experience more immersive.
  • the object is achieved by a method of generating light settings for a lighting unit of a lighting system based on video content, the method comprising: analyzing one or more images of the video content, recognizing, in the one or more analyzed images, a light effect that originates from a lighting device, determining, based on the light effect, a light setting for the lighting unit, controlling the lighting unit according to the light setting.
  • the one or more images of the video content are analyzed to determine if there are any (virtual) light effects that originate from a (virtual) lighting device (a lamp) in the video content. If such a light effect is recognized, a light setting is determined based on that light effect, and the lighting unit is controlled based on that light setting. The light setting may be determined such that it resembles the light effect. So instead of (only) extracting colors from predetermined areas in the one or more images, light effects that originate from lighting devices in the video are used to control the lighting unit(s) of the lighting system. This is beneficial, because the illumination in the room wherein the display is located corresponds to the illumination in the video. As a result, the video experience becomes more immersive.
  • the lighting device may be located in the one or more images, or the lighting device may be located “outside” the one or more images. In case of the latter, the light effect may be indicative of that the light effect originates from a lighting device. This may be determined by analyzing the one or more images, thereby determining that the light effect originates from a lighting device.
  • the step of recognizing the light effect that originates from the lighting device may comprise: recognizing, in the one or more analyzed images, a first light effect that originates from a first lighting device, and recognizing, in the one or more analyzed images, a second light effect that originates from a second lighting device.
  • the method may further comprise: determining a first likelihood that the first light effect originates from the first lighting device, determining a second likelihood that the second light effect originates from the second lighting device, and selecting the light effect based on the first and second likelihoods.
  • two light effects originating from different lighting devices in the video content may be recognized, and the likelihood (probability) that each recognized light effect does originate from its respective lighting device is determined.
  • one of the light effects is selected (as the light effect for which the light setting is determined). It may occur that multiple light effects are detected in the one or more images, and by assigning likelihoods to these light effects indicating that they (actually) originate from lighting devices may help in selecting which light effect to use to control the lighting unit. A light effect with a higher likelihood may be selected over a light effect with a lower likelihood.
  • the lighting system may comprise a plurality of lighting units, and the step of recognizing the light effect that originates from the lighting device may comprise: recognizing, in the one or more analyzed images, a plurality of light effects that originate from respective lighting devices.
  • the method may further comprise: determining respective likelihoods that the respective light effects originate from the respective lighting devices, and, if the number of recognized light effects is higher than the number of lighting units comprised in the lighting system, selecting a subset of light effects based on their respective likelihoods, determining respective light settings for the subset of light effects and controlling the plurality of lighting units according to the respective light settings. It may occur that the number of lighting units (to be controlled based on light effects originating from lighting devices in the video content) in the lighting system is lower than the number of light effects recognized in the one or more images of the video content. A subset of light effects may therefore be selected based on the likelihoods (probabilities) that these are light effects originating from lighting devices. Light effects with a higher likelihood may be selected over light effects with a lower likelihood.
  • the light setting may be determined by: analyzing an area of the one or more images, the area comprising the light effect, extracting one or more colors from the area, and determining the light setting based on the one or more colors.
  • the method may comprise: extracting one or more second colors from a second area in the one or more images, the second area not comprising a light effect originating from a lighting device, determining, based on the extracted one or more second colors, a second light setting, associating a first priority value with the light setting and a second priority value with the second light setting, wherein the first priority value is higher than second priority value, and selecting the light setting for the lighting unit based on the higher first priority value.
  • the method may comprise: determining that a second area does not comprise a light effect originating from a lighting device.
  • areas that do not comprise a light effect originating from a lighting device may be analyzed for extraction of colors (and therewith light settings) but may be assigned a lower priority value compared to area(s) which comprise light effect(s) originating from lighting device(s).
  • a light setting with a higher priority may be selected over a light setting with a lower priority, and the lighting unit may therefore be controlled according to the light setting with the higher priority (i.e. the light setting based on the light effect originating from the lighting device).
  • the light setting may be determined by identifying a type of the lighting device from which the light effect originates based on the one or more analyzed images, and determining the light setting based on the identified type of the lighting device.
  • the type of lighting device e.g. based on the light effect
  • the light setting can be determined such that it better resembles the light effect in the video content. This is beneficial, because the illumination in the room corresponds better to the illumination in the video. Consequently, the video experience becomes more immersive.
  • the type of the lighting device may be indicative of a color temperature of light emitted by that type of lighting unit, and the light setting may be determined based on the color temperature.
  • Different types of lighting devices may have light sources with typical color temperatures. For instance, the color temperature of a fluorescent tube is different from the color temperature of a filament lamp. Hence, it is beneficial to determine the type of lighting device from which the light effect originates.
  • the lighting system may comprise a plurality of lighting units of different types, and wherein the method may comprise: selecting the lighting unit from the plurality of lighting units based on the identified type of the lighting device and based on the types of the lighting units.
  • the lighting units may be of different types and have different light/color rendering properties, and the lighting unit may be selected such that its light/color rendering properties match the light effect provided by the identified type of the lighting device.
  • the lighting system may comprise a plurality of lighting units.
  • the method may further comprise: determining a virtual location of the lighting device relative to a display, obtaining locations of the plurality of lighting units relative to the display, determining which of the locations of the plurality of lighting units corresponds to the virtual location of the lighting device to select the lighting unit from the plurality of lighting units based thereon.
  • the lighting unit may be selected such that its location corresponds to the location of the lighting device in the video content.
  • the lighting device may be located in the one or more images, or the lighting device may be located “outside” the one or more images, and by analyzing light effect of the lighting device in the one or more images, the location of the lighting device may be determined.
  • the method may therefore further comprise: determining, based on the analyzed one or more images, if the lighting device is located in the one or more images or not in the one or more images, and if the lighting device is located in the one or more images, determining the virtual location based on the location of the lighting device in the one or more images, if the lighting device is not located in the one or more images, estimating the virtual location by analyzing the light effect.
  • the object is achieved by a computer program product comprising computer program code to perform any of the above-mentioned methods when the computer program product is run on a processing unit of the computing device.
  • the object is achieved by a control system for generating light settings for a lighting unit of a lighting system based on video content
  • the control system comprising: an input configured to obtain the video content, a communication unit configured to communicate with the lighting unit, and one or more processors configured to: analyze one or more images of the video content, recognize, in the one or more analyzed images, a light effect that originates from a lighting device, determine, based on the light effect, a light setting for the lighting unit, control the lighting unit according to the light setting.
  • Fig. 1 shows schematically an example of a system comprising a control system for generating light settings for a lighting unit of a lighting system based on video content;
  • Fig. 2 shows schematically an example of recognizing a light effect that originates from a lighting device outside an image
  • Fig. 3 shows schematically an example of recognizing multiple light effects that originate from different lighting devices in an image
  • Fig. 4 shows schematically an example of recognizing a light effect that originates from a lighting device in an image
  • Fig. 5 shows schematically an example of a system comprising a plurality of lighting units
  • Fig. 6 shows schematically a method of generating light settings for a lighting unit of a lighting system based on video content.
  • Fig. 1 shows an example of a lighting system 100 comprising a control system 102, a display 120 and a lighting unit 110.
  • the control system 102 is configured to generate light settings for the lighting unit 110 based on video content 122, which may be rendered on the display 120.
  • the control system comprises an input 104 configured to obtain the video content 122, a communication unit 108 configured to communicate with the lighting unit 110 and one or more processors 106 configured to: analyze one or more images of the video content 122, recognize, in the one or more analyzed images, a light effect 126 that originates from a lighting device 124, determine, based on the light effect 126, a light setting for the lighting unit 110, and control the lighting unit 110 according to the light setting.
  • the lighting unit 110 may comprise one or more (LED) light sources.
  • the lighting unit is configured to illuminate a space.
  • the lighting unit 110 may be a light bulb, a light strip, a TLED, light tiles, etc.
  • the lighting unit may be an individually controllable light source of a luminaire (e.g. an LED strip).
  • the lighting unit 110 may comprise a control unit, such as a microcontroller (not shown), for controlling the light output generated by the one or more light sources based on received lighting control commands (which may be based on the light setting, which may be received from the control system 102 via the communication unit 108).
  • the lighting unit 110 may comprise a receiver configured to (wirelessly) receive the lighting control commands, for instance via Bluetooth, Zigbee, Wi-Fi, Thread, etc.
  • a lighting control command (defining the light setting) may comprise lighting control instructions for controlling the light output, such as the color, intensity, saturation, beam size, beam shape, etc. of the one or more light sources.
  • the display 120 is configured to render the video content 122.
  • the display 120 may be any type of display, for example a tv, a mobile phone display, a projector, a pc/laptop display, etc. configured to render the video content 122.
  • the control system 102 may be any device configured to generate light setting for the lighting unit 110 based on the video content 122.
  • the control system 102 may be comprised in a lighting control device such as a smartphone, a bridge, a hub, etc., be comprised in the display 120, be located on a remote (cloud) server, be comprised in a device connected to the display (e.g. via an audio/video interface such as HDMI), etc.
  • control 102 system may be distributed across a plurality of devices.
  • the control system 102 may, for example, comprise a first processor and a second processor.
  • the first processor may be comprised in a first device (e.g. a remote (cloud) server, a video analysis device, etc.), and be configured to analyze the one or more images of the video content and recognize, in the one or more analyzed images, the light effect that originates from a lighting device.
  • the first processor may then communicate information indicative of the recognized light effect to a second device comprising the second processor (e.g. a lighting control device such as a bridge or a smartphone), which may determine, based on the light effect, a light setting for the lighting unit, and control the lighting unit 110 according to the light setting.
  • a lighting control device such as a bridge or a smartphone
  • the first processor (which may be located at a remote (cloud) server) may be configured to analyze the one or more images of the video content and recognize, in the one or more analyzed images, the light effect that originates from a lighting device and determine, based on the light effect, a light setting for the lighting unit.
  • the second processor (which may be comprised in a lighting control device such as a bridge or a smartphone) may then receive the light setting from the first processor control the lighting unit 110 according to the light setting.
  • the control system 102 is configured to control the lighting unit 110 based on one or more images of the video content 122 by analyzing the one or more images of the video content, any by determining light settings for the lighting unit 110 based on the analyzed images.
  • the one or more processors 106 may, for example, be configured to analyze an area of the one or more images, extract one or more colors from the area and determine the light setting based on the one or more colors. Such image analysis techniques for extracting light settings from video content are known in the art and will therefore not be discussed in detail.
  • the control system 102 comprises an input 104 to obtain the video content 122.
  • the input 104 may for example be an input of the one or more processors 106 or a receiver configured to obtain the video content 122.
  • the control system 102 may be comprised in the display 120, and the input 104 may therefore have access to the video content 122 that is being/to be rendered on the display 120.
  • the control system 102 may be comprised in an external device connected to the display via an audio/video interface (e.g. via an (HDMI) cable), and the input 104 may be configured to obtain the video content 122 via the audio/video interface.
  • the input may be a receiver configured to wirelessly receive the video content (e.g.
  • the control system 102 may further comprise an output configured to communicate the video content 122 to the display 120, or the control system 102 may be configured to obtain the video content 122 substantially simultaneously as the display 120. It should be understood that these are mere examples of how the one or more processors 106 may gain access to the video content, and that the skilled person is able to design alternatives without departing from the scope of the appended claims.
  • the control system 102 further comprises a communication unit 108 configured to communicate with the lighting unit 110.
  • the one or more processors 106 may control the lighting device 110 by communicating control commands to the lighting unit 110.
  • the control commands are indicative of the light setting.
  • the control commands may be received directly from the communication unit 108 or indirectly (e.g. via a bridge or a hub), via one or more wireless networks (e.g. Wi-Fi, Zigbee, Bluetooth, Tread, Z-Wave, the internet, etc.).
  • the one or more processors 106 are configured to analyze one or more images of the video content 122 to recognize a light effect 126 in the one or more images, which light effect 126 originates from a lighting device 124.
  • the lighting device 124 is an artificial light source providing illumination to the scene in the video content.
  • the one or more processors 106 is configured to analyze the image (e.g. by applying image/feature/object recognition algorithms to the image) to recognize the presence of a lighting device 124 and/or its light effect 126.
  • the one or more processors 106 may be configured to apply deep learning algorithms to the video content to recognize features/objects in the video content.
  • the deep learning algorithm may have been trained with training data comprising a set of images of lighting device and/or their respective light effects.
  • the lighting device 124 is a lamp post that generates the light effect 126.
  • the lighting device 124 is comprised in the one or more images.
  • the lighting device 224 is not comprised in the one or more images, but the lighting device 224 is located “outside” the one or more images. Hence, only the light effect 226, 228 of the lighting device 224 is visible in the one or more images.
  • the one or more processors 106 may be configured to analyze the one or more images and determine, based on the light effect 226, 228 recognized in the one or more images, that the light effect originates from a lighting device.
  • the one or more processors 106 may, for example, recognize the light effect 226 on the top of the pole (e.g. based on a color of the light effect, based on a contrast of the light effect and its surroundings, etc.) and/or the light effect 228 on the ground (e.g. based on a color of the light effect, based on a contrast of the light effect and its surroundings, etc.) originate from a lighting device.
  • the one or more processors 106 may use further image analysis to recognize objects/features in the one or more images, and the presence of the objects/features in the one or more images may be indicative of that the light effect 226, 228 originates from a lighting device.
  • the one or more processors 106 may, for example, recognize the pole in the image and thereby conclude the light effect 226 on the top of the pole and/or the light effect 228 on the ground originate from a lighting device.
  • the one or more processors 106 is further configured to determine, based on the light effect 126, the light setting for the lighting unit 110.
  • the one or more processors 106 may, for example, be configured to analyze an area of the one or more images, the area comprising the light effect and extract one or more colors from the area to determine the light setting based on the one or more colors.
  • the one or more processors 106 may, for example, take an average color of the colors of pixels in the area. Referring to the example in Fig. 1, the one or more processors 106 may analyze the light emitting area of the lighting device 124 and extract for example a warm yellow color from that area. Referring to the example in Fig. 2, the one or more processors 106 may analyze a first area at the top of the pole and/or a second area on the ground and extract for example a white color from the first/second area.
  • the one or more processors 106 may be configured to determine the light setting based on the type of lighting device 124.
  • the one or more processors 106 may be configured to identify a type of the lighting device 124 from which the light effect 126 originates based on the one or more analyzed images, and determine the light setting based on the identified type of the lighting device.
  • Different types of lighting devices may have light sources with typical color temperatures. For instance, the color temperature of a fluorescent tube is different from the color temperature of a filament lamp.
  • the one or more processors 106 determines that the lighting device 124 is a fluorescent tube, a light setting with a higher color temperature (e.g. 4100 K) may be selected, whereas if the lighting device 124 would be a filament lamp, a light setting with a lower color temperature may be selected (e.g. 2700 K).
  • the lighting device may be identified as a stage lighting fixture providing a dynamic light effect (e.g. based on the analysis of a plurality of images), and the light setting may be determined such that it has a high level of dynamics.
  • the type of lighting device may be indicative of its beam shape (e.g.
  • a spotlight typically has a more narrow beam compared to a, for example, a wall washer
  • the light setting may define a beam setting that corresponds to the beam of light of the identified lighting device.
  • the lighting unit 110 may comprise means to control the beam shape/size/direction based on such a light setting.
  • the one or more processors 106 may be further configured to select the lighting unit from a plurality of lighting units of the lighting system based on the identified type of the lighting device and based on the types of the lighting units.
  • the lighting units may be of different types and have different light/color rendering properties, and the lighting unit may be selected such that its light/color/beam properties match the light effect provided by the identified type of the lighting device.
  • the one or more processors 106 is further configured to control the lighting unit 110 according to the light setting.
  • the one or more processors 106 may communicate a lighting control command via the communication unit 108 to the lighting unit 110.
  • the one or more processors 106 may be further configured to recognize, in the one or more analyzed images, a first light effect that originates from a first lighting device and a second light effect that originates from a second lighting device. This has been illustrated in Fig. 3, wherein the one or more processors 106 may recognize a first light effect 326 that originates from a first lighting device 324 and a second light effect 336 that originates from a second lighting device 334. The one or more processors 106 may be further configured to determine a first likelihood that the first light effect 326 originates from the first lighting device 324 and determine a second likelihood that the second light effect 336 originates from the second lighting device 334. The first and second likelihoods may be determined based on features recognized in the one or more images.
  • the one or more processors 106 may be configured to use deep learning algorithms to recognize the light effect. Such deep learning algorithms typically have a likelihood associated with whether it is a light effect or not (e.g. a brighter area in the image).
  • the one or more processors 106 may, for example, recognize the lamp post of lighting device 324 and recognize only a part of the lamp post of lighting device 334, thereby determine a higher likelihood that the first light effect 326 originates from the first lighting device 324 as compared to that the second light effect 336 originates from the second lighting device 334.
  • the one or more processors 106 may recognize a first light effect 426 that originates from a first lighting device 424 and a second light effect 436 that may originate from a second lighting device (not shown). In this example, the one or more processors 106 may, for example, recognize the lamp post of lighting device 424 and recognize that, due to the size of the light effect 436, it is unlikely that this originates from a lighting device, and thereby determine a higher likelihood that the first light effect 426 originates from the first lighting device 424 as compared to that the second light effect 436 originates from a second lighting device. The one or more processors 106 may be further configured to select the light effect (for the lighting unit 110) based on the first and second likelihoods. Referring to the example of Fig.
  • the one or more processors 106 may select the first light effect 326 due to the higher likelihood, and determine the light setting for the lighting unit 110 based on that light effect 326. Referring to the example of Fig. 4, the one or more processors 106 may select the first light effect 426 due to the higher likelihood, and determine the light setting for the lighting unit 110 based on that light effect 426.
  • the lighting system may comprise a plurality of lighting units.
  • the one or more processors 106 may be configured to recognize a plurality of light effects that originate from respective lighting devices and determine respective likelihoods that the respective light effects originate from the respective lighting devices (as described above).
  • the one or more processors 106 may be further configured to select a subset of light effects based on their respective likelihoods if the number of recognized light effects is higher than the number of lighting units comprised in the lighting system, determine respective light settings for the subset of light effects, and controlling the plurality of lighting units according to the respective light settings.
  • the lighting system may, for example, comprise two lighting units (not shown).
  • the one or more processors 106 may recognize three light effects 326, 336, 346 in the video content of Fig.
  • the one or more processors 106 may, for example, determine a high likelihood that the first light effect 326 originates from the first lighting device 324, determine a medium likelihood that the second light effect 336 originates from the first lighting device 334 and determine a lower likelihood that the third light effect 346 originates from the third lighting device 344. Based on these likelihoods, the one or more processors 106 may select a subset of light effects comprising the first and second light effect, and determine a first light setting based on the first light effect 326, determine a second light setting for the second light effect 336, and control the first lighting unit according to the first light setting and control the second lighting unit according to the second light setting.
  • the one or more processors 106 may be configured to map a light effect with a higher likelihood to a higher number of lighting units compared to a light effect with a lower likelihood.
  • the one or more processors 106 may be configured to determine the light setting by analyzing an area of the one or more images, the area comprising the light effect, extracting one or more colors from the area, and determining the light setting based on the one or more extracted colors. For example, referring to Fig. 4, the one or more processors 106 may extract one or more colors from an area (indicated with the dotted line) defining the light effect 426, 428, and determine the light setting based thereon (for example an average color of the colors of the pixels in the area).
  • the one or more processors 106 may be further configured to extract one or more second colors from a second area in the one or more images, the second area not comprising a light effect originating from a lighting device and determine, based on the extracted one or more second colors, a second light setting.
  • the one or more processors 106 may associate a first priority value with the (first) light setting and a second priority value with the second light setting.
  • the first priority value may be higher than second priority value, because the (first) light setting originates from a lighting device and the second light setting does not.
  • the one or more processors 106 may then select the (first) light setting for the lighting unit 110 based on the higher first priority value. Referring again to Fig.
  • the one or more processors 106 may extract one or more colors from a second area 446 (indicated with the dotted line), and determine the second light setting based thereon (for example an average color of the colors of the pixels in the area 446 (e.g. a blue color of the sky)).
  • the one or more processors 106 may associate a first priority value with the (first) light setting (which is based on the area defining the light effect 426) and a second priority value with the second light setting (which is based on the second area 446).
  • the first priority value may be higher than second priority value, because the (first) light setting originates from a lighting device 424 and the second light setting does not. Based on these priority levels, the one or more processors 106 may select the first light setting for the lighting unit 110
  • the lighting system may comprise a plurality of lighting units.
  • the one or more processors 106 may be further configured to determine a virtual location of the lighting device 126 relative to the display 120, obtain locations of the plurality of lighting units relative to the display 120 and determine which of the locations of the plurality of lighting units corresponds to the virtual location of the lighting device, and select the lighting unit from the plurality of lighting units based thereon.
  • the locations of the plurality of lighting units may be obtained from an (indoor) positioning system, for instance an RF -based positioning system, a coded light positioning system, a camera-based positioning system, etc.
  • the locations of the plurality of lighting units may be defined by a user via a user interface, wherein the user may provide information about the locations of the lighting units, for instance by positioning virtual counterparts of the lighting units on a map of a space wherein the lighting units are located.
  • Techniques for determining locations of devices and users in an area are known in the art and will therefore not be discussed in detail.
  • Fig. 5 shows an example of a space comprising a first lighting unit 510 and a second lighting unit 512.
  • the one or more processors 106 may determine the location of the lighting device 524 in the video content and therewith relative to the display 520, and select a lighting unit based on the locations of the lighting units 510, 512 relative to the display 520.
  • the one or more processors 106 may select the first lighting unit 510, because its location relative to the display 520 is closer to the location of the lighting device 524 relative to the display 520 compared to the second lighting unit 512.
  • the lighting device 524 is located in the one or more images. It may occur that the lighting device from which the light effect originates is not located in the one or more images (see for instance the example of Fig. 2).
  • the one or more processors 106 may be further configured to determine, based on the analyzed one or more images, if the lighting device is located in the one or more images or not in the one or more images.
  • the one or more processors 106 may determine the virtual location based on the location of the lighting device in the one or more images (as shown in Fig. 5). If the one or more processors 106 determines that the lighting device is not located in the one or more images (as shown in the example of Fig. 2), the one or more processors 106 may estimate the virtual location by analyzing the light effect. The one or more processors 106 may, for example use image analysis to recognize objects/features in the one or more images.
  • the one or more processors 106 may, for example, recognize the pole in the image and thereby conclude the light effect 226 on the top of the pole and/or the light effect 228 on the ground originate from a lighting device.
  • the one or more processors may estimate the beam shape of the light effect (indicated with the diagonal dotted lines), and determine that the lighting device is located at a virtual location where the diagonal lines converge.
  • Fig. 6 shows a method 600 of generating light settings for a lighting unit of a lighting system based on video content.
  • the method 600 comprises: analyzing 602 one or more images of the video content, recognizing 604, in the one or more analyzed images, a light effect that originates from a lighting device, determining 606, based on the light effect, a light setting for the lighting unit, and controlling 606 the lighting unit according to the light setting.
  • the method 600 may be executed by computer program code of a computer program product when the computer program product is run on a processing unit of a computing device, such as the one or more processors 106 of the control system 102.
  • any reference signs placed between parentheses shall not be construed as limiting the claim.
  • Use of the verb "comprise” and its conjugations does not exclude the presence of elements or steps other than those stated in a claim.
  • the article “a” or “an” preceding an element does not exclude the presence of a plurality of such elements.
  • the invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer or processing unit. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
  • aspects of the invention may be implemented in a computer program product, which may be a collection of computer program instructions stored on a computer readable storage device which may be executed by a computer.
  • the instructions of the present invention may be in any interpretable or executable code mechanism, including but not limited to scripts, interpretable programs, dynamic link libraries (DLLs) or Java classes.
  • the instructions can be provided as complete executable programs, partial executable programs, as modifications to existing programs (e.g. updates) or extensions for existing programs (e.g. plugins).
  • parts of the processing of the present invention may be distributed over multiple computers or processors or even the ‘cloud’.
  • Storage media suitable for storing computer program instructions include all forms of nonvolatile memory, including but not limited to EPROM, EEPROM and flash memory devices, magnetic disks such as the internal and external hard disk drives, removable disks and CD-ROM disks.
  • the computer program product may be distributed on such a storage medium, or may be offered for download through HTTP, FTP, email or through a server connected to a network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

Un procédé de génération de réglages de lumière pour une unité d'éclairage d'un système d'éclairage sur la base d'un contenu vidéo est divulgué. Le procédé comprend : l'analyse d'une ou plusieurs images du contenu vidéo, la reconnaissance, dans la ou les images analysées, d'un effet de lumière qui provient d'un dispositif d'éclairage, la détermination, sur la base de l'effet de lumière, d'un réglage de lumière pour l'unité d'éclairage, et la commande de l'unité d'éclairage en fonction du réglage de lumière.
PCT/EP2022/056531 2021-03-19 2022-03-14 Génération de réglages de lumière pour une unité d'éclairage sur la base d'un contenu vidéo WO2022194773A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163163233P 2021-03-19 2021-03-19
US63/163,233 2021-03-19
EP21165429 2021-03-29
EP21165429.8 2021-03-29

Publications (1)

Publication Number Publication Date
WO2022194773A1 true WO2022194773A1 (fr) 2022-09-22

Family

ID=81074214

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/056531 WO2022194773A1 (fr) 2021-03-19 2022-03-14 Génération de réglages de lumière pour une unité d'éclairage sur la base d'un contenu vidéo

Country Status (1)

Country Link
WO (1) WO2022194773A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018073052A1 (fr) * 2016-10-18 2018-04-26 Philips Lighting Holding B.V. Commande d'éclairage
US20200245435A1 (en) * 2017-10-16 2020-07-30 Signify Holding B.V. A method and controller for controlling a plurality of lighting devices
WO2020169382A1 (fr) * 2019-02-18 2020-08-27 Signify Holding B.V. Dispositif de commande destiné à commander des sources de lumière et procédé associé
WO2020249502A1 (fr) * 2019-06-14 2020-12-17 Signify Holding B.V. Procédé pour commander une pluralité d'unités d'éclairage d'un système d'éclairage

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018073052A1 (fr) * 2016-10-18 2018-04-26 Philips Lighting Holding B.V. Commande d'éclairage
US20200245435A1 (en) * 2017-10-16 2020-07-30 Signify Holding B.V. A method and controller for controlling a plurality of lighting devices
WO2020169382A1 (fr) * 2019-02-18 2020-08-27 Signify Holding B.V. Dispositif de commande destiné à commander des sources de lumière et procédé associé
WO2020249502A1 (fr) * 2019-06-14 2020-12-17 Signify Holding B.V. Procédé pour commander une pluralité d'unités d'éclairage d'un système d'éclairage

Similar Documents

Publication Publication Date Title
JP5676264B2 (ja) 家庭用娯楽システムに利用可能な照明効果の自動識別を備えた照明管理システム
US10120267B2 (en) System and method for re-configuring a lighting arrangement
JP6421279B1 (ja) 照明シーンの生成
US11234312B2 (en) Method and controller for controlling a plurality of lighting devices
US20120287334A1 (en) Method of Controlling a Video-Lighting System
US20150355829A1 (en) Enabling a user to control coded light sources
US20170141847A1 (en) High-dynamic-range coded light detection
KR101317240B1 (ko) 무대용 스마트 조명 자동 조절 장치
US11612042B2 (en) Method and a controller for configuring a replacement lighting device in a lighting system
WO2022194773A1 (fr) Génération de réglages de lumière pour une unité d'éclairage sur la base d'un contenu vidéo
US20210232301A1 (en) A method and a lighting control device for controlling a plurality of lighting devices
US20200374998A1 (en) A lighting control system for controlling a plurality of light sources based on a source image and a method thereof
US20230319963A1 (en) A controller for mapping a light scene onto a plurality of lighting units and a method thereof
US20220151039A1 (en) A controller for controlling light sources and a method thereof
US20200126264A1 (en) A system for rendering virtual objects and a method thereof
US20220151046A1 (en) Enhancing a user's recognition of a light scene
US20220312558A1 (en) A controller for controlling a plurality of lighting units of a lighting system and a method thereof
US20230045111A1 (en) A controller for generating light settings for a plurality of lighting units and a method thereof
WO2023232558A1 (fr) Dispositif de commande pour commander une pluralité d'unités d'éclairage dans un espace et son procédé
WO2023274700A1 (fr) Dispositif de commande pour commander une pluralité de dispositifs d'éclairage sur la base d'un contenu multimédia et son procédé
JP7126507B2 (ja) 照明デバイスを介して仮想オブジェクトの存在を示すためのコントローラ及びその方法
WO2024099856A1 (fr) Dispositif de commande d'une pluralité de sources de lumière dont la couleur et/ou la luminosité peuvent être réglées individuellement
WO2023202981A1 (fr) Commande d'un dispositif d'éclairage réorientable

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22714201

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22714201

Country of ref document: EP

Kind code of ref document: A1