US20140265878A1 - Coded light detector - Google Patents

Coded light detector Download PDF

Info

Publication number
US20140265878A1
US20140265878A1 US14/351,153 US201214351153A US2014265878A1 US 20140265878 A1 US20140265878 A1 US 20140265878A1 US 201214351153 A US201214351153 A US 201214351153A US 2014265878 A1 US2014265878 A1 US 2014265878A1
Authority
US
United States
Prior art keywords
light
region
remote control
control unit
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/351,153
Other versions
US9232610B2 (en
Inventor
Tommaso Gritti
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Signify Holding BV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Priority to US14/351,153 priority Critical patent/US9232610B2/en
Assigned to KONINKLIJKE PHILIPS ELECTRONICS N V reassignment KONINKLIJKE PHILIPS ELECTRONICS N V ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GRITTI, TOMMASO
Publication of US20140265878A1 publication Critical patent/US20140265878A1/en
Application granted granted Critical
Publication of US9232610B2 publication Critical patent/US9232610B2/en
Assigned to PHILIPS LIGHTING HOLDING B.V. reassignment PHILIPS LIGHTING HOLDING B.V. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KONINKLIJKE PHILIPS N.V.
Assigned to SIGNIFY HOLDING B.V. reassignment SIGNIFY HOLDING B.V. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: PHILIPS LIGHTING HOLDING B.V.
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • H05B37/02
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/125Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/11Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
    • H04B10/114Indoor or close-range type systems
    • H04B10/116Visible light communication
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/165Controlling the light source following a pre-assigned programmed sequence; Logic control [LC]
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/175Controlling the light source by remote control
    • H05B47/19Controlling the light source by remote control via wireless transmission
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the present invention relates to the field of coded light and in particular to a remote control unit and a method for controlling a set of light sources emitting coded light.
  • a first example involves wall-mounted control.
  • a set of wall-mounted controls are installed, each of them controlling an individual or group of light sources or luminaires, possibly with optimized controls for each type of control within the set.
  • a second example involves having a separate remote control unit for each individual light source or luminaire. This may be regarded, by means of the remote control unit, as a more or less straight forward extension of the above disclosed wall switch control.
  • a third example involves iterative selection of the individual light sources or luminaires.
  • a user is provided with a simple remote control unit capable of controlling all light sources or luminaires which the remote control unit has been commissioned with.
  • the remote control unit is able to control a single luminaire at a time, but it may also allow a user to browse through all the luminaires, for example by manipulation of a user interface provided on the remote control unit (e.g. by using “previous” or “next” buttons).
  • a digital version of such a concept has also been developed, which adopts a touch screen device as a remote control unit, so that, once a light source or luminaire is selected, a light control optimized for such a light source or luminaire is displayed to the user (e.g. color temperature for a light source or luminaire with tunable white; or a color wheel for an RGB light) by means of the touch screen on the remote control unit.
  • a fourth example involves the concept of point and control; this approach exploits the principle of coded light and a remote control unit capable of detecting the code of the light source or luminaire toward which the remote control unit is pointed and thereby to identify the light source or luminaire emitting the coded light.
  • a remote control unit typically comprises one or more photodiodes for detecting the coded light emitted by the light source or luminaire.
  • coded light has been proposed to enable advanced control of light sources. Coded light is based on embedding of data, inter alia invisible identifiers, in the light output of the light sources. Coded light may thus be defined as the embedding of data and identifiers in the light output of a visible light source, f.i.
  • Coded light may be used in communications applications wherein one or more light sources in a coded lighting system are configured to emit coded light and thereby communicate information to a receiver.
  • the point and control approach shows the advantage of using coded light as a means to be able to select a luminaire by simply pointing towards it.
  • this approach employs a photodiode in order to detect the Coded Light message of each luminaire. It has been proposed to detect and decode coded light by means of a standard camera.
  • WO 2009/010926 relates to a method for processing light in a structure where the light sources emit light carrying individual codes.
  • a camera is arranged in a camera position of the structure and registers images of spots of the light.
  • WO 2009/010926 is based on an insight that by using a camera for registering images of the light emitted from the light sources after installation thereof, and recognizing the individual codes in the registered images, it is possible to obtain a fast and at least substantially automatic determination of light source properties.
  • the camera may comprise an image detector comprising a matrix of detector elements each generating one pixel of the registered image.
  • the camera registers images of illuminated areas at a frequency that corresponds to, or is adapted to, the modulation frequency of CDMA modulation. Thereby it is possible for the camera to generate images that capture different CDMA codes of the different illuminated areas.
  • the inventors of the enclosed embodiments have identified a number of disadvantages with the above noted first, second, third and fourth examples. For example, it may be time consuming because the user needs to either browse through all light sources or luminaires (as in the third example), or point the remote control unit toward each individual light source or luminaire (as in the fourth example).
  • the above disclosed arrangements of light sources or luminaires and remote control units are not scalable.
  • browsing through each light (as in the third example), or carrying along an individual remote control unit for each light (as in the second example) can be a tedious and error prone process.
  • a remote control unit for controlling a set of light sources, comprising an image sensor arranged to capture at least one image and to detect coded light in the at least one image; a processing unit arranged to determine a region and/or object in an image captured by the image sensor; associate, by virtue of the detected coded light, the determined region and/or object with a set of light sources emitting the detected coded light, each light source having one or more light settings, wherein the region and/or object in the image is illuminated at least by the set of light sources; receive an input signal relating to updated light settings of the set of light sources; and a transmitter arranged to transmit a control signal corresponding to the updated light settings to the set of light sources.
  • Such a remote control unit may advantageously shorten the time needed to determine settings for a set of light sources in a coded lighting system since it does not require a remote control to receive coded light exclusively from one of the light sources one by one.
  • the disclosed remote control unit advantageously also scales with the number of light sources in the system since the functionality of the disclosed remote control unit is independent of the number of light sources in the system.
  • the disclosed approach allows a user to focus on the desired light effects, regardless of the number and location of light sources. An example of this would imply a user interface which presents to the user settings for the overall light effect affecting the selected area/object, instead of a specific user interface for each individual light source influencing the area/object. This type of interface may be preferred for quick settings, while an interface allowing to individually set the parameters of each light source affecting the area would be preferred when accuracy and full control of the light settings is necessary.
  • the processing unit may be further arranged to receive at least two images from the image sensor and to detect a difference between the at least two images; and determine the region and/or object in one of the two images from the difference.
  • the remote control unit may thus advantageously identify the object and/or region from captured images.
  • the processing unit may be further arranged to receive user input identifying an area in the image; and from the identified area determine an outline of the object and/or region so as to determine the object and/or region.
  • the remote control unit may thus advantageously identify the object and/or region from user input.
  • the remote control unit may further comprise a user interface arranged to provide the processing unit with the user input upon user interaction with the user interface.
  • the user interface may advantageously comprise a touch sensitive display which thus provides for easy identification of the region and/or object by the user of the remote control unit.
  • the image sensor may further be arranged to capture a plurality of images so as to form a stack of images from the plurality of images; determine the sum over all pixels per image in the stack of images to generate a conventional one-dimensional signal to determine which codes are present in a scene represented by the stack of images; and determine, from the stack of images, a footprint of the light source by correlating time dependence of all pixels with a code associated with the light source.
  • a light sensor embodied as a standard camera with a beam splitter and a photodiode may be able to maintain an overview of the environment, by means of the camera, and at the same time be able to detect codes at extremely high accuracy and speed, by means of the photodiode.
  • the processing unit may further be arranged to transmit identification of the region and/or object to the user interface. This may advantageously provide the user with feedback regarding the identified region and/or object.
  • the user interface may be arranged to provide a snapshot of the image comprising the object and/or region together with and indication of the object and/or region. This may advantageously further improve the feedback provided to the user.
  • the user interface may be arranged to provide an indication relating to which light source or light sources that affect illumination of the object and/or region. This may advantageously provide the user with feedback regarding the light sources.
  • the user interface may be arranged to provide light settings available for the set of light sources. This may advantageously provide the user of the remote control unit the possibility to change one or more light settings of the light source(s) affecting the identified region and/or object.
  • the processing unit may be further arranged to identify the region and/or object by segmenting image foreground information from image background information and to identify the region and/or object by virtue of the image foreground information. This may advantageously enable improved identification of the region and/or object.
  • the processing unit may be further arranged to identify the region and/or object by performing motion detection and/or estimation between the two images in order to detect an object and/or region that has moved between the two images. This may advantageously enable improved identification of the region and/or object.
  • the processing unit may be further arranged to identify the region and/or object by performing depth segmentation so as to obtain an image depth measurement from which the region and/or object is identified. This may advantageously enable improved identification of the region and/or object.
  • the processing unit may be further arranged to identify the region and/or object by performing object detection. This may advantageously enable improved identification of the region and/or object.
  • the objective is achieved by an arrangement comprising a remote control unit according to the above and at least one luminaire controllable by the remote control unit and comprising at least one light source from the set of light sources.
  • the objective is achieved by method for controlling a set of light sources, comprising capturing, by an image sensor, at least one image and detecting coded light in the at least one image; determining, by a processing unit , a region and/or object in an image captured by the image sensor; associating, by the processing unit, by virtue of the detected coded light, the determined region and/or object with a set of light sources emitting the detected coded light, each light source having one or more light settings, wherein the region and/or object in the image is illuminated at least by the set of light sources; receiving, by the processing unit, an input signal relating to updated light settings of the set of light sources; and transmitting, by a transmitter, a control signal corresponding to the updated light settings to the set of light sources.
  • FIGS. 1 a and 1 b illustrate lighting systems according to embodiments
  • FIG. 2 illustrates a remote control unit
  • FIGS. 3 a, 3 b, 3 c and 4 illustrate images as captured by an image sensor
  • FIG. 5 illustrates an example of a user interface of a remote control unit or an arrangement
  • FIGS. 6 a and 6 b are flowcharts according to embodiments.
  • the lighting systems 1 a and 1 b of FIGS. 1 a and 1 b comprise at least one light source arranged to emit coded light, schematically denoted by light sources with reference numerals 2 a, 2 b, 2 c.
  • the at least one light source 2 a - c may be a luminaire and/or be part of a lighting control system.
  • the lighting systems 1 a and 1 b may thus be denoted as coded lighting systems.
  • a luminaire may comprise at least one light source 2 a - c.
  • the term “light source” means a device that is used for providing light in a room, for purpose of illuminating objects in the room.
  • a room is in this context typically an apartment room or an office room, a gym hall, an indoor retail, environment, a theatre scene, a room in a public place or a part of an outdoor environment, such as a part of a street.
  • Each light source 2 a - c is capable of emitting coded light, as schematically illustrated by arrows 3 a, 3 b, 3 c.
  • the emitted light thus comprises a modulated part associated with coded light comprising information sequences.
  • the emitted light may also comprise an un-modulated part associated with an illumination contribution.
  • Each light source 2 a - c may be associated with a number of light (or lighting) settings, inter alia pertaining to the illumination contribution of the light source, such as color, color temperature, intensity and frequency of the emitted light.
  • the illumination contribution of the light source may be defined as a time-averaged output of the light emitted by the light source 2 a - c.
  • the system 1 a further comprises a device termed a remote control unit 4 arranged to receive and detect the coded light emitted by the light sources in the system 1 a.
  • the remote control unit 4 therefore comprises an image sensor 5 for detecting the light emitted by the light source(s) in the system 1 a by capturing images comprising coded light.
  • the remote control unit 4 further comprises a processing unit 6 operatively coupled to the image sensor 5 .
  • the processing unit 6 analyzes images captured by the image sensor 5 and identifies changes in the scene defined by the captured images, and particularly objects which have been moved or inserted in the scene.
  • the remote control unit 4 further comprises a transmitter 7 operatively coupled to the processing unit.
  • the transmitter 7 is arranged to transmit data, as schematically illustrated by arrows 8 a, 8 b to one or more of the light sources in the system 1 a.
  • the remote control unit 4 may further comprise other components, such as a memory 9 operatively coupled to the processing unit 6 and a user interface 16 also operatively coupled to the processing unit 6 .
  • the remote control unit 4 may be part of a mobile phone and the herein disclosed functionality may be provided as one or more applications, so-called “Apps”.
  • the one or more applications may be stored as one or more software products stored on a computer-readable storage medium.
  • the system 1 b comprises an arrangement 20 .
  • the arrangement 20 comprises a number of physically separated (but operatively connectible) devices which when operationally connected (or coupled) enable the arrangement 20 arranged to receive and detect the coded light emitted by the light sources in the system 1 b.
  • the arrangement 20 therefore comprises an image sensor 5 for detecting the light emitted by the light source(s) in the system 1 b by capturing images comprising coded light.
  • the arrangement 20 further comprises a processing unit 6 operatively coupled to the image sensor 5 .
  • the processing unit 6 analyzes images captured by the image sensor 5 and identifies changes in the scene defined by the captured images, and particularly objects which have been moved or inserted in the scene.
  • the arrangement 20 further comprises a transmitter 7 operatively coupled to the processing unit 6 .
  • the transmitter 7 is arranged to transmit data, as schematically illustrated by arrows 8 a, 8 b to one or more of the light sources in the system 1 b.
  • the arrangement 20 may further comprise other components, such as a memory 9 operatively coupled to the processing unit 6 and a user interface 16 also operatively coupled to the processing unit 6 .
  • the arrangement 20 may be advantageous in scenarios in which the image sensor 5 is either mounted fixed at the ceiling of the space to be illuminated by the light source 2 a - c, or positioned at a certain location while an operator of the lighting system 1 b is free to move around and change light settings while being in the scene himself/herself. The change of lighting settings will be further explained below.
  • arranging the image sensor 5 fixed at the location enables the operator to move around freely in the area to be illuminated whilst interacting with the user interface 16 (which thus may be provided in a device which is physically different from the device holding the image sensor 5 ), whereby images of the scene to be illuminated could be wirelessly transmitted from the device holding the image sensor 5 to the device holding the user interface 16 .
  • the processing unit 6 may be provided either in the device holding the image sensor 5 , in the device holding the user interface 16 , or in a third device.
  • the remote control unit 4 of FIG. 1 a represents an embodiment according to which the particularly the image sensor 5 , the processing unit 6 , and the transmitter 7 are part of one and the same physical device (i.e. the remote control unit 4 )
  • the arrangement 20 of FIG. 1 b represents an embodiment according to which the image sensor 5 , the processing unit 6 , and the transmitter 7 are not necessarily part of one and the same device, but instead provided as two or more physically separated, but operatively coupled, devices.
  • FIG. 2 schematically illustrates, in terms of a number of functional blocks, the remote control unit 4 , the components of which may also be part of the arrangement 20 .
  • the remote control unit 4 (and hence also the arrangement 20 ) comprises an image sensor 5 for receiving coded light from at least one light source, such as the light sources 2 a - c in the lighting systems 1 a, 1 b.
  • the image sensor 5 captures at least one image.
  • the image sensor 5 detects coded light in the at least one image. How coded light is detected in the at least one image will be further disclosed below.
  • the image sensor 5 is further arranged to capture images within a field of view 10 a - 10 b along a general light detection direction.
  • the image sensor 5 is thus able to receive and detect light, particularly coded light, within the field of view 10 - 10 b.
  • the general light detection direction can be changed by changing the direction of the image sensor 5 .
  • the field of view 10 a - 10 b may be narrowed by the remote control unit 4 performing a zoom-in operation.
  • the field of view 10 a - 10 b may be broadened by the remote control unit 4 performing a zoom-out operation.
  • the image sensor 5 is enabled to capture images in a plurality of different directions and with a plurality of different fields of view.
  • the remote control unit 4 (and hence also the arrangement 20 ) may thereby be arranged to identify an individual lighting device 2 a - 2 c from the group of said at least one light source 2 a - 2 c.
  • the image sensor 5 may be able to detect the physical direction from which the detected light is emanating. These physical directions are in FIGS. 1 a and 1 b schematically denoted by arrows 3 a - 3 c, which indicate the light emanating from the light source 2 a - 2 c.
  • the individual light source 2 a - 2 c may be identified by said lighting device identification codes, which, as discussed above, may be embedded in the emitted light contributions of the light source 2 a - 2 c.
  • each individual light source 2 a - 2 c is associated with a unique lighting device identification code each individual light source 2 a - 2 c may be identified.
  • the image sensor 5 may in a step S 12 first capture a plurality of images so as to form a stack of images from the plurality of images (down sample if no higher resolution is needed).
  • the image sensor 5 may in a step S 14 determine the sum over all pixels per image in the stack of images to generate a conventional one-dimensional signal (in time) to determine which codes (and with which phase/time delay) that are present in the scene represented by the stack of images.
  • the image sensor 5 may then in a step S 16 use the stack of images to determine footprints of every light source 2 a - 2 c by correlating the time dependence of all pixels with the corresponding code (and phase, from the previous step).
  • the remote control unit 4 (and hence also the arrangement 20 ) further comprises a processing unit 6 .
  • the processing unit 6 may be implemented by a so-called central processing unit (CPU).
  • CPU central processing unit
  • the image sensor 5 of the remote control unit 4 detects the light emitted by the one or more light sources 2 a - 2 c (the light being within the field of view 10 a - 10 b of the image sensor 5 ).
  • the light comprises coded light defining a unique lighting device identification code.
  • such an identification code may be realized as a pulse width modulation code.
  • the identification code may be realized by using code division multiple access techniques. It is to be understood that other embodiments for the realization of identification codes are known to a person skilled in the art.
  • the remote control unit 4 may further comprise a user interface 16 through which a user is enabled to interact with the functionality of the remote control unit.
  • the user interface 16 may in particular be arranged to receive user input and to provide information to the user.
  • FIGS. 3 a, 3 b, 3 c and 4 illustrate possible embodiments of controlling a set of light sources emitting coded light using the disclosed remote control unit 4 (or the arrangement 20 ).
  • FIGS. 3 a, 3 b, 3 c and 4 commonly depict an environment with several coded light sources 2 a, 2 b, 2 c, 2 d as captured by an image sensor 5 of the remote control 4 (or the arrangement 20 ).
  • the image sensor 5 is used to detect the presence and location of the object 13 .
  • the region and/or object in the image is thus illuminated at least by the set of light sources 2 a - b emitting the coded light.
  • the region and/or object in the image may also be illuminated by additional light sources.
  • the image sensor 5 is able to detect (and thereafter select) the coded light sources which are mostly affecting the new object.
  • the light sources affecting the object 13 are those with reference numerals 2 a and 2 b.
  • Each one of the FIGS. 3 a, 3 b, 3 c and 4 will now be described in more detail.
  • FIG. 3 a illustrates a first image 11 a as captured by the image sensor 5 of the remote control unit 4 (or the arrangement 20 ).
  • the image 11 a may be presented to a user interface of the remote control unit 4 (or the arrangement 20 ).
  • the image 11 a depicts a scene comprising light sources 2 a - 2 c emitting coded light 12 a - 12 d.
  • FIG. 3 b illustrates a second image 11 b as captured by the image sensor 5 of the remote control unit 4 (or the arrangement 20 ).
  • the second image 11 b depicts the same scene as the image 11 a with the difference that an object 13 has been added to the image 11 b.
  • the added object 13 may be detected by the processing unit 6 .
  • the processing unit 6 is arranged to, in a step S 4 , determine a region and/or object in an image captured by the image sensor 5 .
  • the processing unit 6 may utilize one of a number of ways to detect the added object 13 .
  • the processing unit 6 is further arranged to in a step S 18 receive at least two images, such as the first image 11 a and the second image 11 b, from the image sensor 5 and to detect a difference between the at least two images 11 a, 11 b.
  • the difference is defined by the object 13 having been added in image 11 b.
  • the object 13 may thus be determined by the processing unit 6 , in a step S 20 , in one of the two images 11 a, 11 b from the difference there between. Background segmentation may be used alternatively or in combination therewith to determine the region and/or object 13 in one or more images captured by the image sensor 5 .
  • background segmentation foreground segmentation mask is computed so as to separate the region and/or object from the background.
  • the processing unit 6 may further be arranged to in a step S 22 identify the region and/or object 13 by segmenting image foreground information from image background information and to identify the region and/or object 13 by virtue of the image foreground information.
  • Motion detection/estimation may, in a step S 24 , be used alternatively or in combination therewith by the processing unit 6 to determine the region and/or object 13 in one or more images captured by the image sensor 5 .
  • motion detection/estimation a map of all areas of the image which moved in the recent past is computed.
  • the processing unit 6 may thus in particular be arranged to detect an object and/or region that has moved between the two images.
  • Depth segmentation may, in a step S 26 , be used alternatively or in combination therewith by the processing unit 6 to determine the region and/or object 13 in one or more images captured by the image sensor 5 .
  • depth segmentation a map of what changed in the environment is obtained.
  • Recent advances in 3D optical ranging have proven how to obtain a depth measurement for each pixel of a regular camera.
  • the processing unit 6 may thus in particular be arranged to detect an object and/or region from an image depth measurement from which the region and/or object is identified.
  • Object detection may, in a step S 28 , be used alternatively or in combination therewith by the processing unit 6 to determine the region and/or object 13 in one or more images captured by the image sensor 5 .
  • the processing unit 6 could be trained to recognize only a specific category of objects, such as people, shoes, vegetables, etc., so that only if an object belonging to one of those categories appears in the scene a region would be made available.
  • the added object 13 may alternatively be detected from user interaction.
  • the remote control unit 4 (and hence also the arrangement 20 ) may comprise a user interface 16 .
  • the user interface 16 advantageously comprises a touch sensitive display. Touch sensitive displays and their functions are as such known in the art.
  • the touch sensitive display may in particular be arranged to provide the processing unit 6 with the user input upon user interaction with the touch sensitive display.
  • the user interface may also be gaze based or based on tactile interaction.
  • the remote control unit 4 advantageously comprises a camera unit arranged to be directed towards one eye (or both eyes) of the user of the control unit 4 (or the user interface 16 of the arrangement 20 ) and where the processing unit 6 uses eye-tracking so as to determine the point of gaze of the user to determine an object and/or region in the displayed image.
  • the user input may receive tactile user input for example from a keyboard or a joystick provided on, or being operatively coupled to, the remote control unit 4 (or the arrangement 20 ). As the skilled person understands, there may be other equally likely and equivalent ways of receiving user input.
  • FIG. 3 c illustrates a third image 11 c as captured by the image sensor 5 of the remote control unit 4 (or the arrangement 20 ).
  • the second image 11 c depicts the same scene as the image 11 b with the difference that a marker 14 marking the object 13 has also been added to the image 11 c.
  • a user of remote control unit 4 (or the arrangement 20 ) may be allowed to point to an object in one of the images captured by the image sensor 5 .
  • the processing unit may therefore further be arranged to, in a step S 30 , receive user input identifying an area in the image. The received user input may be acknowledged to the user by providing the marker 14 .
  • the complete object outline may then be automatically segmented exploiting computer vision techniques as disclosed above.
  • an outline of the object and/or region may be determined by the processing unit in a step S 32 so as to determine the object and/or region.
  • all the relevant codes i.e. codes associated with light sources 2 a, 2 b at least partly illuminating the identified area and/or object 13 , can then be selected.
  • the user may, in a step S 34 , receive a snapshot of the image captured by the image sensor 5 .
  • the snapshot may be provided together with an indication, in a step S 36 , of which regions/objects that have been selected.
  • the snapshot may be provided by the user interface 16 .
  • FIG. 4 illustrates such a snapshot image 11 c as captured by the image sensor 5 of the remote control unit 4 (or the arrangement 20 ).
  • the image 11 c depicts the same scene as the images 11 b and 11 c.
  • the image 11 c comprises and indication, in the form of a dotted line 15 , of the detected object 13 .
  • Information pertaining to the indication is thus originally generated by the processing unit 6 in response to detecting the object 13 and may then be transmitted, in a step S 38 , from the processing unit 6 to the user interface 16 so as to provide the indication to the user.
  • no image/indication is provided to the user.
  • the light sources 2 a, 2 b affecting the region/object may provide visible indication, such as emitting one or more blinkings, so that the user can recognize which region/object that is currently selected.
  • the remote control unit 4 upon detection of the region/object and identification of the light sources 2 a, 2 b affecting the region/object transmits a control signal to the light sources 2 a, 2 b to provide the visible indication.
  • the processing unit 6 is further arranged to, in a step S 6 , associate the determined region and/or object with a set of light sources 2 a - b emitting the coded light, where the coded light has been detected by the image sensor 5 in step S 2 .
  • the user may additionally be presented with a user interface through which the user is enabled to change one or more applicable lighting settings or properties of the light sources 2 a, 2 b affecting the detected region/object, step S 40 . This is illustrated in FIG. 5 .
  • the processing unit 6 is in a step S 8 arranged to receive an input signal relating to updated light settings of the set of light sources 2 a, 2 b.
  • FIG. 5 shows an example of a user interface 16 of the remote control unit 4 .
  • the user interface 16 may be provided as a separate physical unit.
  • Controllable properties of the light may pertain to color of the emitted light, the color temperature of the emitted light, the intensity of the emitted light and/or the blinking frequency of the emitted light.
  • this is illustrated by a number of available settings 18 with which a user may interact by setting values of respective slidebars 19 .
  • the user interface thereby enables controlling lights by the setting of the properties of the particular group of light affecting the detected region/object.
  • the remote control unit 4 (and hence also the arrangement 20 ) further comprises a transmitter 7 .
  • the transmitter 7 is arranged to, in a step S 10 transmitting a control signal corresponding to the updated light settings to the set of light sources 2 a, 2 b.
  • the set of light sources 2 a, 2 b is thereby controlled by the remote control unit 4 .
  • the remote control unit 4 (and hence also the arrangement 20 ) may further comprise other components, such as a memory 9 operatively coupled to the processing unit 6 .
  • the memory 9 is operated according to principles which as such are known by the skilled person. Particularly, the memory 9 may comprise a set of lighting settings which may be transmitted to light sources 2 a - 2 c in the lighting systems 1 a, 1 b.
  • the transmitter 7 may be a light transmitter configured to emit coded light. Alternatively the transmitter 7 may be a radio transmitter configured to wirelessly transmit information. The transmitter 7 may be configured for bidirectional communications.
  • the transmitter 7 may comprise a radio antenna. Alternatively the transmitter 7 may comprise a connector for wired communications.
  • the disclosed remote control unit 4 and at least one luminaire comprising at least one light source 2 a, 2 b, 2 c and being controllable by the remote control unit 4 may be provided as an arrangement.
  • the disclosed embodiments are applicable to several scenarios: in retail lighting, it allows shopping personnel to easily select luminaires affecting a new product inserted in a shopping window; for a light designer, it allows to move in the environment to a certain location, and automatically be able to choose lights which affect such position, without any need of a pointing device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)
  • Selective Calling Equipment (AREA)

Abstract

A coded lighting system comprises a set of light sources and a remote control unit or an arrangement. The set of light sources emits coded light. In order to do so each light source is associated with a unique identifier. The remote control unit or the arrangement comprises an image sensor which captures images comprising light emitted by at least one of the light sources in the set of light sources. By analyzing the captured images the remote control unit or the arrangement is able to associate light sources affecting a particular region and/or object. The remote control unit or the arrangement is thereby able to transmit a control signal comprising updated light settings to the set of light sources.

Description

    FIELD OF THE INVENTION
  • The present invention relates to the field of coded light and in particular to a remote control unit and a method for controlling a set of light sources emitting coded light.
  • BACKGROUND OF THE INVENTION
  • The advent of integrated lighting installations, consisting of an ever growing number of individually controllable light sources, luminaires, lighting arrangements and the like with advanced rendering capabilities, may be regarded as transforming lighting systems for both professional and consumer markets. This brings a need for an intuitive control capable of fully exploiting the rendering capabilities of the complete lighting infrastructure.
  • Several approaches have been proposed to control light sources, luminaires, lighting arrangements and the like.
  • A first example involves wall-mounted control. At commissioning time a set of wall-mounted controls are installed, each of them controlling an individual or group of light sources or luminaires, possibly with optimized controls for each type of control within the set.
  • A second example involves having a separate remote control unit for each individual light source or luminaire. This may be regarded, by means of the remote control unit, as a more or less straight forward extension of the above disclosed wall switch control.
  • A third example involves iterative selection of the individual light sources or luminaires. A user is provided with a simple remote control unit capable of controlling all light sources or luminaires which the remote control unit has been commissioned with. The remote control unit is able to control a single luminaire at a time, but it may also allow a user to browse through all the luminaires, for example by manipulation of a user interface provided on the remote control unit (e.g. by using “previous” or “next” buttons). A digital version of such a concept has also been developed, which adopts a touch screen device as a remote control unit, so that, once a light source or luminaire is selected, a light control optimized for such a light source or luminaire is displayed to the user (e.g. color temperature for a light source or luminaire with tunable white; or a color wheel for an RGB light) by means of the touch screen on the remote control unit.
  • A fourth example involves the concept of point and control; this approach exploits the principle of coded light and a remote control unit capable of detecting the code of the light source or luminaire toward which the remote control unit is pointed and thereby to identify the light source or luminaire emitting the coded light. Such a remote control unit typically comprises one or more photodiodes for detecting the coded light emitted by the light source or luminaire. In general, coded light has been proposed to enable advanced control of light sources. Coded light is based on embedding of data, inter alia invisible identifiers, in the light output of the light sources. Coded light may thus be defined as the embedding of data and identifiers in the light output of a visible light source, f.i. applying CDMA modulation techniques, wherein the embedded data and/or identifier preferably do not influence the primary lighting function of the light source. Hence, any modulation of the emitted light pertaining to data and/or identifier should be invisible to humans. This allows for applications such as interactive scene setting, commissioning and re-commissioning of networked lighting systems. Coded light may be used in communications applications wherein one or more light sources in a coded lighting system are configured to emit coded light and thereby communicate information to a receiver.
  • The point and control approach shows the advantage of using coded light as a means to be able to select a luminaire by simply pointing towards it. As noted above, this approach employs a photodiode in order to detect the Coded Light message of each luminaire. It has been proposed to detect and decode coded light by means of a standard camera.
  • International application WO 2009/010926 relates to a method for processing light in a structure where the light sources emit light carrying individual codes. A camera is arranged in a camera position of the structure and registers images of spots of the light. WO 2009/010926 is based on an insight that by using a camera for registering images of the light emitted from the light sources after installation thereof, and recognizing the individual codes in the registered images, it is possible to obtain a fast and at least substantially automatic determination of light source properties. The camera may comprise an image detector comprising a matrix of detector elements each generating one pixel of the registered image. The camera registers images of illuminated areas at a frequency that corresponds to, or is adapted to, the modulation frequency of CDMA modulation. Thereby it is possible for the camera to generate images that capture different CDMA codes of the different illuminated areas.
  • SUMMARY OF THE INVENTION
  • The inventors of the enclosed embodiments have identified a number of disadvantages with the above noted first, second, third and fourth examples. For example, it may be time consuming because the user needs to either browse through all light sources or luminaires (as in the third example), or point the remote control unit toward each individual light source or luminaire (as in the fourth example).
  • For example, the above disclosed arrangements of light sources or luminaires and remote control units are not scalable. With an increasing number of light sources and luminaires in the arrangement, browsing through each light (as in the third example), or carrying along an individual remote control unit for each light (as in the second example) can be a tedious and error prone process.
  • For example, the above disclosed arrangements of light sources or luminaires and remote control units are not flexible. A wall mount switch for every configuration would have to be modified or added once a new type of setting, or subset of light sources or luminaires would need to be defined (as in the first example).
  • It is an object of the present invention to overcome these problem, and to provide a remote control unit and a method for controlling a set of light sources emitting coded light that are less time consuming, scalable, and flexible without being complex or error prone.
  • According to a first aspect of the invention, this and other objects are achieved by a remote control unit for controlling a set of light sources, comprising an image sensor arranged to capture at least one image and to detect coded light in the at least one image; a processing unit arranged to determine a region and/or object in an image captured by the image sensor; associate, by virtue of the detected coded light, the determined region and/or object with a set of light sources emitting the detected coded light, each light source having one or more light settings, wherein the region and/or object in the image is illuminated at least by the set of light sources; receive an input signal relating to updated light settings of the set of light sources; and a transmitter arranged to transmit a control signal corresponding to the updated light settings to the set of light sources.
  • Such a remote control unit may advantageously shorten the time needed to determine settings for a set of light sources in a coded lighting system since it does not require a remote control to receive coded light exclusively from one of the light sources one by one. The disclosed remote control unit advantageously also scales with the number of light sources in the system since the functionality of the disclosed remote control unit is independent of the number of light sources in the system. Furthermore, the disclosed approach allows a user to focus on the desired light effects, regardless of the number and location of light sources. An example of this would imply a user interface which presents to the user settings for the overall light effect affecting the selected area/object, instead of a specific user interface for each individual light source influencing the area/object. This type of interface may be preferred for quick settings, while an interface allowing to individually set the parameters of each light source affecting the area would be preferred when accuracy and full control of the light settings is necessary.
  • The processing unit may be further arranged to receive at least two images from the image sensor and to detect a difference between the at least two images; and determine the region and/or object in one of the two images from the difference. The remote control unit may thus advantageously identify the object and/or region from captured images.
  • The processing unit may be further arranged to receive user input identifying an area in the image; and from the identified area determine an outline of the object and/or region so as to determine the object and/or region. The remote control unit may thus advantageously identify the object and/or region from user input.
  • The remote control unit may further comprise a user interface arranged to provide the processing unit with the user input upon user interaction with the user interface. The user interface may advantageously comprise a touch sensitive display which thus provides for easy identification of the region and/or object by the user of the remote control unit.
  • The image sensor may further be arranged to capture a plurality of images so as to form a stack of images from the plurality of images; determine the sum over all pixels per image in the stack of images to generate a conventional one-dimensional signal to determine which codes are present in a scene represented by the stack of images; and determine, from the stack of images, a footprint of the light source by correlating time dependence of all pixels with a code associated with the light source.
  • This may be possible either by using synchronization of the light sensor with the set of light sources, or by exploiting the rolling shutter characteristics of the image sensor. A light sensor embodied as a standard camera with a beam splitter and a photodiode may be able to maintain an overview of the environment, by means of the camera, and at the same time be able to detect codes at extremely high accuracy and speed, by means of the photodiode.
  • The processing unit may further be arranged to transmit identification of the region and/or object to the user interface. This may advantageously provide the user with feedback regarding the identified region and/or object.
  • The user interface may be arranged to provide a snapshot of the image comprising the object and/or region together with and indication of the object and/or region. This may advantageously further improve the feedback provided to the user.
  • The user interface may be arranged to provide an indication relating to which light source or light sources that affect illumination of the object and/or region. This may advantageously provide the user with feedback regarding the light sources.
  • The user interface may be arranged to provide light settings available for the set of light sources. This may advantageously provide the user of the remote control unit the possibility to change one or more light settings of the light source(s) affecting the identified region and/or object.
  • The processing unit may be further arranged to identify the region and/or object by segmenting image foreground information from image background information and to identify the region and/or object by virtue of the image foreground information. This may advantageously enable improved identification of the region and/or object.
  • The processing unit may be further arranged to identify the region and/or object by performing motion detection and/or estimation between the two images in order to detect an object and/or region that has moved between the two images. This may advantageously enable improved identification of the region and/or object.
  • The processing unit may be further arranged to identify the region and/or object by performing depth segmentation so as to obtain an image depth measurement from which the region and/or object is identified. This may advantageously enable improved identification of the region and/or object.
  • The processing unit may be further arranged to identify the region and/or object by performing object detection. This may advantageously enable improved identification of the region and/or object.
  • According to a second aspect of the invention, the objective is achieved by an arrangement comprising a remote control unit according to the above and at least one luminaire controllable by the remote control unit and comprising at least one light source from the set of light sources.
  • According to a third aspect of the invention, the objective is achieved by method for controlling a set of light sources, comprising capturing, by an image sensor, at least one image and detecting coded light in the at least one image; determining, by a processing unit , a region and/or object in an image captured by the image sensor; associating, by the processing unit, by virtue of the detected coded light, the determined region and/or object with a set of light sources emitting the detected coded light, each light source having one or more light settings, wherein the region and/or object in the image is illuminated at least by the set of light sources; receiving, by the processing unit, an input signal relating to updated light settings of the set of light sources; and transmitting, by a transmitter, a control signal corresponding to the updated light settings to the set of light sources.
  • It is noted that the invention relates to all possible combinations of features recited in the claims. Likewise, the advantages of the first aspect apply to the second aspect as well as the third aspect, and vice versa.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other aspects of the present invention will now be described in more detail, with reference to the appended drawings showing embodiment(s) of the invention.
  • FIGS. 1 a and 1 b illustrate lighting systems according to embodiments;
  • FIG. 2 illustrates a remote control unit;
  • FIGS. 3 a, 3 b, 3 c and 4 illustrate images as captured by an image sensor;
  • FIG. 5 illustrates an example of a user interface of a remote control unit or an arrangement; and
  • FIGS. 6 a and 6 b are flowcharts according to embodiments.
  • DETAILED DESCRIPTION
  • The below embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout.
  • Recent development, such as exemplified by International application WO 2009/010926, has shown the possibility to detect coded light with the use of standard cameras Hereinafter a system which exploits a similar principle, combined with computer vision algorithms to automatically detect which luminaires are mostly influencing an object which is inserted, or recently moved, in the scene will be described.
  • Operation of a lighting system will now be disclosed with reference to the lighting systems 1 a of FIG. 1 a and 1 b of FIG. 1 b, and the flowcharts of FIGS. 6 a and 6 b. The lighting systems 1 a and 1 b of FIGS. 1 a and 1 b comprise at least one light source arranged to emit coded light, schematically denoted by light sources with reference numerals 2 a, 2 b, 2 c. The at least one light source 2 a-c may be a luminaire and/or be part of a lighting control system. The lighting systems 1 a and 1 b may thus be denoted as coded lighting systems. A luminaire may comprise at least one light source 2 a-c. The term “light source” means a device that is used for providing light in a room, for purpose of illuminating objects in the room. A room is in this context typically an apartment room or an office room, a gym hall, an indoor retail, environment, a theatre scene, a room in a public place or a part of an outdoor environment, such as a part of a street. Each light source 2 a-c is capable of emitting coded light, as schematically illustrated by arrows 3 a, 3 b, 3 c. The emitted light thus comprises a modulated part associated with coded light comprising information sequences. The emitted light may also comprise an un-modulated part associated with an illumination contribution. Each light source 2 a-c may be associated with a number of light (or lighting) settings, inter alia pertaining to the illumination contribution of the light source, such as color, color temperature, intensity and frequency of the emitted light. In general terms the illumination contribution of the light source may be defined as a time-averaged output of the light emitted by the light source 2 a-c.
  • The system 1 a further comprises a device termed a remote control unit 4 arranged to receive and detect the coded light emitted by the light sources in the system 1 a. The remote control unit 4 therefore comprises an image sensor 5 for detecting the light emitted by the light source(s) in the system 1 a by capturing images comprising coded light.
  • The remote control unit 4 further comprises a processing unit 6 operatively coupled to the image sensor 5. The processing unit 6 analyzes images captured by the image sensor 5 and identifies changes in the scene defined by the captured images, and particularly objects which have been moved or inserted in the scene. The remote control unit 4 further comprises a transmitter 7 operatively coupled to the processing unit. The transmitter 7 is arranged to transmit data, as schematically illustrated by arrows 8 a, 8 b to one or more of the light sources in the system 1 a. The remote control unit 4 may further comprise other components, such as a memory 9 operatively coupled to the processing unit 6 and a user interface 16 also operatively coupled to the processing unit 6. The remote control unit 4 may be part of a mobile phone and the herein disclosed functionality may be provided as one or more applications, so-called “Apps”. The one or more applications may be stored as one or more software products stored on a computer-readable storage medium.
  • In the alternative embodiment of FIG. 1 b, the system 1 b comprises an arrangement 20. The arrangement 20 comprises a number of physically separated (but operatively connectible) devices which when operationally connected (or coupled) enable the arrangement 20 arranged to receive and detect the coded light emitted by the light sources in the system 1 b. The arrangement 20 therefore comprises an image sensor 5 for detecting the light emitted by the light source(s) in the system 1 b by capturing images comprising coded light. The arrangement 20 further comprises a processing unit 6 operatively coupled to the image sensor 5. The processing unit 6 analyzes images captured by the image sensor 5 and identifies changes in the scene defined by the captured images, and particularly objects which have been moved or inserted in the scene. The arrangement 20 further comprises a transmitter 7 operatively coupled to the processing unit 6. The transmitter 7 is arranged to transmit data, as schematically illustrated by arrows 8 a, 8 b to one or more of the light sources in the system 1 b. The arrangement 20 may further comprise other components, such as a memory 9 operatively coupled to the processing unit 6 and a user interface 16 also operatively coupled to the processing unit 6. The arrangement 20 may be advantageous in scenarios in which the image sensor 5 is either mounted fixed at the ceiling of the space to be illuminated by the light source 2 a-c, or positioned at a certain location while an operator of the lighting system 1 b is free to move around and change light settings while being in the scene himself/herself. The change of lighting settings will be further explained below. As an example, arranging the image sensor 5 fixed at the location enables the operator to move around freely in the area to be illuminated whilst interacting with the user interface 16 (which thus may be provided in a device which is physically different from the device holding the image sensor 5), whereby images of the scene to be illuminated could be wirelessly transmitted from the device holding the image sensor 5 to the device holding the user interface 16. This may also relieve the operator from having to point at the scene to be illuminated while changing the light settings. The processing unit 6 may be provided either in the device holding the image sensor 5, in the device holding the user interface 16, or in a third device.
  • Thus, whereas the remote control unit 4 of FIG. 1 a represents an embodiment according to which the particularly the image sensor 5, the processing unit 6, and the transmitter 7 are part of one and the same physical device (i.e. the remote control unit 4), the arrangement 20 of FIG. 1 b represents an embodiment according to which the image sensor 5, the processing unit 6, and the transmitter 7 are not necessarily part of one and the same device, but instead provided as two or more physically separated, but operatively coupled, devices.
  • FIG. 2 schematically illustrates, in terms of a number of functional blocks, the remote control unit 4, the components of which may also be part of the arrangement 20. The remote control unit 4 (and hence also the arrangement 20) comprises an image sensor 5 for receiving coded light from at least one light source, such as the light sources 2 a-c in the lighting systems 1 a, 1 b. In a step S2 the image sensor 5 captures at least one image. The image sensor 5 then detects coded light in the at least one image. How coded light is detected in the at least one image will be further disclosed below. The image sensor 5 is further arranged to capture images within a field of view 10 a-10 b along a general light detection direction. The image sensor 5 is thus able to receive and detect light, particularly coded light, within the field of view 10-10 b. The general light detection direction can be changed by changing the direction of the image sensor 5. The field of view 10 a-10 b may be narrowed by the remote control unit 4 performing a zoom-in operation. Similarly, the field of view 10 a-10 b may be broadened by the remote control unit 4 performing a zoom-out operation. Thereby the image sensor 5 is enabled to capture images in a plurality of different directions and with a plurality of different fields of view. The remote control unit 4 (and hence also the arrangement 20) may thereby be arranged to identify an individual lighting device 2 a-2 c from the group of said at least one light source 2 a-2 c. For example the image sensor 5 may be able to detect the physical direction from which the detected light is emanating. These physical directions are in FIGS. 1 a and 1 b schematically denoted by arrows 3 a-3 c, which indicate the light emanating from the light source 2 a-2 c. As a second example the individual light source 2 a-2 c may be identified by said lighting device identification codes, which, as discussed above, may be embedded in the emitted light contributions of the light source 2 a-2 c.
  • Since each individual light source 2 a-2 c is associated with a unique lighting device identification code each individual light source 2 a-2 c may be identified. In order to do so the image sensor 5 may in a step S12 first capture a plurality of images so as to form a stack of images from the plurality of images (down sample if no higher resolution is needed).
  • The image sensor 5 may in a step S14 determine the sum over all pixels per image in the stack of images to generate a conventional one-dimensional signal (in time) to determine which codes (and with which phase/time delay) that are present in the scene represented by the stack of images. The image sensor 5 may then in a step S16 use the stack of images to determine footprints of every light source 2 a-2 c by correlating the time dependence of all pixels with the corresponding code (and phase, from the previous step).
  • The remote control unit 4 (and hence also the arrangement 20) further comprises a processing unit 6. The processing unit 6 may be implemented by a so-called central processing unit (CPU). In the typical scenarios of FIGS. 1 a and 1 b, the image sensor 5 of the remote control unit 4 (or the arrangement 20) detects the light emitted by the one or more light sources 2 a-2 c (the light being within the field of view 10 a-10 b of the image sensor 5). The light comprises coded light defining a unique lighting device identification code. For example such an identification code may be realized as a pulse width modulation code. As a second example the identification code may be realized by using code division multiple access techniques. It is to be understood that other embodiments for the realization of identification codes are known to a person skilled in the art.
  • As will be further elaborated upon below the remote control unit 4 (and hence also the arrangement 20) may further comprise a user interface 16 through which a user is enabled to interact with the functionality of the remote control unit. Thus the user interface 16 may in particular be arranged to receive user input and to provide information to the user.
  • FIGS. 3 a, 3 b, 3 c and 4 illustrate possible embodiments of controlling a set of light sources emitting coded light using the disclosed remote control unit 4 (or the arrangement 20). FIGS. 3 a, 3 b, 3 c and 4 commonly depict an environment with several coded light sources 2 a, 2 b, 2 c, 2 d as captured by an image sensor 5 of the remote control 4 (or the arrangement 20). Once a new object 13 is placed in the scene, the image sensor 5 is used to detect the presence and location of the object 13. The region and/or object in the image is thus illuminated at least by the set of light sources 2 a-b emitting the coded light. The region and/or object in the image may also be illuminated by additional light sources. Exploiting the principles described above for identifying individual light sources, the image sensor 5 is able to detect (and thereafter select) the coded light sources which are mostly affecting the new object. In the example of FIGS. 3 a, 3 b, 3 c and 4 the light sources affecting the object 13 are those with reference numerals 2 a and 2 b. Each one of the FIGS. 3 a, 3 b, 3 c and 4 will now be described in more detail.
  • FIG. 3 a illustrates a first image 11 a as captured by the image sensor 5 of the remote control unit 4 (or the arrangement 20). The image 11 a may be presented to a user interface of the remote control unit 4 (or the arrangement 20). The image 11 a depicts a scene comprising light sources 2 a-2 c emitting coded light 12 a-12 d.
  • FIG. 3 b illustrates a second image 11 b as captured by the image sensor 5 of the remote control unit 4 (or the arrangement 20). In comparison to the first image 11 a of FIG. 3 a the second image 11 b depicts the same scene as the image 11 a with the difference that an object 13 has been added to the image 11 b. The added object 13 may be detected by the processing unit 6. Particularly the processing unit 6 is arranged to, in a step S4, determine a region and/or object in an image captured by the image sensor 5. The processing unit 6 may utilize one of a number of ways to detect the added object 13.
  • According to an embodiment the processing unit 6 is further arranged to in a step S18 receive at least two images, such as the first image 11 a and the second image 11 b, from the image sensor 5 and to detect a difference between the at least two images 11 a, 11 b. In the exemplary scenario of the first image 11 a and the second image 11 b of FIGS. 3 a and 3 b respectively the difference is defined by the object 13 having been added in image 11 b. The object 13 may thus be determined by the processing unit 6, in a step S20, in one of the two images 11 a, 11 b from the difference there between. Background segmentation may be used alternatively or in combination therewith to determine the region and/or object 13 in one or more images captured by the image sensor 5. In background segmentation foreground segmentation mask is computed so as to separate the region and/or object from the background. Particularly, the processing unit 6 may further be arranged to in a step S22 identify the region and/or object 13 by segmenting image foreground information from image background information and to identify the region and/or object 13 by virtue of the image foreground information.
  • Motion detection/estimation may, in a step S24, be used alternatively or in combination therewith by the processing unit 6 to determine the region and/or object 13 in one or more images captured by the image sensor 5. In motion detection/estimation a map of all areas of the image which moved in the recent past is computed. The processing unit 6 may thus in particular be arranged to detect an object and/or region that has moved between the two images.
  • Depth segmentation may, in a step S26, be used alternatively or in combination therewith by the processing unit 6 to determine the region and/or object 13 in one or more images captured by the image sensor 5. In depth segmentation a map of what changed in the environment is obtained. Recent advances in 3D optical ranging have proven how to obtain a depth measurement for each pixel of a regular camera. The processing unit 6 may thus in particular be arranged to detect an object and/or region from an image depth measurement from which the region and/or object is identified.
  • Object detection may, in a step S28, be used alternatively or in combination therewith by the processing unit 6 to determine the region and/or object 13 in one or more images captured by the image sensor 5. In object detector the processing unit 6 could be trained to recognize only a specific category of objects, such as people, shoes, vegetables, etc., so that only if an object belonging to one of those categories appears in the scene a region would be made available.
  • The added object 13 may alternatively be detected from user interaction. In order to do so the remote control unit 4 (and hence also the arrangement 20) may comprise a user interface 16. The user interface 16 advantageously comprises a touch sensitive display. Touch sensitive displays and their functions are as such known in the art. The touch sensitive display may in particular be arranged to provide the processing unit 6 with the user input upon user interaction with the touch sensitive display. The user interface may also be gaze based or based on tactile interaction. If the user interface 16 is gaze based the remote control unit 4 (or the arrangement 20) advantageously comprises a camera unit arranged to be directed towards one eye (or both eyes) of the user of the control unit 4 (or the user interface 16 of the arrangement 20) and where the processing unit 6 uses eye-tracking so as to determine the point of gaze of the user to determine an object and/or region in the displayed image. The user input may receive tactile user input for example from a keyboard or a joystick provided on, or being operatively coupled to, the remote control unit 4 (or the arrangement 20). As the skilled person understands, there may be other equally likely and equivalent ways of receiving user input.
  • FIG. 3 c illustrates a third image 11 c as captured by the image sensor 5 of the remote control unit 4 (or the arrangement 20). In comparison to the second image 11 b of FIG. 3 b the second image 11 c depicts the same scene as the image 11 b with the difference that a marker 14 marking the object 13 has also been added to the image 11 c. For example, a user of remote control unit 4 (or the arrangement 20) may be allowed to point to an object in one of the images captured by the image sensor 5. The processing unit may therefore further be arranged to, in a step S30, receive user input identifying an area in the image. The received user input may be acknowledged to the user by providing the marker 14. The complete object outline may then be automatically segmented exploiting computer vision techniques as disclosed above. Thus from the identified area an outline of the object and/or region may be determined by the processing unit in a step S32 so as to determine the object and/or region. From the selected image area, all the relevant codes, i.e. codes associated with light sources 2 a, 2 b at least partly illuminating the identified area and/or object 13, can then be selected.
  • Once one or several relevant regions/objects have been detected, the user may, in a step S34, receive a snapshot of the image captured by the image sensor 5. Optionally the snapshot may be provided together with an indication, in a step S36, of which regions/objects that have been selected. The snapshot may be provided by the user interface 16. FIG. 4 illustrates such a snapshot image 11 c as captured by the image sensor 5 of the remote control unit 4 (or the arrangement 20). The image 11 c depicts the same scene as the images 11 b and 11 c. In addition the image 11 c comprises and indication, in the form of a dotted line 15, of the detected object 13. Information pertaining to the indication is thus originally generated by the processing unit 6 in response to detecting the object 13 and may then be transmitted, in a step S38, from the processing unit 6 to the user interface 16 so as to provide the indication to the user. Alternatively, no image/indication is provided to the user. Instead, once a region/object has been detected, the light sources 2 a, 2 b affecting the region/object may provide visible indication, such as emitting one or more blinkings, so that the user can recognize which region/object that is currently selected. In such a case the remote control unit 4 (or the arrangement 20) upon detection of the region/object and identification of the light sources 2 a, 2 b affecting the region/object transmits a control signal to the light sources 2 a, 2 b to provide the visible indication.
  • By virtue of the detected coded light the processing unit 6 is further arranged to, in a step S6, associate the determined region and/or object with a set of light sources 2 a-b emitting the coded light, where the coded light has been detected by the image sensor 5 in step S2. Once a region/object is detected, the user may additionally be presented with a user interface through which the user is enabled to change one or more applicable lighting settings or properties of the light sources 2 a, 2 b affecting the detected region/object, step S40. This is illustrated in FIG. 5. Particularly, the processing unit 6 is in a step S8 arranged to receive an input signal relating to updated light settings of the set of light sources 2 a, 2 b.
  • FIG. 5 shows an example of a user interface 16 of the remote control unit 4. With respect to the arrangement 20 the user interface 16 may be provided as a separate physical unit. Controllable properties of the light may pertain to color of the emitted light, the color temperature of the emitted light, the intensity of the emitted light and/or the blinking frequency of the emitted light. In the example of FIG. 5 this is illustrated by a number of available settings 18 with which a user may interact by setting values of respective slidebars 19. The user interface thereby enables controlling lights by the setting of the properties of the particular group of light affecting the detected region/object.
  • As noted above the remote control unit 4 (and hence also the arrangement 20) further comprises a transmitter 7. The transmitter 7 is arranged to, in a step S10 transmitting a control signal corresponding to the updated light settings to the set of light sources 2 a, 2 b. The set of light sources 2 a, 2 b is thereby controlled by the remote control unit 4.
  • The remote control unit 4 (and hence also the arrangement 20) may further comprise other components, such as a memory 9 operatively coupled to the processing unit 6. The memory 9 is operated according to principles which as such are known by the skilled person. Particularly, the memory 9 may comprise a set of lighting settings which may be transmitted to light sources 2 a-2 c in the lighting systems 1 a, 1 b. The transmitter 7 may be a light transmitter configured to emit coded light. Alternatively the transmitter 7 may be a radio transmitter configured to wirelessly transmit information. The transmitter 7 may be configured for bidirectional communications. The transmitter 7 may comprise a radio antenna. Alternatively the transmitter 7 may comprise a connector for wired communications.
  • The person skilled in the art realizes that the present invention by no means is limited to the preferred embodiments described above. On the contrary, many modifications and variations are possible within the scope of the appended claims. Particularly, the disclosed remote control unit 4 and at least one luminaire comprising at least one light source 2 a, 2 b, 2 c and being controllable by the remote control unit 4 may be provided as an arrangement.
  • In summary, the disclosed embodiments are applicable to several scenarios: in retail lighting, it allows shopping personnel to easily select luminaires affecting a new product inserted in a shopping window; for a light designer, it allows to move in the environment to a certain location, and automatically be able to choose lights which affect such position, without any need of a pointing device.

Claims (15)

1. A remote control unit for controlling a set of light sources, comprising
an image sensor arranged to capture at least one image and to detect coded light in said at least one image;
a processing unit arranged to
determine a region and/or object in an image captured by the image sensor;
using said detected coded light to identify a set of light sources emitting said detected coded light that are illuminating said region and/or object, each light source having one or more light settings;
receive an input signal relating to updated light settings of said set of light sources; and
a transmitter arranged to transmit a control signal corresponding to said updated light settings to said set of light sources.
2. The remote control unit according to claim 1, wherein the processing unit is further configured to
receive at least two images from the image sensor and to detect a difference between the at least two images; and
determine the region and/or object in one of the two images from said difference.
3. The remote controller according to claim 1, wherein the processing unit is further configured to
receive user input identifying an area in the image; and
from the identified area determine an outline of the object and/or region so as to determine the object and/or region.
4. The remote control unit according to claim 3, further comprising a user interface arranged to provide the processing unit with the user input upon user interaction with the user interface.
5. The remote control unit according to claim 1, wherein the image sensor is further configured to
capture a plurality of images so as to form a stack of images from said plurality of images;
determine the sum over all pixels per image in said stack of images to generate a conventional one-dimensional signal to determine which codes that are present in a scene represented by the stack of images; and
determine, from the stack of images, a footprint of the light source by correlating time dependence of all pixels with a code associated with the light source.
6. The remote control unit according to claim 5, further comprising a user interface, and wherein the processing unit is further to transmit identification of said region and/or object to the user interface.
7. The remote control unit according to claim 6, wherein the user interface is arranged to provide a snapshot of the image comprising the object and/or region together with and indication of the object and/or region.
8. The remote control unit according to claim 6, wherein the user interface is arranged to provide an indication relating to which light source or light sources that affect illumination of the object and/or region.
9. The remote control unit according to claim 8, wherein the user interface is arranged to provide light settings available for said set of light sources.
10. The remote control unit according to claim 1, wherein the processing unit is further arranged to identify the region and/or object by segmenting image foreground information from image background information and to identify the region and/or object by virtue of the image foreground information.
11. The remote control unit according to claim 2, wherein the processing unit is further arranged to identify the region and/or object by performing motion detection and/or estimation between the two images in order to detect an object and/or region that has moved between the two images.
12. The remote control unit according to claim 1, wherein the processing unit is further arranged to identify the region and/or object by performing depth segmentation so as to obtain an image depth measurement from which the region and/or object is identified.
13. The remote control unit according to claim 1, wherein the processing unit is further arranged to identify the region and/or object by performing object detection.
14. An arrangement comprising a remote control unit according to claim 13 and at least one luminaire controllable by the remote control unit and comprising at least one light source from said set of light sources.
15. A method for controlling a set of light sources, comprising
capturing, by an image sensor, at least one image and detecting coded light in said at least one image;
determining, by a processing unit, a region and/or object in an image captured by the image sensor;
the processing unit, identifying using said detected coded light, a set of light sources emitting said detected coded light that are illuminating the determined region and/or object, each light source having one or more light settings;
receiving, by the processing unit, an input signal relating to updated light settings of said set of light sources, wherein the input signal sets an overall light effect affecting the determined region and/or object; and
transmitting, by a transmitter, a control signal corresponding to said updated light settings to said set of light sources.
US14/351,153 2011-10-14 2012-09-28 Coded light detector Active US9232610B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/351,153 US9232610B2 (en) 2011-10-14 2012-09-28 Coded light detector

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201161547101P 2011-10-14 2011-10-14
PCT/IB2012/055174 WO2013054221A1 (en) 2011-10-14 2012-09-28 Coded light detector
US14/351,153 US9232610B2 (en) 2011-10-14 2012-09-28 Coded light detector

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2012/055174 A-371-Of-International WO2013054221A1 (en) 2011-10-14 2012-09-28 Coded light detector

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/848,728 Continuation US9504126B2 (en) 2011-10-14 2015-09-09 Coded light detector

Publications (2)

Publication Number Publication Date
US20140265878A1 true US20140265878A1 (en) 2014-09-18
US9232610B2 US9232610B2 (en) 2016-01-05

Family

ID=47295089

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/351,153 Active US9232610B2 (en) 2011-10-14 2012-09-28 Coded light detector
US14/848,728 Active US9504126B2 (en) 2011-10-14 2015-09-09 Coded light detector

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/848,728 Active US9504126B2 (en) 2011-10-14 2015-09-09 Coded light detector

Country Status (7)

Country Link
US (2) US9232610B2 (en)
EP (1) EP2748950B1 (en)
JP (1) JP6157011B2 (en)
CN (1) CN103858364B (en)
ES (1) ES2708695T3 (en)
IN (1) IN2014CN02382A (en)
WO (1) WO2013054221A1 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130147366A1 (en) * 2011-12-07 2013-06-13 Charlie Huizenga System for and method of commissioning lighting devices
US20150002026A1 (en) * 2012-01-20 2015-01-01 Koninklijke Philips N.V. Method for detecting and controlling coded light sources
US20150324642A1 (en) * 2014-05-06 2015-11-12 Qualcomm Incorporated Determining an orientation of a mobile device
US20160091217A1 (en) * 2014-09-29 2016-03-31 Koninklijke Philips N.V. Systems and methods for managing environmental conditions
US20160098609A1 (en) * 2013-05-07 2016-04-07 Koninklijke Philips N.V. A video analysis device and a method of operating a video analysis device
US20160164603A1 (en) * 2013-07-23 2016-06-09 Koninklijke Philips N.V. Modulation of coded light components
US20170054960A1 (en) * 2015-08-17 2017-02-23 Chiun Mai Communication Systems, Inc. Camera color trmperature compensation system and smart terminal employing same
US20170093489A1 (en) * 2015-09-30 2017-03-30 Osram Sylvania Inc. Reconstructing light-based communication signals captured with a rolling shutter image capture device
US9664814B2 (en) 2008-06-02 2017-05-30 Abl Ip Holding Llc Wireless sensor
US20170245348A1 (en) * 2013-12-18 2017-08-24 General Electric Company Communication system for adaptive lighting control
CN107636729A (en) * 2015-04-22 2018-01-26 飞利浦照明控股有限公司 Lighting plan maker
US20180209823A1 (en) * 2015-09-15 2018-07-26 Pepperl+Fuchs Gmbh Apparatus and method for reliably determining the position of an object
US20180219624A1 (en) * 2015-07-27 2018-08-02 Philips Lighting Holding B.V. Light emitting device for generating light with embedded information
US10139787B2 (en) 2008-06-02 2018-11-27 Abl Ip Holding Llc Intelligence in distributed lighting control devices
CN109891490A (en) * 2016-10-27 2019-06-14 昕诺飞控股有限公司 The method of storage object identifier
US10354298B2 (en) * 2014-06-27 2019-07-16 Ledvance Llc Lighting audit and LED lamp retrofit
US10375800B2 (en) * 2016-04-06 2019-08-06 Signify Holding B.V. Controlling a lighting system
US10979145B2 (en) * 2019-03-28 2021-04-13 Honda Motor Co., Ltd. Optical transmitter and optical transmission method
US20210233370A1 (en) * 2020-01-29 2021-07-29 Everseen Limited System and Method for Identifying Users
US11221599B2 (en) 2014-11-28 2022-01-11 Signify Holding B.V. Systems and methods for managing environmental conditions

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150355829A1 (en) * 2013-01-11 2015-12-10 Koninklijke Philips N.V. Enabling a user to control coded light sources
WO2014174412A2 (en) * 2013-04-25 2014-10-30 Koninklijke Philips N.V. Adaptive outdoor lighting control system based on user behavior
US10298326B2 (en) * 2014-02-14 2019-05-21 Signify Holding B.V. Coded light
EP2928269A1 (en) 2014-04-04 2015-10-07 LG Electronics Inc. Lighting system and control method thereof
CN106576412A (en) * 2014-05-12 2017-04-19 飞利浦灯具控股公司 Detection of coded light
US10609795B2 (en) 2014-08-19 2020-03-31 Signify Holding B.V. Fault detection system
US20160054023A1 (en) * 2014-08-22 2016-02-25 Lutron Electronics Co., Inc. Load control system responsive to sensors and mobile devices
EP3216320B1 (en) * 2014-11-04 2018-04-04 Philips Lighting Holding B.V. Transmitter comprising a transmission queue and corresponding source device
CN104393931B (en) * 2014-11-17 2018-12-25 北京智谷睿拓技术服务有限公司 Visible light signal reception control method, control device and receiving device
CN104378164B (en) * 2014-11-17 2018-12-25 北京智谷睿拓技术服务有限公司 Visible light signal reception control method, control device and receiving device
US10020881B2 (en) * 2014-11-25 2018-07-10 Qualcomm Incorporated Method and apparatus for transmitting secure VLC identifiers
US10484827B2 (en) 2015-01-30 2019-11-19 Lutron Technology Company Llc Gesture-based load control via wearable devices
JP6688314B2 (en) * 2015-03-26 2020-04-28 シグニファイ ホールディング ビー ヴィSignify Holding B.V. Commissioning in the context of lighting devices
EP3295338B1 (en) * 2015-05-12 2023-07-19 Signify Holding B.V. Method and system for managing space configurations
WO2016206991A1 (en) * 2015-06-23 2016-12-29 Philips Lighting Holding B.V. Gesture based lighting control
US10599174B2 (en) 2015-08-05 2020-03-24 Lutron Technology Company Llc Load control system responsive to the location of an occupant and/or mobile device
CA3179801A1 (en) 2015-08-05 2017-02-09 Lutron Technology Company Llc Commissioning and controlling load control devices
WO2017100723A1 (en) 2015-12-11 2017-06-15 Lutron Electronics Co., Inc. Load control system having a visible light sensor
WO2018050590A1 (en) * 2016-09-16 2018-03-22 Philips Lighting Holding B.V. Illumination control
US20180098215A1 (en) * 2016-09-30 2018-04-05 Richard D. Roberts Data processing and authentication of light communication sources
WO2018107182A2 (en) 2016-12-09 2018-06-14 Lutron Electronics Co., Inc. Load control system having a visible light sensor
JP6897389B2 (en) * 2017-07-25 2021-06-30 富士通株式会社 Discrimination computer program, discriminating device and discriminating method, and communication system
TWI647976B (en) * 2017-08-24 2019-01-11 財團法人工業技術研究院 Illumination control system and illumination control method
WO2020016027A1 (en) 2018-07-16 2020-01-23 Lumileds Holding B.V. Controlling a plurality of lighting units
EP3935790A1 (en) 2019-03-08 2022-01-12 Lutron Technology Company LLC Commissioning and controlling load control devices

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100295457A1 (en) * 2009-05-20 2010-11-25 Pixart Imaging Inc. Light control system and control method thereof
US20120105217A1 (en) * 2010-03-12 2012-05-03 Pixart Imaging Inc. Remote device and remote control system
US20130141010A1 (en) * 2011-11-18 2013-06-06 Express Imaging Systems, Llc Adjustable output solid-state lamp with security features

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1096605A (en) * 1996-09-24 1998-04-14 Komatsu Ltd Position measuring method and device with image processing
JP2006032295A (en) * 2004-07-21 2006-02-02 Fuji Photo Film Co Ltd Illumination adjustment method of photographing studio
JP2008152950A (en) * 2006-12-14 2008-07-03 Toshiba Lighting & Technology Corp Lighting system
ATE490674T1 (en) 2007-07-18 2010-12-15 Koninkl Philips Electronics Nv METHOD FOR PROCESSING LIGHT IN A STRUCTURE AND LIGHTING SYSTEM
TW200935972A (en) * 2007-11-06 2009-08-16 Koninkl Philips Electronics Nv Light management system with automatic identification of light effects available for a home entertainment system
WO2009101570A1 (en) * 2008-02-12 2009-08-20 Koninklijke Philips Electronics N.V. Adaptive modulation and data embedding in light for advanced lighting control
JP5230371B2 (en) * 2008-11-21 2013-07-10 パナソニック株式会社 Lighting system
HUE057575T2 (en) * 2009-01-06 2022-05-28 Signify Holding Bv Control system for controlling one or more controllable devices sources and method for enabling such control
US8798316B2 (en) 2009-05-14 2014-08-05 Koninklijke Philips N.V. Method and system for controlling lighting
JP5406683B2 (en) * 2009-11-25 2014-02-05 パナソニック株式会社 Lighting control device, lighting control system, and lighting control device
US9041296B2 (en) 2009-12-15 2015-05-26 Koninklijkle Philips N.V. System and method for physical association of lighting scenes
BR112012017100A8 (en) 2010-01-15 2017-09-19 Koninklijke Philips Electronics Nv DETECTION SYSTEM FOR DETERMINING IF A LIGHT CONTRIBUTION FROM A FIRST LIGHT SOURCE OF A LIGHTING SYSTEM IS PRESENT IN A SELECTED POSITION WITHIN A SCENE, METHOD FOR DETERMINING IF A LIGHT CONTRIBUTION FROM A FIRST LIGHT SOURCE OF A LIGHTING SYSTEM LIGHTING IS PRESENT AT A SELECTED POSITION WITHIN A SCENE AND COMPUTER PROGRAM

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100295457A1 (en) * 2009-05-20 2010-11-25 Pixart Imaging Inc. Light control system and control method thereof
US20120105217A1 (en) * 2010-03-12 2012-05-03 Pixart Imaging Inc. Remote device and remote control system
US20130141010A1 (en) * 2011-11-18 2013-06-06 Express Imaging Systems, Llc Adjustable output solid-state lamp with security features

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10139787B2 (en) 2008-06-02 2018-11-27 Abl Ip Holding Llc Intelligence in distributed lighting control devices
US9664814B2 (en) 2008-06-02 2017-05-30 Abl Ip Holding Llc Wireless sensor
US20130147366A1 (en) * 2011-12-07 2013-06-13 Charlie Huizenga System for and method of commissioning lighting devices
US10111308B2 (en) 2011-12-07 2018-10-23 Abl Ip Holding Llc System for and method of commissioning lighting devices within a wireless network
US9192019B2 (en) * 2011-12-07 2015-11-17 Abl Ip Holding Llc System for and method of commissioning lighting devices
US9888548B2 (en) 2011-12-07 2018-02-06 Abl Ip Holding Llc System for and method of commissioning lighting devices
US20150002026A1 (en) * 2012-01-20 2015-01-01 Koninklijke Philips N.V. Method for detecting and controlling coded light sources
US9210777B2 (en) * 2012-01-20 2015-12-08 Koninklijke Philips N.V. Method for detecting and controlling coded light sources
US20160098609A1 (en) * 2013-05-07 2016-04-07 Koninklijke Philips N.V. A video analysis device and a method of operating a video analysis device
US20160164603A1 (en) * 2013-07-23 2016-06-09 Koninklijke Philips N.V. Modulation of coded light components
US9900092B2 (en) * 2013-07-23 2018-02-20 Philips Lighting Holding B.V. Modulation of coded light components
US20170245348A1 (en) * 2013-12-18 2017-08-24 General Electric Company Communication system for adaptive lighting control
US9317747B2 (en) * 2014-05-06 2016-04-19 Qualcomm Incorporated Determining an orientation of a mobile device
US20150324642A1 (en) * 2014-05-06 2015-11-12 Qualcomm Incorporated Determining an orientation of a mobile device
US10354298B2 (en) * 2014-06-27 2019-07-16 Ledvance Llc Lighting audit and LED lamp retrofit
US20160091217A1 (en) * 2014-09-29 2016-03-31 Koninklijke Philips N.V. Systems and methods for managing environmental conditions
US10602589B2 (en) * 2014-09-29 2020-03-24 Signify Holding B.V. Systems and methods for managing environmental conditions
US20170318647A1 (en) * 2014-09-29 2017-11-02 Philips Lighting Holding B.V. Automatic and decentralized commissioning of replacement lighting units
US10117314B2 (en) * 2014-09-29 2018-10-30 Philips Lighting Holding B.V. Automatic and decentralized commissioning of replacement lighting units
US11221599B2 (en) 2014-11-28 2022-01-11 Signify Holding B.V. Systems and methods for managing environmental conditions
CN107636729A (en) * 2015-04-22 2018-01-26 飞利浦照明控股有限公司 Lighting plan maker
US20180144213A1 (en) * 2015-04-22 2018-05-24 Philips Lighting Holding B.V. A lighting plan generator
US10762388B2 (en) * 2015-04-22 2020-09-01 Signify Holding B.V. Lighting plan generator
US10348403B2 (en) * 2015-07-27 2019-07-09 Signify Holding B.V. Light emitting device for generating light with embedded information
US20180219624A1 (en) * 2015-07-27 2018-08-02 Philips Lighting Holding B.V. Light emitting device for generating light with embedded information
US20170054960A1 (en) * 2015-08-17 2017-02-23 Chiun Mai Communication Systems, Inc. Camera color trmperature compensation system and smart terminal employing same
US9819874B2 (en) * 2015-08-17 2017-11-14 Chiun Mai Communication Systems, Inc. Camera color temperature compensation system and smart terminal employing same
US20180209823A1 (en) * 2015-09-15 2018-07-26 Pepperl+Fuchs Gmbh Apparatus and method for reliably determining the position of an object
US10508936B2 (en) * 2015-09-15 2019-12-17 Pepperl+Fuchs Gmbh Apparatus and method for reliably determining the position of an object
US9929809B2 (en) * 2015-09-30 2018-03-27 Osram Sylvania Inc. Reconstructing light-based communication signals captured with a rolling shutter image capture device
US20170302376A1 (en) * 2015-09-30 2017-10-19 Osram Sylvania Inc. Reconstructing light-based communication signals captured with a rolling shutter image capture device
US9742493B2 (en) * 2015-09-30 2017-08-22 Osram Sylvania Inc. Reconstructing light-based communication signals captured with a rolling shutter image capture device
US20170093489A1 (en) * 2015-09-30 2017-03-30 Osram Sylvania Inc. Reconstructing light-based communication signals captured with a rolling shutter image capture device
US10375800B2 (en) * 2016-04-06 2019-08-06 Signify Holding B.V. Controlling a lighting system
CN109891490A (en) * 2016-10-27 2019-06-14 昕诺飞控股有限公司 The method of storage object identifier
US10979145B2 (en) * 2019-03-28 2021-04-13 Honda Motor Co., Ltd. Optical transmitter and optical transmission method
US20210233370A1 (en) * 2020-01-29 2021-07-29 Everseen Limited System and Method for Identifying Users

Also Published As

Publication number Publication date
EP2748950B1 (en) 2018-11-28
IN2014CN02382A (en) 2015-06-19
US20150382438A1 (en) 2015-12-31
CN103858364A (en) 2014-06-11
US9504126B2 (en) 2016-11-22
EP2748950A1 (en) 2014-07-02
JP6157011B2 (en) 2017-07-05
WO2013054221A1 (en) 2013-04-18
ES2708695T3 (en) 2019-04-10
JP2014534568A (en) 2014-12-18
CN103858364B (en) 2017-02-22
US9232610B2 (en) 2016-01-05

Similar Documents

Publication Publication Date Title
US9504126B2 (en) Coded light detector
RU2557084C2 (en) System and method for interactive illumination control
EP3192330B1 (en) Lighting preference arbitration.
US20160150624A1 (en) Proximity based lighting control
RU2562805C2 (en) System and method for physical association of lighting scenes
US20150022123A1 (en) Remote control of light source
CN112655279B (en) Method for commissioning a lighting control system using a mobile device
US20120320262A1 (en) Device, system, and method for controlling light source to capture image
CN103168505A (en) A method and a user interaction system for controlling a lighting system, a portable electronic device and a computer program product
JP2015513806A (en) Visible light communication
US10694328B2 (en) Method of locating a mobile device in a group of mobile devices
US9668319B2 (en) Lighting system and control method thereof
EP3329616B1 (en) Light emitting device for generating light with embedded information
EP2805583B1 (en) Method for detecting and controlling coded light sources
US20180255625A1 (en) A method of visualizing a shape of a linear lighting device
RU2689142C2 (en) Coded light detection
KR101633455B1 (en) Apparatus and method for scanning code image
CN109891490A (en) The method of storage object identifier
KR20180026594A (en) Lighting equipment system controlled by motion and method of controlling thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: KONINKLIJKE PHILIPS ELECTRONICS N V, NETHERLANDS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GRITTI, TOMMASO;REEL/FRAME:032660/0597

Effective date: 20130522

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: PHILIPS LIGHTING HOLDING B.V., NETHERLANDS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KONINKLIJKE PHILIPS N.V.;REEL/FRAME:040060/0009

Effective date: 20160607

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

AS Assignment

Owner name: SIGNIFY HOLDING B.V., NETHERLANDS

Free format text: CHANGE OF NAME;ASSIGNOR:PHILIPS LIGHTING HOLDING B.V.;REEL/FRAME:050837/0576

Effective date: 20190201

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8