WO2012137046A1 - Éclairage adaptatif - Google Patents

Éclairage adaptatif Download PDF

Info

Publication number
WO2012137046A1
WO2012137046A1 PCT/IB2011/054730 IB2011054730W WO2012137046A1 WO 2012137046 A1 WO2012137046 A1 WO 2012137046A1 IB 2011054730 W IB2011054730 W IB 2011054730W WO 2012137046 A1 WO2012137046 A1 WO 2012137046A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
activity
motion
illumination
interest
Prior art date
Application number
PCT/IB2011/054730
Other languages
English (en)
Inventor
Gianluca Monaci
Tommaso Gritti
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Publication of WO2012137046A1 publication Critical patent/WO2012137046A1/fr

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/16Controlling the light source by timing means
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/125Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the invention relates to a luminaire, a system and a method for adaptive illumination.
  • US 2008/0265799 Al relates to providing energy efficient and intelligent illumination using distributed processing across a network of illuminators to control the illumination for a given environment.
  • the invention is based on the idea to adjust lighting configuration based on data mining, e.g. to use a typical pattern or history of detected data for adapting an illumination behavior.
  • the data correspond to activity data detected or sensed in an environment or surrounding area to be illuminated.
  • the activity data may be detected at a single point in the vicinity of a luminaire or illumination unit.
  • the activity data may be detected within a larger predefined surrounding area of the luminaire or illumination unit.
  • information for lighting configuration may be generated depending on a spatial segmentation of an area of interest based on statistically learned patterns in the area of interest from the spatial information provided by a sensor.
  • the illumination is adaptable to changes of the usage of a particular place with time, without human intervention or cumbersome installation requiring knowledge of the expected use of the environment.
  • a luminaire for adaptive illumination comprising an illumination unit, an activity sensor unit and a control unit.
  • the illumination unit is adapted to illuminate the environment or surrounding area of the luminaire.
  • the illumination unit may comprise one or more light sources with adjustable lighting properties, such as light color, light intensity, light quality, color rendering index, correlated color temperature or the like.
  • LED elements, fluorescence lamps, incandescent lamps, HID lamps, halogen lamps, etc. or a combination thereof may be used.
  • the illumination unit may be configured to change the direction and/or shape of the light beam. Then, at least one of shape, size or location of the illuminated area may be adjustable.
  • the activity sensor unit is adapted to sense or detect activity within the environment of the luminaire or of the illumination unit. Therefore, the activity sensor unit is preferably provided or associated with the luminaire or illumination unit, so that the activity within the area to be illuminated can be measured.
  • the sensitive area of the activity sensor unit may correspond to the direct vicinity or to an area below or around the luminaire (or illumination unit). Possibly, the sensitive area of the activity sensor unit corresponds even to a single point.
  • the activity sensor unit may comprise one or more sensors adapted to sense activity, including at least one of a conventional camera, a range imaging sensor (such as stereo camera, time-of- flight sensor, structured light 3D scanner, coded aperture camera etc.), a microphone array, an ultrasound sensor, a microwave sensor, a laser radar sensor, an infrared sensor or any combination thereof.
  • the activity sensor unit is adapted to sense motion, speed of motion and/or direction of motion.
  • the control unit is adapted to generate at least one history from the activity data sensed by the activity sensor unit.
  • the history of activity data may refer to activity data that are sensed over a
  • This data recording interval may be adjustable or set during commissioning.
  • the control unit may further be adapted to adjust operation characteristics of the luminaire. Therefore, the luminaire is capable of learning about the activity in its surroundings and to adjust its functionality or operation accordingly. Thus, the luminaire is capable of learning an appropriate illumination without requiring manual re-configuration or updates.
  • control unit may be capable of generating histories of activity data for different daytimes, respectively, and adapt the operation characteristics accordingly.
  • it may be likewise differentiated between weekdays and weekends or holidays.
  • the predetermined data recording interval defining the detection time interval of activity data included in the history may be adjustable.
  • the data recording interval may be set to an hour, a day, a week or the like.
  • the history may be based on activity data of the recent past.
  • a sliding window algorithm or the like may be used, so that new data replace old data.
  • any other algorithm for obtaining a current history comprising activity data of the recent past may be used.
  • the speed of adaptation may be adjusted. In some circumstances, a slow adaptation may be preferred avoiding consideration of temporary changes. In other situations, a fast reactivity may be required in order to quickly adapt the operation of the luminaire.
  • the adjusted operation characteristics are used as new operation parameters of the luminaire, the illumination unit, the activity sensor unit and/or the control unit.
  • the new operation parameters may be stored for future use. Therefore, the operation characteristics of the luminaire may be considered as adaptive operation parameters or adaptive default parameters.
  • the basic operation features and thus the functionality of the luminaire may be permanently or continuously adapted.
  • the luminaire may further comprise a memory unit for storing at least one of activity data, one or more histories of activity data, adjusted operation characteristics, or the like.
  • the history may be stored as a single activity value, which is updated using new activity data provided by the activity sensor unit.
  • this single activity value may correspond to a mean, average, or the like.
  • the memory unit stores activity data recorded for a predetermined time.
  • the luminaire may further comprise means for wireless communication, so that the luminaire may be capable of communicating or exchanging data with other luminaires or external control centers in its surroundings.
  • the luminaire may comprise a user interface, so that a user can manually change at least one of operation characteristics, parameters corresponding to a current illumination, or the like.
  • the operation characteristics refer to parameters of at least one of the luminaire, the illumination unit, the activity sensor unit and the control unit, the operation characteristics characterizing the respective operation or functionality.
  • operation characteristics include at least one of: a turn-on time relating to the time for reaching the final illumination state from an off-state (no light); a turn-off time relating to the time for switching off the illumination from an on-state; an activation time relating to the time interval, during which the luminaire, the illumination unit, the activity sensor unit and/or the control unit is active; an activation schedule relating to a daytime or date, when the luminaire, the illumination unit, the activity sensor unit and/or the control unit is activated; an illumination reactivity relating to the reaction speed with respect to current changes of activity within the surroundings (e.g.
  • more than one value may be set for these parameters, e.g. for different situations. For instance, several minimum intensity levels may be set or the like.
  • operation characteristics may relate to any parameters used by the luminaire during operation, but not only to parameters relating to illumination per-se.
  • these operation characteristics can be individually adjusted.
  • predefined sets of operation characteristics may be adjusted together, e.g. in dependency of each other. This may be advantageous for inter-correlated operation characteristics.
  • the activity data include at least one of an occupancy level relating to the number of persons detected within the surroundings, speed of motion, direction of motion, maximum speed, average speed and frequency of motion detection, i.e. how often motion is detected within a predefined time interval.
  • at least one of the speed of motion, the direction of motion, the maximum speed, the average speed and the frequency of motion detection may be determined for each person separately and/or simultaneously.
  • the speed of motion, the direction of motion, the frequency of motion detection and/or the maximum speed may be determined for a plurality of persons together.
  • the motion of one or more persons may be observed over a predetermined time interval in order to determine one or more of these parameters.
  • the average speed or the frequency of motion detection may be defined by a predetermined time interval.
  • the luminaire may be adapted to differentiate between surroundings having different activity based on occupancy and/or motion detected therein.
  • a luminaire is provided that can flexibly adapt its characteristics depending on typical speed patterns that are observed in its vicinity for a certain amount of time.
  • control unit may be capable of considering at least one of user control, ambient noise, ambient brightness, installation settings, time of day and date for adjusting the operation characteristics.
  • the luminaire may also comprise a light sensor and/or a sound sensor.
  • control unit may determine an activity pattern of the surrounding area based on the detected activity data. For instance, the control unit may be capable of differentiating between desk activity and corridor activity and to adjust the operation characteristics accordingly. This is highly advantageous, since a person working at a desk will most likely require a higher illumination quality than a person walking down a corridor. Possibly, a plurality of activity patterns is predefined, so that the activity data or the history of activity data can be discretely classified. Moreover, one or more operation characteristics or one or more sets of operation characteristics may be predefined corresponding to an activity pattern. For instance, a desk activity pattern may correspond to a set of operation characteristics including high lighting level, high light quality, slow illumination reactivity and long turn-on time and turn-off time.
  • a histogram is generated from the detected activity data for adjusting the operation characteristics of the luminaire.
  • the histogram may correspond to the history of activity data sensed for the predetermined time.
  • a sliding window method may be used for updating the histogram with respect to new activity data provided by the activity sensor unit. Therefore, the histogram comprises activity data of the recent past.
  • a fit function may be used for analyzing the histogram.
  • a discrete quantization may be performed, e.g. counts of activity below or above a certain threshold, higher/lower speed of motion than a predetermined value etc.
  • the histogram is also used for determining the activity pattern.
  • the operation characteristics may be adjusted based on a predefined mapping function, which links the activity data, the history of activity data, the histogram of activity data and/or the activity pattern to one or more operation characteristics.
  • this mapping function can be used without prior determination of activity patterns.
  • the mapping function may relate to a fit function, which can be either applied directly to the activity data, to pre-processed activity data, to the history of activity data and/or to the histogram of activity data.
  • the mapping function may relate to a family of curves defining a fit function for one or more operation characteristics, respectively.
  • the illuminated area or surface A may be a function of the occupancy level:
  • the lighting intensity I may be a function of the detected ambient brightness I ambient, the daytime t and the detected speed of motion v:
  • I I(I_ambient, t, v).
  • the luminaire, the activity sensor unit, the illumination unit and/or the controller is activated at predefined daytimes and/or in regular time intervals and/or when the luminaire or the illumination unit is manually activated by a user.
  • the luminaire, the activity sensor unit and/or the illumination unit may be continuously activated.
  • a system for adapted illumination comprising a plurality of luminaires according to one of the above-described embodiments, wherein the luminaires are capable of wirelessly communicating with each other and/or with a control center.
  • the luminaires may adapt their operation characteristics to each other.
  • a system for adaptive illumination comprising at least one illumination unit, an activity sensor unit associated with the illumination unit and a control unit.
  • the activity sensor unit may be integrated in or attached to an illumination unit.
  • the illumination unit and the activity sensor unit may be provided together, e.g. both comprised in a luminaire unit.
  • the activity sensor unit is capable of sensing activity data in the surroundings of the illumination unit.
  • the control unit is capable of generating at least one history of activity data for adjusting operation characteristics of the system based on the generated history of activity data. In one embodiment, the operation characteristics of one or more illumination units are adjusted. Alternatively or additionally, the operation characteristics of one or more activity sensor units are adjusted.
  • the system comprises only one control unit, which may be wirelessly connected to the illumination units and/or to the activity sensor units.
  • the control unit may be adapted to receive control commands from a remote user control interface.
  • Wireless communication may be performed using infrared communication, Bluetooth, ZigBee, WLAN, or radio communication.
  • the luminaire or system is adapted to sense the activity data in a spatially resolved manner.
  • data detected at one point of the surrounding area may be distinguished from another point of the surrounding area.
  • This spatially resolved activity data may also be referred to as spatial information of the observed surroundings or area of interest.
  • the spatial information may be used for learning patterns of activity in the surrounding area of interest based on statistical methods in order to spatially divide the area into segments.
  • the spatial information may be recorded for some time, so that a history of spatial information can be derived for adjusting operation characteristics of the illumination unit accordingly.
  • a yet further embodiment of the invention relates to a system for lighting configuration, wherein the system comprises at least one sensor for providing spatial information of an observed an area of interest or surrounding area, and a control unit being configured to process the spatial information provided by the at least one sensor by performing the following acts of statistically learning patterns in the area of interest from the spatial information for a predefined time intervall, spatially segmenting the area of interest based on the statistically learned patterns, and generating information for lighting
  • control unit of any embodiment of a system according to the present invention is preferably configured to perform a method of the invention as specified below.
  • a method for adaptive illumination comprising the steps of: sensing activity data in a surrounding area; generating at least one history of activity data sensed for a predetermined time; and adjusting operation characteristics based on the history of activity data. Furthermore, the method may further comprise the step of illuminating the surrounding area based on the adjusted operation characteristics. Also here, the area, in which activity data are detected, corresponds to the area to be illuminated.
  • the operation characteristics may relate to a luminaire as described above or to the operation characteristics of a system as described above.
  • the method according to the present invention may be performed for realizing a luminaire or system for adaptive illumination according to one of the above-described embodiments of the present invention.
  • an illumination or lighting is configured depending on a spatial segmentation of an area of interest based on patterns, which were statistically learned in the area of interest from the spatial information provided by a sensor for a predefined time intervall.
  • the patterns may be for example patterns of activity in the area of interest, patterns of changes, patterns of colors, patterns of appearance features or the like.
  • the predefined time intervall may last few hours or even days, so that a large amount of data may be available for the statistical learning of patterns, e.g. based on a data mining technique.
  • a data mining algorithm may be employed in order to analyze the statistically learned patterns and/or to extract high-level information from a large number of observations of the area of interest.
  • the so learned patterns may be used to spatially segment the area of interest, for example in segments with high and low activity. This segmentation may then be used for generating information for lighting configuration, e.g. for displaying the segmentation on a computer screen, so that a user may configure the lighting in accordance with the processed segmentation, or for creating light settings for light units located in the area of interest so that a fully automatic lighting configuration in an environment can be achieved.
  • the lighting configuration may be better adapted to the usage of an environment.
  • a method for lighting configuration comprising the steps of receiving spatial information of an observed area of interest provided by at least one sensor; statistically learning patterns in the area of interest from the sensor information for a predefined time intervall; spatially segmenting the area of interest based on the statistically learned patterns; and generating information for lighting configuration depending on the spatial segmentation of the area of interest.
  • the spatial information may be information from the area of interest, which contains an assignment of a detected information in the area of interest, for example an activity, to a location in the area of interest.
  • Spatial information may be for example obtained from a camera capturing pictures from the area of interest, a microphone array, an array of PIRs (passive infrared receivers), ultrasound arrays, TOF (Time of Flight) cameras or radar sensor sampling the area of interest for changes or activities.
  • PIRs passive infrared receivers
  • ultrasound arrays ultrasound arrays
  • TOF Time of Flight
  • radar sensor sampling the area of interest for changes or activities.
  • the statistically learning of patterns in the area of interest from the spatial information for a predefined time intervall may comprise analyzing the statistically learned patterns for partitions of the area of interest with different features, and the spatially segmenting of the area of interest based on the statistically learned patterns may comprise spatially segmenting the area of interest depending on the analyzed partitions with different features.
  • the area of interest may be spatially segmented with regard to different features, such as average motion detection, particularly slow and fast motion, direction of motion, speed of motion, detection of a change and/or color(s).
  • a partition with a clear trend in the direction of motion may be assigned to a walking area, where many people walk through (e.g. entrances in rooms).
  • a partition with a slow motion and frequent detection of changes may be assigned to a working area, where people usually move slowly, and things such as files and books on a desk are frequently moved from one location to another. Also, colors may analyzed for determining partitions, for example a partition with a many color changes may be assigned to a walking area, since many people with differently colored clothes walk through this partition within a short time period.
  • the segments of the area of interest may be classified. Such a classification with three different regions “region with no activity”, “walking area”, “desk area” can be aleady performed after only a few hours of embarkation of the are of interest.
  • Another embodiment of the invention provides a computer program enabling a processor to carry out the method according to the invention and as specified above.
  • a record carrier storing a computer program according to the invention may be provided, for example a CD-ROM, a DVD, a memory card, a diskette, internet memory device or a similar data carrier suitable to store the computer program for optical or electronic access.
  • a yet further embodiment of the invention provides a computer programmed to perform a method according to the invention such as a PC (Personal Computer) and comprising a first interface for receiving spatial information of an observed area of interest from at least one sensor and a second interface for outputting a generated information for lighting configuration.
  • the computer may execute a program with a graphical user interface, allowing a user to comfortably configure a lighting created with one or more controllable light units of a lighting system.
  • Fig. 1 shows a luminaire according to an exemplary embodiment of the present invention.
  • Fig. 2 shows a system for adaptive illumination according to an exemplary embodiment of the present invention.
  • Fig. 3 shows a set of operation characteristics for different activity patterns according to an exemplary embodiment of the present invention.
  • Fig. 4 shows histograms of motion speed according to an exemplary embodiment of the present invention.
  • Fig. 5 shows a flow diagram of a method for adaptive illumination according to an exemplary embodiment of the present invention.
  • Fig. 6 shows a simple block diagram of another embodiment for a system for lighting configuration according to the invention.
  • Fig. 7 shows a top view of a small office environment as area of interest as captured with a camera embedded in the ceiling of the room, wherein zones of different activity of people are marked.
  • Fig. 8 shows a flowchart of an embodiment of the method for lighting configuration according to the invention.
  • Fig. 9 shows a flowchart of an embodiment of the act of spatially segmenting the area of interest according to the invention.
  • Fig. 10 shows a flowchart of a further embodiment of the act of spatially segmenting the area of interest according to the invention.
  • Fig. 11 shows a flowchart of an embodiment of the act of computing histograms according to the invention.
  • the luminaire 100 comprises an illumination unit 110, an activity sensor unit 120, a control unit 130 and a memory 140.
  • the illumination unit 110 can comprise one or more light sources, such as an LED, a halogen lamp, a fluorescence lamp or the like.
  • the illumination unit 110 can further be configured to change the direction or shape of the light beam, so that the shape, size or location of the illuminated area can be adjusted.
  • the activity sensor unit can comprise more than one sensor element, e.g. a presence sensor, a motion sensor, a speed sensor or the like.
  • the activity sensor unit 120 is configured to measure the activity in the direct neighborhood of the luminaire 100, e.g. at a single point or in a small area below or around the luminaire 100.
  • the luminaire 100 can comprise further sensors (not shown), e.g. for sensing ambient brightness, ambient sound, etc.
  • a system for adaptive illumination is provided.
  • the illumination unit 110 and the activity sensor unit 120 are co-located. In the example shown in fig. 2, they are both included in a luminaire unit 150.
  • the activity sensor unit 120 can also be included in the housing of the illumination unit 110 or vice versa.
  • the controller 130 is provided separately from the luminaire unit 150 and can comprise a memory 140.
  • the memory 140 can be provided in the luminaire units 150.
  • the controller 130 can differentiate between activity data measured for the different luminaire units 150.
  • the controller 130 and the luminaire units 150 can communicate wirelessly with each other in order to exchange operation characteristics and activity data.
  • the system may be employed as a wireless network. Additional sensors, e.g. for sensing the ambient brightness, can either be provided in the luminaire units 150, in the control unit 130, or separately.
  • the invention is explained using the example of the luminaire 100 shown in fig. 1.
  • the invention is not limited thereto, but the embodiments described below can also be transferred to the system 200.
  • the principles of the invention are described using the example of an activity sensor unit 120 capable of measuring velocity and direction of motion.
  • a conventional camera with a computer vision algorithm, a range imaging sensor (such as stereo camera, time-of- flight sensor, structured light 3D scanner, coded aperture camera etc.), a microphone array with a speed estimation algorithm, an ultrasound based sensor, a microwave or lased radar sensor and the like or any combination thereof can be used as sensor elements of the activity sensor unit 120 in order to assess direction or speed of motion in the vicinity of the luminaire 100.
  • the invention is not limited to activity data related to speed or direction of motion, but also any other activity data may be used.
  • the velocity is measured when activating the luminaire 100, e.g. whenever an object or a person moves in the range of the luminaire 100. Then, the velocities measured in the recent past are exploited to extract simple yet reliable information about the environment in the direct vicinity of the luminaire 100. This information is then used to modify the operation characteristics of the luminaire 100, of the activity sensor unit 120 or of the illumination unit 110. For instance, as shown in fig. 3, the operation characteristics of the illumination unit 110 can include activation parameters, e.g.
  • Fig. 3A illustrates an example of an illumination pattern for an area showing "desk activity”
  • fig. 3B illustrates an example of an illumination pattern for an area showing "corridor activity”.
  • two luminaires 100 installed in an office are compared in order to illustrate the adaptation of illumination patterns based on the typical activity in the surroundings of the respective luminaires 100.
  • the first luminaire 100 is installed on top of a desk area, while the other luminaire 100 is arranged in a passage area or corridor.
  • the luminaire 100 mounted in the desk area will have observed mainly a large series of slow movements, with the exception of few fast movements representing the instances, in which a person arrived or left the working area.
  • this luminaire 100 will detect a "desk activity" as typical activity in its surroundings.
  • the luminaire 100 in the corridor will have observed a series of fast movements, corresponding to the many people passing by, and a few instances of slow movements corresponding to the situations of people stopping to discuss.
  • the luminaire 100 located in the corridor will sense a "corridor activity" being the typical activity in its vicinity.
  • a person working at a desk and a person walking down a corridor have very different requirements with respect to illumination. While the person at the desk will require a high quality illumination for avoiding fatigue and degrading concentration, the person in the corridor will only require a basic illumination of the corridor in his walking direction.
  • the illumination for the desk area should be set to be intense and slowly reactive to changes in order to maintain a good level of light required for optimal visibility, as shown in fig. 3A.
  • the interval between the first detection of motion or presence (arrow in fig. 3A) at time to and the light activation start time ti can be longer than for areas with other activity.
  • the time t 2 until which the final illumination level Imax is reached, can be set to be later.
  • This final illumination level I max will relate to a high intensity or high quality illumination.
  • the time for deactivating the illumination i.e. the interval between the light deactivation start time t 3 and the light deactivation end time t 4 , can be set longer.
  • the illumination unit 1 10 of the luminaire 100 installed in the desk area does also not turn off itself in case that no or only few movement is detected for a short time.
  • the illumination behavior of the luminaire 100 located in the corridor will be set to be reactive, both in activation and deactivation, and to a lower intensity I max , since this is sufficient for people walking by.
  • An example for such an illumination pattern is shown in fig. 3B.
  • the detection time to and the light activation start time are
  • the time until reaching the final illumination level I max or for deactivation is shorter than in the desk area.
  • the luminaire 100 in the corridor may be faster switched off or switched to a standby state.
  • the direction of illumination can be adjusted to the direction of motion or the size or diameter of the illuminated area can be adapted to the number of detected people. These parameters will be adjusted for the luminaire 100 in the corridor rather than for the luminaire 100 in the desk area.
  • FIG. 3C another example for an illumination pattern of a corridor area is shown.
  • the time for light activation i.e. the interval between light activation start time ti and light activation end time t 2
  • for light deactivation i.e. the time interval between the light deactivation start time t 3 and the light deactivation end time t 4
  • an additional intensity level I m i n is defined for situations, when no activity is sensed for a predefined time period.
  • the light activation is immediately started at ti.
  • illumination of the intensity level I max is provided for a predetermined time interval (from t 2 until t a ).
  • the illumination is switched at time tb to a standby illumination with a lower illumination level I m i n .
  • new motion is detected leading to re-activation of the illumination unit 110 to the higher illumination level I max for the predefined time interval (from td until t 3 , possibly being identical to the interval from t 2 until t a ).
  • the illumination is switched to the standby illumination.
  • the deactivation is started at time t 3 .
  • deactivation starts, if the illumination unit 110 is manually deactivated.
  • the illumination rules or presets can be set a priori or can be adapted after installation, e.g. by facility management or by the users themselves.
  • the control unit 130 can also consider other additional parameters for adapting the operation
  • the luminaire 100 can comprise further sensors, such as a brightness sensor or the like.
  • histograms for the detected speed of motion are shown for the luminaire 100 installed in the desk area and the luminaire 100 installed in the corridor, respectively.
  • the luminaire 100 in the desk area will mainly detect motion with low speed
  • the luminaire 100 installed in the corridor will mainly detect motion with high speed (shown in fig. 4B).
  • the detected speed of motion is quantized into a number of bins, e.g. low velocity, middle velocity and high velocity.
  • the size or number of bins may be adjustable or set during installation.
  • the histograms comprise speed data of the resent past, i.e. recorded during the predefined data recording interval. New detected speed data replace old data in the histogram, so that the histogram is permanently updated. This can be achieved, for instance, using a sliding window algorithm, wherein a window having a predefined width, e.g. corresponding to the predefined data recording interval, is slid over the data recorded over time and only data within the window are considered for the histogram.
  • the control unit 130 of the luminaire 100 can determine a typical activity pattern of the surroundings of the
  • control unit 130 may also use any statistical parameter determined from the detected activity data for determining the typical activity pattern in the surroundings.
  • the control unit 130 fits the raw activity data, preprocessed activity data or the histogram with a predefined fit function in order to determine the typical activity pattern.
  • the control unit 130 adapts the operation characteristics of the luminaire 100 for obtaining a functional and comfortable illumination behavior. For this, different sets of operation characteristics may be defined for a particular activity pattern, e.g. for a desk lamp, a corridor lamp, an entrance lamp, etc.
  • the operation characteristics can be directly determined from the histogram without prior classification of the area with activity patterns.
  • the definition of a mapping function from the histogram space to the operation characteristics space is required.
  • This mapping function can relate to a simple step function, e.g. desk activity for an average velocity below a certain threshold and corridor activity for an average velocity above this threshold.
  • Another example for a mapping function can be based on the frequency or counts of high and low velocity.
  • the operation characteristics are determined from the raw or preprocessed activity data themselves without generation of histograms and the like.
  • the operation characteristics of light intensity I can be set as depending on the detected occupancy level (i.e. the number of people present in the vicinity) and on the average speed of motion.
  • the mapping function for the light intensity I is a function of these data determined from the sensor data.
  • control unit 130 can differentiate between daytimes. In other words, the control unit 130 determines histograms or typical activity patterns for different daytimes, respectively. For instance, the activity next to the luminaire 100 in the morning may be different from the activity during lunchtime. Therefore, the control unit 130 is adapted to adjust the operation characteristics of the luminaire 100 based on the activity detected during a respective daytime.
  • the difference between the proposed invention and a luminaire controlled by simple binary presence detection becomes obvious.
  • binary presence detection no information about the motion of objects or people is provided.
  • activity patterns can be distinguished for a corridor, where ten people pass by within one minute and stay only for a very limited time under the luminaire 100, and an office, where a person moves on his chair for 10 times in this minute. Therefore, in contrast to binary presence detection and subsequent illumination control, the present invention provides a luminaire 100 that can learn the typical activity in its surroundings and adapt its operation characteristics accordingly.
  • step S510 activity data is sensed in the direct vicinity of the luminaire 100. These data are used for generating a history of activity (S520), wherein only activity data detected within the predefined data recording interval are used. Then, the activity data can be quantized in order to generate a histogram (S530). The history of activity data is updated continuously or with predefined time periods for new activity data (S540). Based on the history of activity, the operation characteristics of the luminaire 100 are adapted (S550). Then, the surroundings of the luminaire 100 can be illuminated based on the adjusted operation characteristics (S560).
  • the operation characteristics of the luminaire 100 do not only include activation characteristics of the illumination unit 110 (see fig. 3), but can also relate to illumination properties, such as light color, light temperature, light effects, periodic light amplitude fluctuations, color changes, the direction of the light beam, the shape or size of the illuminated area and the like.
  • the operation characteristics of the activity sensor unit 120 can be adjusted, e.g. the time, in which activity data are taken into account for the adaptation process, a bin size of histograms, a time of activation of the activity sensor unit 120 and the like.
  • Fig. 6 shows a simple block diagram of another embodiment of a system for lighting configuration according to the present invention.
  • the system 600 is capable of learning how an environment such as a small office is used and configuring lighting in the environment, particularly adjusting the lighting settings accordingly.
  • the system 600 may comprise at least one sensor or sensor unit 120 for observing an area of interest 14 (the office environment) and providing spatial information from the area of interest, a processing/control unit 130, for example a computer configured by software to process the spatial information provided by the sensor 120, and at least one controllable light unit or illumination unit 110.
  • the sensor unit 120 may collect data for several illumination units 110, wherein the sensor unit 120 is adapted to sense data with spatial resolution, also referred to as spatial information.
  • the control unit 130 may receive the spatial information from the sensor 120 via a wired or wireless communication connection, for example via a ZigBeeTM connection.
  • the illumination units 110 may be controlled by the control unit 130 also via a wired or wireless communication connection such as a ZigBeeTM communication connection.
  • the sensor 120 of the system 600 is able to deliver spatial information from the area of interest 14, which in general corresponds to the surrounding area of one or more illumination units 110.
  • sensors that can provide spatial information of the observed environment can be used. Examples include cameras, arrays of PIRs as described in "Video Scene Understanding Using Multi-scale Analysis", Yang, Y., Liu, J. and Shah, M. 2009, IEEE International Conference on Computer Vision, ultra-sound radar arrays, microphone arrays, thermopile arrays etc.
  • the sensor 120 collects measurements of the environment 14.
  • the processing unit 130 using data mining techniques, automatically learns partitions of the observed space 14 that reflect the usage that is made of the environment. Based on this knowledge, the control unit 130 can adapt the type of illumination (shape, color, intensity etc.) created by the illumination units 110 according to the way the different areas are typically used.
  • usage patterns are determined using statistical learning for a predefined time interval, collecting simple and robust features for a long time. These long-term observations allow building a high-level segmentation of the environment, which was found being tightly related to the activities carried out in the different zones. This allows for a robust and flexible lighting configuration system that is able to provide the right amount and quality of light given the situation, allowing energy saving (e.g. more light on the desk than on the corridor) and better fit to user's needs.
  • the statistical learning can be done only once, after the system is installed, or can be updated (continuously, periodically or in certain occasions, e.g. when a room is refurbished) so that the behavior and usage changes can be automatically taken into account.
  • Fig. 7 shows in a top view a small office 14 as an example of an environment (area of interest), which can be observed with the system 600 according to this embodiment of the invention.
  • two desks 143 with computers and chairs 144 for office workers are located in the left part of the office 14.
  • office furniture such as shelves 145 and the door 146 to the office are located in the right part of the office 14.
  • a camera 147 is embedded in the ceiling, which is configured to capture images of the entire office environment and to transmit the captured images to the control unit of the system. After enough information is collected (typically few hours), the system 600 is able to "understand" which areas of the office 14 are used and how, and may adapt the light settings in the room accordingly.
  • the “understanding” is performed by applying algorithms as will be described later in detail.
  • the system "recognizes” a desk area or partition 141 with slow motion (dashed-dotted box), and lights are consequently set to be for example intense and slowly reactive to changes.
  • the system 600 also automatically “learns” the area or partition 142 with fast motion where people walk (dashed box) and illuminates it for example with reactive light only when users are detected there.
  • the rest of the room is not used and thus illumination can be, for example, low and diffuse.
  • These illumination "rules” can be set a priori or adapted, after the installation, by facility management or by the users themselves.
  • the system extracts context information from the environment, which can be used in two different ways:
  • Context information can be fed to presence detection systems to improve their performances, e.g. lowering detection threshold in areas where activity has typically occurred more frequently, as in the partitions 141 and 142 in Fig. 7.
  • the lighting system Given the presence of a user (e.g. detected by the presence detection unit), the lighting system can activate the appropriate lighting according to the learned context information.
  • the system is simple because it does not require the modeling of human activities and their reliable detection in unconstrained environments.
  • information extracted from long-term observations of the environment can be used for applications that are not directly related to lighting.
  • usage patterns can be used to provide statistics of people behavior in shops or public spaces for marketing purposes or security, or to automatically extract abnormal events in video surveillance systems.
  • Vision sensors provide accurate spatial information with only one sensor, and are already deployed in smart lamps to sense light levels and occupancy levels, as in the Philips Mini300 LED luminaire.
  • the sensor can be embedded in a light fixture or can be installed as a separate module with the lighting infrastructure, for example in the ceiling.
  • the algorithm is implemented in the control unit 130 of the system 600.
  • the control unit 130 may be for example implemented by a computer with interfaces for receiving spatial information from the sensor 120 and for controlling the illumination units 110.
  • the interfaces may be for example implemented by ZigBeeTM or any other modules suitable for communication in a lighting system.
  • the algorithm may be implemented as part of a program for controlling and configuring a lighting system.
  • Fig. 8 shows in a flowchart the steps S10-S16 of an embodiment of the algorithm.
  • step S10 spatial information, namely images captured with a vision sensor from the area of interest 14 are received.
  • step S12 patterns in the area of interest 14 are learned from the received spatial information.
  • step S14 the area of interest 14 is then spatially segmented into two partitions 141 and 142 based on the statistically learned patterns.
  • step SI 6 information for lighting configuration depending on the spatial segmentation is generated in step SI 6.
  • the generated information can for example be displayed on a monitor coupled with the control unit 130, so that a user may configure the lighting system by using this information, or the generated information can configure the lighting in the area of interest 14 by controlling the illumination units 110.
  • the statistically learned patterns are analyzed for partitions 141, 142 of the area of interest 14 with different features (step S121 in Fig. 9) and the area of interest 14 is then spatially segmented depending on the analyzed partitions with different features (step S141 in Fig. 9).
  • Fig. 10 An embodiment of the analysis in step S121 and of the spatially segmenting in step S14 is shown in Fig. 10:
  • images of the area of interest 14 are divided into blocks, and then very simple video features for each block can be computed (in Fig. 7, the division of an image into blocks is shown by a grid over the image of the entire office 14).
  • Features can include average motion detection such as slow or fast motion, direction of motion such for example from the door 146 to a desk 143, speed of motion, detection of a change such as for example the change of the location of a chair 144, color etc.
  • the advantage of using such simple features is that they are very basic, making the system flexible and general, and easy and robust to compute.
  • Histograms provide a very compact representation of data, while capturing the main statistical properties of the observed phenomena.
  • a 3 -bin histogram for motion (no motion, slow motion, fast motion) and a 2-bin histogram for change detection (no change detected, change detected) can be built (steps S 12122 S 12123 in Fig. 11) in the following way:
  • ChangeDetection(x,y) ChangeDetection(x,y)+ 1 SlowMotion
  • FastMotion and ChangeDetection can be visualized as 2D density maps and each point into these histograms represents one image block.
  • the size of these density maps is HightImage/8 WidthImage/8.
  • step S1411 blocks with the same characteristics are grouped together.
  • clustering methods such as hierarchical clustering (Hastie, T., Tibshirani, R. and Friedman, J. H. The Elements of Statistical Learning: Data Mining, Inference, and Prediction, s.l. : Springer, 2009; Hierarchical grouping to optimize an objective function. Ward, J. H. 301, 1963, Journal of the American Statistical Association, Vol. 58, pp. 236-244), diffusion maps (Diffusion maps. Coifman, R. R. and Lafon, S. 1, 2006, Applied and Computational Harmonic Analysis, Vol. 21, pp.
  • image blocks may be clustered based on the similarity of features' histograms using the minimal variance hierarchical clustering method (Hierarchical grouping to optimize an objective function. Ward, J. H. 301, 1963, Journal of the American Statistical Association, Vol. 58, pp. 236-244).
  • the number of clusters does not have to be specified in advance.
  • the number of clusters can be automatically computed in such a way that the average solidity (area/convex area) of all connected regions in the labeled image is maximal. This is done to ensure spatial compactness of the clusters.
  • an analysis of an office room such as the one shown in Fig. 7 found essentially 4 clusters. These clusters are shown in Fig. 7 and are denoted with reference numerals 147-150.
  • the segmentation of the environment reflects the function of different regions: non-moving areas (147), walking areas (148), door/corridor (149) and desk areas (170).
  • the spatial grouping (partitions 141 and 142) of similarly behaving image regions is obtained without using the information of blocks position in the image. This labeling of the observed scene is clearly related to the usage that is made of the room. This image may be referred to as activity map.
  • the activity map of the room can be used in different ways:
  • the environment shown in Fig. 7 can be automatically subdivided into these pre-defined areas (partitions 141 and 142) automatically after few hours of observation.
  • the 4 clusters in Fig. 7 can be reduced to 3 clusters, where the door 149 and the walking areas 148 are merged (they are both classified as walking area according to rule (b) above).
  • the third cluster with the regions where nothing happens is not denoted and shown in Fig. 7. For each of these areas, a predefined light setting is stored in the system and is applied to the corresponding part of the room.
  • the clusters are areas where similar activities are carried out and that are described in a language (feature representation) that is understandable for the computer.
  • a user or facility manager can select one cluster (e.g. the desk area 170 in Fig. 7) and define some lighting setting for it (e.g. intense light for reading). Then all regions observed by the system that are similar in activity can be automatically set to the same lighting setting.
  • the proposed method translates complex semantic concepts (area where people walk, area where people work etc.) into commands that are easily
  • a clear advantage of this approach is that, after the facility management has defined lighting for a certain activity cluster, any new zone behaving similarly will be automatically assigned to the same light setting.
  • a luminaire, a system and a method for smart reactive illumination wherein a sensor unit senses activity data in the surroundings of the luminaire (unit) or of an illumination unit, a control unit generates a history of the activity data sensed for a predetermined time and adjusts operation characteristics of luminaire (unit) or of the illumination unit based on the history of activity data for illuminating the surroundings.
  • a luminaire, system and method for adaptive illumination are provided capable of flexible, autonomous and automatic adaptation of operation characteristics (e.g. how quickly illumination is turned on/off, length of activation time, maximum intensity, shape of light beam, etc) depending on typical speed patterns observed in the vicinity of the luminaire or luminaire unit for a certain amount of time.
  • a surrounding area can be illuminated with increased energetic efficiency, higher flexibility, improved user convenience and operation comfort, the illumination being autonomously adapted according to changing requirements.
  • the luminaires, systems and methods for adaptive illumination according to the present invention can be used for any professional or consumer illumination, e.g. in cafeterias, libraries, museums, office buildings, common spaces, corridors, and private homes in order to improve the lighting configuration, particularly to better adapt lighting to the activities in regions with different activities of an environment.
  • the invention can also be used to assist users in lighting configuration, for example by means of computer executing a computer program implementing the present invention and processing spatial information of an environment collected over a long time, such as images captured with a video camera installed in the ceiling of shop from the activity in the shop over one or more days. Shop personnel and lighting designers can then obtain lighting configuration information as output of the program, which can be visualized and allow to recognize spatial segments or partitions of the environment with different activity.
  • At least some of the functionality of the invention may be performed by hard- or software.
  • a single or multiple standard microprocessors or microcontrollers may be used to process a single or multiple algorithms implementing the invention.

Landscapes

  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

Pour l'éclairage d'une zone environnante avec un rendement énergétique accru, une flexibilité supérieure et une ergonomie améliorée pour l'utilisateur, l'éclairage étant adapté de manière autonome en fonction des besoins changeants, l'invention concerne un luminaire, un système et un procédé pour l'éclairage adaptatif, une unité de capteur d'activité (120) captant des données d'activité aux environs d'une unité d'éclairage (110), une unité de commande (130) générant un historique des données d'activité captées pour un temps prédéterminé et ajustant les caractéristiques opérationnelles de l'unité d'éclairage (110) sur la base de l'historique des données d'activité pour éclairer les environs.
PCT/IB2011/054730 2011-04-04 2011-10-24 Éclairage adaptatif WO2012137046A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP11161030.9 2011-04-04
EP11161030 2011-04-04

Publications (1)

Publication Number Publication Date
WO2012137046A1 true WO2012137046A1 (fr) 2012-10-11

Family

ID=44925600

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2011/054730 WO2012137046A1 (fr) 2011-04-04 2011-10-24 Éclairage adaptatif

Country Status (1)

Country Link
WO (1) WO2012137046A1 (fr)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2919562A1 (fr) * 2014-03-11 2015-09-16 Helvar Oy Ab Luminaire d'apprentissage et dispositif de commande d'apprentissage pour un luminaire
WO2016122771A1 (fr) * 2015-01-28 2016-08-04 Abl Ip Holding Llc Auto-découverte de relations voisines et cartographie autonome d'installations d'éclairage au moyen d'une communication en lumière visible
EP3064042A1 (fr) * 2013-10-29 2016-09-07 CP Electronics Limited Appareil de régulation d'une charge électrique
US9730293B2 (en) 2013-07-02 2017-08-08 Philips Lighting Holding B.V. Method and apparatus for conveying aggregate presence information using light
WO2017182512A1 (fr) * 2016-04-22 2017-10-26 Philips Lighting Holding B.V. Système d'éclairage de vente au détail
WO2017207276A1 (fr) * 2016-05-30 2017-12-07 Philips Lighting Holding B.V. Commande d'éclairage
DE102016222471A1 (de) * 2016-11-16 2018-05-17 Tridonic Gmbh & Co Kg System und Verfahren zur Erstellung von Anwesenheitsprofilen für die Gebäudesteuerung
EP3326080A4 (fr) * 2015-07-23 2018-07-18 Digital Lumens Incorporated Systèmes et procédés d'éclairage intelligent pour la surveillance, l'analyse et l'automation de l'environnement bâti
LU100351B1 (en) * 2017-07-28 2019-02-12 Titian Touch Sarl Sensor system and apparatus
RU2704309C2 (ru) * 2014-09-29 2019-10-28 Филипс Лайтинг Холдинг Б.В. Системы и способы для управления освещением
US10531539B2 (en) 2016-03-02 2020-01-07 Signify Holding B.V. Method for characterizing illumination of a target surface
WO2020254227A1 (fr) * 2019-06-18 2020-12-24 Signify Holding B.V. Dispositif d'éclairage pour éclairer un environnement et procédé de commande d'un dispositif d'éclairage
EP3846591A1 (fr) * 2019-12-30 2021-07-07 Helvar Oy Ab Commande d'éclairage
EP3879936A1 (fr) * 2020-03-11 2021-09-15 Tridonic GmbH & Co KG Procédé de classification fonctionnelle de luminaires
CN113608459A (zh) * 2021-07-09 2021-11-05 佛山电器照明股份有限公司 光环境智能调控方法、光环境智能调控系统及设备
EP3972391A1 (fr) * 2020-09-18 2022-03-23 Helvar Oy Ab Modélisation de caractéristiques environnementales sur la base de données de capteurs pouvant être obtenues dans un système d'éclairage
CN114980443A (zh) * 2022-06-15 2022-08-30 安徽领电智能科技有限公司 一体化智能照明控制系统
CN115835453A (zh) * 2022-12-30 2023-03-21 东莞锐视光电科技有限公司 调节光源光线参数的方法、装置、介质及电子设备
CN116600448A (zh) * 2023-05-29 2023-08-15 深圳市帝狼光电有限公司 一种壁挂灯的控制方法、控制装置及壁挂灯
WO2024060138A1 (fr) * 2022-09-22 2024-03-28 Tridonic Gmbh & Co Kg Système de commande d'équipement d'éclairage, dispositif de commande et procédé associé

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004049767A1 (fr) * 2002-11-22 2004-06-10 Koninklijke Philips Electronics N.V. Systeme et procede permettant de commander une source de lumiere et dispositif d'eclairage associe
WO2007072285A1 (fr) * 2005-12-19 2007-06-28 Koninklijke Philips Electronics N. V. Procede et appareil de commande d’eclairage
US20080265799A1 (en) 2007-04-20 2008-10-30 Sibert W Olin Illumination control network
WO2011007299A1 (fr) * 2009-07-15 2011-01-20 Koninklijke Philips Electronics N.V. Automatisation d'éclairage adaptée à l'activité

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004049767A1 (fr) * 2002-11-22 2004-06-10 Koninklijke Philips Electronics N.V. Systeme et procede permettant de commander une source de lumiere et dispositif d'eclairage associe
WO2007072285A1 (fr) * 2005-12-19 2007-06-28 Koninklijke Philips Electronics N. V. Procede et appareil de commande d’eclairage
US20080265799A1 (en) 2007-04-20 2008-10-30 Sibert W Olin Illumination control network
WO2011007299A1 (fr) * 2009-07-15 2011-01-20 Koninklijke Philips Electronics N.V. Automatisation d'éclairage adaptée à l'activité

Non-Patent Citations (10)

* Cited by examiner, † Cited by third party
Title
BLEI, D. M., NG, A. Y., JORDAN, M. 1.: "Latent dirichlet allocation", JOURNAL OF MACHINE LEARNING RESEARCH, vol. 3, 2003, pages 993 - 1022, XP002427366, DOI: doi:10.1162/jmlr.2003.3.4-5.993
COIFMAN, R. R., LAFON, S. 1: "Diffusion maps.", APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, vol. 21, 2006, pages 5 - 30
DE HAAN, G., BIEZEN, P.: "Sub-pixel motion estimation with 3-D recursive search block-matching", SIGNAL PROCESSING: IMAGE COMMUNICATION, vol. 6, 1994, pages 229 - 239, XP000451927, DOI: doi:10.1016/0923-5965(94)90027-2
HASTIE, T., TIBSHIRANI, R., FRIEDMAN, J. H.: "The Elements of Statistical Learning: Data Mining, Inference, and Prediction", 2009, SPRINGER
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, vol. 101, pages 1566 - 1581
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, vol. 58, pages 236 - 244
TEH, Y. W. ET AL., HIERARCHICAL DIRICHLET PROCESS, 2006, pages 476
WARD, J. H., HIERARCHICAL GROUPING TO OPTIMIZE AN OBJECTIVE FUNCTION, 1963, pages 301
WARD, J. H., HIERARCHICAL GROUPING TO OPTIMIZE AN OBJECTIVE FUNCTION, vol. 301, 1963
YANG, Y., LIU, J., SHAH, M.: "Video Scene Understanding Using Multi-scale Analysis", IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION, 2009

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9730293B2 (en) 2013-07-02 2017-08-08 Philips Lighting Holding B.V. Method and apparatus for conveying aggregate presence information using light
EP3064042A1 (fr) * 2013-10-29 2016-09-07 CP Electronics Limited Appareil de régulation d'une charge électrique
CN104918356A (zh) * 2014-03-11 2015-09-16 赫尔瓦有限公司 学习照明器和照明器的学习控制设备
CN104918356B (zh) * 2014-03-11 2020-04-03 赫尔瓦有限公司 学习照明器和照明器的学习控制设备
EP2919562A1 (fr) * 2014-03-11 2015-09-16 Helvar Oy Ab Luminaire d'apprentissage et dispositif de commande d'apprentissage pour un luminaire
RU2704309C2 (ru) * 2014-09-29 2019-10-28 Филипс Лайтинг Холдинг Б.В. Системы и способы для управления освещением
US9806810B2 (en) 2015-01-28 2017-10-31 Abl Ip Holding Llc Auto-discovery of neighbor relationships and lighting installation self-mapping via visual light communication
WO2016122771A1 (fr) * 2015-01-28 2016-08-04 Abl Ip Holding Llc Auto-découverte de relations voisines et cartographie autonome d'installations d'éclairage au moyen d'une communication en lumière visible
US9998219B2 (en) 2015-01-28 2018-06-12 Abl Ip Holding Llc Auto-discovery of neighbor relationships and lighting installation self-mapping via visual light communication
US10193626B2 (en) 2015-01-28 2019-01-29 Abl Ip Holding Llc Auto-discovery of neighbor relationships and lighting installation self-mapping via visual light communication
EP3326080A4 (fr) * 2015-07-23 2018-07-18 Digital Lumens Incorporated Systèmes et procédés d'éclairage intelligent pour la surveillance, l'analyse et l'automation de l'environnement bâti
US10531539B2 (en) 2016-03-02 2020-01-07 Signify Holding B.V. Method for characterizing illumination of a target surface
WO2017182512A1 (fr) * 2016-04-22 2017-10-26 Philips Lighting Holding B.V. Système d'éclairage de vente au détail
US11330693B2 (en) 2016-05-30 2022-05-10 Signify Holding B.V. Illumination control
WO2017207276A1 (fr) * 2016-05-30 2017-12-07 Philips Lighting Holding B.V. Commande d'éclairage
DE102016222471A1 (de) * 2016-11-16 2018-05-17 Tridonic Gmbh & Co Kg System und Verfahren zur Erstellung von Anwesenheitsprofilen für die Gebäudesteuerung
CN111279798B (zh) * 2017-07-28 2022-09-09 悌薰科技股份有限公司 感测器设备及其安装方法、系统及其使用方法
WO2019034389A1 (fr) * 2017-07-28 2019-02-21 Titian Touch Sàrl Système et appareil capteur
EP3659405B1 (fr) * 2017-07-28 2024-01-31 Titian Tech Inc. Système de capteur et appareil
US10928054B2 (en) 2017-07-28 2021-02-23 Titian Tech Inc. Sensor system and apparatus
CN111279798A (zh) * 2017-07-28 2020-06-12 悌薰科技股份有限公司 感测器系统及感测器设备
AU2018317644B2 (en) * 2017-07-28 2021-07-08 Titian Tech Inc. Sensor system and apparatus
AU2018317644B9 (en) * 2017-07-28 2021-08-19 Titian Tech Inc. Sensor system and apparatus
LU100351B1 (en) * 2017-07-28 2019-02-12 Titian Touch Sarl Sensor system and apparatus
WO2020254227A1 (fr) * 2019-06-18 2020-12-24 Signify Holding B.V. Dispositif d'éclairage pour éclairer un environnement et procédé de commande d'un dispositif d'éclairage
EP3846591A1 (fr) * 2019-12-30 2021-07-07 Helvar Oy Ab Commande d'éclairage
WO2021180468A1 (fr) * 2020-03-11 2021-09-16 Tridonic Gmbh & Co Kg Procédé de classification fonctionnelle de luminaires
EP3879936A1 (fr) * 2020-03-11 2021-09-15 Tridonic GmbH & Co KG Procédé de classification fonctionnelle de luminaires
US20230100783A1 (en) * 2020-03-11 2023-03-30 Tridonic Gmbh & Co Kg Method for functional classification of luminaires
EP3972391A1 (fr) * 2020-09-18 2022-03-23 Helvar Oy Ab Modélisation de caractéristiques environnementales sur la base de données de capteurs pouvant être obtenues dans un système d'éclairage
CN113608459A (zh) * 2021-07-09 2021-11-05 佛山电器照明股份有限公司 光环境智能调控方法、光环境智能调控系统及设备
CN114980443A (zh) * 2022-06-15 2022-08-30 安徽领电智能科技有限公司 一体化智能照明控制系统
WO2024060138A1 (fr) * 2022-09-22 2024-03-28 Tridonic Gmbh & Co Kg Système de commande d'équipement d'éclairage, dispositif de commande et procédé associé
CN115835453A (zh) * 2022-12-30 2023-03-21 东莞锐视光电科技有限公司 调节光源光线参数的方法、装置、介质及电子设备
CN116600448A (zh) * 2023-05-29 2023-08-15 深圳市帝狼光电有限公司 一种壁挂灯的控制方法、控制装置及壁挂灯
CN116600448B (zh) * 2023-05-29 2024-02-13 深圳市帝狼光电有限公司 一种壁挂灯的控制方法、控制装置及壁挂灯

Similar Documents

Publication Publication Date Title
WO2012137046A1 (fr) Éclairage adaptatif
US9367925B2 (en) Image detection and processing for building control
US11132881B2 (en) Electronic devices capable of communicating over multiple networks
US9915416B2 (en) Method, apparatus, and system for occupancy sensing
US20170347432A1 (en) Learning capable lighting equipment
US10412811B1 (en) Electronic devices for controlling lights
US11252378B1 (en) Batteryless doorbell with rectified power delivery
US10593174B1 (en) Automatic setup mode after disconnect from a network
US11341825B1 (en) Implementing deterrent protocols in response to detected security events
US11039520B1 (en) Electronic devices for controlling lights
US10803719B1 (en) Batteryless doorbell with energy harvesters
JP2017522692A (ja) 占有センシングスマート照明システム
US11501618B1 (en) Security device with user-configurable motion detection settings
US11164435B1 (en) Audio/video recording and communication doorbell devices with supercapacitors
US10769909B1 (en) Using sensor data to detect events
US10791607B1 (en) Configuring and controlling light emitters
US10943442B1 (en) Customized notifications based on device characteristics
CN117354986A (zh) 一种多功能led灯珠的智能控制方法及系统
US11412189B1 (en) Batteryless doorbell with multi-load power delivery
CN109844825B (zh) 存在检测系统和方法
KR20150119585A (ko) 객체 인식을 이용한 조명 제어 시스템 및 조명 제어 방법
US11423762B1 (en) Providing device power-level notifications
US11163097B1 (en) Detection and correction of optical filter position in a camera device
US11644191B2 (en) NIR motion detection system and method
US12014611B1 (en) Temporal motion zones for audio/video recording devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11781646

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11781646

Country of ref document: EP

Kind code of ref document: A1