WO2017108408A1 - Système de capteur - Google Patents
Système de capteur Download PDFInfo
- Publication number
- WO2017108408A1 WO2017108408A1 PCT/EP2016/080097 EP2016080097W WO2017108408A1 WO 2017108408 A1 WO2017108408 A1 WO 2017108408A1 EP 2016080097 W EP2016080097 W EP 2016080097W WO 2017108408 A1 WO2017108408 A1 WO 2017108408A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- time
- sensor unit
- person
- data
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/105—Controlling the light source in response to determined parameters
- H05B47/11—Controlling the light source in response to determined parameters by determining the brightness or colour temperature of ambient light
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/105—Controlling the light source in response to determined parameters
- H05B47/115—Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/105—Controlling the light source in response to determined parameters
- H05B47/115—Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
- H05B47/125—Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02B—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
- Y02B20/00—Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
- Y02B20/40—Control techniques providing energy savings, e.g. smart controller or presence detection
Definitions
- the present invention relates to synchronization of a sensor system.
- a lighting system for illuminating an environment may comprise one or more luminaires, each of which comprises one or more lamps that emit illumination into the environment, plus any associated socket, housing or support.
- Each lamp may take any suitable form, for example an LED-based lamp comprising one or more LEDs, or a filament bulb, gas discharge lamp, etc.
- Such luminaires may be inter-connected so as to form a lighting network.
- a gateway such as a lighting bridge, may be connected to the network.
- the gateway can be used to communicate control signals via the network to each of the luminaires, for example from a general-purpose computer device such as a smartphone, tablet or laptop connected to the gateway.
- the lighting network may have a mesh topology, whereby the luminaires themselves act as relays within the lighting network, relaying control signals between the gateway and other luminaires in the network.
- the network may have a star topology, whereby luminaires communicate with the gateway "directly” i.e. without relying on other luminaires to relay the control signals (though possibly via other dedicated network components).
- the network can have any suitable network topology, e.g. based on a combination of star-like and mesh-like connections.
- the lighting network may for example operate in accordance with one of the ZigBee protocols, while the computer device connects to the gateway via another protocol such as Wi-Fi.
- the luminaires or the lighting system may also be equipped with sensor mechanisms.
- sensor mechanisms have been relatively unsophisticated.
- PIR passive infra-red
- More modern lighting systems can incorporate sensors into the lighting network, so as to allow the aggregation of sensor data from multiple sensors in the environment. Using suitable sensors, this allows the luminaires to share information on, say, occupancy, activity patterns, changes in temperature or humidity, daylight levels, etc.
- These sensor signals may be communicated via the lighting network to the gateway, thereby making them available to the (or a) computer device connected to the gateway.
- Such sensors have also been used in a lighting system to extract information relating to people in the area covered by the lighting system.
- people counting techniques have been utilised to generate a count of people in the area based on the aggregation of sensor data from individual image capture devices.
- the ability to detect a count of people over a particular area may have a number of applications, such as space optimization, planning and maintenance, HVAC control, and data analytics driven marketing.
- applications such as space optimization, planning and maintenance, HVAC control, and data analytics driven marketing.
- people count is needed as one of the input data for analysis.
- a count of people in (pseudo) real time may be desired to identify temporal and spatial usage patterns.
- the present invention allows the outputs of the sensors to be synchronized, thereby ensuring that accurate information can be derived from those outputs.
- a first aspect of the present invention is directed to a method of synchronizing first and second sensor units of a sensor system, the method comprising implementing by a synchronisation system of the sensor system and external to the sensor units the following steps: receiving from the first sensor unit: at least one first measurement generated at the first sensor unit, and a timestamp of that measurement generated at the first sensor unit based on a first clock signal available thereat; receiving from the second sensor unit: a plurality of second measurements generated at the second sensor unit, and for each of the second measurements a timestamp of that measurement generated at the second sensor unit based on a second clock signal available thereat; comparing the first measurement with each of the plurality of second measurements to identify which of the second measurements has a maximum correlation with the first measurement; determining a timing offset between the first and second clock signals by determining a difference between the timestamp of the first measurement generated at the first sensor unit and the timestamp of the identified second measurement having the maximum correlation with the first measurement generated at the second sensor unit; receiving from the first sensor unit at least
- the synchronization of the first aspect is entirely passive, in the sense that it is based entirely on measurements from the sensor units without requiring any special communications between the sensor units and the synchronization system outside of the normal operation of the sensor units. That is, the first aspect can be implemented without any additional signalling overhead to the sensors within the sensor system.
- the synchronization of the subsequent measurements is not achieved by adjusting the first or second sensors units (which would require communication between the external synchronization system, and thus additional signalling overhead), and in particular is not achieved by adjusting how they apply their respective timestamps - rather, the first and second units continue to output timestamps that are "inconsistent" (in the sense that the timing offset persists, such that the timestamps of the first measurement differs from that of the substantially simultaneous second measurement by substantially the timing offset), and this inconsistency is accounted for externally at the external synchronization system based on the earlier determination of the timing offset at the external synchronization system.
- a second aspect of the present invention is directed to a person detection sensor unit comprising: a communications interface; a sensor device configured to capture over time sensor data from an area covered by the sensor device; a processor configured to implement the following steps: detecting in a first portion of the sensor data captured at a first time a predetermined synchronization code, and measuring the first time based on a clock signal available at the sensor unit; detecting in a second portion of the sensor data captured at a second time, at least one person present in the area, and measuring the second time based on the clock signal; based on said detection at the sensor unit of the at least one person, generating from the second portion of the sensor data presence data pertaining to the detected at least one person; and outputting via the communications interface the presence data for the second time and associated timing data, which conveys the second time as measured at the sensor unit relative to the first time as measured at the sensor unit.
- the second aspect can provide highly accurate synchronization and, when sensor units according to the second aspect are connected in a sensor network, allows time differences due to the transport of the message over the sensor network to be accounted for. Moreover, although a dedicated synchronization code is used, it is communicated in manner that is detectable by the sensor device of the sensor units, and thus does not create any additional signalling overhead within the sensor system.
- the first and second aspect can be combined such that synchronization of timestamps is performed locally at the sensor units (according to the second aspect) and measurements are additional synchronized externally (according to the first aspect).
- a plurality of first measurements may be received from the first sensor unit at the synchronization system, and for each of the first measurements a timestamp of that measurement generated at the first sensor unit based on the first clock signal.
- a correlation may be determined for each of a plurality of time difference values, by applying a correlation function to the first and second measurements for that time difference value, the determined time offset between the clock signals corresponding to the difference value for which the determined correlation is maximised.
- maximum in relation to a correlation means most correlated. Depending on how the correlation function is defined this may (for example) correspond to a maximum value of the correlation function, but in other cases may correspond to a minimum value of the correlation function depending on how it is defined.
- the first measurement and the second measurements may pertain to an area of overlapping sensor coverage between the first and second sensor units.
- Each of the first and second measurements may comprise a respective measured location of a person detected in an area covered by the first and second sensor units respectively.
- the locations measured by both sensor units may be in the area of overlapping sensor coverage.
- Each of the first and second measurements may comprise a respective light level measured over all or part of an area covered by the first and second sensor units respectively.
- the light levels may be measured by both sensor units across the area of overlapping sensor coverage.
- the first measurement may pertain to only a part of the area covered by the first sensor unit, and each of the second measurements pertains to only a part of the area covered by the second sensor unit.
- the first measurement may be compared with each of the plurality of second measurements by multiplying the first measurement with that second measurement.
- a plurality of first measurements may be received from the first sensor unit , each with a timestamp of that measurement generated at the first sensor unit based on the first clock signal, and the comparing step may comprise determining a correlation for each of a plurality of difference values by: for each of the first measurements, multiplying that first measurement with the second measurement whose timestamp corresponds to the timestamp of that second measurement offset by that difference value, the determined time offset between the clock signals corresponding to the difference value for which the determined correlation is maximised.
- the comparing step may comprise determining a correlation for each of a plurality of difference values by: multiplying the first measurement with the second measurement whose timestamp corresponds to the timestamp of the first measurement offset by that difference value, the determined time offset between the clock signals corresponding to the difference value for which the determined correlation is maximised.
- a plurality of first measurements may be received from the first sensor unit, each with a timestamp of that measurement generated at the first sensor unit based on the first clock signal, wherein the correlation for each of the candidate timing offsets may be determined by: for each of the second measurements, comparing that second measurement with the first measurement whose timestamp corresponds to the timestamp of that second measurement offset by that difference value.
- a plurality of first measurements may be received from the first sensor unit at the synchronization system, each with a timestamp of that measurement generated at the first sensor unit based on the first clock signal, wherein the comparing step may comprise determining a correlation for each of a plurality of difference values by: determining a sum of differences between each of the first measurement and the second measurement whose timestamp corresponds to the timestamp of the first measurement offset by that difference value.
- the sum of differences may be a sum of absolute or squared differences.
- the method may comprise using the determined timing offset to account for inconsistencies in timestamps pertaining to subsequent measurements by the first and second sensor units when the sensor system is in use.
- the method may comprise estimating a people count for a desired area, which comprises a total area covered by the first and second sensor units, based on the subsequent measurements, their timestamps and the determined timing offset.
- Each of the sensor units may be a person detection sensor unit configured to detect locally at that sensor unit any person or people present in an area covered by that sensor unit, and to output to the synchronization system presence data pertaining to the person or people detected locally at that sensor unit.
- a sensor system comprises: a first sensor unit configured to generate: at least one first measurement, at least a subsequent first measurement, and a timestamp of each of those first measurements generated based on a first clock signal available at the first sensor unit; a second sensor unit (3b) configured to generate: a plurality of second measurements a plurality of subsequent second measurements, and a timestamp of each of those second measurements based on a second clock signal available at the second sensor unit; a synchronisation system external to and connected to the sensor units and configured to: compare the first measurement with each of the plurality of second measurements to identify which of the second measurements has a maximum correlation with the first measurement; determine a timing offset between the first and second clock signals by determining a difference between the timestamp of the first measurement and the timestamp of the identified second measurement having the maximum correlation with the first measurement; and use the determined timing offset to determine which of the subsequent second measurements was performed substantially simultaneously with the subsequent first measurement, by determining that the timestamp of that subsequent second measurement differs from that
- the presence data for the second time may comprise: a presence count indicating a number of people detected by the sensor unit in the covered area at the second time; and/or
- a presence score indicating a likelihood that there is a person or people in the covered area at the second time
- Each person location identifier may be a two or three dimensional location vector.
- the sensor device may be a photo sensor device configured to sense visible and/or non- visible radiation.
- the photo sensor device is an image capture device and the sensor data is image data, the first portion of the image data being one or more first images captured by the image capture device of the area and the second portion being one or more second image captured by the image capture device of the area.
- the second images may not be outputted by the sensor unit.
- the processor may be configured to detect the synchronization code embedded in the radiation as amplitude and/or phase modulations.
- the processor may be configured to determine a difference between the first time as measured at the sensor unit and the second time as measured at the sensor unit, wherein the associated timing data may comprise the determined difference.
- the associated timing data may comprise a first timestamp of the first time and a second stamp of the second time, and thereby conveys the first time relative to the second time. That is the first and second timestamps may be outputted separately, at the same or at different times.
- a sensor system comprises: a plurality of person detection sensors units; a transmitting unit configured to emit at a first time a synchronization code detectable by the sensor units and embodying the synchronization code; wherein each of the sensor units is configured according to any embodiment of the second aspect.
- the sensor system may further comprise a people counting apparatus; wherein each of the sensor units may be configured to output respective presence data for the second time and associated timing data which conveys, to the people counting apparatus the second time relative to the first time; and wherein the people counting apparatus may be configured to use the respective presence data to estimate a people count for a total area covered by the sensor units.
- the transmitting unit may be a luminaire configured to emit illumination at the first time in which the synchronization code is embedded, the sensor device of each sensor unit being a photo sensor device.
- the synchronization code may be embedded using visible light communication, whereby it is imperceptible to a human eye.
- Another aspect of the present invention is directed to a people detection method implemented by a person detection sensor unit of a sensor system, the method comprising: capturing over time sensor data from an area covered by the sensor unit;
- detecting in a first portion of the sensor data captured at a first time a predetermined synchronization code measuring the first time based on a clock signal available at the sensor unit; detecting in a second portion of the sensor data captured at a second time at least one person present in the area; measuring the second time based on the clock signal; based on said detection at the sensor unit of the at least one person, generating from the second portion of the sensor data presence data for the second time pertaining to the detected at least one person; and outputting, to a processing apparatus external to the sensor unit, the presence data for the second time and associated timing data, which conveys to the external processing apparatus the second time as measured at the sensor unit relative to the first time as measured at the sensor unit.
- any embodiments of any of the above aspects may implement features of any of the other aspects, or embodiments therefor.
- embodiments of the first aspect may implement any feature of the second aspect or any embodiment thereof and vice versa.
- a computer program code comprises executable code stored on a computer readable storage medium and configured when executed to implement any of the methods, sensor system functionality or sensor unit functionality disclosed herein.
- Figure 1 is a schematic illustration of a lighting system
- Figure 2 is a schematic block diagram of a sensor unit
- Figure 2 A is a schematic block diagram of a luminaire with embedded sensor unit
- Figure 2B is a schematic block diagram of a luminaire
- Figure 3 is a perspective view of a pair of adjacent luminaires
- Figure 3A is a plan view of part of a lighting system
- Figure 4 is a schematic block diagram of a central processing apparatus for operating a lighting system
- Figure 4A is a schematic block diagram illustrating an exemplary control architecture of a lighting system
- Figures 5 illustrates how local image processors cooperate with a central processing apparatus to provide a people counting function
- Figure 6 illustrates how a correctly synchronized sensor system may be used to implement people counting
- Figure 6A is a block diagram showing how sensors may communicate timestamped measurements to a central processing apparatus
- Figure 7 is a flowchart for a sensor synchronization method
- Figure 8 shows a first example of how sensors may operate in accordance with the synchronization method of figure 7;
- Figure 9 shows a second example of how sensors may operate in accordance with the synchronization method of figure 7;
- Figure 10 shows a flow chart for another synchronization method
- Figure 11 shows a third example of how sensors may operate according to the other synchronization method of figure 10.
- Vision sensors comprising visible light cameras are useful for people counting.
- each vision sensor does not provide entire images to a central processing device when operating in real time, but only presence decisions based on performing the image recognition locally at the sensor (where the presence decision could be a soft or hard decision).
- a vision sensor configured in this manner is an example of a person detection sensor unit.
- Alternative person detection sensor units for example which do not operate on captured images but collect and analyze other type(s) of sensor data are also within the scope of the present disclosure.
- Each vision sensor covers (i.e. provides sensor coverage of) a respective area, defined by its field of view from which it is able to capture sensor data.
- the vision sensors may have overlapping field-of- views (FoVs) and sensing areas.
- a vision sensor system it may happen that one or more issues arise, which in turn may lead to application errors, e.g. errors in an estimated number of people over a particular space, at a higher system level that are not immediately noticeable.
- the vision sensors are connected, so as to form a sensor network having any suitable topology, for example of the kinds described above, and which operated according to any suitable protocol (e.g. ZigBee, Wi-Fi, Ethernet etc.).
- Examples are described below in the context of a sensor system with multiple vision sensors in communication with an external processing apparatus, such as a people counting apparatus to offer data-enabled applications based on people counting.
- an external processing apparatus such as a people counting apparatus to offer data-enabled applications based on people counting.
- further processing for example fusion, analytic etc.
- reasonable synchronization of measurements generated by the vision sensors is required.
- Each of the vision sensors has available to it a respective clock signal, which it uses to apply timestamps to its outputted measurements.
- a shift in the local time of each vision sensor can occur, leading to inconsistencies between the timestamps generated by different sensors. Over long periods of time, this shift can become significant and lead to increase errors in an estimated people count over the total area covered by the vision sensors as a whole.
- the outputs of the vision sensors are synchronized without requiring any additional transmission overhead for synchronization within the sensor network, i.e. without any additional network traffic via the sensor network.
- the vision sensors only send a limited amount of information (e.g. location, illumination changes) to an external processing apparatus via a bandwidth-limited
- each vision sensor communicates to the external processing apparatus, at each of a plurality of times, the following:
- ID a vision sensor identifier unique to the vision sensor within the system.
- the external processing apparatus determined, by correlating the reported measurements of adjacent sensors, and compensate for any time shift within a given time window. That is, to determine account for any timing offset(s) between the respective clock signals available to the different vision sensors.
- the measurement identifies a location of any person whose presence has been detected in the area covered by the sensor at that time.
- the external processing apparatus uses knowledge of any sensing region overlap of adjacent sensors to determine the timing offset(s) by correlating the locations reported by adjacent sensors.
- the measurement comprises one or more light levels measured over all or of the covered area, or part of the area covered by the sensor.
- at least two light levels may be measured simultaneously at the sensor unit each over a respective predefined region of the area covered by the sensor.
- Each of the predefined regions is large compared with a pixel size of the vision sensor, such that the measurements in this second example are effectively a heavily quantized image, having a significantly lower resolution.
- the external processing apparatus uses knowledge of any sensing region overlap of adjacent sensors to determine the timing offset(s) by correlating the light levels reported by adjacent sensors.
- the vision sensor are arranged such that there is a luminaire within the field of view of at least two adjacent vision sensors.
- the luminaire is dimmed according to a predetermined sequence a sequence, so as to emit from the luminaire a visible light synchronization code.
- Each vision sensor is configured to recognize the dimming sequence, thereby allowing it to detect a starting time of the dimming sequence from the luminaire.
- Each vision sensor communicates to the external processing apparatus, at each of a plurality of times: a measurement for that time, which is presence data for that time, which conveys information about the presence of any detected person or people in the area covered by the sensor at that time;
- the external processing apparatus arranges the measurements received from the sensors in chronological time with respect to the start of the dimming sequences for each vision sensor.
- the first and second examples pertain to external synchronization, performed by the central processing apparatus.
- the third example pertains to internal synchronization, performed within each of the vision sensors.
- the internal synchronization techniques of the third example can be combined with the external synchronization techniques of the first or second examples, such that both types of synchronization are performed in parallel, which can allow greater accuracy and robustness.
- the presence data for a given time may for example comprise a people counting metric for that time, such as a people count for the area covered by the sensor or one or more probability scores for estimating a people count for that area.
- a people counting metric for that time such as a people count for the area covered by the sensor or one or more probability scores for estimating a people count for that area.
- Each measurement generated by each vision sensor is based on a respective portion of the sensor data, such as image data, captured by that vision sensor. Any person in the area covered by that vision sensor is not identifiable from the measurement itself, even if they are identifiable in that portion of sensor data (e.g. image data) from which that measurement is generated.
- the measurement has a lower information content, and thus a smaller size than, than the portion of sensor data from which it is generated, which reduces signaling overhead within the sensor network.
- timestamps outputted by different vision sensors may be expressed in different temporal frames of references due to the clock signals available at the different vision sensors being out of sync - at the very least, the system does not assume that the timestamps outputted by the vision sensor are in the same frame of reference, and takes steps to identify and account for any time-base or time-stamp correction externally to the vision sensors, using measurements outputted by the vision sensors as part of their normal function within the system so that no additional signaling overhead is required.
- a synchronization code is embedded in visible light that is detectable by the vision sensors as part of their normal function, or more generally in a manner that is detectable by the vision sensors as part of their normal sensor function. That is, such that the synchronization code is received by the vision sensors in sensor data collected by them, and not via the sensor network.
- This synchronization code is used by the vision sensors to correct their timestamps locally, such that the timestamp outputted by the different vision sensors are all in substantially the same temporal frame of reference. Again, this does not require any additional signaling overhead, as the synchronization code is not sent via the sensor network.
- the respective clock signal available to each vision sensor is preferably a locally generated clock signal generated by a respective local clock of that sensor, for example a crystal oscillator (e.g. quartz) clock, as this requires no signalling overhead.
- a crystal oscillator e.g. quartz
- the sensors can report the number of "ticks" (or a derivative thereof) since the last synchronization code.
- the advantage of using a stable clock is that it has limited sensitivity to drift and thus the frequency of re-synchronization can be limited.
- the present techniques may be performed repeatedly at suitable intervals to account for the variable offsets.
- the vision sensor may receive its local clock signal from the sensor network, this could be an existing clock, e.g. the form of the clock of the communication network (e.g. the TSF in 802.11) or a dedicated clock distributed purposefully though this is less preferred due to the signalling overhead it requires.
- Clock signals received via the network are still prone to synchronization errors, for example due to different clock signal transmission times and/or changing network conditions.
- FIG. 1 illustrates an exemplary lighting system 1 in which the technique disclosed herein may be employed.
- the system 1 comprises a plurality of luminaires 4 installed in an environment 2, arranged to emit illumination in order to illuminate that environment 2.
- the system may further comprise a gateway 10 to which each of the luminaires 4 is connected via a first wired or wireless networking technology such as ZigBee.
- the gateway 10 sometimes referred to as a lighting bridge, connects to a computing apparatus 20 (which may or may not be physically present in the environment 2) via a second wired or wireless networking technology such as Wi-Fi or Ethernet.
- the computing apparatus 20 may for example take the form of a server (comprising one or more server units at one or more sites), or a user terminal such as a smartphone, tablet, laptop or desktop computer, or a combination of any such device. It is able to control the luminaires 4 by sending control commands to the luminaires 4 via the gateway 10, and/or is able to receive status reports from the luminaires 4 via the gateway 10. Alternatively in embodiments the gateway 10 may not be required and the computing apparatus 20 and luminaires 4 may be equipped with the same wired or wireless networking technology, by which they may be connected directly into the same network in order for the computing apparatus 20 to control the luminaires 4 and/or receive the status reports from the luminaires 4.
- the environment 2 is an indoor space within a building, such as one or more rooms and/or corridors (or part thereof).
- the luminaires 4 are ceiling-mounted, so as to be able to illuminate a surface below them (e.g. the ground or floor, or a work surface). They are arranged in a grid along two mutually perpendicular directions in the plane of the ceiling, so as to form two substantially parallel rows of luminaires 4, each row being formed by multiple luminaires 4. The rows have an approximately equal spacing, as do the individual luminaires 4 within each row. However it will be appreciated that this is not the only possible arrangement. E.g.
- one or more of the luminaires 4 could be mounted on the wall, or embedded in the floor or items of furniture; and/or the luminaires 4 need not be arranged in a regular grid; and/or the environment 2 may comprise an outdoor space such as a garden or park, or a partially-covered space such as a stadium or gazebo (or part thereof), or a combination of such spaces.
- Multiple people 8 may occupy the environment, standing on the floor below the luminaires 4.
- the environment 2 is also installed with one or more "vision sensor" units
- these may also be mounted on the ceiling in a regular pattern amongst the luminaires 4, and may be arranged to face downwards towards the illuminated surface beneath (e.g. the ground or floor, or a work surface).
- the sensor units 3 may be mounted in other places such as the wall, facing in other directions than downwards; and/or they need not be installed in a regular pattern.
- the luminaires 4 have known identifiers ("IDs”), unique within the system in question, and are installed at known locations.
- IDs identifiers
- the vision sensor units 3 also have known
- the sensor units 3 are not necessarily co-located with the luminaires 4.
- the locations of the luminaires 4 are determined during a commissioning phase of the luminaires
- a commissioning technician determines the location of each of the luminaires 4, either manually or using automated means such as GPS or another such satellite based positioning system. This may be the location on any suitable reference frame, e.g. coordinates on a floorplan, map of the area, or global coordinates. By whatever means and in whatever terms determined, the commissioning technician then records the location of each luminaire 4 in a commissioning database 21 mapped to its respective luminaire ID. The commissioning technician also performs a similar
- the sensor commissioning phase comprises storing the (believed) location of each in the commissioning database 21 mapped to its respective sensor ID.
- the commissioning database 21 could be anything from a large database down to a small look-up table. It could be implemented on a single device or multiple devices (e.g. computing apparatus 20 represents a distributed server, or a combination of server and user terminal). E.g. the table mapping the vision sensor locations to the vision sensor IDs could be implemented separately from the table mapping the luminaire locations to the luminaire IDs. Of course it will also be appreciated that the commissioning could be performed over different occasions, and/or by more than one technician. E.g. the commissioning of the vision sensors 3 could be performed by a different commissioning technician on a later occasion than the commissioning of the luminaires 4.
- Knowing the locations of the luminaires and the sensors 3 allows the position of the luminaires 4 relative to the sensor units 3 to be known. According to the present disclosure, this is advantageously exploited in order to check for commissioning errors or other problems with the sensor units 3.
- only the relative locations of the luminaires 4 relative to the sensor units 3 need be known (e.g. stored in terms of a vector in the commissioning database 21).
- each of one, some or all of the sensor units 3 may be incorporated into the housing of a respective one of the luminaires 4.
- the locations of the luminaires 4 are known relative to the sensor units 3 implicitly, i.e. can be assumed to be co-located.
- the commissioning database 21 is not necessarily required for the purpose of checking the sensor units 3, though may optionally be included anyway for other purposes (e.g. again to enable detection of the location of a person 8, or for indoor navigation).
- FIG. 2 shows a block diagram of a vision sensor unit 3, representing the individual configuration of each sensor unit 3 in the lighting system 1.
- the sensor unit 3 comprises: an image sensor 6 in the form of a visible light camera, a local processing module 11, a network interface 7, a local memory 13 connected to the local processing module 11, and a local clock 18 connected to provide a local clock signal 19 to the local processing module 11.
- the camera 6 is able to detect radiation from the luminaires 4 when illuminating the environment, and is preferably a visible light camera. However, the use of a thermal camera is not excluded.
- the local processing module 1 1 is formed of one or more processing units, e.g.
- the local memory 13 is formed of one or more memory units, such as one or more volatile or non-volatile memory units, e.g. one or more RAMs, EEPROMs ("flash" memory), magnetic memory units (such as a hard disk), or optical memory units.
- the local memory 13 stores code 12a arranged to run (e.g. execute or be interpreted) on the local processing module 11, the processing module 11 thereby being configured to perform operations of the sensor unit 3 in accordance with the following disclosure.
- the processing module 11 could be implemented in dedicated hardware circuitry, or configurable or reconfigurable hardware circuitry such as a PGA or FPGA.
- the local processing module 11 is operatively coupled to its respective camera 6 in order to receive images captured by the camera 6, and is also operatively coupled to the network interface 7 in order to be able to communicate with the processing apparatus 20.
- the processing apparatus 20 is external to each of the sensor units 3 and luminaires 4, but arranged to be able to communicate with the sensor units via the respective interfaces 7, and to communicate with the luminaires 4 via a similar interface in each luminaire 4 (not shown).
- the local clock signal 19 is a periodic, regular (i.e. having a fixed or approximately fixed period) signal, which the processing module 11 can use to generate a timestamp of an event denoting a current, local time, i.e. measured locally at the sensor unit 3, of the event.
- the timestamp can have any suitable format, and the term "timestamp" herein generally refers to any data that conveys a time of an event in any temporal frame of reference, generated based on a clock signal.
- a timestamp may be counter value e.g. expressing the time as a single integer value (or a set of integer values).
- a timestamp may express a time in any combination of hours, minutes, seconds, ms etc., as appropriate to the individual circumstances; or more generally as a floating point or set of floating point values.
- a timestamp may express a time to any degree of accuracy and precision that is appropriate to the individual circumstances.
- the local clock 18 comprises a crystal oscillator crystal clock, and the clock signal 19 is derived from by applying a current to a crystal oscillator e.g. quartz crystal of the clock 18.
- the local clock signal 19 denotes a current time relative to a local reference time, for example a current time expressed as an integer count (where the reference time is e.g. a count of zero).
- the sensor unit 3 may derive its local clock signal 19 by some other means, for example from based on a locally available AC (alternating current) e.g. from a power supply that is powering the sensor, or it clock signal may be a locally received version of a clock signal broadcast though the sensor network (though this is less preferred, due to the additional signalling overhead it requires).
- the local clock signal 19 expresses the current time relative to the local reference time of the sensor unit 3, and has a frequency at which the current time is updated.
- the local clock 18 and processing module 11 are shown as separate components for the sake of illustration. However, part of the functionality of the local clock 18 may be implemented by the local processing module 1 1 itself, for example the local clock may provide a periodic input to the processing module 11 from which the local processing module generates the clock signal 19 itself, i.e. such that the local processing module 11 computes the current time relative to the local reference time.
- the local clock signals 19 available at any two different sensors 3, by whatever means they are generated, may have a timing offset (i.e. be out of sync), for example, because they are based on different reference times - causing in a substantially constant time offset between the clock signals - and/or because they have slightly different frequencies - causing a time offset between the clock signals that increases over time.
- a timing offset i.e. be out of sync
- FIG. 2B shows an example of a luminaire 4 in embodiments where the luminaires 4 are separate to the sensor units 3.
- each luminaire 4 may comprise one or more lamps 5, a respective interface 7', a local memory 13' and a local processing module 11 '.
- the local processing module 11 ' is operatively coupled to the lamp(s) and the interface 7'.
- Each lamp 5 may comprise an LED-based lamp (comprising one or more LEDs), a filament bulb, a gas-discharge lamp or any other type of light source.
- the memory 13' comprises one or more memory units and the processing module 11 ' comprising one or more processing units.
- the local memory 13' stores code 12b arranged to run (e.g.
- processing module 11 ' execute or be interpreted) on the local processing module 11 ', the processing module 11 ' thereby being configured to perform operations of a luminaire 4 in accordance with the present disclosure.
- the processing module 11 ' of the luminaire 4 could be implemented in dedicated hardware circuitry, or configurable or reconfigurable hardware circuitry such as a PGA or FPGA.
- each of the above-mentioned interfaces 7, 7' could be a wired or wireless interface, but is preferably wireless.
- the interface 7 of each of the sensor units 3, and the interface 7' of each of the luminaires 4 may be a ZigBee interface arranged to connect to the gateway 10 using a first wireless networking protocol such as one of the ZigBee standards, e.g. ZigBee Light Link; while the processing apparatus 20 (e.g. a server, or a desktop computer, laptop, tablet or smartphone running a suitable application) connects to the gateway 10 via a second wireless networking protocol such as Wi-Fi or Bluetooth.
- the gateway 10 then converts between the protocols to allow the external processing apparatus 20 to communicate in one or both directions with the sensor units 3 and luminaires 4.
- the interface 7 in each of the sensor units 3, and the interface 7' in each of the luminaires 4 may comprise an interface of a type (e.g. Wi-Fi or Bluetooth) directly compatible with that of the external processing apparatus 20, thus allowing the communication to occur directly between the processing apparatus 20 and the sensor units 3 and luminaires 4 without the need for a gateway 10.
- a type e.g. Wi-Fi or Bluetooth
- the network can have any suitable network topology, for example a mesh topology, star topology or any other suitable topology that allows signals to be transmitted and received between each luminaire 4 and the gateway 10 and/or processing apparatus 20.
- the external processing apparatus 20 is configured to send control commands to the sensor units 3 and luminaires 4 and to receive information back from the sensor units 3 and luminaires 4, via the relevant interfaces 7, 7'. This includes receiving soft or hard presence decisions from the sensor units 3, and in some cases receiving measured light levels (as in the second example below).
- the various communications disclosed herein between components 3, 4, 20 may be implemented by any of the above-described means or others, and for conciseness will not be repeated each time.
- Figure 2A shows a variant of the arrangement shown in Figures 1 and 2, wherein the sensor unit 3 is integrated into the same housing as one of the luminaires 4, and therefore the sensor unit 3 is substantially collocated with the respective luminaire 4.
- the combined luminaire and sensor 3, 4 unit further comprises (in addition to the components described above in relation to Figure 2) at least one lamp 5 such as an LED- based lamp (comprising one or more LEDs), gas-discharge lamp or filament bulb.
- the communication with the combined sensor unit and luminaire 3, 4 may both implemented via a shared interface 7 of the unit, and/or any control, processing or reporting associated with the sensing and or luminaire functionality may be implemented by a shared local processing module 11. Alternatively separate interface 7' and/or separate local processing module 11 ' could be provided for each of the sensor and luminaire functions, but in the same housing.
- the local processor 11 ' of the luminaire 4 (or the local processor 11 of the combined unit 3, 4) is connected to the lamp(s) 5, to allow local lighting control code 12b executed on the local processor 11 ' (or 11) to control the dimming level of the illumination emitted by the lamp(s) 5, and or to switch the emitted illumination on and off.
- Other illumination characteristic(s) such as colour may also be controllable.
- the luminaire 4 comprises multiple lamps 5, these may be individually controllable by the local processor 11 ' (or 11), at least to some extent. For example, different coloured lamps 5 or elements of a lamp 5 may be provided, so that the overall colour balance can be controlled by separately controlling their individual illumination levels.
- the local controller 11 ' of the luminaire 4 may be configured to control one or more such properties of the emitted illumination based on lighting control commands received via the interface 7' from the external processing apparatus 20.
- the processing apparatus 20 may comprise a server arranged to receive presence metrics from the sensor units 3 indicative of where people are present in the environment 2, and make decisions as to which luminaries 4 to turn on and off, or which to dim up and down and to what extent, based on an overview of the presence detected by the different sensor units 3.
- the processing apparatus 20 may comprise a user terminal such as a smartphone, tablet or laptop running a lighting control application (or "app"), though which the user can select a desired adjustment to the emitted illumination, or select a desired lighting effect or scene to be created using the illumination.
- the application sends lighting control commands to the relevant luminaires 4 to enact the desired adjustment or effect.
- the local controller 11 ' of the luminaire 4 may be configured to control any one or more of the above properties of the illumination based on signals received from one or more other sources, such as one or more of the sensor units 3. E.g. if a sensor unit 3 detects occupancy then it may send a signal to a neighbouring luminaire 4 to trigger that luminaire to turn on or dim up.
- each sensor unit 3 (or combined unit 3, 4) the respective image sensor 6 is connected to supply, to its local processor 11, raw image data captured by the image sensor 6, to which a local person detection algorithm is applied by local image processing code 12a executed on the local processor 11.
- the local person detection algorithm can operate in a number of ways based any suitable image recognition techniques (e.g. facial recognition and/or body recognition). Based on this, the local person detection algorithm generates one or more "presence metrics" indicative of whether a person 8 is detected to be present in a still image or moving image (video) captured by the image sensor 6, and or how many people 8 are detected to be so present.
- the one or more presence metrics may comprise: a hard indication of whether or not a person 8 is detected to be present in the image (yes/no), a soft indication of whether or not a person 8 is detected to be present in the image (an indication of a degree of certainty such as a percentage), or a momentary count of people 8 simultaneously present in the image, a count of the number of people appearing in the image over a certain window of time, and/or a rate at which people appear in the image.
- the code 12a running on the local processing module 11 reports this information to the external processing apparatus 20, for use in a determining a person count centrally.
- detecting whether a person appears in an image may comprise detecting whether a whole person appears in the image, or detecting whether at least a part of a person appears in the image, or detecting whether at least a specific part or part of a person appears in the image.
- the detection could also be comprise whether a specific person appears in the image, or detecting whether a specific category of person appears in the image, or detecting whether any person appears in the image.
- Figure 3 shows a perspective view of a first and a second of the sensor units 3a, 3b, as described above.
- the first and second sensor units 3a capture images from a respective sensor area 30a, 30b, which experience light from one or more of the luminaires 4a, 4b.
- each sensor unit 3a, 3b may be associated with or incorporated into a different respective one of the luminaires 4a, 4b adjacent one another in a grid, or each sensor unit 3 could be associated with a different respective group of the luminaires 4 (e.g. placed at the centre of the group).
- each of the luminaires 4a, 4b is arranged to emit illumination towards a surface 29 (e.g. the floor, or a workspace plane such as a desk), thereby illuminating the surface 29 below the luminaires 4.
- a surface 29 e.g. the floor, or a workspace plane such as a desk
- the illumination provided by the luminaires 4 renders the people 8 detectable by the sensor units 3.
- each sensor unit 3 a, 4b has a limited field of view.
- the field of view defines a volume of space, marked by dotted lines in Figure 3, within which visible structure is detectable by that sensor unit 3 a, 3b.
- Each sensor unit 3 a, 3b is positioned to capture images of the respective portion (i.e. area) 30a, 30b of the surface 29 that is within its field of view ("sensing area") below.
- the fields of view of the first and second sensor units 3a, 3b overlap in the sense that there is a region of space within which structure is detectable by both sensor units 3a, 3b.
- one of the borders 30R of the sensing area 30a of the first sensor unit 3a is within the sensor area 32b of the second sensor unit 3b ("second sensing area”).
- one of the borders 30L of the sensor area 32b of the second sensor unit 3b is within the sensor area 30a of the first sensor unit 3a ("first sensing area”).
- An area A is shown, which is the intersection of the first and second sensor areas 30a, 30b. The area A is the part of the surface 29 that is visible to both of the first and second sensor units 3a, 3b ("sensor overlap").
- Figure 3 A shows a plan view of a part of the lighting system 1, in which a 3x3 gird of nine sensor units 3a,...,3h is shown, each having a respective sensor area 30a,...,30h, which is the sensor area of its respective image sensor 6 as described above.
- the sensing area 30 of each sensor unit 3 overlaps with that of each of its neighbouring sensor units 3, in both directions along the gird and both directions diagonal to the grid, as shown.
- every pair of neighbouring sensor units (3a, 3b), (3a, 3c), (3a, 3d), (3b, 3c), ... has an overlapping sensor area (or field of view, FoV).
- the overlapping sensing areas of the vision sensors ensure that there are no dead sensing regions.
- FIG. 4 shows a block diagram of the processing apparatus 20.
- the processing apparatus comprises at least one computer device for operating the lighting system 1.
- the computer device may take the form of a server, or a static user terminal such as a desktop computer, or a mobile user terminal such as a laptop, tablet, smartphone or smart watch.
- the computer device 20 comprises a processor 27 formed of one or more processing units, and a network interface 23.
- the network interface 23 is connected to the processor 27.
- the processor 27 has access to a memory 22, formed of one or more memory devices, such as one or more RAMs,
- the memory 22 may be external or internal to the computer device 20, or a combination of both (i.e. the memory 22 can, in some cases, denote a combination of internal and external memory devices), and in the latter case may be local or remote (i.e. accessed via a network).
- the processor 27 is also connected to a display 25, which may for example be integrated in the computer device 20 or an external display.
- the processor 27 is shown executing people counting code 24, from the memory 22.
- the people counting code 27 applies an aggregation algorithm, to aggregate multiple local presence metrics received from different ones of the sensor units 3 so as to generate an estimate of the number of people 8 in the environment 2.
- the processor 27 implements a processing module connected to receive data relating to the captured images of the image capturing device, and to thereby determine a count of the number or rate if people being found in the environment 2.
- the network interface 23 can be a wired interface (e.g. Ethernet, USB,
- the gateway 10 operates as an interface between the computer device 20 and the lighting network, and thus allows the central processing apparatus 20 to communication with each of the luminaires 4 and sensor units 3 via the lighting network.
- the gateway 10 provides any necessary protocol conversion to allow communication between the computer device 20 and the lighting network.
- the interface 23 may enable the computer device 20 to connect directly to the luminaires 4 and senor units 3. Either way, this allows the computer device 20 to transmit control signals to each of the luminaires 4 and receive measurements from each of the sensors 3.
- the computer device 20 may be local to the environment 2 (e.g. present in the environment 2 or in the same building) or may be remote from it (at a remote geographic site), or the processing apparatus 20 may even comprise a combination of local and remote computer devices. Further, it may connect to the gateway 10 via a single connection or via another network other than the lighting network.
- Figure 4A shows an exemplary lighting system control architecture for implementing a remote or networked connection between the computer device 20 and the gateway.
- the computer device 20 is connected to the gateway 10 via a packet basic network 42, which is a TCP/IP network in this example.
- the computer device 20
- the gateway 10 communicates with the gateway 10 via the packet based network 42 using TCP/IP protocols, which may for example be effected at the link layer using Ethernet protocols, Wi-Fi protocols, or a combination of both.
- the network 42 may for example be a local area network (business or home network), the Internet, or simply a direct wired (e.g. Ethernet) or wireless (e.g. Wi-Fi) connection between the computer device 20 and the gateway 10.
- the lighting network 44 is a ZigBee network in this example, in which the luminaires 4a, 4b, 4c,...
- the gateway 10 communicates with the gateway 10 using ZigBee protocols.
- the gateway 10 performs protocol conversion between TCP/IP and ZigBee protocols, so that the central computer 20 can communicate with the luminaires 4 and sensor units 3 via the packet based network 32, the gateway 10 and the lighting network 44.
- “external” or “externally” means the processing apparatus 20 is not housed within any shared housing (casing) of any of the sensor units 3, and in embodiments nor in any housing of the luminaires 4. Further, this means the processing apparatus communicates with all of the involved sensor units 3 (and in embodiments luminaires 4) only using an external connection via a networked and/or wireless connection, e.g. via the gateway 10, or via a direct wireless connection.
- the memory 22 of the external processing apparatus 20 stores a database 21.
- This database 21 contains a respective identifier (ID) of each sensor unit 3 and each luminaire 4 in the lighting system 1 (or just IDs of the luminaires 4 when the sensor units 3 are integrated into luminaires 4). These uniquely identify the sensor units 3 and luminaires 4 within the system 1. Further, the database 21 also contains an associated location identifier 71 of each sensor unit 3 and luminaire (of again just the location identifiers of the luminaires 4 if the sensor units are integrated into luminaires). For example, each location identifier 71 may be a two dimensional identifier (x,y) or three dimensional location identifier (x,y,z) (e.g. if the sensor units 3 are installed at different heights).
- the location identifier 71 may convey only relatively basic location information, such as a grid reference denoting the position of the corresponding luminaire 4 or sensor unit in a grid - e.g. (m,n) for the mth column and nth row - or it may convey a more accurate location on a floor plan or map, e.g. meters, feet or arbitrary units, to any desired accuracy.
- the IDs of the luminaires 4 and sensor units 3, and their locations, are thus known to the processing apparatus 20.
- the memory 22 may also store additional metadata 26, such as an indication of the sensor overlap A, and any other sensor overlaps in the system.
- Figure 5 illustrates how the processing apparatus 20 and the sensor units 4 cooperate within the system 1.
- First, second and third sensor units 3a, 3b, 3c are shown, though this is purely exemplary.
- the image sensor 6 of each sensor unit 3a, 3b, 3c captures at least one respective image 60a, 60b, 60c of its respective sensing area (each of which could be a still image or a video).
- the local processing module 11a, 1 lb, 1 lc of that sensor unit applies the local person detection algorithm to the respective image(s). That is, the local person detection algorithm is applied separately at each of the sensor units 3a, 3b, 3c, in parallel to generate a respective local presence metric 62a, 62b, 62c at each, also referred to equivalently as a people counting metric herein.
- Each of the local presence metrics 62a, 62b, 62c is transmitted to the processing apparatus 20, e.g. via the networks 42, 44 and gateway 10.
- the images 60a, 60b, 60c themselves however are not transmitted to the central processing apparatus 20 (or at least not in a high enough resolution form for people to be recognizable or a least identifiable).
- the external processing apparatus 20 applies the aggregation algorithm to the presence metrics 62a, 62b, 62c in order to estimate the number of people 8 in the
- the aggregation algorithm generates an indicator of this number (people count) 64, which may be outputted on the display 25 to user of the processing apparatus 20 and/or stored in the memory 22 for later use.
- the process may be real-time, in the sense that each local processing module 11a, 1 lb, 11c repeatedly generates and transmits local presence metrics as new images are captured.
- the people count 64 is updated as the new presence metrics are received, for example one every few (e.g. ten or fewer) seconds.
- the process may be pseudo- real-time, e.g. such that the people count 64 is updated every minute or every few minutes, or every hour (for example), or it may be pseudo-static e.g. a "one-time" people count may be obtained in response to a count instruction from the user of the external processing apparatus 20, to obtain a snapshot of current occupancy levels manually. That is, each count may be instructed manually.
- Each presence metric 62 may be generated over a time window i.e. based on multiple images within that time window. This allows movements above a certain speed to be filtered out. I.e. objects moving fast enough to not appear in all of those images may be filtered out so that they do not affect the people count 64.
- the sensor unit 3 a captures images of the part of the surface 29 directly below it.
- This means the image 60a is a top-down view of the person 61, whereby the top of their head and shoulders are visible.
- the person 61 is in the sensor overlap area A, they would be similarly detectable in an image captured by the second sensor unit 3b. That is the same person 61 would be simultaneously visible in images from both the first and second sensor units 3a, 3b, at different respective locations in those images.
- a similar scenario can also occur even if the sensor units 3 do not face directly down, e.g. are at an angle in a corner of a room, or face sideways from the wall. It will be appreciated that the present disclosure is not limited to a top-down arrangement.
- each sensor unit 3 (or rather its local image processor 11) communicates a respective one or more presence metrics, along with its ID and a timestamp, to the external processing apparatus 20 (e.g. a centralized people counting computer device).
- the timestamp is generated based on that sensors local clock signal 19.
- the presence metric(s) reported by each sensor unit 3 comprise at least an indication of whether a person 8 is detected, or likely to have been detected, by the sensor unit 3. For example, this may comprise a yes/no flag indicating whether a person was detected. Alternatively or additionally, it may comprise a block-pixel-by-block-pixel score matrix, e.g. a 10 by 10 matrix of binary values e.g. with each element a "1" or "0", indicative of presence or no presence - this choice ensures that the communication from the sensor units 3 to the external processing apparatus 20 maintains privacy, and is also low rate. Another alternative or additional possibility is to report a score a probability score indicative of the probability that a person is present.
- the probability score may be computed over a time window, thus filtering out movements above a certain speed. These may be estimated using known statistical methods, e.g. maximum a posteriori (MAP). Further, in embodiments, the reported presence metrics may comprise a location vector denoting the location of the detected person 61, e.g. which may be expressed relative to the sensor unit 3 that captures the image 60, or as a position within the image.
- MAP maximum a posteriori
- the external processing apparatus 20 collects such metrics from all the sensor units 3 associated with a region over which a people count is of interest (e.g. all or part of the surface 29). Additionally, the external processing apparatus 20 has knowledge of sensing region overlap of the sensor units 3, from the metadata 26. It aggregates the individual vision sensor counts while avoiding double-counts over overlapping regions within a given time window. For example, if the reported presence metric(s) from each sensor unit 3 comprise a yes/no indication of whether or not they detected a person 8, plus an associated location vector indicating where the person was detected, then the processing apparatus 20 can determine when a result from two different sensor unit 3 is within an overlap region (e.g. A in the example of Figure 3) and occurs at approximately the same location.
- an overlap region e.g. A in the example of Figure 3
- the processing apparatus 20 can again determine when a result from two different sensor unit 3 is within an overlap region and occurs at approximately the same location in order to avoid double counting.
- the regions of overlap A can be determined at the commissioning stage and pre-stored in the memory 22 as part of the metadata 26, or alternatively can be determined automatically by the processing apparatus 20 automatically based on outputs form the sensors.
- At least part of the metadata 26 may be available to the sensors 3 themselves, such that the sensors themselves have knowledge of sensing region overlap.
- the above has described a system for detecting the presence of people 8 in an environment, and for counting the total number of people detected during a certain window of time.
- This can have a number of applications, such as marketing analysis in a retail environment; or tracking occupancy levels for safety reasons (e.g. at a sports or entertainment event); or to inform automated control of a utility such as the illumination provided by the luminaires, or heating, ventilation or air conditioning.
- n (denoting the nth sensor in the system).
- a measurement performed by that sensor is denoted m n(t where t is a time denoted by an associated timestamp of that measurement generated locally at the sensor S n based on its local clock signal 19. That is, t is a time measured based on the local clock signal 19 of that sensor and thus expressed relative to its local reference time and based on the frequency of its clock signal 19.
- FIG. 7 A flowchart for one method of measurement synchronization is shown in figure 7, first and second examples of which are described below with reference to figures 8 and 9 respectively. The description of figures 8 and 9 is interleaved with that of figure 7.
- measurements performed by the sensors are synchronized externally at the central processing apparatus 20, on the assumption that the local reference times and/or the frequencies of different sensors may not be synchronized.
- synchronization of the sensors in this case is not achieved by adjusting the sensors (which would require some form of signalling between the external processing apparatus and the sensors), and in particular is not achieved by adjusting how they apply their respective timestamps to correct inconsistencies between their respective clock signals - rather, the inconsistencies between the clock signals are allowed to persist locally at the sensors, and accounted for centrally at the external processing apparatus instead.
- Step S2 represents operations performed over an interval of time, during which each sensor S n performes and communicates to the central processing apparatus 20 a respective set of measurements m n (t 2 ) performed by that sensor S n at times t , t 2 , ... as measured locally at that sensor S n ,.
- This is illustrated in figure 6 A for sensors 3a (5 ⁇ ) and 3b (5 2 ).
- the measurements pertain at least in part to the area(s) of overlap "A" between the sensor S n and its neighbouring sensor(s) in the grid of figure 3A, as indicated by the dashed lines of figure 6 A.
- each measurement m n (t) is a location (scalar or vector) of a person detected anywhere in the area covered by the sensor S n (e.g. 30a or 30b), including any area(s) A of sensor overlap with neighbouring sensor(s), denoted x n (t) below.
- the location of each person may for instance be with respect to the vision sensor; that is, relative to a location of the vision sensor itself in a spatial reference frame local to that sensor. In this local reference frame, the location of the sensor may for example be (0,0) or (0,0,0).
- the central processing apparatus 20 converts the locations to a global spatial frame reference, based on the locations of the locations of the vision sensor recorded in the commissioning database 20 relative to a common origin (for example, relative to a floorplan); this ensures that locations originating from the same person overlap over space when expressed in the global frame of reference.
- each measurement m n (t) is a light level, denoted l n (t), measured at local time t, which may for example be measured over the whole of the sensing area (e.g. 30a or 30b) including any area(s) A of sensor overlap at local time t, or over a respective sub-region of the covered area at local time t.
- the measurement m n (t) may be a respective light level measured over area of each of those area(s) of overlap, i.e. one light level per area of sensor overlap.
- the measurement m n (t) may be a respective light level measured over area of each of those area(s) of overlap, i.e. one light level per area of sensor overlap.
- measurement m n (t) may be a light level in a grid of light level (e.g. a 10x10 grid of light levels), for a grid of sub regions the covered area (e.g. 30a or 30b in figure 3) - in this case, the measurement m n (t) constitutes a very low resolution, monochromatic image derived from a full resolution image captured by the camera of sensor S n at local time t.
- a grid of light level e.g. a 10x10 grid of light levels
- the covered area e.g. 30a or 30b in figure 3
- the local reference times and/or clock signal frequencies are not necessarily synchronized, where two measurements m n (t), m m (t') have respective timestamps such that t ⁇ t' i.e. where the measurements m n (t), m m (t') have substantially matching timestamps, that does not necessarily mean they correspond to the same physical time. That is, the timestamps from different sensors 3 may be inconsistent with one another.
- timing offset - which can be expressed as a time difference ⁇ ⁇ ⁇ - between the respective clock signals of S n and S m - arising due to those clock signals being based on different local reference times, due to them having different frequencies, or a combination of both - the fact that the measurements m n (t), m m (t') have substantially matching timestamps actually means they were performed a time At n m apart.
- the timestamps are substantially matching in the sense that the time t denoted by the timestamp generated by S n is closer to the time t' denoted by the timestamp generated by S m than the time denoted by any other timestamp generated by S m .
- a suitable correlation function is the following:
- corr(5t) ⁇ m n (t) ⁇ m m (t— 5t) (1)
- the central processing apparatus 20 multiplies each measurement performed by the sensor S n with the measurement performed by the sensor S m whose timestamp corresponds to the timestamp of the first measurement offset by that difference value St.
- this is just one example of a simple correlation function, and other examples are within the scope of this disclosure, such as a normalized version of the simple correlation function of equation 1 - for example normalized to a maximum value of 1 or to have any desired value other than 1, such as the known "moravec" correlation function) - and/or a zero-mean version of the correlation function of equation (1).
- any suitable correlation function which constitutes a measure of similarity or dependence between measurements from different sensors can be used in place of correlation functions based on equations (1) and (2).
- the summation is over a suitable time window, over which correlations in the measurements from different sensors are detectable.
- the summation is also over
- both sets of measurements m n (t), m m (t) are associated with the same object and/or event, and will this exhibit detectable correlations. That is, multiple measurements are performed over the same area over time, as the measured quantity or quantities in that same area change over time.
- the central processing apparatus estimates, for each pair of adjacent sensors S n , S m , the time offset At n m between the respective clock signals of the sensors
- time offset At n m is estimated to be equal to the difference value 5t m for which the correlation function corr(5t) is maximized.
- the central processing apparatus 20 collects location information from adjacent vision sensors, from the commissioning database 26. Additionally, the unit has knowledge of the overlap of the vision sensors, from the metadata 26. The processing apparatus 20 correlates over time the reported location of person over the overlapping region for a pair of adjacent vision sensors.
- Figure 8 shows the first example, in which a first vision sensor S and a second vision sensor S 2 report respectively locations x (t) and x 2 (t) to the central processing apparatus 20.
- the central processing apparatus 20 estimates the time shift At between the respective clocks of S and S 2 by correlating the locations, for example based on equation 2 as:
- the offset can be estimated based on equation 1 as:
- At 2 arg max x (t) ⁇ x 2 (t— 5t)
- each vision sensor 3 communicates to the central processing apparatus 20, in addition to the location of any detected person, light levels in predefined regions within its field of view, along with its vision sensor ID and a timestamp of each measured light level.
- Each predefined region may for instance be with respect to the vision sensor and so the central processing apparatus 20 converts the location of these regions to a global reference; this ensures that similar located regions are overlapping correctly.
- the central processing apparatus 20 collects the light levels from adjacent vision sensors. Additionally, the unit has knowledge of the overlap (if any) of the vision sensors, form the metadata 21. Then, the unit correlates over time the light levels over overlapping (or nearby) regions for a pair of adjacent vision sensors. Note that the light levels might change due to lighting control (e.g. as a result daylight or occupancy adaptation), so that correlations between light levels reported by different sensors are detectable due to the changing light levels.
- Figure 9 shows an example of predefined regions over which vision sensor S 1 and vision sensor S 2 report light levels l (t) and l 2 (t) respectively, which are the light levels over the sensor overlap region A between them (see figure 3).
- the central processing apparatus 20 estimates the time shift At by correlating the locations, for example as:
- the central processing apparatus 20 can use the estimated clock signal offsets At n m to account for inconsistencies in the timestamps applied to later measurements (S6). For example, the central processing apparatus is able to generate an accurate people count 66 from the later measurements for any desired time (e.g. based on people locations or other presence metrics reported by the sensors), accounting for the inconsistencies in the timestamps applied to the later measurements arising due to the clock signal offsets At n m .
- multiple measurements form the first sensor S are compared with multiple measurement from the second sensor S 2 at different time offsets St. It is preferable to compare multiple measurements from both sensors in this manner as it provides more accurate results. Nevertheless, in a few circumstances comparing a single measurement from the first sensor S with measurements form the second sensor S 2 - though less preferred - is sufficient.
- a person or people is detected at the vision sensors based on a high resolution image captured by that vision sensor.
- the higher resolution used for person detection at each of the sensor units 3 may for example be at least 100x100 pixels (at least 100 pixels in each of the horizontal and vertical dimensions).
- the higher resolution used for person detection at each of the sensor units 3 may be at least 500x500 pixels (at least 500 pixels in each of the horizontal and vertical dimensions).
- the higher resolution used for person detection at each of the sensor units 3 may be at least 1000x1000 pixels (at least 1000 pixels in each of the horizontal and vertical dimensions).
- the light levels reported over for each portion of the other area may effectively constitute a lower resolution image, as also noted above.
- the lower resolution of the second example used by each of the sensor units 3 to externally report the images to the processing apparatus 20 may be no more than 10x10 pixels (no more than ten pixels in each of the horizontal and vertical dimensions) - with each pixel in the lower resolution image being a light level measured over a respective part of the area covered by the sensor.
- the lower resolution used by each of the sensor units 3 to externally report the images to the processing apparatus 20 may be no more than 25x25 pixels (no more than twenty-five pixels in each of the horizontal and vertical dimensions).
- the lower resolution used by each of the sensor units 3 to externally report the images to the processing apparatus 20 may be no more than 50x50 pixels (no more than fifty pixels in each of the horizontal and vertical dimensions).
- the lower resolution used for reporting may be reduced by at least ten times in each dimension compared to the higher resolution used for detection (ten times fewer pixels in each of the horizontal and vertical directions Alternatively or in addition, the lower resolution used for reporting may be reduced by at least fifty times in each dimension compared to the higher resolution used for detection (fifty times fewer pixels in each of the horizontal and vertical directions).
- the lower resolution used for reporting may be reduced by at least one hundred times in each dimension compared to the higher resolution used for detection (one hundred times fewer pixels in each of the horizontal and vertical directions).
- the sensor unit 3 may automatically and unconditionally report the lower resolution image to the external processing apparatus 20 each time an image captured, or may report it periodically. Alternatively the sensor unit 3 may automatically report the lower resolution image to the external processing apparatus 20 only in response to an event, e.g. whenever a local automated check performed by the local image processing code 12a determines the image does not conform to an empirical or analytical expectation, or in response to the local image processing code 12a detecting a debug sequence signalled in the illumination from one or more of the luminaires 4.
- an event e.g. whenever a local automated check performed by the local image processing code 12a determines the image does not conform to an empirical or analytical expectation, or in response to the local image processing code 12a detecting a debug sequence signalled in the illumination from one or more of the luminaires 4.
- Figure 10 shows a flowchart for another synchronization method, which is performed in the third example described below.
- the third example is illustrated in figure 11, the description of which is interleaved with that of figure 10.
- a synchronization code is embedded in the visible light outputted by at least one of the luminaires 4 at a first time, and used by the sensors 3 as a reference point for synchronization. That is, the synchronization code defines a reference time that is global across the system.
- the illumination of one of the luminaires 4 falls within the fields of view of two adjacent vision sensors 3 a, 3b.
- the synchronization code when embedded in its illumination is detected by both sensors 3a, 3b (S54).
- the synchronization code is a modulation sequence, shown as 17 and can be embedded in the illumination by modulating any characteristic of the light (e.g. using anyone or more of amplitude, frequency and/or phase).
- the luminaire 4 may dim its illumination according to the sequence 17 and each vision sensor 3 a, 3b reports its time with respect to the sensor perceived dimming sequence, as explained below.
- Information from each vision sensor 3 a, 3b is synchronized at the central processing apparatus 20 during a synchronization phase.
- This phase may be performed repeatedly at certain intervals chosen so as not to disturb users (e.g. e.g. during intervals of global un-occupancy of the environment).
- the luminaire is dimmed with the sequence 17 that allows for timing retrieval (e.g. based on gold codes).
- Each vision sensor 3a, 3b knows (e.g. receive from the central processing apparatus 20) the sequence 17 and thus can determine time locally with respect to the start of the sequence 17.
- the dimming sequence can be transmitted using visible light communication, using on so-called “visible light communication” techniques and thus a continuous (and more accurate) synchronization can be achieved, as the synchronization code when embedded using visible light communication is not perceptible to a human eye.
- visible light communication a continuous (and more accurate) synchronization can be achieved, as the synchronization code when embedded using visible light communication is not perceptible to a human eye.
- a modulation technique that does not introduce frequency components below 100 Hz.
- each vision sensor communicates to the central processing apparatus 20: a people counting metric of the kind described above (e.g. a people count, location information etc.) and a timestamp denoting that second time relative to the start of the dimming sequence (i.e. the first time), along with its vision sensor ID. That is, as an elapsed time from the synchronization code.
- the timestamp is thus generated relative to the global reference time, such that the timestamps outputted all the vision sensors 3 in the system. That is, the timestamps are globally synchronized across the sensor system.
- the people counting metric has lower information content than the image(s) from which it is generated, in this case one or more images captured by the camera of the vision sensor.
- the person(s) is not identifiable in the people counting metric even if they are identifiable in that image(s).
- the central processing apparatus 20 arranges the received information in chronological time with respect to the start of the dimming sequences for each vision sensor 3.
- a combinations of the third and second embodiments may be used.
- a luminaire in the event of an occupancy change, a luminaire may be turned on following a pre-defined dimming sequence, which is both detectable at the vision sensors 3 in accordance with the third embodiment and which leads to correlations in their outputs detectable at the central processing apparatus 20 in accordance with the second embodiment, with both types of synchronization being performed to maximize accuracy.
- the central processing apparatus 20 uses the presence information and synchronized time stamps to provide an estimated people count over the total area covered by the sensors 3, for any desired time.
- Each of the second times can be conveyed, to the external processing apparatus 20, relative to the first time (of the synchronization code) in a number of ways, for example as a difference value obtained by subtracting the first time from the second time; or by supplying the first time and the second time to the external apparatus 20 separately, as in that the external processing apparatus 20 is able to compute the second time relative to the first time. In the case of the latter, these do not need to be outputted to the people counting apparatus separately.
- Figure 6 shows a schematic block diagram of the sensing system, in which the vison sensors are integrated in the luminaires in the manner of figure 2B, to provide an accurate people count 64 based on the synchronized data (synchronized according to one or more of the above embodiments).
- sensor units comprise sensor devices in the form of visible or infrared cameras
- other types of sensor device are also viable.
- a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.
Landscapes
- Circuit Arrangement For Electric Light Sources In General (AREA)
Abstract
L'invention concerne une unité de capteur de détection de personne qui comprend : une interface de communication ; un dispositif capteur configuré pour capturer aufur et à mesure des données de capteur d'une zone couverte par le dispositif capteur ; et un processeur configuré pour exécuter les étapes suivantes : détecter, dans une première partie des données de capteur capturées à un premier instant, un code de synchronisation prédéterminé, et mesurer le premier instant sur la base d'un signal d'horloge disponible au niveau de l'unité de capteur ; détecter, dans une seconde partie des données de capteur capturées à un second instant, au moins une personne présente dans la zone, et mesurer le second instant sur la base du signal d'horloge ; sur la base de ladite détection de ladite personne au niveau de l'unité de capteur, générer, à partir de la seconde partie des données de capteur, des données de présence pour le second instant concernant ladite personne détectée ; et délivrer, par l'intermédiaire de l'interface de communication, les données de présence pour le second instant et des données temporelles associées, qui véhiculent le second instant mesuré au niveau de l'unité de capteur par rapport au premier instant mesuré au niveau de l'unité de capteur. Un procédé et un programme informatique correspondants sont également décrits, ainsi qu'un système de capteur qui comprend l'unité de capteur.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15202004 | 2015-12-22 | ||
EP15202004.6 | 2015-12-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017108408A1 true WO2017108408A1 (fr) | 2017-06-29 |
Family
ID=55070744
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2016/080097 WO2017108408A1 (fr) | 2015-12-22 | 2016-12-07 | Système de capteur |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2017108408A1 (fr) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020104366A1 (fr) * | 2018-11-19 | 2020-05-28 | Roomz S.A. | Système de surveillance d'un état d'occupation d'une zone prédéterminée |
US20210041520A1 (en) * | 2019-08-07 | 2021-02-11 | Enlighted, Inc. | Building management system for sensor time correction |
WO2021047878A1 (fr) * | 2019-09-11 | 2021-03-18 | Signify Holding B.V. | Fonctionnement efficace de capteurs de présence ayant des régions de détection superposées |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1996025021A1 (fr) * | 1995-02-06 | 1996-08-15 | Mytech Corporation | Detecteur de presence et procede d'exploitation |
EP2709428A2 (fr) * | 2012-09-12 | 2014-03-19 | Sensity Systems Inc. | Infrastructure d'éclairage sur réseau pour détecter des applications |
WO2014187717A1 (fr) * | 2013-05-21 | 2014-11-27 | Koninklijke Philips N.V. | Dispositif d'eclairage |
WO2015052613A1 (fr) * | 2013-10-10 | 2015-04-16 | Neodelis S.R.L. | Dispositif d'éclairage intelligent, procédé et système correspondants |
-
2016
- 2016-12-07 WO PCT/EP2016/080097 patent/WO2017108408A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1996025021A1 (fr) * | 1995-02-06 | 1996-08-15 | Mytech Corporation | Detecteur de presence et procede d'exploitation |
EP2709428A2 (fr) * | 2012-09-12 | 2014-03-19 | Sensity Systems Inc. | Infrastructure d'éclairage sur réseau pour détecter des applications |
WO2014187717A1 (fr) * | 2013-05-21 | 2014-11-27 | Koninklijke Philips N.V. | Dispositif d'eclairage |
WO2015052613A1 (fr) * | 2013-10-10 | 2015-04-16 | Neodelis S.R.L. | Dispositif d'éclairage intelligent, procédé et système correspondants |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020104366A1 (fr) * | 2018-11-19 | 2020-05-28 | Roomz S.A. | Système de surveillance d'un état d'occupation d'une zone prédéterminée |
US11889390B2 (en) | 2018-11-19 | 2024-01-30 | Roomz S.A. | System for monitoring a state of occupancy of a pre-determined area |
US20210041520A1 (en) * | 2019-08-07 | 2021-02-11 | Enlighted, Inc. | Building management system for sensor time correction |
WO2021025839A1 (fr) * | 2019-08-07 | 2021-02-11 | Enlighted, Inc. | Système de gestion de bâtiment permettant la correction de temps de capteur |
US11474186B2 (en) | 2019-08-07 | 2022-10-18 | Building Robotics, Inc. | Building management system for sensor time correction |
WO2021047878A1 (fr) * | 2019-09-11 | 2021-03-18 | Signify Holding B.V. | Fonctionnement efficace de capteurs de présence ayant des régions de détection superposées |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017108374A1 (fr) | Système de capteur | |
US10448006B2 (en) | People sensing system | |
CN108431702B (zh) | 传感器系统的试运行 | |
US10878251B2 (en) | Image processing system | |
EP3427545B1 (fr) | Système de prédiction de demi-vie basée sur une couleur | |
EP3590310B1 (fr) | Détection de remise en service | |
US11026318B2 (en) | Lighting sensor analysis | |
CN103947296A (zh) | 用于使用声音调试照明的系统和方法 | |
WO2017060083A1 (fr) | Système de comptage de personnes et d'éclairage intégré | |
US11907988B2 (en) | Systems and methods for providing geolocation services in a mobile-based crowdsourcing platform | |
US9877369B2 (en) | Lighting device and method for managing a lighting system | |
WO2017108408A1 (fr) | Système de capteur | |
TWI813968B (zh) | 封閉環境中的活動監控系統和方法 | |
CN113647057A (zh) | 利用预测事件操作的网络系统 | |
JP2016524853A (ja) | 符号化光の検出 | |
CN113785666B (zh) | 照明设备 | |
US20230118062A1 (en) | Grid of a plurality of building technology sensor modules and system comprising such a grid | |
CN116830565A (zh) | 用于操作监控网络的方法、计算机程序和监控网络 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16806164 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16806164 Country of ref document: EP Kind code of ref document: A1 |