WO2020104254A1 - Système de comptage de personnes doté de régions de détection agrégées - Google Patents

Système de comptage de personnes doté de régions de détection agrégées

Info

Publication number
WO2020104254A1
WO2020104254A1 PCT/EP2019/081033 EP2019081033W WO2020104254A1 WO 2020104254 A1 WO2020104254 A1 WO 2020104254A1 EP 2019081033 W EP2019081033 W EP 2019081033W WO 2020104254 A1 WO2020104254 A1 WO 2020104254A1
Authority
WO
WIPO (PCT)
Prior art keywords
occupant
aggregated
detection regions
variability
people counting
Prior art date
Application number
PCT/EP2019/081033
Other languages
English (en)
Inventor
Ashish Vijay Pandharipande
Meng Zhao
Original Assignee
Signify Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Signify Holding B.V. filed Critical Signify Holding B.V.
Publication of WO2020104254A1 publication Critical patent/WO2020104254A1/fr

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/125Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • a people counting system with aggregated detection regions A people counting system with aggregated detection regions
  • the invention relates to a people counting system, a people counting device, an aggregation device, a people counting method, and a computer readable storage medium.
  • Counting how many people are present in an environment enables various useful applications. For instance, in an office environment, knowing how often a particular office, desk, or meeting room is typically occupied enables to optimize space in the office, for instance by adjusting the amount of meeting rooms or desks according to needs. For the increasingly popular“flex” offices, in which employees do not have a personal desk but instead select their desk on a day-by-day basis or even switch desks multiple times per day, having information about occupancy of workplaces is of particular importance. For instance, based on people count data, employees may be pointed to places where desks are available, and office management may decide to adjust the number of workplaces based on the needs of their employees.
  • International patent application WO 2017/080929 A1 provides a people counting system in which a plurality of vision sensors is arranged to provide sensor coverage of an area.
  • a local image processor is configured to apply a local person detection algorithm to images captured by a respective vision sensor, thereby generating a local presence metric representative of a number of detected people.
  • a central processor estimates the total number of people in the area covered by the vision sensors by applying an aggregation algorithm to the local presence metrics generated by the local image processors.
  • the people counting system enables opt-outs by a user device associated with a person in the area, for instance, somebody who does not want to reveal any information that may be perceived to give away information related to their presence.
  • 2017/080929 A1 provide people count data, is not always satisfactory.
  • a total number of people in an area covered by a sensor and/or an overall total number of people are
  • the opt-out in the existing system relates to a particular user.
  • people count statistics for a particular area are influenced not just by occupancy of that area but also by how many occupants happen to have provided an opt-out.
  • the only choice the user has, is between being included in the people count or not being included in the people count at all.
  • more subtle privacy preferences of users and/or privacy preferences by other stakeholders such as facility managers cannot be considered by the system.
  • a people counting system is provided.
  • Image data is extracted from a vision sensor, and occupancy data is extracted from the image data for each detection region in a set of detection regions.
  • the extracted occupancy data is aggregated into people count data for each aggregated detection region in a set of aggregated detection regions.
  • the set of aggregated detection regions is determined based on occupant variability measures of the respective detection regions to effect increased aggregation of occupancy data of detection regions with a low occupant variability measure. Determination of the set of aggregated detection regions comprises identifying a detection region with a low occupant variability that does not exceed a given threshold and grouping said detection region into an aggregated detection region with a size having an aggregated occupant variability of at least the given threshold.
  • Aggregating occupancy data into people count data based on aggregated detection regions determined based on occupant variability measures has the advantage that more use can be made of occupancy data, while it still aggregated when needed. For instance, detection regions with a low occupant variability measure are aggregated into larger aggregate detection regions to ensure increased aggregation, but detection regions with higher occupancy variability measures can be aggregated into smaller aggregate detection regions to allow more use of data.
  • the larger aggregate detection regions may comprise one or more detection regions with a low occupant variability measure or may comprise at least two detection regions with a low occupant variability measure.
  • occupant variability measures are determined for particular detection regions. This allows increased aggregation without the need to identify particular persons to process their opt-out for people counting.
  • people count data for the same aggregated detection regions may be determined at different points in time regardless of the presence of particular persons, obtaining more stable statistics.
  • the people counting system is a lighting system, the vision sensor being co-located with a luminaire of the lighting system. This allows hardware and/or software to be combined in a single system and/or luminaires to be used to light the area that the vision sensor obtains image data from.
  • a set of occupant features of an occupant of a detection region is extracted from the image data, and an occupant variability measure of the detection region is determined from sets of occupant features extracted during a time interval.
  • the occupant variability measure can be determined automatically based on actual usage trends of the detection region.
  • the set of occupant features comprises one or more of a head shape, a head-shoulder shape, a hair color, and a whorl; these features are effective for determining an occupant variability measure.
  • the extracted set of occupant features are stored in a memory grouped according to a feature type; because the link between different features extracted from the same image data is lost by storing in this way, less sensitive information is stored while the occupant variability can still be determined.
  • determining the occupant variability measure comprises computing a feature diversity of a feature over multiple extracted sets of occupant features, giving an indication of the amount of variation in this particular feature that can be combined with other information to obtain the occupant variability measure.
  • determining the occupant variability measure comprises clustering extracted sets of occupant features. Because extracted sets of occupant features of the same person are expected to be more similar than extracted sets of occupant features of different persons, clustering extracted sets of occupant features can be used to determine which occupant features are of the same person.
  • the occupant variability measure is indicative of a number of different occupants of the detection region during a time interval; this helps detection regions with a low number of different occupants such as personal workplaces to be aggregated into larger aggregated detection regions.
  • the aggregated detection regions are determined in order to satisfy an anonymity property with respect to the occupant variability measures.
  • anonymity property means that the chance that an individual person can be identified is below a certain value, e.g. below 2 %, or preferably below 1 %, more preferably below 0.1 % even more preferably below 0.01 %.
  • the anonymity property can guarantee increased aggregation of occupancy data.
  • determining the aggregated detection regions comprises identifying a detection region with a low occupant variability and grouping said detection region into an aggregated detection region with a size exceeding a given threshold. This ensures increased aggregation of such a detection region.
  • determining the aggregated detection regions comprises computing an aggregated variability measure for an aggregated detection region and verifying that the aggregated variability measure exceeds a given threshold. This way, it is ensured that each aggregated detection region has at least a particular aggregated variability measure, and that, as a consequence, detection regions with a low occupant variability are increasingly aggregated together with other detection regions.
  • a people counting device for use in a people counting system.
  • the people counting device comprises a vision sensor, an image processing unit, and an occupancy processing unit.
  • the people counting device extracts occupancy data from image data obtained from the vision sensor and aggregates it into people count data. Thereby, people count data can be obtained without occupancy data needing to leave the people counting device.
  • an aggregation device for use in a people counting system.
  • the aggregation device obtains extracted occupancy data from one or more people counting devices and aggregates it into people count data.
  • people count data is obtained close to the people counting devices, but without the need to configure the people counting devices to perform the aggregation, determine or otherwise obtain occupant variability measures and/or determine the set of aggregated detection regions.
  • a further aspect concerns a people counting method.
  • An embodiment of the method may be implemented on a computer as a computer implemented method, or in dedicated hardware, or in a combination of both.
  • Executable code for an embodiment of the method may be stored on a computer program product.
  • Examples of computer program products include memory devices, optical storage devices, integrated circuits, servers, online software, etc.
  • the computer program product comprises non-transitory program code stored on a computer readable medium for performing an embodiment of the method when said program product is executed on a computer.
  • the people counting method of described herein may be applied in a wide range of practical applications. Such practical applications include space optimization; workspace recommendation; and heating, ventilation, and air conditioning (HVAC) control.
  • HVAC heating, ventilation, and air conditioning
  • the computer program comprises computer program code adapted to perform all the steps of an embodiment of the method when the computer program is run on a computer.
  • the computer program is embodied on a computer readable medium.
  • Another aspect of the invention provides a method of making the computer program available for downloading. This aspect is used when the computer program is uploaded into, e.g., Apple’s App Store, Google’s Play Store, or Microsoft’s Windows Store, and when the computer program is available for downloading from such a store.
  • Fig. la schematically shows an example of an embodiment of a people counting system
  • Fig. lb schematically shows an example of an embodiment of a people counting system
  • Fig. 2 schematically shows an example of an embodiment of a people counting system
  • Fig. 3 schematically shows an example of an embodiment of a people counting system
  • Fig. 4 schematically shows an example of an embodiment of a people counting system
  • Fig. 5a schematically shows an example of an embodiment of luminaire
  • Fig. 5b schematically shows an example of an embodiment of a people counting device
  • Fig. 6a schematically shows an example of an embodiment of a people counting method
  • Fig. 6b schematically shows a computer readable medium having a writable part comprising a computer program according to an embodiment
  • Fig. 6c schematically shows a representation of a processor system according to an embodiment.
  • Fig. la schematically shows an example of an embodiment of a people counting system 100.
  • People counting system 100 counts people in an environment 160 in which the people counting system is installed.
  • environment 160 may be an indoor space, such as one or more rooms and/or corridors, or part thereof; a partially-covered space such as a stadium or gazebo, or part thereof; environment 160 may also be an outdoor space.
  • People counting system 100 comprises one or more vision sensors.
  • the figure shows two vision sensors 110, 111, but people counting system 100 may also comprise just one vision sensor, or more than two vision sensors, for example, at least 10 or at least 50.
  • each vision sensor is comprised in a respective people counting device, various embodiments of which are described herein.
  • People counting system 100 may be a lighting system wherein vision sensors are co-located with luminaires of the lighting system, as discussed in more detail below.
  • Vision sensors e.g., vision sensor 110 or 111, are also known as image capture devices, e.g., vision sensor 110 and/or 111 may be a camera.
  • a vision sensor is configured to capture image data.
  • the vision sensor may be able to detect light, for example, light from luminaires of a lighting system of which the vision sensor is part.
  • the vision sensor is a visible light camera.
  • the use of a thermal camera is not excluded.
  • a vision sensor can provide much richer data than a conventional passive infrared sensor that is commonly used, e.g., in lighting systems.
  • the image data is image data of a sensor area of the vision sensor. For example, shown in the figure are respective sensor areas 151, 152 of the vision sensors 110, 111. Sensor areas of respective vision sensors may be overlapping. It is also possible that parts of environment 160 are not covered by any sensor area: for instance, a part of a floor area that does not contain any desk may not need to be covered by any sensor area.
  • a sensor area of a vision sensor may comprise one or more detection regions of which occupancy data is extracted from the image data extracted by vision sensors 110,
  • a detection region of a sensor area may be a predefined part of the sensor area, for example, predefined by a user. For instance, a detection region may be defined when commissioning the people counting system.
  • a detection region is preferably an area that is typically expected to be occupied by at most one person, e.g., a workspace such as a desk or an area for sitting at such a desk, a cubicle, an office with a single occupant, etcetera. In such a case, occupancy data may indicate whether or not the detection region is occupied.
  • detection regions 154, 155 and 156 each comprising a single workplace (shown in darker grey) at a large desk (shown in lighter grey).
  • a detection region is occupied by more persons and/or is typically expected to be occupied by more persons, for example, the detection region may comprise a meeting room, a coffee area, a square, etcetera, in which case occupancy data may indicate a number of people occupying the detection region.
  • the detection regions shown in the figure are rectangles, this is not necessary: detection regions may also be circles, polygons, etcetera.
  • a vision sensor 110, 111 may cover around 5 to 15 detection regions, e.g., desks; a typical sensor area 151, 152 may be around 20 to 30 square meters.
  • An aggregated detection region may comprise one or more detection regions.
  • aggregated detection region 157 shown in the figure comprises six detection regions corresponding to the six workplaces (show in darker grey) it comprises.
  • aggregated detection region 158 comprises three detection regions
  • aggregated detection region 159 comprises four detection regions.
  • all detection regions comprised in an aggregated detection region are from a single sensor area. For instance, this allows the same device that extracts image data of the sensor area to perform aggregation.
  • an aggregated detection region comprises detection regions of multiple sensor areas. This provides more flexibility in selecting which detection regions are comprised in which aggregated detection region.
  • People counting system 100 may aggregate the extracted occupancy data into people count data for each aggregated detection region in the set of aggregated detection regions.
  • people count data for an aggregated detection region may be a number representing a number of people in detection regions of the aggregated detection region; other examples are provided elsewhere.
  • the set of aggregated detection regions is determined according to embodiments described herein.
  • one or more aggregated detection regions may comprise just a single detection region.
  • detection regions 154, 155, and/or 156 may individually form separate aggregated detection regions.
  • at least one aggregated detection region comprises multiple detection regions. There are typically multiple aggregated detection regions.
  • the aggregated detection regions are a partitioning of the set of all detection regions, but it is also possible that some detection regions are not comprised in any aggregated detection region, for example, if a regular occupant of the aggregated detection region has elected to opt out of the people counting system.
  • each detection region is comprised in at most one aggregated detection region.
  • an aggregated detection region comprises adjacent detection regions, facilitating the display of the people count data to a user, but also this is not necessary: in general, an aggregated detection region may comprise any nonempty subset of the set of detection regions.
  • people counting system 100 may achieve that people count data for aggregated detection regions, e.g., aggregated detection region 157, 158 and/or 159, is determined based on image data from vision sensors, e.g., vision sensors 110, 111. Thereby, at least some degree of aggregation of extracted occupancy data is achieved, while still allowing this data to be used, e.g., for space optimization, workspace recommendation, HVAC control, data analytics driven marketing, and the like.
  • Fig. lb schematically shows an example of an embodiment of a people counting system 101 installed in an environment 160, in this case, a room.
  • people counting system 101 comprises four people counting devices 112, 113, 114, and 115, mounted on the ceiling of room 160, and a gateway device 120.
  • Each people counting device may comprise a vision sensor with a respective sensor area.
  • the vision sensor of people counting device 114 has sensor area 153.
  • a gateway device such as gateway device 120 may collect information produced by one or more people counting devices, e.g., people counting devices 112-115, optionally further process the collected information, and provide it to a data collection device 130.
  • the Information provided to data collection device 130 may comprise people count data 170 for each aggregated detection region in a set of aggregated detection regions. Note that data collection device 130 may or may not be part of people counting system 101.
  • data collection device 130 obtains information, e.g., people count data 170, from people counting system 101 but is not part of the system itself; in other embodiments, data collection device 130 is comprised in people counting system 130, for instance, as the party that determines the set of aggregated detection regions.
  • people counting system 101 may comprise various units, embodiments of which are described in more detail below.
  • people counting system 101 may comprise an image processing unit configured to obtain image data from a vision sensor, e.g., a vision sensor of a people counting device 112, 113, 114, or 115, and extract occupancy data from the image data for each detection region in a set of detection regions.
  • People counting system 101 may further comprise a region processing unit configured to obtain occupant variability measures of detection regions in the set of detection regions and determine a set of aggregated detection regions based on the occupant variability measures, each aggregated detection region comprising one or more detection regions, to effect increased aggregation of occupancy data of detection regions with a low occupant variability measure.
  • People counting system 101 may also comprise an occupancy processing unit configured to aggregate the extracted occupancy data into people count data 170 for each aggregated detection region in the set of aggregated detection regions.
  • a people counting device of people counting system 101 comprises a vision sensor, an image processing unit, a region processing unit, and an occupancy processing unit as described above; for example, there may be multiple such people counting devices.
  • a people counting device may provide people count data 170 to data collection device 130, e.g., by sending the people count data 170 indirectly via gateway device 120, or by sending it directly.
  • people counting system 101 may not comprise a gateway device.
  • Such embodiments may ensure that data stays as close as possible to where it is collected, thereby reducing communication complexity and improving privacy.
  • such embodiments may be more expensive to produce or maintain because units are duplicated across many devices.
  • one or more of the image processing unit, the region processing unit, and the occupancy processing unit are comprised in other devices such as gateway device 120 or data collection device 130.
  • such embodiments have the advantage that less duplication is needed and/or economies of scale in processing the data may be achieved.
  • the gateway device may comprise at least the occupancy processing unit and may be configured to obtain extracted occupancy data from one or more people counting devices; in such cases, the gateway device may be called an aggregation device.
  • each of the units can either be included in a people counting device or a gateway device.
  • the image processing unit and occupancy processing unit are preferably not included in data collection device 130, e.g., to ensure that data collection device 130 receives more aggregated information.
  • Hybrid forms in which, for instance, a unit is included in some people counting devices but not in others, in which the gateway performs these functions for the latter people counting devices, are also possible.
  • Fig. 2 schematically shows an example of an embodiment of a people counting system 200 comprising a people counting device 210 and a gateway device 211.
  • People counting device 210 comprises a communication interface 250 arranged for digital communication with gateway device 211.
  • Gateway device 211 comprises a communication interface 251 arranged for digital communication with people counting device 210.
  • communication interfaces 250, 251 may communicate over a computer network 260.
  • Computer network 260 is preferably a wireless personal area network, e.g., using ZigBee, IrDA, wireless USB, Bluetooth, or similar.
  • computer network 260 may also be another type of network, e.g., an internet, an intranet, a LAN, a WLAN, etc.
  • communication interfaces 250, 251 may comprise a connector, e.g., a wireless connector, an Ethernet connector, a Wi-Fi, 4G or 4G antenna, a ZigBee chip, etc., as appropriate for computer network 260.
  • Computer network 260 may comprise additional elements, e.g., a router, a hub, etc.
  • the figure shows a single people counting device 210 and a single gateway device 211, but people counting system 200 may comprise multiple people counting devices and/or multiple gateway devices, e.g., gateway device 211 may be arranged for digital communication with a set of multiple people counting devices and its units may be arranged to perform their respective functions for each people counting device from the set of multiple people counting devices.
  • a data collection device 212 may obtain data, e.g., people count data, from people counting system 200.
  • People counting device 210 may comprise a vision sensor 231.
  • Vision sensor 231 may be configured to capture image data 241, e.g., of a sensor area of an environment in which people counting system 200 is installed.
  • People counting device 210 may further comprise an image processing unit 232.
  • Image processing unit 232 may be configured to obtain a set of detection regions. As discussed, a detection region is typically a predefined part of a sensor area of vision sensor 231. For instance, image processing unit 232 may receive the set of detection regions from an external device, e.g., data collection device 212 or a configuration device, e.g., directly or indirectly via gateway 211. The set of detection regions may also be hardcoded, or it may be determined by people counting device 210, e.g., by detecting that a particular area of the sensor area of vision sensor 231 is typically occupied by a single person.
  • Image processing unit 232 may be further configured to obtain image data 241 from vision sensor 231. Moreover, image processing unit 232 may be configured to extract occupancy data 242 from image data 241 for each detection region in the set of detection regions.
  • occupancy data 242 of a detection region is value, e.g., a nonnegative integer, indicative of a number of occupants according to image data 241.
  • occupancy data 242 of a detection region is a value, e.g., a binary value, indicative of whether the detection region is occupied according to image data 241.
  • occupancy data 242 of a detection region is a probability value, e.g., a value between zero and one, indicative of whether the detection region is occupied according to image data 241.
  • image processing unit 232 may apply a local person detection algorithm producing a local presence metric, as disclosed in international patent application WO 2017/080929 Al, titled“Image processing system” (incorporated herein by reference).
  • the image processing may be real-time, in the sense that image processing unit 232 repeatedly extracts occupancy data 242 as new image data 241 is captured; it may be periodic, e.g., new occupancy data 242 is determined every second or every few seconds, e.g., two or ten seconds; it may be pseudo-real-time, e.g., new occupancy data is determined every minute or every few minutes, or every hour; or it may be pseudo-static, e.g., new occupancy data 242 is determined in response to obtaining an instruction, e.g., of gateway device 211 or data collection device 212. Occupancy data 242 may be generated over a time window, e.g., based on image data 241 extracted within that time window, e.g., allowing to filter out artifacts or use more accurate person detection.
  • Occupancy data 242 may be generated over a time window, e.g., based on image data 241 extracted within that time window, e.g., allowing to filter out
  • Image processing unit 232 may be further configured to extract from image data 241 a set of occupant features 246 of an occupant of a detection region. Such occupant features may allow people counting device 210 to determine an occupant variability measure 243, as elaborated on below.
  • a feature is typically a value, e.g., an integer or a floating-point value, indicative of a particular physical property of an occupant.
  • set of occupant features 246 comprises one or more of a head shape, a head-shoulder shape, a hair color, and a whorl; these features have been found to be suitable for determining variability measures.
  • image processing unit 232 may also extract a different set of occupant features and/or use a different method for extracting them.
  • the occupant features 246 extracted during a time interval are suitable for determining whether the detection region is occupied by a fixed user or by different users during that time interval, e.g., occupant features of the same person extracted from different image data 241 may be on average closer according to a distance metric than occupant features of different persons extracted from different image data 241.
  • extracting the set of occupant features 246 does not typically involve identifying the occupant of the detection region, decreasing the amount of processing of sensitive information.
  • the extraction of occupant features 246 may happen at the same time as extracting the occupancy data.
  • image processing unit 232 may also apply a different schedule for extracting the occupant features, e.g., periodically, in pseudo-real-time, or pseudo-statically. For instance, occupant features 246 may be extracted less frequently than occupancy data 242, allowing to more effectively use computational resources.
  • People counting system 210 may further comprise a variability processing unit 235 configured to determine an occupant variability measure 243 of a detection region from sets of occupant features 246 extracted during a time interval.
  • occupant variability measure 243 is indicative of whether a workspace in the detection region is used by a fixed user or by different users during the time interval.
  • occupant variability measure 243 may be binary, denoting a fixed user or a different user.
  • Occupant variability measure 243 may also be indicative of a number of different occupants of the detection region during the time interval, for example, it may be an integer or floating-point value representing an estimate of the number of different occupants during the time interval.
  • Occupant variability measure 243 may also measure a degree of uncertainty, e.g., entropy, e.g., measured in bits, about the occupant of the detection region during the time interval.
  • the time interval should be sufficiently large to detect trends, while being short enough to detect changes; in an embodiment, the time interval spans at most or at least one or multiple days, for instance, one, two, or more weeks.
  • Determining occupant variability measure 243 may be performed in various ways.
  • Variability processing unit 235 may identify an occupant of the detection region based on occupant features 246 and determine occupant variability measure 243 therefrom;
  • identification is preferably avoided for privacy reasons.
  • Variability processing unit may also determine occupant variability measure 243 from occupant features 246 by classifying a vector comprising occupant features 246 over a time interval using a trained classifier.
  • the classifier may determine occupant variability measure 243 in the form of a classification of the vector of features as flex or non flex, or with a more granular classification.
  • variability processing unit 235 determines occupant variability measure 243 by clustering extracted sets of occupant features 246.
  • Clustering e.g., k-means clustering, hierarchical clustering or distribution-based clustering, allows to gain insight into similarities between multiple extracted sets of occupant features. For instance, a number of clusters identified by the clustering algorithm may indicate a number of different occupants. As such, occupant variability measure 243 may be based on a number of clusters identified by the clustering algorithm.
  • a clustering algorithm may still divide the sets of occupant features into multiple clusters, e.g., corresponding to different postures.
  • variability processing unit 235 may configure the clustering algorithm, e.g., based on expected variations between sets of occupant features of the same person and of different persons. Variability processing unit 235 may also determine a quality metric of the clustering. For instance, variability processing unit 235 mat determine that the detection region has low occupant variability, e.g., is non-flex, if a clustering quality metric exceeds a given threshold. Variability processing unit 235 may also use quality metrics to post-process the output of the clustering algorithm, e.g., by comparing an average distance of points between two different clusters to a predefined threshold, and merging clusters if the average distance is below the threshold.
  • variability processing unit 235 may also cluster occupant features corresponding to multiple detection regions, e.g., all detection regions of people counting device 210; in such a case, occupant variability measure 243 of a particular detection region may be determined based on matching sets of occupant features in a cluster to the detection region they were extracted from, for instance, based on a cluster quality metric such as a f-measure or a Fowlkes- Mallows index.
  • a cluster quality metric such as a f-measure or a Fowlkes- Mallows index.
  • variability processing unit 235 may determine that the detection region has low occupant variability, e.g., it is non-flex.
  • variability processing unit 235 determines occupant variability measure 243 using a method for which extracted occupant features do not need to be grouped according to image data 241 from which they were extracted.
  • a particularly beneficial way of organizing a memory may be used, an example of an embodiment of which is schematically shown in Fig. 5b.
  • this figure shows a people counting system 510’ comprising a vision sensor 53 G configured to capture image data 541, an image processing unit 532 configured to extract from image data 541 sets of occupant features of an occupant of a detection region, and a variability processing unit 535 configured to determine occupant variability measure 543 of the detection region from sets of occupant features extracted during a time interval.
  • Fig. 5b shows functional units that may be functional units of a processor.
  • extracted sets of occupant features 546 are stored in a memory, e.g., a memory of people counting system 510’, grouped according to a feature type. For instance, shown in the figure are a first set of features 546.1 of a first feature type, e.g., head shape, and a second set of features 546.2 of a second feature type different from the first feature type.
  • the leftmost feature of set 546.1 may have been extracted from the same image data as the leftmost feature of set 546.2, or the rightmost feature of set 546.2, etc.
  • Storing features grouped by feature type, as opposed to grouped by the image from which they are extracted, has the advantage that it is harder to identify particular persons from the stored occupant features 546, thereby improving privacy and data minimization. For instance, it may be possible to identify a person with some confidence based on a set of occupant features from the same image data 541, but this may not be possible if it is not known which features were from the same image data, as is the case with occupant features 546.
  • individual occupant features may be stored in various ways, e.g., sorted by their value, or as a count for each possible value of the occupant feature.
  • Set 546 may be reset, e.g., at the start of a new time interval.
  • one method to determine occupant variability measure 543 that may be applied by variability processing unit 535 using to the memory organization discussed above, or by other variability processing units and/or using other memory organizations, involves computing feature diversities.
  • a feature diversity may indicate an amount of variation in values of the feature, e.g., as measured by a standard deviation or a spread of a feature count histogram. For instance, if a set of features is dominant, the histogram spread may be smaller, indicating a higher likelihood of the same user occupying a workspace in the detection region.
  • a feature diversity is an entropy computed over quantized feature values.
  • variability processing unit 535 may first compute a feature diversity of a feature over multiple extracted sets of occupant features, e.g., based on features 546.1 or 546.2 grouped according to the feature type. Then, variability processing unit 535 may combine multiple computed feature diversities of respective features into occupant variability measure 543, for instance, by taking the maximum, taking the minimum, averaging, summing, or any other way appropriate for the used feature diversity. For instance, if the feature diversity is an entropy or a standard deviation, the minimum may be taken.
  • image processing unit 532 and variability processing unit 535 are in this particular example components of a people counting device, it will be understood that one or more of these units may equally well be components of one or more other devices of a people counting system, e.g., variability processing unit 535 may be a unit of a gateway device, as further elaborated upon elsewhere.
  • the image processing unit and the variability processing unit may be components of different devices, in which case the extracted set of occupant features may be stored grouped according to a feature type in a memory of one or both of the two devices.
  • People counting device 510’ typically contains additional components, e.g., a communication interface and/or an occupancy processing unit, that for ease of exposition are not shown here.
  • variability processing unit 235 receives occupant variability measures 243 for one or more detection regions from an external device, e.g., directly from gateway device 211 or indirectly from another device via gateway device 211.
  • variability processing unit 235 may receive override occupant variability measures for one or more detection regions, and override determined occupant variability measures 243 by the override occupant variability measures or use these values instead of determining occupant variability measures 243 for these detection regions.
  • undesired results of determining the set of aggregated detection regions may be avoided and/or opt-outs for particular detection regions, e.g., provided by an occupant of such a detection region, may be enabled.
  • People counting device 210 may further comprise a region processing unit 233.
  • Region processing unit 233 may be configured to obtain occupant variability measures 243 of detection regions in the set of detection regions, e.g., from variability processing unit 235 as discussed above.
  • Region processing unit 233 may be further configured to determine a set 244 of aggregated detection regions based on occupant variability measures 243.
  • Each aggregated detection region comprises one or more detection regions.
  • the aggregated detection regions are typically disjoint, but not all detection regions need to be comprised in an aggregated detection region: for example, an occupant of a detection region may opt out of people counting system 210, in which case occupancy data 242 of the detection region is not used to compute people count data 245, or not even extracted at all.
  • at least some aggregated detection regions 244 comprise multiple detection regions.
  • Set 244 of aggregated detection regions may be determined to effect increased aggregation of occupancy data of detection regions with a low occupant variability measure.
  • low occupant variability measure may mean that the occupant variability measure is below a given threshold.
  • Such a detection region may be aggregated in an aggregated detection to effect increased aggregation, e.g., an aggregated detection region of at least a given size, an aggregated detection region of which an aggregated variability measure exceeds a given threshold, and the like.
  • effecting increased aggregation may comprise ensuring that the aggregated detection regions satisfy an anonymity property with respect to the occupant variability measures.
  • anonymity properties are provided below.
  • a low occupant variability measure 243 may indicate that a detection region corresponds to a fixed workspace, e.g., a workspace that is typically occupied by the same person or same few persons.
  • Region processing unit 233 may group such a detection region into an aggregated detection region 244 with at least a given size, e.g., a size exceeding a given or predetermined threshold.
  • detection regions with higher occupant variability measure 243 may be grouped into aggregated detection regions 244 with smaller size, or even into an aggregated detection region 244 of size one.
  • an occupant variability measure 243 may indicate that a detection region corresponds to a flexible workplace, of which occupancy data 242 may be reported directly.
  • region processing unit 233 identifies multiple aggregated detection regions 244, but, depending on the occupant variability measures 243, there may also be just one aggregated detection region 244.
  • an aggregated detection region comprises adjacent detection regions, e.g., geometrically adjacent regions.
  • an aggregated detection region may be defined by a shape, e.g., a square or a rectangle.
  • Region processing unit 233 may for instance obtain, e.g., generate, a candidate aggregated detection region according to a predefined shape and add such a candidate aggregated detection region to set 244 of aggregated detection regions if it effects increased aggregation, e.g., satisfies an anonymity property or has at least a given aggregated variability measure.
  • Region processing unit 233 may also achieve adjacency by specifying it as a constraint to an optimization problem, as elaborated on below.
  • Region processing unit 233 may use various approaches to determine aggregated detection regions 244. Determining aggregated detection regions 244 may be phrased as an optimization problem in which the objective is to find an optimal division of detection regions into aggregated detection regions subject to satisfying an anonymity property with respect to the occupant variability measures 243. Optimality may mean, for instance, that the number of aggregated detection regions 244 is as large as possible, that sum of the squares of the sizes of the respective aggregated detection regions is as small as possible, etcetera.
  • the anonymity property may demand increased aggregation of occupancy data of detection regions with a low occupant variability measure. For instance, the anonymity property may demand that all detection regions with occupant variability measure 243 lower than a given value, e.g., all non-flex workplaces, are grouped in an aggregated detection region 244 with size exceeding a given first threshold, and/or that all detection regions with occupant variability measure 244 of at least the given value, e.g., all flex workplaces, are grouped in an aggregated detection region 244 with size exceeding a given second threshold.
  • the optimization problem may demand that all detection regions are included in an aggregated detection region 244, or that a penalty is incurred for any detection region that is not included in any aggregated detection region 244.
  • the optimization problem may also demand that detection regions in an aggregated detection region are adjacent and/or a penalty is incurred for any non-adjacent detection regions in an aggregated detection region.
  • Region processing unit 233 may effect increased aggregation by computing an aggregated variability measure of an aggregated detection region 244, e.g., based on occupant variability measures of the respective detection regions in the aggregated detection region, and verifying that the computed aggregated variability measure exceeds a given threshold.
  • an aggregated variability measure may be a sum of the respective occupant variability measures 243 or another function that at least is non-decreasing as additional detection regions are added to the aggregated detection region. For instance, if an occupant variability measure corresponds to a number of bits of uncertainty about the occupant of a detection region or a number of different occupants of the detection region, then the sum may be a good estimate; a discounting may be applied to the sum to account for the fact that the set of occupants of detection regions that are close to each other are typically not independent from each other. For instance, such an aggregated variability measure may be used to phrase the above optimization problem.
  • Region processing unit 233 may determine set 244 of aggregated detection region using generic optimization techniques to solve the optimization problem phrased above, for instance, using linear programming, integer linear programming, or heuristic techniques such as hill climbing, simulated annealing, etcetera. Region processing unit 233 may also determine set 244 of aggregated detection regions without specifically solving an optimization problem as above, for instance, by starting with a detection region with low variability, and grouping it with other detection regions with low variability and/or with adjacent detection regions until the obtained aggregated detection region is sufficiently large, e.g., has an aggregated variability exceeding a given threshold, proceeding to a next detection region, etc. For instance, region processing unit 233 may select other detection regions that are adjacent to at least one detection region already in the aggregated detection region.
  • Region processing unit 233 may also include a detection region into a non-adjacent aggregated detection region 244, for example, if no suitable adjacent detection region is found to combine the detection region with.
  • People counting device 210 may also comprise an occupancy processing unit 234.
  • Occupancy processing unit 234 may be configured to aggregate occupancy data 242 into people count data 245 for each aggregated detection region in the set of aggregated detection regions 244.
  • People count data 245 of an aggregated detection region may be indicative of a number of people occupying the detection regions of the aggregated detection region.
  • people count data 245 may be an integer or floating-point value, for example, the sum of the occupancy data 242 of respective detection regions, e.g., rounded to the nearest integer.
  • People count data 245 may provide a snapshot at a single point in time, but people count data 242 may also represent a trend, for instance, occupancy data 245 may comprise an average number of occupants over a time period and/or a standard deviation of the number of occupants. Occupancy processing unit 234 may provide the determined people count data 245 to communication interface 250, which may then provide the people count data to gateway device 211. By sending people count data 245 as opposed to more detailed data such as occupancy data 242 or occupant variability measures 243 of individual detection regions, data minimization and/or data locality is improved.
  • Gateway device 211 shown in the figure comprises communication interfaces 251 and 252 and processor 236. As discussed above, communication interface 251 is arranged for digital communication with people counting device 210. Similarly,
  • Communication interface 252 is arranged for digital communication with a data collection device 212, e.g., using a computer network 261.
  • Computer network 261 may be an internet, an intranet, a LAN, a WLAN, etc.
  • Computer network 261 may be the Internet.
  • the computer network may be wholly or partly wired, and/or wholly or partly wireless.
  • the computer network may comprise Ethernet connections.
  • the computer network may comprise wireless connections, such as Wi-Fi, ZigBee, and the like.
  • Gateway device 211 and data collection device 212 comprise a communication interface which is arranged to communicate with each other as needed.
  • the communication interface may comprise a connector, e.g., a wired connector, e.g., an Ethernet connector, or a wireless connector, e.g., an antenna, e.g., a Wi-Fi, 4G or 5G antenna.
  • Computer network 261 may comprise additional elements, e.g., a router, a hub, etc.
  • Computer networks 260 and 261 may be the same computer network, and communication interfaces 251 and 252 may be the same communication interface.
  • network 260 may be a network with lower power and/or computational requirements than network 261; and/or for reducing external exposure of people counting device 210.
  • Processor 236 of gateway device 211 is configured to obtain people count data from one or more people counting devices 210, e.g., multiple people counting device, and to provide the people count data to data collection device 212.
  • Gateway device may perform aggregation on the people count data prior to providing it to data collection device 212, e.g., by computing statistics such as a mean and standard deviation over a longer time period, and/or by aggregating people count data from multiple aggregated detection regions into people count data for a larger aggregated detection region. Thereby, improved data minimization with respect to data collection device 212 may be achieved.
  • Data collection device 212 shown in the figure comprises communication interface 253 for digital communication with gateway device 211, as discussed above, as well as a processor 237.
  • Processor 237 may be configured to receive people count data from gateway device 211.
  • processor 237 receives people count data from multiple devices, e.g., multiple gateway devices or multiple people counting devices.
  • Data collection device 212 may perform various services based on the received people count data.
  • data collection device 212 comprises a display showing the environment in which people count system is installed together with the obtained people count data, for example, superimposed on a picture of the environment, e.g., a map.
  • the display may show the environment similarly to how environment 160 is shown in Fig. la, with people counts shown for aggregated detection regions such as regions 157, 158, and 159.
  • Data collection device may also provide one or more of a space optimization application; a workspace recommendation application; a heating, ventilation, and air conditioning (HVAC) control application; and the like and/or provide data to other device offering such applications.
  • a space optimization application may use people count data to identify temporal and spatial usage patterns.
  • a workspace recommendation application may show people count data in real-time or pseudo-real-time to direct users to potentially unoccupied workplaces, e.g., by showing availability of workspaces within an aggregated detection region, etcetera.
  • relevant services are based on occupancy data occupied from vision sensors, while adhering to the data minimization principle by ensuring increased aggregation of occupancy data of detection regions with a low occupant variability measure.
  • Fig. 2 shows functional units that may be functional units of the processor.
  • the devices shown in Fig. 2 may be used as a blueprint of a possible functional organization of the processor.
  • the processor is not shown separate from the units in the figure.
  • the functional units shown in Fig. 2 may be wholly or partially be implemented in computer instructions that are stored at the respective devices, e.g., people counting device 210, e.g., in an electronic memory of people counting device 210, and are executable by a microprocessor of people counting device 210.
  • Fig. 2 Such data elements are typically stored in a memory of a respective device, e.g., people counting device 210 may comprise a memory configured to store image data 241, occupancy data 242, occupancy variability measures 243, aggregated detection regions 244, people count data 245, and/or occupant features 246.
  • Fig. 3 schematically shows an example of an embodiment of a people counting system 300.
  • People counting system 300 comprises a people counting device 310 and a gateway device 311. Together, the devices determine people count data 345 based on image data from a vision sensor 311 that may be provided to external devices, e.g., data collection device 312 as shown in the figure.
  • functional units are shown that may be functional units of the processor. Although similar units are shown in Fig. 3 as in Fig. 2, in this particular example the units are distributed differently over the devices of the people counting system.
  • data elements shown in the figure are typically stored in a memory of the respective device.
  • people counting device 310 comprises a vision sensor 331, e.g., vision sensor 231 or similar; an image processing unit 332, e.g., image processing unit 232, 532, or similar; and a communication interface 350, e.g., communication interface 250 or similar.
  • Vision sensor 331 may be configured to capture image data 341.
  • Image processing unit 332 may obtain image data 341 from vision sensor 331, and extract from image data 341 occupancy data 342 for each detection region in a set of detection regions.
  • Image processing unit 332 may also extract from image data 341 a set of occupant features 346 of an occupant of a detection region.
  • Image processing unit 332 may send occupancy data 342 and/or occupant features 346 to gateway device 311, using communication interface 350 over computer network 360.
  • Gateway device 311 in this example comprises a communication interface 351, e.g., communication interface 251 or similar; a variability processing unit 335, e.g., variability processing unit 235, 535, or similar; a region processing unit 333, e.g., region processing unit 233 or similar; an occupancy processing unit 334, e.g., occupancy processing unit 234 or similar; and a communication interface 352, e.g., communication interface 252 or similar.
  • Variability processing unit 335 may be configured to receive occupant features 346 from people counting device 310, e.g., using communication interface 351, and determine occupant variability measure 343 of a detection region from sets of occupant features 346 extracted from image data of the detection region during a time interval by people counting device 310.
  • Region processing unit 333 may be configured to obtain the determined occupant variability measures 343 from variability processing unit 335 and determine a set 344 of aggregated detection regions based on occupant variability measures 343.
  • Each aggregated detection region may comprise one or more detection regions to effect increased aggregation of occupancy data of detection regions with a low occupant variability measure 343.
  • Occupancy processing unit 334 may be configured to receive the extracted occupancy data 342 from people count device 310, e.g., via communication interface 351, and to aggregate the extracted occupancy data 342 into people count data 345 for each aggregated detection region in the set 344 of aggregated detection regions. Occupancy processing unit 334 may further send people count data to data collection device 312, e.g., using communication interface 352 over computer network 361.
  • an aggregated detection region comprises detection regions corresponding to image data from vision sensors of multiple people count devices.
  • Data collection device 312 may be similar to data collection device 212, e.g., its communication interface 353 may be similar to communication device 253 and its processor 336 may be similar to processor 237, allowing data collection device 312 to, e.g., provide applications on the people count data such that data minimization in guaranteed but relatively granular data is available.
  • people count system 300 may have advantage of placing a lower burden on people count device 210, e.g., less software is needed so less software updates are needed and/or less security risks are posed, and less computational resources may be needed.
  • aggregation may be performed at a larger scale, e.g., detection regions from different people count devices may be aggregated into a single aggregated detection region. Thereby, a more globally optimal aggregation may be achieved.
  • occupancy data 342 and occupant variability measures 346 about specific detection regions are received by gateway device 311 , so less data locality is achieved than in people counting system 200.
  • Fig. 4 schematically shows an example of an embodiment of a people counting system 400.
  • People counting system 400 comprises a people counting device 410 and a gateway device 411.
  • people counting system 400 also comprises a data collection device 412.
  • the devices together enable to produce people count data 445 that may be used, e.g., by data collection device 412 and/or by external devices.
  • functional units are shown that may be functional units of the processor.
  • Various units similar to those of people counting systems 200 and 300 are shown but are distributed differently over the various devices.
  • data elements shown in the figure are typically stored in a memory of the respective device.
  • People counting system 400 may comprise multiple people counting devices and/or gateway devices.
  • people counting device 410 comprises a vision sensor 431, e.g., vision sensor 231, 331, or similar; an image processing unit 432, e.g., image processing unit 232, 332, 532, or similar; and an occupancy processing unit 434, e.g., occupancy processing unit 234, 334, or similar.
  • Vision sensor 431 may be configured to capture image data 441.
  • Image processing unit 432 may be configured to obtain the captured image data 441 and extract occupancy data 442 from the image data for each detection region in a set of detection regions.
  • Occupancy processing unit 434 may be configured to aggregate the extracted occupancy data 442 into people count data 445 for each aggregated detection region in a set 444 of aggregated detection regions.
  • Set 444 of aggregated detection regions may be determined by data collection device 412; occupancy processing unit may receive set 444 of aggregated detection regions, e.g., directly from data collection device 412 or indirectly from gateway device 411.
  • People counting device 410 may further comprise a communication interface 450 arranged for digital communication, e.g., over computer network 460, with gateway device 411, e.g., communication interface 250, 350, or similar.
  • Occupancy processing unit 434 may be configured to send people count data 445, e.g., to data collection device 412 or to gateway device 411 for providing it to data collection device 412 and/or to other devices.
  • Gateway device 411 in this example may comprise a communication interface 451 arranged for digital communication with people counting device 410 and optionally additional people counting devices, e.g., using computer network 460, e.g., communication interface 251, 351, or similar.
  • Gateway device 411 may also comprise a processor 436, e.g., processor 236 or similar.
  • Gateway device 411 may similarly comprise a communication interface 452 arranged for digital communication with data collection device 412 and/or other devices, e.g., using computer network 461, e.g., communication interface 252, 352, or similar.
  • Processor 436 may be configured to receive set 444 of aggregated detection regions from data collection device 412 and send it to people counting device 410 and optionally additional people counting devices.
  • Processor 436 may be similarly configured to receive people count data 445 from people counting device, and optionally to obtain people count data from other people counting devices, to optionally post-process the received people count data, and send it to data collection device 412 and/or other devices.
  • Data collection device 412 in this example may comprise a communication interface 453 arranged for digital communication with gateway device 411 and possibly other gateway devices of people counting system 400, for instance, using computer network 461, e.g., communication interface 253, 353, or similar.
  • data collection device 412 may comprise a region processing unit 433, e.g., based on region processing unit 233 or 333/ Region processing unit 433 may be configured to obtain occupant variability measures 443 of detection regions in the set of detection regions.
  • the occupant variability measures 443 may be provided by a user of data collection device 412, e.g., via a user interface of the data collection device.
  • the user may be a facility manager of the environment in which the people counting system is installed.
  • An environment in which people counting system 400 is installed may be shown on a display of data collection device 412, and a user may set occupant variability measures 443 for detection regions.
  • the user may select which detection regions of the set of detection regions have a high occupant variability measure, e.g., are flex places, and which detection regions have a low occupant variability measure, e.g., are fixed places, e.g., by means of a toggle or checkbox.
  • Region processing unit 433 may also otherwise determine or obtain occupant variability measures 443, e.g., based an external office management system, etcetera.
  • Region processing unit 443 may be further configured to determine set 444 of aggregated detection regions based on occupant variability measures 443.
  • Each aggregated detection region may comprise one or more detection regions to effect increased aggregation of occupancy data of detection regions with a low occupant variability measure.
  • Region processing unit 433 may be configured to provide set 444 of aggregated detection regions to people counting device 410 and optionally to other people counting devices, e.g., by sending set 444 directly or by sending set 444 to gateway device 411, e.g., using communication interface 453 and computer network 461, for forwarding to the people counting device.
  • region processing unit 433 is part of data collection device 412, in other embodiments, region processing unit 433 is part of people counting device 410 or gateway device 411, the region processing unit 433 being configured to receive occupant variability measures 443 obtained by data collection device 412 as described above, e.g., from a user.
  • Region processing unit 412 may further comprise a processor 437, e.g., processor 237, 337, or similar.
  • Processor 437 may be configured to receive people count data from gateway device 411, for example in order to provide applications based on the people count data.
  • Gateway device 411 may instead or in addition provide the people count data to other devices for providing applications or enabling the provision of applications based on the people count data. Examples of such services, e.g., in space optimization, workspace recommendation, and HVAC control, have been discussed elsewhere.
  • a system is provided in which people count data can be used that ensures that, when needed, occupancy data is aggregated before being used.
  • the aggregation is achieved while still allowing a system administrator of people counting device 400 to retain a degree of control over how aggregation takes place, e.g., by selecting flex or non- flex places to ensure that non-flex places are aggregated.
  • People counting systems e.g., people counting system 100, 101, 200, 300, or 400, may be lighting systems.
  • a vision sensor in such a people counting system e.g., vision sensor 110, 111, 231, 431, 531, or 53 G , may be co-located with a luminaire of the lighting system.
  • Fig. 5a schematically shows an example of an embodiment of such a luminaire 510.
  • Luminaire 510 may be a people counting device, e.g., people counting device 210, 310, 410, 510, or 510’.
  • Luminaire 510 comprises a lamp 538.
  • Lamp 538 may be a LED-base lamp, e.g., one or more LEDs, a gas-discharge lamp, or a filament bulb, plus any associated housing or support.
  • Luminaire 510 also comprises vision sensor 531. Vision sensor 531 may be able to detect radiation from luminaire 510.
  • lamp 538 may be configured to emit illumination towards a surface comprising the detection regions of the people counting device, thereby rendering occupants of the detection regions detectable by vision sensor 531.
  • a lighting system comprises a people counting system and one or more luminaires.
  • the vision sensor of the people counting system may be used to control the one or more luminaires, e.g., in addition to providing input to the people counting system.
  • the vision sensors may be used to measure the amount of ambient light. In dependence on the amount of ambient light luminaires may be controlled, e.g., by increasing or decreasing their light output.
  • the lighting system may also comprise additional light sensors.
  • a processor may be configured to receive input from the vison sensors and optionally of the additionally sensors, and to determine a lighting control signal. The lighting control signal is provided to the luminaires.
  • the processor may be additionally configured to determine a people count, occupancy variability measure, and the like.
  • Luminaire 510 further comprises a processor 530, a memory 540, and/or a communication interface 550.
  • Processor 530 may comprise an image processing unit, a region processing unit, an occupancy processing unit, and/or a variability processing unit according to various examples described herein.
  • processor 530 comprises at least the image processing unit and the variability processing unit.
  • Memory 540 may be configured to store instructions for causing processor 530 system to perform the functions of one or more of an image processing unit, a region processing unit, an occupancy processing unit, and a variability processing unit as described herein.
  • Communication interface 550 may be one of the communication interfaces as described above, e.g., communication interface 250, 350, or 450.
  • luminaire 510 and other luminaires of the lighting system may be able to form a lighting network, e.g., computer network 260, 360, or 460 may be a lighting network.
  • luminaire 510 may connect to a gateway device such as gateway device 211, 311, or 411.
  • a lighting network can have any suitable network topology, e.g., a mesh topology, a star topology, or any other suitable topology that allows signals to be transmitted and received between each luminaire 510 and the gateway device.
  • the communication interfaces may be selected from various alternatives.
  • the interface may be a network interface to a local or wide area network, e.g., the Internet, a storage interface to an internal or external data storage, a keyboard, an application interface (API), etc.
  • API application interface
  • the devices may have a user interface, which may include well-known elements such as one or more buttons, a keyboard, display, touch screen, etc.
  • the user interface may be arranged for accommodating user interaction, e.g., a setting an occupant variability measure, inspecting people count data, or using a service based on people count data such as a workspace recommendation service.
  • the various devices may comprise a storage implemented as an electronic memory, say a flash memory, or magnetic memory, say hard disk or the like.
  • the storage may comprise multiple discrete memories together making up the storage.
  • the storage may also be a temporary memory, say a RAM. In the case of a temporary storage, the storage contains some means to obtain data before use, say by obtaining them over an optional network connection (not shown).
  • the people count devices, gateway devices, and data collection devices each comprise a microprocessor which executes appropriate software stored at the respective devices; for example, that software may have been downloaded and/or stored in a corresponding memory, e.g., a volatile memory such as RAM or a non-volatile memory such as Flash.
  • the devices may also be equipped with microprocessors and memories.
  • the devices may, in whole or in part, be implemented in programmable logic, e.g., as field-programmable gate array (FPGA).
  • FPGA field-programmable gate array
  • One or more of the devices may be implemented, in whole or in part, as a so-called application-specific integrated circuit (ASIC), e.g., an integrated circuit (IC) customized for their particular use.
  • ASIC application-specific integrated circuit
  • a people counting device comprises an image processing circuit.
  • a people counting device moreover comprises an occupancy processing circuit.
  • a people counting device further comprises a region processing circuit and a variability processing circuit.
  • a gateway device comprises a region processing circuit, an occupancy processing circuit, and a variability processing circuit.
  • a data collection device comprises a region processing circuit.
  • the devices may comprise additional circuits. The circuits implement the
  • the circuits may be a processor circuit and storage circuit, the processor circuit executing instructions represented electronically in the storage circuits.
  • a processor circuit may be implemented in a distributed fashion, e.g., as multiple sub-processor circuits.
  • a storage may be distributed over multiple distributed sub storages.
  • Part or all of the memory may be an electronic memory, magnetic memory, etc.
  • the storage may have volatile and a non-volatile part.
  • Part of the storage may be read-only.
  • the circuits may also be, FPGA, ASIC or the like.
  • Fig. 6a schematically shows an example of an embodiment of a people counting method 900.
  • People counting method 900 may comprise obtaining 910 image data from a vision sensor. People counting method 900 may further comprise extracting 920 occupancy data from the image data for each detection region in a set of detection regions. People counting method 900 may also comprise determining 930 a set of aggregated detection regions. Each aggregated detection region may comprise one or more detection regions. Determining 930 maybe based on occupant variability measures of the detection regions. People counting method 940 may also comprise aggregating 940 the extracted occupancy data into people count data for each aggregated detection region in the set of aggregated detection regions.
  • steps 920 and 930 may be executed, at least partially, in parallel.
  • a given step may not have finished completely before a next step is started.
  • Embodiments of the method may be executed using software, which comprises instructions for causing a processor system to perform method 900.
  • Software may only include those steps taken by a particular sub-entity of the system.
  • the software may be stored in a suitable storage medium, such as a hard disk, a floppy, a memory, an optical disc, etc.
  • the software may be sent as a signal along a wire, or wireless, or using a data network, e.g., the Internet.
  • the software may be made available for download and/or for remote usage on a server.
  • Embodiments of the method may be executed using a bitstream arranged to configure programmable logic, e.g., a field-programmable gate array (FPGA), to perform the method.
  • FPGA field-programmable gate array
  • the invention also extends to computer programs, particularly computer programs on or in a carrier, adapted for putting the invention into practice.
  • the program may be in the form of source code, object code, a code intermediate source, and object code such as partially compiled form, or in any other form suitable for use in the implementation of embodiments of the method.
  • An embodiment relating to a computer program product comprises computer executable instructions corresponding to each of the processing steps of at least one of the methods set forth. These instructions may be subdivided into subroutines and/or be stored in one or more files that may be linked statically or dynamically.
  • Another embodiment relating to a computer program product comprises computer executable instructions corresponding to each of the means of at least one of the systems and/or products set forth.
  • Fig. 6b shows a computer readable medium 1000 having a writable part 1010 comprising a computer program 1020.
  • the computer program 1020 comprises instructions for causing a processor system to perform a people counting method according to an
  • the computer program 1020 comprises instructions for causing a processor system to perform obtaining image data from a vision sensor and extracting from the image data an occupancy data for each detection region in a set of detection regions; and/or determining a set of aggregated detection regions, each aggregated detection region comprising one or more detection regions, said determining being based on occupant variability measures of the detection regions; and/or aggregating the extracted occupancy data into a people count data for each aggregated detection region in the set of aggregated detection regions, for instance, to perform the obtaining, the determining, and the aggregating.
  • the computer program 1020 may be embodied on the computer readable medium 1000 as physical marks or by means of magnetization of the computer readable medium 1000. However, any other suitable embodiment is conceivable as well. Furthermore, it will be appreciated that, although the computer readable medium 1000 is shown here as an optical disc, the computer readable medium 1000 may be any suitable computer readable medium, such as a hard disk, solid state memory, flash memory, etc., and may be non- recordable or recordable.
  • the computer program 1020 comprises instructions for causing a processor system to perform said people counting method, or said obtaining and/or determining and/or aggregating.
  • Fig. 6c shows in a schematic representation of a processor system 1140 according to an embodiment.
  • the processor system comprises one or more integrated circuits 1110.
  • the architecture of the one or more integrated circuits 1110 is schematically shown in Fig. 7b.
  • Circuit 1110 comprises a processing unit 1120, e.g., a CPU, for running computer program components to execute a method according to an embodiment and/or implement its modules or units.
  • Circuit 1110 comprises a memory 1122 for storing programming code, data, etc. Part of memory 1122 may be read-only.
  • Circuit 1110 may comprise a
  • Circuit 1110 may comprise a dedicated integrated circuit 1124 for performing part or all of the processing defined in the method.
  • Processor 1120, memory 1122, dedicated IC 1124 and communication element 1126 may be connected to each other via an interconnect 1130, say a bus.
  • the processor system 1110 may be arranged for contact and/or contact-less communication, using an antenna and/or connectors, respectively.
  • processor system 1140 e.g., the people counting device may comprise a processor circuit and a memory circuit, the processor being arranged to execute software stored in the memory circuit.
  • the processor circuit may be an Intel Core i7 processor, ARM Cortex-R8, etc.
  • the processor circuit may be ARM Cortex M0.
  • the memory circuit may be an ROM circuit, or a non volatile memory, e.g., a flash memory.
  • the memory circuit may be a volatile memory, e.g., an SRAM memory.
  • the device may comprise a non-volatile software interface, e.g., a hard drive, a network interface, etc., arranged for providing the software, e.g., image processing instructions, region processing instructions, occupancy processing instructions, and/or variability processing instructions for respective units described herein.
  • a non-volatile software interface e.g., a hard drive, a network interface, etc.
  • the software e.g., image processing instructions, region processing instructions, occupancy processing instructions, and/or variability processing instructions for respective units described herein.
  • any reference signs placed between parentheses shall not be construed as limiting the claim.
  • Use of the verb‘comprise’ and its conjugations does not exclude the presence of elements or steps other than those stated in a claim.
  • the article‘a’ or ‘an’ preceding an element does not exclude the presence of a plurality of such elements.
  • the invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware.
  • the mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
  • the wording“a set of’ means“one or more”.
  • references in parentheses refer to reference signs in drawings of exemplifying embodiments or to formulas of embodiments, thus increasing the intelligibility of the claim. These references shall not be construed as limiting the claim.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

Selon ‌certains‌ ‌modes‌ ‌de‌ réalisation,‌ ‌la‌ présente invention ‌concerne‌ un système de comptage de personnes. Une unité de traitement d'image obtient des données d'image à partir d'un capteur de vision, et extrait des données d'occupation à partir des données d'image pour chaque région de détection dans un ensemble de régions de détection. Une unité de traitement de région obtient des mesures de variabilité d'occupant de régions de détection dans l'ensemble de régions de détection et détermine un ensemble de régions de détection agrégées sur la base des mesures de variabilité d'occupant. Chaque région de détection agrégée comprend une ou plusieurs régions de détection pour effectuer une agrégation accrue de données d'occupation de régions de détection avec une faible mesure de variabilité d'occupant. Une unité de traitement d'occupation agrège les données d'occupation extraites en données de comptage de personnes pour chaque région de détection agrégée dans l'ensemble de régions de détection agrégées.
PCT/EP2019/081033 2018-11-20 2019-11-12 Système de comptage de personnes doté de régions de détection agrégées WO2020104254A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP18207369.2 2018-11-20
EP18207369 2018-11-20

Publications (1)

Publication Number Publication Date
WO2020104254A1 true WO2020104254A1 (fr) 2020-05-28

Family

ID=64453286

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/081033 WO2020104254A1 (fr) 2018-11-20 2019-11-12 Système de comptage de personnes doté de régions de détection agrégées

Country Status (1)

Country Link
WO (1) WO2020104254A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114494350A (zh) * 2022-01-28 2022-05-13 北京中电兴发科技有限公司 一种人员聚集检测方法及装置
CN116129361A (zh) * 2023-03-24 2023-05-16 武汉中科通达高新技术股份有限公司 一种基于距离度量的人群聚集识别方法及装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017060083A1 (fr) * 2015-10-06 2017-04-13 Philips Lighting Holding B.V. Système de comptage de personnes et d'éclairage intégré
WO2017072158A1 (fr) * 2015-10-30 2017-05-04 Philips Lighting Holding B.V. Système et procédé de détermination de l'emplacement et de l'occupation d'espaces de travail
WO2017080929A1 (fr) 2015-11-12 2017-05-18 Philips Lighting Holding B.V. Système de traitement d'images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017060083A1 (fr) * 2015-10-06 2017-04-13 Philips Lighting Holding B.V. Système de comptage de personnes et d'éclairage intégré
WO2017072158A1 (fr) * 2015-10-30 2017-05-04 Philips Lighting Holding B.V. Système et procédé de détermination de l'emplacement et de l'occupation d'espaces de travail
WO2017080929A1 (fr) 2015-11-12 2017-05-18 Philips Lighting Holding B.V. Système de traitement d'images

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CARLOS DUARTE ET AL: "Revealing occupancy patterns in an office building through the use of occupancy sensor data", ENERGY AND BUILDINGS, vol. 67, December 2013 (2013-12-01), CH, pages 587 - 595, XP055590533, ISSN: 0378-7788, DOI: 10.1016/j.enbuild.2013.08.062 *
RYOTA NAKATANI ET AL.: "A Person Identification Method Using a Top-View Head Image from an Overhead Camera", JACIII, vol. 16, no. 6, 2012

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114494350A (zh) * 2022-01-28 2022-05-13 北京中电兴发科技有限公司 一种人员聚集检测方法及装置
CN116129361A (zh) * 2023-03-24 2023-05-16 武汉中科通达高新技术股份有限公司 一种基于距离度量的人群聚集识别方法及装置
CN116129361B (zh) * 2023-03-24 2023-08-08 武汉中科通达高新技术股份有限公司 一种基于距离度量的人群聚集识别方法及装置

Similar Documents

Publication Publication Date Title
US11184968B2 (en) Occupancy sensor calibration and occupancy estimation
JP6579450B2 (ja) スマート照明システム、照明を制御するための方法及び照明制御システム
US10372990B2 (en) System and method for identification of personal thermal comfort
WO2017015664A1 (fr) Systèmes et procédés d'éclairage intelligent pour la surveillance, l'analyse et l'automation de l'environnement bâti
US10430528B2 (en) Method and system for managing space configurations
JP2018514835A (ja) 建物内の環境管理システムを制御するための方法および装置
EP3374925B1 (fr) Systeme de traitement des images
US20170123386A1 (en) Method and apparatus for determining information for building information modeling
WO2017072158A1 (fr) Système et procédé de détermination de l'emplacement et de l'occupation d'espaces de travail
WO2020104254A1 (fr) Système de comptage de personnes doté de régions de détection agrégées
JP7382947B2 (ja) 等価メラノピックルクス(eml)クォータ
US20170227941A1 (en) Control apparatus and device control system
WO2017060083A1 (fr) Système de comptage de personnes et d'éclairage intégré
CN112204590A (zh) 检测智能建筑物中的异常行为
US10880973B2 (en) Sensor control device
JP7437717B2 (ja) 管理システム、管理方法及びプログラム
EP3326080B1 (fr) Systèmes et procédés d'éclairage intelligent pour la surveillance, l'analyse et l'automation de l'environnement bâti
CN109791640A (zh) 用于建筑物能源管理和空间优化的估测占用能源简档的方法
JP6833102B1 (ja) 環境調節システム、環境調節方法および環境調節プログラム
Liu Simulation of Local Climate Control in Shared Offices Based on Occupants Locations and Preferences
US10701782B2 (en) Verification device for a connected lighting system
WO2023036665A1 (fr) Réception et analyse de données de comportement de consommateur à l'aide d'une communication par lumière visible
JP2022186409A (ja) 照明制御システム、照明制御方法、及びプログラム
EP3232129A1 (fr) Procédé et système pour diriger une opération d'un système de contrôle d'environnement intérieur
JP2013101046A (ja) ホームサーバ、配分方法及び配分プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19801875

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19801875

Country of ref document: EP

Kind code of ref document: A1