WO2006109256A2 - Pattern based occupancy sensing system and method - Google Patents

Pattern based occupancy sensing system and method Download PDF

Info

Publication number
WO2006109256A2
WO2006109256A2 PCT/IB2006/051119 IB2006051119W WO2006109256A2 WO 2006109256 A2 WO2006109256 A2 WO 2006109256A2 IB 2006051119 W IB2006051119 W IB 2006051119W WO 2006109256 A2 WO2006109256 A2 WO 2006109256A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
region
classifier
occupancy
pattern
Prior art date
Application number
PCT/IB2006/051119
Other languages
French (fr)
Other versions
WO2006109256A3 (en
Inventor
Ling Wang
Original Assignee
Koninklijke Philips Electronics, N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics, N.V. filed Critical Koninklijke Philips Electronics, N.V.
Publication of WO2006109256A2 publication Critical patent/WO2006109256A2/en
Publication of WO2006109256A3 publication Critical patent/WO2006109256A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19604Image analysis to detect motion of the intruder, e.g. by frame subtraction involving reference image or background adaptation with time to compensate for changing conditions, e.g. reference image update on detection of light level change
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19613Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
    • G08B13/19615Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion wherein said pattern is defined by the user
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19652Systems using zones in a single scene defined for different treatment, e.g. outer zone gives pre-alarm, inner zone gives alarm
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • This invention relates generally to lighting control, and more specifically to lighting control with pattern based occupancy sensing.
  • analog sensors create problems in modern lighting systems, which are often part of building automation systems. Building automation systems are typically digital systems, making it difficult to incorporate analog sensors without costly adapters or specialized interfaces. Analog sensors also consume relatively large amounts power, wasting power and requiring that they be wired to a power source. Wiring limits design flexibility and increases installation expense. In addition, analog sensors lack features that are desirable in a modern lighting system, such as built-in programmability and occupant counting.
  • One aspect of the present invention provides a pattern based occupancy sensor having an image array generating an image signal in response to an image, a microcontroller unit (MCU) responsive to the image signal and generating a status signal, and a communication interface responsive to the status signal and generating a communication signal.
  • the MCU processes the image signal to calculate area occupancy by analyzing an object for a pattern, determining whether the pattern matches a classifier, and processing the object when the pattern matches the classifier to calculate area occupancy.
  • Another aspect of the present invention provides a method for occupancy counting including acquiring an image, detecting an object in the image, analyzing the object for a pattern, determining whether the pattern matches a classifier, and processing the object when the pattern matches the classifier to calculate area occupancy.
  • Another aspect of the present invention provides a system for occupancy counting including means for acquiring an image, means for detecting an object in the image, means for analyzing the object for a pattern, means for determining whether the pattern matches a classifier, and means for processing the object when the pattern matches the classifier to calculate area occupancy.
  • FIG. 1 is a block diagram of a pattern based occupancy sensor made in accordance with the present invention.
  • FIG. 2 is a flowchart for a counting method employing a pattern based occupancy sensor made in accordance with the present invention
  • FIGS. 3 A & 3B are schematic diagrams of a boundary deployment and image, respectively, for a pattern based occupancy sensor made in accordance with the present invention
  • FIG. 4 is a flowchart for a boundary counting method employing a pattern based occupancy sensor made in accordance with the present invention
  • FIGS. 5 A & 5B are schematic diagrams of an area deployment and images, respectively, for pattern based occupancy sensors made in accordance with the present invention.
  • FIGS. 6A & 6B are flowcharts for an area counting and occupancy checking method, respectively, employing pattern based occupancy sensors made in accordance with the present invention.
  • FIG. 1 is a block diagram of a pattern based occupancy sensor made in accordance with the present invention.
  • Pattern based occupancy sensor 20 includes a lens 22, an image array 24, a microcontroller unit (MCU) 26, and a communication interface 38.
  • MCU microcontroller unit
  • the pattern based occupancy sensor 20 collects refracted light 40 from the lens 22 as an image on the image array 24.
  • the image array 24 generates an image signal 42 in response to the image.
  • the microcontroller unit (MCU) 26, which includes an image processor 32, memory 36, and a communication stack 34, receives the image signal 42 for processing.
  • the MCU 26 compares a pattern of an object detected in the image signal 42 with classifiers stored in the memory 36 to determine whether the pattern matches a classifier and the object is an object of interest which should be counted.
  • the MCU 26 generates status signal 46, including occupancy information, in response to the image signal 42.
  • the occupancy information can indicate whether an area is occupied and/or the number of objects of interest in the area.
  • Status information passes through the communication stack 34 to be converted to a communication packet in accordance with the communication protocol.
  • the status information, including occupancy information is included in the status signal 46.
  • the communication interface 38 includes a radio frequency (RF) transceiver 28 and an antenna 30.
  • the RF transceiver 28 receives the status signal 46 and provides a modulated signal 48 to the antenna 30, which communicates with other pattern based occupancy sensors, local lighting controls, other local controls, and/or building automation systems through a communication signal 50.
  • Power supply 40 provides power to the image array 24, the MCU 26, and the RF transceiver 28. In one embodiment, the MCU 26 provides a power control signal 52 to the power supply 40.
  • the signals of the pattern based occupancy sensor 20 can be bidirectional signals, communicating information back and forth between their associated components.
  • the pattern based occupancy sensor 20 sends communication signal 50 to other pattern based occupancy sensors, local lighting controls, other local controls, and/or building automation systems, and the pattern based occupancy sensor 20 receives other communication signals from the other pattern based occupancy sensors, local lighting controls, other local controls, and/or building automation systems.
  • the communication signals can include data and instructions, such as occupancy information, operating commands, queries, programming instructions, and the like.
  • the image signal 42 and the status signal 46 can also be bidirectional signals: the image signal 42 communicating data and instructions between the image array 24 and MCU 26, and the status signal 46 communicating data and instructions between the MCU 26 and communication interface 38.
  • the lens 22 can be any lens suitable for viewing an object and providing an image on the image array 24, such as a plastic lens, a glass lens, or the like.
  • the lens 22 can be a lens as used in mobile telephone cameras.
  • the characteristics of the lens such as the focal length, aperture, and magnification, can be selected to suit the particular application in which the pattern based occupancy sensor 20 is used.
  • the image array 24 can be any sensor suitable for converting an image received from the lens 22 to an image signal 42, while maintaining low power consumption.
  • the image array 24 is a complementary metal oxide semiconductor (CMOS) image sensing chip.
  • the image array 24 is a charge coupled device (CCD).
  • the image array 24 detects the image on an array of pixels and converts the sensed image to the image signal 42.
  • the image array 24 is a low power image array.
  • the image array 24 can include on-chip functions, such as calibration, region selection, decimation, power management, pixel correction, and color correction.
  • the image array 24 includes the lens 22.
  • the image array 24 can include the MCU 26 and/or the RF transceiver 28.
  • a suitable image array is the W6501 VGA CMOS Color Image Sensor manufactured by STMicroelectronics of Geneva, Switzerland.
  • the microcontroller unit (MCU) 26 can be any microcontroller suitable for storing and processing instructions and data.
  • the MCU 26 processes the image signal 42 to calculate area occupancy by analyzing an object for a pattern, determining whether the pattern matches a classifier, and processing the object when the pattern matches the classifier to calculate area occupancy.
  • the MCU 26 includes the image processor 32, memory 36, and communication stack 34. Instructions can be programmed into the memory 36 during manufacture of the pattern based occupancy sensor 20 and/or can be programmed through the communication signal 50 during operation.
  • the particular characteristics of the MCU 26, such as n-bit architecture, clock speed, memory size, and the like, can be selected for the particular application.
  • microcontrollers examples include the 8-bit HCS08 and 16-bit HCS12 families manufactured by Freescale Semiconductor, Inc., of Austin, Texas, the AVR 8 -Bit RISC Flash Microcontroller manufactured by Atmel Corporation of San Jose, California, and the STV0767 Imaging Digital Signal Processor manufactured by STMicroelectronics of Geneva, Switzerland.
  • the MCU 26 can be a single chip, or can be combined on a single chip with one or both of the image array 24 and the RF transceiver 28. Besides performing image processing, in various embodiments the MCU 26 can manage communications in and out of the pattern based occupancy sensor 20 through the communication interface 38 and manage power usage in the pattern based occupancy sensor 20.
  • the radio frequency (RF) transceiver 28 can be any transceiver for communicating between the pattern based occupancy sensor 20 and other pattern based occupancy sensors, local lighting controls, other local controls, and/or building automation systems. Typically, the RF transceiver 28 operates at low voltage and with low power consumption, and can incorporate power management features. In one embodiment, the RF transceiver 28 communicates at 2.4 GHz in accordance with the IEEE 802.15.4 short-range wireless standard and the ZigBee networking standard protocol. In another embodiment, the RF transceiver 28 communicates at 15 GHz. Examples of suitable transceivers are the MC 13193 short range, low power, 2.4 GHz ISM band transceiver manufactured by Freescale Semiconductor, Inc.
  • the RF transceiver 28 can operate at various frequencies and with various protocols as desired for a particular application.
  • the RF transceiver 28 can be a single chip, or can be combined on a single chip with one or both of the image array 24 and the MCU 26.
  • the power supply 40 can be any wired or wireless power supply suitable for powering the components of the pattern based occupancy sensor 20.
  • Wireless power supplies can include batteries or scavenging power supplies, such as power supplies scavenging solar, vibrational, electromagnetic field energy, or the like. Vibrational scavenging power supplies can be fabricated using micro electromechanical systems (MEMS) and nanotechnology.
  • MEMS micro electromechanical systems
  • the MCU 26 provides a power control signal 52 to the power supply 40 to manage power consumption in the pattern based occupancy sensor 20.
  • the individual components of the pattern based occupancy sensor 20 include power management features.
  • FIG. 2 is a flowchart for a counting method employing a pattern based occupancy sensor made in accordance with the present invention.
  • the counting method includes acquiring an image at 100; detecting an object in the image at 102; analyzing the object for a pattern at 104; determining whether the pattern matches a classifier at 106; and processing the object when the pattern matches the classifier to calculate area occupancy at 108.
  • the method further includes acquiring training images, and generating the classifier from the training images.
  • the area occupancy can be transmitted to other pattern based occupancy sensors and/or the building automation system for emergency roll calls, security, utility control, lighting control, and the like.
  • FIGS. 3 A & 3B are schematic diagrams of a boundary deployment and image, respectively, for a pattern based occupancy sensor made in accordance with the present invention.
  • FIG. 3 A shows one example of a boundary deployment of a pattern based occupancy sensor.
  • FIG. 3B shows one example of an image detected by a pattern based occupancy sensor.
  • a pattern based occupancy sensor 120 is installed near a doorway 122 to view the boundary 124 between a first area 126 and a second area 128.
  • Objects having a pattern such as people, animals, machines, parts, or the like, move across the boundary 124 as shown by arrows 132.
  • the object 130 is a person and the pattern based occupancy sensor 120 is installed near the top of the doorway 122 to observe a head-and-shoulder pattern of the object 130.
  • the pattern based occupancy sensor 120 can be mounted as suited to the particular application.
  • the pattern based occupancy sensor 120 is installed low on the doorway 122 to observe the leg or hip pattern of a person.
  • multiple pattern based occupancy sensors 120 can be installed to cover multiple directions, i.e., looking down and looking across the doorway 122.
  • the multiple pattern based occupancy sensors 120 can communicate among themselves or with the building automation system and crosscheck the area occupancy determined by each of the pattern based occupancy sensors 120.
  • an image 150 is divided into a first region 152 and a second region 154 by an image boundary 156.
  • the first region 152 and the second region 154 correspond respectively to the first area 126 and the second area 128 of FIG. 3A.
  • the image boundary 156 can be any continuous curve dividing the image 150 into two regions.
  • Pattern 158 is the pattern corresponding to the object 130 of FIG. 3A, shown in the first region 152. In this example, the pattern 158 is the head-and-shoulder pattern of a person.
  • FIG. 4 is a flowchart for a boundary counting method employing a pattern based occupancy sensor made in accordance with the present invention.
  • the boundary counting method includes initializing the pattern based occupancy sensor at 400, acquiring an image having first and second regions at 402, detecting an object in the image at 404, analyzing the object for a pattern at 406, and determining whether the pattern match a classifier at 408.
  • the method returns to the acquiring an image at 402.
  • the method continues with identifying the object as being in one of the first region and the second region at 410, acquiring a next image at 412, and determining whether the object in the other of the first region and the second region in the next image at 414.
  • the method returns to the acquiring an image at 402.
  • the method continues with changing an occupancy counter at 416.
  • the method continues with returning to the acquiring an image at 402.
  • the time for the next acquisition of an image at 402 can be varied depending on the expected speed of the object across the image. In one embodiment, the acquisition of images at 402 can be repeated on the order of hundreds of milliseconds, such as about every 100 milliseconds.
  • the initializing the pattern based occupancy sensor at 400 can include initializing counters, defining boundaries and image regions, acquiring classifiers by downloading, and/or acquiring classifiers by training.
  • Counters such as the occupancy counter, can be zeroed out to start a counting session.
  • the image boundary and regions can be generated automatically from natural features in sample images, such as floor color changes or doorway edges, or can be selected manually. The image boundary and regions remain constant for a pattern based occupancy sensor from one image to the next.
  • at least one classifier for checking patterns such as a head- and-shoulder classifier can be downloaded from the library into the pattern based occupancy sensor.
  • the classifier can be obtained by training the pattern based occupancy sensor. Training includes acquiring training images, extracting features from objects of interest in the training images, and generating the classifier from the features.
  • the classifiers can include shape as a primary feature, without regard to size. Other embodiments can include color or grayness as a feature determining the classifier.
  • Classifiers can be individual classifiers that identify specific individuals or can be group classifiers that identify different groups, such as people and dogs.
  • the analyzing the object for a pattern at 406 is similar to generating the classifiers, i.e., features of objects in the image are determined and analyzed to determine the pattern for the object, such as a head-and-shoulders pattern.
  • a tolerance can be applied so that the match is defined to occur as long as the pattern falls within the tolerance.
  • the movement can be from the first region to the second region or from the second region to the first region, i.e., any movement across the boundary requiring a change to the occupancy counter.
  • the occupancy counter can tally boundary crossings or matched objects within an area, such as a room.
  • the occupancy counter increments each time a matched object moves from the first region to the second region or from the second region to the first region.
  • the occupancy counter includes an IN counter and an OUT counter.
  • the occupancy counter changes by incrementing the IN counter when the object is in the first region in the first image and in the second region in the second image, and incrementing the OUT counter when the object is in the second region in the first image and in the first region in the second image.
  • Area occupancy such as the number of people in a room, can be calculated from the difference between the IN counter and the OUT counter.
  • the boundary crossings and/or area occupancy can be transmitted to other pattern based occupancy sensors and/or the building automation system for emergency roll calls, security, utility control, lighting control, and the like.
  • Classifiers can be individual classifiers that identify specific individuals or can be group classifiers that identify different groups, such as people and dogs. Individual classifiers can be used to track individuals from one area to the other. Group classifiers can be used to track numbers of a group crossing from one area to the other. In another embodiment, the group classifiers can be used to track individuals by assigning an individual tag to the identified object of the group and tracking the individual tag from one area to the other.
  • FIGS. 5A & 5B are schematic diagrams of an area deployment and images, respectively, for pattern based occupancy sensors made in accordance with the present invention.
  • FIG. 5A shows one example of an area deployment of a pattern based occupancy sensor.
  • FIG. 5B shows one example of images detected by pattern based occupancy sensors.
  • pattern based occupancy sensors 220, 221, 222, 223 are installed near the ceiling of room 242 looking toward the floor 240, which includes first area 226, second area 228, third area 230, and fourth area 232.
  • the pattern based occupancy sensors 220, 221, 222, 223 view the first area 226, the second area 228, the third area 230, and the fourth area 232, respectively.
  • Each pattern based occupancy sensor views a single area.
  • Objects having a pattern such as people, animals, machines, parts, or the like, can be in the room 242.
  • the objects 210 are people and the object 212 is an animal.
  • the pattern based occupancy sensors 221, 222 observe the head-and-shoulder patterns of the objects 210 in the second area 228, the pattern based occupancy sensor 222 observes the head-and-shoulder pattern of the object 210 in the third area 230, and the pattern based occupancy sensor 223 observes the body pattern of the object 212 in the fourth area 232.
  • the pattern based occupancy sensors can be mounted as suited for the particular application.
  • the pattern based occupancy sensors can be installed low on walls to observe the leg or hip pattern of a person or profile of an animal.
  • multiple pattern based occupancy sensors 120 can be installed to cover multiple directions. The multiple pattern based occupancy sensors can communicate among themselves or with the building automation system and crosscheck the area occupancy determined by each of the pattern based occupancy sensors.
  • a room image 250 is divided into a first image 266, a second image 268, a third image 270, and a fourth image 272.
  • Each pattern based occupancy sensor has a single image.
  • the first image 266, second image 268, third image 270, and fourth image 272 correspond respectively to the first area 226, second area 228, third area 230, and fourth area 232 of FIG. 5A.
  • Patterns 280 correspond to the objects 210 of FIG. 3A; in this example, the patterns 280 are the head-and-shoulder pattern of a person.
  • Pattern 282 corresponds to the object 212 of FIG. 5A; in this example, the pattern 282 is the back pattern of an animal.
  • FIGS. 6A & 6B are flowcharts for an area counting and occupancy checking method, respectively, employing pattern based occupancy sensors made in accordance with the present invention.
  • the occupancy checking method can be optionally included within the area counting method to check and correct the area occupancy.
  • the method returns to the detecting object i in the object image at 612.
  • the method returns to the detecting object i in the object image at 612.
  • the method returns to the acquiring a background image at 602.
  • the initializing the pattern based occupancy sensor at 600 can include initializing counters, acquiring classifiers by downloading, and/or acquiring classifiers by training. Counters, such as the occupancy counter, can be zeroed out to start a counting session.
  • a library of classifiers is available, at least one classifier for checking patterns, such as a head-and-shoulder classifier can be downloaded from the library into the pattern based occupancy sensor.
  • the classifier can be obtained by training the pattern based occupancy sensor. Training includes acquiring training images, extracting features from objects of interest in the training images, and generating the classifier from the features.
  • the classifiers can include shape as a primary feature, without regard to size. Other embodiments can include color or grayness as a feature determining the classifier.
  • Acquiring a background image at 602 and calculating an object space PO and object range RL to RU at 604 are optional, and can be used in the optional occupancy checking method 622.
  • Acquiring a background image at 602 includes acquiring a background image with no objects present. The background image provides a baseline for determining the accuracy of the indicated area occupancy. The background image is also used in calculating the object space PO and the object range RL to RU at 604.
  • the object space PO is the space in pixels changed by the shape of an object of interest.
  • the object space PO can be determined from measuring pixel changes from sample objects of interest against the background image, and averaging the measured pixel changes for a number of sample objects.
  • the object range, lower range RL to upper range RU can be a band about the object space PO, such as ⁇ 10% of the object space PO value.
  • the objects in an object image are counted and objects of interest matching a classifier counted by incrementing an area occupancy counter between 606 and 620.
  • the checking the area occupancy indicated by the area occupancy counter can be checked at 622. Checking the area occupancy indicated by the area occupancy counter can account for errors, such as counting objects not of interest but with a similar shape to an object of interest, changes in ambient light, shadows, and the like. In an alternate embodiment, checking the area occupancy at 622 can be omitted.
  • the background update time at 624 is selected on the order of hours to refresh the background image as desired and depending on the particular installation.
  • the monitoring time at 626 is selected on the order of minutes to refresh the object image as desired and depending on the particular installation.
  • the area occupancy can be transmitted to other pattern based occupancy sensors and/or the building automation system for emergency roll calls, security, utility control, lighting control, and the like.
  • the area occupancies from a number of pattern based occupancy sensor can be added to calculate the occupancy for the larger area.
  • the area occupancies from a number of pattern based occupancy sensor covering particular portions of a room can be added to calculate the occupancy for the room.
  • the optional occupancy checking method 622 includes starting at 700, calculating pixel change P between the object image and the background image at 702, and determining whether N*RL ⁇ P ⁇ N*RU at 704.
  • N*RL ⁇ P ⁇ N*RU N at 706, and the method ends at 712.
  • NFinal N at 706, and the method ends at 712.
  • Calculating pixel change P between the object image and the background image at 702 uses the object space PO and object range RL to RU calculated at 604.
  • the pixel change P is the number of pixels changed from the background image to the object image.
  • the object range for the indicated area occupancy is between the indicated area occupancy N times the lower range RL pixels per object and the indicated area occupancy N times the upper range RU pixels per object.
  • the indicated area occupancy is acceptable and the final area occupancy NFinal is taken as the indicated area occupancy N. Otherwise, the indicated area occupancy can be corrected: the final area occupancy NFinal is set to the lesser of the indicated area occupancy N and the integer value of the pixel change P divided by the object space PO.

Abstract

A pattern based occupancy sensing system and method includes an image array (24) generating an image signal (42) in response to an image, a microcontroller unit (MCU) (26) responsive to the image signal 42 and generating a communication signal (46), and a communication interface (38) responsive to the communication signal (46) and generating a system signal (50). The MCU (26) processes the image signal (42) to calculate area occupancy by analyzing an object for a pattern, determining whether the pattern matches a classifier, and processing the object when the pattern matches the classifier to calculate area occupancy.

Description

PATTERN BASED OCCUPANCY SENSING SYSTEM AND METHOD
[0001] This invention relates generally to lighting control, and more specifically to lighting control with pattern based occupancy sensing.
[0002] For energy conservation, lights should be turned off when no one is in a room. For security and safety, lights should be turned on when someone enters a room. To this end, lighting controls for lighting systems often incorporate occupancy sensors to switch the lights on and off depending whether someone is in the room. Presently, analog sensors, such as passive infrared (PIR) sensors or ultrasound motion detectors, hard-wired into the lighting systems are used to detect room occupancy.
[0003] Unfortunately, such analog sensors create problems in modern lighting systems, which are often part of building automation systems. Building automation systems are typically digital systems, making it difficult to incorporate analog sensors without costly adapters or specialized interfaces. Analog sensors also consume relatively large amounts power, wasting power and requiring that they be wired to a power source. Wiring limits design flexibility and increases installation expense. In addition, analog sensors lack features that are desirable in a modern lighting system, such as built-in programmability and occupant counting.
[0004] It would be desirable to have a pattern based occupancy sensing system and method that overcomes the above disadvantages.
[0005] One aspect of the present invention provides a pattern based occupancy sensor having an image array generating an image signal in response to an image, a microcontroller unit (MCU) responsive to the image signal and generating a status signal, and a communication interface responsive to the status signal and generating a communication signal. The MCU processes the image signal to calculate area occupancy by analyzing an object for a pattern, determining whether the pattern matches a classifier, and processing the object when the pattern matches the classifier to calculate area occupancy. [0006] Another aspect of the present invention provides a method for occupancy counting including acquiring an image, detecting an object in the image, analyzing the object for a pattern, determining whether the pattern matches a classifier, and processing the object when the pattern matches the classifier to calculate area occupancy.
Another aspect of the present invention provides a system for occupancy counting including means for acquiring an image, means for detecting an object in the image, means for analyzing the object for a pattern, means for determining whether the pattern matches a classifier, and means for processing the object when the pattern matches the classifier to calculate area occupancy.
[0007] The foregoing and other features and advantages of the invention will become further apparent from the following detailed description of the presently preferred embodiments read in conjunction with the accompanying drawings. The detailed description and drawings are merely illustrative of the invention rather than limiting, the scope of the invention being defined by the appended claims and equivalents thereof.
[0008] FIG. 1 is a block diagram of a pattern based occupancy sensor made in accordance with the present invention;
[0009] FIG. 2 is a flowchart for a counting method employing a pattern based occupancy sensor made in accordance with the present invention;
[00010] FIGS. 3 A & 3B are schematic diagrams of a boundary deployment and image, respectively, for a pattern based occupancy sensor made in accordance with the present invention;
[00011] FIG. 4 is a flowchart for a boundary counting method employing a pattern based occupancy sensor made in accordance with the present invention;
[00012] FIGS. 5 A & 5B are schematic diagrams of an area deployment and images, respectively, for pattern based occupancy sensors made in accordance with the present invention; and
[00013] FIGS. 6A & 6B are flowcharts for an area counting and occupancy checking method, respectively, employing pattern based occupancy sensors made in accordance with the present invention.
[00014] FIG. 1 is a block diagram of a pattern based occupancy sensor made in accordance with the present invention. Pattern based occupancy sensor 20 includes a lens 22, an image array 24, a microcontroller unit (MCU) 26, and a communication interface 38.
[00015] The pattern based occupancy sensor 20 collects refracted light 40 from the lens 22 as an image on the image array 24. The image array 24 generates an image signal 42 in response to the image. The microcontroller unit (MCU) 26, which includes an image processor 32, memory 36, and a communication stack 34, receives the image signal 42 for processing. In accordance with instructions stored in the memory 36, the MCU 26 compares a pattern of an object detected in the image signal 42 with classifiers stored in the memory 36 to determine whether the pattern matches a classifier and the object is an object of interest which should be counted. The MCU 26 generates status signal 46, including occupancy information, in response to the image signal 42. The occupancy information can indicate whether an area is occupied and/or the number of objects of interest in the area. Status information passes through the communication stack 34 to be converted to a communication packet in accordance with the communication protocol. The status information, including occupancy information, is included in the status signal 46.
[00016] The communication interface 38 includes a radio frequency (RF) transceiver 28 and an antenna 30. The RF transceiver 28 receives the status signal 46 and provides a modulated signal 48 to the antenna 30, which communicates with other pattern based occupancy sensors, local lighting controls, other local controls, and/or building automation systems through a communication signal 50. Power supply 40 provides power to the image array 24, the MCU 26, and the RF transceiver 28. In one embodiment, the MCU 26 provides a power control signal 52 to the power supply 40.
[00017] Those skilled in the art will appreciate that the signals of the pattern based occupancy sensor 20 can be bidirectional signals, communicating information back and forth between their associated components. For example, the pattern based occupancy sensor 20 sends communication signal 50 to other pattern based occupancy sensors, local lighting controls, other local controls, and/or building automation systems, and the pattern based occupancy sensor 20 receives other communication signals from the other pattern based occupancy sensors, local lighting controls, other local controls, and/or building automation systems. The communication signals can include data and instructions, such as occupancy information, operating commands, queries, programming instructions, and the like. The image signal 42 and the status signal 46 can also be bidirectional signals: the image signal 42 communicating data and instructions between the image array 24 and MCU 26, and the status signal 46 communicating data and instructions between the MCU 26 and communication interface 38.
[00018] The lens 22 can be any lens suitable for viewing an object and providing an image on the image array 24, such as a plastic lens, a glass lens, or the like. In one embodiment, the lens 22 can be a lens as used in mobile telephone cameras. Those skilled in the art will appreciate that the characteristics of the lens, such as the focal length, aperture, and magnification, can be selected to suit the particular application in which the pattern based occupancy sensor 20 is used.
[00019] The image array 24 can be any sensor suitable for converting an image received from the lens 22 to an image signal 42, while maintaining low power consumption. In one embodiment, the image array 24 is a complementary metal oxide semiconductor (CMOS) image sensing chip. In another embodiment, the image array 24 is a charge coupled device (CCD). The image array 24 detects the image on an array of pixels and converts the sensed image to the image signal 42. Typically, the image array 24 is a low power image array. The image array 24 can include on-chip functions, such as calibration, region selection, decimation, power management, pixel correction, and color correction. In one embodiment, the image array 24 includes the lens 22. In other embodiments, the image array 24 can include the MCU 26 and/or the RF transceiver 28. One example of a suitable image array is the W6501 VGA CMOS Color Image Sensor manufactured by STMicroelectronics of Geneva, Switzerland.
[00020] The microcontroller unit (MCU) 26 can be any microcontroller suitable for storing and processing instructions and data. The MCU 26 processes the image signal 42 to calculate area occupancy by analyzing an object for a pattern, determining whether the pattern matches a classifier, and processing the object when the pattern matches the classifier to calculate area occupancy. Typically, the MCU 26 includes the image processor 32, memory 36, and communication stack 34. Instructions can be programmed into the memory 36 during manufacture of the pattern based occupancy sensor 20 and/or can be programmed through the communication signal 50 during operation. The particular characteristics of the MCU 26, such as n-bit architecture, clock speed, memory size, and the like, can be selected for the particular application. Examples of suitable microcontrollers are the 8-bit HCS08 and 16-bit HCS12 families manufactured by Freescale Semiconductor, Inc., of Austin, Texas, the AVR 8 -Bit RISC Flash Microcontroller manufactured by Atmel Corporation of San Jose, California, and the STV0767 Imaging Digital Signal Processor manufactured by STMicroelectronics of Geneva, Switzerland. Those skilled in the art will appreciate that the MCU 26 can be a single chip, or can be combined on a single chip with one or both of the image array 24 and the RF transceiver 28. Besides performing image processing, in various embodiments the MCU 26 can manage communications in and out of the pattern based occupancy sensor 20 through the communication interface 38 and manage power usage in the pattern based occupancy sensor 20.
[00021] The radio frequency (RF) transceiver 28 can be any transceiver for communicating between the pattern based occupancy sensor 20 and other pattern based occupancy sensors, local lighting controls, other local controls, and/or building automation systems. Typically, the RF transceiver 28 operates at low voltage and with low power consumption, and can incorporate power management features. In one embodiment, the RF transceiver 28 communicates at 2.4 GHz in accordance with the IEEE 802.15.4 short-range wireless standard and the ZigBee networking standard protocol. In another embodiment, the RF transceiver 28 communicates at 15 GHz. Examples of suitable transceivers are the MC 13193 short range, low power, 2.4 GHz ISM band transceiver manufactured by Freescale Semiconductor, Inc. of Austin, Texas, the EM2420 transceiver manufactured by Ember Corporation of Boston, Massachusetts, and the CC2420 RF transceiver manufactured by Chipcon AS, of Oslo, Norway. Those skilled in the art will appreciate that the RF transceiver 28 can operate at various frequencies and with various protocols as desired for a particular application. The RF transceiver 28 can be a single chip, or can be combined on a single chip with one or both of the image array 24 and the MCU 26.
[00022] The power supply 40 can be any wired or wireless power supply suitable for powering the components of the pattern based occupancy sensor 20. Wireless power supplies can include batteries or scavenging power supplies, such as power supplies scavenging solar, vibrational, electromagnetic field energy, or the like. Vibrational scavenging power supplies can be fabricated using micro electromechanical systems (MEMS) and nanotechnology. In one embodiment, the MCU 26 provides a power control signal 52 to the power supply 40 to manage power consumption in the pattern based occupancy sensor 20. In another embodiment, the individual components of the pattern based occupancy sensor 20 include power management features.
[00023] FIG. 2 is a flowchart for a counting method employing a pattern based occupancy sensor made in accordance with the present invention. The counting method includes acquiring an image at 100; detecting an object in the image at 102; analyzing the object for a pattern at 104; determining whether the pattern matches a classifier at 106; and processing the object when the pattern matches the classifier to calculate area occupancy at 108. In one embodiment, the method further includes acquiring training images, and generating the classifier from the training images. The area occupancy can be transmitted to other pattern based occupancy sensors and/or the building automation system for emergency roll calls, security, utility control, lighting control, and the like.
[00024] FIGS. 3 A & 3B are schematic diagrams of a boundary deployment and image, respectively, for a pattern based occupancy sensor made in accordance with the present invention. FIG. 3 A shows one example of a boundary deployment of a pattern based occupancy sensor. FIG. 3B shows one example of an image detected by a pattern based occupancy sensor.
[00025] Referring to FIG. 3A, a pattern based occupancy sensor 120 is installed near a doorway 122 to view the boundary 124 between a first area 126 and a second area 128. Objects having a pattern, such as people, animals, machines, parts, or the like, move across the boundary 124 as shown by arrows 132. In this example, the object 130 is a person and the pattern based occupancy sensor 120 is installed near the top of the doorway 122 to observe a head-and-shoulder pattern of the object 130. In other embodiments, the pattern based occupancy sensor 120 can be mounted as suited to the particular application. In an alternate embodiment, the pattern based occupancy sensor 120 is installed low on the doorway 122 to observe the leg or hip pattern of a person. In yet another embodiment, multiple pattern based occupancy sensors 120 can be installed to cover multiple directions, i.e., looking down and looking across the doorway 122. The multiple pattern based occupancy sensors 120 can communicate among themselves or with the building automation system and crosscheck the area occupancy determined by each of the pattern based occupancy sensors 120.
[00026] Referring to FIG. 3B, an image 150 is divided into a first region 152 and a second region 154 by an image boundary 156. The first region 152 and the second region 154 correspond respectively to the first area 126 and the second area 128 of FIG. 3A. The image boundary 156 can be any continuous curve dividing the image 150 into two regions. Pattern 158 is the pattern corresponding to the object 130 of FIG. 3A, shown in the first region 152. In this example, the pattern 158 is the head-and-shoulder pattern of a person.
[00027] FIG. 4 is a flowchart for a boundary counting method employing a pattern based occupancy sensor made in accordance with the present invention. The boundary counting method includes initializing the pattern based occupancy sensor at 400, acquiring an image having first and second regions at 402, detecting an object in the image at 404, analyzing the object for a pattern at 406, and determining whether the pattern match a classifier at 408. When the pattern does not match the classifier, the method returns to the acquiring an image at 402. When the pattern matches the classifier, the method continues with identifying the object as being in one of the first region and the second region at 410, acquiring a next image at 412, and determining whether the object in the other of the first region and the second region in the next image at 414. When the object is not in the other of the first region and the second region in the second image, the method returns to the acquiring an image at 402. When the object is in the other of the first region and the second region in the second image, the method continues with changing an occupancy counter at 416. The method continues with returning to the acquiring an image at 402. The time for the next acquisition of an image at 402 can be varied depending on the expected speed of the object across the image. In one embodiment, the acquisition of images at 402 can be repeated on the order of hundreds of milliseconds, such as about every 100 milliseconds.
[00028] The initializing the pattern based occupancy sensor at 400 can include initializing counters, defining boundaries and image regions, acquiring classifiers by downloading, and/or acquiring classifiers by training. Counters, such as the occupancy counter, can be zeroed out to start a counting session. The image boundary and regions can be generated automatically from natural features in sample images, such as floor color changes or doorway edges, or can be selected manually. The image boundary and regions remain constant for a pattern based occupancy sensor from one image to the next. When a library of classifiers is available, at least one classifier for checking patterns, such as a head- and-shoulder classifier can be downloaded from the library into the pattern based occupancy sensor. When a library of classifiers is not available or a custom classifier for a particular application is desired, the classifier can be obtained by training the pattern based occupancy sensor. Training includes acquiring training images, extracting features from objects of interest in the training images, and generating the classifier from the features. The classifiers can include shape as a primary feature, without regard to size. Other embodiments can include color or grayness as a feature determining the classifier. Classifiers can be individual classifiers that identify specific individuals or can be group classifiers that identify different groups, such as people and dogs.
[00029] The analyzing the object for a pattern at 406 is similar to generating the classifiers, i.e., features of objects in the image are determined and analyzed to determine the pattern for the object, such as a head-and-shoulders pattern. In determining whether the pattern match a classifier at 408, a tolerance can be applied so that the match is defined to occur as long as the pattern falls within the tolerance.
[00030] The identifying the object as being in one of the first region and the second region at 410, acquiring a next image at 412, and determining whether the object in the other of the first region and the second region in the next image at 414 checks to see whether a matched object moves across the boundary between one region and the other region between a one image and the next image. The movement can be from the first region to the second region or from the second region to the first region, i.e., any movement across the boundary requiring a change to the occupancy counter. [00031] The occupancy counter can tally boundary crossings or matched objects within an area, such as a room. For boundary crossings, the occupancy counter increments each time a matched object moves from the first region to the second region or from the second region to the first region. For matched objects within an area, the occupancy counter includes an IN counter and an OUT counter. The occupancy counter changes by incrementing the IN counter when the object is in the first region in the first image and in the second region in the second image, and incrementing the OUT counter when the object is in the second region in the first image and in the first region in the second image. Area occupancy, such as the number of people in a room, can be calculated from the difference between the IN counter and the OUT counter. The boundary crossings and/or area occupancy can be transmitted to other pattern based occupancy sensors and/or the building automation system for emergency roll calls, security, utility control, lighting control, and the like.
[00032] Classifiers can be individual classifiers that identify specific individuals or can be group classifiers that identify different groups, such as people and dogs. Individual classifiers can be used to track individuals from one area to the other. Group classifiers can be used to track numbers of a group crossing from one area to the other. In another embodiment, the group classifiers can be used to track individuals by assigning an individual tag to the identified object of the group and tracking the individual tag from one area to the other.
[00033] FIGS. 5A & 5B are schematic diagrams of an area deployment and images, respectively, for pattern based occupancy sensors made in accordance with the present invention. FIG. 5A shows one example of an area deployment of a pattern based occupancy sensor. FIG. 5B shows one example of images detected by pattern based occupancy sensors.
[00034] Referring to FIG. 5A, pattern based occupancy sensors 220, 221, 222, 223 are installed near the ceiling of room 242 looking toward the floor 240, which includes first area 226, second area 228, third area 230, and fourth area 232. The pattern based occupancy sensors 220, 221, 222, 223 view the first area 226, the second area 228, the third area 230, and the fourth area 232, respectively. Each pattern based occupancy sensor views a single area. Objects having a pattern, such as people, animals, machines, parts, or the like, can be in the room 242. In this example, the objects 210 are people and the object 212 is an animal. The pattern based occupancy sensors 221, 222 observe the head-and-shoulder patterns of the objects 210 in the second area 228, the pattern based occupancy sensor 222 observes the head-and-shoulder pattern of the object 210 in the third area 230, and the pattern based occupancy sensor 223 observes the body pattern of the object 212 in the fourth area 232. In other embodiments, the pattern based occupancy sensors can be mounted as suited for the particular application. In an alternate embodiment, the pattern based occupancy sensors can be installed low on walls to observe the leg or hip pattern of a person or profile of an animal. In yet another embodiment, multiple pattern based occupancy sensors 120 can be installed to cover multiple directions. The multiple pattern based occupancy sensors can communicate among themselves or with the building automation system and crosscheck the area occupancy determined by each of the pattern based occupancy sensors.
[00035] Referring to FIG. 5B, a room image 250 is divided into a first image 266, a second image 268, a third image 270, and a fourth image 272. Each pattern based occupancy sensor has a single image. The first image 266, second image 268, third image 270, and fourth image 272 correspond respectively to the first area 226, second area 228, third area 230, and fourth area 232 of FIG. 5A. Patterns 280 correspond to the objects 210 of FIG. 3A; in this example, the patterns 280 are the head-and-shoulder pattern of a person. Pattern 282 corresponds to the object 212 of FIG. 5A; in this example, the pattern 282 is the back pattern of an animal.
[00036] FIGS. 6A & 6B are flowcharts for an area counting and occupancy checking method, respectively, employing pattern based occupancy sensors made in accordance with the present invention. The occupancy checking method can be optionally included within the area counting method to check and correct the area occupancy. [00037] Referring to FIG. 6A, the area counting method includes initializing the pattern based occupancy sensor at 600, acquiring a background image at 602, calculating an object space PO and object range RL to RU at 604, setting occupancy counter N = 0 at 606, acquiring an object image at 608, counting objects M in the object image at 610, detecting object i in the object image at 612, analyzing the object i for a pattern i at 614, and determining whether the pattern i matches a classifier at 616. When the pattern i does not match a classifier, the method returns to the detecting object i in the object image at 612. When the pattern i matches a classifier, the method continues with incrementing area occupancy counter N at 618, and determining whether the object i = Mth object at 620. When the object i is not the Mth object, the method returns to the detecting object i in the object image at 612. When the object i is the Mth object, the method continues with checking area occupancy at 622, and determining whether time = background update time at 624. When the time is the background update time at 624, the method returns to the acquiring a background image at 602. When the time is not the background update time at 624, the method continues with determining whether the time = monitoring time at 626. When the time is the monitoring time, the method returns to the setting occupancy counter N = 0 at 606.
[00038] The initializing the pattern based occupancy sensor at 600 can include initializing counters, acquiring classifiers by downloading, and/or acquiring classifiers by training. Counters, such as the occupancy counter, can be zeroed out to start a counting session. When a library of classifiers is available, at least one classifier for checking patterns, such as a head-and-shoulder classifier can be downloaded from the library into the pattern based occupancy sensor. When a library of classifiers is not available or a custom classifier for a particular application is desired, the classifier can be obtained by training the pattern based occupancy sensor. Training includes acquiring training images, extracting features from objects of interest in the training images, and generating the classifier from the features. The classifiers can include shape as a primary feature, without regard to size. Other embodiments can include color or grayness as a feature determining the classifier. [00039] Acquiring a background image at 602 and calculating an object space PO and object range RL to RU at 604 are optional, and can be used in the optional occupancy checking method 622. Acquiring a background image at 602 includes acquiring a background image with no objects present. The background image provides a baseline for determining the accuracy of the indicated area occupancy. The background image is also used in calculating the object space PO and the object range RL to RU at 604. The object space PO is the space in pixels changed by the shape of an object of interest. The object space PO can be determined from measuring pixel changes from sample objects of interest against the background image, and averaging the measured pixel changes for a number of sample objects. The object range, lower range RL to upper range RU, can be a band about the object space PO, such as ± 10% of the object space PO value.
[00040] The objects in an object image are counted and objects of interest matching a classifier counted by incrementing an area occupancy counter between 606 and 620. The checking the area occupancy indicated by the area occupancy counter can be checked at 622. Checking the area occupancy indicated by the area occupancy counter can account for errors, such as counting objects not of interest but with a similar shape to an object of interest, changes in ambient light, shadows, and the like. In an alternate embodiment, checking the area occupancy at 622 can be omitted. The background update time at 624 is selected on the order of hours to refresh the background image as desired and depending on the particular installation. The monitoring time at 626 is selected on the order of minutes to refresh the object image as desired and depending on the particular installation. The area occupancy can be transmitted to other pattern based occupancy sensors and/or the building automation system for emergency roll calls, security, utility control, lighting control, and the like. In one embodiment, the area occupancies from a number of pattern based occupancy sensor can be added to calculate the occupancy for the larger area. For example, the area occupancies from a number of pattern based occupancy sensor covering particular portions of a room can be added to calculate the occupancy for the room. [00041] Referring to FIG. 6B, the optional occupancy checking method 622 includes starting at 700, calculating pixel change P between the object image and the background image at 702, and determining whether N*RL < P < N*RU at 704. When it is the case that N*RL < P < N*RU, NFinal = N at 706, and the method ends at 712. When it is not the case that N*RL < P < N*RU, the method continues with setting NCaIc = Int (P/ PO) at 708, setting NFinal = Min (N, NCaIc) at 710, and the method ends at 712.
[00042] Calculating pixel change P between the object image and the background image at 702 uses the object space PO and object range RL to RU calculated at 604. The pixel change P is the number of pixels changed from the background image to the object image. The object range for the indicated area occupancy is between the indicated area occupancy N times the lower range RL pixels per object and the indicated area occupancy N times the upper range RU pixels per object. When the pixel change is in the object range for the indicated area occupancy, the indicated area occupancy is acceptable and the final area occupancy NFinal is taken as the indicated area occupancy N. Otherwise, the indicated area occupancy can be corrected: the final area occupancy NFinal is set to the lesser of the indicated area occupancy N and the integer value of the pixel change P divided by the object space PO.
[00043] While the embodiments of the invention disclosed herein are presently considered to be preferred, various changes and modifications can be made without departing from the scope of the invention. The scope of the invention is indicated in the appended claims, and all changes that come within the meaning and range of equivalents are intended to be embraced therein.

Claims

1. A pattern based occupancy sensor comprising: an image array 24, the image array 24 generating an image signal 42 in response to an image; a microcontroller unit (MCU) 26, the MCU 26 being responsive to the image signal 42 and generating a status signal 46; and a communication interface 38, the communication interface 38 being responsive to the status signal 46 and generating a communication signal 50; wherein the MCU 26 processes the image signal 42 to calculate area occupancy by analyzing an object for a pattern, determining whether the pattern matches a classifier, and processing the object when the pattern matches the classifier to calculate area occupancy.
2. The sensor of claim 1 wherein the image array 24 is selected from the group consisting of a complementary metal oxide semiconductor (CMOS) image sensing chip and a charge coupled device (CCD).
3. The sensor of claim 1 wherein the MCU 26 has an image processor 32, memory 36, and communication stack 34.
4. The sensor of claim 1 wherein the communication interface 38 has a radio frequency (RF) transceiver 28 and an antenna 30, the RF transceiver 28 is responsive to the status signal 46 to generate a modulated signal 48, and the antenna 30 is responsive to the modulated signal 48 to generate the communication signal 50.
5. The sensor of claim 1 wherein the system signal 50 communicates at 2.4 GHz in accordance with the IEEE 802.15.4 short-range wireless standard and the ZigBee networking standard protocol.
6. The sensor of claim 1 further comprising a power supply 40 to provide power to the image array 24, the power supply 40 being selected from the group consisting of a battery and a scavenging power supply.
7. The sensor of claim 5 wherein the scavenging power supply scavenges power from a source selected from the group consisting of solar energy, vibrational energy, electromagnetic field energy.
8. The sensor of claim 5 wherein the power supply 40 is responsive to a power control signal 52 from the MCU 26.
9. A method for occupancy counting comprising: acquiring an image 100; detecting an object in the image 102; analyzing the object for a pattern 104; determining whether the pattern matches a classifier 106; and processing the object when the pattern matches the classifier to calculate area occupancy 108.
10. The method of claim 8 further comprising: acquiring training images; and generating the classifier from the training images.
11. The method of claim 8 wherein each image comprises a first region and a second region, and the processing comprises: identifying the object as being in one of the first region and the second region 410; acquiring a second image 412; determining whether the object is in the other of the first region and the second region in the second image 414; and changing an occupancy counter when the object is in the other of the first region and the second region 416.
12. The method of claim 10 wherein the occupancy counter comprises an IN counter and an OUT counter, and the changing comprises incrementing the IN counter when the object is in the first region in the first image and in the second region in the second image, and incrementing the OUT counter when the object is in the second region in the first image and in the first region in the second image.
13. The method of claim 11 further comprising determining a difference between the IN counter and the OUT counter to calculate area occupancy.
14. The method of claim 10 wherein the classifier is an individual classifier.
15. The method of claim 10 wherein the classifier is a group classifier and further comprising assigning an individual tag to the identified object.
16. The method of claim 8 wherein: the detecting comprises detecting each object in the image; the analyzing comprises analyzing each of the objects for a pattern; the determining comprises determining when each of the patterns matches a classifier; and the processing comprises incrementing an area occupancy counter when each one of the patterns matches a classifier.
17. The method of claim 15 further comprising checking the area occupancy indicated by the area occupancy counter.
18. The method of claim 16 wherein the image is an object image and the checking comprises: acquiring a background image; calculating an object space and an object range from sample objects against the background image; calculating a pixel change between the object image and the background image; determining whether the pixel change is in the object range for the indicated area occupancy; accepting the indicated area occupancy when the pixel change is in the object range for the indicated area occupancy; and setting the area occupancy to the lesser of the indicated area occupancy and the integer value of the pixel change divided by the object space when the pixel change is not in the object range for the indicated area occupancy.
19. A system for occupancy counting comprising: means for acquiring an image; means for detecting an object in the image; means for analyzing the object for a pattern; means for determining whether the pattern matches a classifier; and means for processing the object when the pattern matches the classifier to calculate area occupancy.
20. The system of claim 18 further comprising: means for acquiring training images; and means for generating the classifier from the training images.
21. The system of claim 18 further comprising means for powering the means for acquiring an image.
22. The system of claim 18 wherein the image comprises a first region and a second region, and the means for processing comprises: means for identifying the object as being in one of the first region and the second region; means for acquiring a second image; means for determining whether the object is in the other of the first region and the second region in the second image; and means for changing an occupancy counter when the object is in the other of the first region and the second region.
23. The system of claim 18 wherein: the means for detecting comprises means for detecting each object in the image; the means for analyzing comprises means for analyzing each of the objects for a pattern; the means for determining comprises means for determining when each of the patterns matches a classifier; and the means for processing comprises means for incrementing an area occupancy counter when each one of the patterns matches a classifier.
24. The system of claim 22 further comprising means for checking the area occupancy indicated by the area occupancy counter.
PCT/IB2006/051119 2005-04-12 2006-04-11 Pattern based occupancy sensing system and method WO2006109256A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US67040005P 2005-04-12 2005-04-12
US60/670,400 2005-04-12

Publications (2)

Publication Number Publication Date
WO2006109256A2 true WO2006109256A2 (en) 2006-10-19
WO2006109256A3 WO2006109256A3 (en) 2007-01-04

Family

ID=36999837

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2006/051119 WO2006109256A2 (en) 2005-04-12 2006-04-11 Pattern based occupancy sensing system and method

Country Status (3)

Country Link
KR (1) KR20080005265A (en)
TW (1) TW200643824A (en)
WO (1) WO2006109256A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013179175A1 (en) * 2012-05-29 2013-12-05 Koninklijke Philips N.V. Processing module for use in a presence sensing system
WO2017122119A1 (en) * 2016-01-11 2017-07-20 Brown Gregory A M Systems, and methods for detecting, counting, and tracking people and assets

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI387283B (en) * 2008-04-25 2013-02-21 Univ Nat Taiwan Wireless smart controlling display panel
TWI559236B (en) * 2012-06-20 2016-11-21 原相科技股份有限公司 Background model update method for image process

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0716402A1 (en) * 1994-12-09 1996-06-12 Matsushita Electric Industrial Co., Ltd. Human occupancy detection method and system for implementing the same
GB2337146A (en) * 1998-05-08 1999-11-10 Primary Image Limited Detecting motion across a surveillance area
US6340864B1 (en) * 1999-08-10 2002-01-22 Philips Electronics North America Corporation Lighting control system including a wireless remote sensor
US20030183767A1 (en) * 2000-05-18 2003-10-02 Meunier Gilbert Bruno System for counting living beings
US20040066500A1 (en) * 2002-10-02 2004-04-08 Gokturk Salih Burak Occupancy detection and measurement system and method
US20040161133A1 (en) * 2002-02-06 2004-08-19 Avishai Elazar System and method for video content analysis-based detection, surveillance and alarm management

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0716402A1 (en) * 1994-12-09 1996-06-12 Matsushita Electric Industrial Co., Ltd. Human occupancy detection method and system for implementing the same
GB2337146A (en) * 1998-05-08 1999-11-10 Primary Image Limited Detecting motion across a surveillance area
US6340864B1 (en) * 1999-08-10 2002-01-22 Philips Electronics North America Corporation Lighting control system including a wireless remote sensor
US20030183767A1 (en) * 2000-05-18 2003-10-02 Meunier Gilbert Bruno System for counting living beings
US20040161133A1 (en) * 2002-02-06 2004-08-19 Avishai Elazar System and method for video content analysis-based detection, surveillance and alarm management
US20040066500A1 (en) * 2002-10-02 2004-04-08 Gokturk Salih Burak Occupancy detection and measurement system and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013179175A1 (en) * 2012-05-29 2013-12-05 Koninklijke Philips N.V. Processing module for use in a presence sensing system
WO2017122119A1 (en) * 2016-01-11 2017-07-20 Brown Gregory A M Systems, and methods for detecting, counting, and tracking people and assets

Also Published As

Publication number Publication date
WO2006109256A3 (en) 2007-01-04
KR20080005265A (en) 2008-01-10
TW200643824A (en) 2006-12-16

Similar Documents

Publication Publication Date Title
US11669981B2 (en) Occupant counting device
US10460582B2 (en) Presence detection and uses thereof
CN109792827B (en) Lighting control arrangement
JP5511477B2 (en) Air conditioner
EP1118252B1 (en) Lighting control system including a wireless remote sensor
Ibrahim et al. Visible light based activity sensing using ceiling photosensors
US20120019168A1 (en) Illumination control system and method for controlling illumination
EP3517839A1 (en) Method, apparatus, and system for occupancy sensing
WO2006109256A2 (en) Pattern based occupancy sensing system and method
WO2018064764A1 (en) Presence detection and uses thereof
US9237635B2 (en) Intelligent lighting control apparatus and method
CN109074717A (en) Monitor camera and bracket
CN106228153A (en) A kind of existence induction installation based on recognition of face and circuit control system
WO2018182887A1 (en) Thermal image occupant detection
JP2004164282A (en) Personal behavior detection system
CN108292357A (en) Image processing system
CN108474583B (en) Air conditioner
WO2022111179A1 (en) Wireless sensing units, systems, methods, and media
CN109844825B (en) Presence detection system and method
JP2012197985A (en) Air conditioner
CN113031460B (en) Smart home system and method based on human-living scene intelligent perception analysis
US20180227473A1 (en) Smart home security control device
US11137770B2 (en) Sensor registering method and event identifying method of smart detection system
CN213517544U (en) Laboratory controller instrument
WO2016199740A1 (en) Central processing device for monitoring person being monitored, central processing method for monitoring person being monitored, central processing program for monitoring person being monitored, and system for monitoring person being monitored

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Ref document number: DE

WWE Wipo information: entry into national phase

Ref document number: 1020077026200

Country of ref document: KR

NENP Non-entry into the national phase

Ref country code: RU

WWW Wipo information: withdrawn in national office

Ref document number: RU

122 Ep: pct application non-entry in european phase

Ref document number: 06727891

Country of ref document: EP

Kind code of ref document: A2

WWW Wipo information: withdrawn in national office

Ref document number: 6727891

Country of ref document: EP