CN110634209B - Occupancy sensing system for supervisory service management - Google Patents
Occupancy sensing system for supervisory service management Download PDFInfo
- Publication number
- CN110634209B CN110634209B CN201910548911.0A CN201910548911A CN110634209B CN 110634209 B CN110634209 B CN 110634209B CN 201910548911 A CN201910548911 A CN 201910548911A CN 110634209 B CN110634209 B CN 110634209B
- Authority
- CN
- China
- Prior art keywords
- occupancy
- server
- sensing system
- processor
- occupancy sensing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 claims description 16
- 230000004044 response Effects 0.000 claims description 8
- 230000008569 process Effects 0.000 claims description 6
- 230000003068 static effect Effects 0.000 abstract description 2
- 238000004891 communication Methods 0.000 description 37
- 238000007726 management method Methods 0.000 description 15
- 238000012545 processing Methods 0.000 description 13
- 230000001105 regulatory effect Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 7
- 238000012544 monitoring process Methods 0.000 description 6
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 239000000344 soap Substances 0.000 description 4
- 238000013500 data storage Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 238000012795 verification Methods 0.000 description 3
- PICXIOQBANWBIZ-UHFFFAOYSA-N zinc;1-oxidopyridine-2-thione Chemical class [Zn+2].[O-]N1C=CC=CC1=S.[O-]N1C=CC=CC1=S PICXIOQBANWBIZ-UHFFFAOYSA-N 0.000 description 3
- 238000007792 addition Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000004140 cleaning Methods 0.000 description 2
- 230000001010 compromised effect Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- IRLPACMLTUPBCL-KQYNXXCUSA-N 5'-adenylyl sulfate Chemical compound C1=NC=2C(N)=NC=NC=2N1[C@@H]1O[C@H](COP(O)(=O)OS(O)(=O)=O)[C@@H](O)[C@H]1O IRLPACMLTUPBCL-KQYNXXCUSA-N 0.000 description 1
- 241000282412 Homo Species 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- QVFWZNCVPCJQOP-UHFFFAOYSA-N chloralodol Chemical compound CC(O)(C)CC(C)OC(O)C(Cl)(Cl)Cl QVFWZNCVPCJQOP-UHFFFAOYSA-N 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C1/00—Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people
- G07C1/10—Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people together with the recording, indicating or registering of other data, e.g. of signs of identity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
- G06T2207/20032—Median filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Business, Economics & Management (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Signal Processing (AREA)
- Quality & Reliability (AREA)
- Health & Medical Sciences (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- General Health & Medical Sciences (AREA)
- Game Theory and Decision Science (AREA)
- Educational Administration (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Development Economics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Alarm Systems (AREA)
Abstract
An occupancy sensing system for supervising service management. An occupancy sensing based system for supervisory service management is disclosed that allows supervisory services to be scheduled based on actual use of a venue such as a restroom, rather than following a static schedule, and thus can potentially save a significant amount of labor costs. The system is very accurate, privacy aware and easy to use. In addition to allowing supervisory management to optimize their workforce, the system is also able to detect theft of supplies and perform inventory management. It may be useful in airports, hospitals, stadiums, theaters, academic buildings, convention centers, and many other dynamic locations.
Description
The present application claims the benefit of priority from U.S. provisional application serial No. 62/689,484, filed 2018, month 6, 25, the disclosure of which is incorporated herein by reference in its entirety.
Technical Field
The apparatus and method disclosed in this document relate to occupancy sensing and, more particularly, to an occupancy sensing system for supervising service management.
Background
Unless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.
The supervisory service provides cleaning services in the building. It is expensive and has a valuable multi-million dollar market in airports, hospitals, stadiums, theaters, academic buildings, and convention centers. Current regulatory services are typically based on static schedules and do not monitor nor consider the number of people using the facility in real time. Thus, some heavily used lavatories may be out of service to regulatory personnel, and rarely used lavatories may be over-serviced by regulatory personnel, resulting in inefficient labor management and increased costs.
Disclosure of Invention
An occupancy sensing system is disclosed. The occupancy sensing system comprises: a plurality of occupancy sensors, each occupancy sensor mounted at an entryway to a respective venue of the plurality of venues, each occupancy sensor having a depth sensor facing the entryway to the respective venue and configured to provide depth sensor data, each occupancy sensor configured to (i) determine, based on the depth sensor data, a volume of people passing through the entryway to the respective venue and (ii) determine, based on the determined volume of people passing through the entryway to the respective venue, a number of people occupying the respective venue at a plurality of different times; and a server communicatively coupled to the plurality of occupancy sensors and configured to receive data from the plurality of occupancy sensors, the data indicating a number of people occupying each of the plurality of places at a plurality of different times.
Drawings
The foregoing aspects and other features of the occupancy sensing system are explained in the following description, taken in connection with the accompanying drawings.
FIG. 1 is a diagrammatic view of an exemplary embodiment of an occupancy sensing system.
FIG. 2 is a diagrammatic view of an exemplary installation of occupancy sensing devices at an entryway of a venue.
FIG. 3 is a block diagram of exemplary components of the occupancy sensing device(s) of FIGS. 1-2.
Fig. 4 is a block diagram of exemplary components of the server of fig. 1.
Fig. 5 is a block diagram of exemplary components of the client device(s) of fig. 1.
FIG. 6 illustrates a logic flow diagram of an exemplary method of operating an occupancy sensing device to monitor traffic through an entryway and occupancy of a corresponding site.
Fig. 7 illustrates an exemplary depth image frame.
Detailed Description
The following description is presented to enable any person skilled in the art to make and use the described embodiments, and is provided in the context of a particular application and its requirements. Various modifications to the described embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the described embodiments. Thus, the described embodiments are not limited to the embodiments shown, but are to be accorded the widest scope consistent with the principles and features disclosed herein.
In the present disclosure, an occupancy sensing system for supervisory service management is described. As used herein, "regulatory services" include, but are not limited to, cleaning, resupply, repair, inspection, or otherwise maintaining public, semi-public, and/or private locations, such as restrooms, public areas, cafeterias, or other locations that need to be maintained. Occupancy sensing-based regulatory service management systems may be particularly useful in places where occupancy patterns are irregular and dynamic, such as airports, hospitals, stadiums, theaters, academic buildings, and convention centers. In such places, there is a huge potential savings in labor costs by optimizing the work schedule based on the actual use of the facility. In addition, occupancy sensing based regulatory service management systems are also useful for inventory management and theft detection.
While occupancy sensing based supervisory service management systems are useful for a wide variety of purposes, one particular problem addressed and discussed in detail herein is monitoring the actual use of a washroom in real time and making this information available for supervisory service management. More specifically, the system counts the number of people entering and exiting each restroom area in real time and determines how the restroom is being used. However, it will be appreciated that the system is similarly suitable for other purposes.
There are several challenges in counting the number of people using a restroom. In particular, placement of sensors inside the restroom is often discouraged. Additionally, the solution should be non-privacy intrusive, which prevents the use of sensors such as RGB cameras. Furthermore, it is challenging to determine how a person uses a restroom, i.e. how long they spend in the restroom, and what restroom supplies they use.
Referring to fig. 1-2, an exemplary embodiment of an administrative service management system 10 is depicted. As shown in FIG. 1, the administrative service management system 10 includes a plurality of occupancy sensing devices 20. The occupancy sensing device 20 is configured to monitor occupancy and usage of one or more venues, in particular one or more restrooms of a building. In some embodiments, the occupancy sensing device 20 is configured to track how many people have used the corresponding location and how long each person has used the corresponding location.
The administrative services management system 10 further includes a server 30. The server 30 is configured to collect and process occupancy and usage information provided by the occupancy sensing device 20 in order to provide various administrative service management functions, such as: estimating the consumption and/or depletion of supplies at the site (e.g., toilet paper, soap, and paper towels in the case of a restroom), determining whether a particular site needs to be serviced, and generating an optimized service schedule for the site and for supervising service workers. The server 30 may be an application server, a certificate server, a mobile information server, an e-commerce server, an FTP server, a directory server, a CMS server, a printer server, a management server, a mail server, a public/private access server, a real-time communication server, a database server, a proxy server, a streaming server, etc.
The administrative service management system 10 further includes one or more client devices 40. The client device 40 enables a user, such as a regulatory service worker, to view information about: occupancy and usage of a venue, an estimate of offers for a particular venue, which venue needs to be serviced, and a generated service schedule. Each client device 40 may be a personal computer or desktop computer, laptop computer, cellular or smart phone, tablet device, personal Digital Assistant (PDA), wearable device, game console, audio device, video device, entertainment device such as a television, vehicle infotainment, or the like.
The occupancy sensing device 20, the server 30 and the client device 40 are configured to communicate with each other via a communication link L of the network 50. The communication link L may be wired, wireless or a combination thereof. The network 50 may include one or more sub-networks that cooperate to enable communication between the occupancy sensing device 20, the server 30, and the client device 40. Network 50 may include, in part, a pre-existing network, such as an enterprise-wide computer network, an intranet, the internet, a public computer network, or any combination thereof. The network 50 may include, for example, a Local Area Network (LAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a primary public network having a private sub-network, a primary private network having a public sub-network, or a primary private network 50 having a private sub-network. Still further embodiments of the network 50 may include network types such as: point-to-point networks, broadcast networks, telecommunications networks, data communication networks, computer networks, ATM (asynchronous transfer mode) networks, SONET (synchronous optical network) networks, SDH (synchronous digital hierarchy) networks, wireless networks, wired networks, and the like. The particular network topology of the network 50 may vary in different embodiments, which may include a bus network topology, a star network topology, a ring network topology, a repeater-based network topology, or a hierarchical star network topology. Additional embodiments of the network 50 may utilize a mobile phone network that uses a protocol to communicate among mobile devices, where the protocol may be, for example, AMPS, TDMA, CDMA, GSM, GPRS, UMTS, LTE, or any other protocol capable of transmitting data among mobile devices. The wireless communication link may include a cellular protocol, a data packet protocol, a radio frequency protocol, a satellite band, an infrared channel, or any other protocol capable of transmitting data among the devices of the network 50.
Referring to FIG. 2, an exemplary installation of an occupancy sensing device 20 at an entryway 11 of a venue is shown. In at least one embodiment, the venue is a restroom, but may also include any room: buildings, convention centers, airports, auditoriums, theaters, places whether open or closed, spaces, any public places, any private access locations or positions, and so forth. In the illustrated installation, the occupancy sensing device 20 and/or its depth sensor 110 is mounted to and/or integrated with the top portion 12 of the entryway 11. The occupancy sensing device 20 is mounted to and/or integrated with the entryway 11 such that the depth sensor 110 faces in a direction orthogonal to an interior portion or frame of the entryway 11. For example, in the illustrated embodiment, the depth sensor 110 faces in a direction D toward the floor 18. However, even if the depth sensor 110 is mounted at a non-orthogonal angle due to mounting errors or mounting constraints, the occupancy sensing device 20 has robust performance because it can still sense a person in the frame when mounted at a non-orthogonal angle. It will be appreciated that in alternative embodiments, the occupancy sensing device 20 and/or the depth sensor 110 thereof may be mounted to and/or integrated with one of the side portions 14 or 16 of the entryway 11 or the floor 18 of the entryway 11, depending on the method used to track the flow of people through the entryway 11.
In some embodiments, the display 160 of the occupancy sensing device 20 is mounted above the entryway and is configured to display occupancy and/or usage information regarding the venue. For example, in some embodiments, the display 160 is configured to display the number of people currently occupying the location.
Each occupancy sensing device 20 is associated with a particular venue, such as a particular restroom of a building, and is configured to monitor occupancy and usage of the particular venue. Occupancy sensing devices 20 associated with a particular site are communicatively coupled to server 30 via communication link L of network 50. Where a particular site includes multiple entryways, an occupancy sensing device 20 may be installed at each entryway. Alternatively, a single occupancy sensing device 20 associated with the site may include a respective depth sensor 110 mounted at each entryway. The occupancy sensing device 20 and/or its depth sensor 110 work together to monitor occupancy and usage of a particular site having multiple entryways.
It will be appreciated that the use of a depth sensor 110 installed in the entryway is a much smaller privacy intrusive policy for occupancy sensing than many other solutions for occupancy sensing. In particular, the depth sensor 110 only measures the distance from the sensor to the nearest object. Thus, the occupancy sensing device 20 may see the shape of the human body, but unlike RGB camera based solutions, the color or other more unique identifying features of the face, clothing or hair of a particular person occupying a site cannot be seen. Additionally, as will be discussed in more detail below, only metadata such as occupancy counts are uploaded to the server 30. Thus, this solution is less privacy intrusive, as no images are stored or uploaded to the server 30. Furthermore, if the occupancy sensing device 20 is compromised by a malicious actor, only the depth sensor data and the occupancy and/or usage metadata will be compromised.
FIG. 3 illustrates a block diagram showing exemplary components of the occupancy sensing device 20 of FIGS. 1-2. In the illustrated embodiment, the occupancy sensing device 20 includes a depth sensor 110, a processor 120, a memory 130, one or more transceivers 140, an input/output (I/O) interface 150, and a display 160, which are communicatively coupled to each other via one or more system buses B. The system bus B may be any of several types of bus structures including a memory or memory controller, a peripheral bus, a local bus, and any type of bus architecture. In at least one embodiment, the occupancy sensing device 20 further comprises: a battery (not shown) configured to provide power to other components of the occupancy sensing device 20.
The depth sensor 110 is configured to provide depth data that is used by the processor 120 to detect and track the flow of people entering and exiting at the site through the entryway 11 in real time. Although one depth sensor 110 is illustrated, the occupancy sensing device 20 may include more than one depth sensor 110. The depth sensor 110 may include an array of individual depth sensor elements (not shown) arranged in a grid, each individual sensor element configured to detect a depth value and provide sensor data representing the detected depth value. In one embodiment, the depth sensor 110 includes: a readout controller (not shown) configured to control individual depth sensor elements of the depth sensor 110, receive sensor data from the individual depth sensor elements, and perform various preprocessing steps on the sensor data, such as digitizing the sensor data, time stamping the sensor data, and/or packaging the sensor data into image frames at a predetermined or adjustable frame rate. However, it will be appreciated that such processing by the readout controller may alternatively be performed by the processor 120 of the occupancy sensing device 20.
In some embodiments, the occupancy sensing device 20 may further include other types of sensors, such as optical sensors, imaging sensors, acoustic sensors, motion sensors, global positioning system sensors, thermal sensors, IR array sensors, and environmental sensors, which may be utilized in conjunction with or in place of the depth sensor 110 to monitor occupancy and usage of the venue.
The processor 120 may be any of a variety of processors as will be appreciated by one of ordinary skill in the art. One of ordinary skill in the art will appreciate that "processor" as used herein includes: any hardware system, hardware mechanism, or hardware component that processes data, signals, and/or other information. A processor can include a system with a central processing unit, multiple processing units, dedicated circuitry for achieving functionality, and/or other systems. Exemplary processors include a microprocessor (μ P), a microcontroller (μ C), a Digital Signal Processor (DSP), or any combination thereof. Processor 120 may include one or more levels of cache (such as a level cache memory), one or more processor cores, and one or more registers. Example processor cores may include an Arithmetic Logic Unit (ALU), a Floating Point Unit (FPU), a digital signal processing core (DSP core), or any combination thereof. In at least one embodiment, the processor 120 comprises an ARM processor. The processor 120 is operatively connected to the depth sensor 110, the memory 130, the communication interface 140, the input/output interface 150, and the display 160, and is configured to receive sensor data (pre-processed or otherwise) from the depth sensor 110.
The one or more transceivers 140 may be any of a variety of devices configured for communicating with other electronic devices, including the ability to send and receive communication signals. The transceiver 140 may include: different types of transceivers, configured to communicate with different networks and systems, include at least communication with a server via a communication link L of the network 50. The transceiver 140 may include, for example, a modem, radio, network interface, communication port, PCM-CIA slot and card, and the like. In one embodiment, transceiver 140 is configured to exchange data using a protocol such as LoRa, wi-Fi, bluetooth, RFID, NFC, zigBee, Z-Wave, or Ethernet.
The I/O interface 150 includes: hardware and/or software configured to facilitate communication with one or more peripherals and/or user interfaces including a display 160, which may include an LCD display, a 7-segment digital display, one or more indicator lights, and the like. The input/output interface 150 is in communication with the display 160 and is configured to visually display graphics, text, and other data to a user via the display 160. Additional peripherals and/or user interfaces may include a keyboard, joystick, mouse, trackball, touchpad, touch screen or tablet device input, foot pedal control, servo control, joystick input, infrared or laser pointer, camera-based gesture input, and the like, capable of controlling different aspects of the operation of occupancy-sensing device 20.
The memory 130 of the occupancy sensing device 20 is configured to store information including both data and instructions. The memory 130 may be any type of device capable of storing information accessible by the processor 120, such as a memory card, ROM, RAM, writable memory, read-only memory, hard drive, magnetic disk, flash memory, or any of a variety of other computer-readable media serving as data storage devices as will be appreciated by one of ordinary skill in the art. In at least one embodiment, memory 130 includes: occupancy data 170, such as data regarding: current or previous occupancy of the restroom, an entrance or exit event through the entryway, and/or the amount of time a particular individual spends in the restroom. The data may further include various other operational data, logs, and the like, such as information about: the regulatory service workers monitor the last time the site was serviced, the estimated supply volume consumed and/or depleted since the site was last serviced, and/or the estimated remaining supply volume at the site.
The memory 130 is further configured to store program instructions that, when executed by the processor 120, enable the occupancy sensing device 20 to provide the features, functionalities, characteristics, and/or the like as described herein. In particular, the memory 130 includes: occupancy sensing program 180 that enables monitoring of traffic through the entryway and occupancy at the respective location.
In at least one embodiment, the processor 120 is configured to (i) determine, based on the depth sensor data, a volume of people passing through the entryway to the respective venue, and (ii) determine, based on the determined volume of people passing through the entryway to the respective venue, a number of people occupying the respective venue at a plurality of different times. In particular, the processor 120 is configured to detect each time a person enters the site through the entryway and to detect each time a person leaves the site through the entryway. Based on these detected entry and exit events through the entryway, the processor 120 is configured to determine the number of people occupying the place by subtracting the number of exit events from the number of events in the entry event. In at least one embodiment, the processor 120 is configured to determine to operate the display 160 to display occupancy information, such as the number of people occupying the respective location. In at least one embodiment, the processor 120 is configured to operate the transceiver 140 to transmit data to the server 30, the data including occupancy counts with associated timestamps and entry and exit events with associated timestamps. It will be appreciated that such monitoring of the flow of people through the entryway and occupancy at the respective locations may be performed using a variety of different methods, models and/or algorithms. Exemplary methods for monitoring traffic through an entryway and occupancy at a corresponding location are discussed in more detail below with reference to fig. 6-7.
In at least one embodiment, the processor 120 is configured to determine the occupancy time for each person entering the respective site. In particular, in one embodiment, to determine how long a person stays in the location, as each person enters or exits through the entryway, the processor 120 is configured to extract biometric features about the person based on the depth sensor data, including but not limited to height, shoulder size, and head radius. Additionally, as each person enters or exits through the entryway, the processor 120 is configured to record a timestamp associated with a particular entry event or exit event. As each person exits through the entryway, the processor 120 is configured to compare the extracted biometric features with previously extracted biometric features of persons previously entering the location. Based on the comparison, the processor 120 is configured to match the exit event with the previous entry event and compare their respective timestamps to determine how long the person occupied the venue, referred to herein as the occupancy time. In at least one embodiment, the processor 120 is configured to operate the transceiver 140 to transmit data to the server 30, the data being the occupancy time of each person entering and then leaving through the entryway.
Fig. 4 illustrates a block diagram showing exemplary components of the server 30 of fig. 1-2. In the illustrated embodiment, server 30 includes a processor 210, a memory 220, a user interface 230, and a network communication module 240. It is appreciated that the illustrated embodiment of server 30 is only one exemplary embodiment of server 30 and represents only a server, a remote computer, or any other data processing system operating in the manner set forth herein, in any of a variety of ways or configurations.
The processor 210 may be any of a variety of processors as will be appreciated by one of ordinary skill in the art. One of ordinary skill in the art will appreciate that "processor" as used herein includes: any hardware system, hardware mechanism, or hardware component that processes data, signals, and/or other information. A processor can include a system with a central processing unit, multiple processing units, dedicated circuitry for achieving functionality, and/or other systems. Exemplary processors include a microprocessor (μ P), a microcontroller (μ C), a Digital Signal Processor (DSP), or any combination thereof. Processor 210 may include one or more levels of cache (such as a level cache memory), one or more processor cores, and one or more registers. Example processor cores may include an Arithmetic Logic Unit (ALU), a Floating Point Unit (FPU), a digital signal processing core (DSP core), or any combination thereof. The processor 210 is operatively connected to the memory 220, the user interface 230, and the network communication module 240.
The server 30 may be operated locally or remotely by a user. To facilitate local operations, the server 30 may include an interactive user interface 230. Via the user interface 230, a user may modify and/or update program instructions stored on the memory 220, as well as collect data from the memory 220 and store data to the memory 220. In one embodiment, the user interface 230 may suitably include an LCD display screen or the like, a mouse or other pointing device, a keyboard or other keypad, a speaker, and a microphone, as will be appreciated by those of ordinary skill in the art. Alternatively, in some embodiments, the user may operate the server 30 remotely from another computing device that is in communication with the server 30 via the network communication module 240 and has a similar user interface.
The network communication module 240 of the server 30 provides an interface that allows communication with any of a variety of devices or networks, and includes at least a transceiver or other hardware configured to communicate with the occupancy sensing device 20 and with the client device 40. In particular, the network communication module 240 may include: a local area network port that allows communication with any of a variety of local computers housed in the same or nearby facility. In some embodiments, the network communication module 240 further comprises: a wide area network port that allows communication with a remote computer over the internet. Alternatively, the server 30 communicates with the internet via a separate modem and/or router of a local area network. In one embodiment, the network communication module 240 is equipped with a Wi-Fi transceiver or other wireless communication device. Thus, it will be appreciated that communication with the server 30 may occur via wired communication or via wireless communication. The communication may be accomplished using any of a variety of known communication protocols.
The memory 220 of the server 30 is configured to store information including both data and instructions. The memory 220 may be any type of device capable of storing information accessible by the processor 210, such as a memory card, ROM, RAM, writable memory, read-only memory, hard drive, magnetic disk, flash memory, or any of a variety of other computer-readable media serving as data storage devices as will be appreciated by one of ordinary skill in the art. In at least one embodiment, memory 220 includes: occupancy data 260, such as data received from the occupancy sensing device 20 regarding: current or previous occupancy of the restroom, an entrance or exit event through the entryway, and/or the amount of time a particular individual spends in the restroom. In at least one embodiment, memory 220 includes: administration service data 270, which includes various other operational data, logs, etc., such as information about the last time the administration service worker provided service to each site, the estimated supply volume consumed and/or depleted since each site was last provided service, the estimated supplies remaining at each site, the service schedule for each site, and information about the administration service worker and its work schedule.
The memory 220 is further configured to store program instructions that, when executed by the processor 210, enable the server 30 to provide features, functionalities, characteristics, etc. as described herein. In particular, the memory 220 includes: a supervisory service manager 250 capable of monitoring occupancy of a plurality of sites where occupancy sensing devices 20 are installed, estimating usage of supplies at the sites, and managing service to the sites by supervisory service workers based on the occupancy and usage at the sites. In particular, in at least one embodiment, where the venue is a restroom, the administrative services manager 250 enables monitoring of restroom occupancy, estimating use of restroom supplies (e.g., toilet paper, soap, and paper towels), and managing services of the restroom by administrative service workers.
In particular, the server 30 is configured to receive occupancy and/or usage related metadata, such as occupancy counts with associated timestamps, entry and exit events with associated timestamps, and occupancy times of each person at one of the occupancy venues, from each of the occupancy sensing devices 20 via the communication module 240 or otherwise.
In some embodiments, based on the received occupancy and/or usage related metadata, the processor 210 is configured to, for each person entering one of the sites, estimate usage of the offer by the respective person at the respective site. In particular, in at least one embodiment, processor 210 is configured to estimate usage of the supply by each person based on the occupancy time of the respective person at the respective location. In one embodiment, processor 210 is configured to estimate usage of the supply by each person based on a comparison of the occupancy time of the respective person to one or more predetermined threshold durations T. In particular, in one embodiment, processor 210 is configured to estimate a first supply amount used by a respective person at a respective location in response to the occupancy time of the respective person being less than a predetermined threshold duration T. Similarly, the processor 210 is configured to estimate a second supply amount used by the respective person at the respective location in response to the occupancy time of the respective person being greater than the predetermined threshold duration T.
For example, where the venue is a restroom, if a person spends less than the first predetermined threshold duration T in the restroom 1 (e.g., 3 minutes), then processor 210 is configured to determine that a first amount of the restroom supply (e.g., toilet paper, soap, and paper towels) is used. Similarly, if a person spends more than a first predetermined threshold duration T in the restroom 1 (e.g., 3 minutes), processor 210 is configured to determine that the second amount of the restroom supply is used. In at least one embodiment, the second amount is greater than the first amount. The first and second quantities of restroom supplies may be set according to implicit assumptions about how a person uses the restroom, based on how long the person is in the restroom (i.e., longer occupancy may indicate that more toilet paper is used). Additionally, in some embodiments, the first and second quantities of restroom supplies may be set differently depending on whether the restroom is a male restroom or a female restroom (i.e., a short occupancy in a male restroom may indicate no toilet paper usage and a short occupancy in a female restroom may indicate a small amount of toilet paper usage). For some types of supplies, the first and second amounts may be the same regardless of the elapsed time (e.g., soap and paper towels). In some embodiments, processor 210 is configured to adjust the estimated first and second amounts of restroom supply over time based on actual usage reported by the regulatory service workers.
In some embodiments, multiple thresholds may be used. For example, in one embodiment, if a person spends more than a second predetermined threshold duration T in a restroom 2 (e.g., 45 minutes), processor 210 is configured to determine that a third amount of the restroom supply is used. If a person spends more than a first predetermined threshold duration T in the restroom 1 (e.g., 3 minutes) but less than a second predetermined threshold duration T 2 (e.g., 45 minutes), processor 210 is configured to determine that the second quantity of the restroom supply is used. In one embodiment, the third amount of wash is based on the assumption that the person may be on the phone in the restroom or participate in some other atypical use in the restroomThe interhand supply is less than the second amount.
However, it will be appreciated that in an alternative embodiment, the processor 120 of the occupancy sensing device 20 is configured to estimate the usage of the supply by each person entering the respective location in the manner described above. In such embodiments, the processor 120 is configured to operate the transceiver 140 to transmit data to the server 30, the data including the estimated supply usage for each person entering the site.
In some embodiments, the processor 210 is configured to store in the memory 220 an administrative services log indicating the service history for each location, including at least the most recent times that administrative services workers provided service for each respective location. In at least one embodiment, processor 210 is configured to estimate total supply usage at each of the plurality of sites since a most recent time that the service was provided for each respective site. In one embodiment, processor 210 is configured to determine total supply usage at each respective location based on the number of people that have occupied each respective location since the last time that service was provided for each respective location. More particularly, in one embodiment, processor 210 is configured to determine a total supply usage at each respective site based on an estimated supply usage of each person entering the site since the most recent time that the site was serviced.
However, it will be appreciated that in an alternative embodiment, the processor 120 of the occupancy sensing device 20 is configured to store in the memory 130 the most recent times that the administrative service workers provided service to the respective locations, and to estimate the total supply usage at the respective locations in the manner described above. In such embodiments, the processor 120 is configured to operate the transceiver 140 to transmit data to the server 30, the data including the estimated total supply usage at the respective site.
In one embodiment, the processor 210 is configured to: for each of a plurality of sites, a remaining supply volume at each respective site is estimated based on the estimated total supply usage at the respective site since the most recent time that the service was provided for the respective site. In particular, in one embodiment, processor 210 is configured to estimate the remaining supply volume at each respective site by subtracting the total supply usage from the starting volume of supplies that existed after the most recent time that each respective site was serviced.
In one embodiment, the processor 210 is configured to: for each of the plurality of sites, a next time at which the administrative service worker should next provide service for the respective site is determined based on the estimated total supply usage at the respective site since the most recent time at which service was provided for each respective site, and/or based on the estimated remaining supplies at the respective site.
In some embodiments, the processor 210 is configured to determine a usage pattern for each of the plurality of venues based on the occupancy and/or usage related data received from the occupancy sensing device 20. Based on the usage pattern, the processor 210 is configured to generate a service schedule for each of the plurality of places based on the usage pattern. Additionally, when there is a deviation from the expected consumption of the supply based on the regular usage pattern, the determined usage pattern may be used to identify theft activity. For example, if it is determined that no one is using the restroom on the weekend, but the cleaner is replenishing the bulk supply, it may be a sign of theft of the restroom supply.
Fig. 5 illustrates an exemplary embodiment of one of the client devices 40, which may include a smart phone, a smart watch, a laptop computer, a tablet computer, a desktop computer, and so forth. In the illustrated embodiment, client device 40 includes a processor 310, a memory 320, a transceiver 330, an I/O interface 340, and a display 350. It is appreciated that the illustrated embodiment of client device 40 is merely one exemplary embodiment of client device 40 and represents only a client device, personal electronic device, or other device operating in the manner set forth herein, in any of a variety of manners or configurations.
The processor 310 may be any of a variety of processors as will be appreciated by one of ordinary skill in the art. One of ordinary skill in the art will appreciate that "processor" as used herein includes: any hardware system, hardware mechanism, or hardware component that processes data, signals, and/or other information. A processor can include a system with a central processing unit, multiple processing units, dedicated circuitry for achieving functionality, and/or other systems. Exemplary processors include a microprocessor (μ P), a microcontroller (μ C), a Digital Signal Processor (DSP), or any combination thereof. The processor 310 may include one or more levels of cache (such as a level cache memory), one or more processor cores, and one or more registers. Example processor cores may include an Arithmetic Logic Unit (ALU), a Floating Point Unit (FPU), a digital signal processing core (DSP core), or any combination thereof. The processor 310 is operatively connected to the memory 320, the transceiver 330, the I/O interface 340, and the display screen 350.
The transceiver 330 includes at least a transceiver, such as a Wi-Fi transceiver, configured to communicate with the server 30 via the network 50, but may also include any of a variety of other devices configured for communication with other electronic devices, including the ability to send and receive communication signals. In one embodiment, the transceiver 330 further includes additional transceivers common to smartphones, smartwatches, laptop computers, tablet computers, desktop computers, such as bluetooth transceivers, ethernet adapters, and transceivers configured to communicate via a wireless telephone network.
The I/O interface 340 includes software and hardware configured to facilitate communication with one or more interfaces of the client device 40, including the display screen 350, as well as other interfaces such as tactile buttons, switches and/or toggle switches, microphones, speakers, and connection ports. Display screen 350 may be an LED screen or any of a variety of other screens suitable for personal electronic devices. I/O interface 340 is in communication with display screen 350 and is configured to visually display graphics, text, and other data to a user via display screen 350.
The memory 320 of the server 30 is configured to store information including both data and instructions. The memory 320 may be any type of device capable of storing information accessible by the processor 310, such as a memory card, ROM, RAM, writable memory, read-only memory, hard drive, magnetic disk, flash memory, or any of a variety of other computer-readable media serving as data storage devices as will be appreciated by one of ordinary skill in the art. In at least one embodiment, memory 320 includes: user data 360, which includes various types of information relating to one or more users, and in particular, one or more administrative service workers. User data 360 may include user and/or employee profiles, work schedules, and the like.
In particular, in one embodiment, the processor 310 of the client device 40 is configured to operate the transceiver 330 to transmit an information request to the server 30, which may include a unique employee identification number or the like. In response to the information request, processor 210 of server 30 is configured to transmit data that the user is authorized to view back to client device 40 via network communication module 240. In particular, in some embodiments, in response to the information request, processor 210 is configured to transmit data to client device 40, such as: the number of people occupying each of the plurality of places at the current time or at a previous time, the estimated remaining supply volume at each place, the most recent time that the administrative service workers provided service to each place, the next time that each place should be provided service next, and/or a service schedule generated for one or more places.
In at least one embodiment, processor 310 of client device 40 is configured to receive input from a user via I/O interface 340 indicating that the user has provided service for a particular location. In response to input indicating that a particular venue has been serviced, the processor 310 is configured to operate the transceiver 330 to transmit a message to the server 30 indicating that the venue has been serviced, which message may include a timestamp and an identifier for the particular venue. In response to the message to the server 30 indicating that the location has been serviced, the processor 210 of the server 30 is configured to record a timestamp in the memory 220 as the most recent time that the location has been serviced. In one embodiment, the processor 210 of the server 30 is configured to forward the message to the corresponding occupancy sensing device 20 at the venue, and the processor 120 of the occupancy sensing device 20 is configured to record the timestamp in the memory 130.
In at least one embodiment, processor 310 of client device 40 is configured to receive input from a user via I/O interface 340 indicating how many offers are needed at a site at the time of service to re-offer a particular site and/or how many offers are actually left. The processor 310 is configured to include the re-supply information with a message to the server 30 indicating that the site has been serviced. The processor 210 of the server 30 is configured to use the re-supply information to compare it to expected values based on observed usage and occupancy, based on estimated supply quantities remaining at each site. If there is a mismatch between the reported re-supply information and the expected value, then there may have been theft of the supply, and the processor 210 is configured to generate an alarm or otherwise flag the possible theft.
FIG. 6 illustrates a logic flow diagram of an exemplary method 400 for operating the occupancy sensing device 20 to monitor the flow of people through the entryway and the occupancy of the corresponding location. In the description of the method, the statement that the method is performing a certain task or function refers to: a controller or general purpose processor (e.g., processor 120) that executes programming instructions (e.g., occupancy sensing program 180) stored in a non-transitory computer-readable storage medium (e.g., memory 130) operatively connected to the controller or processor to manipulate data or operate one or more components in the occupancy sensing device 20 to perform tasks or functions. Additionally, the steps of the method may be performed in any feasible temporal order, regardless of the order shown in the figures or the order in which the steps are described.
In step 410, the processor 120 receives depth sensor data from the depth sensor 110. As discussed above, the data provided by the depth sensor 110 may be in the form of image frames. In particular, the processor 120 is configured to receive a chronological sequence of image frames, each image frame sequence comprising depth sensor data detected by the depth sensor 110 at a respective time. Each pixel in each depth image frame provides a distance from the depth sensor 110 to the nearest object. In one embodiment, the depth sensor 110 is configured to output a signal having a defined resolution (e.g.,) And image frames at a frame rate (e.g., 30 frames per second).
In step 420, the processor 120 is configured to pre-process the received depth sensor data. In particular, in some embodiments, in the presence of noise, the corresponding pixel of the depth image frame may have a value of 0 or some other outlier. Thus, in one embodiment, the processor 120 is configured to reset the noise pixels and outliers (depths exceeding a predetermined threshold) to the depth of the floor 18. In one embodiment, the processor 120 is configured to calculate the depth of the floor 18 by calculating a histogram of the depth values of the frame, where the bin (bin) having the greatest number of depth data points is considered the floor. Additionally, in some embodiments, the processor 120 is configured to apply median filtering to smooth each depth image frame. Fig. 7 shows an exemplary pre-processed depth image frame 500 having a person 502 entering through an entrance passageway and a person 504 exiting through an exit passageway.
With continued reference to fig. 6, in steps 430 and 440, the processor 120 is configured to perform a multi-level scan in which a plurality of potential depth levels are scanned to detect a human and for each scanned depth level, extract a contour of the potential human head by ignoring depth data below that level. In particular, the goal of this step is to determine the center and radius of the smallest closed circle of all potential heads. To this end, the processor 120 is configured to perform a multi-level scan and determine the center and radius of the head by detecting contours at different height levels. In one embodiment, the processor 120 is configured to scan the depth sensor data starting at a predetermined interval (e.g., every 6 inches) from a first height (e.g., 6 feet from the floor) to a second height (e.g., 2 feet from the floor). Note that the average height of an adult male is about 5 feet 7 inches to 5 feet 11 inches, and the average height of a female is about 5 feet 2 inches to 5 feet 7 inches, and the starting height and ending height are conservatively chosen so that humans are not missing during multiple scans. As the processor 120 scans the depth data at each height level, the processor 120 is configured to discard all depth data below that height level. The processor 120 is configured to find the full contour at each depth level. For each contour, the processor 120 is configured to use an iterative algorithm to find a minimum closed circle (such as the circle 506 shown in fig. 7). The center and radius of the smallest closed circle are considered to be the center and radius of the head. As discussed below, for each detected center and radius, the processor 120 is configured to verify that it is a person by verifying the presence of the head and shoulders. However, a single person may be detected at different levels. To avoid this, the processor 120 is configured to scan from the top and when it authenticates a person at a higher depth level, it discards all nearby centers at a lower level.
In some embodiments, processor 120 utilizes two policies to accelerate processing. First, when performing a multi-level scan, the processor 120 is configured to execute it out of order. Instead of scanning from top (e.g., 6 feet from the floor) to bottom (2 feet from the floor) in serial order, the processor 120 is configured to scan first at the top-most level, and then at the bottom-most level, and then at the remaining levels. The intuition is that if there is someone there, the depth sensor 110 should capture the body while scanning at the bottom level. If the bottom level scan returns that nothing is there compared to the approximate background (described below), the processor 120 is configured to continue processing the next depth image frame. Otherwise, the processor 120 is configured to scan the remaining levels in serial order (top to bottom) to determine the precise location of the head. Second, the processor 120 is configured not to scan at levels that do not have sufficient depth data compared to the approximate background. The processor 120 is configured to determine the approximate background by constructing histograms of depth data points at different scan levels (e.g., 6 inch bin sizes). Each time processor 120 sees a new frame, processor 120 is configured to update the histogram with the assumption that: the minimum number of depth data points seen at one level to date are from the background, which reasonably captures walls, doors, tables, etc. in the environment. This approximate background detection technique enables processor 120 to quickly continue moving to the next frame when there is no person in the scene.
In steps 450 and 460, the processor 120 is configured to verify whether each extracted contour represents a real person by verifying the presence of the head and shoulders using anthropometric properties of the human body. In particular, in step 450, the center of the head (c) is given x ,c y ) And a radius r, the purpose of this step being to verify whether there is a human head at this position. The processor 120 is configured to model the human head using a semi-ellipsoid (the upper half of the ellipsoid). In particular, the ellipsoid in Cartesian coordinates is represented by the equationWhere a, b and c are the lengths of the half-axes, and (c) x ,c y ,c z ) Is the center of the ellipsoid. The processor 120 is configured to set a = b = r (in pixel coordinates), and we set c =0.5 × D (in depth coordinates), where D is the depth of the human head (e.g., 220 mm). The processor 120 is configured to c z T +0.5 × D, where T is the minimum distance between the depth sensor 110 and the head. The processor 120 is configured to: x with respect to the detected profileThe y values are iterated, the z value of each (x, y) is calculated and compared to the corresponding z value in the depth frame. If the average difference is less than the threshold T head Then the processor 120 is configured to report that a header is detected.
In step 460, the center of the head (c) is given x ,c y ) And a radius r, the purpose of this step being to verify whether there is a human shoulder near this orientation. To verify the shoulder, the processor 120 is configured to perform four steps. First, the processor 120 is configured to consider a region of interest (ROI) around the head and shoulders. The end-to-end distance between the person's two shoulders is approximately three times his head diameter, and thus the processor 120 is configured to select a slightly larger square ROI around the head. Second, the processor 120 is configured to extract the header from it by discarding all depth data above T + D (calculated in the header verification step). Third, the processor 120 is configured to subtract the head from the region of interest to obtain shoulder depth data. Note that starting from the first step, the processor 120 is configured to discard all depth data above T + D + S by setting these values to 0, where S is the depth of the shoulder. In one embodiment, the processor 120 is configured to set S to 250 mm, as a depth of 10 inches is sufficient to reasonably capture the shoulder. Fourth, the processor 120 is configured to determine whether the obtained depth data conforms to the shoulder by attempting several techniques. For example, in one embodiment, the processor 120 is configured to detect the contour and measure a goodness-of-fit to an ellipsoid. In another embodiment, the processor 120 is configured to compute histograms of depth data at different height levels and check whether there is at least one bin at the shoulder depth level around the head position with enough depth data points to represent the shoulder. If no shoulder is present, such as for a ball, the depth data at that location will be near the floor level, and bins at the shoulder level will not have enough depth data points. The purpose of shoulder verification is to avoid spherical objects such as balls, balloons and spherical lamps. As for counting people, head verification is usually sufficient. However, the shoulder size is used for identification and trackingUseful characteristics of an occupant.
In step 470, the processor 120 is configured to determine the position of the door. In particular, in one embodiment, the processor 120 is configured to automatically determine the position of the door in six steps. First, starting with a pre-processed image, the processor 120 is configured to perform median filtering with kernel size 5. Second, the processor 120 is configured to discard depth data that is very close to the ground (within 1 foot) and 2 feet above the ground by replacing these depth data with the maximum floor depth. Third, the processor 120 is configured to detect Canny edges to increase contrast and reduce noise. Fourth, the processor 120 is configured to perform a Hough line transformation on Canny edges to detect straight lines. Although Canny edge detection and Hough line transformation are computationally inexpensive, it does not degrade real-time performance because gate detection is only performed at the beginning. Fifth, starting from the candidate Hough line, the processor 120 is configured to select the line (e.g., line gate) having the highest accumulator vote (accumulator vote) that is most parallel to the X-axis of the depth frame 1 As shown in fig. 7).
In step 480, the processor 120 is configured to track individuals entering and leaving the entryway. The processor 120 performs two types of tracking: (i) Basic tracking to determine if a person enters the interior or exits the exterior through a door to accurately count them; and (ii) biometric tracking to identify and track individuals. The processor 120 is configured to implement a lightweight greedy binary matching algorithm by utilizing the person's orientation, height, and head radius. Assume that processor 120 detects N and M persons in the previous and current frames, respectively. For each pair of people: (i,j) In whichAnd isThe processor 120 is configured to normalize the distance between the head centers, the difference in head radii, and the height of each pair. The processor 120 is then configured to use these threeThe distances are used to calculate weighted distances (e.g., weights: 1, and 0.5, respectively). The reason for the smaller weight for the height difference is that we observe that when a person walks from side to side, his or her height varies by up to 40 millimeters. The processor 120 is then configured to sort the distances in ascending order and pair them in that order. If someoneNot paired, the processor 120 is configured to add him to the current frame. However, if someoneNot paired, the processor 120 is configured not to discard him immediately, since it is possible that the depth sensor 110 may miss someone in a frame and detect him in the next frame. For the missing person, the processor 120 is configured to predict the current position of the person based on his average walking speed and direction, and update the position of his head center accordingly. To do this, the processor 120 is configured to update the average walking speed and direction of the person whenever there is a pair.
At low frame rates, for example, when someone (P1) leaves through the gate and another person (P2) enters from the other side of the gate in the next frame, someone may move a considerable distance between consecutive frames, which may negatively affect tracking. It may appear as if P1 is moving in his opposite direction and may erroneously increase/decrease the occupancy count. Because the header of P1 is missing at the current frame, a greedy binary match attempts to match P1 of the earlier frame with P2 of the current frame. To avoid this, the processor 120 is configured to consider the walking direction and if the match requires a direction reversal, the processor 120 is configured to check whether there is a deep shadow of P1 in the current and previous frames at his respective predicted position. As used herein, deep shading means that the head is missing, but a partial body contour is seen near that location. If there is a deep shadow, the processor 120 is configured to assume that P1 is/is there when P2 comes in, and we do not allow a match to be made.
Whenever someone enters/exits through the entryway, the processor 120 is configured to extract a number of simple features regarding the subject's height, head radius, shoulder size, entry/exit, and walking speed. More specifically, for height, processor 120 is configured such that as he/she passes through the door 1 When extracting several features from the depth data including minimum, maximum, average and exact height, and overall minimum, maximum, average and median height during an entry/exit event. Similar features regarding head radius and shoulder size are extracted. The processor 120 is configured to match the features to identify the individual.
In step 490, the processor 120 is configured to determine a count of people currently occupying the venue. In particular, for each frame, for each person within the frame, the processor 120 is configured to determine D i If he is at the door 1 Outer part of (2), then D i Is 1, and otherwise is 0, wherein. The processor 120 is configured to determine if a person's D i Changed from 1 (at the previous frame) to 0 (at the current frame), the occupancy count is incremented. The processor 120 is configured to record the direction of the person and if he D j (j ≠ i) is then changed from 1 to 0, and the processor 120 is configured to not increment the count again. However, if D is present i Or D j And then changed from 0 to 1, the processor 120 is configured to decrement the occupancy count and then ignore the similar change (0 to 1).
The embodiments described above have been shown by way of example, and it should be understood that they may be susceptible to various modifications and alternative forms. It should be further understood that the claims are not intended to be limited to the particular forms disclosed, but rather to cover all modifications, equivalents, and alternatives falling within the spirit and scope of this disclosure.
While this patent has been described with reference to various embodiments, it will be understood that these embodiments are illustrative, and that the scope of the disclosure is not limited to these embodiments. Many variations, modifications, additions, and improvements are possible. More generally, embodiments according to this patent have been described in the context or in particular embodiments. The functionality may be separated or combined in blocks differently in various embodiments of the disclosure or described using different terminology. These and other variations, modifications, additions, and improvements may fall within the scope of the disclosure as defined in the claims that follow.
Claims (20)
1. An occupancy sensing system, comprising:
a plurality of occupancy sensors, each occupancy sensor mounted at an entryway to a respective venue of a plurality of venues, each occupancy sensor having a depth sensor facing the entryway to the respective venue and configured to provide depth sensor data, each occupancy sensor configured (i) to perform a multi-level scanning process to extract a depth profile of a body of each respective person as the respective person enters and exits the respective venue through the entryway and to determine a height, a head radius, and a shoulder size of the respective person based on the extracted depth profile, and (ii) to determine an occupancy duration for each person entering the respective venue by matching the respective person's entry event to the respective person's exit event based on the determined heights, head radii, and shoulder sizes of the entry and exit events; and
a server communicatively coupled to the plurality of occupancy sensors and configured to receive data from the plurality of occupancy sensors, the data indicating a number of people occupying each of the plurality of places at a plurality of different times.
2. The occupancy sensing system of claim 1, wherein:
each occupancy sensor of the plurality of occupancy sensors is configured to: as each respective person enters and exits the respective location through the entryway, (i) verifying that the extracted depth profile includes a human head by comparing the extracted depth profile to an ellipsoid, and (ii) verifying that the extracted depth profile includes a human shoulder by discarding the extracted depth profile corresponding to the head and checking whether the remaining extracted depth profiles conform to the shoulder.
3. The occupancy sensing system of claim 2, wherein the server is configured to: for each person entering a location of the plurality of locations, estimating usage of a supply by the respective person at the respective location based on an occupancy time of the respective person.
4. The occupancy sensing system of claim 3, wherein the server is configured to: for each person entering a location of the plurality of locations, (i) in response to the occupancy time of the respective person being less than a predetermined threshold duration, estimating a first supply volume used by the respective person at the respective location, and (ii) in response to the occupancy time of the respective person being greater than the predetermined threshold duration, estimating a second supply volume used by the respective person at the respective location.
5. The occupancy sensing system of claim 1, wherein the server is configured to: for each of the plurality of sites, storing in memory a most recent time that the administrative service worker provided service for the respective site.
6. The occupancy sensing system of claim 5, wherein the server is configured to: for each of the plurality of sites, determining a next time at which the supervising service workers should next provide service for the respective site based on the estimated total supply usage at the respective site since the most recent time.
7. The occupancy sensing system of claim 5, wherein the server is configured to estimate the total supply usage at each of the plurality of places since the most recent time based on a number of people occupying each of the plurality of places at a plurality of different times.
8. The occupancy sensing system of claim 5, wherein the server is configured to: for each of the plurality of sites, storing in memory an amount of supply at the site at a most recent time that the administrative service worker provided service for the respective site.
9. The occupancy sensing system of claim 8, wherein the server is configured to: for each of the plurality of sites, estimating a remaining supply volume at the respective site based on the estimated supply usage at the respective site since the most recent time and the supply volume at the site at the most recent time.
10. The occupancy sensing system of claim 5, wherein the server is configured to receive data from a personal electronic device communicatively coupled to the server, the data indicating a most recent time that the administrative service worker provided service for a particular location of the plurality of locations.
11. The occupancy sensing system of claim 10, wherein the server is configured to receive data from the personal electronic device, the data indicating actual supply amounts replenished during the most recent time that the administrative service worker provided service to the particular one of the plurality of sites.
12. The occupancy sensing system of claim 11, wherein the server is configured to:
determining an expected supply amount required to replenish supplies at the particular one of the plurality of sites;
comparing the desired supply amount required with the actual supply amount replenished; and
theft of the supply is detected based on the comparison.
13. The occupancy sensing system of claim 1, wherein the server is configured to: (i) Determine a usage pattern for each of a plurality of places based on a number of people occupying each of the plurality of places at a plurality of different times, and (ii) generate a service schedule for each of the plurality of places based on the respective usage pattern for each of the plurality of places.
14. The occupancy sensing system of claim 1, wherein the server is configured to transmit data to a personal electronic device communicatively coupled to the server, the data indicating a number of people occupying each of the plurality of places at a current time.
15. The occupancy sensing system of claim 5, wherein the server is configured to transmit data to a personal electronic device communicatively coupled to the server, the data indicating a most recent time that the administrative service worker provided service for a particular location of the plurality of locations.
16. The occupancy sensing system of claim 6, wherein the server is configured to transmit data to a personal electronic device communicatively coupled to the server, the data indicating a next time at which the administrative service worker should next provide service for a particular location of the plurality of locations.
17. The occupancy sensing system of claim 9, wherein the server is configured to transmit data to a personal electronic device communicatively coupled to the server, the data indicating the estimated remaining supply at a particular one of the plurality of sites.
18. The occupancy sensing system of claim 13, wherein the server is configured to transmit data to a personal electronic device communicatively coupled to the server, the data indicating a service schedule generated for at least one of the plurality of venues.
19. The occupancy sensing system of claim 1, wherein each of the plurality of occupancy sensors includes a display configured to display a number of people occupying the respective location.
20. The occupancy sensing system of claim 1, wherein at least one of the plurality of venues is a restroom.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862689484P | 2018-06-25 | 2018-06-25 | |
US62/689484 | 2018-06-25 | ||
US16/148,206 US11367041B2 (en) | 2018-06-25 | 2018-10-01 | Occupancy sensing system for custodial services management |
US16/148206 | 2018-10-01 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110634209A CN110634209A (en) | 2019-12-31 |
CN110634209B true CN110634209B (en) | 2023-03-14 |
Family
ID=68886414
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910548911.0A Active CN110634209B (en) | 2018-06-25 | 2019-06-24 | Occupancy sensing system for supervisory service management |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110634209B (en) |
DE (1) | DE102019209139A1 (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101861606A (en) * | 2007-09-19 | 2010-10-13 | 联合工艺公司 | System and method for occupancy estimation |
CN101911108A (en) * | 2007-11-05 | 2010-12-08 | 斯洛文阀门公司 | Restroom convenience center |
CN102621886A (en) * | 2012-02-15 | 2012-08-01 | 清华大学 | Method for controlling energy equipment according to regional population distribution in building |
CN104731048A (en) * | 2013-09-23 | 2015-06-24 | 中山润涛商贸有限公司 | Washroom monitoring and consumable supply management system |
US9408041B1 (en) * | 2015-04-24 | 2016-08-02 | Insensi, Inc. | Premise occupancy detection based on smartphone presence |
CN106407317A (en) * | 2016-08-31 | 2017-02-15 | 乐视控股(北京)有限公司 | Washroom information acquisition method and device |
WO2017114846A1 (en) * | 2015-12-28 | 2017-07-06 | Robert Bosch Gmbh | Depth sensing based system for detecting, tracking, estimating, and identifying occupancy in real-time |
CN107688897A (en) * | 2017-08-16 | 2018-02-13 | 成都新东冠智慧科技有限公司 | Management method and system between a kind of tourist attraction Intelligent sanitary |
-
2019
- 2019-06-24 CN CN201910548911.0A patent/CN110634209B/en active Active
- 2019-06-25 DE DE102019209139.3A patent/DE102019209139A1/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101861606A (en) * | 2007-09-19 | 2010-10-13 | 联合工艺公司 | System and method for occupancy estimation |
CN101911108A (en) * | 2007-11-05 | 2010-12-08 | 斯洛文阀门公司 | Restroom convenience center |
CN102621886A (en) * | 2012-02-15 | 2012-08-01 | 清华大学 | Method for controlling energy equipment according to regional population distribution in building |
CN104731048A (en) * | 2013-09-23 | 2015-06-24 | 中山润涛商贸有限公司 | Washroom monitoring and consumable supply management system |
US9408041B1 (en) * | 2015-04-24 | 2016-08-02 | Insensi, Inc. | Premise occupancy detection based on smartphone presence |
WO2017114846A1 (en) * | 2015-12-28 | 2017-07-06 | Robert Bosch Gmbh | Depth sensing based system for detecting, tracking, estimating, and identifying occupancy in real-time |
CN106407317A (en) * | 2016-08-31 | 2017-02-15 | 乐视控股(北京)有限公司 | Washroom information acquisition method and device |
CN107688897A (en) * | 2017-08-16 | 2018-02-13 | 成都新东冠智慧科技有限公司 | Management method and system between a kind of tourist attraction Intelligent sanitary |
Also Published As
Publication number | Publication date |
---|---|
DE102019209139A1 (en) | 2020-01-02 |
CN110634209A (en) | 2019-12-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11367041B2 (en) | Occupancy sensing system for custodial services management | |
US10726271B2 (en) | Virtual turnstile system and method | |
US10089535B2 (en) | Depth camera based detection of human subjects | |
CN110222640A (en) | Monitor recognition methods, device, method and the storage medium of suspect in place | |
JP6270433B2 (en) | Information processing apparatus, information processing method, and information processing system | |
JPWO2017122258A1 (en) | Congestion status monitoring system and congestion status monitoring method | |
CN108701211B (en) | Depth sensing based system for detecting, tracking, estimating and identifying occupancy in real time | |
CN107330386A (en) | A kind of people flow rate statistical method and terminal device | |
US11227458B1 (en) | Occupancy analysis system using depth sensing to determine the movement of people or objects | |
JP6627894B2 (en) | Face recognition device | |
EP3355282A1 (en) | System and method for detecting, tracking and counting human objects of interest using a counting system and a data capture device | |
CN109508586A (en) | A kind of passenger flow statistical method, device and equipment | |
US20230010991A1 (en) | Access control system and a method for controlling operation of an access control system | |
Zhang et al. | Occupant activity level estimation using floor vibration | |
US12100164B2 (en) | Methods and system for object path detection in a workplace | |
US10657746B1 (en) | Access control system including occupancy estimation | |
WO2018114443A1 (en) | Rgbd sensing based object detection system and method thereof | |
Lu et al. | A zone-level occupancy counting system for commercial office spaces using low-resolution time-of-flight sensors | |
CN110634209B (en) | Occupancy sensing system for supervisory service management | |
TWI697868B (en) | Image object tracking systems and methods | |
JP2003109001A (en) | System and program for detecting person | |
Kuang et al. | Low-resolution IR sensor-based occupancy detection scheme for smart buildings | |
Gonzalez et al. | Data mining-based localisation of spatial low-resolution sensors in commercial buildings | |
Tanaka et al. | AkiKomi: Design and Implementation of a Mobile App System for Real-time Room Occupancy Estimation | |
US12007469B1 (en) | Trajectory determination system using positional sensing to determine the movement of people or objects |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |