CN115398464A - Identifying, reducing health risks in a facility and tracking occupancy of a facility - Google Patents

Identifying, reducing health risks in a facility and tracking occupancy of a facility Download PDF

Info

Publication number
CN115398464A
CN115398464A CN202180024055.4A CN202180024055A CN115398464A CN 115398464 A CN115398464 A CN 115398464A CN 202180024055 A CN202180024055 A CN 202180024055A CN 115398464 A CN115398464 A CN 115398464A
Authority
CN
China
Prior art keywords
sensor
facility
individual
sensor system
local network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202180024055.4A
Other languages
Chinese (zh)
Inventor
N·特里卡
R·P·穆尔普里
A·古普塔
T·马克尔
E·普斯
K·易卜拉希米
A·达亚尔
J·K·拉斯姆斯-沃拉斯
R·M·马丁森
A·马利克
A·M·史密斯
P·I·I·O·麦克诺顿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
View Inc
Original Assignee
View Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/US2021/015378 external-priority patent/WO2021154915A1/en
Application filed by View Inc filed Critical View Inc
Publication of CN115398464A publication Critical patent/CN115398464A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B19/00Alarms responsive to two or more different undesired or abnormal conditions, e.g. burglary and fire, abnormal temperature and abnormal rate of flow
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/12Alarms for ensuring the safety of persons responsive to undesired emission of substances, e.g. pollution alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/22Status alarms responsive to presence or absence of persons

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physiology (AREA)
  • Biophysics (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Marketing (AREA)
  • Theoretical Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Alarm Systems (AREA)
  • Arrangements For Transmission Of Measured Signals (AREA)
  • Telephonic Communication Services (AREA)

Abstract

Disclosed herein are methods, devices, non-transitory computer-readable media, and systems related to reducing and/or identifying one or more health risks in a facility. For example by sensing a physical characteristic of an individual in the facility, for example by sensing at least one environmental characteristic. For example by sensing surface cleanliness. For example by tracking personnel in the facility. For example, by suggesting routes in the peripheral structure based at least in part on the concentration of people in the facility. Methods, devices, non-transitory computer-readable media, and systems related to monitoring occupancy of a facility are disclosed herein.

Description

Identifying, reducing health risks in a facility and tracking occupancy of a facility
RELATED APPLICATIONS
The present application claims U.S. provisional patent application Ser. No. 63/159,814, entitled "IDENTIFYING, REDUCING HEALTH RISKS, AND TRACKING OCCUPANTY IN A FACILITY", filed on 11/3.2021; U.S. provisional patent application Ser. No. 63/115,886, entitled "IDENTIFYING AND REDUCING HEALTH RISKS IN A FACILITY", filed 11/19/2020; U.S. provisional patent application Ser. No. 63/041,002, entitled "SENSING ABNORMAL BODY CHARACTERISTICS OF ENCLOSURE OCCUPANTS", filed on 18.6.2020; and U.S. provisional patent application Ser. No. 62/993,617 entitled "SENSING ABNORMAL BODY CHARACTERISTICS OF ENCLOSURE OCCUPANTS" filed on 23.3.2020. This application also claims priority from a continuation-in-part application, international patent application Ser. No. PCT/US21/15378 entitled "Sensor Calibration AND Operation" filed 28/1/2021, which claims priority from a provisional U.S. patent application Ser. No. 62/967,204 entitled "SENSOR Calibration AND Operation" filed 29/1/2020. International patent application serial No. PCT/US21/15378, also a continuation-in-part application of U.S. patent application serial No. 17/083,128 entitled "BUILDING NETWORK" filed 28/10/2020, is a continuation-in-part application of U.S. patent application serial No. 16/664,089 entitled "BUILDING NETWORK" filed 25/10/2019. U.S. patent application Ser. No. 17/083,128, also filed on 2.5.2019 and filed on International patent application Ser. No. PCT/US19/30467 entitled "EDGE NETWORK FOR BUILDING SERVICES", claims priority from U.S. provisional patent application Ser. No. 62/666,033 filed on 2.5.2018 and entitled "EDGE NETWORK FOR BUILDING SERVICES". U.S. patent application Ser. No. 17/083,128, also a continuation-in-part application of International patent application Ser. No. PCT/US18/29460, entitled "TINTABLE WINDOW SYSTEM FOR BUILDING SERVICES", filed 25.4.2018, claims priority of U.S. provisional patent application Ser. No. 62/607,618 filed 19.12.2017, U.S. provisional patent application Ser. No. 62/523,606 filed 22.6.22.2017, U.S. provisional patent application Ser. No. 62/507,704 filed 17.5.2017, U.S. provisional patent application Ser. No. 62/506,514 filed 15.5.15.2017, and U.S. provisional patent application Ser. No. 62/490,457 filed 26.4.26.2017. International patent application Ser. No. PCT/US21/15378 is also a continuation-in-part application, U.S. patent application Ser. No. 16/447,169, entitled "SENSING AND COMMUNICATIONS UNIT FOR OPTICAL SWITCH WINDOW SYSTEM" filed on 2019 at 20.6.9, claiming priority from U.S. provisional patent application Ser. No. 62/858,100, filed on 2019 at 6.6.6.9, entitled "SENSING AND COMMUNICATIONS UNIT FOR OPTILY SWITCH WINDOW SYSTEM". U.S. patent application Ser. No. 16/447,169 also claims U.S. provisional patent application Ser. No. 62/803,324, entitled "SENSING AND COMMUNICATIONS UNIT FOR OPTICAL SWITCH WINDOW SYSTEMS", filed on 8.2.2019; U.S. provisional patent application Ser. No. 62/768,775, entitled "SENSING AND COMMUNICATIONS UNIT FOR OPTICAL SWITCH WINDOW SYSTEMS", filed on 16.11.2018; U.S. provisional patent application Ser. No. 62/688,957, entitled "SENSING AND COMMUNICATIONS UNIT FOR OPTICAL SWITCH WINDOW SYSTEMS", filed 22/6.2018; and U.S. provisional patent application serial No. 62/666,033. U.S. patent application Ser. No. 16/447,169 is also a continuation-in-part application of International patent application Ser. No. PCT/US 19/30467. Each of the above-listed patent applications is incorporated by reference herein in its entirety.
Background
The sensor may be configured (e.g., designed) to measure one or more environmental characteristics, such as temperature, humidity, ambient noise, carbon dioxide, and/or other aspects of the ambient environment. It may be desirable to know whether one or more physical characteristics of an individual are abnormal or even indicative of a disease. For example, it may be beneficial to know whether an individual has a higher than normal body temperature in various environments. For example, in a private environment such as a home. For example, in a non-private environment, such as in a workplace, hospital, airport, restaurant, or correctional facility (e.g., a shift or prison). This need may arise in view of any potential injury that may be inflicted on others by the affected individual. Individuals may themselves test for fever when they perceive or suspect an increase in their body temperature, for example, when they are significantly uncomfortable. In some environments, an organization may have an incentive to prevent a sick occupant from entering the organization. For example, an industrial kitchen may wish to prevent the presence of a sick employee from processing any food, for example, food distributed to the public. Sometimes, detection in sick individuals (e.g., with abnormal features) can be challenging. For example, due to the cognitive level of the individual and/or due to the hindrance of the individual to disclose such information.
Sometimes, when physical characteristics change (e.g., widely change) in a population, accurately detecting abnormal physical characteristics (e.g., fever) of an individual can be challenging. Sensors deployed to measure body characteristics may be uncalibrated and/or may require daily calibration, which may be expensive and/or inconvenient. Furthermore, certain physical characteristics may have natural variations that do not indicate an abnormality. Once an anomaly is detected, additional challenges may arise due to attempting to track the interaction of affected individuals and notify potentially exposed individuals for purposes of contacter tracking, while maintaining privacy and security of the data. At times, it may be advantageous to provide a route that reduces exposure of an individual to other occupants in the peripheral structure, for example, reducing infection in the individual.
Disclosure of Invention
Various aspects disclosed herein mitigate at least a portion of one or more disadvantages associated with physical characteristics of an individual.
Various aspects disclosed herein relate to one or more sensors disposed in an environment having an occupant. The sensors sense various characteristics of the environment, such as environmental characteristics affected by individuals in the facility (e.g., having a peripheral structure). For example, temperature sensors, carbon dioxide sensors, humidity sensors, and/or sensors that sense volatile organic compounds may be affected by the presence of individuals in a facility (e.g., having a peripheral structure). The environment in the facility (e.g., with the peripheral structure) may be mapped by the sensors in the facility (e.g., with the peripheral structure). The characteristic of the environment may have a gradient, for example, around the environment and/or due to the presence of individuals in the environment. For example, the environment may use a sensor thermal array to show trends in environmental characteristics. The environment may be a normal environment, such as a workplace, an airport, a restaurant. The environment may be a dedicated inspection environment, for example, allowing a single individual to be tested therein. During the test, the individual may be stationary or non-stationary. For example, an individual may traverse a sensed environment, e.g., during a test.
Various aspects disclosed herein relate to using relative body characteristic (e.g., temperature) measurements sensed by a sensor. In some embodiments, an assessment of the relative change in the characteristic is used (e.g., as opposed to making a determination based on an absolute magnitude), for example, because the temperature sensor does not need to accurately measure the absolute temperature of the occupant in order to accurately measure the relative difference in temperature measurements made at different times. The sensor may read 100 ° f, with the occupant's actual temperature being 98.6 ° f. However, a particular sensor (e.g., even uncalibrated) may accurately (e.g., within measurement error) consistently measure changes. For example, the later measured temperature is within 3F of 100F. The sensor may be in a high traffic area of a building (e.g., a bathroom entrance, an office entrance) and may read a physical characteristic of the individual (e.g., forehead temperature) and broadcast the temperature to the receiver via a communication network (e.g., via an electromagnetic beacon). The communication network may be communicatively coupled to a telephony application (e.g., if the application remains active without the need for identification of an individual) or a control system of the building. The control system may require Identification (ID) of the individual. Identification may be generic (e.g., not including Personal Identifiable Information (PII)). The identification may be a unique identifier. The storage device may store the sensor readings along with the ID of the sensor. Individuals with (e.g., substantially) the same physical characteristics may test at different locations in a facility (e.g., a building) on the same day (e.g., with peripheral structures) and obtain different readings (e.g., due to different sensor calibration states). Over time (e.g., over a certain period of time), the stored sensor readings corresponding to a particular sensor may be compared to analyze the physical characteristic in terms of relative change.
Various aspects disclosed herein relate to artificial intelligence (e.g., machine learning) software on an application (app) and/or control system that can learn a typical body temperature of the person over a period of time (e.g., a month) (e.g., for a plurality of sensors in a facility (e.g., with peripheral structures) that interact with an individual over the period of time). In some embodiments, if an individual develops an abnormal physical characteristic (e.g., fever), the abnormality will be detected, may be recorded, and/or trigger an event. For example, if the physical characteristic exceeds a threshold (e.g., if the body temperature differs from a normal reading by more than 3 ° f).
Various aspects disclosed herein relate to using contactor tracking to track contactors identified as potentially infected (e.g., having abnormal physical characteristics as measured in a facility (e.g., having peripheral structures) or otherwise reported as positive for abnormal physical characteristics and/or disease testing).
In another aspect, a method of detecting a physical characteristic of an individual in a facility (e.g., having a peripheral structure) includes: (a) Sensing and/or identifying, using at least one sensor, an environmental feature in the presence of an individual in an environment, the environmental feature being detectably perturbed by the presence of the individual compared to the absence of the individual in the environment; (b) Analyzing (i) the sensed environmental feature relative to (ii) a threshold indicative of an abnormal physical feature to generate an analysis; and (c) using the analysis to generate a report of the presence or absence of an indication of the abnormal physical characteristic of the individual.
In some embodiments, the environmental characteristic comprises temperature, carbon dioxide, humidity, or volatile organic compounds. In some embodiments, the physical feature comprises fever, respiration, or sweat. In some embodimentsThe environment includes an environment inside the facility (e.g., having a peripheral structure) or an environment at an opening of the facility (e.g., having a peripheral structure). In some embodiments, the individual passes through the environment during use of the at least one sensor to sense and/or identify environmental features. In some embodiments, the individual is stationary for at most five seconds during the use of the at least one sensor to sense the environmental characteristic. In some embodiments, the individual walks up the test platform to facilitate sensing environmental characteristics. In some embodiments, during sensing of the environmental feature, the individual passes through an opening to which the at least one sensor is attached. In some embodiments, at least a portion of the report comprises the results of the report. In some embodiments, at least a portion of the report is sent to the individual in a legible form. In some embodiments, at least a portion of the report is sent to the individual in audio or tactile form. In some embodiments, the at least one sensor comprises a plurality of sensors. In some embodiments, the at least one sensor comprises a plurality of sensors disposed in the aggregate. In some embodiments, the at least one sensor is communicatively coupled (e.g., connected) to a communication network. In some implementations, the report is communicated to the individual's mobile device via a communication network. A mobile device may include circuitry (e.g., a processor). The mobile device may be an electronic mobile device. In some embodiments, the at least one sensor is communicatively coupled (e.g., connected) to a control network that controls one or more devices in a facility (e.g., having a peripheral structure). In some embodiments, the at least one sensor is communicatively coupled (e.g., wired and/or wirelessly connected) to a control network that controls one or more devices in a facility in which the facility (e.g., having a peripheral structure) is disposed. In some embodiments, the at least one sensor is communicatively coupled (e.g., connected) to a control network that controls one or more devices in a facility (e.g., a facility) (e.g., with peripheral structure), for example, by accounting for analysis and/or reporting. In some embodiments, the at least one sensor is disposed at (e.g., has an outer portion of) a location Walled) within the framework of the facility interior. In some embodiments, the method further comprises altering at least one portion of the frame to construct a new frame. In some embodiments, the frame receives an object in the frame. In some embodiments, the method further comprises changing the objects in the frame to construct a new frame. In some embodiments, the method further comprises interacting, by the user, with the object in the frame. In some embodiments, the method further comprises providing, by the user, input to the object in the box. In some embodiments, the method further comprises receiving, in response to the user, an output from the object in the box. In some embodiments, the output from the object in the box is received in response to (I) a user input and/or (II) an identification of the user. In some embodiments, the frame accommodates an object in the frame, including a display construct, a panel, a window, or a device. In some embodiments, the device comprises a transmitter, a sensor, an antenna, a radar, a dispenser, and/or an identification reader. In some embodiments, the emitter comprises a lighting device, a buzzer, or a speaker. In some embodiments, the method further comprises identifying the code, including a visual code, an electromagnetic code, or an audible code, using an identification reader. In some embodiments, the visual code comprises writing or pictures. In some embodiments, the visual code comprises letters, numbers, lines, or geometric shapes. In some embodiments, the visual code is a barcode or a Quick Response (QR) code. In some embodiments, the visual code comprises a machine-readable code. In some embodiments, the electromagnetic code comprises ultra-wideband (UWB) radio waves, ultra-high frequency (UHF) radio waves, or radio waves utilized in the Global Positioning System (GPS). In some embodiments, the electromagnetic code comprises electromagnetic waves having a frequency of at least about 300MHz, 500MHz, or 1200 MHz. In some embodiments, the electromagnetic code includes location or time data. In some embodiments, identification utilizes bluetooth, UWB, UHF and/or Global Positioning System (GPS) technology. In some embodiments, the electromagnetic code has at least about 1013 bits per second per square meter (bit/s/m) 2 ) The spatial capacity of (a). In some embodiments, the at least one sensor is disposed in a dam, the dam being configured to receive the at least one sensorTo controllably block or allow passage of a user therethrough. In some embodiments, the barrage includes a fixed first portion and a second portion that controllably changes its position. In some embodiments, the first portion is a divider configured to separate one user from another user. In some embodiments, the method further comprises controlling, at least in part, the position of the second portion by the control system. In some embodiments, the method further comprises using the control system to control one or more components in a facility in which the facility (e.g., having a peripheral structure) is disposed. In some embodiments, the method further comprises controlling, at least in part, by a user, a position of the second portion. In some embodiments, the second portion comprises a transparent door. In some embodiments, the second portion comprises a rotary gate. In some embodiments, the method further comprises altering at least one portion of the dam to construct a new dam. In some embodiments, the barrage houses or is communicatively coupled to the device. In some embodiments, the device comprises a transmitter, a sensor, an antenna, a radar, a dispenser, and/or a badge reader. In some embodiments, the transmitter comprises a lighting device, a buzzer, or a speaker. In some embodiments, the method further comprises modifying and/or exchanging the device to construct a new dam. In some embodiments, the method further comprises interacting with the barrage by the user. In some embodiments, the method further comprises providing an input to the barrage by the user. In some embodiments, the method further comprises receiving an output from the dam in response to a user. In some embodiments, the output from the blocker is received in response to (I) a user input and/or (II) an identification of the user. In some embodiments, the dam comprises a panel or display configuration.
In another aspect, a non-transitory computer-readable program instructions for tracking a plurality of individuals in a facility, which, when read by one or more processors, cause the one or more processors to perform the operations of any of the above-described methods.
In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, at least two of the operations are performed by one of the one or more processors. In some embodiments, at least two of the operations are each performed by a different processor of the one or more processors.
In another aspect, a non-transitory computer-readable program instruction for tracking individuals in a facility, which, when read by one or more processors, causes the one or more processors to perform operations comprising: (a) Using or directing use of a sensor system to sense a first identity having a first location at a first time and a second identity having a second location at a second time, wherein the sensor system is operatively coupled to a local network disposed in a facility, the sensor system comprising a plurality of sensors configured to sense and/or identify the first identity, the first location, the first time, the second identity, the second location, and the second time; (b) Tracking or directing tracking movement of a first identity over a period of time to generate first tracking information, and tracking movement of a second identity over the period of time to generate second tracking information; and (c) evaluating or directing the evaluation of the distance from the first tracking information to the second tracking information relative to a distance threshold. In some embodiments, the one or more processors are operatively (e.g., communicatively) coupled to at least one sensor configured to sense and/or identify at least one environmental characteristic.
In some embodiments, the local network is configured to facilitate control of at least one other device of the facility. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, at least two of the operations are performed by one of the one or more processors. In some embodiments, at least two of the operations are each performed by a different processor of the one or more processors.
In another aspect, an apparatus for detecting physical characteristics of an individual in a facility (e.g., having a peripheral structure) includes one or more controllers operatively (e.g., communicatively) coupled to at least one sensor configured to sense and/or identify at least one environmental characteristic, the one or more controllers (e.g., including circuitry) configured to: (a) Directing the at least one sensor to sense and/or identify an environmental feature in the presence of an individual in an environment, the environmental feature being detectably perturbed by the presence of the individual compared to the absence of the individual in the environment; (b) Analyzing or directing analysis of (i) the sensed environmental feature relative to (ii) a threshold indicative of an abnormal physical feature to generate an analysis; and (c) using or directing use of the analysis to generate a report indicative of the presence and/or absence of abnormal physical characteristics of the individual.
In some embodiments, the one or more controllers are communicatively coupled to a network (e.g., a local network) configured for communication. In some embodiments, at least two of the at least one sensor are configured to sense and/or identify the same type of environmental feature. For example, the plurality of sensors may include a plurality of temperature sensors. In some embodiments, at least two sensors of the at least one sensor are configured to sense and/or identify different types of environmental features. For example, the plurality of sensors may include a temperature sensor and a humidity sensor. In some embodiments, the environmental characteristic comprises temperature, carbon dioxide, humidity, or volatile organic compounds. The multiple sensors may sense the same environmental characteristic and have the same sensor type. For example, the plurality of sensors may be temperature sensors, all being thermocouples. The multiple sensors may sense the same environmental characteristic and have different sensor types. For example, the plurality of sensors may be temperature sensors, including thermocouples and IR sensors. In some embodiments, the physical feature comprises fever, respiration, or sweat. In some embodiments, the environment includes an environment inside a facility (e.g., having a peripheral structure) Or an environment at an opening of a facility (e.g., having a peripheral structure). In some embodiments, the one or more sensors are communicatively coupled to the at least one controller. In some embodiments, the one or more controllers are configured to direct the one or more sensors to detect individuals passing through the environment during use of the at least one sensor to sense and/or identify environmental features. In some embodiments, the one or more sensors and the at least one sensor have at least one sensor in common. In some embodiments, the one or more sensors and the at least one sensor have at least one different sensor. In some embodiments, the one or more sensors and the at least one sensor have at least one common sensor type. In some embodiments, the one or more sensors and the at least one sensor are of at least one different sensor type. In some embodiments, the one or more sensors are communicatively coupled to the at least one controller. In some embodiments, the one or more controllers are configured to direct the one or more sensors to detect that the individual is stationary for at most five seconds during the use of the at least one sensor to sense and/or identify the environmental feature. In some embodiments, the one or more sensors are communicatively coupled to the at least one controller. In some embodiments, the one or more controllers are configured to direct the one or more sensors to detect an individual walking onto the test platform to facilitate sensing of the environmental characteristic. In some embodiments, the one or more sensors are communicatively coupled to the at least one controller. In some embodiments, the one or more controllers are configured to direct the one or more sensors to detect that an individual passes through an opening to which the at least one sensor is attached during sensing of the environmental feature. In some embodiments, the one or more controllers are configured to send or direct the sending of at least a portion of the report to the individual. In some embodiments, at least a portion of the report includes the results of the report. In some embodiments, at least a portion of the report is sent to the individual in a legible form. In some embodiments, the at least one sensor comprises a plurality of sensors A sensor. In some embodiments, the at least one sensor comprises a plurality of sensors disposed in the aggregate. In some embodiments, the at least one sensor is communicatively coupled (e.g., connected) to a communication network. In some embodiments, the one or more controllers are configured to communicate or direct the transmission of the report to the individual's mobile device via a communication network. In some embodiments, the one or more controllers are configured to control or direct control of one or more devices in a facility (e.g., having a peripheral structure). In some embodiments, the one or more controllers are configured to control or direct control of one or more devices in a facility in which the facility (e.g., having a peripheral structure) is disposed. In some embodiments, the one or more controllers are configured to control or direct the operation of one or more devices in the control facility by taking into account the analysis and/or reporting. In some embodiments, at least two of (a), (b), and (c) are performed by the same controller of the one or more controllers. In some embodiments, at least two of (a), (b), and (c) are performed by different controllers of the one or more controllers. In some embodiments, the at least one sensor is disposed in a frame located inside the facility (e.g., having a peripheral structure). In some embodiments, at least one portion of the frame is configured to be changeable to construct a new frame. In some embodiments, the at least one controller is configured to (i) identify or direct identification of the new framework and/or one or more components of the new framework and/or (ii) establish or direct establishment of communication between the at least one controller and the new framework and/or establishment of communication between the at least one controller and one or more components of the new framework. In some embodiments, the frame is configured to receive an object in the frame. In some embodiments, the object in the box is configured for interaction with a user. In some embodiments, the object in the box is configured for providing input by a user. In some embodiments, the frame is operatively coupled to the at least one controller. In some embodiments, the at least one controller is configured to receive an input from the object in the frame in response to a user And (6) discharging. In some embodiments, the at least one controller is configured to receive the output in response to (I) a user input and/or (II) an identification of the user. In some embodiments, the frame is configured to receive an object in the frame, including a display construction, panel, window, or device. In some embodiments, the device comprises a transmitter, a sensor, an antenna, a radar, a dispenser, and/or an identification reader. In some embodiments, the emitter comprises a lighting device, a buzzer, or a speaker. In some embodiments, the identification reader is configured to recognize a visual code, an electromagnetic code, or an audible code. In some embodiments, the visual code comprises writing or pictures. In some embodiments, the visual code comprises letters, numbers, lines, or geometric shapes. In some embodiments, the visual code is a barcode or a Quick Response (QR) code. In some embodiments, the visual code comprises a machine-readable code. In some embodiments, the electromagnetic code comprises ultra-wideband (UWB) radio waves, ultra-high frequency (UHF) radio waves, or radio waves utilized in the Global Positioning System (GPS). In some embodiments, the electromagnetic code comprises an electromagnetic wave having a frequency of at least about 300MHz, 500MHz, or 1200 MHz. In some embodiments, the electromagnetic code includes location or time data. In some embodiments, identification utilizes bluetooth, UWB, UHF and/or Global Positioning System (GPS) technology. In some embodiments, the electromagnetic code has at least about 1013 bits per second per square meter (bit/s/m) 2 ) The spatial capacity of (a). In some embodiments, the at least one sensor is disposed in an obstruction configured to be operatively coupled to the at least one controller. In some embodiments, the at least one controller is configured to (I) block or direct a user to pass therethrough, or (II) allow a user or guide a user to pass therethrough. In some embodiments, the barrage includes a fixed first portion and a second portion configured to controllably change its position. In some embodiments, the at least one controller is configured to change or direct a change in position of the damming body. In some embodiments, the first portion is a divider configured to separate one user from another user.In some embodiments, the at least one controller is configured to at least partially control or direct control of the position of the second portion. In some embodiments, the at least one controller is configured to control or direct control of one or more components in a facility in which the facility (e.g., having a peripheral structure) is disposed. In some embodiments, the position of the second portion is configured to be at least partially controlled by a user. In some embodiments, the second portion comprises a transparent door. In some embodiments, the second portion comprises a rotary gate. In some embodiments, at least one portion of the dam is configured to facilitate changing the configuration of the dam to construct a new dam. In some embodiments, the at least one controller is configured to (i) identify or direct identification of the new barrage and/or one or more components of the new barrage and/or (ii) establish or direct establishment of communication between the at least one controller and the new barrage and/or establishment of communication between the at least one controller and one or more components of the new barrage. In some embodiments, the barrage is configured to receive or configured to communicatively couple to the device. In some embodiments, the device comprises a transmitter, a sensor, an antenna, a radar, a dispenser, and/or a badge reader. In some embodiments, the emitter comprises a lighting device, a buzzer, or a speaker. In some embodiments, at least one portion of the dam is configured to facilitate changing and/or exchanging the device to construct a new dam. In some embodiments, the at least one controller is configured to (i) identify or direct identification of the new barrage and/or one or more components of the new barrage and/or (ii) establish or direct establishment of communication between the at least one controller and the new barrage and/or establishment of communication between the at least one controller and one or more components of the new barrage. In some embodiments, the barrier is configured for interaction with a user. In some embodiments, the barrier is configured to receive input from a user. In some embodiments, the at least one controller is configured to be communicatively coupled to the blocker. In some embodiments, the at least one controller is configured to receive the information from the user in response to the user And (4) outputting the obstruction body. In some embodiments, the output from the dam is received in response to (I) user input and/or (II) identification of the user. In some embodiments, the dam comprises a panel or display configuration.
In another aspect, a non-transitory computer-readable product to detect physical characteristics of an individual (e.g., having a peripheral structure) in a facility includes one or more processors, the non-transitory computer-readable product containing instructions inscribed thereon that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: (a) Sensing and/or identifying, using at least one sensor, an environmental feature in the presence of an individual in an environment that is detectably perturbed by the presence of the individual compared to the absence of the individual in the environment; (b) Analyzing (i) the sensed environmental feature relative to (ii) a threshold indicative of an abnormal physical feature to generate an analysis; and (c) using the analysis to generate a report of the presence or absence of an indication of the abnormal physical characteristic of the individual.
In some embodiments, the environmental characteristic comprises temperature, carbon dioxide, humidity, or volatile organic compounds. In some embodiments, the physical characteristic comprises fever, respiration, or perspiration. In some embodiments, the environment includes an environment inside the facility (e.g., with peripheral structure) or an environment at an opening of the facility (e.g., with peripheral structure). In some embodiments, the operations further comprise detecting an individual passing through the environment during use of the at least one sensor to sense and/or identify the environmental feature. In some embodiments, the operations further comprise detecting that the individual is stationary for at most five seconds during the use of the at least one sensor to sense and/or identify the environmental feature. In some embodiments, the operations further comprise detecting an individual stepping onto the test platform to facilitate sensing of environmental characteristics. In some embodiments, the operations further comprise detecting an individual passing through the opening to which the at least one sensor is attached during sensing of the environmental feature. In some embodiments, the operations further comprise sending at least a portion of the report to the individual. In some embodiments, at least a portion of the report comprises a report The result of (1). In some embodiments, at least a portion of the report is sent to the individual in a legible form. In some embodiments, at least a portion of the report is sent to the individual. In some embodiments, the at least one sensor comprises a plurality of sensors. In some embodiments, the at least one sensor comprises a plurality of sensors disposed in the aggregate. In some embodiments, the at least one sensor is communicatively coupled (e.g., connected) to a communication network. In some embodiments, the operations further comprise transmitting the report to a mobile device of the individual via a communication network. In some embodiments, the at least one sensor is communicatively coupled (e.g., wired and/or wirelessly connected) to a control network that controls one or more devices in a facility (e.g., having a peripheral structure). In some embodiments, the at least one sensor is communicatively coupled (e.g., wired and/or wirelessly connected) to a control network that controls one or more devices in a facility in which the facility (e.g., having a peripheral structure) is disposed. In some embodiments, the at least one sensor is communicatively coupled (e.g., connected) to a control network configured to control one or more devices in a facility (e.g., a facility) (e.g., having a peripheral structure), e.g., by accounting for analytics and/or reporting. In some embodiments, the at least one sensor is disposed in a frame located inside the facility (e.g., having a peripheral structure). In some embodiments, the frame is configured to facilitate changing at least one portion of the frame to construct a new frame. In some embodiments, the operations include (i) identifying or directing identification of the new framework and/or one or more components of the new framework and/or (ii) establishing or directing establishment of communication between the control system and the new framework and/or establishing communication between the control system and one or more components of the new framework. In some embodiments, the frame is configured to receive an object in the frame. In some embodiments, the frame is configured to facilitate changing the objects in the frame to construct a new frame. In some embodiments, these operations include (i) identifying or directing identification of objects in a new frame and/or changed frame And/or (ii) establish or direct the establishment of communication between the control system and the new framework and/or the establishment of communication between the control system and the objects in the changed boxes. In some embodiments, the object in the box is configured for interaction with a user. In some embodiments, the object in the box is configured to provide input to the object in the box by the user. In some embodiments, the operations include receiving, in response to a user, an output from an object in a box. In some embodiments, an output from an object in a box is received in response to (I) a user input and/or (II) an identification of the user. In some embodiments, the frame is configured to receive an object in the frame, including a display construction, panel, window, or device. In some embodiments, the device comprises a transmitter, a sensor, an antenna, a radar, a dispenser, and/or an identification reader. In some embodiments, the emitter comprises a lighting device, a buzzer, or a speaker. In some embodiments, the operations include identifying a code, including a visual code, an electromagnetic code, or an audible code, using an identification reader. In some embodiments, the visual code comprises writing or pictures. In some embodiments, the visual code comprises letters, numbers, lines, or geometric shapes. In some embodiments, the visual code is a barcode or a Quick Response (QR) code. In some embodiments, the visual code comprises a machine-readable code. In some embodiments, the electromagnetic code comprises ultra-wideband (UWB) radio waves, ultra-high frequency (UHF) radio waves, or radio waves utilized in the Global Positioning System (GPS). In some embodiments, the electromagnetic code comprises an electromagnetic wave having a frequency of at least 300MHz, 500MHz, or 1200 MHz. In some embodiments, the electromagnetic code includes location or time data. In some embodiments, the operations include identifying an identification code based at least in part on bluetooth, UWB, UHF, and/or Global Positioning System (GPS) technology. In some embodiments, the electromagnetic code has at least about 1013 bits per second per square meter (bit/s/m) 2 ) The spatial capacity of (a). In some embodiments, the at least one sensor is disposed in a barrier configured to controllably obstruct or permit passage of a user therethrough. In thatIn some embodiments, the dam comprises a fixed first portion and a second portion that controllably alters its position. In some embodiments, the first portion is a divider configured to separate one user from another user. In some embodiments, the operations include controlling, at least in part, the position of the second portion by a control system. In some embodiments, the operations include using a control system to control one or more components in a facility in which the facility (e.g., having a peripheral structure) is disposed. In some embodiments, the operations include controlling, at least in part, a position of the second portion by a user. In some embodiments, the second portion comprises a transparent door. In some embodiments, the second portion comprises a rotary gate. In some embodiments, the operations include facilitating a change to at least one portion of the dam to construct a new dam. In some embodiments, facilitating the change includes (i) identifying the new barrage and/or one or more components of the new barrage and/or (ii) establishing communication between the control system and the new barrage and/or establishing communication between the control system and one or more components of the new barrage. In some embodiments, the damming body houses or is communicatively coupled to a device. In some embodiments, the device comprises a transmitter, a sensor, an antenna, a radar, a dispenser, and/or a badge reader. In some embodiments, the emitter comprises a lighting device, a buzzer, or a speaker. In some embodiments, these operations include facilitating changing and/or exchanging the device to construct a new dam. In some embodiments, facilitating the change includes (i) identifying the new barrage and/or one or more components of the new barrage and/or (ii) establishing communication between the control system and the new barrage and/or establishing communication between the control system and one or more components of the new barrage. In some embodiments, the operations include facilitating user interaction with a blocker. In some embodiments, the operations include facilitating a user providing input to the obstruction. In some embodiments, the operations include facilitating receiving an output from the dam in response to a user. In some embodiments, in response to (I) user input and/or (II) a user Receives the output from the interceptor. In some embodiments, the dam comprises a panel or display configuration.
In another aspect, a method of tracking a plurality of individuals in a facility (e.g., having a peripheral structure) includes: (a) Sensing and/or identifying a first identity having a first location at a first time and a second identity having a second location at a second time using a sensor system, wherein the sensor system is operatively coupled to a network (e.g., a local network) disposed in a facility (e.g., having a peripheral structure), the sensor system comprising a plurality of sensors configured to sense and/or identify the first identity, the first location, the first time, the second identity, the second location, and the second time; (b) Tracking movement of the first identity over time (e.g., over a period of time) to generate first tracking information, and tracking movement of the second identity over time (e.g., over a period of time) to generate second tracking information; and (c) evaluating a distance from the first tracking information to the second tracking information relative to a distance threshold.
In some embodiments, the local network is configured to facilitate control of at least one other device of the facility. In some embodiments, the local network is configured to facilitate control of the facility, for example using a building management system. In some embodiments, the at least one other device (e.g., respectively) comprises at least one other device type. In some embodiments, the at least one other device comprises a sensor, a transmitter, an antenna, a router, a media display, or a tintable window. In some embodiments, the at least one other device comprises a service device (e.g., a printer, a maker, or a beverage dispenser). In some embodiments, at least some of the plurality of sensors are integrated in one or more device assemblies. In some embodiments, the plurality of sensors and/or device aggregates comprise accelerometers. In some embodiments, the plurality of sensors includes at least one geolocation sensor for detecting (i) the first location and the second location and/or (ii) the first identity and the second identity. In some embodiments, the at least one geolocation sensor includes an ultra-wideband (UWB) sensor or a bluetooth sensor. In some embodiments, the plurality of sensors includes a plurality of geolocation sensors that are synchronized in time (e.g., time synchronized). In some embodiments, the geolocation sensor comprises an ultra-wideband (UWB) sensor or a bluetooth sensor. In some embodiments, the sensor system comprises a camera for detecting (i) the first location and the second location and/or (ii) the first identity and the second identity. In some embodiments, the camera is comprised of a sensor array having at least about 4000 sensors on its fundamental length scale. In some embodiments, the base length dimension comprises a length, a width, a radius, or a boundary radius. In some embodiments, the sensor array comprises a sensor comprising a Charge Coupled Device (CCD). In some embodiments, the sensor system is comprised of a device aggregate that includes (i) a plurality of sensors or (ii) a sensor and an emitter. In some embodiments, the sensor system is comprised of an aggregate of devices that integrate the controller and/or processor. In some embodiments, the sensor system is comprised of an assembly of devices that includes a controller and/or a processor. In some embodiments, the sensor system is operatively coupled to a hierarchical control system comprising a plurality of controllers. In some embodiments, the sensor system is comprised of a device aggregate that is mounted at a fixed location in the facility (e.g., having a peripheral structure). In some embodiments, the fixed location is at or attached to a fixed structure of the facility (e.g., having a peripheral structure). In some embodiments, the fixed location is a controlled access to a facility (e.g., having a peripheral structure). In some embodiments, a facility (e.g., having a perimeter structure) includes a facility, a building, and/or a room. In some embodiments, the sensor system is comprised of a collection of devices mounted to or embedded in a non-fixed structure in a facility (e.g., having a peripheral structure). In some embodiments, the method further comprises: (d) Associating a first location and a first time with a first identity to generate a first association, and associating a second location and a second time with a second identity to generate a second association; and (e) comparing the first association to the second association to evaluate a distance from the first identity to the second identity relative to a distance threshold. In some embodiments, the method further comprises evaluating, relative to a time threshold, whether the first tracking information and the second tracking information are separated by a distance below a distance threshold for a cumulative time. In some embodiments, the sensor system is coupled to a network (e.g., a local network) disposed in a facility (e.g., having a peripheral structure). In some embodiments, a network (e.g., a local network) includes wiring disposed in an enclosure of a facility (e.g., disposed in an enclosure of a peripheral structure such as a building). In some embodiments, the method further comprises transmitting power and communications over a single cable of a network (e.g., a local network). In some embodiments, the method further includes transmitting at least a fourth or fifth generation cellular communication using a network (e.g., a local network). In some embodiments, the method further includes using a network (e.g., a local network) to transmit the data including the media. In some embodiments, the method further comprises using a network (e.g., a local network) to control the atmosphere of a facility (e.g., having a peripheral structure). In some embodiments, the method further includes using a network (e.g., a local network) to control a tintable window disposed in a facility (e.g., having a peripheral structure). In some embodiments, the method further comprises using a network (e.g., a local network) to control the facility (e.g., having a peripheral structure).
In another aspect, a method for tracking a plurality of individuals in a facility includes tracking a plurality of individuals in a facility, the plurality of individuals in the facility having a plurality of locations, wherein the plurality of locations are located in a plurality of locations, and wherein the plurality of locations are located in a plurality of locations.
In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, at least two of the operations are performed by one of the one or more processors. In some embodiments, at least two of the operations are each performed by a different processor of the one or more processors.
In another aspect, a non-transitory computer-readable program instructions for tracking individuals in a facility, which, when read by one or more processors, cause the one or more processors to perform operations comprising: (a) Using or directing use of a sensor system to sense a first identity having a first location at a first time and a second identity having a second location at a second time, wherein the sensor system is operatively coupled to a local network disposed in a facility, the sensor system comprising a plurality of sensors configured to sense and/or identify the first identity, the first location, the first time, the second identity, the second location, and the second time; (b) Tracking or directing tracking movement of a first identity over a period of time to generate first tracking information, and tracking movement of a second identity over the period of time to generate second tracking information; and (c) evaluating or directing evaluation of the distance from the first tracking information to the second tracking information relative to a distance threshold. In some embodiments, the one or more processors are operatively coupled to a sensor system comprising a plurality of sensors configured to sense and/or identify a first identity having a first location at a first time and a second identity having a second location at a second time. In some embodiments, the sensor system is operatively coupled to a local network disposed in the facility. In some embodiments, the one or more processors are configured to control at least one other device of the facility.
In some embodiments, the local network is configured to facilitate control of at least one other device of the facility. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, at least two of the operations are performed by one of the one or more processors. In some embodiments, at least two of the operations are each performed by a different processor of the one or more processors.
In another aspect, an apparatus for tracking a plurality of individuals in a facility (e.g., having a peripheral structure) includes at least one controller (e.g., including circuitry) configured to: (a) Operatively coupled to a sensor system comprising a plurality of sensors configured to sense and/or identify a first identity having a first location at a first time and a second identity having a second location at a second time, the sensor system operatively coupled to a network (e.g., a local network) disposed in a facility (e.g., having a peripheral structure); (b) Using or directing use of a sensor system to sense and/or identify a first identity having a first location at a first time and a second identity having a second location at a second time; (c) Tracking or directing to track movement of the first identity over time (e.g., over a certain period of time) to generate first tracking information, and tracking movement of the second identity over time (e.g., over a certain period of time) to generate or direct to generate second tracking information; and (d) evaluating or directing the evaluation of the distance from the first tracking information to the second tracking information relative to a distance threshold.
In some embodiments, the at least one controller is configured to control at least one other device of the facility. In some embodiments, the at least one controller is configured to control the facility, for example, using a building management system. In some embodiments, the at least one other device (e.g., respectively) comprises at least one other device type. In some embodiments, the at least one other device comprises a sensor, a transmitter, an antenna, a router, a media display, or a tintable window. In some embodiments, the at least one other device comprises a service device (e.g., a printer, a maker, or a beverage dispenser). In some embodiments, at least one sensor of the plurality of sensors is integrated in a device aggregate that includes (i) the sensor or (ii) the sensor and the emitter. In some embodiments, the plurality of sensors includes at least one geolocation sensor for detecting (i) the first location and the second location and/or (ii) the first identity and the second identity. In some embodiments, the at least one geolocation sensor includes an ultra-wideband (UWB) sensor or a bluetooth sensor. In some embodiments, the plurality of sensors includes a plurality of geolocation sensors that are synchronized in time (e.g., time synchronized). In some embodiments, the geolocation sensor comprises an ultra-wideband (UWB) sensor or a bluetooth sensor. In some embodiments, the sensor system comprises a camera for detecting (i) the first location and the second location and/or (ii) the first identity and the second identity. In some embodiments, the camera is comprised of a sensor array having at least about 4000 sensors on its fundamental length scale. In some embodiments, the base length dimension comprises a length, a width, a radius, or a boundary radius. In some embodiments, the sensor array comprises a sensor comprising a Charge Coupled Device (CCD). In some embodiments, the sensor system is comprised of a device aggregate that includes (i) a plurality of sensors or (ii) a sensor and an emitter. In some embodiments, the sensor system is comprised of a device aggregate that includes (i) a plurality of sensors or (ii) a sensor and an emitter. In some embodiments, the sensor system is comprised of a device aggregate that integrates a controller. In some embodiments, the sensor system is operatively coupled to a hierarchical control system comprising a plurality of controllers. In some embodiments, the sensor system is comprised of a collection of devices that are mounted at fixed locations in a facility (e.g., having a peripheral structure). In some embodiments, the fixed location is at or attached to a fixed structure of the facility (e.g., having a peripheral structure). In some embodiments, the fixed location is a controlled access to a facility (e.g., having a peripheral structure). In some embodiments, a facility (e.g., having a perimeter structure) includes a facility, a building, and/or a room. In some embodiments, the sensor system is comprised of a collection of devices mounted to or embedded in a non-fixed structure in a facility (e.g., having a peripheral structure). In some embodiments, the at least one controller is further configured to: (d) Associating or directing the association of the first location and the first time with the first identity to generate a first association and associating the second location and the second time with the second identity to generate a second association; and (e) comparing or directing a comparison of the first association to the second association to evaluate a distance from the first identity to the second identity relative to a distance threshold. In some embodiments, the apparatus further comprises evaluating, relative to a time threshold, whether the first tracking information and the second tracking information are separated by a distance below a distance threshold for an accumulated time. In some embodiments, the sensor system is coupled to a network (e.g., a local network) disposed in a facility (e.g., having a peripheral structure). In some embodiments, a network (e.g., a local network) includes wiring disposed in an enclosure of a facility (e.g., disposed in an enclosure of a peripheral structure such as a building). In some embodiments, a network (e.g., a local network) is configured for power and communications transmission over a single cable. In some embodiments, a network (e.g., a local network) is configured for at least fourth or fifth generation cellular communication. In some embodiments, a network (e.g., a local network) is configured for data transfer including media. In some embodiments, a network (e.g., a local network) is configured to be coupled to at least one antenna. In some embodiments, a network (e.g., a local network) is configured to couple to a plurality of different sensor types. In some embodiments, a network (e.g., a local network) is configured to be coupled to a building management system configured to manage the atmosphere of a facility (e.g., having a peripheral structure). In some embodiments, a network (e.g., a local network) is configured to be coupled to a tintable window. In some embodiments, a network (e.g., a local network) is configured to be coupled to a hierarchical control system configured to control a facility (e.g., having a peripheral structure).
In another aspect, a method for tracking a plurality of individuals in a facility (e.g., having a peripheral structure) includes: (A) Using a sensor system to detect identities of a first individual and a second individual disposed in a facility (e.g., having a peripheral structure); (B) Tracking, using a sensor system, movement of a first individual across a first set of locations (e.g., of peripheral structure) in a facility during a first set of times, and tracking movement of a second individual across a second set of locations (e.g., of peripheral structure) in the facility during a second set of times; (C) Associating the first set of locations and the first set of times with a first individual to generate a first association, and associating the second set of locations and the second set of times with a second individual to generate a second association; and (D) comparing the first association to the second association to assess a distance from the first individual to the second individual relative to a threshold.
In some embodiments, the sensor system is operatively coupled to a local network configured to facilitate control of at least one other device of the facility. In some embodiments, the local network is configured to facilitate control of facilities, for example, using a building management system. In some embodiments, the at least one other device (e.g., respectively) comprises at least one other device type. In some embodiments, the at least one other device comprises a sensor, a transmitter, an antenna, a router, a media display, or a tintable window. In some embodiments, the at least one other device comprises a service device (e.g., a printer, a maker, or a beverage dispenser). In some embodiments, detecting the identity of the first individual and the second individual is performed upon entry of the first individual and the second individual into a facility (e.g., having a peripheral structure). In some embodiments, the first set of locations and the second set of locations are specified in terms of subdividing a plurality of zones of a facility (e.g., having a peripheral structure). In some embodiments, associating the set of locations with the set of times includes storing a plurality of times and zone locations according to the monitored movements in a plurality of caches corresponding to the first individual and the second individual. In some embodiments, the plurality of caches includes a first cache corresponding to a first individual and a second cache corresponding to a second individual. In some embodiments, the first association is compared to the second association conditionally upon detection of a predetermined event associated with the first individual and/or the second individual. In some embodiments, (a) detecting an identity and (b) tracking movement is performed at least in part by using a sensor system comprising a device ensemble comprising (i) a plurality of sensors or (ii) a sensor and an emitter. In some embodiments, the plurality of sensors includes a geolocation sensor for detecting (i) the first set of locations and the second set of locations and/or (ii) the first identity and the second identity. In some embodiments, the geolocation sensor comprises an ultra-wideband (UWB) sensor or a bluetooth sensor. In some embodiments, the plurality of sensors includes a plurality of geolocation sensors that are synchronized in time (e.g., time synchronized). In some embodiments, the geolocation sensor comprises an ultra-wideband (UWB) sensor or a bluetooth sensor. In some embodiments, at least one sensor of the plurality of sensors is included in the device aggregate. In some embodiments, the device aggregate and/or the plurality of sensors comprises an accelerometer. In some embodiments, the device aggregate comprises (i) a sensor or (ii) a sensor and an emitter. In some embodiments, the plurality of sensors includes a camera for detecting the first identity and/or the second identity. In some embodiments, the camera is comprised of a sensor array having at least about 4000 sensors on its fundamental length scale. In some embodiments, the base length dimension comprises a length, a width, a radius, or a boundary radius. In some embodiments, the sensor array comprises a sensor comprising a Charge Coupled Device (CCD). In some embodiments, the sensor system is comprised of a plurality of sensors, wherein at least some of the plurality of sensors are integrated in one or more device aggregates. In some embodiments, the sensor system is comprised of a device aggregate including a controller and/or processor. In some embodiments, the sensor system is comprised of an assembly of devices that includes a controller and/or a processor. In some embodiments, the sensor system is operatively coupled to a hierarchical control system comprising a plurality of controllers. In some embodiments, the sensor system includes at least one sensor housed in a device aggregate mounted or located in a fixed structure of the facility (e.g., having a peripheral structure). In some embodiments, the sensor system is located in a controlled portal of the facility (e.g., having a peripheral structure). In some embodiments, a facility (e.g., having a peripheral structure) includes a facility, a building, and/or a room. In some embodiments, at least one sensor of the sensor system is housed in a device aggregate that is mounted to or located in a non-fixed structure in a facility (e.g., having a peripheral structure). In some embodiments, the sensor system is coupled to a network (e.g., a local network) disposed in a facility (e.g., having a peripheral structure). In some embodiments, a network (e.g., a local network) includes wiring disposed in an enclosure of a facility (e.g., disposed in an enclosure of a peripheral structure such as a building). In some embodiments, the method further comprises transmitting power and communications over a single cable of a network (e.g., a local network). In some embodiments, the method further includes transmitting at least a fourth or fifth generation cellular communication using a network (e.g., a local network). In some embodiments, the method further includes using a network (e.g., a local network) to transmit the data including the media. In some embodiments, the method further comprises using a network (e.g., a local network) to control the atmosphere of a facility (e.g., having a peripheral structure). In some embodiments, the method further includes using a network (e.g., a local network) to control a tintable window disposed in a facility (e.g., having a peripheral structure). In some embodiments, the method further comprises using a network (e.g., a local network) to control the facility (e.g., having a peripheral structure).
In another aspect, a non-transitory computer-readable program instructions for tracking a plurality of individuals in a facility, which, when read by one or more processors, cause the one or more processors to perform the operations of any of the above-described methods.
In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, at least two of the operations are performed by one of the one or more processors. In some embodiments, at least two of the operations are each performed by a different processor of the one or more processors.
In another aspect, a non-transitory computer-readable program instruction for tracking individuals in a facility, which, when read by one or more processors, causes the one or more processors to perform operations comprising: (A) Using or directing use of a sensor system to detect identities of a first individual and a second individual disposed in a facility, the sensor system operatively coupled to a local network; (B) Tracking, using a sensor system, movement of a first individual across a first set of locations in a facility during a first set of times, and tracking movement of a second individual across a second set of locations in the facility during a second set of times; (C) Associating or directing the association of the first set of locations and the first set of times with the first individual to generate a first association, and associating the second set of locations and the second set of times with the second individual to generate a second association; and (D) comparing or directing a comparison of the first association to the second association to assess a distance from the first individual to the second individual relative to a threshold. In some embodiments, the one or more processors are operatively coupled to a sensor system configured to detect identities of a first individual and a second individual disposed in a facility (e.g., having a peripheral structure).
In some embodiments, the local network is configured to facilitate control of at least one other device of the facility. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, at least two of the operations are performed by one of the one or more processors. In some embodiments, at least two of the operations are each performed by a different processor of the one or more processors.
In another aspect, an apparatus for tracking a plurality of individuals in a facility (e.g., having a peripheral structure) includes at least one controller (e.g., including circuitry) configured to: (A) Operatively coupled to a sensor system configured to detect identities of a first individual and a second individual disposed in a facility (e.g., having a peripheral structure); (B) Using or directing use of a sensor system to detect identities of a first individual and a second individual disposed in a facility (e.g., having a peripheral structure); (C) Using or directing use of the sensor system to track movement of a first individual across a first set of locations (e.g., with peripheral structure) in the facility during a first set of times and to track movement of a second individual across a second set of locations (e.g., with peripheral structure) in the facility during a second set of times; (D) Associating or directing association of the first location and the first set of times with a first individual and generating a first association, and associating or directing association of the second location and the second set of times with a second individual and generating a second association; and (E) comparing or directing a comparison of the first association to the second association to assess a distance from the first individual to the second individual relative to a threshold.
In some embodiments, the at least one controller is configured to control at least one other device of the facility. In some embodiments, the at least one controller is configured to control the facility, for example, using a building management system. In some embodiments, the at least one other device (e.g., respectively) comprises at least one other device type. In some embodiments, the at least one other device comprises a sensor, a transmitter, an antenna, a router, a media display, or a tintable window. In some embodiments, the at least one other device comprises a service device (e.g., a printer, a maker, or a beverage dispenser). In some embodiments, the at least one controller is configured to detect the identity of the first individual and the second individual upon entry of the first individual and the second individual into the facility (e.g., having a peripheral structure). In some embodiments, the first set of locations and the second set of locations are specified in terms of subdividing a plurality of zones of a facility (e.g., having a peripheral structure). In some embodiments, the at least one controller is configured to associate or direct association of a location and a set of times by: storing a plurality of times and zone locations according to the monitored movements in the plurality of caches corresponding to the first individual and the second individual. In some embodiments, the plurality of caches includes a first cache corresponding to a first individual and a second cache corresponding to a second individual. In some embodiments, the at least one controller is configured to conditionally compare or direct a comparison of the first association with the second association upon detection of a predetermined event related to the first individual and/or the second individual. In some embodiments, the at least one controller is configured to perform (a) detecting or directing detection of an identity and (ii) tracking or directing tracking movement at least in part by using a sensor system comprising a device ensemble comprising (i) a plurality of sensors or (ii) a sensor and a transmitter. In some embodiments, the plurality of sensors includes a geolocation sensor for detecting (i) the first set of locations and the second set of locations and/or (ii) the first identity and the second identity. In some embodiments, the geolocation sensor comprises an ultra-wideband (UWB) sensor or a bluetooth sensor. In some embodiments, the plurality of sensors includes a plurality of geolocation sensors that are synchronized in time (e.g., time synchronized). In some embodiments, the geolocation sensor comprises an ultra-wideband (UWB) sensor or a bluetooth sensor. In some embodiments, at least one sensor of the plurality of sensors is included in the device aggregate. In some embodiments, the device aggregate and/or the plurality of sensors comprise an accelerometer. In some embodiments, the device aggregate comprises (i) a sensor or (ii) a sensor and an emitter. In some embodiments, the plurality of sensors includes a camera for detecting the first identity and/or the second identity. In some embodiments, the camera is comprised of a sensor array having at least 4000 sensors on its fundamental length scale. In some embodiments, the base length dimension comprises a length, a width, a radius, or a boundary radius. In some embodiments, the sensor array comprises a sensor comprising a Charge Coupled Device (CCD). In some embodiments, the sensor system is comprised of a plurality of sensors, wherein at least some of the plurality of sensors are integrated in one or more device aggregates. In some embodiments, the sensor system is comprised of an assembly of devices that includes a controller and/or a processor. In some embodiments, the sensor system is comprised of an assembly of devices that includes a controller and/or a processor. In some embodiments, the sensor system is operatively coupled to a hierarchical control system comprising a plurality of controllers. In some embodiments, the sensor system includes at least one sensor housed in a device aggregate mounted or located in a fixed structure of the facility (e.g., having a peripheral structure). In some embodiments, the sensor system is located in a controlled portal of the facility (e.g., having a peripheral structure). In some embodiments, a facility (e.g., having a peripheral structure) includes a facility, a building, and/or a room. In some embodiments, at least one sensor of the sensor system is housed in a device aggregate that is mounted to or located in an unsecured structure in a facility (e.g., having a peripheral structure). In some embodiments, the sensor system is coupled to a network (e.g., a local network) disposed in a facility (e.g., having a peripheral structure). In some embodiments, the network (e.g., local network) includes wiring disposed in the enclosure of the facility (e.g., disposed in the enclosure of a peripheral structure). In some embodiments, a network (e.g., a local network) is configured for power and communications transmission over a single cable. In some embodiments, a network (e.g., a local network) is configured for at least fourth or fifth generation cellular communication. In some embodiments, a network (e.g., a local network) is configured for data transfer including media. In some embodiments, a network (e.g., a local network) is configured to be coupled to at least one antenna. In some embodiments, a network (e.g., a local network) is configured to couple to a plurality of different sensor types. In some embodiments, a network (e.g., a local network) is configured to be coupled to a building management system configured to manage the atmosphere of a facility (e.g., having a peripheral structure). In some embodiments, a network (e.g., a local network) is configured to be coupled to a tintable window. In some embodiments, a network (e.g., a local network) is configured to be coupled to a hierarchical control system configured to control a facility (e.g., having a peripheral structure).
In another aspect, a method for monitoring surface disinfection includes: (A) Sensing and/or identifying a plurality of temperature samples of a surface (e.g., surface) of an object at a plurality of sample times using a sensor system; (B) Comparing successive ones of the plurality of temperature samples to generate comparison results; (C) Detecting a cleaning event when the comparison indicates that the temperature falls below the temperature threshold; (E) monitoring the elapsed time since the last cleaning event; and (F) generating a notification when the elapsed time exceeds a time threshold.
In some embodiments, the sensor system is disposed in a facility and is operatively coupled to a local network of the facility. In some embodiments, the local network is configured to control at least one other apparatus of a facility operatively coupled to the local network. In some embodiments, the local network is configured to facilitate control of the facility, for example using a building management system. In some embodiments, the at least one other device (e.g., respectively) comprises at least one other device type. In some embodiments, the sensor system is coupled to a network (e.g., a local network) disposed in a facility (e.g., having a peripheral structure) in which the surface (e.g., an object surface) is disposed. In some embodiments, the network (e.g., local network) includes wiring disposed in the enclosure of the facility (e.g., disposed in the enclosure of a peripheral structure). In some embodiments, the method further comprises transmitting power and communications over a single cable of a network (e.g., a local network). In some embodiments, the method further comprises using a network (e.g., a local network) to transmit at least a fourth or fifth generation cellular communication. In some embodiments, the method further includes using a network (e.g., a local network) to transmit the data including the media. In some embodiments, the method further comprises using a network (e.g., a local network) to control the atmosphere of a facility (e.g., having a peripheral structure). In some embodiments, the method further includes using a network (e.g., a local network) to control a tintable window disposed in a facility (e.g., having a peripheral structure). In some embodiments, the method further comprises using a network (e.g., a local network) to control the facility (e.g., having a peripheral structure). In some embodiments, the sensor system senses the temperature sample remotely. In some embodiments, the sample time is repeated according to a predetermined sampling frequency. In some embodiments, the predetermined sampling frequency comprises one sampling at intervals of at most one minute. In some embodiments, the number of consecutive temperature samples to compare comprises at least about 2, 5, 10, or 20 temperature samples. In some embodiments, the notification is sent to the designated recipient or the requesting recipient. In some embodiments, a sensor system includes at least one sensor integrated in a device aggregate that includes (i) a sensor or (ii) a sensor and an emitter. In some embodiments, a sensor system includes at least one sensor integrated in a device aggregate that includes (i) at least one controller or (ii) at least one processor. In some embodiments, the sensor system is operatively coupled to a hierarchical control system comprising a plurality of controllers.
In another aspect, a non-transitory computer-readable program instructions for monitoring surface disinfection of a facility, which, when read by one or more processors, cause the one or more processors to perform the operations of any of the methods described above.
In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, at least two of the operations are performed by one of the one or more processors. In some embodiments, at least two of the operations are each performed by a different processor of the one or more processors.
In another aspect, a non-transitory computer-readable program instructions for monitoring surface disinfection of a facility, which, when read by one or more processors, cause the one or more processors to perform operations comprising: (A) Using or directing use of a sensor system to sense a plurality of temperature samples of a surface of an object at a plurality of sample times, the sensor system disposed in a facility and operatively coupled to a local network of the facility; (B) Comparing or directing a comparison of successive ones of the plurality of temperature samples to generate a comparison result; (C) Detecting or directing detection of a cleaning event when the comparison indicates that the temperature falls below the temperature threshold; (E) Monitoring or directing monitoring of the time elapsed since the last cleaning event; and (F) generating or directing generation of a notification when the elapsed time exceeds a time threshold. In some embodiments, the one or more processors are operatively coupled to the sensor system.
In some embodiments, the local network is configured to control at least one other apparatus of a facility operatively coupled to the local network. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, at least two of the operations are performed by one of the one or more processors. In some embodiments, at least two of the operations are each performed by a different processor of the one or more processors.
In another aspect, an apparatus for monitoring surface disinfection (e.g., of a facility) includes at least one controller (e.g., including circuitry) configured to: (A) Using or directing use of a sensor system to sense and/or identify a plurality of temperature samples of a surface (e.g., a surface) of an object at a plurality of sample times; (B) Comparing or directing a comparison of successive ones of the plurality of temperature samples to generate a comparison result; (C) Detecting or directing detection of a cleaning event when the comparison indicates that the temperature has dropped below the temperature threshold; (E) Monitoring or directing monitoring of the time elapsed since the last cleaning event; and (F) generating a notification when the elapsed time exceeds a time threshold.
In some embodiments, the at least one controller is operatively coupled to the sensor system. In some embodiments, the at least one controller is configured to control or direct at least one other device of the control facility (e.g., the at least one controller is configured to be operatively coupled with the at least one other device). In some embodiments, the at least one controller is configured to facilitate control of the facility, for example, using a building management system. In some embodiments, the at least one other device (e.g., respectively) comprises at least one other device type. In some embodiments, the sensor system is configured to remotely sense the temperature sample. In some embodiments, the at least one controller is configured to repeat the sample time according to a predetermined sampling frequency. In some embodiments, the predetermined sampling frequency comprises one sampling at intervals of at most one minute. In some embodiments, the number of consecutive temperature samples to compare comprises at least about 2, 5, 10, or 20 temperature samples. In some embodiments, the at least one controller is configured to send or direct a notification to a designated recipient or a requesting recipient. In some embodiments, a sensor system includes at least one sensor integrated in a device aggregate that includes (i) a sensor or (ii) a sensor and an emitter. In some embodiments, a sensor system includes at least one sensor integrated in a device aggregate that includes (i) at least one controller or (ii) at least one processor. In some embodiments, the sensor system is operatively coupled to a hierarchical control system comprising a plurality of controllers. In some embodiments, the sensor system is coupled to a network (e.g., a local network) disposed in a facility (e.g., having a peripheral structure) in which the surface (e.g., an object surface) is disposed. In some embodiments, the network (e.g., local network) includes wiring disposed in the enclosure of the facility (e.g., disposed in the enclosure of a peripheral structure). In some embodiments, a network (e.g., a local network) is configured for power and communications transmission over a single cable. In some embodiments, a network (e.g., a local network) is configured for at least a fourth or fifth generation. In some embodiments, a network (e.g., a local network) is configured for data transfer including media. In some embodiments, a network (e.g., a local network) is configured to be coupled to at least one antenna. In some embodiments, a network (e.g., a local network) is configured to couple to a plurality of different sensor types. In some embodiments, a network (e.g., a local network) is configured to be coupled to a building management system configured to manage an atmosphere of a facility (e.g., having a peripheral structure). In some embodiments, a network (e.g., a local network) is configured to couple to the tintable window. In some embodiments, a network (e.g., a local network) is configured to be coupled to a hierarchical control system configured to control a facility (e.g., having a peripheral structure).
In another aspect, a method of detecting a physical characteristic of an individual in a facility (e.g., having a peripheral structure) includes: (a) Sensing and/or identifying environmental characteristics in the presence of an individual at a plurality of occasions using a sensor system; (b) Analyzing (i) a plurality of environmental characteristic data samples and (ii) a threshold indicative of an abnormal physical characteristic to generate an analysis; and (c) using the analysis to generate a report indicative of the presence and/or absence of abnormal physical characteristics of the individual.
In some embodiments, the sensor system is disposed in a facility. In some embodiments, the sensor system is operatively coupled to a local network configured to facilitate control of at least one other device of the facility. In some embodiments, the local network is configured to facilitate control of the facility, for example using a building management system. In some embodiments, the at least one other device (e.g., respectively) comprises at least one other device type. In some embodiments, the environmental characteristic is detectably perturbed by the presence of the individual compared to the absence of the individual in the environment (e.g., of a facility (e.g., having a peripheral structure)). In some embodiments, a plurality of environmental characteristic data samples of the individual are collected for a plurality of occasions for quantifying a normal physical characteristic of the individual, wherein analyzing further comprises analyzing a relative difference between a most recent one of the data samples and the normal quantification value, and wherein the threshold is a difference threshold. In some embodiments, the sensor system comprises a plurality of sensors. In some embodiments, the sensor system includes a device aggregate that houses (i) the sensor and/or (ii) the sensor and the emitter. In some embodiments, the sensor system is communicatively coupled to a network (e.g., a local network) disposed in a facility (e.g., having a peripheral structure). In some embodiments, the network (e.g., local network) includes wiring disposed in the enclosure of the facility (e.g., disposed in the enclosure of a peripheral structure). In some embodiments, the method further comprises transmitting power and communications over a single cable of a network (e.g., a local network). In some embodiments, the method further includes transmitting at least a fourth or fifth generation cellular communication using a network (e.g., a local network). In some embodiments, the method further includes using a network (e.g., a local network) to transmit the data including the media. In some embodiments, the method further comprises using a network (e.g., a local network) to control the atmosphere of a facility (e.g., having a peripheral structure). In some embodiments, the method further includes using a network (e.g., a local network) to control a tintable window disposed in a facility (e.g., having a peripheral structure). In some embodiments, the method further comprises using a network (e.g., a local network) to control the facility (e.g., having a peripheral structure). In some embodiments, the sensor system comprises a plurality of devices disposed in a device aggregate, and wherein the device aggregate comprises a transmitter, a sensor, an antenna, radar, a dispenser, a badge reader, a geolocation technology, an accelerometer, and/or an identification reader. In some embodiments, the sensor system comprises an electromagnetic sensor. In some embodiments, the sensor system includes a first electromagnetic sensor configured to detect a first range of radiation and a second electromagnetic sensor configured to detect a second range of radiation, the second range of radiation having at least one portion that does not overlap with the first range of radiation. In some embodiments, the sensor system comprises an infrared sensor, a visible light sensor, or a depth camera. In some embodiments, the sensor system comprises a webcam. In some embodiments, the sensor system comprises a visible light sensor and a non-visible light sensor. In some embodiments, the non-visible light comprises infrared light. In some embodiments, the sensor system includes a camera configured to distinguish the individual from its surroundings based at least in part on infrared radiation readings and/or visible radiation readings. In some embodiments, using the sensor system comprises measuring at a rate of at least about every 2 seconds (sec), 1sec, 0.5sec, 0.25sec, 0.2sec, 0.15sec, 0.1sec, 0.05sec, or 0.025 sec. The sensor system may include a camera. The camera may be configured to capture at least about 30 frames per second (frm/sec), 20frm/sec, 10frm/sec, 8frm/sec, 6frm/sec, 4frm/sec, or 2 frm/sec. The sensing frequency (e.g., the number of measurements taken per second, such as the number of frames taken per second) may be adjusted (e.g., manually and/or automatically using at least one controller (e.g., as part of a control system)). The adjustment of the sensing rate may depend at least in part on the expected and/or average movement of occupants in the facility. For example, in an office environment, the average rate of movement (e.g., velocity or speed) of occupants may be slower than the average rate of movement of occupants. The adjustment of the sensing rate may depend at least in part on an expected and/or average movement of occupants in the space of the facility in which the sensor system is disposed. For example, at an airport or train station, the average rate of movement (e.g., velocity or speed) of occupants in the waiting area may be slower than the average rate of movement of occupants in the transition area (e.g., from a security gate to an airport terminal). The average rate may include an average, median, or mode of the rate. A higher sensor sampling rate (e.g., a higher sensor measurement rate, such as a higher number of frames taken by the camera per second) may correspond to a higher average mobility rate for individuals in the facility or a portion thereof (e.g., a space in the facility). In some embodiments, the individual is positioned at a distance of at least about 1 foot, 2 feet, or 3 feet from a sensor of the horizontal distance sensor system that senses the environmental characteristic. In some embodiments, a sensor that senses an environmental feature has a horizontal and/or vertical field of view of at least about 45 degrees, 55 degrees, 75 degrees, or 110 degrees, the sensor being included in a sensor system. Sometimes, lower resolution cameras that are sensitive to a range of wavelengths have a larger field of view than higher resolution cameras that are sensitive to a range of wavelengths. The wavelength range may include visible, ultraviolet or infrared wavelength ranges. In some embodiments, the method further comprises focusing at least one sensor of the sensor system on one or more facial landmark features of the individual to measure the environmental features. In some embodiments, the facial landmark features include the eyes, eyebrows, and/or nose of the individual. In some embodiments, the method further comprises focusing at least one sensor of the sensor system on a placement depth of the individual to measure the environmental characteristic. In some embodiments, the method further comprises focusing at least one sensor of the sensor system at a horizontal distance from the at least one sensor to measure the environmental characteristic. In some embodiments, the method further comprises focusing the measurements of at least one sensor of the sensor system at least in part by considering (i) at least one facial feature of the individual and/or (ii) a horizontal displacement of the individual relative to the one or more sensors. In some embodiments, the horizontal displacement is determined, at least in part, by using distances between facial landmark features of the individual, depth camera, and/or visible and non-visible electromagnetic sensor measurements. In some embodiments, analyzing the plurality of environmental feature data samples includes using a machine learning model that utilizes a learning set that includes measurements when individuals are present, measurements when blackbodies are present, ground truth measurements, and/or simulation measurements. In some embodiments, evaluating the feature comprises filtering the background measurement. In some embodiments, evaluating the feature includes filtering environmental features attributed to the background. In some embodiments, analyzing the plurality of environmental feature data samples comprises using a machine learning model, the machine learning model comprising a regression model or a classification model. In some embodiments, the sensor system is disposed in or attached to the frame. In some embodiments, the sensor system is disposed in a kiosk configured to service at least one user. In some embodiments, the sensor system is provided in a kiosk configured to simultaneously service multiple users located on opposite sides of the kiosk. In some embodiments, the sensor system is provided in a kiosk, and wherein the method further comprises simultaneously servicing a plurality of users located on one side of the kiosk. In some embodiments, the sensor system is disposed in a kiosk that includes a modular unit. In some embodiments, the sensor system is disposed in a kiosk that includes one or more media displays. In some embodiments, the sensor system is provided in a kiosk, and wherein the method further comprises remotely and/or contactlessly interacting with one or more users. In some embodiments, the sensor system is disposed in a kiosk. In some embodiments, the method further comprises conditionally allowing access to at least a portion of the facility.
In another aspect, a method for detecting physical characteristics of an individual in a facility includes detecting physical characteristics of an individual in a facility, the method including detecting physical characteristics of the individual in the facility, and determining, by one or more processors, a time of day at which the method is performed.
In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, at least two of the operations are performed by one of the one or more processors. In some embodiments, at least two of the operations are each performed by a different processor of the one or more processors.
In another aspect, a non-transitory computer-readable program instruction for detecting a physical characteristic of an individual in a facility, which, when read by one or more processors, causes the one or more processors to perform operations comprising: (a) Using or directing the sensing of environmental characteristics in the presence of an individual in a plurality of occasions using a sensor system, the sensor system being disposed in a facility and operatively coupled to a local network; (b) Analyzing or directing analysis of (i) a plurality of environmental characteristic data samples and (ii) a threshold indicative of an abnormal physical characteristic to generate an analysis; and (c) using or directing use of the analysis to generate a report indicative of the presence and/or absence of an abnormal physical characteristic of the individual. In some embodiments, the one or more processors are operatively coupled to a sensor system configured to sense and/or identify environmental features.
In some embodiments, the local network is configured to facilitate control of at least one other device of the facility. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, at least two of the operations are performed by one of the one or more processors. In some embodiments, at least two of the operations are each performed by a different processor of the one or more processors.
In another aspect, an apparatus for detecting physical characteristics of an individual in a facility (e.g., having a peripheral structure) includes at least one controller (e.g., including circuitry) configured to: (a) Operatively coupled to a sensor system configured to sense and/or identify environmental features; (b) Sensing and/or identifying environmental characteristics in the presence of an individual at a plurality of occasions using or leading use of a sensor system; (b) Analyzing or directing an analysis of (i) a plurality of environmental characteristic data samples and (ii) a difference threshold indicative of an abnormal physical characteristic to generate an analysis; and (c) using or directing use of the analysis to generate a report indicative of the presence and/or absence of abnormal physical characteristics of the individual.
In some embodiments, the at least one controller is configured to control or direct at least one other device of the control facility. In some embodiments, the at least one controller is configured to control the facility, for example, using a building management system. In some embodiments, the at least one other device (e.g., respectively) comprises at least one other device type. In some embodiments, the environmental characteristic is detectably perturbed by the presence of the individual compared to the absence of the individual in the environment. In some embodiments, the at least one controller is configured to (i) collect or direct collection of a plurality of environmental characteristic data samples of the individual for a plurality of occasions for quantifying a normal physical characteristic of the individual, (ii) analyze or direct analysis of a relative difference between a most recent one of the data samples and the normal quantified value, and wherein the threshold is a difference threshold. In some embodiments, the sensor system comprises a plurality of sensors. In some embodiments, the sensor system includes a device aggregate that houses (i) the sensor and/or (ii) the sensor and the emitter. In some embodiments, the sensor system is communicatively coupled to a network (e.g., a local network) disposed in a facility (e.g., having a peripheral structure). In some embodiments, the network (e.g., the local network) includes cables disposed in an enclosure of the facility (e.g., disposed in an enclosure of a peripheral structure). In some embodiments, the at least one controller is configured to transmit or direct transmission of power and communications over a single cable of a network (e.g., a local network). In some embodiments, the at least one controller is configured to use or direct use of a network (e.g., a local network) to transmit at least a fourth or fifth generation cellular communication. In some embodiments, the at least one controller is further configured to transmit data comprising media (e.g., video, presentations, web video and/or web pages) using a network (e.g., a local network). In some embodiments, the at least one controller is configured to use or direct the atmosphere using a network (e.g., a local network) to control a facility (e.g., having a peripheral structure). In some embodiments, the at least one controller is configured to use or direct use of a network (e.g., a local network) to control a tintable window disposed in a facility (e.g., having a peripheral structure). In some embodiments, the at least one controller is configured to use or direct use of a network (e.g., a local network) to control a facility (e.g., having a peripheral structure). In some embodiments, the sensor system comprises a plurality of devices disposed in a device aggregate, and wherein the device aggregate comprises a transmitter, a sensor, an antenna, radar, a dispenser, a badge reader, a geolocation technology, an accelerometer, and/or an identification reader. In some embodiments, the sensor system comprises an electromagnetic sensor. In some embodiments, the sensor system includes a first electromagnetic sensor configured to detect a first radiation range and a second electromagnetic sensor configured to detect a second radiation range having at least one portion that does not overlap with the first radiation range. In some embodiments, the sensor system comprises an infrared sensor, a visible light sensor, or a depth camera. In some embodiments, the sensor system comprises a webcam. In some embodiments, the sensor system comprises a visible light sensor and a non-visible light sensor. In some embodiments, the non-visible light comprises infrared light. In some embodiments, the sensor system includes a camera configured to distinguish the individual from its surroundings based at least in part on infrared radiation readings and/or visible radiation readings. In some embodiments, the at least one controller is configured to use or direct use of the sensor system at least in part by making measurements at a rate of at least about every about 2 seconds or every 1 second. In some embodiments, the sensor system is configured to measure the environmental characteristic when the individual is positioned at a distance of at least about 1 foot, 2 feet, or 3 feet from a sensor of the horizontal distance sensor system, the sensor configured to sense and/or identify the environmental characteristic. In some embodiments, a sensor that senses an environmental feature has a horizontal and/or vertical field of view of at least about 45 degrees, 55 degrees, 75 degrees, or 110 degrees, the sensor being included in a sensor system. In some embodiments, at least one sensor of the sensor system is configured to focus on one or more facial landmark features of an individual to measure environmental features. In some embodiments, the facial landmark features include eyes, eyebrows, and/or nose of the individual. In some embodiments, the at least one controller is configured to direct the at least one sensor of the sensor system to focus on a placement depth of the individual to measure the environmental characteristic. In some embodiments, the at least one controller is configured to direct the at least one sensor of the sensor system to focus on a horizontal distance from the at least one sensor to the individual to measure the environmental characteristic. In some embodiments, the at least one controller is configured to direct the at least one sensor of the sensor system to focus the measurements of the at least one sensor at least in part by considering (i) at least one facial feature of the individual and/or (ii) a horizontal displacement of the individual relative to the one or more sensors. In some embodiments, the horizontal displacement is determined at least in part by using distances between facial landmark features of the individual, depth camera, and/or visible and non-visible electromagnetic sensor measurements. In some embodiments, the at least one controller is configured to analyze or guide the analysis of the plurality of environmental characteristic data samples at least in part by using a machine learning model that utilizes a learning set that includes measurements when individuals are present, measurements when blackbody is present, ground truth measurements, and/or simulation measurements. In some embodiments, the at least one controller is configured to evaluate or direct evaluation of the characteristic at least in part by filtering the background measurement. In some embodiments, the at least one controller is configured to evaluate or direct evaluation of the feature at least in part by filtering environmental features due to the context. In some embodiments, the at least one controller is configured to analyze or direct analysis of the plurality of environmental feature data samples at least in part by using a machine learning model, the machine learning model comprising a regression model or a classification model. In some embodiments, the sensor system is disposed in or attached to the frame. In some embodiments, the sensor system is disposed in a kiosk configured to service at least one user. In some embodiments, the sensor system is provided in a kiosk configured to simultaneously service multiple users located on opposite sides of the kiosk. In some embodiments, the sensor system is provided in a kiosk configured to simultaneously service multiple users located on one side of the kiosk. In some embodiments, the sensor system is disposed in a kiosk that includes a modular unit. In some embodiments, the sensor system is disposed in a kiosk that includes one or more media displays. In some embodiments, the sensor system is provided in a kiosk configured to remotely and/or contactlessly interact with one or more users. In some embodiments, the sensor system is provided in a kiosk configured to conditionally allow access to at least a portion of the facility.
In another aspect, a method of detecting occupancy in at least one peripheral structure of a facility, the method comprising: (a) Using a sensor system to sense a physical characteristic of at least one individual disposed in the at least one peripheral structure of the facility over a period of time, the physical characteristic being a characteristic of the individual; (b) Analyzing the sensed physical characteristics over the time period to generate an analysis; and (c) determining occupancy of the at least one peripheral structure of the facility based at least in part on the analysis.
In some embodiments, the sensor system is operatively coupled to a local network of the facility. In some embodiments, the local network is configured to control at least one other apparatus of the facility. In some embodiments, the local network is configured to facilitate control of facilities, for example, using a building management system. In some embodiments, the at least one other device (e.g., respectively) comprises at least one other device type. In some embodiments, the at least one sensor of the sensor system is configured to sense electromagnetic radiation. In some embodiments, the at least one sensor of the sensor system is configured to sense infrared and/or visible radiation. In some embodiments, the at least one sensor of the sensor system is configured to sense a depth of the at least one peripheral structure. In some embodiments, the at least one sensor of the sensor system is configured to sense ultra-wideband frequency radiation. In some embodiments, the at least one sensor of the sensor system is configured to sense movement of the at least one individual. In some embodiments, the method further comprises performing image processing of the body feature measured by the sensor system. In some embodiments, the method further comprises performing a movement analysis of the body feature measured by the sensor system. In some embodiments, the method further comprises performing a movement direction analysis of the body feature measured by the sensor system. In some embodiments, the at least one sensor of the sensor system is disposed at a ceiling of the facility. In some embodiments, the at least one peripheral structure is a plurality of peripheral structures, and wherein determining occupancy of the at least one peripheral structure of the facility comprises determining occupancy in each peripheral structure of the plurality of peripheral structures over time. In some embodiments, the analysis includes using a database of occupancy, schedules, and/or scheduling facilities for the facilities. In some embodiments, analyzing comprises using a database of occupancy, schedules, and/or schedules of the one or more individuals in the facility.
In another aspect, an apparatus for detecting occupancy in at least one peripheral structure of a facility, the apparatus comprising at least one controller configured to: (a) operatively coupled to a sensor system; (b) Using or directing use of a sensor system to sense a physical characteristic of at least one individual disposed in the at least one peripheral structure of the facility over a period of time, the physical characteristic being a characteristic of the individual; (c) Analyzing or directing analysis of the sensed physical characteristics over the time period to generate an analysis; and (d) determining or directing a determination of occupancy of the at least one peripheral structure of the facility based at least in part on the analysis.
In some embodiments, the at least one controller is configured to control or direct at least one other device of the control facility. In some embodiments, the at least one controller is configured to control the facility, for example, using a building management system. In some embodiments, the at least one other device (e.g., respectively) comprises at least one other device type. In some embodiments, at least one sensor of the sensor system is configured to sense electromagnetic radiation. In some embodiments, at least one sensor of the sensor system is configured to sense infrared and/or visible radiation. In some embodiments, at least one sensor of the sensor system is configured to sense a depth of the at least one peripheral structure. In some embodiments, at least one sensor of the sensor system is configured to sense ultra-wideband frequency radiation. In some embodiments, at least one sensor of the sensor system is configured to sense movement of the at least one individual. In some embodiments, the at least one controller is configured to perform or direct the performance of image processing of the body feature measured by the sensor system. In some embodiments, the at least one controller is configured to perform or direct performance of movement analysis of the body characteristic measured by the sensor system. In some embodiments, the at least one controller is configured to perform or direct performance of movement direction analysis of the body feature measured by the sensor system. In some embodiments, at least one sensor of the sensor system is disposed at a ceiling of the facility. In some embodiments, the at least one peripheral structure is a plurality of peripheral structures, and wherein determining occupancy of the at least one peripheral structure of the facility comprises determining occupancy in each peripheral structure of the plurality of peripheral structures over time. In some embodiments, the analysis includes using a database of occupancy, schedules, and/or scheduling facilities for the facilities. In some embodiments, analyzing comprises using a database of occupancy, schedules, and/or schedules of the one or more individuals in the facility.
In some embodiments, the at least one controller disclosed herein includes circuitry (e.g., electrical circuitry). The at least one controller is configured in or includes a processor.
In another aspect, the present disclosure provides methods of using (e.g., for its intended purpose) any of the systems and/or apparatuses disclosed herein.
In another aspect, the present disclosure provides systems, devices (e.g., controllers), and/or non-transitory computer-readable media (e.g., software) that implement any of the methods disclosed herein.
In another aspect, an apparatus comprises at least one controller programmed to direct a mechanism for carrying out (e.g., implementing) any of the methods disclosed herein, wherein the at least one controller is operably coupled to the mechanism.
In another aspect, an apparatus includes at least one controller configured (e.g., programmed) to implement (e.g., realize) the methods disclosed herein. The at least one controller may implement any of the methods disclosed herein.
In some embodiments, a controller of the at least one controller is configured to perform two or more operations. In some embodiments, two different controllers of the at least one controller are configured to each perform different operations.
In another aspect, a system comprises: at least one controller programmed to direct operation of at least one other device (or component thereof); and the device (or components thereof), wherein the at least one controller is operatively coupled to the device (or components thereof). The device (or components thereof) may comprise any device (or components thereof) disclosed herein. The at least one controller may direct any of the devices (or components thereof) disclosed herein.
In another aspect, a computer software product comprises a non-transitory computer-readable medium having program instructions stored therein, which when read by a computer, cause the computer to direct a mechanism disclosed herein to implement (e.g., realize) any method disclosed herein, wherein the non-transitory computer-readable medium is operatively coupled to the mechanism. The mechanism may comprise any of the devices (or any component thereof) disclosed herein.
In another aspect, the present disclosure provides a non-transitory computer-readable medium comprising machine-executable code that, when executed by one or more computer processors, implements any of the methods disclosed herein.
In another aspect, the present disclosure provides a non-transitory computer-readable medium comprising machine-executable code that, when executed by one or more computer processors, implements guidance of a controller (e.g., as disclosed herein).
In another aspect, the present disclosure provides a computer system comprising one or more computer processors and a non-transitory computer-readable medium coupled to the one or more computer processors. The non-transitory computer-readable medium includes machine-executable code that, when executed by one or more computer processors, implements any of the methods disclosed herein and/or implements the guidance of the controller disclosed herein.
In another aspect, the present disclosure provides non-transitory computer-readable program instructions that, when read by one or more processors, cause the one or more processors to perform any of the operations of the methods disclosed herein, to perform (or be configured to perform) any of the operations by (or to be) a device disclosed herein, and/or to be directed by (or to be) a device disclosed herein.
In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, the program instructions are embodied in a non-transitory computer readable medium. In some embodiments, at least two of the operations are performed by one of the one or more processors. In some embodiments, at least two of the operations are each performed by a different processor of the one or more processors.
The contents of this summary section are provided as a simplified introduction to the disclosure and are not intended to limit the scope of any invention disclosed herein or the scope of the appended claims.
Other aspects and advantages of the present disclosure will become apparent to those skilled in the art from the following detailed description, wherein only illustrative embodiments of the present disclosure are shown and described. As will be realized, the disclosure is capable of other and different embodiments and its several details are capable of modifications in various obvious respects, all without departing from the disclosure. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.
These and other features and embodiments will be described in more detail below with reference to the accompanying drawings.
Is incorporated by reference
All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.
Drawings
The novel features believed characteristic of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings (also referred to herein as "figures"), wherein:
FIGS. 1A and 1B schematically illustrate various peripheral structures in which an individual is placed;
FIGS. 2A and 2B schematically illustrate various peripheral structures in which an individual is placed;
figures 3A to 3D schematically illustrate various admission related devices;
figures 4A-4B schematically illustrate various admission related devices in a functional environment;
figures 5A-5B schematically illustrate various admission related devices in a functional environment;
figures 6A-6B schematically illustrate various admission related devices in a functional environment;
Figure 7 shows an admission flow diagram;
figure 8 shows an admission flow diagram;
fig. 9 schematically shows a schematic example of a sensor arrangement;
FIG. 10 shows a schematic example of a sensor arrangement and sensor data;
FIG. 11 depicts a time-dependent plot of carbon dioxide concentration;
FIG. 12 shows a topographical map of measured property values;
FIG. 13 schematically illustrates a device and its components and connectivity options;
FIG. 14 schematically shows a schematic example of a sensor arrangement and sensor data;
FIG. 15 schematically illustrates the connectivity of devices and components used to detect abnormal conditions and provide reports;
FIG. 16 shows a flow chart for comparing relative sensor data;
FIG. 17 schematically shows devices and interactions for monitoring building occupants;
FIG. 18 illustrates a flow chart relating to contacter tracking;
FIG. 19 shows a flow chart relating to movement prediction;
FIG. 20 shows a flow chart relating to monitoring the cleaning status of a surface;
FIG. 21 schematically illustrates a control system and various components thereof;
FIG. 22 shows a flow chart relating to sensor data processing;
FIG. 23 schematically depicts a controller;
FIG. 24 schematically depicts a processing system;
FIG. 25 schematically illustrates a person and a sensor;
FIG. 26 schematically depicts a flow chart for reporting a measured characteristic (e.g., temperature) of a user;
27A-27B schematically depict various sensor ensembles and data processing sequences;
FIG. 28 schematically illustrates various framework systems and users;
FIG. 29 schematically illustrates various frame systems; and is
Fig. 30 schematically illustrates various frame system portions.
The drawings and components therein may not be to scale. The components in the figures described herein may not be drawn to scale.
Detailed Description
While various embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed.
Terms such as "a", "an" and "the" are not intended to refer to only a single entity, but include the general class of which a specific example may be used for illustration. The terminology herein is used to describe specific embodiments of the invention, but their usage does not limit the invention.
When referring to ranges, ranges are meant to include the endpoints unless otherwise indicated. For example, a range between a value of 1 and a value of 2 is meant to be inclusive and includes both values of 1 and 2. The range, inclusive, will span any value from about the value 1 to about the value 2. As used herein, the term "adjacent" or "adjacent" includes "immediately adjacent", "abutting", "contacting", and "proximate".
As used herein, the conjunction "and/or" (such as "including X, Y and/or Z") in the phrases included in the claims refers to including any combination of X, Y and Z or multiple X, Y and Z. For example, such phrases are meant to include X. For example, such phrases are meant to include Y. For example, such phrases are meant to encompass Z. For example, such phrases are meant to include X and Y. For example, such phrases are meant to include X and Z. For example, such phrases are meant to include Y and Z. For example, such phrases are meant to include multiple xs. For example, such phrases are meant to include multiple ys. For example, such phrases are meant to include multiple Z. For example, such phrases are meant to include multiple xs and multiple ys. For example, such phrases are meant to include multiple xs and multiple zs. For example, such phrases are meant to include multiple Y's and multiple Z's. For example, such phrases are meant to include multiple xs and one Y. For example, such phrases are meant to include multiple xs and one Z. For example, such phrases are meant to include multiple Y's and one Z. For example, such phrases are meant to include one X and multiple Y. For example, such phrases are meant to include one X and a plurality of Z. For example, such phrases are meant to include one Y and a plurality of Z.
The terms "operatively coupled" or "operatively connected" refer to a first element (e.g., a mechanism) coupled (e.g., connected) to a second element to allow for the intended operation of the second element and/or the first element. Coupling may include physical or non-physical coupling. The non-physical coupling may include signal inductive coupling (e.g., wireless coupling). Coupling may include physical coupling (e.g., a physical connection) or non-physical coupling (e.g., via wireless communication).
An element (e.g., a mechanism) that is "configured to" perform a function includes a structural feature that causes the element to perform the function. The structural features may include electrical features such as circuitry or circuit elements. The structural features may include circuitry (e.g., including electrical or optical circuitry). The electrical circuitry may include one or more wires. The optical circuitry may include at least one optical element (e.g., a beam splitter, a mirror, a lens, and/or an optical fiber). The structural feature may comprise a mechanical feature. The mechanical features may include latches, springs, closures, hinges, chassis, supports, fasteners, or cantilevers, etc. Performing the function may include utilizing the logic feature. The logic features may include programming instructions. The programming instructions are executable by at least one processor. The programming instructions may be stored or encoded on a medium accessible by one or more processors.
In some embodiments, the peripheral structure includes a region defined by at least one structure. The at least one structure may include at least one wall. The peripheral structure may include and/or surround one or more sub-peripheral structures. The at least one wall may comprise metal (e.g., steel), clay, stone, plastic, glass, stucco (e.g., gypsum), polymer (e.g., polyurethane, styrene, or vinyl), asbestos, fiberglass, concrete (e.g., reinforced concrete), wood, paper, or ceramic. The at least one wall may comprise wires, bricks, blocks (e.g., cinder blocks), tiles, drywall, or framing (e.g., steel framing).
In some embodiments, a plurality of devices are operatively (e.g., communicatively) coupled to a control system. The plurality of devices may be disposed in a facility (e.g., which includes a building and/or a room). The control system may include a hierarchy of controllers. The device may include an emitter, a sensor, or a (e.g., tintable) window (e.g., IGU). The device may be any device disclosed herein. At least two of the plurality of devices may be of the same type. For example, two or more IGUs may be coupled to the control system. At least two of the plurality of devices may be of different types. For example, the sensor and transmitter may be coupled to a control system. Sometimes, the plurality of devices may include at least about 20, 50, 100, 500, 1000, 2500, 5000, 7500, 10000, 50000, 100000, or 500000 devices. The plurality of devices may be any number between the aforementioned numbers (e.g., from about 20 devices to about 500000 devices, from about 20 devices to about 50 devices, from about 50 devices to about 500 devices, from about 500 devices to about 2500 devices, from about 1000 devices to about 5000 devices, from about 5000 devices to about 10000 devices, from about 10000 devices to about 100000 devices, or from about 100000 devices to about 500000 devices). For example, the number of windows in a floor may be at least about 5, 10, 15, 20, 25, 30, 40, or 50. The number of windows in a floor may be any number between the above numbers (e.g., from 5 to 50, from 5 to 25, or from 25 to 50). Sometimes, these devices may be located in a multi-storey building. At least a portion of the floors of the multi-storey building may have devices controlled by the control system (e.g., at least a portion of the floors of the multi-storey building may be controlled by the control system). For example, a multi-storey building may have at least one floor controlled by a control system About 2, 8, 10, 25, 50, 80, 100, 120, 140, or 160 layers. The number of floors (e.g., devices therein) controlled by the control system can be any number between the above numbers (e.g., from about 2 to about 50, from about 25 to about 100, or from about 80 to about 160). The floor may have at least about 150m 2 、250m 2 、500m 2 、1000m 2 、1500m 2 Or 2000 square meters (m) 2 ) The area of (c). The floor area may have an area between any of the floor area values described above (e.g., from about 150 m) 2 To about 2000m 2 From about 150m 2 To about 500m 2、 From about 250m 2 To about 1000m 2 From about 1000m 2 To about 2000m 2 )。
In some embodiments, the peripheral structure includes one or more openings. The one or more openings may be reversibly closable. The one or more openings may be permanently open. The base length dimension of the one or more openings may be smaller relative to the base length dimension of the walls defining the peripheral structure. The base length dimension may include a diameter, length, width, or height of the bounding circle. The base length scale may be abbreviated herein as "FLS". The surface of the one or more openings may be smaller relative to the surface of the wall defining the peripheral structure. The open surface may be a certain percentage of the total surface of the wall. For example, the opening surface may be measured as about 30%, 20%, 10%, 5%, or 1% of the wall. The walls may include floors, ceilings, or side walls. The closable opening may be closed by at least one window or door. The peripheral structure may be at least a portion of a facility. The peripheral structure may comprise at least a portion of a building. The building may be a private building and/or a commercial building. A building may include one or more floors. The building (e.g., a floor thereof) may include at least one of: a room, a hallway, a rooftop, a basement, a balcony (e.g., an interior or exterior balcony), a stairwell, an aisle, an elevator shaft, a facade, a mid-floor, an attic, a garage, a porch (e.g., an enclosed porch), a balcony (e.g., an enclosed balcony), a cafeteria, and/or a duct. In some embodiments, the peripheral structure may be fixed and/or movable (e.g., a train, an airplane, a ship, a vehicle, or a rocket).
In some embodiments, the peripheral structure surrounds the atmosphere. The atmosphere may include one or more gases. The gas may include an inert gas (e.g., argon or nitrogen) and/or a non-inert gas (e.g., oxygen or carbon dioxide). The peripheral structure atmosphere may be similar to the atmosphere outside the peripheral structure (e.g., ambient atmosphere) in at least one external atmospheric feature, including: temperature, relative gas content, gas type (e.g., humidity and/or oxygen content), debris (e.g., dust and/or pollen), and/or gas velocity. The peripheral structure atmosphere may differ from the atmosphere outside the peripheral structure in at least one external atmospheric characteristic, the at least one external atmospheric characteristic comprising: temperature, relative gas content, gas type (e.g., humidity and/or oxygen content), debris (e.g., dust and/or pollen), and/or gas velocity. For example, the surrounding structural atmosphere may be less humid (e.g., drier) than the external (e.g., ambient) atmosphere. For example, the peripheral structure atmosphere may contain the same (e.g., or substantially similar) oxygen-nitrogen ratio as the atmosphere outside the peripheral structure. The velocity of the gas in the peripheral structure may be (e.g., substantially) similar throughout the peripheral structure. The velocity of the gas in the peripheral structure may be different in different portions of the peripheral structure (e.g., by flowing the gas through a vent coupled to the peripheral structure).
Certain disclosed embodiments provide a network infrastructure in a peripheral structure (e.g., a facility such as a building). The network may be installed when constructing the envelope of a surrounding structure (e.g., a building) and/or prior to constructing the interior of the surrounding structure. The network may include wiring (e.g., cables) located in an enclosure of a perimeter structure (e.g., a perimeter structure such as an exterior wall of a building). The network infrastructure may be used for various purposes, such as for providing communication and/or power services. The communication services may include high bandwidth (e.g., wireless and/or wired) communication services. The communication service may be available to occupants of the facility and/or users outside of the facility (e.g., building). The network infrastructure may operate in conjunction with or as a partial replacement for the infrastructure of one or more cellular carriers. The network infrastructure may be provided in a facility comprising electrically switchable windows. Examples of components of the network infrastructure include high-speed backhaul. The network infrastructure may include at least one cable, switch, physical antenna, transceiver, sensor, transmitter, receiver, radio, processor, and/or controller (which may include a processor). The network infrastructure may be operatively coupled to and/or include a wireless network. The network infrastructure may include wiring. One or more sensors may be deployed (e.g., installed) in the environment as part of and/or after the network is installed. The network may be an encrypted network. The network may include one or more levels of encryption.
In some embodiments, the network infrastructure is operatively coupled to one or more sensors disposed in the peripheral structure. The sensor may be configured to detect a feature of the environment that is affected by individuals present in the peripheral structure, and/or any gradient of that feature in the environment. The sensor may be capable of sensing and/or identifying an abnormal effect of the individual on an environmental characteristic (e.g., residual heat, sweat, VOCs, and/or carbon dioxide emissions). In some embodiments, the sensor is configured to sense and/or identify (e.g., directly) one or more physical characteristics (e.g., heat, sweat (e.g., humidity), VOCs (e.g., odor), and/or carbon dioxide) emitted and/or excreted from the individual. The physical characteristic may include sebum or urine. VOCs can be the reaction products of compounds secreted by an individual. For example formaldehyde or (e.g. cyclic) methylsiloxanes. In some embodiments, the sensor is configured to sense and/or identify one or more physical characteristics of the individual (e.g., directly sense the individual). The sensors may be operatively coupled to a network (e.g., the network infrastructure disclosed herein). The sensed data may be analyzed and/or recorded. The analyzed data may be used to provide reports and/or alerts. For example, if the sensed data is subject to rules, the analyzed data may send an alert. The rules may include if the data is outside of a threshold window, within a threshold window, above a threshold, or below a threshold. The threshold may be a value or a function. The threshold may evolve over time, for example, as more data appears. The rules may be predetermined, for example, by the user, by a third party, by global and/or jurisdictional standards, or by any combination thereof.
In some embodiments, the abnormal physical characteristic is associated with a disease. The disease may involve inflammation. Diseases may include infections associated with pathogens (e.g., bacteria or viruses). Pathogens may cause common cold, pneumonia, or flu-like symptoms. The pathogen may include streptococcus. Pathogens may include rhinoviruses, influenza viruses, or coronaviruses. The disease may involve an infectious disease. The disease may involve respiratory and/or inflammatory diseases. The disease may involve epithelial tissue and/or neuronal infectious diseases. The disease may involve a severe and/or acute illness. The disease may involve nasal, ear and/or throat infections. The disease may be infectious (e.g., by an individual). The disease may infect the bronchi and/or lungs. The disease may infect one or more internal organs and/or the blood. The disease may include pneumonia or influenza. The disease may be caused by, for example, microorganisms invading at least one body tissue. The disease may be infectious with an R of at least about 0.5, 1, 1.5, 2, 2.5 or 3 o A factor.
In some embodiments, one or more sensors may transmit analyzed data, reports, and/or alarms. The transmission may be to an individual and/or to a user. For example, the report may be transmitted to interested parties. The interested party may be a competent institution, a facility owner, a user, an individual under test. The data may be stored, analyzed, and/or reported in compliance with jurisdictional rules and regulations. The reports, notifications, alerts, and/or data may be transmitted to wired and/or wireless devices (e.g., cell phones). For example, the individual being tested may receive an audible, vibratory, pictorial or written notification (e.g., an alarm). The transmission may include encoding. The code may include one signal for an abnormal characteristic (e.g., indicating a diseased individual) and a second signal for a normal characteristic (e.g., indicating a healthy individual). For example, a person's cell phone may vibrate once when the temperature sensor does not sense an increase in the person's temperature, and vibrate twice when the temperature sensor senses an increase in the person's temperature.
In some embodiments, the sensor is disposed in an environment. The sensor may be attached to (or provided on) at least one wall, ceiling or floor of the peripheral structure. In an example, the sensor or sensor aggregate may be disposed on or in the floor. The sensor ensemble may include weight sensors, movement sensors and accelerometers, audio sensors or optical sensors. At least one sensor in the ensemble may be capable of sensing and/or identifying the presence of the individual, for example, by sensing the weight of the individual, by sensing the sound of the individual, by sensing the blocking of light by the individual, by sensing a proximate object associated with the individual. For example, when an individual resides on a platform coupled to a weight sensor, the weight sensor may be able to sense and/or identify a weight associated with the individual. For example, when an individual blocks light (e.g., IR light) sensed by an optical sensor, the optical sensor may be able to sense and/or identify the blocking of light. The requested location (e.g., test station) of the individual may be marked (e.g., in the form of a shape such as a circle, square, or letter X) on the floor. The sensor may be mounted on a door (e.g., a door frame) or a window (e.g., a window frame). The peripheral structure may facilitate testing of a single individual or multiple individuals. A single test station (e.g., a kiosk) or multiple test stations may be provided in the peripheral structure. The test stations in the plurality of test stations may be far enough to prevent contamination between individuals in the same room (e.g., to prevent the individuals from contaminating one another). The distance may be predefined and/or adjusted and/or defined according to jurisdictional, global, and/or third party standards. The individual may walk freely and/or pass through the designated test area or remain there for a period of time. The time that an individual may need to remain (e.g., substantially) stationary over a test area may be at most about 1 second (sec), 3sec, 5sec, 10sec, 30sec, or 60sec. The time that an individual may need to remain (e.g., substantially) stationary over a test area may be any value between the above-described values (e.g., about 1sec to about 60sec, about 1sec to about 5sec, about 5sec to about 30sec, or about 30sec to about 60 sec). An individual may pass the sensor and may be tested on each pass. This may allow an individual to be automatically tested multiple times during a day, depending on how many times the individual passes the testing station (e.g., passes the sensor). Multiple test results may increase confidence in the results, decrease error values, and/or enable identification of relative changes between test results obtained at different times. If any of the sensors has high fidelity, the sensors may be able to provide trend information in absolute and/or relative terms for individual features, for example, over a certain period of time, such as at least one hour, day, week, month, or year. If the sensor is uncalibrated or uncalibrated, and the sensor results are consistent over a period of use, the sensor may be able to provide trend information in relative terms for individual features, for example, over a certain period of time, such as at least one hour, day, week, month, or year.
Fig. 1A shows a vertical cross-sectional example of a peripheral structure 100 (e.g., a room) in which an individual 101 is present, the peripheral structure having an environment 102 (e.g., the atmosphere) and a test platform 111. The peripheral structure has an inlet direction 104 and an outlet direction 105. The individual may enter the peripheral structure from an entrance direction and exit the peripheral structure through an exit direction. The individual 102 may sense and/or identify environmental characteristics for a period of time (e.g., as indicated herein) on the test platform 111 waiting for the sensor, or naturally walk through the test station. The individual 102 may have a communication device 106 that may receive an analysis of the test. The communication device may be communicatively coupled to a network (e.g., a network infrastructure) of facilities in which the peripheral structures are disposed. The communication device may be a mobile device (e.g., a cellular phone, iPad, or laptop). The communication device may be a handheld device. The network may be configured to support at least fourth or fifth generation cellular network communications.
The testing station (e.g., kiosk) may contain one or more sensors, for example, to perform environmental localization and/or to detect the presence of individuals. The test station may be free of at least one sensor that performs environmental localization. The test station may be free of at least one sensor that detects the presence of an individual. For example, a weight sensor may detect the presence of an individual and be disposed in a test platform. A temperature sensor that can detect the ambient temperature in the peripheral structure can be disposed in the test platform or elsewhere in the peripheral structure. For example, optical sensors (e.g., sensors sensitive to electromagnetic radiation such as in the infrared and/or visible range) may detect the presence of individuals and be disposed in peripheral structures external to the platform (e.g., not in the test platform). The platform may be free of the one or more sensors. The testing platform may guide the individual in a position that facilitates (e.g., optimally) performing the test. Access to the peripheral structure (e.g., door) may be closed and/or prevented during testing. The sensors may be located at a fixed structure of the facility (e.g., a test station, an entrance lobby, or a meeting room). The fixed structure may comprise a frame, a wall, a ceiling or a floor. The frame may comprise a door frame or a window frame.
In some embodiments, the occupancy sensor is operatively coupled to a network. The occupancy sensor may be configured to sense and/or identify movement of an occupant in (or entering) the facility, electromagnetic radiation associated with the occupant, and/or an identification tag thereof. For example, the sensor may comprise a visible or IR sensor. At times, the occupancy sensors may not be able to identify the identity of the occupant. The IR sensor may detect a thermal characteristic of the individual. The IR sensor may be part of a sensor system (e.g., a device aggregate). The data of the IR sensor may be integrated with the visual data and/or the motion data. The sensor system may be operatively coupled to a network and/or control system of the facility. Occupancy sensors (e.g., as part of a sensing system) may be used by an occupant to determine the density (e.g., over time) of individuals in and/or users of the peripheral structure of a facility in which the occupancy sensors are disposed. The occupant sensor may be operatively coupled to the at least one processor. The occupancy sensor may be configured to detect movement of an occupant. For example, the occupancy sensor may detect electromagnetic radiation (e.g., IR radiation, such as forming a thermal signature of an individual) emitted and/or reflected by an occupant, which detection may occur over time. In some embodiments, using an occupancy sensor (e.g., as part of a sensor system, such as an assembly of devices) includes measuring at a rate of at least about every 2 seconds (sec), 1sec, 0.5sec, 0.25sec, 0.2sec, 0.15sec, 0.1sec, 0.05sec, or 0.025 sec. The occupancy sensor may comprise a camera. The camera may be configured to capture at least about 30 frames per second (frm/sec), 20frm/sec, 10frm/sec, 8frm/sec, 6frm/sec, 4frm/sec, or 2 frm/sec. The sensing frequency (e.g., the number of measurements taken per second, such as the number of frames taken per second) may be adjusted (e.g., manually and/or automatically using at least one controller (e.g., as part of a control system)). The camera may include an IR and/or visible camera. The sensor system may be configured to detect the position of an occupant (e.g., using GPS or any other anchored geolocation technique), for example, relative to the facility, relative to the peripheral structure in which it is located, and/or in absolute coordinates. Movement (e.g., including direction of movement) of an occupant may be determined with detecting the occupant over time and in the space. The direction of movement of the occupant may include utilizing a tracking algorithm. Determining occupancy and/or movement of an occupant may be determined to an accuracy of at least about 80%, 85%, 90%, 95%, or 99%. The tracking algorithm may be one such as disclosed in "Simple object tracking with OpenCV" of the Adrian rosecock published on 23.7.2018 and available in https:// www. Imagesearch. Com/2018/07/23/Simple-object-tracking-with-OpenCV, which is incorporated herein by reference in its entirety.
Fig. 1B shows an example of a vertical cross-section of a peripheral structure 150 (e.g., a room) in which individuals 151, 152, and 153 are present, the peripheral structure having an environment 170 (e.g., the atmosphere) and test platforms 161, 162 on which the individuals stand and/or pass. The peripheral structure has an inlet direction 181 and an outlet direction 182. An individual may enter the peripheral structure from an entrance direction and exit the peripheral structure through an opening (e.g., a door) through an exit direction. Fig. 2A shows a vertical cross-sectional example of a peripheral structure 200 (e.g., a room) in which an individual 201 is present, the peripheral structure having an environment 202 (e.g., the atmosphere) and a sensor 211. The peripheral structure has an inlet direction 204 and an outlet direction 105. The individual may enter the peripheral structure from an entrance direction and exit the peripheral structure through an exit direction. The individual 202 may be accessed through an opening (e.g., a door) to which one or more sensors 211 are attached so that the sensors can sense environmental characteristics. The individual may naturally walk past the sensor 211 or wait for a period of time. The individual 202 may have a communication device 206 that may receive an analysis of the test.
Fig. 2B shows an example of a vertical cross-section of a peripheral structure 250 (e.g., a room) in which individuals 251, 252, and 253 are present, the peripheral structure having an environment 270 (e.g., the atmosphere) and a sensor 261. Individuals 251, 252, and 253 may enter through an opening (e.g., a door) to which one or more sensors 261 are attached so that the sensors may sense environmental characteristics. The peripheral structure has an inlet direction 281 and an outlet direction 282. An individual may enter the peripheral structure from an entrance direction and exit the peripheral structure through an opening (e.g., a door) through an exit direction.
In some embodiments, the control system is operatively (e.g., communicatively) coupled to a collection of devices (e.g., sensors and/or transmitters). The aggregate facilitates control of the environment and/or alarms. Control may utilize a control scheme, such as feedback control or any other control scheme described herein. The aggregate may include at least one sensor configured to sense and/or identify electromagnetic radiation. The electromagnetic radiation may be (human) visible, infrared (IR) or Ultraviolet (UV) radiation. The at least one sensor may comprise an array of sensors. For example, the ensemble may include an array of IR sensors (e.g., a far infrared thermal array, such as the far infrared thermal array of Melexis). The IR sensor array may have a resolution of at least 32 x 24 pixels or 640 x 480 pixels. The IR sensor may be coupled to a digital interface. The ensemble may include an IR camera. The aggregate may comprise a sound detector. The aggregate may comprise a microphone. The aggregate may include any of the sensors and/or emitters disclosed herein. The aggregate may comprise CO 2 VOC, temperature, humidity, electromagnetic light, pressure and/or noise sensors. The sensor may comprise a gesture sensor (e.g., RGB gesture sensor) Sensor), an acetometer or an acoustic sensor. The sound sensor may include an audio decibel level detector. The sensor may comprise a meter driver. The ensemble may include a microphone and/or a processor. The ensemble may include a camera (e.g., a 4K pixel camera), a UWB sensor and/or transmitter, a UHF sensor and/or transmitter, a bluetooth (abbreviated herein as "BLE") sensor and/or transmitter, a processor. The camera may have any of the camera resolutions disclosed herein. One or more of the devices (e.g., sensors) may be integrated on a chip. The sensor ensemble may be used to determine the presence, number, and/or identity of occupants in the peripheral structure (e.g., using a camera). The sensor ensemble may be used to control (e.g., monitor and/or adjust) one or more environmental features in the surrounding structural environment.
The sensor attached to the opening through which the individual passes (e.g., the door and/or window) may comprise one or more sensors. The one or more sensors may perform environmental localization and/or detect the presence of an individual. The sensor attached to the opening may be free of at least one sensor performing the environmental localization. The sensor attached to the opening may be free of at least one sensor that detects the presence of an individual. For example, environmental characteristics (e.g., temperature) affected by the individual may be provided in sensors attached to the opening or provided elsewhere in the peripheral structure. For example, a sensor (e.g., an optical sensor) that senses the presence of an individual may be attached to the opening or disposed elsewhere in the peripheral structure. A system that detects environmental characteristics affected by an individual may be free of any sensors that do not detect environmental characteristics. Sensors that do not detect environmental features may detect the presence of an individual (e.g., a weight sensor or a camera). In some embodiments, the sensor that detects the environmental characteristic also detects the presence of an individual (e.g., an infrared sensor, a humidity sensor, or a carbon dioxide sensor). In some embodiments, one or more openings in the peripheral structure remain at least partially open during testing. For example, the door may remain open when the sensor senses the temperature of the environment of the opening (e.g., and thus any disturbance thereof). The perturbation may be caused by an individual. An individual may have (e.g., within normal and/or abnormal ranges) a typical pattern of interfering environmental features. In some embodiments, it is sufficient to detect whether an individual is affecting a sensor that indicates a change in the environment of normal and/or abnormal personal characteristics, e.g., and without the additional need for a sensor dedicated to indicating the presence or absence of an individual. For example, a high fidelity (e.g., high resolution) sensor that measures a characteristic of an individual may be sufficient. For example, a sensor that (e.g., significantly) perturbs the ambient environmental features may be sufficient. The high fidelity sensor may have more than one single pixel detector. The high fidelity sensor may comprise a plurality of single pixel detectors (e.g., an array). The high fidelity sensor may be configured to detect a span of values of the detectable characteristic. For example, an amplitude span (e.g., range). Such as a span of wavelengths (e.g., including multiple wavelengths). For example, the thermal detector may be an optical detector (e.g., an Infrared (IR) detector) and may be capable of detecting a plurality of IR wavelengths (e.g., within a wavelength range). The perturbation of the label may have a high signal-to-noise ratio that is detectably different from the signal under ambient conditions. In some embodiments, the sensors that detect the environmental characteristic (e.g., temperature) form a sensor array. The sensor array may be included in an aggregate. The aggregate and/or sensors therein may have a small FLS, a small form factor (e.g., volume to height ratio), may be low cost, may support bluetooth (e.g., include a bluetooth adapter), and/or require low power. The small FLS may be up to about 20 millimeters (mm), 10mm, 8mm, 6mm, 5mm, 4mm, 3mm, 2mm, or 1mm. The small FLS can be any value between the above-described values (e.g., about 20mm to about 1mm, about 20mm to about 8mm, or about 8mm to about 1 mm). The low power required for the aggregate and/or the sensors therein may be up to about 1500 microwatts (μ W), 1000 μ W, 800 μ W, 500 μ W, or 250 μ W. The low power required for the aggregate and/or the sensors therein can be between any of the above power values (e.g., about 1500 μ W to about 250 μ W, about 1500 μ W to about 500 μ W, or about 500 μ W to about 250 μ W).
In some embodiments, the test station comprises a barrier. The barrier may selectively allow an individual to pass therethrough. The barrier may include one or more doors that block an occupant from walking through (e.g., block the occupant's passage) when the doors are closed or (ii) allow the occupant to walk through the doors when the doors are open. The door may comprise a transparent material or an opaque material. The door may comprise glass, polymer or metal material. The door may rotate about the hinge. The door may slide, for example, on bearings. The door may be, for example, a pivoting door. The door may be operatively coupled (e.g., connected) to one or more posts. The door may be operatively coupled (e.g., connected) to the divider. The divider may facilitate dividing occupants. The divider may include a cavity. The cavity may include one or more circuitry. The door may be opened and closed automatically and/or reversibly. The closing and opening of the door may be controlled by one or more controllers (e.g., as disclosed herein). The door may be operatively coupled to an actuator configured to facilitate opening and/or closing of the door, for example, upon receiving a signal from the controller. The actuator may be operatively coupled to the controller, for example, via a network. The divider may or may not include at least one controller configured to control the opening and/or closing of the door. The divider may include an actuator. The barrier apparatus may include a security gate, a rod, a gate such as a swing gate (e.g., a tri-roll gate), and/or a revolving gate. The swing gate may replace or be added to the door. The stem and/or gate may comprise any of the materials disclosed herein (e.g., gate materials). The barrier may comprise an access control barrier. The gate may be a fraction of the average individual height. The score may be at least about 0.25, 0.3, 0.4, 0.5, 0.75, 1, 1.25, or 1.5 times the average individual height.
Fig. 3A shows an example of a door 310 coupled to a divider 311 and a door 312 coupled to a divider 316 by a hinge 313. The divider 316 includes compartments 314 and 315, which may include devices for authentication and/or interaction with an individual. The cavity may include one or more sensors, for example, to sense one or more characteristics of the environment and/or one or more characteristics of the individual, for example, as they pass along and/or stand alongside the compartment. For example, the sensor may sense the passage of an individual and/or the identity of an individual. The compartment may include a signaling emitter, such as a sound and/or light. The signaling may indicate whether an individual is allowed to pass through the door, or whether the door mechanism is malfunctioning. For example, a red light may signal no authorized entry, a green light may signal authorized entry, and an orange light may signal a fault. The sound may include a mechanical (e.g., a buzz) or recorded sound (e.g., human voice). The sound may comprise an audible read message (e.g., a sentence). The divider may be fixed. The divider may include a non-corrugated surface (e.g., a smooth surface), such as surface 317. The divider may include a rough and/or corrugated surface, such as 318. The divider may include one or more slits and/or holes. The divider may include one or more lighting devices (e.g., disposed in the compartment). The slits and/or holes may facilitate transmission of environmental features from the ambient environment to sensors and/or emissions (e.g., sound and/or light) disposed in the divider traveling from the interior of the divider into the ambient environment.
In some embodiments, the test station includes one or more frames. The frame may receive an object. The frame and/or the objects in the frame may be modular. For example, the relative positions of the frame and/or objects in the frame may change. The objects in the frame may include windows (e.g., tintable windows), panels, and/or display configurations. The display configuration may include monitoring (e.g., computer or television monitoring). The display construction may include an array of Light Emitting Diodes (LEDs), such as an array of organic LEDs. The organic LED array may be at least partially transparent (T-OLED). Examples of DISPLAY configurations, tintable WINDOWs, their operation AND control can be found in U.S. provisional patent application serial No. 62/975,706, entitled "DISPLAY view WINDOW AND MEDIA DISPLAY," filed on 12.2.2020, which is incorporated herein by reference in its entirety. The display configuration may be provided in the frame as an object in a separate frame. The display configuration may be disposed on a window (e.g., a tintable window). The objects in the frame may include one or more dispensing devices and/or sterilization-related devices. The dispensing apparatus may be configured to dispense gloves, masks, sanitizing liquids, badges, paper (e.g., toilet paper). The dispensing device may be a printer (e.g., dispensing printed stickers or paper). The frame may include wiring. The frame may include one or more devices including sensors, transmitters, antennas, controllers, circuitry, and/or power sources. The test station may be operatively (e.g., communicatively) coupled to a network (e.g., and one or more controllers). A display configuration (e.g., an OLED display) may display a message to an individual attempting to enter the facility. The display structure may display an image of the foreground. The front desk may be positioned in the facility or outside the facility. The foreground image may be a real-time foreground (e.g., interacting with the individual via a video conference). The foreground image may be an animation. The test station may include a signaling image. The signaling images may be static or non-static. The signaling image may or may not change over time. The signaling image may be permanent or non-permanent (e.g., temporary). The signaling image may be transparent or non-transparent. The signaling image may be attached (e.g., to a fixed structure of the peripheral structure, such as a floor) or non-attached. For example, the signaling image may include a sticker. The signaling image includes a drawing. The signaling image may comprise an engraving, a relief or an embroidery. The signaling image may be a part of a fabric, such as a carpet. The signaling image may be a projection of light. The projection of light may or may not change over time. The test framework may include a projector that projects light onto the individual and/or the floor to form the signaling image. The projected light may signal where the individual is standing (e.g., a circle may be projected, signaling that the individual is standing in the circle). The projection light may signal the direction in which the individual is standing (e.g., the projection light may include a footprint of a stationary individual disposed in a certain direction, thereby signaling the individual to stand in that direction). The projection light may signal the direction in which the individual is walking (e.g., the projection light may include footprints of the walking individual set in a certain direction, signaling the individual is walking in that direction). The projection may be colored. The projection may be a white projection. The color may signal if the individual is ready to proceed to the next stage in the admission process (e.g., green indicates that the individual is free to enter the facility). The color may signal (e.g., red indicates that the individual cannot enter the facility) if the individual is not ready to proceed to the next stage in the admission process. The color may signal an error or fault (e.g., orange or yellow to indicate that an error is present in the process and/or in one of the components of the test station).
Fig. 3B shows an example of a test station comprising a frame 320 and an object in the frame, such as a plate 321 comprising dispensing equipment 322 attached thereto, a display construct 323 displaying messages, and a window 324. The framework includes a projector configured to project signaling images (e.g., 325 and 326) of white light. The signaling image includes a circle 325 and a footprint 326 of the stationary individual. The signaling image signals the individual to stand in circle 325, facing frame 320. The signaling image may facilitate sensing of close proximity and/or directionality of the individual by one or more sensors disposed in frame 320. Any of the objects and/or frames in the frame may be modular. The signaling image may facilitate the approach of the individual to the distribution equipment and/or the image projected by the display construct, thereby facilitating the admission process to the facility. The frame can comprise any material disclosed herein (e.g., polymeric, metallic, transparent and/or non-transparent materials). The frame may include one or more holes and/or slots. The slits and/or holes may facilitate transmission of environmental features from the ambient environment to sensors and/or emissions (e.g., sound and/or light) disposed in the divider traveling from the interior of the divider into the ambient environment.
Fig. 3C shows an example of a test station comprising a frame 330 and objects in the frame, such as a plate 331 comprising a dispensing device 332 attached thereto, a display configuration 333 to display messages, a display configuration 334 to display a foreground, a plate 335 with a poster attached thereto, a plate 336 and a window 337. The test station depicted in the example shown in fig. 3C projects a signaling image 338. Any of the objects and/or frames in the frame (e.g., fig. 3B-3D) may be modular.
In some embodiments, the frame device (e.g., frame system) and the barrier device are disposed in a reception area of a facility. The frame device may be provided in close proximity (e.g., contact or non-contact) with the barrier device. For example, the frame device may be arranged within a certain travel distance from the barrier device. The walking distance may comprise up to about 10, 8, 5, 3 or 2 average steps of the individual. Fig. 3D shows an example of a test station comprising a door and divider device 340 and a frame 341, which are arranged adjacent to each other within a walking distance of about one or two steps. The frame 341 includes a projector 342 disposed in an interior volume of the frame that projects light 434 to form a signaling image 346 on a floor adjacent the frame 341. The framework device may be used as a self-service terminal that services one or more users (e.g., simultaneously).
The test station may include a frame apparatus that includes a frame (e.g., including mullions and transoms) that frames objects in one or more frames. The outer surface of the object in the frame may comprise smooth or rough portions. The roughness may be formed as a result of embossing, sanding or scratching of the surface. The roughened portion may include an intrusion, an extrusion, or an application (application). The applicator may comprise a mesh or cloth. The roughness may include a regular pattern or an irregular pattern. The objects in the frame may be hollow or non-hollow. The object in the frame may include an internal cavity. The internal cavity may include at least one device (e.g., a sensor, a transmitter, a controller, circuitry (e.g., a processor), a computer, a memory, a radar, or an antenna). The internal cavity may include a transceiver. The internal cavity may include circuitry. The internal cavity may include wiring. The internal cavity may have bluetooth, global Positioning System (GPS), UHF and/or Ultra Wideband (UWB) devices. The internal cavity may include a modem. The object in the frame may comprise one or more slits and/or holes. The object in the frame may include one or more camera devices (e.g., disposed in a compartment). The slits and/or holes may facilitate transmission of environmental features from the ambient environment to sensors and/or emissions (e.g., sound and/or light) disposed in the divider traveling from the interior of the divider into the ambient environment.
In some embodiments, the frame apparatus or barrier apparatus is disposed in a reception area of a facility. The entrance into the facility may include one or more signaling images, e.g., signaling where the individual is standing. The signaling image may signal the individual to stand at a prescribed distance from each other. The signaling image may include a region that is allowed to stand, i.e., an inner shape, and a region that is not allowed to stand, i.e., an edge. The signaling image may include a shape (e.g., a geometric shape). The shape may include an ellipse (e.g., a circle) or a polygon. The polygon may include a triangle (e.g., an equilateral triangle), a rectangle (e.g., a square), a pentagon, a hexagon, a heptagon, or an octagon. The polygon may be a regular polygon. The edge may have the same shape as the inner shape. The edge may have a different shape than the inner shape. Fig. 4A shows an example of a hospitality area in a facility, the hospitality area including a signaling shape having an interior shape 411 that allows an individual (e.g., while waiting to engage with a test station) to stand therein and an edge 412 that an individual should not stand therein, the interior shape and the edge having a circular shape. The signaling shape signals the position of an individual such as 413 in which they stand while waiting to interact with a testing station 414, the testing station 414 including a barrier apparatus 415, a dispensing station 417, and a device housing 416 (e.g., including any device disclosed herein, such as a sensor and/or transmitter). The device housing may include a device aggregate. In the example shown in FIG. 4A, the test station is disposed in an entrance 418 to the facility. An individual 413 wishing to access the facility will have to pass through the test station 414 to access the facility, for example by opening a door of the test station 414.
In some embodiments, the dispensing device is operatively coupled to a network. The dispensing device may be controlled manually or automatically. For example, the dispensing device may be controlled by a controller. The dispensing device may be operatively coupled (e.g., wired and/or wirelessly) to the network. The dispensing device may be operatively (e.g., communicatively) coupled to the barrier device and/or the frame device. The dispensing device may be physically coupled (e.g., using an intermediate plate) to the frame device and/or the barrier device, e.g., for support and/or stability. The dispensing device may be uncoupled and/or not connected to the network. Fig. 4A shows an example where the distribution apparatus 417 is coupled to the barrier apparatus via a transparent panel 418. The plate may be any of the plates disclosed herein.
Fig. 4B shows an example of a test station 424 that includes barrier apparatuses 425 and 426 and a frame apparatus 427 disposed beside barrier apparatus 426. The test station 424 is provided in an entrance 428 into the facility such that the individual 423 will have to pass through a door of the test station 428 to enter the facility. The individuals are guided along the signaling image (e.g., 430 including inner circles and rounded edges) in the direction shown by the dashed arrow (e.g., 429) toward the test station 424. The individual is further guided by the projected image 428 to interact with the framing apparatus 427 and, if permitted, pass through the door of the barrier apparatus 426 to enter the facility through the entrance 428.
In some embodiments, the test station is disposed in an admissible region of the facility. The admission may include one or more gates. The door may or may not be part of the test station. The inlet may be restrictive and allow a small number of individuals (e.g., up to 5, 4, 3, 2, or 1) to pass therethrough, or may be less restrictive and allow a larger number of individuals to pass therethrough. The test station may include one or more barrier devices and/or one or more frame devices. At least two barrier devices in the test station may be different from each other. At least two barrier devices in the test station may be similar to each other (e.g., 425 and 426 in fig. 4B). At least two frame devices in the test station may be different from each other. At least two frame devices in the test station may be similar to each other. For example, in FIG. 4B, all of the framing apparatuses (e.g., 427) are similar to each other.
Fig. 5A shows an example of a facility having entry doors, including door 510 and entry 511. A test station 512 including a frame device 512 projecting a signaling image 513 is disposed beside the entrance 511, the test station being disposed on a passage 514 from the door 510 to the entrance 511. The test station 512 does not include a barrier device. The walkway to the entrance 511 is not limiting and allows individuals to access the peripheral structure 500 and the entrance 511.
Fig. 5B shows an example of a facility having entry doors including a door 520 and an entry 521. The test station includes frame devices 522 and 523. The test station is positioned on the pathway through which the individual 525 will walk from the door 520 to the entrance 521. The test station does not include a barrier device. The walkway from the door 520 to the entrance 521 is more restrictive than in fig. 5A, in that it allows one individual (rather than several individuals) to freely enter the peripheral structures 529 and the entrance 521. In the example shown in fig. 5B, the frame devices 522 and 523 are disposed at an angle (e.g., about a 90 degree angle) to each other. The frame devices 522 and 523 are different. The frame device 522 includes a window such as 526 and a dispensing device 527. The frame device 523 includes a window such as 531, a display construct 532 that displays, for example, a foreground, a board 533 with a poster attached thereto, a white board 534 that displays messages, a dispensing device 535, and an opaque (e.g., and textured) board 537.
In some embodiments, the reception area may include components of the testing station (e.g., barrier equipment and frame equipment). At least two of the components may be identical. At least two of the components may be different. The two components may be arranged (e.g., substantially) parallel to each other. The two parts may be arranged at an angle to each other. For example, the two components may form an angle with each other, which may be at least about 30 degrees (°), 60 °, 90 °, 120 °, or 180 °. The angle may have any value between the above values.
In some embodiments, the test stations are disposed in a peripheral configuration that includes secondary (e.g., manual and/or more rigorous) test stations. Fig. 6A shows an example of a test station 612 comprising a barrier device disposed at an entryway into a facility 600, the test station 612 being disposed in a walkway of an individual entering the facility, the individual passing through an outer door 611, an inner door 614, a reception area 615 and entering the facility 616. The barrier devices of 612 are arranged parallel to each other. When the individual is unable to pass through the barrier apparatus 612 (e.g., because the individual fails the test), the individual is directed 613 to the secondary testing station 610. Nurses and/or security officers (physically present, graphically present via display constructs, or visually present via display constructs) may interview and/or test individuals more rigorously than the exam in the testing station 612. Fig. 6B shows an example of a testing station 622 that includes parallel barrier and frame devices disposed at an entrance passageway in an access facility 630, the testing station 622 being disposed in the walkway of an individual entering the facility through an outer door 621 in a direction 631, the individual entering through an inner door 624, a reception area 625, and into a facility 626. 622 are arranged parallel to each other. 622 are arranged parallel to each other. 622 are disposed parallel to the frame means. Each frame device of testing station 622 is disposed adjacent (e.g., in contact with or without) at least one barrier device. Each barrier device of testing station 622 is disposed adjacent (e.g., in contact with or without contact with) at least one frame device. The frame apparatus of the test station 622 forms a pair with a barrier apparatus. When an individual cannot pass through the barrier facility 622 (e.g., because the individual has failed the test), the individual is directed 623 to the secondary test station 620.
In some embodiments, the frame device forms a pair with the barrier device. The frame device and/or the barrier device may be coupled to the network (e.g., via wired and/or wireless communication). The frame device and the barrier device may or may not be directly operatively (e.g., communicatively) coupled to each other. The frame device and the barrier device may be operatively (e.g., communicatively) coupled to each other via a network. The frame device and/or the barrier device may comprise (e.g. be provided in or on) at least one controller. The frame device and the barrier device may be operatively (e.g., communicatively) coupled (e.g., via a network) to at least one controller.
In some embodiments, an individual entering a facility undergoes one or more hospitality procedures (e.g., operations). The hospitality operation may involve an admission process, equipment distribution, physical characteristics (e.g., disinfection), barriers, and/or secondary inspections. If the individual is a non-employee, the operation involves the receptionist accompanying the individual into the facility and/or signing up for an agreement (e.g., a privacy agreement). The operations may be performed in any order.
Fig. 7 shows a flow diagram depicting various operations involving a person (e.g., employee) undergoing an admission process 710, a device allocation 720, physical characteristics 730, a barrier 740, and a secondary check 750. Optional operations are depicted in dashed lines in fig. 7. In an admission operation 710, an employee may optionally scan 711 the badge, allowing him to enter the facility. The test system evaluates whether to grant the employee access to the facility at 712. If the employee is not allowed access, the employee is optionally directed away from the facility and/or to call an administrator in operation 713. For example, the testing system may analyze whether the employee is pre-authorized for entry (e.g., the employee is in an employee roster and no special instructions are recorded that restrict the employee's entry). If the employee is authorized to enter, the door is opened at operation 715 and the employee either enters the facility directly or is directed to a device assignment operation 720 at 716. The test system then analyzes whether the employee has a tag (e.g., by scanning and/or sensing the identity of the employee using the tag) in operation 721. If the employee does not have a label, the testing system may assign and/or give the employee a new label, which is assigned to the employee, in operation 722. In operation 723, the testing system analyzes whether the employee is in compliance with device-related safety protocols of the jurisdiction in which the facility is located and/or safety protocols of the facility (e.g., wearing protective equipment such as masks, lab coats, safety shoes, and/or safety goggles). If the employee is not in compliance, the testing system may dispense the required equipment at 724 or otherwise block entry of the employee until the employee is in compliance with the dress safety (the blocking operation is not shown). In optional operation 725, the test system may determine that the employee has taken the device. The employee will then undergo an operation 730 related to a physical characteristic (e.g., temperature or cough), wherein the employee optionally (e.g., within jurisdictional and/or utility protocols) disinfects any desired body part (e.g., hand) in operation 731, senses one or more physical characteristics (e.g., using one or more sensors) in operation 732, and the test system then analyzes whether the physical characteristic is normal or abnormal (e.g., using thresholds and/or characteristics). If the employee's physical characteristics are normal, the test system unlocks the barrier in operation 741 of the barrier operation stage 740, and the employee enters the facility in operation 742. The results of the analysis of the physical characteristics may optionally be transmitted to employees and/or facilities in operation 736. If a physical characteristic anomaly is found at 733, the employee is notified at 734 and the facility is notified at 735. The employee is then notified of the various options in operation 751. The employee should select in operation 752 (e.g., by entering information into the system) whether the employee wishes to continue the admission process. If the employee wishes to continue, the employee is directed to a secondary check in operation 753 (e.g., a manual and/or more rigorous check on the employee). If the employee does not wish to continue the admission process, the employee is sent away in operation 752. The results of the employee's decision are sent to the facility and/or recorded.
Fig. 8 shows a flow diagram depicting various operations involving a person (e.g., visitor, e.g., non-employee) undergoing an admission process 810, reagent and physical characteristics 820, device assignments 830, barriers 840, and entering a facility 850. Optional operations are depicted in dashed lines in fig. 8. In an admission operation 810, the testing system attempts to detect employee badges (e.g., tags). If employee tags are detected, the operations in FIG. 7 occur. If the tag is not detected (and the individual is not in the employee roster), the testing system classifies the individual as a visitor. The test system verifies whether the guest is allowed access to the access facility in operation 812, thereby checking whether it is the desired guest. For example, the testing system may analyze whether the guest is pre-authorized to enter (e.g., the guest is in a guest registration book and no special instructions are recorded to restrict guest entry). If the visitor is not allowed to enter the facility, the visitor optionally leaves the facility and/or calls a receptionist in operation 813. If the visitor is authorized to enter, the visitor recipient is called 815 and once the recipient arrives 816, the door is opened and the visitor enters the facility directly 821 or directed to a device assignment operation 820. The testing system then senses whether the physical characteristic of the guest is within a normal range or abnormal (e.g., using a sensor) in operation 823. If the physical characteristics are normal, the receptionist is notified in operation 825 and the employee is requested to leave the facility in operation 855. If the physical characteristics are not abnormal (e.g., within normal ranges), the facility and/or receptionist is notified (e.g., transmitted) of the results in operation 826 and the visitor proceeds to the equipment and disinfection operating stage. In operation 831, the testing system analyzes whether the visitor is complying with device-related safety protocols (e.g., standards) of the jurisdiction in which the facility is disposed and/or safety protocols of the facility (e.g., wearing protective equipment such as masks, lab coats, safety shoes, and/or safety goggles). If the employee is not in compliance, the testing system may assign the required equipment at 832 or otherwise block entry of the employee until the employee dress safety compliance (the blocking operation is not shown). In optional operation 833, the system may determine that the employee has taken the device. The employee may optionally be requested to sanitize a body part (e.g., hands) in operation 834. If the guest is in compliance with a device-related safety protocol (e.g., standard), the guest may optionally be requested to disinfect a body part (e.g., hand) in operation 834. The test system may or may not verify sterilization (e.g., using a sensor). Once the guest complies with the device and disinfection operation stage 830, the guest begins protocol-related operations 840. The guest is directed to wait for the recipient (e.g., if the recipient has not yet arrived) in operation 841 and request the guest to sign an agreement (e.g., a privacy agreement) in operation 842. If the guest signs the agreement, the guest may be provided with a badge (e.g., tag) in operation 843. If the guest is not signed for an agreement, the receptionist is responsible for the guest at 844 (e.g., the guest may be asked to leave at operation 855). Once the guest signs the agreement, the test system unlocks the barrier in operation 851 of the leave/enter operation phase 850, and the employee enters the facility in operation 852. The results of the analysis of the physical characteristics may optionally be transmitted to the visitor and/or the facility. If the physical characteristic of the visitor is found to be abnormal, the visitor may be notified (e.g., using a message, such as by email, telephone, text, sound, image, or printed matter, etc.). The visitor may be notified of various options. The guest should select (e.g., by entering information into the system) whether the guest wishes to continue the admission process. If the visitor wishes to continue, the visitor is directed to a secondary check (e.g., a manual and/or more rigorous check on the employee). If the guest does not wish to continue the admission process, the guest is sent out of the facility. The results of the guest decision may be sent to the facility and/or recorded.
In some embodiments, the test system is operatively coupled to one or more controllers. The controller is configured to control one or more devices of the facility. The analysis of the test system may be performed at least in part by the one or more controllers of the control facility (e.g., which may include or may be operatively coupled to the building management system).
In some embodiments, one or more sensors are attached to the mobile traveler. Travelers may be animate or inanimate. For example, the sensors may be carried by individuals walking around the facility. For example, the sensors may be carried by a transient robot (e.g., a self-propelled vehicle). Such as air-borne vehicles (e.g., drones) or ground-based transportation vehicles (e.g., wheeled vehicles). The location of the traveler may be known. The traveler and/or the individual being tested may identify their location. The location of the traveler and/or the individual being tested may be identified at least in part via electromagnetic radiation related technology, such as via satellite (e.g., GPS), bluetooth, UHF, and/or UWB technology. The traveler's location may be identified, at least in part, via at least one sensor in the facility (e.g., a radar, moment sensor, antenna, and/or optical sensor (e.g., a camera, such as a video camera)). The location of the traveler and/or the individual being tested may be identified, at least in part, via the individual and/or a mobile device (e.g., a cellular telephone) carried by the traveler.
In some embodiments, the environmental characteristic secreted by and/or affected by the individual has a transmission pattern in the environment. The propagation pattern may have a distinct and/or recognized pattern, e.g., may be measured by one or more sensors. For example, the heat emitted by an individual has a diffusion pattern in the ambient atmosphere. For example, an individual's carbon dioxide emissions have an emission pattern in the environment. For example, carbon dioxide may be more pronounced around the mouth and nose of an individual, and for example, exhalation rate, direction, pressure, and/or velocity may be measured. An example of a contour plot of a horizontal (e.g., top) view of an office environment is shown in the example shown in FIG. 12, depicting various COs 2 And (4) concentration. Based on the exhalation of CO and its concentration change in the room, nine can be identifiedThe locations of individuals 1201-1209. Furthermore, individuals 1201-1204 may be identified as facing individuals 1205-1208, and an individual 1209 may be identified as being positioned between and orthogonal to the groups of individuals 1201-1204 and the groups of individuals 1205-1208. Additionally, by considering the contour map, it can be assumed that wind is blowing in a direction from individuals 1201-1204 towards individuals 1205-1208, where the wind has a non-uniform effect in the room. Wind appears to have a greater effect in the direction from individuals 1201 and 1205 towards individual 1209. Wind affected individuals 1201 and 1205 the least, affected individuals 1202 and 1206 more, affected individuals 1203 and 1207 even more, affected individuals 1204 and 1208 still more, and affected individual 1209 the most. Wind may enter through an opening such as a vent, an open window, or an open door. Thus, the sensed environmental characteristics in the room can be used (at least in part) to locate individuals in the peripheral structure.
The peripheral structure may have an ambient pressure and/or temperature. The peripheral structure may have a pressure different from ambient pressure (e.g., lower and/or higher than ambient pressure). The peripheral structure may have a temperature different from (e.g., lower and/or higher than) the ambient temperature. The peripheral structure is a specialized peripheral structure (e.g., configured to maintain a pressure and/or temperature different than ambient temperature). The ambient pressure may be one atmosphere. The ambient temperature may be about 25 ℃. The ambient pressure and atmosphere may be standard pressure and atmosphere.
In some examples, data from multiple sensors (e.g., of the same type or different types) may be used to assess the environment and individuals acting on the environment. Such an assessment may reveal the health of the individual (e.g., having normal or abnormal physical characteristics). For example, the temperature, humidity, and/or carbon dioxide pattern in the environment may be used to locate and/or assess the health of an individual. Data from the multiple sensors (e.g., of different types) may be correlated (e.g., cross-correlated).
In some embodiments, the owner and/or user of the data may take action if the individual exhibits one or more abnormal characteristics. For example, they may initiate medical treatment of the individual, isolate the individual, and/or take steps to divert the individual away (e.g., to reduce the risk of injury to others). In some embodiments, a control system of a facility may initiate an action. For example, the control system may change or direct a change in the state of one or more devices operatively coupled to the building management system. For example, the control system may change or direct a change in the state of an HVAC system, a lighting system, or a buzzer. For example, in response to the result, the lighting device may blink, a buzzer will buzz, or ventilation will increase.
In some embodiments, the peripheral structure includes one or more sensors. The sensors can help control the environment of the peripheral structure such that an occupant of the peripheral structure can have an environment that is more comfortable, pleasant, beautiful, healthy, productive (e.g., in terms of occupant performance), easier to live (e.g., work), or any combination thereof. The sensor may be configured as a low resolution sensor or a high resolution sensor. The sensor may provide an on/off indication of the occurrence and/or presence of a particular environmental event (e.g., a pixel sensor). In some embodiments, the accuracy and/or resolution of the sensor may be improved via artificial intelligence analysis of its measurements. Examples of artificial intelligence techniques that may be used include: reactive, limited memory, theory of thought, and/or self-cognition techniques known to those skilled in the art. The sensors may be configured to process, measure, analyze, detect, and/or react to one or more of: data, temperature, humidity, sound, force, pressure, electromagnetic waves, position, distance, motion, flow, acceleration, velocity, vibration, dust, light, glare, color, gas, and/or other aspects (e.g., characteristics) of the environment (e.g., of the surrounding structure). The gas may include Volatile Organic Compounds (VOCs). The gas may include carbon monoxide, carbon dioxide, water vapor (e.g., moisture), oxygen, radon, and/or hydrogen sulfide.
In some embodiments, processing the sensor data comprises performing sensor data analysis. The sensor data analysis may include at least one rational decision-making process and/or learning. The sensor data analysis may be used to adjust the environment, for example, by adjusting one or more components that affect the environment of the peripheral structure. The data analysis may be performed by a machine-based system (e.g., circuitry). The circuitry may be a processor. Sensor data analysis may utilize artificial intelligence. The sensor data analysis may rely on one or more models (e.g., mathematical models). In some embodiments, the sensor data analysis includes linear regression, least squares fitting, gaussian process regression, nuclear regression, nonparametric multiplicative regression (NPMR), regression trees, local regression, semi-parametric regression, order preserving regression, multivariate Adaptive Regression Splines (MARS), logistic regression, robust regression, polynomial regression, stepwise regression, ridge regression, lasso regression, elastic network regression, principal Component Analysis (PCA), singular value decomposition, fuzzy measurement theory, borel measures, han measures, risk neutral measures, lebesgue measures, data processing Grouping Methods (GMDH), bayesian classifiers, k nearest neighbor algorithms (k-NN), support Vector Machines (SVM), neural networks, support vector machines, classification and regression trees (CART), random forest methods, gradient boosting, or Generalized Linear Model (GLM) techniques. Fig. 9 shows an example of a diagram 900 of a sensor arrangement distributed between peripheral structures. In the example shown in FIG. 9, controller 905 is communicatively linked 908 with sensors located in peripheral structure A ( sensors 910A, 910B, 910C, \ 8230; 910Z), peripheral structure B ( sensors 915A, 915B, 915C, 915Z), peripheral structure C ( sensors 920A, 920B, 920C, \8230; 920Z), and peripheral structure Z (sensors 985A, 985B, 985C, \8230; 985Z). The communication link includes wired communication and/or wireless communication. In some embodiments, the sensor assembly includes at least two sensors of different types. In some embodiments, the sensor assembly comprises at least two sensors of the same type. In the example shown in FIG. 9, sensors 910A, 910B, 910C, \823030910Z of peripheral structure A represent an aggregate. A sensor ensemble may refer to a collection of various different sensors. In some embodiments, at least two of the sensors in the aggregate cooperate to determine environmental parameters of, for example, a peripheral structure in which they are disposed. For example, the sensor assembly may include a carbon dioxide sensor, a carbon monoxide sensor, a volatile organic chemical sensor, an environmental noise sensor, a visible light sensor, a temperature sensor, and/or a humidity sensor. The sensor assembly may include other types of sensors, and claimed subject matter is not limited in this respect. The peripheral structure may include one or more sensors that are not part of the sensor ensemble. The peripheral structure may include a plurality of aggregates. At least two of the plurality of ensembles may differ in at least one of their sensors. At least two of the plurality of ensembles may have at least one sensor of their sensors that is similar (e.g., of the same type). For example, the aggregate may have two motion sensors and one temperature sensor. For example, the aggregate may have a carbon dioxide sensor and an IR sensor. The aggregate may include one or more devices that are not sensors. The one or more other devices than sensors may include an acoustic emitter (e.g., a buzzer) and/or an electromagnetic radiation emitter (e.g., a light emitting diode). In some embodiments, a single sensor (e.g., not in an aggregate) may be disposed adjacent (e.g., in close proximity such as contact) to another device that is not a sensor. The controller may be communicatively coupled to the sensor assembly or may be part of the sensor assembly.
The sensors of the sensor aggregate may cooperate with each other. One type of sensor may have a correlation with at least one other type of sensor. Conditions in the peripheral structure may affect one or more of the different sensors. The sensor readings of the one or more different sensors may be correlated to and/or influenced by the condition. The correlation may be predetermined. The correlation may be determined over a period of time (e.g., using a learning process). The time period may be predetermined. The time period may have a cutoff value. The cutoff value may take into account, for example, a threshold value (e.g., a percentage value) of error between the predicted sensor data and the measured sensor data under similar circumstances. The time may be continuous. The correlation may be derived from a learning set (also referred to herein as a "training set"). The learning set may include and/or may be derived from real-time observations in the peripheral structure. The observation may include data collection (e.g., from sensors). The learning set may include sensor data from similar peripheral structures. The learning set may include a third party data set (e.g., sensor data). The learning set may be derived from, for example, a simulation of one or more environmental conditions affecting the peripheral structure. The learning set may constitute detected (e.g., historical) signal data with one or more types of noise added. The correlation may utilize historical data, third party data, and/or real-time (e.g., sensor) data. A correlation between the two sensor types may be assigned a value. The value may be a relative value (e.g., strong correlation, medium correlation, or weak correlation). A learning set that is not derived from real-time measurements may be used as a reference (e.g., baseline) to initiate operation of sensors and/or various components affecting the environment (e.g., HVAC systems and/or tinted windows). The real-time sensor data may supplement the learning set, for example, on an ongoing basis or over a defined period of time. The (e.g., supplemental) learning set may increase in size during deployment of the sensors in the environment. The initial learning set may, for example, include additional (i) real-time measurements, (ii) sensor data from other (e.g., similar) peripheral structures, (iii) third party data, (iv) other and/or updated simulations to increase in size.
In some embodiments, the data from the sensors may be correlated. Once a correlation between two or more sensor types is established, deviations from the correlation (e.g., deviations from the correlation values) may indicate an irregular condition and/or failure of a sensor of the related sensors. The fault may include a calibrated slip. The fault may indicate a need for recalibration of the sensor. The failure may include a complete failure of the sensor. In one example, the movement sensor may cooperate with a carbon dioxide sensor. In one example, in response to the movement sensor detecting movement of one or more individuals in the peripheral structure, the carbon dioxide sensor may be activated to begin taking carbon dioxide measurements. An increase in movement in the peripheral structure may be associated with an increase in carbon dioxide levels. In another example, a movement sensor detecting an individual in a peripheral structure may be associated with an increase in noise detected by a noise sensor in the peripheral structure. In some embodiments, the detection of the first type of sensor is not accompanied by the detection of the second type of sensor, which may cause the sensors to issue an error message. For example, if the motion sensor detects many individuals in the peripheral structure without an increase in carbon dioxide and/or noise, the carbon dioxide sensor and/or noise sensor may be identified as faulty or having a faulty output. An error message may be issued. The first plurality of different associated sensors in the first aggregate may include one sensor of a first type and a second plurality of sensors of a different type. If the second plurality of sensors indicate a correlation and the one sensor indicates a reading different from the correlation, the likelihood of the one sensor failing increases. If a first plurality of sensors in the first aggregate detect a first correlation and a third plurality of correlated sensors in the second aggregate detect a second correlation different from the first correlation, the likelihood that the first sensor aggregate is exposed to a different condition than the third sensor aggregate is increased.
The sensors of the sensor aggregate may cooperate with each other. The collaboration may include considering sensor data of another sensor (e.g., a different type) in the aggregate. The collaboration may include a trend predicted by another sensor (e.g., type) in the aggregate. The collaboration may include trends predicted by data related to another sensor (e.g., type) in the aggregate. The other sensor data may be derived from another sensor in the aggregate, from the same type of sensor in the other aggregate, or from a type of data collected from other sensors in the aggregate that is not derived from another sensor. For example, the first aggregate may include a pressure sensor and a temperature sensor. The cooperation between the pressure sensor and the temperature sensor may comprise taking the pressure sensor data into account when analyzing and/or predicting the temperature data of the temperature sensors in the first aggregate. The pressure data may be (i) pressure data of pressure sensors in the first aggregate, (ii) pressure data of pressure sensors in one or more other aggregates, (iii) pressure data of other sensors, and/or (iv) pressure data of a third party.
In some embodiments, the collection of sensors is distributed throughout the peripheral structure. The same type of sensors may be dispersed in the peripheral structure, for example, to allow measurement of environmental parameters at various locations of the peripheral structure. The same type of sensor may measure gradients along one or more dimensions of the peripheral structure. The gradient may include a temperature gradient, an ambient noise gradient, or any other change (e.g., increase or decrease) in the measured parameter as a function of position from the point. The gradient may be utilized to determine that the sensor is providing an erroneous measurement (e.g., a sensor failure). Fig. 10 shows an example of a diagram 1090 of the arrangement of the sensor aggregates in the peripheral structure. In the example of fig. 10, the aggregate 1092A is positioned a distance D from the vent 1096 1 To (3). The sensor assembly 1092B is positioned a distance D from the vent 1096 2 To (3). The sensor assembly 1092C is positioned a distance D from the vent 1096 3 To (3). In the example of fig. 10, vent 1096 corresponds to an air conditioning vent representing a relatively constant source of cooling air and a relatively constant source of white noise. Thus, in the example of fig. 10, temperature and noise measurements are made by the sensor aggregate 1092A. Temperature and noise measurements made by the sensor 1092A are shown by the output reading profile 1094A. The output reading profile 1094A indicates a relatively low temperature and a significant amount of noise. Temperature and noise measurements made by the sensor aggregate 1092B are shown by the output reading profile 1094B. The output reading profile 1094B indicates a slightly higher temperature and a slightly reduced noise level. Temperature and noise measurements made by the sensor aggregate 1092C are shown by the output reading profile 1094C. The output reading profile 1094C indicates a slightly higher temperature than the temperatures measured by the sensor assemblies 1092B and 1092A. The noise measured by sensor cluster 1092C indicates a lower level than the noise measured by sensor clusters 1092A and 1092B. In one example, if the temperature measured by the sensor aggregate 1092C indicates a ratio by the sensor aggregate 1092A measured a lower temperature, the one or more processors and/or controllers may identify the sensor aggregate 1092C sensor as providing erroneous data.
In another example of a temperature gradient, a temperature sensor mounted near a window may measure a temperature fluctuation that increases relative to a temperature fluctuation measured by a temperature sensor mounted at a location opposite the window. A sensor mounted near a midpoint between the window and the location opposite the window may measure temperature fluctuations between temperature fluctuations measured near the window relative to temperature fluctuations measured at the location opposite the window. In one example, an ambient noise sensor mounted near an air conditioner (or near a heating vent) may measure greater ambient noise than an ambient noise sensor mounted remotely from an air conditioner or heating vent.
In some embodiments, the first type of sensor is coordinated with the second type of sensor. In one example, an infrared radiation sensor may be coordinated with a temperature sensor. Coordination between sensor types may include establishing a correlation (e.g., negative or positive) between readings from the same type or different types of sensors. For example, an infrared radiation sensor measuring an increase in infrared energy may be accompanied by (e.g., positively correlated to) an increase in measured temperature. A decrease in the measured infrared radiation may be accompanied by a decrease in the measured temperature. In one example, an infrared radiation sensor that measures an increase in infrared energy without a measurable increase in temperature may indicate a failure or degradation of the operation of the temperature sensor.
In some embodiments, one or more sensors are included in the peripheral structure. For example, the peripheral structure may include at least 1, 2, 4, 5, 8, 10, 20, 50, or 500 sensors. The peripheral structure may include a plurality of sensors (e.g., from about 1 to about 1000, from about 1 to about 500, or from about 500 to about 1000) within a range between any of the above-described values. The sensor may be of any type. For example, the sensor may be configured (e.g., and/or designed) to measure the concentration of a gas (e.g., carbon monoxide, carbon dioxide, hydrogen sulfide, volatile organic chemicals, or radon). For example, the sensor may be configured (e.g., and/or designed) to measure ambient noise. For example, the sensor may be configured (e.g., and/or designed) to measure electromagnetic radiation (e.g., RF, microwave, infrared, visible, and/or ultraviolet radiation). For example, the sensors may be configured (e.g., and/or designed) to measure safety-related parameters, such as (e.g., glass) breakage and/or restricting unauthorized presence of personnel in the area. The sensor may be associated with one or more (e.g., active) devices such as radar or lidar. The apparatus is operable to detect a physical size of the peripheral structure, a person present in the peripheral structure, a stationary object in the peripheral structure, and/or a moving object in the peripheral structure.
In some embodiments, the sensors are operatively coupled to at least one controller. The coupling may comprise a communication link. The communication link (e.g., fig. 9, 908) may include any suitable communication medium (e.g., wired and/or wireless). The communication link may include wires, such as one or more conductors arranged in twisted pairs, coaxial cables, and/or optical fibers. The communication link may comprise a wireless communication link, such as Wi-Fi, bluetooth, zigBee, cellular, or fiber optic. One or more segments of the communication link may include a conductive (e.g., wired) medium, while one or more other segments of the communication link may include a wireless link. The communication network may include one or more levels of encryption. The communication network may be communicatively coupled to the cloud and/or one or more servers external to the facility. The communication network may support at least third generation (3G), fourth generation wireless (4G), or fifth generation wireless (5G) communication. The communication network may support cellular signals external and/or internal to the facility. The downlink communication network speed may have a peak data rate of at least about 5 gigabits per second (Gb/s), 10Gb/s, or 20 Gb/s. The uplink communication network speed may have a peak data rate of at least about 2Gb/s, 5Gb/s, or 10 Gb/s.
In some embodiments, the peripheral structure is a facility (e.g., a building). The peripheral structure may comprise a wall, a door or a window. In some embodiments, at least two peripheral structures of the plurality of peripheral structures are disposed in the facility. In some embodiments, at least two peripheral structures of the plurality of peripheral structures are disposed in different facilities. The different facilities may be campuses (e.g., belonging to the same entity). At least two of the plurality of peripheral structures may reside in the same floor of the facility. At least two of the plurality of peripheral structures may reside in different floors of the facility. The peripheral structures of fig. 9, such as peripheral structures a, B, C, and Z, may correspond to peripheral structures located on the same floor of the building, or may correspond to peripheral structures located on different floors of the building. The peripheral structure of fig. 9 may be located in different buildings on a multi-building campus. The peripheral architecture of fig. 9 may be located on different campuses of a multi-campus neighborhood.
In one example, for a gas sensor disposed in a room (e.g., in an office environment), the relevant parameter may correspond to a gas (e.g., CO) 2 ) Levels, wherein desired levels are typically in the range of about 1000ppm or less. In one example, CO 2 The sensor can determine that self-calibration should be at the CO 2 Occurring during a time window of minimum level, such as when there is no occupant near the sensor (see, e.g., CO before 18000 seconds in FIG. 11) 2 Horizontal). Period of CO 2 The time window of minimum level fluctuation may correspond to, for example, a one hour period of lunch from about 12 noon to about 1 pm and a business off hour. FIG. 12 shows a contour map example of a horizontal (e.g., top) view of an office environment depicting various COs 2 And (4) concentration. Gas (CO) may be measured by sensors placed at various locations of a peripheral structure (e.g., an office) 2 ) The concentration of (c).
The position of the peripheral structure and/or fixed features (e.g., placement of walls and/or windows) may be utilized to measure characteristics of a given environment. The position and/or fixed characteristics of the peripheral structure may be derived independently (e.g., from 3 rd party data and/or non-sensor data). Data from one or more sensors disposed in the environment may be used to derive the position and/or fixed characteristics of the peripheral structure. When the environment is minimally disturbed relative to the measured environmental characteristics (e.g., when in the environment When no person is present, and/or when the environment is quiet), some sensor data may be used to sense and/or identify the location of (e.g., fixed and/or non-fixed) objects to determine the environment. Determining the position of the object includes determining an occupancy (e.g., human) in the environment. The range and/or position related measurements may utilize sensors such as radar and/or ultrasonic sensors. The distance and location-related measurements may be derived from sensors that are not traditionally location and/or distance-related. Objects disposed in or as part of the peripheral structure may have different sensor characteristics. For example, the location of a person in the peripheral structure may be at different temperatures, humidity and/or CO 2 The features are correlated. For example, the location of the wall may be related to the temperature, humidity and/or CO in the surrounding structure 2 Is correlated with an abrupt change in the distribution of (c). For example, the position of a window or door (whether open or closed) may be related to the temperature, humidity, and/or CO in the vicinity of the window or door 2 Is correlated with a change in the distribution of (c). The one or more sensors in the peripheral structure may monitor any environmental changes and/or correlate such changes with changes in subsequently monitored values. In some cases, the lack of fluctuation in the monitored values may be taken as an indication of sensor damage, and the sensor may need to be removed or replaced.
In some embodiments, the sensor transmits (e.g., directs) data to a receiver, such as a sensor or sensor suite. The sensor package may also be referred to as a "sensor assembly". The sensors in the sensor suite may be similar to sensors deployed in the space of the peripheral structure.
In some embodiments, a plurality of sensors are assembled into a sensor kit (e.g., a sensor ensemble). At least two sensors of the plurality of sensors may be of different types (e.g., configured to measure different characteristics). The various sensor types may be assembled together (e.g., bundled) and form a sensor suite. The plurality of sensors may be coupled to an electronic board. The electrical connection of at least two of the plurality of sensors in the sensor suite can be controlled (e.g., manually and/or automatically). For example, the sensor suite may be operatively coupled to or include a controller (e.g., a microcontroller). The controller may control the on/off connection of the sensor to the power source. Thus, the controller may control the time (e.g., period) at which the sensor will operate.
In some embodiments, one or more sensors are added to or removed from a sensor group, such as disposed in a peripheral structure and/or in a sensor suite (e.g., a sensor ensemble). The newly added sensor may inform (e.g., direct) other members of the sensor group of its presence and relative location within the topology of the group. An example of a SENSOR cluster can be found, for example, in U.S. provisional patent application serial No. 62/958,653 entitled "SENSOR automation" filed on 8.1.2020, which is incorporated by reference herein in its entirety. Examples of SENSORs AND SENSOR assemblies can be found, for example, in U.S. provisional patent application serial No. 62/967,204 entitled "SENSOR calibration AND OPERATION," filed on 29/1/2020, which is incorporated herein by reference in its entirety. Examples include methods of use thereof, software and devices in which they are utilized and/or included.
The sensors of the sensor ensemble may be organized into sensor modules. The sensor assembly may include a circuit board (such as a printed circuit board) to which a plurality of sensors are adhered or attached. The sensor may be removed from the sensor module. For example, the sensor may be plugged into and/or unplugged from the circuit board. The sensors may be individually activated and/or deactivated (e.g., using a switch). The circuit board may comprise a polymer. The circuit board may be transparent or non-transparent. The circuit board may include a metal (e.g., an elemental metal and/or a metal alloy). The circuit board may include a conductor. The circuit board may include an insulator. The circuit board may comprise any geometric shape (e.g., rectangular or oval). The circuit board may be configured (e.g., may have a shape) to allow the aggregate to be disposed in a mullion (e.g., of a window). The circuit board may be configured (e.g., may have a shape) to allow the aggregate to be disposed in a frame (e.g., a door frame and/or a window frame). The mullion and/or the frame may include one or more apertures to allow the sensor to obtain (e.g., accurate) readings. The circuit board may include an electrical connection port (e.g., a socket). The circuit board may be connected to a power source (e.g., electrical power). The power source may include a renewable power source or a non-renewable power source.
FIG. 13 shows an example of a diagram 1300 of an ensemble of sensors organized into sensor modules. Sensors 1310A, 1310B, 1310C, and 1310D are shown as being included in sensor ensemble 1305. The aggregate of sensors organized into sensor modules may include at least 1, 2, 4, 5, 8, 10, 20, 50, or 500 sensors. The sensor module may include a plurality of sensors (e.g., from about 1 to about 1000, from about 1 to about 500, or from about 500 to about 1000) ranging between any of the above-described values. The sensors of the sensor module may include sensors configured or designed to sense parameters including temperature, humidity, carbon dioxide, particulate matter (e.g., between 2.5 μm and 10 μm), total volatile organic compounds (e.g., changes in voltage potential caused by surface adsorption of volatile organic compounds), ambient light, audio noise levels, pressure (e.g., gases and/or liquids), acceleration, time, radar, lidar, radio signals (e.g., ultra-wideband radio signals), passive infrared, glass breakage, or movement detectors. The sensor ensemble (e.g., 1305) may include non-sensor devices such as a buzzer and a light emitting diode. An example of a sensor assembly AND its use can be found in U.S. patent application Ser. No. 16/447,169 entitled "SENSING AND COMMUNICATIONS UNIT FOR OPTICAL SWITCH WINDOW SYSTEMS", filed on 20.6.2019, which is incorporated herein by reference in its entirety.
In some embodiments, an increase in the number and/or type of sensors may be used to increase the accuracy of one or more measurement characteristics and/or the probability that a particular event measured by one or more sensors has occurred. In some embodiments, the sensors of the sensor assembly may cooperate with each other. In one example, a radar sensor of a sensor ensemble may determine the presence of multiple individuals in a peripheral structure. A processor (e.g., processor 1315) may determine that detection of the presence of the plurality of individuals in the peripheral structure is positively correlated with an increase in the concentration of carbon dioxide. In one example, a memory accessible to the processor may determine that the increase in detected infrared energy is positively correlated with an increase in temperature detected by the temperature sensor. In some embodiments, the network interface (e.g., 1350) may communicate with other sensor ensembles similar to the sensor ensemble. The network interface may additionally be in communication with the controller.
The individual sensors of the sensor ensemble (e.g., sensor 1310A, sensor 1310D, etc.) may include and/or utilize at least one dedicated processor. The sensor ensemble may utilize a remote processor (e.g., 1354) using wireless and/or wired communication links. The sensor complex may utilize at least one processor (e.g., processor 1352), which may represent a cloud-based processor coupled to the sensor complex via a cloud (e.g., 1350). The processors (e.g., 1352 and/or 1354) may be located in the same building, in different buildings, in buildings owned by the same entity or different entities, in facilities owned by the manufacturer of the window/controller/sensor complex, or at any other location. In various embodiments, the sensor ensemble 1305 need not include a separate processor and network interface, as indicated by the dashed lines of fig. 13. The sensor assembly may be an architectural element. These entities may be separate entities and may be operatively coupled to the aggregate 305. The dashed lines in fig. 13 indicate optional features. In some embodiments, on-board processing and/or memory of one or more aggregates of sensors may be used to support other functions (e.g., via allocating aggregate memory and/or processing power to the network infrastructure of a building).
In some embodiments, multiple sensors of the same type may be distributed in the peripheral structure. At least one sensor of the plurality of sensors of the same type may be part of the aggregate. For example, at least two sensors of the plurality of sensors of the same type may be part of at least two aggregates. The sensor assembly may be distributed in a peripheral structure. The peripheral structure may include a conference room. For example, multiple sensors of the same type may measure environmental parameters in a conference room. A parametric topology of the peripheral structure may be generated in response to measuring the environmental parameter of the peripheral structure. The output signals from any type of sensor of the sensor ensemble may be utilized to generate the parametric topology, e.g., as disclosed herein. The parameter topology may be generated for any peripheral structure of a facility, such as a conference room, hallway, bathroom, cafeteria, garage, auditorium, utility room, storage facility, equipment room, and/or lift.
Fig. 14 shows an example of a diagram 1400 of an arrangement of sensor aggregates distributed within a peripheral structure. In the example shown in fig. 14, a group 1410 of individuals are seated in a conference room 1402. The conference room includes an "X" size for indicating length, a "Y" size for indicating height, and a "Z" size for indicating depth. XYZ is the direction in a cartesian coordinate system. The sensor aggregates 1405A, 1405B, and 1405C include sensors that may operate similar to the sensors described with reference to the sensor aggregate 1305 of fig. 13. At least two sensor aggregates (e.g., 1405A, 1405B, and 1405C) may be integrated into a single sensor module. The sensor assemblies 1405A, 1405B, and 1405C may include carbon dioxide (CO) 2 ) A sensor, an ambient noise sensor, or any other sensor disclosed herein. In the example shown in fig. 14, the first sensor aggregate 1405A is disposed (e.g., mounted) near a point 1415A, which may correspond to a ceiling, a location in a wall, or other location on a side of a table on which the group 1410 of individuals are seated. In the example shown in fig. 14, the second sensor aggregate 1405B is disposed (e.g., mounted) near a point 1415B, which may correspond to a location in a ceiling, wall, or other location above (e.g., directly above) a table on which the group 1410 of individuals is seated. In the example shown in fig. 14, the third sensor aggregate 1405C may be disposed (e.g., mounted) at or near a point 1415C, which may correspond to a ceiling, a location in a wall, or a location to a side of a table on which a relatively small group 1410 of individuals are seated. Any number of additional sensors and/or sensor modulesMay be located at other locations in conference room 1402. The sensor assembly may be disposed anywhere in the peripheral structure. The location of the sensor assembly in the peripheral structure may have coordinates (e.g., in a cartesian coordinate system). At least one coordinate (e.g., of x, y, and z) may differ between two or more sensor ensembles disposed, for example, in a peripheral structure. The at least two coordinates (e.g., of x, y, and z) may differ between two or more sensor assemblies disposed, for example, in the peripheral structure. All coordinates (e.g., of x, y, and z) may differ between two or more sensor assemblies disposed, for example, in a peripheral structure. For example, two sensor ensembles may have the same x-coordinate and different y and z coordinates. For example, two sensor ensembles may have the same x and y coordinates and different z coordinates. For example, two sensor ensembles may have different x, y, and z coordinates.
In certain embodiments, one or more sensors of the sensor assembly provide readings. In some embodiments, the sensor is configured to sense and/or identify a parameter. Parameters may include temperature, particulate matter, volatile organic compounds, electromagnetic energy, pressure, acceleration, time, radar, lidar, glass breaking, movement, or gas. The gas may comprise an inert gas. The gas may be a gas harmful to the average human being. The gas may be a gas present in the ambient atmosphere (e.g., oxygen, carbon dioxide, ozone, chlorinated carbon compounds, or nitrogen). The gas may include radon, carbon monoxide, hydrogen sulfide, hydrogen, oxygen, water (e.g., moisture). The electromagnetic sensor may comprise an infrared, visible, ultraviolet sensor. The infrared radiation may be passive infrared radiation (e.g., black body radiation). The electromagnetic sensor may sense radio waves. The radio waves may include broadband or ultra-wideband radio signals. The radio waves may include pulsed radio waves. The radio waves may include radio waves utilized in communications. The gas sensor may sense gas type, flow (e.g., velocity and/or acceleration), pressure, and/or concentration. The readings may have a range of amplitudes. The readings may have parameter ranges. For example, the parameter may be an electromagnetic wavelength, and the range may be a range of detected wavelengths.
In some embodiments, the sensor data is responsive to the environment in the peripheral structure and/or any causative factor of the change in the environment (e.g., any environmental disturbance factor). The sensor data may be responsive to a transmitter (e.g., an occupant, an appliance (e.g., a heater, cooler, ventilator, and/or vacuum), an opening) operatively coupled to (e.g., in) the peripheral structure. For example, the sensor data may be in response to an air conditioning duct or in response to an open window. The sensor data may be responsive to activity occurring in the room. The activities may include human activities and/or non-human activities. The activity may include an electronic activity, a gaseous activity, and/or a chemical activity. The activity may include a sensory activity (e.g., visual, tactile, olfactory, auditory, and/or taste). The activity may comprise an electronic and/or magnetic activity. The activity may be perceived by a person. The activity may not be perceived by a person. The sensor data may be responsive to an occupant, a substance (e.g., gas) flow, a substance (e.g., gas) pressure, and/or a temperature in the peripheral structure.
In one example, the sensor aggregates 1405A, 1405B, and 1405C include carbon dioxide (CO) 2 ) Sensors and ambient noise sensors. The carbon dioxide sensor of sensor aggregate 1405A may provide a reading as depicted in sensor output reading profile 1425A. The noisy sensors of sensor aggregate 1405A may provide readings as also depicted in sensor output reading profile 1425A. The carbon dioxide sensor of sensor aggregate 1405B may provide a reading as depicted in sensor output reading profile 1425B. The noisy sensors of sensor aggregate 1405B may provide readings as also depicted in sensor output reading profile 1425B. The sensor output reading distribution 1425B may indicate a higher carbon dioxide and noise level relative to the sensor output reading distribution 1425A. The sensor output reading profile 1425C can indicate a lower carbon dioxide and noise level relative to the sensor output reading profile 1425B. The sensor output reading profile 1425C can indicate carbon dioxide and noise similar to the sensor output reading profile 1425AAnd (4) horizontal. Sensor output reading profiles 1425A, 1425B, and 1425C may include indications indicative of other sensor readings, such as temperature, humidity, particulate matter, volatile organic compounds, ambient light, pressure, acceleration, time, radar, lidar, ultra-wideband radio signals, passive infrared and/or glass breaking, movement detectors.
In some embodiments, data from sensors in the peripheral structure (e.g., and in the sensor ensemble) is collected and/or processed (e.g., analyzed). The data processing may be performed by a processor of the sensor, by a processor of the sensor ensemble, by another sensor, by another ensemble, in the cloud, by a processor of the controller, by a processor in the peripheral structure, by a processor outside the peripheral structure, by a remote processor (e.g., in a different facility), by a manufacturer (e.g., of the sensor, the window, and/or the building network). The data of the sensor may have a time-indicative identification (e.g., may be time-stamped). The data of the sensor may have an identification of the sensor location (e.g., a location stamp). The sensors may be identifiably coupled to one or more controllers.
In particular embodiments, sensor output reading distributions 1425A, 1425B, and 1425C may be processed. For example, as part of the processing (e.g., analysis), a distribution of sensor output readings may be plotted on a graph depicting sensor readings as a function of the size (e.g., "X" size) of a peripheral structure (e.g., conference room 1402). In one example, the carbon dioxide level indicated in sensor output reading profile 1425A may be indicated as CO of fig. 14 2 Point 1435A of the graph 1430. In one example, the carbon dioxide level of sensor output reading profile 1425B may be indicated as CO 2 Point 1435B of graph 1430. In one example, the carbon dioxide level indicated in sensor output reading profile 425C may be indicated as CO 2 Point 1435C of graph 1430. In one example, the ambient noise level indicated in the sensor output reading profile 1425A may be indicated as point 1445A of the noise plot 1440. In one example, the indication in sensor output reading profile 1425BMay be indicated as point 1445B of the noise graph 1440. In one example, the ambient noise level indicated in the sensor output reading profile 1425C may be indicated as point 1445C of the noise plot 1440.
In some embodiments, processing data derived from the sensors includes applying one or more models. The model may comprise a mathematical model. The processing may include fitting (e.g., curve fitting) of the model. The model may be multi-dimensional (e.g., two-dimensional or three-dimensional). The model may be represented as a graph (e.g., a 2-dimensional graph or a 3-dimensional graph). For example, the model may be represented as a contour map (e.g., as depicted in fig. 13). The modeling may include one or more matrices. The model may comprise a topological model. The model may relate to the topology of the sensed parameter in the peripheral structure. The model may relate to temporal variations of the topology of the sensed parameter in the peripheral structure. The model may be environment and/or peripheral structure specific. The model may take into account one or more characteristics of the peripheral structure (e.g., size, opening, and/or environmental interference factors (e.g., transmitters)). The processing of the sensor data may utilize historical sensor data and/or current (e.g., real-time) sensor data. Data processing (e.g., utilizing a model) can be used to predict environmental changes in the peripheral structure and/or recommend actions that mitigate, adjust, or otherwise react to such changes.
In particular embodiments, sensor aggregates 1405A, 1405B, and/or 1405C are capable of accessing the model to allow curve fitting of sensor readings as a function of one or more dimensions of the peripheral structure. In one example, the model can be accessed to utilize the CO 2 Points 1435A, 1435B, and 1435C of the graph 1430 generate sensor profiles 1450A, 1450B, 1450C, 1450D, and 1450E. In one, the model can be accessed to generate sensor distribution curves 1451A, 1451B, 1451C, 1451B, and 1451E using points 1445A, 1445B, and 1445C of the noise graph 1440. In addition to the sensor profiles 1450 and 1451 of fig. 14, additional models may utilize additional readings from sensor aggregates (e.g., 1405A, 1405B, and/or 1405C) to provide the profiles. Generating in response to use of a modelMay indicate a value of a particular environmental parameter as a function of a size of a peripheral structure (e.g., an "X" size, a "Y" size, and/or a "Z" size).
In certain embodiments, one or more models used to form curves 1450A-1450E and 1451A-1451E may provide a parametric topology of the peripheral structure. In one example, the parameter topology (as represented by the curves 1450A-1450E and 1451A-1451E) may be synthesized or generated from the sensor output reading distribution. The parameter topology may be the topology of any sensed parameter disclosed herein. In one example, the parameter topology of a conference room (e.g., conference room 1402) may include a carbon dioxide profile having a relatively low value at a location away from a conference room table and a relatively high value at a location above (e.g., directly above) the conference room table. In one example, the parameter topology of the conference room may include a multi-dimensional noise distribution having a relatively low value at a location away from the conference room table and a slightly higher value above (e.g., directly above) the conference room table.
In some embodiments, sensor data corresponding to an individual is collected upon arrival and in an entry station or admissible location into a peripheral structure (e.g., facility). See, e.g., the admission location examples of fig. 4A and 4B. In some embodiments, additional sensors deployed throughout the facility are used to continue collecting sensor data corresponding to the individual, e.g., as the individual moves in a peripheral structure and is tracked as the individual moves within the facility. Thus, abnormal body features may be detected at different times with at least one sensor located in low flow, regular flow and/or high flow areas of the facility. At different times, individuals may be located in various areas of a facility (e.g., individuals move into another area from time to time). The at least one sensor may be a plurality of sensors distributed in the peripheral structure and configured to track physical features of the individual at different locations of the peripheral structure. Sensor data may be obtained from a single sensor with consistency of measurements, which may be compared at different times. In some embodiments, sensor readings obtained at different times are compared to detect relative changes that may indicate a corresponding abnormality affecting the individual (e.g., if it exceeds a threshold).
In some embodiments, sensor data is collected, compiled and correlated with the individual obtaining the measurement. The relationship may include an Identification (ID) of the individual. Identification may or may not include personal data (e.g., name, home address, telephone number, government identification number, fingerprint, retinal scan, body features, and/or facial features) of the individual. The sensor data (e.g., including raw and/or processed measurements) may be transmitted (e.g., via a controller or compiler) to a database along with corresponding sensor IDs and timestamps (e.g., including time and/or date). Individuals may be identified with tags including geolocation techniques, face image processing, body shape, gait, blood pressure, infrared (IR) features, heart rate, and/or any other personal features (e.g., using radar, IR, etc.). The geolocation techniques may include radio frequency (e.g., RFID) chips. The geolocation techniques may include BLE, GPS, or UWB techniques. The communication may reach at least one software application (e.g., executing in a mobile device of an individual or another individual) and/or a control system operatively coupled to a peripheral structure (e.g., a building). An application (e.g., a control system and/or phone app) may maintain a table (for one or more users) of sensor IDs, physical characteristics (e.g., temperature), and timestamps (e.g., time and date). Data may be collected over time to determine that each uniquely identified individual (e.g., building occupant) is "normal". The collected data can be used as a baseline for the deviation from the individual's standard. Analysis of collected data in a database relating to individuals may be performed based on relative sensor data (e.g., for a particular sensor). For example, relative sensor measurements of individuals collected at different times may be compared. Because relative measurements are being used, there may be no need to calibrate the sensors that obtain the data. The absolute value of the sensor data may be unimportant as long as the increment measured by the sensor is accurate (e.g., as long as the relative measurement is accurate).
In some embodiments, an event (e.g., a notification event) is triggered if the relative difference between sensor measurements exceeds a threshold. To determine a relative difference threshold to be used as a triggering event for identifying abnormal conditions, an individual may be monitored over time to learn individual body feature behaviors (e.g., swings) and behavior patterns. The monitoring may be performed using a learning module. The learning module may include Artificial Intelligence (AI) consisting of machine learning. In one example, an individual occupant and/or user of a peripheral structure (e.g., where the user has a unique identifier such as "123") may be routinely exercising during lunch hours, and thus may typically have a sharp rise in their temperature and heart rate during that time period. The AI may take this into account when considering undesirable variations in characteristics of user # 123. For example, during the course of a day, the temperature, blood pressure, heart rate, etc. of an individual may change according to the normal cycle; and these patterns can be recorded (which can be unique to an individual) and a standard for the individual can be established. Seasonal variables and/or other extrinsic factors (referred to herein as "examples") may be input to the ML to improve accuracy. Some such paradigms may be quantified when sensor data is collected. The quantified values may be stored with the sensor data for data analysis, e.g., to determine normal ranges of physical characteristics of the individual during various examples. Once an abnormal condition is detected for an individual, a report may be generated. Optionally, a notification system may be activated to provide notifications to affected individuals. In some embodiments, a notification is sent (e.g., also) to the contacted person who may have been exposed to the affected individual.
FIG. 15 illustrates a monitoring system 1500 that monitors one or more characteristics of an individual. To facilitate identification of individual users within a facility, an individual may hold a trackable device 1501 such as a phone, smartphone, or RFID tag (e.g., incorporated into an identification/access badge). One or more sensors 1502 (e.g., a sensor ensemble) may be deployed at controlled access points and/or anywhere throughout the facility to sense characteristics of a user at multiple times. In some embodiments, a trackable device is not required, provided that the individual can be identified solely by relying on sensor data (e.g., facial recognition using a camera image or any other uniquely identifiable feature of the individual).
The controller system includes an ID and location tracking 1503 module and a data compiler 1504 module. Using the sensor data from the sensors 1502 and the user identifier from the ID tracking module 1503, the data compiler 1504 organizes the sensor data according to individual (e.g., user), sensor ID, time, and date to enable analysis of the physical characteristics of each tracked user over a span of time. The organization data may be stored in a personal database 1505 and/or a collective database 1506. Personal database 1505 may be stored in a mobile device (e.g., a smartphone executing a corresponding app) or other personal device carried by a user, such as a laptop computer. Collective database 1506 may be stored in a networked controller in the facility or outside the facility (e.g., remotely coupled via the cloud). It may be beneficial to use the collective database 1506 in conjunction with tracking multiple users, and optionally perform notification and contact tracking as described below (e.g., fig. 17).
For the users represented in databases 1505 and/or 1506, the stored sensor data is analyzed according to the relative changes in the physical characteristics 1507. The analysis may use a learning module (e.g., artificial intelligence including machine learning). The learning module 1507 may be implemented in a networked controller or a dedicated user device (e.g., a smartphone). In some embodiments, when new sensor data for a corresponding user is obtained and input into the database 1505 or 1506, the analysis module 1507 determines whether sufficient data is stored to identify relevant normal values for the sensed features that may be used for comparison. Seasonal data 1508, other environmental and/or other environmental factors may be provided to the analysis module 1507, for example, to improve the determination of appropriate normal values to apply. The analysis module provides a report at 1509. When the newly collected sensor data indicates an abnormal condition (e.g., the difference between the new sensor data and the calculated criteria is greater than a threshold), the notification system 1510 is optionally activated to issue various notifications (e.g., text messages or emails) to the affected user 1511 and/or the contacted person 1512 and/or the central administrator and/or health officer 1513 who may have approached the user 1511 during the abnormal condition (e.g., for a period of time above a time threshold).
Fig. 16 shows a flow diagram of a method 1600 for processing sensor data and detecting abnormal physical characteristics (e.g., conditions). The method of fig. 16 may be performed by a centralized controller (e.g., in a network and/or cloud-based server for a facility) and/or a dedicated user device. At 1601, new sensor data is received, which may include sensor measurements of physical characteristics of an individual, a sensor ID, a time and date of the measurement, a user ID, and/or (optionally) any example values that may characterize extrinsic factors (e.g., a user activity schedule) that may have a predictable impact on the physical characteristics. At 1602, incoming data is parsed into personal and/or collective databases, e.g., such that a distribution can be established for each user, with sensor data of the users aggregated according to, e.g., a sharing paradigm. At 1603, a body characteristic analysis is performed/updated, where a corresponding normal value for the user may be established in conjunction with the sensor ID and/or any relevant paradigm. Establishing normal values corresponding to the sensor ID may enable the analysis to rely on relative changes in sensor measurements. Where the sensor is available, has calibration accuracy, allows absolute magnitude dependent sensor measurements, then the interpretation of the data and establishment of normal values can be performed without considering the sensor ID. At 1604, the established normal values are compared to the new sensor data. If the difference is less than the threshold, return is made to 1601 to continue receiving and processing new sensor data. If the difference is greater than the threshold, a notification of an abnormal condition is generated 1605. The appropriate amount of value for the threshold may depend on the type of physical characteristic being monitored and/or the demographic classification (e.g., gender, age, and/or body type) of the particular user.
In some embodiments, tracking a user in a facility provides a basis for identifying potential contactees of the user who detected abnormal physical features. For example, a building security contacter tracking system that opts-in may store the time and location of the user's movement for correlation. The tracking system may utilize (1) (e.g., anonymized or identifiable) the application user's occupancy data, and (2) sufficient data infrastructure for preserving the timeframe in which it exists. As part of or independent of user tracking and user sensor data compilation during occupancy of a user in a facility (e.g., including at least one building), each user may carry a user device that automatically connects to the network infrastructure upon entering the facility. Based at least in part on the interaction with the user device, the time range occupied by the user may be retained locally and/or remotely (e.g., stored in a tracking database). In some embodiments, tracking of a user is based at least in part on communication between an ID device (e.g., a smartphone, RFID badge, or laptop) carried by the user and wireless and/or wired network infrastructure based at least in part on geolocation methods such as GPS tracking, ranging, triangulation, wiFi presence, short range communications such as UWB, and/or other methods. In some embodiments, user tracking is achieved without a device carried by the user, for example, by utilizing remote sensing and identification (e.g., using as facial recognition) that relies on sensors deployed throughout the facility (e.g., in a device ensemble).
In some embodiments, anonymous or non-anonymous tracking statistics are maintained over a rolling period of time (e.g., days, weeks, months, or years). For each individual with a registered user ID, its location and time in the facility may be stored. When a particular user is identified as having an abnormal condition, the location and time of its movement in the building may be retrieved as a search criterion (e.g., within a certain distance and optionally for a time above a time threshold) that extracts the time at which the user ID and other facility occupants converge with the affected user. Notifications may be pushed in an anonymized or identified manner to other facility users (e.g., colleagues or co-residents) that are present in the potential exposure time frame. In some embodiments, local preservation of potential exposure time ranges is used to facilitate adherence to medical guidelines (e.g., within a jurisdiction) for appropriate isolation and/or disinfection procedures, e.g., to enable targeted responses to affected individuals and/or those facility regions most frequently visited by affected individuals.
Fig. 17 shows a tracking system 1700 that may operate independently or as part of a physical characteristic monitoring system embedded in a network in a facility. The presence record database 1701 includes multiple presence records or caches for each registered user ID, such as a cache 1702 for user X and a cache 1703 for user Y. Each cache compiles an array of time and location of respective users tracked during a scrolling period defined by a specified retention time, the array coupled with measurements of one or more sensor types. Data that has been stored in the cache for longer than the retention time may optionally be discarded 1740 as old data (e.g., to alleviate personal security issues that may no longer require data in conjunction with the contacter tracking functionality). The new tracking data is input into the database 1701 via an interface 1704, which may include resources on the network. The network may be a network of controllers. Optionally, the anonymizing program 1705 may be coupled to an interface 1704 that converts tracking data based on actual user ID to anonymous identifiers for storage in the database 1701. The anonymizing program 1705 may be constituted by a high-level security device that prevents user tracking information stored in the database 1701 from revealing user movement in case the database 1701 is compromised. When using the anonymized data to identify potential contactees, the anonymization program 1705 is configured to obscure user data when storing it, and not obscure data only for limited purposes of sending notifications. At 1710 in fig. 17, each individual entering the facility presents authorized access entry by himself. The time and place of entry is forwarded through interface 1704 (and optionally through anonymizer 1705) for storage in the person's corresponding presence record. During the time that the individual remains in the facility, the movement of the individual is tracked at 1720. Tracking may be recorded as the time spent within a predetermined area (e.g., zone) of a facility, for example, to reduce data storage requirements and/or processing time. The zones may be fixed areas or may be dynamically defined according to the type of use and/or the changing number of occupants in the space. The continuous tracking data (e.g., by sensors located throughout the facility) may be forwarded through interface 1704 (and optionally through anonymizer 1705) for storage in the corresponding presence records. When the individual leaves the facility at 1730, tracking is stopped.
In some embodiments, the body feature tracking sensors may be disposed throughout the facility. Multiple sensor types may be required throughout a facility to track (e.g., accurately) one feature of an individual. A single sensor type may be required to track (e.g., accurately) one feature of an individual throughout a facility. In some embodiments, the sensor network tracks a physical characteristic of an individual in the facility. In some embodiments, the sensor network tracks a plurality of different physical characteristics of individuals in the facility. Tracking the physical characteristics of an individual in a facility may include recording the identification of the individual, the location of the individual in the facility, the time and date of the measurement, the type of sensor measurement (e.g., an infrared sensor measurement), and optionally another type of sensor measurement (if needed to accurately reflect the physical characteristics of the individual, e.g., a visible sensor measurement).
FIG. 18 illustrates a flow chart of a method of contacter tracking based at least in part on a presence records database. A search for the facility region and time that a particular user is present in the facility is performed at 1801, which may be initiated upon automatic detection that the user has an abnormal physical characteristic (e.g., an increase in temperature) or upon the occurrence of other triggering events, such as reporting from the user and/or health officer to the facility and/or network administrator that the user has suffered a physical abnormality (e.g., a disease). At 1802, the retrieved location (e.g., region) and time are compared to the scrolling data for the time and location stored for other users, e.g., to find neighboring locations within a distance threshold and optionally for a time exceeding a time threshold, and the results are generated in report 1803. The results may optionally trigger a notification 1804 to be sent (i) to a particular user, (ii) to a user who may be affected that is proximate to the particular user for an excessive amount of time, (iii) to a responsible party of the facility, an organization resident in the facility, and/or an organization owning the facility, and/or (iv) to a jurisdictional officer (e.g., a health and/or government officer). Sometimes, the generation of reports and optional notifications can be used to select the type of abnormal physical feature identified (e.g., fever rather than abnormal sweat).
In some embodiments, the availability of tracking data of facility occupants is used to help maintain physical distances between occupants (e.g., maintain social distances beyond a distance threshold). A user may inform the network of his destination (e.g., using a software application, i.e., app). For example, a user sitting at a desk wants to enter conference room X. The app may use the tracking data and suggest the least congested occupant route as the best route from the occupant's current location (e.g., desk) to the occupant's destination in the facility (e.g., conference room X). During the movement time requested by the requesting user, the app may use projection analysis (e.g., using machine learning, occupancy schedules for the facility, and/or activity schedules for the facility) to anticipate occupancy of the peripheral structure. The app may suggest the route that is least crowded with occupants during the expected travel time as the best route from the occupant's current location (e.g., desk) to the occupant's destination in the facility (e.g., conference room X). As the user begins traveling from their location (e.g., desk) toward their requested destination (e.g., conference room X), the controller can view the tracking database (e.g., in real-time during occupant travel) to check for clusters of other occupants along the projected path. When such a cluster is found, the alternative route may be evaluated to find and suggest another available route that is less congested to the destination. Such less congested routing options may be automatically shared with (e.g., pushed to) the user through the app.
Fig. 19 shows an example of a flow chart 1900 depicting operations for suggesting an optimal route to occupants requesting to reach a destination in a facility along a least congested route. In 1901, the occupant inputs a target route. The identity and/or route of the occupant may be input (e.g., automatically) through a network of the facility (e.g., using sensors). The occupant may manually enter their current location and/or identity into the app. An assessment of occupant density in various routes in the peripheral structure from the occupant's current location to the occupant's target destination is evaluated in 1902, and then the least congested route (e.g., route) is identified and suggested in 1903. It is suggested in 1908 to optionally track the route during travel, the tracking including tracking movement of occupants along the route (e.g., in real-time) in 1904, evaluating occupant density along the route (e.g., in real-time) in 1905, and deciding (e.g., in real-time) whether a less congested route is available in 1906. The occupant is notified of the alternative less congested route when a less congested route is available.
In some embodiments, the availability of tracking data of facility occupants is used to help maintain physical distances (e.g., social distances) between occupants. As occupants are tracked within the facility, the learning module may compile the occurrence of typical movements and/or repetitive events of the individual at various times of the day. Using learned trends of user movement, the learning module can anticipate where people will go as they move. When the learning module identifies a projected path of movement for a particular user, the controller (e.g., using a processor) may evaluate a condition (e.g., congestion or not) on the expected path. In the event the conditions are crowded in the user's expected path, a (e.g., pushed) notification with a recommended route may be sent to the user to avoid the crowded area (e.g., to maintain the user's social distance during the user's walk through the path). For example, a user may have a tracking history showing a repeating pattern of moving from their desk to a printer station, from the printer station to a file room, and then from the file room back to their desk. As the user begins traveling from their desk toward the printer station, the controller may anticipate such roundtrips and then look at the tracking database to check the clusters of other occupants along the projected path. When such clusters are found, the alternative routes may be evaluated to find another available route that is less congested or least congested to the destination (e.g., to avoid other crowds). The alternate routes may be automatically shared to the user through the app.
Fig. 19 shows an example of a flow chart 1950 that depicts operations for suggesting an optimal route to occupants requesting to reach a destination in a facility along a least congested route. At 1951, the user ID of an occupant present in the facility is logged in. This may be done by identifying the occupant as the occupant enters the facility or by any other login of the occupant (e.g., manual login of the occupant). When the user is present in the facility, the user's location is tracked at 1952. As the data points for the user accumulate over time, the learning (and prediction) module analyzes the movement of the user as a function of time and location, and as the data is built up (e.g., used as a learning set for the learning module), learns 1953 the movement patterns of occupants in the peripheral structure (e.g., as a function of time). A check is made at 1957 to determine whether the learning module predicts a likelihood of a particular movement based at least in part on the user's tracking (e.g., based on previous movement routes, movement starting locations, movement destinations, and/or movement times). If movement is not predicted, further monitoring and refinement of the predictive model is performed at 1952. If a particular movement is predicted, the occupant density (e.g., potential interaction with building occupants and/or other potential risks) of the predicted route is evaluated at 1954. At 1955, the degree of congestion of occupants in the respective routes to the predicted destination is determined and compared. The occupant is notified of the least congested approach. If no better route is available, further monitoring and refinement of the prediction model and/or the degree of congestion of the route is performed at 1955. If a less congested route is found, the occupant is notified to take an optionally better route at 1956. And then back to 1952 for further monitoring and refinement of the prediction model. Monitoring of whether the occupant takes the route may continue. Monitoring may continue in real time during occupant travel along the route. The congestion estimation may take into account the real-time congestion level and the forecasted congestion level (e.g., using projection analysis, such as using machine learning, occupancy schedules for the facility, and/or activity schedules for the facility).
In some embodiments, environmental characteristics are monitored using any of the sensor modes to discover anomalous characteristics of people and/or surfaces in a facility. For example, changes in humidity can be monitored as an indication of excessive sweating by the patient. Similarly, gas or chemical component concentrations may be monitored as an indication of various diseases. In some embodiments, a sensor or collection of sensors deployed in a facility is used to sense and/or identify the temperature of a designated surface (e.g., furniture and/or fixture surfaces such as countertops, doors, desktops, handles, windows, frames, etc.) that is a target of conventional disinfection. Such surfaces may be selected for being a significant reservoir of infectious agents (e.g., pathogens), e.g., if not sterilized, they may collect the infectious agent and potentially transfer it to other occupants. Periodic cleaning and monitoring of the surface may be performed. Monitoring of the surface may be initiated manually or by otherwise detecting cleaning activity. After cleaning (e.g., using typical disinfectants and/or other liquids), evaporation of the cleaning liquid (e.g., solvent) from the surface may cool the surface. A cleaning event may be detected by measuring the surface temperature at a plurality of sample times and comparing a plurality of consecutive samples (e.g., two or more) to detect a temperature drop (e.g., exceeding a threshold and/or a rate of temperature drop), for example, because the temperature of the surface decreases due to evaporation. Once a cleaning event has been identified, the elapsed time since the last cleaning of the surface may be detected. In some embodiments, the surface temperature may be measured intermittently or (e.g., substantially) continuously (e.g., at a predetermined sampling rate) to determine how long has elapsed since the last cleaning and optionally whether the surface requires a second cleaning (e.g., the time elapsed since the last cleaning event is too long).
FIG. 20 shows a flow chart of one example of a method for monitoring surface cleaning. At 2001, at least one surface to be cleaned (e.g., and disinfected) is identified. For example, the surface may be specified by a facility administrator. At 2002, a temperature at or near a surface (e.g., as compared to an ambient (e.g., room) temperature) is measured with a sensor (e.g., a remote temperature sensor, an IR thermal camera, and/or an air temperature sensor). At 2003, a check is performed to determine if there is a predetermined drop in the measured temperature of the particular surface (e.g., which is not accompanied by a matching drop in ambient temperature). The purpose of simultaneously monitoring the ambient temperature may be to exclude the possibility that the surface temperature decreases in response to changes in the ambient temperature. If the surface temperature drops beyond any simultaneous drop in ambient temperature, it is determined that cleaning has occurred, and a cleaning event is recorded at 2004. At 2005, the elapsed time since the last cleaning event is compared to a time threshold (e.g., threshold X, where X is a specified time period according to the requested cleaning schedule). If the elapsed time exceeds the threshold, a notification is sent to initiate cleaning 2006. For example, to inform designated individuals (e.g., building managers) that the surface cleaning has expired. For example, to activate an automatic cleaner (e.g., an automatic wiper). After 2005 or 2006, a time delay 2007 may optionally be inserted before returning to 2002, e.g., for re-measuring the surface temperature (and optionally also the ambient temperature).
In some embodiments, the sensor is operatively coupled to at least one controller and/or processor. The sensor readings may be obtained by one or more processors and/or controllers. The controller may include a processing unit (e.g., a CPU or GPU). The controller may receive input (e.g., from at least one sensor). The controller may include electrical circuitry, electrical wiring, optical wiring, an electrical outlet, and/or an electrical outlet. The controller may deliver an output. The controller may include a plurality (e.g., sub) of controllers. The controller may be part of a control system. The control system may include a master controller, a floor controller (e.g., including a network controller), a local controller. The local controller may be a window controller (e.g., controlling an optically switchable window), a peripheral structure controller, or a component controller. For example, the controller can be part of a hierarchical control system (e.g., including a master controller that directs one or more controllers, such as a floor controller, a local controller (e.g., a window controller), a peripheral structure controller, and/or a component controller). The physical location of the controller types in the hierarchical control system may be changing. For example: at a first time: the first processor may assume the role of a master controller, the second processor may assume the role of a floor controller, and the third processor may assume the role of a local controller. At a second time: the second processor may assume the role of a master controller, the first processor may assume the role of a floor controller, and the third processor may retain the role of a local controller. At a third time: the third processor may assume the role of a master controller, the second processor may assume the role of a floor controller, and the first processor may assume the role of a local controller. The controller may control one or more devices (e.g., directly coupled to the devices). The controller may be located in proximity to one or more devices it controls. For example, the controller may control a light-switchable device (e.g., an IGU), an antenna, a sensor, and/or an output device (e.g., a light source, a sound source, an odor source, a gas source, an HVAC outlet, or a heater). In one embodiment, the floor controller may direct one or more window controllers, one or more peripheral structure controllers, one or more component controllers, or any combination thereof. The floor controller may comprise a floor controller. For example, a floor (e.g., including a network) controller may control a plurality of local (e.g., including window) controllers. A plurality of local controllers may be disposed in a portion of a facility (e.g., in a portion of a building). A portion of a facility may be a floor of the facility. For example, a floor controller may be assigned to a floor. In some embodiments, a floor may include multiple floor controllers, for example, depending on the size of the floor and/or the number of local controllers coupled to the floor controller. For example, a floor controller may be assigned to a portion of a floor. For example, a floor controller may be assigned to a portion of a local controller disposed in a facility. For example, a floor controller may be assigned to a portion of a floor of a facility. The master controller may be coupled to one or more floor controllers. The floor controller may be located in the facility. The master controller may be located within the facility or outside the facility. The master controller may be disposed in the cloud. The controller may be part of or operatively coupled to a building management system. The controller may receive one or more inputs. The controller may generate one or more outputs. The controller may be a single input single output controller (SISO) or a multiple input multiple output controller (MIMO). The controller may interpret the received input signal. The controller may acquire data from one or more components (e.g., sensors). The obtaining may include receiving or extracting. The data may include measurements, estimates, determinations, generations, or any combination thereof. The controller may include feedback control. The controller may include a feed forward control. The control may include on-off control, proportional Integral (PI) control, or Proportional Integral Derivative (PID) control. The control may include open loop control or closed loop control. The controller may comprise a closed loop control. The controller may include open loop control. The controller may include a user interface. The user interface may include (or be operatively coupled to) a keyboard, a keypad, a mouse, a touch screen, a microphone, a voice recognition package, a camera, an imaging system, or any combination thereof. The output may include a display (e.g., a screen), speakers, or a printer. Fig. 21 shows an example of a control system architecture 2100 that includes a master controller 2108 that controls floor controllers 2106, which in turn control local controllers 2104. In some embodiments, the local controller controls one or more IGUs, one or more sensors, one or more output devices (e.g., one or more transmitters), or any combination thereof. Fig. 21 shows an example of a configuration in which a master controller is operatively coupled (e.g., wirelessly and/or wired) to a Building Management System (BMS) 2124 and a database 2120. Arrows in fig. 21 indicate communication paths. The controller can be operatively coupled (e.g., directly/indirectly and/or wired and/or wirelessly) to an external source 2110. The external source may comprise a network. The external source may include one or more sensors or output devices. The external source may include a cloud-based application and/or a database. The communication may be wired and/or wireless. The external source may be located outside the facility. For example, the external source may include one or more sensors and/or antennas disposed, for example, on a wall or ceiling of the facility. The communication may be unidirectional or bidirectional. In the example shown in fig. 21, the communication of all communication arrows is meant to be bidirectional.
FIG. 22 shows a flow diagram of a method 2200 for detecting perturbations in environmental features of a peripheral structure. The method of fig. 22 may be performed by individual sensors of a sensor ensemble. The method of fig. 22 may be performed by a first sensor coupled to (e.g., in communication with) a second sensor. The method of fig. 22 may be directed by a controller coupled to (e.g., in communication with) the first sensor and/or the second sensor. The method of fig. 22 begins at 2210 with obtaining sensor readings from one or more sensors of a sensor ensemble. At 2220, the readings are processed (e.g., by considering peripheral structures, historical readings, benchmarks, and/or modeling) to generate results. At 2230, the results are utilized to detect environmental changes (e.g., at a particular time and/or location) and/or predict future readings of the one or more sensors. At 2240, the results may optionally be communicated, for example, to the interested party.
In particular embodiments, sensor readings from a particular sensor may be correlated with sensor readings from the same type or different types of sensors. Receipt of a sensor reading may cause the sensor to access correlation data from other sensors disposed within the same peripheral structure. Based at least in part on the access correlation data, a reliability of the sensor can be determined or estimated. In response to determining or estimating the reliability of the sensor, the sensor output reading may be adjusted (e.g., increased/decreased). A reliability value may be assigned to the sensor based on the adjusted sensor reading.
The sensor readings may be any type of reading, such as detecting movement of an individual within the peripheral structure, temperature, humidity, or any other characteristic detected by the sensor. The sensor readings may be correlated with the correlation data. The correlation data may be accessed from other sensors disposed in the peripheral structure. The correlation data may relate to output readings of the same type of sensor or different types of sensors operating within the peripheral structure. In one example, the noise sensor may access data from the mobile sensor to determine whether one or more individuals have entered the peripheral structure. One or more individuals moving within the peripheral structure may transmit noise levels. In one example, the output signal from the noise sensor may be confirmed by the second noise sensor and/or by the movement detector. Based at least in part on the accessed relevance data, a running analysis (e.g., assessment) of environmental characteristics can be performed at one or more locations of the environment. The one or more locations may be correlated to the location of the individual in the environment (e.g., to detect any abnormal physical features of the individual). Environmental features in the environment that are subject to individual perturbations can be detected for any abnormal physical feature. Once the detected environmental features are analyzed as any abnormal physical features, they may be reported (e.g., as disclosed herein).
Fig. 23 shows an example of a controller 2305 for controlling one or more sensors. The controller 2305 includes a sensor correlator 2310, a model generator 2315, an event detector 2320, a processor and memory 2325 and a network interface 2350. The sensor correlator 2310 operates to detect correlation between various sensor types. For example, an infrared radiation sensor measuring an increase in infrared energy may be positively correlated to an increase in measured temperature. The sensor correlator may establish a correlation coefficient, such as a coefficient for a negative correlation sensor reading (e.g., a correlation coefficient between-1 and 0). For example, the sensor correlator may establish coefficients of positively correlated sensor readings (e.g., a correlation coefficient between 0 and 1).
In some embodiments, the sensor data may be time-dependent. In some embodiments, the sensor data may be space-dependent. The model may utilize temporal and/or spatial dependencies of the sensed parameters. The model generator may allow fitting of sensor readings as a function of one or more dimensions of the peripheral structure. In one example, a model providing a sensor profile of carbon dioxide may utilize various gas diffusion models, which may allow for prediction of carbon dioxide levels at points between sensor locations. The processor and memory may facilitate processing of the model.
In some embodiments, the sensor and/or sensor ensemble may act as an event detector. The event detector may be operable to guide the activity of the sensors in the peripheral structure. In one example, in response to the event detector determining that very few individuals remain in the peripheral structure, the event detector may direct the carbon dioxide sensor to decrease the sampling rate. The reduction in sampling rate may extend the life of the sensor (e.g., carbon dioxide sensor). In another example, the event detector may increase the sampling rate of the carbon dioxide sensor in response to the event detector determining that a large number of individuals are present in the room. In one example, in response to the event detector receiving a signal from the glass break sensor, the event detector may activate one or more movement detectors of the peripheral structure, one or more radar units of the detector. The network interface (e.g., 2350) may be configured or designed to communicate with one or more sensors via a wireless communication link, a wired communication link, or any combination thereof.
The controller can monitor and/or direct a (e.g., physical) change in an operating condition of the devices, software, and/or methods described herein. Control may include regulating, manipulating, limiting, directing, monitoring, adjusting, modulating, changing, altering, inhibiting, inspecting, directing, or managing. Controlled (e.g., by a controller) may include attenuating, modulating, changing, managing, suppressing, normalizing, adjusting, constraining, supervising, manipulating, and/or directing. Control may include controlling a control variable (e.g., temperature, power, voltage, and/or profile). Control may include real-time or offline control. The calculations utilized by the controller may be done in real time and/or offline. The controller may be a manual or non-manual controller. The controller may be an automatic controller. The controller may operate on request. The controller may be a programmable controller. The controller may be programmed. The controller may include a processing unit (e.g., a CPU or GPU). The controller may receive input (e.g., from at least one sensor). The controller may deliver an output. The controller may include a plurality (e.g., sub) of controllers. The controller may be part of a control system. The control system may include a master controller, a floor controller, a local controller (e.g., a peripheral structure controller or a window controller). The controller may receive one or more inputs. The controller may generate one or more outputs. The controller may be a single input single output controller (SISO) or a multiple input multiple output controller (MIMO). The controller may interpret the received input signal. The controller may acquire data from one or more sensors. The obtaining may include receiving or extracting. The data may include measurements, estimates, determinations, generations, or any combination thereof. The controller may include feedback control. The controller may include a feed forward control. The control may include on-off control, proportional Integral (PI) control, or Proportional Integral Derivative (PID) control. The control may include open loop control or closed loop control. The controller may comprise a closed loop control. The controller may include open loop control. The controller may include a user interface. The user interface may include (or be operatively coupled to) a keyboard, keypad, mouse, touch screen, microphone, voice recognition package, camera, imaging system, or any combination thereof. The output may include a display (e.g., a screen), speakers, or a printer.
The methods, systems, and/or devices described herein may include a control system. The control system may be in communication with any of the devices (e.g., sensors) described herein. The sensors may be of the same type or different types, e.g., as described herein. For example, the control system may be in communication with the first sensor and/or the second sensor. The control system may control one or more sensors. The control system may control one or more components of a building management system (e.g., a lighting, security, and/or air conditioning system). The controller may adjust at least one (e.g., environmental) characteristic of the peripheral structure. The control system may use any component of the building management system to regulate the peripheral structural environment. For example, the control system may regulate the energy supplied by the heating element and/or by the cooling element. For example, the control system may regulate the velocity of air flowing into and/or out of the peripheral structure through the vents. The control system may include a processor. The processor may be a processing unit. The controller may include a processing unit. The processing unit may be central. The processing unit may include a central processing unit (abbreviated herein as "CPU"). The processing unit may be a graphics processing unit (abbreviated herein as "GPU"). A controller or control mechanism (e.g., including a computer system) may be programmed to implement one or more methods of the present disclosure. The processor may be programmed to implement the method of the present disclosure. A controller may control at least one component of the forming systems and/or apparatus disclosed herein.
Fig. 24 shows a schematic example of a computer system 2400 programmed or otherwise configured to perform one or more operations of any of the methods provided herein. The computer system may control (e.g., direct, monitor, and/or regulate) various features of the methods, apparatus, and systems of the present disclosure, such as controlling heating, cooling, lighting, and/or ventilation of peripheral structures, or any combination thereof. The computer system may be part of or in communication with any of the sensors or sensor assemblies disclosed herein. The computer can be coupled to one or more of the mechanisms disclosed herein and/or any portion thereof. For example, the computer may be coupled to one or more sensors, valves, switches, lights, windows (e.g., IGUs), motors, pumps, optical components, or any combination thereof. The computer system may include any of the at least one processor and/or circuit board disclosed herein, for example in the context of temperature measurement.
The computer system may include a processing unit (e.g., 2406) (also "processor," "computer," and "computer processor" are used herein). The computer system may include a memory or memory location (e.g., 2402) (e.g., random access memory, read only memory, flash memory), an electronic storage unit (e.g., 2404) (e.g., hard disk), a communication interface (e.g., 2403) (e.g., a network adapter) for communicating with one or more other systems, and a peripheral device (e.g., 2405), such as a cache, other memory, data storage, and/or an electronic display adapter. In the example shown in fig. 24, memory 2402, storage unit 2404, interface 2403, and peripheral device 2405 communicate with processing unit 2406 through a communication bus (solid line) such as a motherboard. The storage unit may be a data storage unit (or data repository) for storing data. The computer system may be operatively coupled to a computer network ("network") (e.g., 2401) with the aid of a communication interface. The network may be the internet, the internet and/or an extranet, or an intranet and/or extranet in communication with the internet. In some cases, the network is a telecommunications and/or data network. The network may include one or more computer servers that may implement distributed computing, such as cloud computing. In some cases, the network may implement a peer-to-peer network with the help of a computer system, which may enable a device coupled to the computer system to act as a client or server. The computer system may include any of the at least one processor and/or circuit board disclosed herein, for example in the context of temperature measurement.
The processing unit may execute a series of machine-readable instructions that may be embodied in a program or software. The instructions may be stored in a memory location such as memory 2402. The instructions may be directed to a processing unit, which may then program or otherwise configure the processing unit to implement the methods of the present disclosure. Examples of operations performed by a processing unit may include fetch, decode, execute, and write-back. The processing unit may interpret and/or execute the instructions. The processor may include a microprocessor, a data processor, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a system on a chip (SOC), a coprocessor, a network processor, an Application Specific Integrated Circuit (ASIC), an application specific instruction set processor (ASIP), a controller, a Programmable Logic Device (PLD), a chipset, a Field Programmable Gate Array (FPGA), or any combination thereof. The processing unit may be part of a circuit, such as an integrated circuit. One or more other components of system 1800 may be included in a circuit.
The storage unit may store files such as drivers, libraries, and saved programs. The storage unit may store user data (e.g., user preferences and user programs). In some cases, the computer system may include one or more additional data storage units located external to the computer system, such as on a remote server in communication with the computer system via an intranet or the internet.
The computer system can be connected with one or more remote computers through networkThe program computer system communicates. For example, the computer system may communicate with a remote computer system of a user (e.g., an operator). Examples of remote computer systems include a personal computer (e.g., a laptop PC), a tablet PC or tablet PC (e.g.,
Figure BDA0003861909420000871
iPad、
Figure BDA0003861909420000872
galaxy Tab), telephone, smartphone (e.g.,
Figure BDA0003861909420000873
iPhone, android enabled device,
Figure BDA0003861909420000874
) Or a personal digital assistant. A user (e.g., a client) may access the computer system via a network.
The methods described herein may be implemented by machine (e.g., computer processor) executable code stored on an electronic storage location of a computer system, such as memory 2402 or electronic storage unit 2404. The machine executable or machine readable code may be provided in the form of software. During use, processor 2406 may execute code. In some cases, code may be retrieved from a memory unit and stored on a memory for ready access by a processor. In some cases, the electronic storage unit may be eliminated, and the machine executable instructions stored on the memory.
The code may be pre-compiled and configured for use with a machine adapted to execute the code, or may be compiled at runtime. The code may be provided in a programming language that may be selected to enable the code to be executed in a pre-compiled or compiled manner.
In some embodiments, the processor includes code. The code may be program instructions. The program instructions may cause at least one processor (e.g., a computer) to direct a feed-forward and/or a feedback control loop. In some embodiments, the program instructions cause the at least one processor to direct a closed-loop and/or open-loop control scheme. The control may be based at least in part on one or more sensor readings (e.g., sensor data). One controller may direct multiple operations. At least two operations may be directed by different controllers. In some embodiments, different controllers may direct at least two of operations (a), (b), and (c). In some embodiments, different controllers may direct at least two of operations (a), (b), and (c). In some embodiments, the non-transitory computer readable medium causes each different computer to boot at least two of operations (a), (b), and (c). In some embodiments, the different non-transitory computer readable medium causes each different computer to direct at least two of operations (a), (b), and (c). The controller and/or computer readable medium may direct any of the devices disclosed herein or components thereof. The controller and/or computer readable medium may direct any operation of the methods disclosed herein.
In some embodiments, the at least one sensor is operatively coupled to a control system (e.g., a computer control system). The sensor may include an optical sensor, an acoustic sensor, a vibration sensor, a chemical sensor, an electrical sensor, a magnetic sensor, a flow sensor, a movement sensor, a speed sensor, a position sensor, a pressure sensor, a force sensor, a density sensor, a distance sensor, or a proximity sensor. The sensors may include temperature sensors, weight sensors, material (e.g., powder) level sensors, metering sensors, gas sensors, or humidity sensors. The metrology sensors may include measurement sensors (e.g., height, length, width, angle, and/or volume). The metrology sensor may comprise a magnetic sensor, an acceleration sensor, an orientation sensor, or an optical sensor. The sensor may send and/or receive acoustic (e.g., echo) signals, magnetic signals, electronic signals, or electromagnetic signals. The electromagnetic signal may include a visible light signal, an infrared signal, an ultraviolet signal, an ultrasonic signal, a radio wave signal, or a microwave signal. The gas sensor may sense any of the gases described herein. The distance sensor may be a type of metering sensor. The distance sensor may comprise an optical sensor or a capacitive sensor. The temperature sensor may comprise a bolometer, bimetallic strip, calorimeter, exhaust gas thermometer, flame detector, gardon meter, golay probe, heat flux sensor, infrared thermometer, microbolometer, microwave radiometer, net radiometer, quartz thermometer, resistive temperature detector, resistive thermometer, silicon bandgap temperature sensor, special sensor microwave/imager, thermometer, thermistor, thermocouple, thermometer (e.g., resistive thermometer), or pyrometer. The temperature sensor may comprise an optical sensor. The temperature sensor may include image processing. The temperature sensor may include a camera (e.g., an IR camera, a CCD camera). The pressure sensor may include a self-recording barometer, a pressure booster, a bourdon tube gauge, a hot filament ion gauge, an ionization gauge, a mcrader gauge, an oscillating U-tube, a permanent downhole gauge, a pressure gauge, a pirani gauge, a pressure sensor, a pressure gauge, a tactile sensor, or a time gauge. The position sensor may include an accelerometer, a capacitive displacement sensor, a capacitive sensing device, a free fall sensor, a gravimeter, a gyroscope sensor, a shock sensor, an inclinometer, an integrated circuit piezoelectric sensor, a laser rangefinder, a laser surface velocimeter, a lidar, a linear encoder, a Linear Variable Differential Transformer (LVDT), a liquid capacitance inclinometer, an odometer, a photosensor, a piezoelectric accelerometer, a rate sensor, a rotary encoder, a rotary variable differential transformer, an autosynchronizer, a shock detector, a shock data recorder, a tilt sensor, a tachometer, an ultrasonic thickness meter, a variable reluctance sensor, or a velocity receiver. The optical sensor can include a charge coupled device, a colorimeter, a contact image sensor, an electro-optic sensor, an infrared sensor, a dynamic inductance detector, a light emitting diode (e.g., photosensor), an optically addressed potentiometric sensor, a nicols radiometer, a fiber optic sensor, an optical position sensor, a photodetector, a photodiode, a photomultiplier, a phototransistor, a photosensor, a photoionization detector, a photomultiplier, a photoresistor, a photoswitch, a phototube, a scintillator, shack-hartmann, a single photon avalanche diode, a superconducting nanowire detector single photon, a transition edge sensor, a visible photon counter, or a wavefront sensor. The one or more sensors may be connected to a control system (e.g., to a processor, computer). The sensor may comprise a Complementary Metal Oxide Semiconductor (CMOS).
In various embodiments, the network infrastructure supports a control system for one or more windows, such as electrochromic (e.g., tintable) windows. The control system may include one or more controllers operatively coupled (e.g., directly or indirectly) to one or more windows. Although the disclosed embodiments describe electrochromic windows (also referred to herein as "optically switchable windows," "tintable windows," or "smart windows"), the concepts disclosed herein are applicable to other types of switchable optical devices, including, for example, liquid crystal devices and suspended particle devices. For example, liquid crystal devices and/or suspended particle devices may be implemented instead of or in addition to electrochromic devices. The tintable window may operate using a liquid crystal device, a suspended particle device, a micro-electromechanical system (MEMS) device, such as a micro-shutter, or any technique now known or later developed that is configured to control light transmission through the window. A window (e.g., having a MEMS device for coloration) is described in U.S. patent application Ser. No. 14/443,353, entitled "Multi-PANE WINDOWS INCLUDING ELECTROCHROMOMIC DEVICES AND ELECTROMECHANICAL SYSTEMS DEVICES", filed 5, 15, 2015, AND incorporated herein by reference in its entirety. In some cases, one or more tintable windows may be located within the interior of a building, for example between a conference room and a hallway. In some cases, one or more tintable windows may be used in automobiles, trains, airplanes, and other vehicles, for example, in place of passive and/or non-tinted windows.
In some embodiments, physical characteristic measurements of an individual may be performed at certain locations of the body. For example, the temperature may be measured on the forehead. At least one sensor may be disposed in a structure (e.g., a fixed structure) to measure a body characteristic. The sensors may be focused on the body location to sense and/or identify body features. For example, the sensor may be focused on the forehead to measure temperature. The sensor may be directed to focus at a lateral (e.g., horizontal) distance from the sensor. The distance may correspond to a distance from the body position. The distance may correspond to a specified location of the individual from the sensor. The sensor may be directed to focus at a vertical distance from the sensor. The vertical distance may correspond to a vertical distance from the body position. The distance may correspond to an average location of body locations in the population, e.g., depending on age group and/or gender.
In some embodiments, the sensor is operatively coupled to a network. At least one controller operatively coupled to the network (e.g., one or more processors operatively coupled to the network) may direct the sensors to focus on the body location based at least in part on image recognition of the facial landmarks. Facial landmark recognition may utilize one or more (e.g., other) sensors, such as visible light sensors and/or IR sensors. The sensor may be configured to identify the individual based at least in part on (i) a temperature of the individual as compared to the surroundings and/or background, (ii) facial landmark features of the individual. The distance between facial features and/or the size of the facial features and their comparison to the average population may be used to estimate the lateral (e.g., horizontal) distance of an individual from the sensor. Such as the distance between the eyes. Such as the size of the pupil. The lateral distance may be estimated using a combination of IR sensor and visible sensor (e.g., camera) data that identifies individuals based at least in part on thermal signatures in environments with different (e.g., lower) thermal signatures. The sensor used at least in part to identify the body position (e.g., forehead) of the individual may be the same or different than the sensor used to measure the body characteristic (e.g., temperature) of the individual.
In some embodiments, a body characteristic (e.g., temperature) is measured at a lateral distance from the sensor (e.g., at a focal distance from the sensor). The individual may be positioned at about the lateral distance measured. The lateral distance may be at least about 0.1 meters (m), 0.3m, 0.6m, 0.9m, or 1m. The lateral distance measured from the sensor may be between any of the distances described above (e.g., about 0.1m to about 1 m). The sensor may have a dwell time during which the physical characteristic may be measured. The residence time may be up to about 0.25 seconds(s), 0.5s, 1s, 1.5s, 2s, 3s, 4s, or 5s. The residence time may be any value in between the above values (e.g., about 0.25s to about 5 s). The thermal characteristic (e.g., temperature) may be measured with an accuracy of at least about +/-0.7 ℃, 0.5 ℃, 0.25 ℃, or more. The sensor used for the measurement (e.g., as part of the camera) may have a horizontal field of view and a vertical field of view. The vertical field of view may be at least about 30 degrees (°), 40 °, 55 °, 60 °, 70 °, 80 °, 90 °, 100 °, 110 °, 120 °, or 150 °. The horizontal field of view may be at least about 30 °, 40 °, 55 °, 60 °, 65 °, 70 °, 75 °, 80 °, 90 °, or 100 °. The vertical field of view may be larger than the vertical field of view. Greater than may be about 1.1 times (, 1.2 times), 1.4 times, 1.5 times, 1.7 times, 1.9 times, or 2.0 times.
Sometimes, multiple sensors may be utilized to view the body features. The measured densities of the plurality of sensors over time may be (e.g., substantially) similar. For example, a thermal IR camera and a depth camera may be utilized to determine a body position (e.g., forehead) and/or measure a body characteristic (e.g., temperature) of a user. The sensor may be positioned at a location that facilitates focusing on the body location (e.g., within the user's horizontal field of view and/or vertical field of view). The sensor may be operatively coupled to an actuator that facilitates its translation (e.g., horizontally and/or vertically) to adjust the capture of the user's body position within the sensor's field of view. The sensor may be stationary. The translation may be manual or automatic (e.g., using at least one controller).
Fig. 25 shows an example of a user 2505 undergoing temperature measurement in a body feature. An IR sensor 2502 is disposed in or attached to the structure 2503. A depth camera 2501 is similarly disposed to or attached to structure 2503. For example, at least one of the sensors may be disposed in the frame (e.g., in a cavity of the frame). For example, at least one of the sensors may be attached to the outside of the frame. The user 2505 is located at a designated location 2504 (e.g., 362 of fig. 3). The IR sensor 2502 is provided at the average chin height h1 of the user, and at a specified distance d1 from a specified position 2504. The IR sensor 2502 has a vertical field of view of 90 degrees.
In some embodiments, assessing a physical characteristic of an individual comprises a plurality of operations. For example, the evaluation may include identifying facial landmarks of the user and/or a (e.g., actual and/or expected) distance of the user from the sensor. The control system may direct the sensor to focus the body position (e.g., based at least in part on facial landmark recognition and/or a user's distance from the sensor). Once the sensor is configured to focus on the body location, the sensor obtains measurements from the body location. The measurements may be processed (e.g., adjusted according to various adjustment methods) to produce a result of the measured physical characteristic. The results may be compared to a threshold (e.g., value) and a report may be generated. The report may be sent to a user, an organization, a manager, or any combination thereof. The report may contain the adjusted physical characteristic (e.g., to reflect the actual value of the physical characteristic), the variance from the normal value, and any remedial and/or lifting measures (e.g., recommendations and/or directions). The report may be used as disclosed herein. The non-manipulated temperature measurements, processed results, comparisons to thresholds, and/or reports may be saved in one or more databases. The database may be operatively coupled to a network. The accuracy of the report can be improved by using the data in the database. In some embodiments, the measurement of the physical characteristic is performed in a contactless manner.
In some embodiments, the raw data of the physical characteristics of the user measured by the sensor does not accurately reflect the actual physical characteristics of the user. The raw data needs to be adjusted. The adjustment may be performed automatically (e.g., by a processor). The adjustment may be suggested by an Artificial Intelligence (AI) computation scheme. The AI may include a learning module. The learning module may utilize historical measurements of individuals and/or objects having subject characteristics (e.g., temperature) as compared to ground truth (e.g., local thermometers that provide accurate readings). The learning module may utilize the composite measurements as part of its learning set. The learning module may utilize simulations as part of its learning set. For example, an IR sensor measuring the temperature of an individual at time t may be compared to a thermometer measuring the temperature of an individual at time t. The adjustment may take into account modeling (e.g., physical modeling) that simulates the characteristics of the subject measured by the sensor. For example, when the characteristic is temperature, the blackbody radiation at an average individual forehead position may be simulated and fed to the learning module as part of its learning set as a composite measurement. The learning module (including machine learning) may utilize a regression algorithm and/or a classification algorithm. The output of the machine learning module may be the (e.g., lateral) distance of the subject from the sensor and/or the measured physical characteristic (e.g., temperature).
In some embodiments, the body feature is measured using a sensor array (e.g., a camera with a sensor array, such as an IR or visible light camera). The sensor array may include at least about 25 pixels, 32 pixels, 800 pixels, 1080 pixels, 1280 pixels, or 1920 pixels in its FLS. (for example, the array may be a 25 × 25, 25 × 32, or 32 × 32 sensor array). The sensor array may include at least about 1 megapixels (Mpxl), 4Mpxl, 8Mpxl, 10Mpxl, or 12Mpxl at its FLS. The sensor can measure a wide operating range, for example, a wide span of body features. For example, the temperature sensor may be configured to measure temperatures spanning an operating temperature range of about-40 ℃ to about 85 ℃ or an operating temperature range of about-40 ℃ to about 300 ℃. The sensor may be a high accuracy sensor. For example, the temperature sensor may have an accuracy of at least about ± 1 ℃, or ± 0.5 ℃ (or any other temperature accuracy value disclosed herein). A sensor array (e.g., a camera with a sensor array) may have an adjustable focus (e.g., automatically adjusted using at least one controller). The camera may have one or more lenses. The sensor array may be sensitive to at least the visible spectrum (e.g., including RGB sensors). The sensor array may be sensitive to at least the infrared spectrum. The sensor may be included in a camera that includes stereo vision. The sensor may be part of a camera. The camera may have a shutter (e.g., a rolling shutter). The sensor may have a small pixel size. Small pixel sizes may have FLS (μm) of up to about 1.2 microns, 1.4 μm, 1.5 μm, 2.0 μm, 2.5 μm, or 3 μm.
In some embodiments, the sensor may be free to move. The movement of the sensor may be controlled (e.g., automatically controlled by at least one controller). Movement of the sensor may be achieved by an actuator operatively coupled to the sensor. The sensor may be configured to have at least 1, 2, 3, 4, 5, or 6 degrees of freedom. The six degrees of freedom may include forward, backward, upward, downward, leftward, rightward, pitch, yaw, or roll.
In some embodiments, a depth camera is used to filter noise and/or distinguish a user from a background. The camera may include stereo vision. The camera may include a plurality of sensors spaced apart, e.g., configured to enable stereo vision capability. The camera module may include a processor. The depth camera may include a sensor (e.g., a red-green-blue (RGB) sensor) that is sensitive to the visible spectrum (e.g., as part of a sensor array) and/or a rolling shutter. The depth camera module may integrate data from IR and visible sensors. The depth camera may include an IR sensor, an IR laser, or a visible sensor. For example, the depth camera may include multiple visible sensors and one IR sensor and/or laser. For example, the depth camera may include multiple IR sensors and/or lasers and one visible sensor. The depth camera may include a laser (e.g., an IR laser). The depth camera may be a web camera. The depth camera may project and/or sense infrared radiation. The depth camera may utilize a comparison of captured data (e.g., sensed information) from two sensors spaced apart (e.g., horizontally and/or vertically) from each other.
Fig. 26 shows an example of a flow chart depicting (e.g., contactless) evaluation of an individual's temperature. In operation 2601, identification of facial landmarks of the user and/or a (e.g., actual and/or expected) distance of the user from the sensor is performed. In operation 2602, focusing of the sensor on the body location is performed (e.g., based at least in part on facial landmark recognition and/or a distance of the user from the sensor). Once the sensor is configured to focus on the body location, the sensor obtains measurements from the body location in operation 2603. In operation 2604, the measurement results are processed (e.g., adjusted according to various adjustment methods) to produce a result of the measured body characteristic. The results are compared to a threshold (e.g., value) in operation 2605 and a report is generated in operation 2606. Non-manipulated temperature measurements, processed results, comparisons to thresholds, and/or reports are optionally saved in one or more databases at operation 2608. The database may be operatively coupled to a network. The data in the database is optionally utilized to improve the accuracy of the report in operation 2607. The temperature can be measured, processed, and provided (e.g., to a user) as an output for at most about 0.25 seconds (sec), 0.5sec, 1sec, 1.5sec, 2sec, 3sec, or 5 sec.
Fig. 27A and 27B show various options for performing temperature measurement, for example, in a configuration similar to that shown in fig. 25. Fig. 27A shows an example of two sensors including a thermal (IR) camera 2701 and an optical camera 2704. The thermal camera is disposed in a housing 2702 that is separate from the housing of the optical camera 2704. For example, a thermal camera may be housed in the device aggregate, while an optical camera is not housed in the device aggregate. Thermal camera 2701 may be coupled to at least one processor 2713 via cable 2707 and connectors 2703 and 2708 (e.g., a USB connector or any other connector, for example, as disclosed herein). Optical camera 2704 may be coupled to the processor via a cable 2705 and a connector 2706 (e.g., a USB connector or any other connector, e.g., as disclosed herein). The at least one processor may be configured to record results of the two camera accumulations in operation 2709, which are processed in operation 2710, and generate results (e.g., insights) in operation 2711. The at least one processor 2713 is connected to a power supply 2712. The at least one processor 2713 is operatively coupled to a storage device (e.g., a database) 2715 via a link 2714 (e.g., a wired and/or wireless connection, such as a WiFi connection). The measurement system 2700 and the database 2715 are operatively coupled to a network 2716, e.g., controlling the facility in which the cameras are disposed. The at least one processor 2713 and database 2715 may or may not be located in the facility. Fig. 27B shows an example of two sensors including a thermal (IR) camera 2751 and an optical camera 2754. The thermal camera and the optical camera are disposed in a housing 2752. For example, thermal cameras and lights The optical camera may be housed in the device aggregate. Cameras 2751 and 2754 may be coupled to at least one processor 2763 via a cable. The optical camera may utilize a separate cable from the cable of the thermal camera. For example, a cable 2757 may couple the optical camera to the at least one co-processor 2763 via a connector 2758, and the thermal camera 2751 may be coupled to the at least one processor 2763 via connectors 2753 and 2756 via a cable 2755. The connector may be a USB connector or any other connector (e.g., as disclosed herein). The at least one processor 2763 may be configured to record the results of the two camera accumulations in operation 2759, process the results in operation 2760, and generate results (e.g., insights) in operation 2761. At least one processor 2763 is connected to a power supply 2762. The at least one processor 2763 is operatively coupled to a storage device (e.g., a database) 2765 via a link 2764 (e.g., a wired and/or wireless connection, such as a WiFi connection). Measurement system 2750 and database 2715 are operatively coupled to network 2766, for example, to control the facility in which the cameras are disposed. The at least one processor 2763 and database 2765 may or may not be located in the facility. The optical camera may be a depth camera. The aggregate of devices may include processing capabilities. For example, the at least one processor (e.g., 2763 or 2713) may reside at least partially within the apparatus aggregate. For example, the at least one processor (e.g., 2763 or 2713) may reside in the apparatus assembly. For example, the at least one processor is a plurality of processors, wherein the at least one processor resides in the aggregate of devices. The processing results may include removing noise and/or calibrating the captured sensor measurements. Noise may be generated by the background surrounding the user. Saving in the database may include saving a record file. The electrical devices (e.g., 2712 or 2762) required for data processing may include up to 2 volts (V), 4V, 5V, or 10V, and up to 1 ampere (a), 2A, or 3A. The at least one processor (e.g., 2763 or 2713) may include a CPU or GPU. The at least one processor may include a media player. The at least one processor may be included in a circuit board. The circuit board may include
Figure BDA0003861909420000951
Jetson Nano of TM A development kit (e.g., a 2GB or 4GB development kit) or a Raspberry-Pi kit (e.g., a 1GB, 2GB, 4GB or 8GB development kit). The at least one processor may be operatively coupled to a plurality of ports, including at least one media port (e.g., displayPort, HDMI, and/or micro-HDMI), USB, or audio video jack, such as may be included in a circuit board. The at least one processor may be operatively coupled to a Camera Serial Interface (CSI) or a Display Serial Interface (DSI), for example, as part of a circuit board. The at least one processor is configured to support communications such as ethernet (e.g., gigabit ethernet). The circuit board may include Wi-Fi functionality, bluetooth functionality, or a wireless adapter. The wireless adapter may be configured to conform to a wireless networking standard in the 802.11 protocol suite (e.g., USB 802.11 ac). The wireless adapter may be configured to provide a high-throughput Wireless Local Area Network (WLAN), for example, over a frequency band of at least about 5 GHz. The USB port may have a transfer speed of at least about 480 megabits per second (Mbps), 4,800Mbps, or 10,000Mbps. The at least one processor may include a synchronous (e.g., clocked) processor. The clock speed of the processor may be at least about 1.2 gigahertz (GHz), 1.3GHz, 1.4GHz, 1.5GHz, or 1.6GHz. The at least one processor may include Random Access Memory (RAM). The RAM may comprise a double data rate Synchronous Dynamic RAM (SDRAM). The RAM may be configured for a mobile device (e.g., a laptop, tablet, or mobile phone, such as a cellular phone). The RAM may include a Low Power Dual Data Rate (LPDDR) RAM. The RAM may be configured to allow channels at least approximately 16, 32, or 64 bits wide.
The at least one processor may comprise a single circuit board computer (SBC). The at least one processor may be configured to run multiple neural networks in parallel (e.g., for image classification, object detection, segmentation, and/or speech processing). The at least one processor may be powered by up to about 10 watts (W), 8W, 5W, or 4W.
In some embodiments, a frame station is utilized to provide a user's interactive experience. FIG. 28 shows an example of a framework system 2800 and a user 2802 with an interactive experience with the framework system. The frame system frames a slate 2803, which may include a static or media display that displays time-varying digital media. The frame station frames a transparent panel such as 2805, which may be a window (e.g., a glass window or a polymer window). The window may or may not be a tintable window. The frame system may have a cavity 2804 accessible to the user. The cavity may have a maximum depth of the frame system. The framework system may include one or more sensors and/or transmitters operatively coupled to a network (e.g., and a control system). The frame system may be modular. Fig. 28 shows an example of a frame system 2850 constructed from a plurality of modular units 2857a, 2857b, and 2857 c. The modular framing system unit includes a transparent panel (e.g., a window), such as 2855; information boards 2853a, 2853b, and 2853c, which may include poster boards, white boards, magnetic boards, cork boards, or media displays. Each of the framing systems 2857a, 2857b, and 2857c is divided into a 2 x 2 matrix that frames four sections (e.g., 2853a and 2855). Modular frame system unit 5357b includes an opaque panel 2856 that covers a portion of the matrix portion while facilitating user 2852 access to the cavity in the frame system in a manner similar to access cavity 2804. Frame system 2857b is similar to frame system 2800. The framework system may be used as a self-service terminal that services one or more individuals (e.g., simultaneously).
In some embodiments, the frame system includes internal frame portions that are integrated together into a single frame system. The framework system may include a cable and one or more interactive devices. The one or more interactive devices may include a dispenser, a sensor, and/or a transmitter. The sensor may comprise an optical (e.g. electromagnetic) sensor. The emitter may comprise a lighting device, a projector, or a media display. The internal frame portions may be integrated with one another in one or more of a variety of joint types. The type of engagement may include linear engagement or non-linear (e.g., staggered) engagement. The joint may comprise a butt joint, dovetail, miter joint, mortise and tenon joint, wood tenon, picket, scarf joint, V-joint, lap joint, butt-strap joint, or tongue and groove joint. The butt-joint may comprise a simple butt-joint or a double butt lap joint. The lap joint may comprise a double butt lap joint, a half lap joint, a flat lap joint, a diagonal lap joint, a double lap joint, or a tenon lap joint. The strap joint may include a single strap joint, a double strap joint, a female double strap joint, or a beveled double strap joint. The tongue and groove engagement may comprise a shoulder rest tongue and groove engagement.
In some embodiments, a frame system (also referred to herein as a "frame device") includes a housing configured to house one or more processors and/or wiring. The shell may include one or more openings. The opening may facilitate operatively coupling (e.g., wired and/or wirelessly connecting) the frame system to a network (e.g., a local network of a facility). The opening may facilitate maintenance of one or more components of a frame system disposed in the housing. For example, the opening may facilitate maintenance of wiring and/or a processor disposed in the housing. Maintenance includes repair, replacement, or introduction of new components. The opening may comprise a reversibly openable and closable lid or door. The cover or door may be secured to the body of the housing by one or more hinges or screws. The cover or door may include a mechanism that facilitates snapping the cover or door (respectively) to the body of the housing. The opening of the housing may be provided on a front, back or side of the housing, where the front is the side that is facing the user designed to interact with the interactive frame system. For example, frame system 2900 depicts a front side of the frame system, and frame system 2940 depicts a rear side view thereof. Sometimes, interactive framework systems are designed to interact with multiple users on opposite sides thereof. In this case, the cover or door of the housing may be located on one side designed for interaction with a user, or on an opposite side thereof designed for interaction with another user (e.g., whether at the same time or not). The interior space of the frame portion (forming the frame system) may be used to house wiring and/or devices (e.g., sensors or emitters; e.g., projector 342 of fig. 3). The interior space of the frame portion (forming the frame system) may be used to house wiring, while a device (e.g., a sensor or transmitter) may be attached to the exterior of the frame portion (e.g., as sensor 2922 of fig. 29).
Fig. 29 shows an example of a frame system 2900 that frames a display panel 2903, which may be a media display, and an opaque panel 2907 having two openings where the level of sterilant disposed in a dispenser of the frame system may be visible through the opening 2904, the opening 2902 being for one or more sensors (e.g., a camera or any other electromagnetic sensing device). The frame 2900 is configured to frame a transparent panel, such as 2905 (e.g., a window). Frame 2900 is configured to house network, communication, and/or control related devices in case 2906. The frame system 2900 is configured as a 2 x 2 matrix that can house various panels, such as opaque panels 2907, display panels 2903, and transparent panels 2905. The frame system 2900 includes a maintenance opening 2909 (e.g., for maintaining electronic components, such as a processor and/or wiring) and a wiring opening 2910 (e.g., to facilitate operatively coupling (e.g., connecting) the frame system to a network (e.g., a local network of a facility)) as part of the housing 2906.
FIG. 29 shows an example of a frame system 2920 that frames a display panel 2923, which may be a media display, and sensors 2922 that are attached to the outside of the frame 2920. The frame 2920 houses a projector 2924. The projector 2924 may project words and/or images. For example, the projector may project a guidance signal that suggests a location where the user is standing (e.g., similar to projected image 342 projected image 346 in fig. 3). The projector 2924 is disposed so that it is aligned with the edge of the opaque plate 2907 to be able to project an image without being blocked by the display panel. The frame 2920 houses a dispenser 2927 (e.g., to dispense disinfectant or another hygienic substance, such as a liquid, foam, or gel). The frame 2920 is configured to frame a transparent plate, such as 2925 (e.g., a window). The frame 2920 is configured to house network, communication and/or control related devices in the housing 2926. The frame system 2920 is configured as a 2 x 2 matrix that can house various panels, such as opaque, display and transparent panels 2923 and 2925. Frame system 2920 is another depiction of frame system 2900 with opaque plate 2907 removed therefrom. The frame system 2920 includes a maintenance opening 2929 (e.g., for maintaining electronic components such as processors and/or wiring) and a wiring opening 2930 (e.g., to facilitate operatively coupling (e.g., connecting) the frame system to a network (e.g., a local network of a facility)) as part of the housing 2926.
Fig. 29 shows an example of a frame system 2940, which is a representation of the back of frame system 2900. Frame system 2940 frames display opaque plates 2941a and 2941b. Region 2942 indicates where the dispenser is located between the back opaque panel 2941b and the front opaque panel 2907, the back and front directions being relative to the intended user of the interactive frame system. The frame 2940 is configured to frame transparent plates 2945a and 2945b (e.g., windows). The frame 2940 is configured to house network, communication, and/or control related devices in the housing 2946. Frame system 2940 shows an example of frame portions supporting transparent plates 2945a and 2945b that are flush, having a linear boundary 2948 in common (e.g., butt joint); and frame portions supporting opaque plates 2941a and 2941b, the two frame portions interleaved and having a boundary 2949 in common (e.g., a dovetail joint).
Fig. 29 shows an example of a frame system 2980, which is a representation of frame system 2900 with shell 2906 open. The frame system 2980 is configured to house network, communication and/or control related devices in the housing 2986 in which is disposed a processor 2985 that is operatively coupled to the network (e.g., for power and communication) and operatively coupled to the sensors, projector, dispenser and any media display framed by the frame system (e.g., 2903 may be a media display). The frame system may be configured to stand alone in a space (e.g., in a room), or may be disposed adjacent to a wall. For example, the back plates 2491a and 2941b may face the wall. The frame system 2980 includes a wiring opening 2990 (e.g., to facilitate operatively coupling (e.g., connecting) the frame system to a network (e.g., a local network of a facility).
Fig. 30 shows an example of a frame system 3020 that frames a display panel 3026, which may be a media display, and sensors 3022a and 3022b, which are attached to the outside of the frame 3020. The frame 3020 accommodates a projector 3028. The projector 3028 may project words or images. For example, the projector may project a guidance signal that suggests a location where the user is standing (e.g., similar to projected image 342 projected image 346 in fig. 3). The projector 3024 is disposed such that it is aligned with an edge of the opaque plate 2907 to be able to project an image without being obstructed by the display panel. Frame 3020 houses a dispenser 3024 (e.g., to dispense disinfectant or another sanitary substance, such as a liquid, foam, or gel). The frame 3020 is configured to frame a transparent panel, such as 3030 (e.g., a window). Frame system 3020 is an enlarged version of frame system 2920. Frame system 3020 depicts a staggered connection 3029 between the frame spacer of display panel 3026 and the frame portion housing sensors 3022a and 3022b, projector 3028, and dispenser 3024. Each part of the cross-connect is secured by screws such as 3023.
Fig. 30 shows an example of a frame system 3040 depicting, at another angle, the frame system 3040 housing a dispenser 3044, a projector 3048, and a sensor 3042 attached to the outside of the frame 3048. Portions of the interior of the frame system 3040 are exposed at 3050. This exposed portion is at least partially covered by the opaque plates 2907 and 2941b and is not accessible to the user.
In some embodiments, the interactive framework system is configured to serve users on one side thereof. In some embodiments, the interactive framework system is configured to serve users on both sides thereof. FIG. 29 shows various representations of an interactive framework system, such as 2900, 2920, 2940, and 2980. The framework system shown in fig. 29 is configured to serve users on its front side, shown in 2900, and not to serve any users on its back side, shown in 2940. However, the frame system may also be interactive on the back side. For example, opaque plate 2941b may be replaced by a media display or billboard. For example, opaque plate 2941a may be replaced with an opaque plate such as 2907 that allows a user to access a cavity such as 2908 and facilitate any framing system functions, such as operatively coupled to one or more sensors (e.g., 2922, 2924), one or more emitters, a disinfectant dispenser, a sanitary dispenser, a medication dispenser, a medical device dispenser, a device dispenser such as a tool dispenser (e.g., screws and/or bolts), and/or a dispenser configured to dispense protective devices (e.g., gloves, goggles, and/or a mask). The interactive framework may be provided in a factory, medical facility, bank, hotel, shopping mall, restaurant, educational facility (e.g., school, college, or university), office building, public transportation station (e.g., train station or airport), or government building. The interactive framework may be provided in a residential building such as a complex apartment.
While preferred embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. The invention is not intended to be limited to the specific examples provided within the specification. While the invention has been described with reference to the foregoing specification, the description and illustrations of the embodiments herein are not meant to be construed in a limiting sense. Numerous variations, changes, and substitutions will occur to those skilled in the art without departing from the invention herein. Further, it is to be understood that all aspects of the present invention are not limited to the specific depictions, configurations or relative proportions set forth herein which depend upon a variety of conditions and variables. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is therefore contemplated that the present invention will also cover any such alternatives, modifications, variations or equivalents. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.

Claims (60)

1. A method for tracking a plurality of individuals in a facility, the method comprising:
(a) Sensing a first identity having a first location at a first time and a second identity having a second location at a second time using a sensor system, wherein the sensor system is operatively coupled to a local network disposed in the facility, the sensor system comprising a plurality of sensors configured to sense and/or identify the first identity, the first location, the first time, the second identity, the second location, and the second time, the local network configured to facilitate control of at least one other apparatus of the facility;
(b) Tracking movement of the first identity over a period of time to generate first tracking information and tracking movement of the second identity over the period of time to generate second tracking information; and
(c) Evaluating a distance from the first tracking information to the second tracking information relative to a distance threshold.
2. The method of claim 1, wherein the plurality of sensors comprises a plurality of geo-location sensors that are time synchronized.
3. The method of claim 1, wherein the sensor system is comprised of an aggregate of devices installed at fixed locations in the facility.
4. The method of claim 1, further comprising:
(d) Associating the first location and the first time with the first identity to generate a first association, and associating the second location and the second time with the second identity to generate a second association; and
(e) Comparing the first association to the second association to evaluate a distance from the first identity to the second identity relative to the distance threshold.
5. The method of claim 1, further comprising evaluating, relative to a time threshold, whether the first tracking information and the second tracking information are separated by a distance below the distance threshold for a cumulative time.
6. The method of claim 1, wherein the local network comprises wiring disposed in an enclosure of at least one peripheral structure of the facility.
7. The method of claim 1, further comprising transmitting power and communications over a single cable of the local network.
8. The method of claim 1, further comprising transmitting at least a fourth or fifth generation cellular communication using the local network.
9. The method of claim 1, further comprising using the local network to transfer data comprising media.
10. The method of claim 1, further comprising controlling an atmosphere of the facility using the local network.
11. The method of claim 1, further comprising using the network to control a tintable window disposed in the facility.
12. A non-transitory computer-readable program instruction for tracking a plurality of individuals in a facility, which, when read by one or more processors, causes the one or more processors to perform operations of any of the methods of claims 1-11.
13. Non-transitory computer-readable program instructions for tracking a plurality of individuals in a facility, which, when read by one or more processors, cause the one or more processors to perform operations comprising:
(a) Using or directing use of a sensor system to sense a first identity having a first location at a first time and a second identity having a second location at a second time, wherein the sensor system is operatively coupled to a local network disposed in the facility, the sensor system comprising a plurality of sensors configured to sense and/or identify the first identity, the first location, the first time, the second identity, the second location, and the second time, the local network configured to facilitate control of at least one other apparatus of the facility;
(b) Tracking or directing tracking movement of the first identity over a period of time to generate first tracking information, and tracking movement of the second identity over the period of time to generate second tracking information; and
(c) Evaluating or directing evaluation of a distance from the first tracking information to the second tracking information relative to a distance threshold.
14. An apparatus for tracking a plurality of individuals in a facility, the apparatus comprising at least one controller configured to:
(a) Operatively coupled to a sensor system comprising a plurality of sensors configured to sense and/or identify a first identity having a first location at a first time and a second identity having a second location at a second time, the sensor system operatively coupled to a local network disposed in the facility, the at least one controller configured to control at least one other apparatus of the facility;
(b) Using or directing use of a sensor system to sense the first identity having the first location at the first time and the second identity having the second location at the second time;
(c) Tracking or directing tracking movement of the first identity over a period of time to generate first tracking information, and tracking movement of the second identity over the period of time to generate or direct generation of second tracking information; and
(d) Evaluating or directing evaluation of a distance from the first tracking information to the second tracking information relative to a distance threshold.
15. The apparatus of claim 14, wherein the plurality of sensors comprises at least one geolocation sensor to detect (i) the first location and the second location and/or (ii) the first identity and the second identity.
16. The apparatus of claim 14, wherein the sensor system is operatively coupled to a hierarchical control system comprising a plurality of controllers.
17. A method for tracking a plurality of individuals in a facility, the method comprising:
(A) Detecting identities of a first individual and a second individual disposed in the facility using a sensor system operatively coupled to a local network configured to facilitate control of at least one other device of the facility;
(B) Tracking, using the sensor system, movement of the first individual across a first set of locations in the facility during a first set of times and tracking movement of the second individual across a second set of locations in the facility during a second set of times;
(C) Associating the first set of locations and the first set of times with the first individual to generate a first association, and associating the second set of locations and the second set of times with the second individual to generate a second association; and
(D) Comparing the first association to the second association to evaluate a distance from the first individual to the second individual relative to a threshold.
18. The method of claim 17, wherein detecting the identity of the first individual and the second individual is performed upon the first individual and the second individual entering the facility.
19. The method of claim 17, wherein the first set of locations and the second set of locations are specified in accordance with subdividing a plurality of zones of the facility.
20. The method of claim 17, wherein the first association is compared to the second association conditionally upon detection of a predetermined event related to the first individual and/or the second individual.
21. A non-transitory computer-readable program instructions for tracking a plurality of individuals in a facility, which, when read by one or more processors, cause the one or more processors to perform operations of any of the methods of claims 17-20.
22. A non-transitory computer-readable program instruction for tracking a plurality of individuals in a facility, which, when read by one or more processors, causes the one or more processors to perform operations comprising:
(A) Using or directing use of a sensor system to detect identities of a first individual and a second individual disposed in the facility, the sensor system operatively coupled to a local network configured to facilitate control of at least one other device of the facility;
(B) Tracking, using the sensor system, movement of the first individual across a first set of locations in the facility during a first set of times and tracking movement of the second individual across a second set of locations in the facility during a second set of times;
(C) Associating or directing association of the first set of locations and the first set of times with the first individual to generate a first association, and associating the second set of locations and the second set of times with the second individual to generate a second association; and
(D) Comparing or directing a comparison of the first association to the second association to assess a distance from the first individual to the second individual relative to a threshold.
23. An apparatus for tracking a plurality of individuals in a facility, the apparatus comprising at least one controller configured to:
(A) Operatively coupled to a sensor system configured to detect the identities of a first individual and a second individual disposed in the facility, the at least one controller configured to control at least one other apparatus of the facility;
(B) Using or directing use of the sensor system to detect the identities of the first individual and the second individual disposed in the facility;
(C) Using or directing use of the sensor system to track movement of the first individual across a first set of locations in the facility during a first set of times and to track movement of the second individual across a second set of locations in the facility during a second set of times;
(D) Associating the first location and the first set of times with the first individual or a lead and generating a first association, and associating the second location and the second set of times with the second individual or a lead and generating a second association; and
(E) Comparing the first association to the second association or directing a comparison to assess a distance from the first individual to the second individual relative to a threshold.
24. The apparatus of claim 23, wherein the plurality of sensors comprises a geolocation sensor to detect (i) the first set of locations and the second set of locations and/or (ii) the first identity and the second identity.
25. The apparatus of claim 24, wherein the geolocation sensor comprises an ultra-wideband (UWB) sensor or a bluetooth sensor.
26. The apparatus of claim 24, wherein the local network comprises wiring disposed in an enclosure of the facility.
27. The apparatus of claim 26, wherein the local network is configured for power and communications transmission over a single cable.
28. A method for monitoring surface disinfection of a facility, the method comprising:
(A) Sensing a plurality of temperature samples of an object surface at a plurality of sample times using a sensor system disposed in the facility and operatively coupled to a local network of the facility, the local network configured to control at least one other device in the facility operatively coupled to the local network;
(B) Comparing successive ones of the plurality of temperature samples to generate comparison results;
(C) Detecting a cleaning event when the comparison indicates that the temperature has fallen below a temperature threshold;
(E) Monitoring the elapsed time since the last cleaning event; and
(F) Generating a notification when the elapsed time exceeds a time threshold.
29. The method of claim 28, wherein the sensor system is coupled to a local network disposed in the facility in which the object surface is disposed.
30. The method of claim 29, further comprising using the local network to transmit data comprising media.
31. The method of claim 28, wherein the sensor system remotely senses the temperature sample.
32. The method of claim 28, wherein the sensor system is operatively coupled to a hierarchical control system comprising a plurality of controllers.
33. A non-transitory computer-readable program instruction for monitoring surface disinfection of a facility, which, when read by one or more processors, causes the one or more processors to perform operations of any of the methods of claims 28-32.
34. Non-transitory computer-readable program instructions for monitoring surface disinfection of a facility, which, when read by one or more processors, cause the one or more processors to perform operations comprising:
(A) Using or directing use of a sensor system to sense a plurality of temperature samples of an object surface at a plurality of sample times, the sensor system disposed in the facility and operatively coupled to a local network of the facility, the local network configured to control at least one other apparatus in the facility operatively coupled to the local network;
(B) Comparing or directing a comparison of successive temperature samples of the plurality of temperature samples to generate a comparison result;
(C) Detecting or directing detection of a cleaning event when the comparison indicates a temperature drop below a temperature threshold;
(E) Monitoring or directing monitoring of the time elapsed since the last cleaning event; and
(F) Generating or directing generation of a notification when the elapsed time exceeds a time threshold.
35. An apparatus for monitoring surface disinfection of a facility, the apparatus comprising at least one controller configured to:
(A) Using or directing use of a sensor system to sense a plurality of temperature samples of a surface of an object at a plurality of sample times;
(B) Comparing or directing a comparison of successive temperature samples of the plurality of temperature samples to generate a comparison result;
(C) Detecting or directing detection of a cleaning event when the comparison indicates a temperature drop below a temperature threshold;
(E) Monitoring or directing monitoring of the time elapsed since the last cleaning event;
(F) Generating a notification when the elapsed time exceeds a time threshold; and
(G) Controlling or directing at least one other device controlling the facility.
36. The apparatus of claim 35, wherein the at least one controller is configured to send or direct the notification to a designated recipient or a requesting recipient.
37. The apparatus of claim 35, wherein the sensor system comprises at least one sensor integrated in a collection of devices, the collection of devices comprising (i) a sensor or (ii) a sensor and an emitter.
38. The apparatus of claim 35, wherein the sensor system is coupled to a local network disposed in the facility in which the object surface is disposed.
39. The apparatus of claim 38, wherein the local network is configured to be coupled to a tintable window.
40. A method of detecting a physical characteristic of an individual in a facility, comprising:
(a) Sensing environmental characteristics in the presence of the individual at a plurality of occasions using a sensor system disposed in the facility and operatively coupled to a local network configured to facilitate control of at least one other apparatus of the facility;
(b) Analyzing (i) the plurality of environmental characteristic data samples and (ii) a threshold indicative of an abnormal physical characteristic to generate an analysis; and
(c) Generating a report of the presence and/or absence of the indication of the abnormal physical feature of the individual using the analysis.
41. The method of claim 40, wherein the environmental feature is detectably perturbed by the presence of the individual compared to the absence of the individual in the environment.
42. The method of claim 40, wherein a plurality of environmental characteristic data samples of the individual are collected for the plurality of occasions for quantifying a normal physical characteristic of the individual, wherein the analyzing further comprises analyzing a relative difference between a most recent one of the data samples and the normal quantification value, and wherein the threshold is a difference threshold.
43. The method of claim 40, wherein the sensor system comprises an electromagnetic sensor.
44. The method of claim 40, wherein the sensor system comprises a first electromagnetic sensor configured to detect a first range of radiation and a second electromagnetic sensor configured to detect a second range of radiation, the second range of radiation having at least one portion that does not overlap with the first range of radiation.
45. The method of claim 40, further comprising focusing at least one sensor of the sensor system on one or more facial landmark features of the individual to measure the environmental feature.
46. The method of claim 40, further comprising focusing at least one sensor of the sensor system on a placement depth of the individual to measure the environmental feature.
47. The method of claim 40, wherein evaluating the features comprises filtering environmental features due to background.
48. A non-transitory computer-readable program instruction for detecting a physical feature of an individual in a facility, which, when read by one or more processors, causes the one or more processors to perform operations of any of the methods of claims 40-32.
49. A non-transitory computer-readable program instruction for detecting a physical feature of an individual in a facility, which, when read by one or more processors, causes the one or more processors to perform operations comprising:
(a) Using or directing use of a sensor system to sense environmental characteristics in the presence of the individual at a plurality of occasions, the sensor system disposed in the facility and operatively coupled to a local network configured to facilitate control of at least one other apparatus of the facility;
(b) Analyzing or directing analysis of (i) the plurality of environmental characteristic data samples and (ii) a threshold indicative of an abnormal physical characteristic to generate an analysis; and
(c) Using or directing use of the analysis to generate a report of the presence and/or absence of the indication of the abnormal physical characteristic of the individual.
50. An apparatus for detecting physical features of an individual in a facility, the apparatus comprising at least one controller configured to:
(a) Operatively coupled to a sensor system configured to sense an environmental characteristic;
(b) Using or directing use of a sensor system to sense environmental characteristics in the presence of the individual at a plurality of occasions;
(b) Analyzing or directing an analysis of (i) the plurality of environmental characteristic data samples and (ii) a difference threshold indicative of an abnormal physical characteristic to generate an analysis;
(c) Using or directing use of the analysis to generate a report of the presence and/or absence of an indication of the abnormal physical feature of the individual; and
(d) Controlling or directing at least one other device controlling the facility.
51. The apparatus of claim 50, wherein the sensor system is communicatively coupled to a local network disposed in the facility.
52. The apparatus of claim 51, wherein the local network comprises a cable disposed in an enclosure of the facility.
53. The apparatus of claim 51, wherein the at least one controller is configured to use or direct use of the local network to transmit at least a fourth or fifth generation cellular communication.
54. The apparatus of claim 51, wherein the at least one controller is configured to use or direct use of the local network to control the facility.
55. The apparatus of claim 50, wherein the sensor system comprises an infrared sensor, a visible light sensor, or a depth camera.
56. The apparatus of claim 50, wherein the sensor system comprises a visible light sensor and a non-visible light sensor.
57. The device of claim 50, wherein the sensor system comprises a camera configured to distinguish an individual from its surroundings based at least in part on infrared radiation readings and/or visible radiation readings.
58. The device of claim 50, wherein the at least one controller is configured to direct the at least one sensor of the sensor system to focus the measurements of the at least one sensor at least in part by taking into account (i) at least one facial feature of the individual and/or (ii) a horizontal displacement of the individual relative to the one or more sensors.
59. The device of claim 50, wherein the at least one controller is configured to analyze or guide analysis of the plurality of environmental feature data samples at least in part by using a machine learning model that utilizes a learning set that includes measurements when individuals are present, measurements when blackbodies are present, ground truth measurements, and/or simulation measurements.
60. The apparatus of claim 50, wherein the at least one controller is configured to analyze or direct analysis of the plurality of environmental feature data samples at least in part by using a machine learning model, the machine learning model comprising a regression model or a classification model.
CN202180024055.4A 2020-03-23 2021-03-22 Identifying, reducing health risks in a facility and tracking occupancy of a facility Withdrawn CN115398464A (en)

Applications Claiming Priority (11)

Application Number Priority Date Filing Date Title
US202062993617P 2020-03-23 2020-03-23
US62/993,617 2020-03-23
US202063041002P 2020-06-18 2020-06-18
US63/041,002 2020-06-18
US202063115886P 2020-11-19 2020-11-19
US63/115,886 2020-11-19
PCT/US2021/015378 WO2021154915A1 (en) 2020-01-29 2021-01-28 Sensor calibration and operation
USPCT/US2021/015378 2021-01-28
US202163159814P 2021-03-11 2021-03-11
US63/159,814 2021-03-11
PCT/US2021/023433 WO2021194944A1 (en) 2020-03-23 2021-03-22 Identifying, reducing health risks, and tracking occupancy in a facility

Publications (1)

Publication Number Publication Date
CN115398464A true CN115398464A (en) 2022-11-25

Family

ID=77892258

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180024055.4A Withdrawn CN115398464A (en) 2020-03-23 2021-03-22 Identifying, reducing health risks in a facility and tracking occupancy of a facility

Country Status (4)

Country Link
EP (1) EP4128130A1 (en)
CN (1) CN115398464A (en)
TW (1) TW202205310A (en)
WO (1) WO2021194944A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117475608A (en) * 2023-12-28 2024-01-30 云南远信科技有限公司 Multi-scene security monitoring method based on unmanned aerial vehicle cluster

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11978562B2 (en) * 2019-12-17 2024-05-07 Vayyar Imaging Ltd. System and method for monitoring passengers and controlling an elevator system
DE102021131553A1 (en) 2021-12-01 2023-06-01 Philipps-Universität Marburg, Körperschaft des öffentlichen Rechts access control system
EP4372711A1 (en) * 2022-11-16 2024-05-22 EMB Co., Ltd. Artificial intelligence based harmful environment control system connected to internet of things and its harmful environment control method
US20240183830A1 (en) * 2022-12-05 2024-06-06 Cisco Technology, Inc. Radar-assisted environment monitoring

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2441512C (en) * 2001-03-09 2010-06-29 Radianse, Inc. Location system and methods
CA2745519A1 (en) * 2008-12-08 2010-06-17 Infonaut Inc. Disease mapping and infection control system and method
EP2565857B1 (en) * 2011-09-01 2017-05-31 Siemens Schweiz AG Method and system for evaluating the security situation in a building with living areas with access authorisation
KR20170021692A (en) * 2015-08-18 2017-02-28 주식회사 에스원 Moving Path Chasing System of The epidemic infected suspect and Method thereof
KR101857081B1 (en) * 2016-07-27 2018-06-25 주식회사 바이테크 Management system of environment control installed in building

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117475608A (en) * 2023-12-28 2024-01-30 云南远信科技有限公司 Multi-scene security monitoring method based on unmanned aerial vehicle cluster
CN117475608B (en) * 2023-12-28 2024-03-08 云南远信科技有限公司 Multi-scene security monitoring method based on unmanned aerial vehicle cluster

Also Published As

Publication number Publication date
WO2021194944A1 (en) 2021-09-30
EP4128130A1 (en) 2023-02-08
TW202205310A (en) 2022-02-01

Similar Documents

Publication Publication Date Title
US20230152652A1 (en) Identifying, reducing health risks, and tracking occupancy in a facility
US11783658B2 (en) Methods and systems for maintaining a healthy building
US11783652B2 (en) Occupant health monitoring for buildings
CN115398464A (en) Identifying, reducing health risks in a facility and tracking occupancy of a facility
US20210391089A1 (en) Methods and systems for reducing a risk of spread of an illness in a building
US9997046B2 (en) Visitor flow management
Gupta et al. Future smart connected communities to fight covid-19 outbreak
US11625964B2 (en) Methods and systems for temperature screening using a mobile device
Azimi et al. Fit-for-purpose: Measuring occupancy to support commercial building operations: A review
WO2018051349A1 (en) Facility monitoring by a distributed robotic system
US20220293278A1 (en) Connected contact tracing
US20230104820A1 (en) Self-Contained, Portable, Sanitization Device and Telemedicine Station
US20230070313A1 (en) Building data platform with air quality analysis based on mobile air quality sensors
CN116137941A (en) Atmospheric regulation in peripheral structures
Azizi et al. Effects of positioning of multi-sensor devices on occupancy and indoor environmental monitoring in single-occupant offices
KR102496002B1 (en) Entrance management kiosk of non-face-to-face based on artificial intelligence and access management system including thereof
WO2021234379A1 (en) Methods and systems for monitoring compliance with health and/or sanitation requirements at a site or building
US11631279B2 (en) Smart cleaning system
US20220087498A1 (en) Self-cleaning environment
Feagin Jr et al. A Review of Existing Test Methods for Occupancy Sensors
KR102534056B1 (en) Indoor air quality management system and operation method thereof
Anand et al. Wireless Sensor‐based IoT System with Distributed Optimization for Healthcare
Choudhury et al. Developing an IoT based Mass Crowd Management System Reviewing Existing Methodologies
US20230039967A1 (en) Airborne pathogen detection through networked biosensors
Azimi Comprehensive simulation-based workflow to assess the performance of occupancy-based controls and operations in office buildings

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20221125

WW01 Invention patent application withdrawn after publication