CN115087846A - Sensor calibration and operation - Google Patents

Sensor calibration and operation Download PDF

Info

Publication number
CN115087846A
CN115087846A CN202180014420.3A CN202180014420A CN115087846A CN 115087846 A CN115087846 A CN 115087846A CN 202180014420 A CN202180014420 A CN 202180014420A CN 115087846 A CN115087846 A CN 115087846A
Authority
CN
China
Prior art keywords
sensor
sensors
data
parameter
duration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202180014420.3A
Other languages
Chinese (zh)
Inventor
N·特里卡
A·马利克
A·古普塔
M·万加蒂
T·马克尔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
View Inc
Original Assignee
View Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/083,128 external-priority patent/US20210063836A1/en
Application filed by View Inc filed Critical View Inc
Publication of CN115087846A publication Critical patent/CN115087846A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D18/00Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00

Abstract

Methods, devices, non-transitory computer-readable media, and systems for sensor calibration in at least one peripheral structure are disclosed herein. The calibration may comprise, for example, automatic self-calibration of the sensor. The calibration may be performed automatically when or after the sensor is deployed in the peripheral structure. The calibration may utilize data of the sensor to be calibrated and/or sensor data of neighboring sensors in the peripheral structure.

Description

Sensor calibration and operation
RELATED APPLICATIONS
The present application claims the benefit of U.S. provisional patent application serial No. 62/967,204 entitled "SENSOR CALIBRATION AND OPERATION" filed on 29.1.2020, is a partial continuation application of U.S. patent application serial No. 17/083,128 entitled "BUILDING NETWORK" filed on 28.10.10.2020, AND U.S. patent application serial No. 17/083,128 is a partial continuation application of (i) U.S. patent application serial No. 16/664,089 entitled "BUILDING NETWORK" filed on 25.10.25.2019, AND (ii) international patent application serial No. PCT/US18/29460 entitled "filed on 25.4.25.4.2018; and is a partial continuation of U.S. patent application Ser. No. 16/447,169 entitled "SENSING AND COMMUNICATIONS UNIT FOR OPTICAL SWITCH WINDOW SYSTEMS" filed on 20.6.2019, U.S. patent application Ser. No. 16/447,169(I) claims the benefit of U.S. provisional patent application Ser. No. 62/858,100 entitled "SENSING AND COMMUNICATIONS UNIT FOR OPTICAL SWITCH WINDOW SYSTEMS" filed on 6.6.2019, and (II) is a partial continuation of International patent application Ser. No. PCT/US19/30467 entitled "EDGE NETWORK FOR BUILDING SERCES" filed on 2.5.2019, International patent application Ser. PCT/US19/30467 claims a COMMUNICATION UNIT No. PCT/US SENSING AND UNICATIONS UNIT FOR SWITCH SWISTS SYSTEM AND US SYSTEMS No. 68 filed on 8.2.8.2019, U.S. SYSTEM FOR OPTIMS SYSTEM 3668 entitled "JOINT PATENT SYSTEM FOR USE Patent application Ser. No. 62/768,775, U.S. provisional patent application Ser. No. 62/688,957 entitled "SENSING AND COMMUNICATIONS UNIT FOR OPTICAL SWITCH WINDOW SYSTEMS" filed on 6/22.2018, and U.S. provisional patent application Ser. No. 62/666,033 entitled "EDGE NETWORK FOR BUILDING SERVICES" filed on 5/2.2018, each of which is incorporated herein by reference in its entirety.
Background
The sensor may be configured (e.g., designed) to measure one or more environmental characteristics, such as temperature, humidity, ambient noise, carbon dioxide, and/or other aspects of the ambient environment. The sensor may require calibration to accurately measure the one or more environmental characteristics. Calibration of the sensor may be performed in a factory setting (e.g., where it is manufactured). The ambient environment of the factory scene may be different from the environment in which the sensors may be installed. Sensors operating in an installation environment may operate in a manner that is inferior to the operation of sensors in a factory setting. Poor operation of the sensor may include providing sensor readings with, for example, reduced and/or impaired accuracy. In response to a sensor being installed in a target (e.g., expected and/or deployed) environment, an installer may calibrate (and/or recalibrate) the sensor. Calibration (and/or recalibration) of sensors in a target environment may have one or more drawbacks, including being time consuming, expensive, and/or labor intensive.
Disclosure of Invention
Various aspects disclosed herein mitigate at least some of one or more disadvantages associated with sensor calibration in a target setting.
Various aspects disclosed herein may relate to groups (e.g., assemblies, groups, and/or networks) of sensors capable of self-calibration. The self-calibration may be performed in an environment (e.g., a factory and/or target environment). Self-calibration of the sensor may include self-learning of the sensor over a period of time (e.g., to find a target environmental baseline). Self-calibration may include a comparison of the target environment baseline to a factory-determined baseline. A factory may be an environment in which sensors are assembled (e.g., built). Any increment from the factory calibration specification to the measured specification in the target environment (e.g., in the field) (e.g., beyond the factory baseline error range) can become the new (target environment) baseline for the sensor. Self-calibration may include monitoring the drift of the field baseline over time.
In another aspect, a method for sensor calibration includes: (a) collecting sensing data (e.g., a data set) using a sensor during a time window; (b) evaluating the sensed data (e.g., data set) to obtain optimal sensed data (e.g., data set) during a time duration equal to or shorter than the time window, the optimal sensed data having a minimum variability greater than zero; and (c) assigning a baseline to the sensor by considering the best sensing data. In some embodiments, the sensed data includes first sensed data (e.g., a data set) collected during a first time duration, and second sensed data (e.g., a data set) collected during a second time duration. In some embodiments, the first duration is shorter than the time window. In some embodiments, the second duration is shorter than the time window. In some embodiments, evaluating the sensed data includes comparing the first sensed data to the second sensed data to find the best sensed data (e.g., the best set of sensed data). In some embodiments, the length of time of the first duration is different from the length of time of the second duration. In some embodiments, the length of time of the first duration is equal to or substantially equal to the length of time (e.g., time span) of the second duration. For example, a method for sensor calibration, the method comprising: (a) using the sensor to perform the following operations: (i) collecting first sensed data during a first time duration; and (ii) collecting second sensing data during a second duration, the first duration and the second duration occurring during a time window, the first duration having a first start time and the second duration having a second start time, wherein a time span of the first duration is equal or approximately equal to a time span of the second duration; (b) evaluating the first sensed data and the second sensed data to obtain optimal sensed data, the optimal sensed data having a minimum variability greater than zero; and (c) assigning a baseline to the sensor by taking into account the optimal sensing data. For example, a method for sensor calibration in a facility, the method comprising: (a) using a sensor to perform the following operations: (i) collecting first sensed data during a first time duration; and (ii) collecting second sensing data during a second duration, the first duration and the second duration occurring during a time window, the first duration having a first start time and the second duration having a second start time, wherein a time span of the first duration is equal or approximately equal to a time span of the second duration, wherein the sensor is included in a sensor array disposed in the facility; (b) evaluating the first sensed data and the second sensed data to obtain optimal sensed data, the optimal sensed data having a minimum variability greater than zero; and (c) assigning a baseline to the sensor by taking into account the optimal sensing data.
In some embodiments, the sensor calibration is performed in a facility. In some embodiments, the sensors are included in a sensor array disposed in the facility. In some embodiments, the sensor is housed in a housing that includes (i) the sensor or (ii) the sensor and the emitter as part of the device aggregate. In some embodiments, the sensor arrays (e.g., sensors in a sensor array) are configured to operate in a coordinated manner. In some embodiments, including at least in part by using data from an array of sensors (e.g., from different sensors in the array, e.g., from different types of sensors in the array of sensors) to cooperatively adjust the environment of the facility. In some embodiments, the time span is predetermined. In some embodiments, the method further comprises collecting third sensing data during the time window and assigning a time span by considering the third sensing data such that the time span comprises a plurality of data that facilitates separation of the signal data from the noise data. In some embodiments, the third sensed data is collected prior to using the sensor to: (i) collecting first sensed data during a first time duration; and (ii) collecting second sensed data during a second time duration. In some embodiments, the sensor is included in a sensor array. In some embodiments, the time span is predetermined. In some embodiments, the method further comprises collecting third sensing data during the time window and assigning a time span by considering the third sensing data such that the time span comprises a plurality of data that facilitates separation of the signal data from the noise data. In some embodiments, the third sensed data is collected prior to using the sensor to: (i) collecting first sensed data during a first time duration; and (ii) collecting second sensed data during a second time duration. In some embodiments, the time window is at least about one day. In some embodiments, the duration is at least about thirty (30) minutes. In some embodiments, the baseline comprises the mean, median, mode, or intermediate range of the best sensing dataset. In some embodiments, the sensor is factory calibrated prior to using the sensor to: (i) collecting first sensed data during a first time duration; and (ii) collect second sensed data during a second time duration. In some embodiments, the sensor is a first sensor of a first type, and wherein assigning the baseline takes into account sensed data of a second sensor of the first type. In some embodiments, the second sensor is proximate to the first sensor such that there is no additional sensor of the first type between the first sensor and the second sensor. In some embodiments, the sensor is a first sensor of a first type, and wherein assigning the baseline takes into account sensed data of a second sensor of a second type. In some embodiments, assigning a baseline to the sensor takes into account external data. In some embodiments, the external data is not obtained by the sensor. In some embodiments, the external data includes historical data. In some embodiments, the external data includes using third party data. In some embodiments, the method is performed at least twice during the lifetime of the sensor. In some embodiments, the sensor is disposed at a location different from at least one production location thereof. In some embodiments, the sensor is disposed in the location where it is deployed. In some embodiments, the first duration partially overlaps the second duration. In some embodiments, the first duration does not overlap with the second duration. In some embodiments, the end of the first duration contacts the beginning of the second duration. In some embodiments, the end of the first duration is the beginning of the second duration. In some embodiments, the collecting of the sensed data using the first sensor and/or the second sensor is in a natural setting. In some embodiments, collecting sensed data using the first sensor and/or the second sensor is in a setting that is not artificially perturbed for the purpose of collecting sensed data. In some embodiments, the first sensor and/or the second sensor is different from a single pixel sensor. In some embodiments, a plurality of data types related to the attribute are collected using the first sensor and/or the second sensor. In some embodiments, the plurality of data types includes different intensities. In some embodiments, the attribute is electromagnetic radiation, and wherein the plurality of data types includes different wavelengths or different intensities. In some embodiments, the attribute is an acoustic wave, and wherein the plurality of data types includes different frequencies or different intensities. In some embodiments, the plurality of data types includes different physical locations. In some embodiments, the physical location is a relative location. In some embodiments, the sensor is included in a sensor array. In some embodiments, the physical location relates to a relative position in the sensor array.
In another aspect, an apparatus for sensor calibration includes one or more controllers configured to: (a) operatively coupled to the sensor; (b) collecting or directing collection of sensory data (e.g., a data set) during a time window; (c) evaluating or directing evaluation of the sensed data to obtain optimal sensed data (e.g., a data set) during a time duration equal to or shorter than the time window, the optimal sensed data having a minimum variability greater than zero; and (d) assigning a baseline to the sensors or directing the assignment of the baseline to the sensors by considering the best sensing data. For example, an apparatus for sensor calibration includes one or more controllers configured to: (a) operatively coupled to the first sensor and to the second sensor; (b) the following collection or directed collection operations were performed: (i) collecting or directing collection of first sensed data during a first time duration; and (ii) collect or direct collection of second sensed data during a second duration, the first duration and the second duration occurring during a time window, the first duration having a first start time and the second duration having a second start time, wherein a time span of the first duration is equal or approximately equal to a time span of the second duration; (c) evaluating or directing evaluation of the first sensed data and the second sensed data to obtain best sensed data, the best sensed data having a minimum variability greater than zero; and (d) assigning a baseline to the sensor or assigning a guide baseline to the sensor by considering the best sensing data. For example, an apparatus for self-calibration of sensors in a facility, the apparatus comprising one or more controllers configured to: (a) operatively coupled to sensors included in a sensor array disposed in a facility; (b) the following collection or directed collection operations were performed: (i) collecting or directing collection of first sensed data during a first time duration; and (ii) collect or direct collection of second sensed data during a second duration, the first duration and the second duration occurring during a time window, the first duration having a first start time and the second duration having a second start time, wherein a time span of the first duration is equal or approximately equal to a time span of the second duration; (c) evaluating or directing evaluation of the first sensed data and the second sensed data to obtain best sensed data, the best sensed data having a minimum variability greater than zero; and (d) assigning a baseline to the sensor or assigning a guide baseline to the sensor by considering the best sensing data.
In some embodiments, sensor self-calibration occurs for a sensor disposed in a facility. In some embodiments, the sensors are included in a sensor array disposed in the facility. In some embodiments, the sensor is housed in a housing that includes (i) the sensor or (ii) the sensor and the emitter as part of the device aggregate. In some embodiments, the sensor arrays are configured to operate in a coordinated manner. In some embodiments, the one or more controllers are configured to cooperatively adjust or direct the adjustment of the environment of the facility at least in part by using data from the sensor array. In some embodiments, the one or more controllers comprise circuitry. In some embodiments, the one or more controllers are configured to collect or direct collection of sensory data including first sensory data (e.g., a data set) collected during a first time duration and second sensory data (e.g., a data set) collected during a second time duration. In some embodiments, the first duration is shorter than the time window. In some embodiments, the second duration is shorter than the time window. In some embodiments, the one or more controllers are configured to evaluate or direct evaluation of the sensed data, the evaluation including comparing the first sensed data with the second sensed data to find the best sensed data (e.g., the best sensed data set). In some embodiments, the length of time of the first duration is different from the length of time of the second duration. In some embodiments, the length of time of the first duration is equal to or substantially equal to the length of time (e.g., the time span) of the second duration. In some embodiments, the one or more controllers comprise circuitry. In some embodiments, the one or more configured controllers include one or more controllers programmed to perform operations (b), (c), and (d). In some embodiments, the one or more controllers include a feedback control scheme. In some embodiments, the one or more controllers include a feed-forward control scheme. In some embodiments, the one or more controllers are operatively coupled to a data processing center that includes a cloud, a processor, or another sensor. In some embodiments, the data processing center may comprise a remote data processing center or a local data processing center. In some embodiments, the first sensor and the second sensor are disposed in the peripheral structure. In some embodiments, the data processing sensor/processor is located at a different location than the peripheral structure. In some embodiments, the one or more controllers comprise a wireless transceiver. In some embodiments, the one or more controllers include a processor and a memory including instructions to direct the processor to perform obtaining, estimating, determining, and/or considering. In some embodiments, the second sensed data and the first sensed data have the same parameter. In some embodiments, the one or more controllers are configured to collect or direct collection of first sensed data of a first parameter that includes a characteristic of an environment of a peripheral structure in which the sensor is disposed and/or to which the sensor is attached. In some embodiments, the one or more controllers are configured to collect or direct collection of parameters including temperature, humidity, sound, force, pressure, electromagnetic waves, position, distance, movement, flow, acceleration, velocity, vibration, dust, light, glare, color, gas, or volatile compounds (e.g., other than gas). In some embodiments, at least two of operations (a), (b), and (c) are performed by the same controller of the one or more controllers. In some embodiments, at least two of operations (a), (b), and (c) are performed by different controllers of the one or more controllers. In some embodiments, the one or more controllers are configured to perform or direct performance of at least one of operations (b) and (c) at a peripheral structure in which the sensor is disposed and/or to which the sensor is attached, or at a facility in which the peripheral structure is located. In some embodiments, the one or more controllers are operatively coupled to a first sensor that is part of a sensor ensemble that includes another sensor. In some embodiments, the further sensor measures a second parameter different from the first parameter measured by the sensor. In some embodiments, at least one controller of the one or more controllers is configured to: (i) included in the aggregate; or (ii) communicatively coupled to a processor. In some embodiments, the at least one of the one or more controllers is configured to be directly coupled to the sensor ensemble. In some embodiments, direct coupling excludes intermediate devices. In some embodiments, the direct coupling comprises a cable. In some embodiments, direct coupling includes wired and/or wireless communication. In some embodiments, the one or more controllers are configured to perform or direct performance of at least one of operations (b) and (c) at a facility in which the peripheral structure is located, wherein the sensor is disposed in and/or attached to the peripheral structure. In some embodiments, the one or more controllers utilize a control scheme that includes feedback control to adjust at least one characteristic of an environment of a peripheral structure in which the sensor is disposed and/or to which the sensor is attached. In some embodiments, the control scheme utilizes data collected by the first sensor and/or the second sensor. In some embodiments, the one or more controllers are configured to determine or direct the determination of a time span and/or a time window. In some embodiments, the one or more controllers are configured to collect or direct collection of third sensed data during the time window, and to allocate a time span by taking into account the third sensed data, such that the time span includes a plurality of data that facilitates separation of signal data from noise data. In some embodiments, the one or more controllers are configured to collect or direct collection of third sensed data prior to using the sensor to: (i) collecting first sensed data during a first time duration; and collecting second sensed data during (ii) the second duration. In some embodiments, the one or more controllers are configured to collect or direct collection of third sensed data during the time window, and to allocate a time span by taking into account the third sensed data, such that the time span includes a plurality of data that facilitates separation of signal data from noise data. In some embodiments, the one or more controllers are configured to collect or direct collection of third sensed data prior to using the sensor to: (i) collecting first sensed data during a first time duration; and collecting second sensed data during (ii) the second time duration. In some embodiments, the one or more controllers are configured to collect or direct collection of data during a time window of at least about one day. In some embodiments, the one or more controllers are configured to collect or direct collection of data over a duration of at least about thirty (30) minutes. In some embodiments, the one or more controllers are configured to assign or direct assignment of a baseline comprising a mean, median, mode, or intermediate range of the best sensing dataset. In some embodiments, the sensor is factory calibrated prior to using the sensor to: (i) collecting first sensed data during a first time duration; and (ii) collecting second sensed data during a second time duration. In some embodiments, the sensor is a first sensor of a first type, and wherein assigning the baseline takes into account sensing data of a second sensor of the first type. In some embodiments, the second sensor is in close proximity to the first sensor such that there is no additional sensor of the first type between the first sensor and the second sensor. In some embodiments, the sensor is a first sensor of a first type, and wherein assigning the baseline takes into account sensing data of a second sensor of a second type. In some embodiments, the one or more controllers are configured to assign a baseline to the sensors or direct assignment of a baseline to the sensors, at least in part by taking into account external data. In some embodiments, the external data is not obtained by the sensor. In some embodiments, the external data includes historical data. In some embodiments, the external data includes using third party data. In some embodiments, the one or more controllers are configured to perform operations (a), (b), and (c) at least twice during the lifetime of the sensor. In some embodiments, the sensor is disposed at a location different from at least one production location thereof. In some embodiments, the first sensor and/or the second sensor are disposed in the location in which they are deployed. In some embodiments, the first duration partially overlaps the second duration. In some embodiments, the first duration does not overlap with the second duration. In some embodiments, the end of the first duration contacts the beginning of the second duration. In some embodiments, the end of the first duration is the beginning of the second duration. In some embodiments, the one or more controllers are configured to collect the first sensed data and the second sensed data when the first sensor and/or the second sensor are in their natural settings. In some embodiments, the one or more controllers are configured to collect or direct collection of the first sensed data and the second sensed data when the first sensor and/or the second sensor are not artificially perturbed for the purpose of collecting the sensed data. In some embodiments, the one or more controllers are configured to collect or direct collection of the first sensed data and the second sensed data from a first sensor and/or a second sensor different from the single pixel sensor. In some embodiments, the one or more controllers are configured to collect or direct collection of the first sensed data and/or the second sensed data as a plurality of data types related to the attribute. In some embodiments, the plurality of data types includes different intensities. In some embodiments, the property is electromagnetic radiation, and wherein the plurality of data types includes different wavelengths or different intensities. In some embodiments, the property is an acoustic wave, and wherein the plurality of data types includes different frequencies or different intensities. In some embodiments, the plurality of data types includes different physical locations. In some embodiments, the physical location is a relative location. In some embodiments, the sensor is included in a sensor array. In some embodiments, the physical location relates to a relative position in the sensor array.
In another aspect, a non-transitory computer program product for sensor calibration includes instructions recorded (instarib) thereon, which when executed by one or more processors (e.g., operatively coupled to a first sensor and to a second sensor) cause the one or more processors to perform a method comprising: (a) collecting or directing collection of sensing data from the sensor during the time window; (b) evaluating or instructing to evaluate the sensed data (e.g., the dataset) to obtain optimal sensed data (e.g., the dataset) with minimal variability greater than zero; and (c) assigning a baseline to the sensors or directing the assignment of the baseline to the sensors by considering the best sensing data. In some embodiments, the sensed data includes first sensed data (e.g., a data set) collected during a first time duration, and second sensed data (e.g., a data set) collected during a second time duration. In some embodiments, the first duration is shorter than the time window. In some embodiments, the second duration is shorter than the time window. In some embodiments, evaluating the sensed data includes comparing the first sensed data to the second sensed data to find the best sensed data (e.g., the best set of sensed data). In some embodiments, the length of time of the first duration is different from the length of time of the second duration. In some embodiments, the length of time of the first duration is equal to or substantially equal to the length of time (e.g., the time span) of the second duration. For example, a non-transitory computer program product for sensor calibration, the non-transitory computer program product including instructions recorded thereon, which when executed by one or more processors, cause the one or more processors to perform a method comprising: (a) the following operations are performed from the sensor: (i) collecting first sensing data during a first duration, and (ii) collecting second sensing data during a second duration, the first duration and the second duration occurring during a time window, the first duration having a first start time, and the second duration having a second start time, wherein a time span of the first duration is equal or approximately equal to a time span of the second duration; (b) evaluating the first sensed data and the second sensed data to obtain optimal sensed data, the optimal sensed data having a minimum variability greater than zero; and (c) assigning a baseline to the sensor by taking into account the optimal sensing data. For example, a non-transitory computer program product for sensor calibration in a facility, the non-transitory computer program product containing instructions recorded thereon that, when executed by one or more processors operatively coupled to a sensor, cause the one or more processors to perform operations comprising: (a) the following operations are performed from the sensor: (i) collect or direct collection of first sensed data during a first duration, and (ii) collect or direct collection of second sensed data during a second duration, the first duration and the second duration occurring during a time window, the first duration having a first start time and the second duration having a second start time, wherein a time span of the first duration is equal or approximately equal to a time span of the second duration, wherein the sensor is included in a sensor array disposed in the facility; (b) evaluating or directing evaluation of the first sensed data and the second sensed data to obtain best sensed data, the best sensed data having a minimum variability greater than zero; and (c) assigning a baseline to the sensor or assigning a guide baseline to the sensor by considering the best sensing data.
In some embodiments, the sensor is housed in a housing that includes (i) the sensor or (ii) the sensor and the emitter as part of the device aggregate. In some embodiments, the sensor arrays are configured to operate in a coordinated manner. In some embodiments, the operations include adjusting or directing the environment of the adjustment facility cooperatively, at least in part, by using data from the sensor array. In some embodiments, the one or more controllers are operatively coupled to a data processing center that includes a cloud, a processor, or another sensor. In some embodiments, the data processing center may comprise a remote data processing center or a local data processing center. In some embodiments, the sensor is disposed in and/or attached to the peripheral structure. In some embodiments, the data processing center is disposed at a location that is different from the peripheral structure in which the sensors are disposed and/or to which the sensors are attached. In some embodiments, the operations further comprise determining a time window and/or a time span prior to operation (a). In some embodiments, the operations further comprise collecting third sensing data during the time window and assigning a time span by considering the third sensing data such that the time span comprises a plurality of data that facilitates separation of signal data from noise data. In some embodiments, the third sensed data is collected prior to using the sensor to: (i) collecting first sensed data during a first time duration; and (ii) collecting second sensed data during a second time duration. In some embodiments, the operations further comprise collecting third sensing data during the time window and assigning a time span by considering the third sensing data such that the time span comprises a plurality of data that facilitates separation of signal data from noise data. In some embodiments, the third sensed data is collected prior to using the sensor to: (i) collecting first sensed data during a first time duration; and (ii) collecting second sensed data during a second time duration. In some embodiments, the time window is at least about one day. In some embodiments, the duration is at least about thirty (30) minutes. In some embodiments, the baseline comprises an average, median, mode, or intermediate range of the best sensing dataset. In some embodiments, the sensor is factory calibrated prior to using the sensor to: (i) collecting first sensed data during a first time duration; and (ii) collecting second sensed data during a second time duration. In some embodiments, the sensor is a first sensor of a first type, and wherein assigning the baseline takes into account sensing data of a second sensor of the first type. In some embodiments, the second sensor is proximate to the first sensor such that there is no additional sensor of the first type between the first sensor and the second sensor. In some embodiments, the sensor is a first sensor of a first type, and wherein assigning the baseline takes into account sensing data of a second sensor of a second type. In some embodiments, assigning a baseline to the sensor takes into account external data. In some embodiments, the external data is not obtained by the sensor. In some embodiments, the external data includes historical data. In some embodiments, the external data includes use of third party data. In some embodiments, this operation is performed at least twice during the lifetime of the sensor. In some embodiments, the sensor is disposed at a location different from at least one production location thereof. In some embodiments, the sensor is disposed in the location where it is deployed. In some embodiments, the first duration partially overlaps with the second duration. In some embodiments, the first duration does not overlap with the second duration. In some embodiments, the end of the first duration contacts the beginning of the second duration. In some embodiments, the end of the first duration is the beginning of the second duration. In some embodiments, the collecting of the sensed data using the first sensor and/or the second sensor is in a natural setting. In some embodiments, collecting sensed data using the first sensor and/or the second sensor is in a setting that is not artificially perturbed for the purpose of collecting sensed data. In some embodiments, the first sensor and/or the second sensor is different from a single pixel sensor. In some embodiments, a plurality of data types related to the attribute are collected using the first sensor and/or the second sensor. In some embodiments, the plurality of data types includes different intensities. In some embodiments, the attribute is electromagnetic radiation, and wherein the plurality of data types includes different wavelengths or different intensities. In some embodiments, the attribute is an acoustic wave, and wherein the plurality of data types includes different frequencies or different intensities. In some embodiments, the plurality of data types includes different physical locations. In some embodiments, the physical location is a relative location. In some embodiments, the sensor is included in a sensor array. In some embodiments, the physical location relates to a relative position in the sensor array. Members of an array of devices (e.g., sensors) can cooperate to facilitate comprehensive analysis. For example, data from members of the device array may be complementary to each other. For example, an analysis of data corresponding to a member of an array of devices may be analyzed, and a conclusion may be drawn; and may analyze conclusions of the data corresponding to different members of the device array (e.g., the conclusions may complement each other to generate, for example, a more complete analysis of the peripheral structural environment to which the device array relates and/or in which the device array is disposed).
In another aspect, a system for sensor calibration includes a sensor and one or more circuitry configured to perform a method, the method comprising: (a) performing, via one or more controllers: (i) collecting first sensing data during a first duration, and (ii) collecting second sensing data during a second duration, the first duration and the second duration occurring during a time window, the first duration having a first start time, and the second duration having a second start time, wherein a time span of the first duration is at least approximately equal to a time span of the second duration; (b) evaluating the first sensed data and the second sensed data to obtain optimal sensed data, the optimal sensed data having a minimum variability greater than zero; and (c) assigning the best sensing data to the sensor as a baseline in response to considering the best sensing data. For example, a system for sensor calibration in a facility, the system comprising a sensor and one or more circuitry, the sensor and the one or more circuitry configured to perform a method comprising: (a) performing, via one or more controllers: (i) collecting first sensed data during a first time duration; and (ii) collecting second sensing data during a second duration, the first duration and the second duration occurring during a time window, the first duration having a first start time and the second duration having a second start time, wherein a time span of the first duration is at least approximately equal to a time span of the second duration, the first sensing data and the second sensing data collected from sensors included in a sensor array disposed in the facility; (b) evaluating the first sensed data and the second sensed data to obtain best sensed data, the best sensed data having a minimum variability greater than zero; and (c) assigning the best sensing data to the sensor as a baseline in response to considering the best sensing data.
In another aspect, a method of sensor calibration includes: (a) obtaining a first reading of the first parameter from a first sensor disposed at a first location in the peripheral structure and a second reading of the first parameter from a second sensor disposed at a second location in the peripheral structure; (b) estimating a predicted value of the first parameter at the first location using (e.g., based at least in part on) the second reading; (c) determining a difference between (I) a predicted value of the estimated first parameter and (II) a first reading of the first parameter; and (d) considering a difference between (i) the estimated predicted value of the first parameter and (ii) the first reading of the first parameter to modify the first reading of the first parameter. For example, a method of sensor calibration in a facility, the method comprising: (a) obtaining a first reading of a first parameter from a first sensor disposed at a first location in the peripheral structure and a second reading of the first parameter from a second sensor disposed at a second location in the peripheral structure, wherein the first sensor and the second sensor are included in a sensor array disposed in the facility; (b) estimating a predicted value of the first parameter at the first location using the second reading; (c) determining a difference between (I) a predicted value of the estimated first parameter and (II) a first reading of the first parameter; and (d) taking into account a difference between (i) the estimated predicted value of the first parameter and (ii) the first reading of the first parameter to modify the first reading of the first parameter.
In some embodiments, the calibration is to a sensor disposed in the facility. In some embodiments, the first sensor and the second sensor are included in a sensor array disposed in the facility. In some embodiments, the sensor arrays are configured to operate in a coordinated manner. In some embodiments, the method further comprises adjusting the environment of the facility synergistically, at least in part, by using data from the sensor array. In some embodiments, the first parameter includes a characteristic of an environment of the peripheral structure. In some embodiments, the characteristic of the peripheral structure comprises temperature, humidity, sound, force, pressure, electromagnetic waves, position, distance, movement, flow, acceleration, velocity, vibration, dust, light, lux, glare, color, gas, and/or a volatile compound (e.g., other than a gas). Lux can measure luminous flux per unit area (e.g., lumens per square meter). Lux can be used as a measure of the intensity of illumination or light passing through a surface (e.g., through a window) as perceived by the human eye. In some embodiments, at least one of operations (b), (c), and (d) is performed in real-time. In some embodiments, the real-time includes a period of up to one hour from the end of obtaining the first reading of the first parameter. In some embodiments, at least one of operations (b), (c), and (d) is performed at the peripheral structure, or at a facility in which the peripheral structure is located. In some embodiments, the first sensor is part of (e.g., included in) a device aggregate that includes another sensor and/or emitter. In some embodiments, the other sensor measures a second parameter different from the first parameter. In some embodiments, the sensor ensemble (i) includes a processor, or (ii) is communicatively coupled to a processor. In some embodiments, the processor is directly coupled to the sensor ensemble. In some embodiments, direct coupling excludes intermediate devices. In some embodiments, the direct coupling comprises a cable. In some embodiments, direct coupling includes wired and/or wireless communication. In some embodiments, at least one of operations (b), (c), and (d) is performed by a processor, or at a facility in which the peripheral structure is located. In some embodiments, the method further comprises, prior to operation (b): obtaining a first reading of the first parameter from one or more additional sensors disposed at one or more locations different from the first location and the second location, the one or more locations being in the peripheral structure, and wherein estimating a predicted value of the first parameter at the first location utilizes the second reading and the one or more readings of the one or more additional sensors. In some embodiments, the one or more locations are different from the first location and the second location.
In another aspect, an apparatus for sensor calibration includes one or more controllers configured to: (a) operatively coupled to the first sensor and to the second sensor; (b) obtaining or directing to obtain a first reading of a first parameter from a first sensor disposed at a first location in the peripheral structure and obtaining or directing to obtain a second reading of the first parameter from a second sensor disposed at a second location in the peripheral structure; (c) estimating or directing estimation of a predicted value of the first parameter at the first location based at least in part on the second reading; (d) determining or directing a determination of a difference between (I) a predicted value of the estimated first parameter and (II) a first reading of the first parameter; and (e) considering or directing to consider a difference between (i) the estimated predicted value of the first parameter and (ii) the first reading of the first parameter to modify the first reading of the first parameter. For example, an apparatus for sensor calibration in a facility, the apparatus comprising one or more controllers (e.g., comprising circuitry) configured to: (a) operatively coupled to a first sensor and to a second sensor, the first and second sensors being included in a sensor array disposed in a facility; (b) obtaining or directing to obtain a first reading of a first parameter from a first sensor disposed at a first location in the peripheral structure and obtaining or directing to obtain a second reading of the first parameter from a second sensor disposed at a second location in the peripheral structure; (c) estimating or directing estimation of a predicted value of the first parameter at the first location based at least in part on the second reading; (d) determining or directing a determination of a difference between (I) a predicted value of the estimated first parameter and (II) a first reading of the first parameter; and (e) taking into account or directing to take into account a difference between (i) the estimated predicted value of the first parameter and (ii) the first reading of the first parameter to modify the first reading of the first parameter.
In some embodiments, the sensor is disposed in a facility. In some embodiments, the first sensor and the second sensor are included in a sensor array disposed in the facility. In some embodiments, the sensor arrays are configured to operate in a coordinated manner. In some embodiments, the one or more controllers are configured to cooperatively adjust or direct the adjustment of the environment of the facility at least in part by using data from the sensor array. In some embodiments, the one or more controllers comprise circuitry. In some embodiments, the one or more controllers comprise a wireless transceiver. In some embodiments, the one or more controllers include a processor and a memory including instructions to direct the processor to perform obtaining, estimating, determining, and/or considering. In some embodiments, the one or more controllers are configured to obtain or direct the obtaining of a first reading of a first parameter comprising a characteristic of an environment of the peripheral structure. In some embodiments, at least two of operations (a), (b), and (c) and (d) are performed by the same controller of the one or more controllers. In some embodiments, at least two of operations (a), (b), (c), and (d) are performed by different controllers of the one or more controllers. In some embodiments, the one or more controllers are configured to obtain or direct the obtaining of a first reading of a first parameter comprising temperature, humidity, sound, force, pressure, electromagnetic waves, position, distance, movement, flow, acceleration, velocity, vibration, dust, light, glare, color, gas, or volatile compounds (e.g., other than gas). In some embodiments, the one or more controllers are configured to perform or direct the performance of at least one of operations (b), (c), and (d) in real time. In some embodiments, the real-time includes a time period of up to one hour from the end of obtaining the first reading of the first parameter. In some embodiments, the one or more controllers are configured to perform or direct performance of at least one of operations (b), (c), and (d) at the peripheral structure or at a facility in which the peripheral structure is located. In some embodiments, the one or more controllers are operatively coupled to a first sensor that is part of a device ensemble that includes another sensor or emitter. In some embodiments, the other sensor measures a second parameter different from the first parameter. In some embodiments, at least one of the one or more controllers is configured to: (i) included in the aggregate; or (ii) communicatively coupled to a processor. In some embodiments, the at least one of the one or more controllers is configured to be directly coupled to the sensor ensemble. In some embodiments, direct coupling excludes intermediate devices. In some embodiments, the direct coupling comprises a cable. In some embodiments, the direct coupling includes wired and/or wireless communication. In some embodiments, the one or more controllers are configured to perform or direct performance of at least one of operations (b), (c), and (d) at a facility in which the peripheral structure is located. In some embodiments, the one or more controllers are further operatively coupled to one or more additional sensors, and wherein prior to operation (b), the one or more controllers are configured to obtain or direct to obtain first readings of the first parameter from one or more additional sensors disposed at one or more locations different from the first location and the second location, the one or more locations being in the peripheral structure, and wherein the one or more controllers are configured to estimate or direct to estimate the predicted value of the first parameter at the first location by utilizing the second readings and the one or more readings from the one or more sensors. In some embodiments, the one or more locations are different from the first location and the second location. In some embodiments, the one or more controllers adjust at least one characteristic of the environment of the peripheral structure using a control scheme that includes feedback control. In some embodiments, the control scheme utilizes data collected by the first sensor and/or the second sensor.
In another aspect, a non-transitory computer program product for sensor calibration, the non-transitory computer program containing instructions recorded thereon, which when executed by one or more processors (e.g., operatively coupled to a first sensor and to a second sensor) cause the one or more processors to perform a method comprising: (a) obtaining or directing to obtain a first reading of a first parameter from a first sensor disposed at a first location in the peripheral structure and obtaining or directing to obtain a second reading of the first parameter from a second sensor disposed at a second location in the peripheral structure; (b) estimating or directing estimation of a predicted value of the first parameter at the first location based at least in part on the second reading; (b) determining or directing a determination of a difference between (I) a predicted value of the estimated first parameter and (II) a first reading of the first parameter; and (d) consider or direct obtaining a difference between (i) the estimated predicted value of the first parameter and (ii) the first reading of the first parameter to modify the first reading of the first parameter. For example, a non-transitory computer program product for sensor self-calibration in a facility, the non-transitory computer program containing instructions recorded thereon, which when executed by one or more processors operatively coupled to first and second sensors included in a sensor array disposed in a facility, cause the one or more processors to perform a method comprising: (a) obtaining or directing to obtain a first reading of a first parameter from a first sensor disposed at a first location in the peripheral structure and obtaining or directing to obtain a second reading of the first parameter from a second sensor disposed at a second location in the peripheral structure; (b) estimating or directing estimation of a predicted value of the first parameter at the first location based at least in part on the second reading; (b) determining or directing a determination of a difference between (I) a predicted value of the estimated first parameter and (II) a first reading of the first parameter; and (d) considering or directing to consider a difference between (i) the estimated predicted value of the first parameter and (ii) the first reading of the first parameter to modify the first reading of the first parameter.
In some embodiments, the first sensor and the second sensor are included in a sensor array disposed in the facility. In some embodiments, the sensor arrays are configured to operate in a coordinated manner. In some embodiments, wherein the operation comprises adjusting or directing the environment of the adjustment facility cooperatively, at least in part, by using data from the sensor array. In some embodiments, the one or more processors are coupled to or have access to one or more memory circuits. In some embodiments, the first parameter includes a characteristic of an environment of the peripheral structure. In some embodiments, the characteristic of the peripheral structure comprises temperature, humidity, sound, force, pressure, electromagnetic waves, position, distance, movement, flow, acceleration, velocity, vibration, dust, light, glare, color, gas, or a volatile compound (e.g., other than a gas). In some embodiments, at least one of operations (b), (c), and (d) is performed in real-time. In some embodiments, the real-time includes a period of up to one hour from the end of obtaining the first reading of the first parameter. In some embodiments, at least one of (b), (c), and (d) is performed by at least one of the one or more processors disposed at the peripheral structure or at a facility in which the peripheral structure is located. In some embodiments, the first sensor is part of a sensor ensemble that includes another sensor. In some embodiments, the other sensor measures a second parameter different from the first parameter. In some embodiments, the sensor ensemble (i) includes at least one processor of the one or more processors, or (ii) is communicatively coupled to the one or more processors. In some embodiments, at least one processor of the one or more processors is directly coupled to the sensor ensemble. In some embodiments, direct coupling excludes intermediate devices. In some embodiments, the direct coupling comprises a cable. In some embodiments, direct coupling includes wired and/or wireless communication. In some embodiments, at least one of operations (b), (c), and (d) is performed by at least one of the one or more processors of the ensemble, or at a facility in which the peripheral structure is located. In some embodiments, the operations further comprise, prior to operation (b): obtaining a first reading of the first parameter from one or more additional sensors disposed at one or more locations different from the first location and the second location, the one or more locations being in the peripheral structure, and wherein estimating a predicted value of the first parameter at the first location utilizes the second reading and the one or more readings of the one or more sensors. In some embodiments, the one or more locations are different from the first location and the second location.
In another aspect, a system for performing sensor calibration includes: one or more first sensors disposed in a facility (e.g., a peripheral structure), wherein the one or more first sensors are calibrated, wherein the peripheral structure is a target location for the one or more first sensors; one or more second sensors in the peripheral structure, wherein the one or more second sensors are uncalibrated or miscalibrated; and one or more controllers operatively coupled with the one or more first sensors and the one or more second sensors, the one or more controllers utilizing sensor measurements obtained from the one or more first sensors to calibrate and/or recalibrate the one or more second sensors.
In some embodiments, the calibration of the sensor is performed in the facility. In some embodiments, the one or more first sensors and the one or more second sensors are included in a sensor array disposed in a facility. In some embodiments, the first sensor is part of a device aggregate that includes another sensor or emitter. In some embodiments, the other sensor measures a second parameter different from the first parameter. In some embodiments, the sensor arrays are configured to operate in a coordinated manner. In some embodiments, the operations include adjusting or directing the environment of the adjustment facility cooperatively, at least in part, by using data from the sensor array. In some embodiments, the one or more first sensors are calibrated in the peripheral structure. In some embodiments, the one or more controllers are at least partially wired to: (A) the one or more first sensors; and (B) the one or more second sensors. In some embodiments, at least one of the one or more controllers is disposed on an electronic board, and at least one of the one or more first sensors is disposed on the electronic board. In some embodiments, the one or more controllers are at least partially wirelessly coupled to: (A) the one or more first sensors; and (B) the one or more second sensors. In some embodiments, the one or more controllers operate to provide adjustment values to the one or more second sensors. In some embodiments, at least a portion of the one or more controllers are wirelessly coupled to each other. In some embodiments, the one or more controllers are wirelessly coupled to at least a portion of the one or more first sensors and/or at least a portion of the one or more second sensors. In some embodiments, the coupling between the one or more controllers and (i) at least a portion of the one or more first sensors and (ii) at least a portion of the one or more second sensors is at least partially wireless.
In another aspect, a system for calibration in a sensor cluster, the system comprising: a first sensor of the plurality of sensors, the first sensor disposed at a first location; a second sensor of the plurality of sensors, the second sensor disposed at a second location, the second sensor operatively coupled to the first sensor, the second sensor configured to: (a) obtaining a first reading of a first parameter from a first sensor; (b) receiving an estimate of the predicted value of the first parameter, or estimating the predicted value of the first parameter, and generating an estimated predicted value; (c) receiving a determination of a difference between (I) the estimated predicted value of the first parameter and (II) the first reading of the first parameter, or determining the difference; and (d) receiving consideration of, or taking into account, the difference between (i) the estimated predicted value of the first parameter and (ii) the first reading of the first parameter to modify the first reading of the first parameter.
In some embodiments, the calibration of the sensor is performed in the facility. In some embodiments, the sensors are included in a sensor array disposed in the facility. In some embodiments, the first sensor is part of a device aggregate that includes another sensor or emitter. In some embodiments, the other sensor measures a second parameter different from the first parameter. In some embodiments, the sensor arrays are configured to operate in a coordinated manner. In some embodiments, the operations include adjusting or directing the environment of the adjustment facility cooperatively, at least in part, by using data from the sensor array. In some embodiments, the first sensor and the second sensor are disposed within the peripheral structure. In some embodiments, an estimate of the predicted value of the first parameter is received from a cloud, a plant, and/or a data processing center. In some embodiments, the determination of the predicted value of the first parameter is performed by the cloud, the plant, and/or the data processing center. In some embodiments, the consideration of the predicted value of the first parameter is performed by a cloud, a plant, and/or a data processing center. In some embodiments, the first reading of the first parameter is modified by the second sensor to generate a modified first reading of the first parameter. In some embodiments, the second sensor operates to convert the modified first reading of the first parameter to a correction factor used by the first sensor.
In another aspect, a method for adjusting an environment includes: (a) connecting to a virtual reality module to view selected sensed characteristics of an environment; and (b) adjusting a sensed characteristic of the environment using the virtual reality module (e.g., adjusting a characteristic of the subsequently sensed environment). For example, a method for adjusting an environment of a facility, the method comprising: (a) connecting to a virtual reality module to view selected sensing characteristics of an environment, the selected sensing characteristics being sensed by a sensor array disposed in a facility; and (b) adjusting the sensed characteristic of the environment using the virtual reality module.
In some embodiments, connecting to the virtual reality module comprises connecting to a virtual reality portal.
In some embodiments, connecting to the virtual reality module includes wearing a virtual reality device (e.g., including glasses). In some embodiments, the virtual reality simulates one or more fixed structures in the environment. In some embodiments, the simulation of one or more fixed structures in the environment occurs in real-time. In some embodiments, the virtual reality module is communicatively coupled to one or more sensors that sense a characteristic of the environment, e.g., the one or more sensors are part of a sensor array.
In some embodiments, the sensor arrays are configured to operate in a coordinated manner. In some embodiments, the method further comprises adjusting the environment of the facility synergistically, at least in part, by using data from the sensor array. In some embodiments, the one or more sensors are part of one or more sensor assemblies or one or more device assemblies. In some embodiments, the device aggregate comprises (i) a sensor and/or (ii) a sensor and an emitter. In some embodiments, at least two of the one or more sensors are of the same type and are disposed in different locations in the environment. In some embodiments, at least two sensors of the one or more sensors are of different types and are disposed in the same sensor complex of the one or more sensor complexes. In some embodiments, the virtual reality module facilitates viewing changes in the sensed characteristic of the environment as a result of adjusting the sensed characteristic (e.g., a feature of the environment). In some embodiments, adjusting the sensing characteristic includes changing operation of one or more components disposed in and/or affecting the environment. In some embodiments, wherein the one or more components comprise a window, an HVAC system, or a light. In some embodiments, the sensed characteristic is a first sensed characteristic, and wherein the method further comprises selecting a second sensed characteristic to view and/or adjust using the virtual reality module. In some embodiments, the virtual reality module facilitates viewing and/or adjusting a plurality of environmental characteristics. In some embodiments, the viewing and/or adjusting of at least two of the plurality of environmental characteristics is performed sequentially. In some embodiments, the viewing and/or adjustment of at least two of the plurality of environmental characteristics at least partially overlaps in their occurrence. In some embodiments, the viewing and/or adjusting of at least two of the plurality of environmental characteristics is performed simultaneously. In some embodiments, the virtual reality module is communicatively coupled to a network. In some embodiments, the network comprises a building management network. In some embodiments, the network includes a hierarchy of controllers.
1. In another aspect, a non-transitory computer program product for adjusting an environment, the non-transitory computer program product including instructions recorded thereon, which, when executed by one or more processors, cause the one or more processors to perform operations of a method, comprising: (a) simulating or directing a virtual reality projection of a simulated environment to view selected sensed characteristics of the environment; and (b) using or directing use of the virtual reality projection to facilitate adjusting (e.g., adjusting or directing adjustment) a sensed characteristic of the environment (e.g., to adjust a characteristic of a subsequently sensed environment). For example, a non-transitory computer program product for adjusting an environment of a facility, the non-transitory computer program product containing instructions recorded thereon that, when executed by one or more processors, cause the one or more processors to perform operations comprising: (a) simulating or directing a virtual reality projection of a simulated environment to view selected sensing characteristics of the environment, the selected sensing characteristics being sensed by at least one sensor of a sensor array disposed in a facility; and (b) using or instructing use of the virtual reality projection to facilitate adjustment of a sensed characteristic of the environment (e.g., adjusting a characteristic of the sensed environment).
In some embodiments, the sensor array comprises an assembly of devices comprising (i) a sensor or (ii) a sensor and an emitter. In some embodiments, the sensor arrays are configured to operate in a coordinated manner. In some embodiments, the operations include adjusting or directing the environment of the adjustment facility cooperatively, at least in part, by using data from the sensor array. In some embodiments, the operations further comprise connecting or instructing connection to a virtual reality portal. In some embodiments, connecting to the virtual reality portal includes wearing virtual reality devices (e.g., including glasses). In some embodiments, the virtual reality projection of the simulated environment includes one or more fixed structures in the simulated environment. In some embodiments, the simulation of one or more fixed structures in the environment occurs in real-time. In some embodiments, the one or more processors are communicatively coupled to one or more sensors that sense a characteristic of the environment. In some embodiments, the one or more sensors are part of one or more sensor assemblies. In some embodiments, at least two of the one or more sensors are of the same type and are disposed in different locations in the environment. In some embodiments, at least two sensors of the one or more sensors are of different types and are disposed in the same sensor complex of the one or more sensor complexes. In some embodiments, the virtual reality projection facilitates viewing changes in the sensed characteristic of the environment as a result of adjusting the sensed characteristic (e.g., a feature of the environment). In some embodiments, adjusting the sensing characteristic includes changing operation of one or more components disposed in and/or affecting the environment. In some embodiments, the one or more components include a window, an HVAC system, or a light. In some embodiments, the sensed characteristic is a first sensed characteristic, and wherein the operations further comprise selecting or directing selection of a second sensed characteristic for viewing and/or adjustment using the virtual reality projection. In some embodiments, the virtual reality projection facilitates viewing and/or adjusting a plurality of environmental characteristics. In some embodiments, the viewing and/or adjusting of at least two of the plurality of environmental characteristics is performed sequentially. In some embodiments, the viewing and/or adjustment of at least two of the plurality of environmental characteristics at least partially overlaps in their occurrence. In some embodiments, the viewing and/or adjusting of at least two of the plurality of environmental characteristics is performed simultaneously. In some embodiments, the one or more processors are communicatively coupled to a network. In some embodiments, the network comprises a building management network. In some embodiments, the network includes a hierarchy of controllers.
In another aspect, an apparatus for environmental adjustment includes one or more controllers individually or simultaneously configured to: (a) operatively coupled to a virtual reality simulator; (b) directing a virtual reality simulator to project a virtual reality projection of an environment to view selected sensing characteristics of the environment; and (c) using or directing use of the virtual reality projection to help adjust (e.g., adjust or direct adjustment of) a sensed characteristic of the environment (e.g., adjust a characteristic of the environment that is subsequently sensed). For example, an apparatus for environmental adjustment of a facility, the apparatus comprising one or more controllers comprising circuitry configured to, separately or simultaneously: (a) operatively coupled to a virtual reality simulator; (b) directing a virtual reality simulator to project a virtual reality projection of an environment to view selected sensing characteristics of the environment, the selected sensing characteristics being sensed by a sensor array disposed in a facility; and (c) using or directing use of the virtual reality projection, directing the virtual reality simulator to facilitate adjustment of the sensed characteristic of the environment.
In some embodiments, the selected sensing characteristic is sensed by at least one sensor of an array of sensors disposed in the facility. In some embodiments, the sensors of the sensor array are housed in a housing that includes (i) the sensors or (ii) the sensors and the emitter as part of the device aggregate. In some embodiments, the sensor arrays are configured to operate in a coordinated manner. In some embodiments, the one or more controllers are configured to cooperatively adjust or direct the adjustment of the environment of the facility at least in part by using data from the sensor array. In some embodiments, the one or more controllers comprise circuitry. In some embodiments, the one or more controllers are configured to facilitate connection of the user to the virtual reality portal. In some embodiments, the virtual reality portal includes virtual reality devices (e.g., including glasses). In some embodiments, the one or more controllers are configured to direct the virtual reality simulator to simulate the environment, including simulating one or more fixed structures in the environment. In some embodiments, the one or more controllers are configured to direct the virtual reality simulator to simulate the one or more fixed structures in the environment in real-time. In some embodiments, the one or more controllers are configured to be communicatively coupled to one or more sensors that sense a characteristic of the environment. In some embodiments, the one or more sensors are part of one or more sensor assemblies. In some embodiments, at least two of the one or more sensors are of the same type and are disposed in different locations in the environment. In some embodiments, at least two sensors of the one or more sensors are of different types and are disposed in the same sensor complex of the one or more sensor complexes. In some embodiments, the virtual reality projection facilitates viewing changes in the sensed characteristic of the environment as a result of adjusting the sensed characteristic (e.g., a feature of the environment). In some embodiments, the one or more controllers are configured to direct adjustments to the sensed characteristic, including directing altered operation of one or more components disposed in and/or affecting the environment. In some embodiments, the one or more components include a window, an HVAC system, or a light. In some embodiments, the sensing characteristic is a first sensing characteristic, and wherein the one or more controllers are configured to facilitate selection of a second sensing characteristic for viewing and/or adjustment using the virtual reality projection, the selection being made by the user. In some embodiments, the one or more controllers are configured to facilitate viewing and/or adjusting a plurality of environmental characteristics by using virtual reality projections. In some embodiments, the one or more controllers are configured to facilitate sequentially viewing and/or adjusting at least two of the plurality of environmental characteristics by using virtual reality projections. In some embodiments, the one or more controllers are configured to facilitate viewing and/or adjusting at least two of the plurality of environmental characteristics to at least partially overlap in their occurrence by using virtual reality projections. In some embodiments, the one or more controllers are configured to view and/or adjust at least two of the plurality of environmental characteristics simultaneously by using virtual reality projections. In some embodiments, the one or more controllers are communicatively coupled to a network. In some embodiments, the network comprises a building management network. In some embodiments, the network includes a hierarchy of controllers.
In some embodiments, a non-transitory computer program product includes at least one medium (e.g., a non-transitory computer-readable medium).
In another aspect, the present disclosure provides a system, apparatus (e.g., controller) and/or non-transitory computer-readable medium (e.g., software) that implements any of the methods disclosed herein.
In another aspect, the present disclosure provides a method of using any of the systems, computer-readable media, and/or devices disclosed herein, for example, for their intended purposes.
In another aspect, an apparatus includes at least one controller programmed to direct a mechanism for implementing (e.g., implementing) any of the methods disclosed herein, the at least one controller configured to be operatively coupled to the mechanism. In some embodiments, at least two operations (e.g., at least two operations of a method) are directed/performed by the same controller. In some embodiments, at least two operations are directed/performed by different controllers.
In another aspect, an apparatus includes at least one controller configured (e.g., programmed) to implement (e.g., realize) any of the methods disclosed herein. The at least one controller may implement any of the methods disclosed herein. In some embodiments, at least two operations (e.g., at least two operations of a method) are directed/performed by the same controller. In some embodiments, at least two operations are directed/performed by different controllers.
In another aspect, a system comprises: at least one controller programmed to direct operation of at least one other device (or component thereof); and the device (or components thereof), wherein the at least one controller is operatively coupled to the device (or components thereof). The device (or components thereof) may comprise any device (or components thereof) disclosed herein. The at least one controller may be configured to direct any of the devices (or components thereof) disclosed herein. The at least one controller may be configured to be operatively coupled to any of the devices (or components thereof) disclosed herein. In some embodiments, at least two operations (e.g., at least two operations of a device) are directed by the same controller. In some embodiments, at least two operations are directed by different controllers.
In another aspect, a computer software product comprises a non-transitory computer-readable medium having program instructions stored therein, which when read by at least one processor (e.g., a computer) causes the at least one processor to direct the mechanism disclosed herein to perform (e.g., implement) any of the methods disclosed herein, wherein the at least one processor is configured to be operatively coupled to the mechanism. The mechanism may comprise any of the devices (or any component thereof) disclosed herein. In some embodiments, at least two operations (e.g., at least two operations of a device) are directed/performed by the same processor. In some embodiments, at least two operations are directed/performed by different processors.
In another aspect, the present disclosure provides a non-transitory computer-readable medium comprising machine-executable code that, when executed by one or more processors, performs any of the methods disclosed herein. In some embodiments, at least two operations (e.g., at least two operations of a method) are directed/performed by the same processor. In some embodiments, at least two operations are directed/performed by different processors.
In another aspect, the present disclosure provides a non-transitory computer-readable medium comprising machine-executable code that, when executed by one or more computer processors, implements booting of a controller (e.g., as disclosed herein). In some embodiments, at least two operations (e.g., of a controller) are directed/executed by the same processor. In some embodiments, at least two operations are directed/performed by different processors.
In another aspect, the present disclosure provides a computer system comprising one or more computer processors and a non-transitory computer-readable medium coupled to the one or more computer processors. The non-transitory computer-readable medium comprises machine-executable code that, when executed by one or more processors, implements any of the methods disclosed herein and/or implements the booting of the controller disclosed herein.
The contents of this summary section are provided as a simplified introduction to the present disclosure and are not intended to limit the scope of any invention disclosed herein or the scope of the appended claims.
Other aspects and advantages of the present disclosure will become apparent to those skilled in the art from the following detailed description, wherein only illustrative embodiments of the present disclosure are shown and described. As will be realized, the disclosure is capable of other and different embodiments and its several details are capable of modifications in various obvious respects, all without departing from the disclosure. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.
These and other features and embodiments will be described in more detail below with reference to the drawings.
Is incorporated by reference
All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.
Drawings
The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings or figures (also referred to herein as "figures")
In the figure:
FIG. 1 schematically depicts an electrochromic device;
fig. 2 schematically shows a cross-section of an Integrated Glass Unit (IGU);
fig. 3 shows a schematic example of a sensor arrangement;
FIG. 4 shows a schematic diagram of a sensor arrangement and sensor data;
fig. 5A to 5E show graphs as a function of time;
FIG. 6 depicts a time-dependent plot of carbon dioxide concentration;
FIG. 7 shows a topographical map of measured property values;
FIG. 8 shows a schematic flow diagram;
FIG. 9 shows a schematic flow diagram;
FIG. 10 shows a schematic flow diagram;
FIG. 11 illustrates a device and its components and connectivity options;
FIG. 12 shows a schematic diagram of a sensor arrangement and sensor data;
FIG. 13 shows a schematic diagram of a sensor arrangement and sensor data;
FIG. 14 illustrates a control system and its various components;
FIG. 15 shows a schematic flow diagram;
FIG. 16 shows a schematic flow chart;
FIG. 17 schematically depicts a controller; and is provided with
Fig. 18 schematically depicts a processing system.
The drawings and components therein may not be to scale. The components in the figures described herein may not be drawn to scale.
Detailed Description
While various embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed.
Terms such as "a," "an," and "the" are not intended to refer to only a singular entity, but include the general class of which a specific example may be used for illustration. The terminology herein is used to describe specific embodiments of the invention, but their usage does not limit the invention.
When referring to ranges, ranges are meant to include the endpoints unless otherwise indicated. For example, a range between a value of 1 and a value of 2 is meant to be inclusive and includes both values of 1 and 2. Ranges, inclusive, will span any value from about the value 1 to about the value 2. As used herein, the term "adjacent" or "adjacent to … includes" immediately adjacent "," abutting "," contacting ", and" proximate ".
The terms "operatively coupled" or "operatively connected" refer to a first element (e.g., a mechanism) that is coupled (e.g., connected) to a second element to allow for the intended operation of the second element and/or the first element. Coupling may include physical or non-physical coupling. The non-physical coupling may include signal inductive coupling (e.g., wireless coupling). Coupling may include physical coupling (e.g., a physical connection) or non-physical coupling (e.g., via wireless communication).
An element (e.g., a mechanism) that is "configured to" perform a function includes a structural feature that causes the element to perform the function. The structural features may include electrical features such as circuitry or circuit elements. The structural feature may comprise an actuator. The structural features may include circuitry (e.g., including electrical or optical circuitry). The electrical circuitry may include one or more wires. The optical circuitry may include at least one optical element (e.g., a beam splitter, a mirror, a lens, and/or an optical fiber). The structural features may include mechanical features. The mechanical features may include latches, springs, closures, hinges, chassis, supports, fasteners, or cantilevers, etc.
Performing the function may include utilizing the logic feature. The logic features may include programming instructions. The programming instructions may be executable by at least one processor. The programming instructions may be stored or encoded on a medium accessible by one or more processors. In addition, in the following description, the phrases "operable," "adapted," "configured," "designed," "programmed," or "capable" may be used interchangeably where appropriate.
In some embodiments, the peripheral structure includes a region defined by at least one structure. The at least one structure may include at least one wall. The peripheral structure may include and/or surround one or more sub-peripheral structures. The at least one wall may comprise metal (e.g., steel), clay, stone, plastic, glass, stucco (e.g., gypsum), polymer (e.g., polyurethane, styrene, or vinyl), asbestos, fiberglass, concrete (e.g., reinforced concrete), wood, paper, or ceramic. The at least one wall may comprise wires, bricks, blocks (e.g., cinder blocks), tiles, drywall, or framing (e.g., steel framing).
In some embodiments, the peripheral structure includes one or more openings. The one or more openings may be reversibly closable. The one or more openings may be permanently open. The base length dimension of the one or more openings may be smaller relative to the base length dimension of the walls defining the peripheral structure. The base length dimension may include a diameter, length, width, or height of the bounding circle. The surface of the one or more openings may be smaller relative to the surface of the wall defining the peripheral structure. The open surface may be a certain percentage of the total surface of the wall. For example, the open surface may be measured as about 30%, 20%, 10%, 5%, or 1% of the wall. The wall may comprise a floor, ceiling or side wall. The closable opening may be closed by at least one window or door. The peripheral structure may be at least a portion of a facility. The peripheral structure may comprise at least a portion of a building. The building may be a private building and/or a commercial building. A building may include one or more floors. The building (e.g., a floor thereof) may include at least one of: a room, a hallway, a rooftop, a basement, a balcony (e.g., an interior or exterior balcony), a stairwell, an aisle, an elevator shaft, a facade, a mid-floor, an attic, a garage, a porch (e.g., an enclosed porch), a balcony (e.g., an enclosed balcony), a cafeteria, and/or a duct. In some embodiments, the peripheral structure may be fixed and/or movable (e.g., a train, airplane, ship, vehicle, or rocket).
In some embodiments, a plurality of devices may be operably (e.g., communicatively) coupled to a control system. The apparatus may include a sensor, a transmitter, a transceiver, an antenna, a radar, a media display configuration, a processor, and/or a controller. The display (e.g., display matrix) may include Light Emitting Diodes (LEDs). The LEDs may comprise organic materials (e.g., organic light emitting diodes, abbreviated herein as "OLEDs"). The OLED may comprise a transparent organic light emitting diode display (abbreviated herein as "TOLED"), which is at least partially transparent. The plurality of devices may be disposed in a facility (e.g., which includes a building and/or a room). The control system may include a hierarchy of controllers. The device may include an emitter, a sensor, or a window (e.g., IGU). The device may be any device disclosed herein. At least two of the plurality of devices may be of the same type. For example, two or more IGUs may be coupled to the control system. At least two of the plurality of devices may be of different types. For example, the sensor and transmitter may be coupled to a control system. Sometimes, the plurality of devices may include at least 20, 50, 100, 500, 1000, 2500, 5000, 7500, 10000, 50000, 100000, or 500000 devices. The plurality of devices may be any number between the aforementioned numbers (e.g., from 20 devices to 500000 devices, from 20 devices to 50 devices, from 50 devices to 500 devices, from 500 devices to 2500 devices, from 1000 devices to 5000 devices, from 5000 devices to 10000 devices, from 10000 devices to 100000 devices, or from 100000 devices to 500000 devices). For example, the number of windows in a floor may be at least 5, 10, 15, 20, 25, 30, 40 or 50. The number of windows in a floor may be any number between the above (e.g. the number of windows in a floor may be any number between the above From 5 to 50, from 5 to 25, or from 25 to 50). Sometimes, these devices may be located in a multi-storey building. At least a portion of the floors of the multi-storey building may have devices controlled by the control system (e.g., at least a portion of the floors of the multi-storey building may be controlled by the control system). For example, a multi-storey building may have at least 2, 8, 10, 25, 50, 80, 100, 120, 140 or 160 storeys controlled by the control system. The number of floors (e.g., devices therein) controlled by the control system may be any number between the above numbers (e.g., from 2 to 50, from 25 to 100, or from 80 to 160). The floor may have at least about 150m 2 、250m 2 、500m 2 、1000m 2 、1500m 2 Or 2000 square meters (m) 2 ) The area of (c). The floor area may have an area between any of the floor area values described above (e.g., from about 150 m) 2 To about 2000m 2 From about 150m 2 To about 500m 2 From about 250m 2 To about 1000m 2 From about 1000m 2 To about 2000m 2 ). The facility may comprise a commercial or residential building. A residential installation may include multiple or a single home building.
In some embodiments, the device may include a display configuration (e.g., a TOLED display configuration). The display may have 2000, 3000, 4000, 5000, 6000, 7000 or 8000 pixels in its basic length scale. The display may have any number of pixels in between the number of pixels described above on its fundamental length scale (e.g., about 2000 pixels to about 4000 pixels, about 4000 pixels to about 8000 pixels, or about 2000 pixels to about 8000 pixels). The base length dimension may include a diameter, length, width, or height of the bounding circle. The base length scale may be abbreviated herein as "FLS". The display configuration may include a high-resolution display. For example, the display construction may have a resolution of at least about 550, 576, 680, 720, 768, 1024, 1080, 1920, 1280, 2160, 3840, 4096, 4320, or 7680 pixels by at least about 550, 576, 680, 720, 768, 1024, 1080, 1280, 1920, 2160, 3840, 4096, 4320, or 7680 pixels (at 30Hz or at 60 Hz). The first number of pixels may specify a height of the display and the second number of pixels may specify a length of the display. For example, the display may be a high resolution display with a resolution of 1920 × 1080, 3840 × 2160, 4096 × 2160, or 7680 × 4320. The display may be a standard definition display, an enhanced definition display, a high definition display, or an ultra high definition display. The display may be rectangular. The image projected by the display matrix may be refreshed at a frequency (e.g., at a refresh rate) of at least about 20Hz, 30Hz, 60Hz, 70Hz, 75Hz, 80Hz, 100Hz, or 120 hertz (Hz). The FLS of the display configuration may be at least 20", 25", 30", 35", 40", 45", 50", 55", 60", 65", 80 "or 90 inches (). The FLS showing configurations can be any value between the above values (e.g., about 20 "to about 55", about 55 "to about 100", or about 20 "to about 100"). The display construct may be operatively (e.g., physically) coupled to the tintable window. The display configuration may operate in tandem with the display configuration. Examples of display configurations, tintable WINDOWs, their operation, control, and any related software may be found in U.S. provisional patent application serial No. 63/085,254 entitled "display VISION WINDOW AND MEDIA DISPLAY," filed on 30/9/2020, which is incorporated herein by reference in its entirety.
In some embodiments, the peripheral structure surrounds the atmosphere. The atmosphere may include one or more gases. The gas may include an inert gas (e.g., argon or nitrogen) and/or a non-inert gas (e.g., oxygen or carbon dioxide). The peripheral structure atmosphere may be similar to the atmosphere outside the peripheral structure (e.g., ambient atmosphere) in at least one external atmospheric feature, including: temperature, relative gas content, gas type (e.g., humidity and/or oxygen content), debris (e.g., dust and/or pollen), and/or gas velocity. The peripheral structure atmosphere may differ from the atmosphere outside the peripheral structure in at least one external atmospheric characteristic, the at least one external atmospheric characteristic comprising: temperature, relative gas content, gas type (e.g., humidity and/or oxygen content), debris (e.g., dust and/or pollen), and/or gas velocity. For example, the peripheral structural atmosphere may be less humid (e.g., drier) than the external (e.g., ambient) atmosphere. For example, the peripheral structure atmosphere may contain the same (e.g., or substantially similar) oxygen-nitrogen ratio as the atmosphere outside the peripheral structure. The velocity of the gas in the peripheral structure may be (e.g., substantially) similar throughout the peripheral structure. The velocity of the gas in the peripheral structure may be different in different portions of the peripheral structure (e.g., by flowing the gas through a vent coupled to the peripheral structure).
Certain disclosed embodiments provide a network infrastructure in a peripheral structure (e.g., a facility such as a building). The network infrastructure may be used for various purposes, such as for providing communication and/or power services. The communication services may include high bandwidth (e.g., wireless and/or wired) communication services. The communication service may be available to occupants of the facility and/or users outside of the facility (e.g., building). The network infrastructure may operate in conjunction with or as a partial replacement for the infrastructure of one or more cellular carriers. The network infrastructure may be provided in a facility comprising electrically switchable windows. Examples of components of the network infrastructure include high-speed backhaul. The network infrastructure may include at least one cable, switching device, physical antenna, transceiver, sensor, transmitter, receiver, radio, processor, and/or controller (which may include a processor). The network infrastructure may be operatively coupled to and/or include a wireless network. The network infrastructure may include wires. One or more sensors may be deployed (e.g., installed) in the environment as part of and/or after the network is installed. The network may be configured to transmit multiple communication types and power over the same cable. The communication type may include data. The communication type may include cellular communication (e.g., compliant with at least third generation (3G), fourth generation (4G), or fifth generation (5G) cellular communication). The communication type may include BACnet (building automation and control network) protocol communication. The communication type may include media streaming. Media streaming may support HDMI, Digital Video Interface (DVI), displayport (dp), and/or Serial Digital Interface (SDI). The streaming may be a compressed or uncompressed (e.g., Moving Picture Experts Group (MPEG) or advanced video coding (AVG, also known as h.264)) digital media stream.
In various embodiments, the network infrastructure supports a control system for one or more windows, such as tintable (e.g., electrochromic) windows. The control system may include one or more controllers operatively coupled (e.g., directly or indirectly) to the one or more windows. Although the disclosed embodiments describe tintable windows (also referred to herein as "optically switchable windows" or "smart windows"), such as electrochromic windows, the concepts disclosed herein may be applied to other types of switchable optical devices, including liquid crystal devices, electrochromic devices, Suspended Particle Devices (SPDs), NanoChromics displays (NCDs), organic electroluminescent displays (OELDs), Suspended Particle Devices (SPDs), NanoChromics displays (NCDs), or organic electroluminescent displays (OELDs). The display element may be attached to a portion of a transparent body, such as a window. The tintable window may be provided in a (non-transitory) facility, such as a building, and/or may be provided in a transitory facility (e.g., a vehicle), such as a car, RV, bus, train, airplane, helicopter, ship, or boat.
In some embodiments, the tintable window exhibits a (e.g., controllable and/or reversible) change in at least one optical property of the window, e.g., when a stimulus is applied. The change may be a continuous change. May be changed to discrete hue gradation (e.g., to at least about 2, 4, 8, 16, or 32 hue gradation). The optical property may include hue or transmittance. The hue may comprise a color. The transmittance may be one or more wavelengths. The wavelengths may include ultraviolet wavelengths, visible wavelengths, or infrared wavelengths. The stimulation may include optical, electrical and/or magnetic stimulation. For example, the stimulus may include an applied voltage and/or current. One or more tintable windows may be used to control lighting and/or glare conditions, for example, by regulating the transmission of solar energy propagating through the one or more tintable windows. One or more tintable windows may be used to control the temperature within a building, for example, by regulating the transmission of solar energy propagating through the one or more tintable windows. Controlling solar energy can control the thermal load applied inside a facility (e.g., a building). The control may be manual and/or automatic. The control may be used to maintain one or more requested (e.g., environmental) conditions, such as human comfort. The control may include reducing energy consumption of a heating system, a ventilation system, an air conditioning system, and/or a lighting system. At least two of the heating, ventilation and air conditioning may be implemented by separate systems. At least two of heating, ventilation and air conditioning may be implemented by one system. Heating, ventilation, and air conditioning may be implemented by a single system (abbreviated herein as "HVAC"). In some cases, the tintable window may be responsive to (e.g., and communicatively coupled to) one or more environmental sensors and/or user controls. The tintable window may comprise (e.g., may be) an electrochromic window. The window may be located within the interior to the exterior of the structure (e.g., a facility; e.g., a building). However, this need not be the case. The tintable window may operate using a liquid crystal device, a suspended particle device, a micro-electro-mechanical system (MEMS) device, such as a micro-shutter, or any technique now known or later developed that is configured to control light transmission through the window. A window (e.g., with a MEMS device for coloring) is described in U.S. patent application serial No. 14/443,353 filed 5/15/2015, which is now published 7/23/2019 under U.S. patent No. 10,359,681 entitled "multi-PANE WINDOWS incorporation electric cell 35hromic DEVICES AND electromechal SYSTEMS DEVICES," which is incorporated herein by reference in its entirety. In some cases, one or more tintable windows may be located within the interior of a building, for example between a conference room and a hallway. In some cases, one or more tintable windows may be used in automobiles, trains, airplanes, and other vehicles, for example, in place of passive and/or non-tinted windows.
In some embodiments, the tintable window comprises an electrochromic device (referred to herein as an "EC device" (abbreviated herein as ECD) or "EC"). The EC device may include at least one coating having at least one layer. The at least one layer may comprise an electrochromic material. In some embodiments, the electrochromic material exhibits a change from one optical state to another, e.g., when an electrical potential is applied across the EC device. The transition of the electrochromic layer from one optical state to another may result from, for example, reversible, semi-reversible, or irreversible ion insertion into the electrochromic material (e.g., by intercalation) and corresponding charge-balancing electron injection. For example, the transition of the electrochromic layer from one optical state to another may be caused by, for example, reversible ion insertion into the electrochromic material (e.g., by intercalation) and corresponding charge-balanced electron injection. May be reversible during the expected life of the ECD. Semi-reversible refers to a measurable (e.g., significant) degradation in the reversibility of the tint of the window during one or more tinting cycles. In some cases, a portion of the ions responsible for the optical transition are irreversibly bound in the electrochromic material (e.g., and thus the induced (altered) tint state of the window irreversibly passes to its original colored state). In many EC devices, at least some (e.g., all) of the irreversibly bound ions can be used to compensate for "blind charges" in the material (e.g., ECD).
In some embodiments, suitable ions include cations. The cation may comprise lithium ions (Li +) and/or hydrogen ions (H +) (i.e., protons). In some implementations, other ions may be suitable. The cations may be intercalated into the (e.g., metal) oxide. A change in the state of intercalation of ions (e.g., cations) into the oxide can induce a visible change in the hue (e.g., color) of the oxide. For example, the oxide may transition from a colorless to a colored state. For example, lithium ion intercalation into tungsten oxide (WO3-y (0< y < -0.3)) can change the tungsten oxide from a transparent state to a colored (e.g., blue) state. An EC device coating as described herein is located within a visible portion of a tintable window such that the coloration of the EC device coating can be used to control the optical state of the tintable window.
Fig. 1 shows an example of a schematic cross-section of an electrochromic device 100 according to some embodiments shown in fig. 1. The EC device coatings are attached to a substrate 102, a Transparent Conductive Layer (TCL)104, an electrochromic layer (EC)106 (sometimes also referred to as a cathodically coloring layer or a cathodically coloring layer), an ionically conductive layer or region (IC)108, a counter electrode layer (CE)110 (sometimes also referred to as an anodically coloring layer or an anodically coloring layer), and a second TCL 114.
Elements 104, 106, 108, 110, and 114 are collectively referred to as an electrochromic stack 120. The voltage source 116, which is operable to apply an electrical potential across the electrochromic stack 120, effects a transition of the electrochromic coating from, for example, a clear state to a colored state. In other embodiments, the order of the layers is reversed relative to the substrate. That is, the layers are in the following order: substrate, TCL, counter electrode layer, ion conducting layer, electrochromic material layer, TCL.
In various implementations, the ion conductor region (e.g., 108) may be formed from a portion of the EC layer (e.g., 106) and/or a portion of the CE layer (e.g., 110). In such embodiments, the electrochromic stack (e.g., 120) may be deposited to include a cathodically coloring electrochromic material (EC layer) in direct physical contact with an anodically coloring counter electrode material (CE layer). Ion conductor regions (sometimes referred to as interface regions, or substantially electrically insulating layers or regions known as ion conduction) may be formed, for example, by heating and/or other processing steps, where the EC and CE layers meet. An example of an ELECTROCHROMIC device (e.g., including those manufactured without depositing different ion conductor materials) can be seen in U.S. patent application serial No. 13/462,725 entitled "ELECTROCHROMIC DEVICES," filed on day 5, month 2, 2012, which is incorporated herein by reference in its entirety. In some embodiments, the EC device coating may comprise one or more additional layers, such as one or more passive layers. The passive layer can be used to improve certain optical properties, provide moisture, and/or provide scratch resistance. These and/or other passive layers may be used to hermetically seal EC stack 120. Various layers including transparent conductive layers, such as 104 and 114, may be treated with antireflective and/or protective layers, such as oxide and/or nitride layers.
In certain embodiments, the electrochromic device is configured to reversibly cycle (e.g., substantially) between a clear state and a colored state. May be reversible over the expected life of the ECD. The expected life may be at least about 5, 10, 15, 25, 50, 75, or 100 years. The expected life may be any value between the above-described values (e.g., about 5 years to about 100 years, about 5 years to about 50 years, or about 50 years to about 100 years). When the window is in a first tonal (e.g., clear) state, an electrical potential may be applied to the electrochromic stack (e.g., 120) such that available ions in the stack that may place the electrochromic material (e.g., 106) in a colored state are primarily present in the counter electrode (e.g., 110). When the potential applied to the electrochromic stack is reversed, ions can be transported across the ion conducting layer (e.g., 108) to the electrochromic material and the material brought into a second tonal state (e.g., a colored state).
It should be understood that reference to a transition between clear and colored states is non-limiting and represents only one of many examples of electrochromic transitions that may be implemented. Unless otherwise specified herein, whenever reference is made to clear-to-colored transitions, the corresponding device or process encompasses other optical state transitions, such as non-reflective and/or transparent-opaque. In some embodiments, the terms "clear" and "bleached" refer to an optically neutral state, e.g., uncolored, transparent, and/or translucent. In some embodiments, the "color" or "hue" of the electrochromic transition is not limited to any particular wavelength or range of wavelengths. Selection of appropriate electrochromic and counter electrode materials can control the associated optical transition (e.g., from a colored state to an uncolored state).
In certain embodiments, at least a portion (e.g., all) of the materials comprising the electrochromic stack are inorganic, solid (i.e., in the solid state), or both inorganic and solid. Since many organic materials tend to degrade over time, especially when exposed to heat and UV light as tinted architectural windows, inorganic materials offer the advantage of a reliable electrochromic stack that can function over an extended period of time. In some embodiments, solid materials may provide the advantages of minimizing contamination and minimizing leakage problems, which are sometimes also achieved with liquid materials. One or more layers in the stack may contain an amount (e.g., measurable) of organic material. The ECD or any portion thereof (e.g., one or more layers) may contain little or no measurable organics. The ECD or any portion (e.g., layer or layers) thereof can contain one or more liquids that may be present in small amounts. Advantages may be up to about 100ppm, 10ppm or 1ppm ECD. Solid materials may be deposited (or otherwise formed) using one or more methods with liquid components, such as certain methods employing sol-gel, physical vapor deposition, and/or chemical vapor deposition.
Fig. 2 illustrates an example of a cross-sectional view of a tintable window embodied as an insulated glass unit ("IGU") 200, according to some embodiments. The terms "IGU," "tintable window," and "optically switchable window" are used interchangeably herein. It may be desirable to use the IGU as a basic construction for holding electrochromic panes (also referred to herein as "tiles") when provided for installation in a building. The IGU sheet may be a single substrate or a multi-substrate configuration. The sheet may comprise a laminate of, for example, two substrates. IGUs (e.g., having a two-pane or three-pane configuration) may provide a number of advantages over single-pane configurations. For example, a multi-pane configuration may provide enhanced thermal insulation, noise insulation, environmental protection, and/or durability when compared to a single-pane configuration. The multi-pane configuration may provide increased protection for the ECD. For example, an electrochromic film (e.g., and associated layers and conductive interconnects) may be formed on an interior surface of a multi-window IGU and protected by an inert gas filled in the interior volume (e.g., 208) of the IGU. The inert gas fill may provide at least some of the (thermal) insulating function of the IGU. The electrochromic IGU may have thermal barrier capabilities, for example, by virtue of a tintable coating that absorbs (and/or reflects) heat and light.
In some embodiments, an "IGU" comprises two (or more) substantially transparent substrates. For example, an IGU may include two panes of glass. At least one substrate of the IGU may include an electrochromic device disposed thereon. One or more panes of the IGU can have a separator disposed therebetween. The IGU may be a hermetically sealed construction, e.g., having an interior region isolated from the surrounding environment. The "window assembly" may include an IGU. A "window assembly" may include a (e.g., free-standing) laminate. The "window assembly" may include one or more electrical leads, for example, for connecting the IGU and/or the laminate. Electrical leads may operatively couple (e.g., connect) one or more electrochromic devices to a voltage source, switch, etc., and may include a frame that supports an IGU or laminate. The window assembly may include a window controller, and/or a component of a window controller (e.g., a dock).
Fig. 2 illustrates an exemplary embodiment of an IGU200 including a first pane 204 having a first surface S1 and a second surface S2. In some embodiments, first surface S1 of first pane 204 faces an external environment, such as outdoors or an external environment. The IGU200 also includes a second pane 206 having a first surface S3 and a second surface S4. In some embodiments, the second surface (e.g., S4) of the second pane (e.g., 206) faces an internal environment, such as an internal environment of a home, building, vehicle, or compartment thereof (e.g., a peripheral structure therein, such as a room).
In some embodiments, the first and second panes (e.g., 204 and 206) are transparent or translucent, e.g., at least for light in the visible spectrum. For example, each of the panes (e.g., 204 and 206) can be formed from a glass material. The glass material may include architectural glass and/or blast resistant glass. The glass may comprise Silicon Oxide (SO) x ). The glass may comprise soda lime glass or float glass. The glass may contain at least about 75% silicon dioxide (SiO) 2 ). The glass may contain oxides, such as Na 2 O or Cao. The glass may contain alkali or alkaline earth metal oxides. The glass may contain one or more additives. The first pane and/or the second pane may comprise any material having suitable optical, electrical, thermal and/or mechanical properties. Other materials (e.g., substrates) that may be included in the first pane and/or the second pane are plastic, semi-plastic, and/or thermoplastic materials, such as poly (methyl methacrylate), polystyrene, polycarbonate, allyl diglycol carbonate, SAN (styrene acrylonitrile copolymer), poly (4-methyl-1-pentene), polyesters, and/or polyamides. The first pane and/or the second pane may contain a specular material (e.g., silver). In some embodiments, the first pane and/or the second pane may be strengthened. Strengthening may include tempering, heating, and/or chemical strengthening.
In some embodiments, the peripheral structure includes one or more sensors. The sensors can help control the environment of the peripheral structure such that the occupants of the peripheral structure can have an environment that is more comfortable, pleasant, beautiful, healthy, productive (e.g., in terms of occupant performance), easier to live (e.g., work), or any combination thereof. The sensor may be configured as a low resolution sensor or a high resolution sensor. The sensor may provide an on/off indication of the occurrence and/or presence of a particular environmental event (e.g., a pixel sensor). In some embodiments, the accuracy and/or resolution of the sensor may be improved via artificial intelligence analysis of its measurements. Examples of artificial intelligence techniques that may be used include: reactive, limited memory, theory of thought, and/or self-cognitive techniques known to those skilled in the art. The sensors may be configured to process, measure, analyze, detect, and/or react to one or more of: data, temperature, humidity, sound, force, pressure, electromagnetic waves, location, distance, motion, flow, acceleration, velocity, vibration, dust, light, glare, color, gas, and/or other aspects (e.g., characteristics) of the environment (e.g., of the surrounding structure). The gas may include Volatile Organic Compounds (VOCs). The gas may include carbon monoxide, carbon dioxide, water vapor (e.g., moisture), oxygen, radon, and/or hydrogen sulfide. The one or more sensors may be calibrated in a factory setting. The sensors may be optimized to be able to perform accurate measurements of one or more environmental features present in the plant scene. In some cases, factory calibrated sensors may be less than optimal for operation in a target environment. For example, a plant scenario may include an environment that is different from the target environment. The target environment may be the environment in which the sensors are deployed.
The target environment may be an environment in which the sensor is expected and/or intended to operate. The target environment may be different from the factory environment. The factory environment corresponds to the location where the sensors are assembled and/or built. The target environment may include a factory where the sensors are not assembled and/or built. In some cases, the plant scenario may be different from the target environment to the extent that the sensor readings captured in the target environment are erroneous (e.g., to a measurable extent). In this context, "error" may refer to a sensor reading that deviates from a specified accuracy (e.g., specified by the manufacturer of the sensor). In some cases, a factory calibrated sensor, when operating in a target environment, may provide readings that do not meet (e.g., are specified by the manufacturer) accuracy specifications.
In certain embodiments, one or more disadvantages in sensor operation may be corrected and/or mitigated, at least in part, by allowing the sensor to self-calibrate in its target environment (e.g., where the sensor is installed). In some cases, the sensor may be calibrated and/or recalibrated after installation in a target environment. In some cases, the sensors may be calibrated and/or recalibrated after a certain period of operation in the target environment. The target environment may be a location where the sensor is mounted in a peripheral structure. Sensors calibrated and/or recalibrated after installation in a target environment may provide measurements with increased (e.g., measurable) accuracy compared to sensors calibrated before installation. In certain embodiments, one or more previously installed sensors in the peripheral structure provide readings for calibrating and/or recalibrating newly installed sensors in the peripheral structure.
In some embodiments, the target environment corresponding to the first peripheral structure is different from the target environment corresponding to the second peripheral structure. For example, a target environment corresponding to a peripheral structure of a cafeteria or auditorium may present sensor readings that are different from a target peripheral structure corresponding to a conference room. The sensor may take into account the target environment (e.g., one or more characteristics thereof) when performing sensor readings and/or outputting sensor data. For example, during lunch hours, a carbon dioxide sensor installed in an occupied cafeteria may provide a higher reading than a sensor installed in an empty conference room. In another example, an ambient noise sensor located in an occupied cafeteria during lunch may provide a higher reading than an ambient noise sensor located in a library.
In some implementations, the sensor provides an output signal indicative of an erroneous measurement (e.g., occasionally). The sensors may be operatively coupled to at least one controller. The controller may obtain an erroneous sensor reading from the sensor. The controller may obtain the same type of readings from one or more other (e.g., nearby) sensors at similar times (e.g., or simultaneously). The one or more other sensors may be disposed at the same environment as the one sensor. The controller may evaluate the erroneous sensor reading in combination with one or more readings of the same type given by one or more other sensors of the same type to identify the erroneous sensor reading as an outlier. For example, the controller may evaluate the erroneous temperature sensor reading and one or more temperature readings given by one or more other temperature sensors. The controller can determine that the sensor reading is erroneous in response to considering (e.g., including evaluating and/or comparing) the sensor reading with one or more readings from other sensors in the same environment (e.g., in the same peripheral structure). The controller may direct the one sensor to provide an erroneous reading to undergo recalibration (e.g., by undergoing a recalibration procedure). For example, the controller may transmit one or more values and/or parameters to a sensor that provides an erroneous reading. The sensor providing the erroneous reading may utilize the transmitted values and/or parameters to adjust its subsequent sensor reading. For example, a sensor providing an erroneous reading may utilize the transmitted values and/or parameters to adjust its baseline for subsequent sensor readings. The baseline may be a value, a set of values, or a function.
In some embodiments, the sensor has an operational lifetime. The operational lifetime of the sensor may be related to one or more readings taken by the sensor. Sensor readings from certain sensors may be more valuable and/or variable during certain time periods and more valuable and/or variable during other time periods. For example, mobile sensor readings vary more during the day than at night. The operational lifetime of the sensor can be extended. By allowing the sensor to reduce sampling of environmental parameters (e.g., with lower beneficial values) over certain time periods, an extension of operational life may be achieved. Some sensors may be modified(e.g., increasing or decreasing) the frequency at which sensor readings are sampled. The timing and/or frequency of sensor operation may depend on the type of sensor, location in the (e.g., target) environment, and/or time of day. Sensor types may require constant and/or more frequent operation throughout the day (e.g., CO 2 Volatile Organic Compounds (VOCs), occupancy and/or lighting sensors). The volatile organic compounds may be animal and/or human derived. VOCs may include compounds that are related to human generated odors. The sensors may need to be operated infrequently during at least a portion of the night. The sensor type may require infrequent operation during at least a portion of the day (e.g., temperature and/or pressure sensors). The sensors may be assigned a timing and/or frequency of operation. The dispensing may be controlled (e.g., changed) manually and/or automatically (e.g., using at least one controller operatively coupled to the sensor). Operatively coupled may include communicatively coupled, electrically coupled, optically coupled, or any combination thereof. The modification of the timing and/or frequency of taking sensor readings may be in response to detection of an event by the same type of sensor or different types of sensors. The modification to the time and/or frequency of taking sensor readings may utilize sensor data analysis. Sensor data analysis may utilize artificial intelligence (abbreviated herein as "Al"). The control may be fully automatic or partially automatic. The partially automatic control may allow the user to (i) override the direction of the controller, and/or (ii) indicate any preferences (e.g., of the user).
In some embodiments, processing the sensor data comprises performing sensor data analysis. The sensor data analysis may include at least one rational decision process and/or learning. The sensor data analysis may be used to adjust the environment, for example, by adjusting one or more components that affect the environment of the peripheral structure. The data analysis may be performed by a machine-based system (e.g., circuitry). The circuitry may be a processor. Sensor data analysis may utilize artificial intelligence. The sensor data analysis may rely on one or more models (e.g., mathematical models). In some embodiments, the sensor data analysis includes linear regression, least squares fitting, gaussian process regression, nuclear regression, nonparametric multiplicative regression (NPMR), regression trees, local regression, semi-parametric regression, order preserving regression, Multivariate Adaptive Regression Splines (MARS), logistic regression, robust regression, polynomial regression, stepwise regression, ridge regression, lasso regression, elastic network regression, Principal Component Analysis (PCA), singular value decomposition, fuzzy measurement theory, Borel measures, Han measures, risk neutral measures, Lebesgue measures, data processing Grouping Methods (GMDH), bayesian classifiers, k nearest neighbor algorithms (k-NN), Support Vector Machines (SVM), neural networks, support vector machines, classification and regression trees (CART), random forest methods, gradient boosting, or Generalized Linear Model (GLM) techniques. Fig. 3 shows an example of a diagram 300 of a sensor arrangement distributed between peripheral structures. In the example shown in fig. 3, the controller 305 is communicatively linked 308 with sensors located in peripheral structure a ( sensors 310A, 310B, 310C, … 310Z), peripheral structure B ( sensors 315A, 315B, 315C, 315Z), peripheral structure C ( sensors 320A, 320B, 320C, … 320Z), and peripheral structure Z (sensors 385A, 385B, 385C, … 385Z). The communication link includes wired communication and/or wireless communication. In some embodiments, the sensor assembly comprises at least two sensors of different types. In some embodiments, the sensor assembly comprises at least two sensors of the same type. In the example shown in fig. 3, sensors 310A, 310B, 310C, … 310Z of peripheral structure a represent aggregates. A sensor ensemble may refer to a collection of various different sensors. In some embodiments, at least two of the sensors in the aggregate cooperate to determine environmental parameters of, for example, a peripheral structure in which they are disposed. For example, the sensor assembly may include a carbon dioxide sensor, a carbon monoxide sensor, a volatile organic chemical sensor, an environmental noise sensor, a visible light sensor, a temperature sensor, and/or a humidity sensor. The sensor ensemble may include other types of sensors, and claimed subject matter is not limited in this respect. The peripheral structure may include one or more sensors that are not part of the sensor ensemble. The peripheral structure may include a plurality of aggregates. At least two of the plurality of ensembles may differ in at least one of their sensors. At least two of the plurality of ensembles may have at least one sensor of their sensors that is similar (e.g., of the same type). For example, the aggregate may have two motion sensors and one temperature sensor. For example, the aggregate may have a carbon dioxide sensor and an IR sensor. The aggregate may include one or more devices that are not sensors. The one or more other devices than sensors may include an acoustic emitter (e.g., a buzzer) and/or an electromagnetic radiation emitter (e.g., a light emitting diode). In some embodiments, a single sensor (e.g., not in an aggregate) may be disposed adjacent (e.g., in close proximity such as contact) to another device that is not a sensor.
In some embodiments, the sensor assembly is disposed in a housing. The housing may include one or more circuit boards. The housing may include a processor or transmitter. The enclosure may include temperature exchange components (e.g., a heat sink, a cooler, and/or a gas flow). The temperature adjustment component may be active or passive. The processor may include a GPU or CPU processing unit. The circuitry may be programmable. The circuit board may be arranged in a manner that allows for temperature exchange (e.g., through another medium). The other medium may comprise a conductive metal (e.g., an elemental metal or metal alloy). For example, copper and/or aluminum. The housing may comprise a polymer or resin. The housing may include a plurality of sensors, emitters, temperature regulators, and/or processors. The housing may include any of the devices disclosed herein. The housing (e.g., container or enclosure) may comprise a transparent material or a non-transparent material. The housing may include a body and a cover. The housing may include one or more apertures. The housing may be operatively coupled to a power and/or communication network. The communication may be wired and/or wireless. An example OF a sensor assembly, housing, control AND coupling to a network is seen in U.S. provisional patent application serial No. 63/079,851, entitled "DEVICE sensors AND community MANAGEMENT OF DEVICEs," filed on 17.9.2020, which is incorporated herein by reference in its entirety.
The sensors of the sensor aggregate may cooperate with each other. One type of sensor may have a correlation with at least one other type of sensor. Conditions in the peripheral structure may affect one or more of the different sensors. The sensor readings of the one or more different sensors may be correlated to and/or influenced by the condition. The correlation may be predetermined. The correlation may be determined over a period of time (e.g., using a learning process). The time period may be predetermined. The time period may have a cutoff value. The cutoff value may take into account, for example, a threshold value (e.g., a percentage value) of error between the predicted sensor data and the measured sensor data under similar circumstances. The time may be continuous. The correlation may be derived from a learning set (also referred to herein as a "training set"). The learning set may include, and/or may be derived from, real-time observations in the peripheral structure. The observation may include data collection (e.g., from sensors). The learning set may include sensor data from similar peripheral structures. The learning set may include a third party data set (e.g., sensor data). The learning set may be derived from, for example, a simulation of one or more environmental conditions affecting the peripheral structure. The learning set may constitute detected (e.g., historical) signal data with one or more types of noise added. The correlation may utilize historical data, third party data, and/or real-time (e.g., sensor) data. A correlation between the two sensor types may be assigned a value. The value may be a relative value (e.g., strong correlation, medium correlation, or weak correlation). A learning set that is not derived from real-time measurements may be used as a reference (e.g., baseline) to initiate operation of sensors and/or various components affecting the environment (e.g., HVAC systems and/or tinted windows). The real-time sensor data may supplement the learning set, for example, on an ongoing basis or over a defined period of time. The (e.g., supplemental) learning set may increase in size during deployment of the sensors in the environment. The initial learning set may, for example, include additional (i) real-time measurements, (ii) sensor data from other (e.g., similar) peripheral structures, (iii) third party data, and/or (iv) other and/or updated simulations to increase in size.
In some embodiments, the data from the sensors may be correlated. Once a correlation between two or more sensor types is established, deviations from the correlation (e.g., deviations from the correlation values) may indicate an irregular condition and/or failure of a sensor of the related sensors. The fault may include a calibrated slip. The fault may indicate a need for recalibration of the sensor. The failure may include a complete failure of the sensor. In one example, the movement sensor may cooperate with a carbon dioxide sensor. In one example, in response to the movement sensor detecting movement of one or more individuals in the peripheral structure, the carbon dioxide sensor may be activated to begin taking carbon dioxide measurements. An increase in movement in the peripheral structure may be associated with an increase in carbon dioxide levels. In another example, a movement sensor detecting an individual in a peripheral structure may be associated with an increase in noise detected by a noise sensor in the peripheral structure. In some embodiments, the detection of the first type of sensor is not accompanied by the detection of the second type of sensor, which may cause the sensors to issue an error message. For example, if the motion sensor detects many individuals in the peripheral structure without an increase in carbon dioxide and/or noise, the carbon dioxide sensor and/or noise sensor may be identified as faulty or having an erroneous output. An error message may be issued. The first plurality of different associated sensors in the first aggregate may include one sensor of a first type and a second plurality of sensors of a different type. If the second plurality of sensors indicate a correlation and the one sensor indicates a reading different from the correlation, the likelihood of the one sensor failing increases. If a first plurality of sensors in the first aggregate detect a first correlation and a third plurality of correlated sensors in the second aggregate detect a second correlation different from the first correlation, the likelihood that the first sensor aggregate is exposed to a different condition than the third sensor aggregate is increased.
The sensors of the sensor aggregate may cooperate with each other. The collaboration may include considering sensor data of another sensor (e.g., a different type) in the aggregate. The collaboration may include a trend predicted by another sensor (e.g., type) in the aggregate. The collaboration may include trends predicted by data related to another sensor (e.g., type) in the aggregate. The other sensor data may be derived from another sensor in the ensemble, from the same type of sensor in the other ensemble, or from data of a type collected from other sensors in the ensemble that is not derived from another sensor. For example, the first aggregate may include a pressure sensor and a temperature sensor. The cooperation between the pressure sensor and the temperature sensor may comprise taking the pressure sensor data into account when analyzing and/or predicting the temperature data of the temperature sensors in the first aggregate. The pressure data may be (i) pressure data of pressure sensors in the first aggregate, (ii) pressure data of pressure sensors in one or more other aggregates, (iii) pressure data of other sensors, and/or (iv) pressure data of a third party.
In some embodiments, the collection of sensors is distributed throughout the peripheral structure. The same type of sensors may be dispersed in the peripheral structure, for example, to allow measurement of environmental parameters at various locations of the peripheral structure. The same type of sensor may measure a gradient along one or more dimensions of the peripheral structure. The gradient may include a temperature gradient, an ambient noise gradient, or any other change (e.g., increase or decrease) in the measured parameter as a function of position from the point. The gradient may be utilized to determine that the sensor is providing an erroneous measurement (e.g., sensor failure). Fig. 4 shows an example of a diagram 490 of the arrangement of sensor aggregates in a peripheral structure. In the example of fig. 4, aggregate 492A is positioned a distance D from vent 496 1 To (3). Sensor assembly 492B is positioned a distance D from vent 496 2 To (3). Sensor assembly 492C is positioned a distance D from vent 496 3 To (3). In the example of fig. 4B, vent 496 corresponds to an air conditioning vent that represents a relatively constant source of cooling air and a relatively constant source of white noise. Due to the fact thatHere, in the example of fig. 4B, temperature and noise measurements are made by the sensor aggregate 492A. The temperature and noise measurements made by sensor 492A are shown by output reading profile 494A. The output reading profile 494A indicates a relatively low temperature and a significant amount of noise. The temperature and noise measurements made by the sensor ensemble 492B are shown by the output reading profile 494B. Output reading profile 494B indicates a slightly higher temperature and a slightly reduced noise level. The temperature and noise measurements made by the sensor ensemble 492C are shown by the output reading profile 494C. Output reading profile 494C indicates a slightly higher temperature than the temperatures measured by sensor assemblies 492B and 492A. The noise measured by sensor aggregate 492C indicates a lower level than the noise measured by sensor aggregates 492A and 492B. In one example, if the temperature measured by the sensor aggregate 492C indicates a lower temperature than the temperature measured by the sensor aggregate 492A, the one or more processors and/or controllers may identify the sensor aggregate 492C sensor as providing erroneous data.
In another example of a temperature gradient, a temperature sensor mounted near a window may measure a temperature fluctuation that increases relative to a temperature fluctuation measured by a temperature sensor mounted at a location opposite the window. A sensor mounted near a midpoint between the window and the location opposite the window may measure temperature fluctuations between temperature fluctuations measured near the window relative to temperature fluctuations measured at the location opposite the window. In one example, an ambient noise sensor mounted near an air conditioner (or near a heating vent) may measure greater ambient noise than an ambient noise sensor mounted remotely from an air conditioner or heating vent.
In some embodiments, the first type of sensor is coordinated with the second type of sensor. In one example, an infrared radiation sensor may be coordinated with a temperature sensor. Coordination between sensor types may include establishing a correlation (e.g., negative or positive) between readings from the same type or different types of sensors. For example, an infrared radiation sensor that measures an increase in infrared energy may be accompanied by (e.g., positively correlated with) an increase in measured temperature. A decrease in the measured infrared radiation may be accompanied by a decrease in the measured temperature. In one example, an infrared radiation sensor that measures an increase in infrared energy without a measurable increase in temperature may indicate a failure or degradation of the operation of the temperature sensor.
In some embodiments, one or more sensors are included in the peripheral structure. For example, the peripheral structure may include at least 1, 2, 4, 5, 8, 10, 20, 50, or 500 sensors. The peripheral structure may include a plurality of sensors (e.g., from about 1 to about 1000, from about 1 to about 500, or from about 500 to about 1000) within a range between any of the foregoing values. The sensor may be of any type. For example, the sensor may be configured (e.g., and/or designed) to measure the concentration of a gas (e.g., carbon monoxide, carbon dioxide, hydrogen sulfide, volatile organic chemicals, or radon). For example, the sensor may be configured (e.g., and/or designed) to measure ambient noise. For example, the sensor may be configured (e.g., and/or designed) to measure electromagnetic radiation (e.g., RF, microwave, infrared, visible, and/or ultraviolet radiation). For example, the sensors may be configured (e.g., and/or designed) to measure safety-related parameters, such as (e.g., glass) breakage and/or restricting unauthorized presence of personnel in the area. The sensor may be coupled to one or more (e.g., active) devices such as radar or lidar. The apparatus may be operable to detect a physical size of the peripheral structure, a person present in the peripheral structure, a fixed object in the peripheral structure, and/or a moving object in the peripheral structure.
In some embodiments, the sensors are operatively coupled to at least one controller. The coupling may comprise a communication link. The communication link (e.g., fig. 3, 308) may include any suitable communication medium (e.g., wired and/or wireless). The communication link may include a wire, such as one or more conductors arranged as a twisted pair, a coaxial cable, and/or an optical fiber. The communication link may comprise a wireless communication link, such as Wi-Fi, bluetooth, ZigBee, cellular, or fiber optic. One or more segments of the communication link may include a conductive (e.g., wired) medium, while one or more other segments of the communication link may include a wireless link.
In some embodiments, the peripheral structure is a facility (e.g., a building). The peripheral structure may comprise a wall, a door or a window. In some embodiments, at least two peripheral structures of the plurality of peripheral structures are disposed in the facility. In some embodiments, at least two peripheral structures of the plurality of peripheral structures are disposed in different facilities. The different facilities may be campuses (e.g., belonging to the same entity). At least two of the plurality of peripheral structures may reside in the same floor of the facility. At least two of the plurality of peripheral structures may reside in different floors of the facility. The peripheral structures shown in fig. 4, such as peripheral structures A, B, C and Z, may correspond to peripheral structures located on the same floor of the building or may correspond to peripheral structures located on different floors of the building. The peripheral structure of fig. 4 may be located in different buildings on a multi-building campus. The peripheral architecture of fig. 4 may be located on different campuses of a multi-campus neighborhood.
In some embodiments, after installation of the first sensor, the sensor performs self-calibration to establish an operating baseline. The performance of the self-calibration operation may be initiated by a single sensor, a nearby second sensor, or by one or more controllers. For example, upon and/or after installation, sensors deployed in the peripheral structure may perform a self-calibration procedure. The baseline may correspond to a lower threshold from which collected sensor readings may be expected to include values above the lower threshold. The baseline may correspond to a higher threshold from which collected sensor readings may be expected to include values below the higher threshold. The self-calibration procedure may be performed starting from a sensor search time window during which fluctuations or disturbances of the relevant parameter are normal. In some embodiments, the time window is sufficient to collect sensed data (e.g., sensor readings) that allows for separation and/or identification of signals and noise from the sensed data. The time window may be predetermined. The time window may be undefined. The time window may remain open (e.g., persistent) until a calibration value is obtained.
In some embodiments, the sensor may search for the best time to measure the baseline (e.g., in a time window). The optimal time (e.g., in a time window) may be the time span during which (i) the measured signal is most stable and/or (ii) the signal-to-noise ratio is highest. The measured signal may contain a certain degree of noise. A complete absence of noise may indicate sensor failure or environmental inapplicability. The sensing signal (e.g., sensor data) may include a timestamp of the measurement of the data. The sensor may be assigned a time window during which the sensor may sense the environment. The time window may be predetermined (e.g., using third party information and/or historical data regarding the characteristics measured by the sensor). The signal may be analyzed during the time window and an optimal time span may be found in the time window, in which time span the measured signal is most stable and/or the signal to noise ratio is highest. The time span may be equal to or shorter than the time window. The time span may occur during the entire time window, or during portions of the time window. FIG. 5E shows an example of a time window 553 indicated as having a start time 551 and an end time 552. In the time window 553, a time span 554 is indicated, having a start time 555 and an end time 556. The sensor can sense the characteristic (e.g., VOC level) it is configured to sense during time window 553 in order to find a time span during which best sensed data (e.g., best sensed data set) is collected, the best data (e.g., data set) having the highest signal-to-noise ratio, and/or indicating collection of a stable signal. The best sensing data may have (e.g., low) noise levels (e.g., to eliminate faulty sensors). For example, the time window may be 12 hours between 5 pm and 5 am. During this time span, sensed voc data is collected. The collected sensing data set may be analyzed (e.g., using a processor) to find a time span during 12 hours in which there is a minimum noise level (e.g., indicating that the sensor is operating) and (i) the highest signal-to-noise ratio (e.g., the signal is discernable) and/or (ii) the signal is most stable (e.g., has low variability). The time may be 1 hour duration between 4 a.m. and 5 a.m. In this example, the time window is 12 hours between 5 pm and 5 am, and the time span is 1 hour between 4 am and 5 am.
In some embodiments, finding the best data (e.g., set) to use for calibration includes comparing sensor data collected during a span of time (e.g., in a window of time). In this time window, the sensor may sense the environment during several time spans of (e.g., substantially) equal duration. Multiple time spans may fit into a time window. The time spans may or may not overlap. The time spans can be shortened with respect to each other. The data collected by the sensors over various time spans may be compared. The time span with the highest signal-to-noise ratio and/or with the most stable signal may be selected to determine the baseline signal. For example, the time window may include a first time span and a second time span. The first time span (e.g., having a first duration or first length of time) may be shorter than the time window. The second time span (e.g., having the second duration) may be shorter than the time window. In some embodiments, evaluating the sensed data (e.g., finding the best sensed data for calibration) includes comparing a first set of sensed data sensed (e.g., collected) during a first time span with a second set of sensed data sensed (e.g., collected) during a second time span. The length of the first time span may be different from the length of the second time span. The length of the first time span may be equal (or substantially equal) to the length of the second time span. The first time span may have a different start time and/or end time than the second time span. The start time and/or the end time of the first time span and the second time span may be in a time window. The start time of the first time span and/or the second time span may be equal to the start time of the time window. The end time of the first time span and/or the second time span may be equal to the end time of the time window. FIG. 5D shows an example of time window 543 having start time 540 and end time 549, first time window 541 having start time 545 and end time 546, and second time window 542 having start time 547 and end time 458. In the example shown in fig. 5D, the start times 545 and 547 are in the time window 543 and the end times 546 and 548 are in the time window 543.
Fig. 5A-5D illustrate examples of various time windows that include a span of time. Fig. 5A depicts a time lapse graph, wherein a time window 510 is indicated as having a start time 511 and an end time 512. In the time window 510, various time spans 501-507 are indicated, which overlap each other. The sensor may sense a characteristic (e.g., humidity, temperature, or CO) that it is configured to sense during at least two of the time spans (e.g., 501-507) 2 Level), for example, for the purpose of comparing the signals to find the time when the signals are most stable and/or have the highest signal-to-noise ratio. For example, the time window (e.g., 501) may be days and the time span may be 50 minutes. The sensor may measure the characteristic (e.g., CO) during an overlapping time period of 50 minutes (e.g., during the overall time 501-507) 2 Horizontal) and the data can be divided into different (overlapping) 50 minutes later, for example by using time stamped measurements. Indicating stable CO 2 The signal (e.g., at night) and/or 50 minutes with the highest signal-to-noise ratio may be designated for measuring baseline CO 2 The optimal time of the signal. The measured signal may be selected as the baseline of the sensor. Once the optimal time span is selected, other COs 2 The sensor (e.g., at other locations) may utilize the time span for baseline determination. Finding the optimal time for baseline determination may speed up the calibration process. Once the optimal time is found, the other sensors can be programmed to measure the signal at the optimal time to record their signals, which can be used for baseline calibration. Fig. 5B depicts a time lapse graph, wherein a time window 523 is indicated, during which two time spans 521 and 522 are indicated, which overlap each other. Fig. 5C depicts a time lapse graph, wherein a time window 533 is indicated, during which two time spans 531 and 532 are indicated, which time spans are in contact with each other, i.e. the end of the first time span 531 is the beginning of the second time span 532. Figure 5D depicts a time lapse graph,therein is indicated a time window 543 during which two time spans 541 and 542 are indicated, which are separated by a time gap 544.
In one example, for a carbon dioxide sensor, the relevant parameter may correspond to a carbon dioxide concentration. In one example, the carbon dioxide sensor may determine that the time window in which the fluctuation in carbon dioxide concentration is likely to be minimal corresponds to a two hour time period (e.g., between 5:00 am and 7:00 am). Self-calibration may begin at 5:00 a.m. and continue while searching for a duration of time within the two hours during which the measurements are stable (e.g., with minimal fluctuation). In some embodiments, the duration is long enough to allow separation between the signal and noise. In one example, data from the carbon dioxide sensor may help determine that a 5 minute duration within a time window between 5:00 am and 7:00 am (e.g., between 5:25 am and 5:30 am) forms the optimal time period for collecting a lower baseline. The determination may be performed at least partially (e.g., entirely) at the sensor level. The determination may be performed by one or more processors operatively coupled to the sensor. During the selected duration, the sensor may collect readings to establish a baseline, which may correspond to a lower threshold.
In one example, for a gas sensor disposed in a room (e.g., in an office environment), the relevant parameter may correspond to a gas (e.g., CO) 2 ) Levels, wherein the required levels are in the range of about 1000ppm or less. In one example, CO 2 The sensor can determine that self-calibration should be at the CO 2 Occurs during a time window of minimum level, such as when there is no occupant near the sensor (see, e.g., CO before 18000 seconds in FIG. 6) 2 Horizontal). Period of CO 2 The time window of minimum level fluctuation may correspond to a one hour period of time during a lunch from about 12:00 am to about 1:00 pm and a business off hour, for example. FIG. 7 illustrates an example of a contour diagram of a horizontal (e.g., top) view of an office environment depicting various COs 2 And (4) concentration. By being placed in a peripheral structure (e.g. an office)Sensors at various locations to measure gas (CO) 2 ) The concentration of (c). The office environment can include a first occupant 701, a second occupant 702, a third occupant 703, a fourth occupant 704, a fifth occupant 705, a sixth occupant 706, a seventh occupant 707, an eighth occupant 708, and a ninth occupant 709. Gas (CO) may be measured by sensors placed at various locations of a peripheral structure (e.g., an office) 2 ) The concentration of (c).
In some examples, a plurality of sensors in a room are used to locate a source chemical constituent of an atmospheric substance (e.g., VOC). The spatial distribution indicative of the distribution of the chemical species in the peripheral structure may be indicative of various (e.g., relative or absolute) concentrations of the chemical species as a function of space. The distribution may be a two-dimensional distribution or a three-dimensional distribution. The sensors may be placed in different locations of the room to allow sensing of chemicals at different room locations. Mapping (e.g., the entire) peripheral structure (e.g., a room) may require (i) overlap of the sensing areas of the sensors and/or (i) inferring the distribution of chemicals in the peripheral structure (e.g., in areas of low (e.g., sensing) or no sensor coverage). For example, fig. 7 shows an example of a relatively steep and high concentration of carbon dioxide 705 present toward an occupant relative to a low concentration 710 in an unoccupied area of the peripheral structure. This may indicate the presence of a carbon dioxide exhaust source at location 705. Similarly, one can find the location (e.g., source) of chemical removal by finding a low concentration (e.g., relatively steep) of chemical in the environment. Relative to the general distribution of the chemical species in the peripheral structure.
In some examples, the one or more sensors in the peripheral structure are VOC sensors. VOC sensors are specific for VOC compounds (e.g., as disclosed herein) or for a class of compounds (e.g., having similar chemical characteristics). For example, the sensor may be sensitive to aldehydes, esters, thiophenes, alcohols, aromatic hydrocarbons (e.g., benzene and/or toluene), or olefins. In some examples, a set of sensors (e.g., an array of sensors) sense various chemical compounds (VOCs) (e.g., having different chemical characteristics). The set of compounds may include identified or unidentified compounds. The chemical sensor may output a sensed value for a particular compound, class of compounds, or group of compounds. The sensor output may be a total (e.g., cumulative) measurement of the sensed compound species or group. The sensor output may be a total (e.g., cumulative) measurement of multiple sensor outputs of (i) individual compounds, (ii) multiple compound classes, or (iii) multiple compound groups. The one or more sensors may output a total voe output (also referred to herein as a TVOC). Sensing may be performed over a period of time. VOCs may be from human or other sources, such as sweat, aldehydes from carpets/furnishings, and the like.
In some embodiments, at least one of the atmospheric components is a VOC. Atmospheric components (e.g., VOCs) can include benzopyrrole volatiles (e.g., indole and skatole), ammonia, short chain fatty acids (e.g., having up to six carbons), and/or volatile sulfur compounds (e.g., hydrogen sulfide, methyl mercaptan (also known as methyl mercaptan), dimethyl sulfide, dimethyl disulfide, and dimethyl trisulfide). Atmospheric components (e.g., VOCs) may include 2-propanone (acetone), 1-butanol, 4-ethyl-morpholine, pyridine, 3-hexanol, 2-methyl-cyclopentanone, 2-hexanol, 3-methyl-cyclopentanone, 1-methyl-cyclopentanol, p-cymene, octanal, 2-methyl-cyclopentanol, lactic acid, methyl ester, 1, 6-heptadien-4-ol, 3-methyl-cyclopentanol, 6-methyl-5-hepten-2-one, 1-methoxy-hexane, ethyl lactate, nonanal, 1-octen-3-ol, acetic acid, 2, 6-dimethyl-7-octen-2-ol (dihydromyrcenol), 2-ethylhexanol, decanal, 2, 5-hexanedione, 1- (2-methoxypropoxy) -2-propanol, 1, 7, 7-trimethylbicyclo [ 2.2.1 ] heptan-2-one (camphor), benzaldehyde, 3, 7-dimethyl-1, 6-octadien-3-ol (linalool), 1-methylhexylacetate, propionic acid, 6-hydroxy-hexan-2-one, 4-cyanocyclohexene, 3, 5, 5-trimethyl-2-en-1-one (isophorone), butyric acid, 2- (2-propyl) -5-methyl-1-cyclohexanol (menthol), furfuryl alcohol, 1-phenyl-ethanone (acetophenone), isovaleric acid, ethyl carbamate (urethane), 4-tert-butylcyclohexyl acetate (vertenex), p-menth-1-en-8-ol (. alpha. -terpineol), dodecanal, 1-phenylethyl acetate, 2(5H) -furanone, 3-methyl, 2-ethylhexyl-2-ethylhexanoate, 3, 7-dimethyl-6-octen-1-ol (citronellol), bis (2-hydroxypropyl) ether, 3-hexene-2, 5-diol, 3, 7-dimethyl-2, 6-octadien-l-ol (geraniol), hexanoic acid, geranylacetone, 2, 4, 6-tri-tert-butylphenol, unknown, 2, 6-bis (1, 1-dimethylethyl) -4- (1-oxypropyl) phenol, phenethyl alcohol, dimethyl sulfone, 2-ethylhexanoic acid, unknown, benzothiazole, phenol, myristic acid, 1-methylethyl ester (isopropyl myristate), 2- (4-tert-butylphenyl) propionaldehyde (p-tert-butylphenyl dihydrocinnamaldehyde), octanoic acid, α -methyl- β - (p-tert-butylphenyl) propionaldehyde (lilac aldehyde), triacetin (triacetin), p-cresol, cedrol, lactic acid, hexadecanoic acid, 1-methylethyl ester (isopropyl palmitate), 2-hydroxy, hexyl benzoate (hexyl salicylate), palmitic acid, ethyl ester, methyl 2-pentyl-3-oxo-1-cyclopentylacetate (methyl dihydrojasmonate or Hedione), 1, 3, 4, 6, 7, 8-hexahydro-4, 6, 6, 7, 8, 8-hexamethyl-cyclopenta- γ -2-benzopyran (galaxolide), 2-ethylhexanol salicylate, propane-1, 2, 3-triol (glycerol), methoxyacetic acid, dodecyl ester, dioctyl cinnamate, benzoic acid, dodecanoic acid, 5- (hydroxymethyl) -2-aldehyde, p-ethylsalicylate, 4-vinylimidazole, methoxyacetic acid, tetradecyl ester, tridecyl acid, tetradecyl acid, pentadecanoic acid, hexadecanoic acid, 9-hexadecanoic acid, heptadecanoic acid, 2, 6, 10, 15, 19, 23-hexamethyl-2, 6, 10, 14, 18, 22-tetracosahexaene (squalene), hexadecanoic acid and/or 2-hydroxyethyl ester.
In one example, for an ambient noise sensor disposed in a crowded area, such as a cafeteria, the relevant parameter may correspond to a sound pressure (e.g., noise) level measured in decibels above background atmospheric pressure. In one example, the ambient noise sensor may determine that self-calibration should occur during a time window in which sound pressure level fluctuations are minimal. The time window of minimal sound pressure fluctuation may correspond to an hour period from 12:00 a.m. to 1:00 a.m. Self-calibration may continue with the sensor determining a duration within the window during which a baseline (e.g., a higher threshold) may be established. In one example, the ambient noise sensor may determine that a 10 minute duration within a time window of about 12:00 a.m. to about 1:00 a.m. (e.g., about 12:30 a.m. to about 12:40 a.m.) forms the best time to collect a higher baseline, which may correspond to a higher threshold.
The sensor may obtain a first reading of the first parameter from the first sensor and a second reading of the first parameter from the second sensor. The first sensor may be disposed at a first location in the peripheral structure and the second sensor may be disposed at a second location in the peripheral structure. A predicted value of the first parameter measured at the first location may be estimated based at least in part on the second reading. A difference between the estimated predicted value of the first parameter and the first reading of the first parameter may be determined. The first reading of the first parameter may be modified by taking into account and/or using a difference between the estimated predicted value of the first parameter and the first reading of the first parameter.
In some embodiments, self-calibration measurements performed in the field (e.g., in a target setting such as a deployment site) may be used to monitor measurable features (e.g., noise, objects, CO) over a time window (e.g., at least one hour, one day, or one week) 2 Level and/or temperature). The values may be monitored within a time window to obtain the best known value. The best known value may include a value that remains within an error range over a time window (also referred to herein as a "minimum sample period"). The optimum may be interpolated, expected and/or calculated. The minimum sampling period may be a function of the number of samples and/or frequency required to establish a reliable baseline. The best known value may be the most stable value sensed during at least the sampling period (e.g., having the smallest fluctuation range and/or the lowest value). In some cases, the best known value may be obtained during periods of low disturbance in the environment when fluctuations in environmental characteristics (e.g., environmental characteristics) are at a minimum. For example, the best known value may be obtained at night or on weekends, such as during periods of low occupancy in the environment (e.g., a building) when noise surges and/or gases (such as CO) 2 ) The concentration of (c) is minimal. The time window during which the live baseline is measured (e.g., during a sampling period) may be pre-assigned, or minimal sampling may be usedPeriodic (e.g., repeated) occurrences. The minimum sampling period may be a period sufficient to allow the measurement signal and noise to be distinguished. Any pre-allocated time window may be adjusted using (e.g., repeated) occurrences of a minimum sampling period. The position of the peripheral structure and/or fixed features (e.g., placement of walls and/or windows) may be utilized to measure characteristics of a given environment. The position and/or fixation characteristics of the peripheral structure may be derived independently (e.g., from 3 rd party data and/or non-sensor data). Data from one or more sensors disposed in the environment may be used to derive the position and/or fixed characteristics of the peripheral structure. Some sensor data may be used to sense the location of (e.g., fixed and/or non-fixed) objects to determine the environment when the environment is minimally disturbed relative to measured environmental characteristics (e.g., when no person is present in the environment, and/or when the environment is quiet). Determining the position of the object includes determining an occupancy (e.g., human) in the environment. The distance and/or position related measurements may utilize sensors such as radar and/or ultrasonic sensors. The distance and location-related measurements may be derived from sensors that are not traditionally location-and/or distance-related. Objects disposed in or as part of the peripheral structure may have different sensor characteristics. For example, the location of a person in the peripheral structure may be at different temperatures, humidity and/or CO 2 The features are correlated. For example, the location of the wall may be related to the temperature, humidity and/or CO in the surrounding structure 2 Is correlated with an abrupt change in the distribution of (c). For example, the position of a window or door (whether open or closed) may be related to the temperature, humidity, and/or CO in the vicinity of the window or door 2 Is correlated with a change in the distribution of (c). The one or more sensors in the peripheral structure may monitor any environmental changes and/or correlate such changes with changes in subsequently monitored values. In some cases, the lack of fluctuation in the monitored values may be taken as an indication of sensor damage, and the sensor may need to be removed or replaced.
In some embodiments, the best known value is specified. The best known value may be designated as the site baseline, which may be compared to the factory baseline, for example. The field baseline may be equal to (e.g., and/or replaced by) the factory baseline if the field baseline is within error of the factory baseline. Otherwise, a new baseline may be assigned to the live baseline (e.g., the baseline of the sensor deployed in the target location). In some cases, the best known value may be compared to and/or derived from historical values and/or third party values. The accuracy of the field baseline can be monitored over time. If a drift in the field baseline is detected, which is (i) above a threshold (e.g., about 5% drop in the field baseline value), or (ii) outside of the field baseline error range, the field baseline may be reset to a new (e.g., drifted) field baseline value. The threshold may be a decrease in value of at least 2%, 4%, 5%, 10%, 15%, 20%, or 30% relative to a previously determined baseline.
In some embodiments, a device (e.g., a sensor) may be designated as a gold device that may be used as a reference (e.g., as a gold standard) for calibrating other sensors (e.g., the same type of sensor in the facility or another facility). The golden device can be the device that is calibrated most in the facility or in a portion of the facility (e.g., in a building, in a floor, and/or in a room). The calibrated and/or positioned device may be used as a standard for calibrating and/or positioning other devices (e.g., of the same type). Such devices may be referred to as "gold devices". The gold device was used as a reference device. Gold devices may be devices that are at most calibrated and/or precisely located in a facility (e.g., between devices of the same type).
In some embodiments, self-calibration is performed based at least in part on one or more learning techniques (e.g., machine learning, artificial intelligence (Al), heuristics, and/or collaboration/correlation between different sensor types). Self-calibration may be performed on a separate sensor and/or on a remote processor operatively coupled to the sensor (e.g., on a central processor and/or cloud). Self-calibration may periodically determine any need for new calibration of the sensor (e.g., by monitoring drift). Self-calibration may account for multiple sensors (e.g., groups of sensors). The groups of sensors may be of the same type, in the same environment, in the same peripheral structure (e.g., space), and/or in the vicinity of the sensors (e.g., proximity). For example, the sensor groups may be in the same peripheral structure, in the same space, in the same building, in the same floor, in the same room portion, within at most a predetermined distance of each other, or any combination thereof. The sensor group may include a sleep sensor, a shutdown sensor, and/or an active run sensor. The baseline from one or more actively operating sensors may be compared to other sensors to find any baseline outliers. The baseline from one or more operational sensors may be compared to previously calibrated (e.g., dormant) sensors. Non-operational (e.g., dormant) sensors may be used as "memory sensors," e.g., for baseline comparison purposes. For example, the sleep state of the sensor may retain its calibration value. Faulty sensors may be functionally replaced by activating inactive sensors previously installed in the environment (e.g., rather than physically replacing them by installing new sensors introduced into the environment). The environment may be a peripheral structure. When a sensor is added to a group of sensors, it may take a baseline value that takes into account the baseline values of the group's neighboring sensors. For example, a new sensor may employ a baseline value (e.g., mean, median, mode, or intermediate range) of its (e.g., directly) neighboring sensors. Directly adjacent sensors 1 and 2 are two sensors adjacent to each other, without any other sensor (e.g., the same type of sensor) in the distance between sensor 1 and sensor 2. For example, the new sensor may employ baseline values (e.g., mean, median, mode, or intermediate range) for a plurality of sensors (e.g., all sensors) in the environment.
In some embodiments, the self-calibration takes into account ground truth sensed values. Ground truth sensing values may be monitored by alternative (e.g., more sensitive) methods. For example, a single sensor is physically monitored (e.g., manually and/or automatically) by comparison to known and/or different measurement methods. In some cases, the ground truth may be determined by a traveler (e.g., a robot or a field service engineer) or external data (e.g., from the 3 rd party). The robot may comprise a drone or a vehicle.
In some embodiments, the sensor transmits (e.g., directs) data to a receiver, such as a sensor or sensor suite. The sensor package may also be referred to as a "sensor assembly". The sensors in the sensor suite may be similar to sensors deployed in the space of the peripheral structure. At least one sensor in the sensor suite may lack calibration or be uncalibrated (e.g., at the time of deployment or thereafter). The sensors may be calibrated using ground truth measurements (e.g., performed by the traveler). The traveler may carry a sensor similar to the sensor to be calibrated/recalibrated. The sensor may be sensed by the traveler as uncalibrated or lack of calibration. The traveler may be a field service engineer. The traveler may be a robot. The robot may be mobile. The robot may include a wheel (e.g., a plurality of wheels). The robot may include a vehicle. The robot may be airborne. The robot may include or be integrated into a drone, a helicopter, and/or an airplane. The mobile peripheral structure (e.g., a car or drone) may be free of a human operator. The receiver may be carried by the traveler, such as into a space. The traveler (e.g., using the receiver) may take one or more readings to determine the ground truth value. Readings corresponding to ground truth values may be sent directly or indirectly (e.g., via the cloud) to nearby uncalibrated and/or miscalibrated sensors. Sensors reprogrammed with ground truth values can thus be calibrated. The traveler's sensor (or sensor suite) can be programmed to transmit (e.g., index) its new calibration values to the uncalibrated or miscalibrated sensor. The transmission of new calibration values may be sent to the sensor within a particular radius, e.g., depending on the characteristics measured by the sensor and its location (e.g., geographical) susceptibility. In one example, when a field service engineer (abbreviated herein as "FSE") is within a radius of the sensor, and the ground truth reading has been successfully programmed into the sensor now calibrated using the ground truth reading. In some embodiments, the signal is indicative of successful calibration of the sensor. Calibration of the sensor may include transmitting data and/or reprogramming the sensor. The signal may include a sound (e.g., a chime), a light, or another signal type that can be detected (e.g., by an instrument and/or by a user). The FSE may be moved to the next sensor for calibration evaluation, calibration, and/or recalibration. Such procedures may allow a traveler (e.g., FSE) to enter the space of the peripheral structure and travel (e.g., walk) in the space. The traveler may input one or more characteristics of the sensor. The one or more characteristics of the sensor may include a measured characteristic, a range (e.g., radius), a sensor type, a sensor fidelity, a sampling frequency, an operating temperature (e.g., or a range thereof), or an operating pressure. The traveler can wait for a signal from the sensor (e.g., indicating that calibration is complete) and proceed to recalibrate the sensor in space. Evaluation of the calibration, and/or recalibration of the sensor may require physical coupling (e.g., via a wire) to the sensor. The evaluation of the calibration, and/or recalibration of the sensor may not be physically coupled to the sensor (e.g., wireless). Wireless calibration may be automated (e.g., using a robot as a traveler). Wireless calibration with travelers may require physical travel within the environment in which the sensors are deployed. To ensure accuracy, the transmitted data may be compared (e.g., in real time or at a later time) to standard and/or alternative measurement methods. The transmitted and/or compared sensor data may be stored and/or used for calibration of the sensor.
In some cases, the position of the sensor may be calibrated. For example, there may be a deviation between the registered position of the sensor and the measured position of the sensor of the traveler. This may occur when the sensor is calibrated or uncalibrated with respect to the characteristic (e.g., humidity or pressure) it is designed to measure. The traveler can transmit the offset to allow correction of any previously measured data by the sensor (either position miscalibrated or uncalibrated). The transmission may be to a controller and/or to a processor operatively coupled to the controller, the controller being operatively coupled to the sensor. The traveler can initiate position correction operations of the sensors, for example, to calibrate/recalibrate their positions.
In some embodiments, the position of the sensor carried by the traveler is different from the position of the sensor to be calibrated. For example, the traveler's sensor may be in the middle of the room, and the sensor to be calibrated may be attached to the wall. Deviations in these positions can introduce calibration errors (e.g., of the characteristics measured by the sensors). The traveler can transmit (e.g., with or separate from the calibration data) the location of the measurement calibration data (e.g., the location of the traveler's sensor), e.g., to allow for any positional deviation compensation. Variability in the sensed quality (e.g., characteristics) may be calculated, expected, and/or applied to the sensed data for calibration to compensate for any variability between the traveler's sensor and the sensor being recalibrated/calibrated. The calculation may include simulation, such as real-time simulation. The simulation may take into account the peripheral structure (e.g., a room), the fixed structures in and/or defining the peripheral structure, the direction of any peripheral structure boundary (e.g., walls, floors, ceilings, and/or windows), and/or any expected variability in the environment of the peripheral structure (e.g., including at least one characteristic of the location, volume, airflow, or temperature of the vents of the peripheral structure). The simulation may contemplate fixed structures (e.g., tables, chairs, and/or lights) and/or bodies in the peripheral structure. The body may include a habitat disposed in the housing (i) during a particular time period and/or (ii) in a habitat traffic pattern. The expected simulation may be similar to predicting the presence, location, quality, and/or other characteristics of a black hole from the behavior around the black hole (e.g., as opposed to by directly measuring the black hole). The simulation may include an indirect method of calibration. The simulation may include a recursive fitting method. The simulation may include (i) a structural grid of the environment (e.g., a building wall) and/or (ii) automatic positioning of the grid to which the sensors are attached. The calibration/recalibration may be adjusted in situ and/or in real time. Calibration/recalibration of the sensor may utilize the relative position information. The relative may be with respect to at least one fixed structural element (e.g., with respect to at least one fixed sensor).
In some embodiments, a plurality of sensors are assembled into a sensor kit (e.g., a sensor ensemble). At least two sensors of the plurality of sensors may be of different types (e.g., configured to measure different characteristics). The various sensor types may be assembled together (e.g., bundled) and form a sensor suite. The plurality of sensors may be coupled to an electronic board. The electrical connection of at least two sensors of the plurality of sensors in the sensor suite may be controlled (e.g., manually and/or automatically). For example, the sensor suite may be operatively coupled to or include a controller (e.g., a microcontroller). The controller may control the on/off connection of the sensor to the power source. Thus, the controller may control the time (e.g., period) at which the sensor will operate.
In some embodiments, the baseline of one or more sensors of the sensor ensemble may drift. The recalibration may include one or more (e.g., but not all) of the sensors of the sensor suite. For example, overall baseline drift may occur in at least two sensor types in a given sensor suite. A baseline drift in one sensor of the sensor suite may indicate a failure of the sensor. Baseline wander measured in a plurality of sensors in a sensor suite may be indicative of a change in the environment sensed by the sensors in the sensor suite (e.g., rather than a failure of those baseline wander sensors). Such sensor data baseline drift can be utilized to detect environmental changes. For example, (i) a building is built/destroyed in the vicinity of the sensor suite, (ii) a ventilation channel is altered (e.g., damaged) in the vicinity of the sensor suite, (iii) a refrigerator is installed/removed in the vicinity of the sensor suite, (iv) a person's work position is altered relative to (e.g., as well as adjacent to) the sensor suite, (v) the sensor suite experiences an electronic change (e.g., a malfunction), (vi) a structure (e.g., an interior wall) has changed, or (vii) any combination thereof. In this way, the data may be used, for example, to update a three-dimensional (3D) model of the peripheral structure.
In some embodiments, one or more sensors are added to or removed from a sensor group, such as in a peripheral structure and/or in a sensor suite. The newly added sensor may inform (e.g., direct) other members of the sensor group of its presence and relative location within the topology of the group. An example of a SENSOR cluster may be found, for example, in U.S. provisional patent application serial No. 62/958,653 entitled "SENSOR AUTOLOCATION" filed on 8.1.2020 and incorporated herein by reference in its entirety.
Fig. 8 shows an example of a flow chart of a method 800 for obtaining a sensor baseline. Operation 802 comprises (i) optionally defining a pre-deployment baseline for the sensed sensor characteristics. Operation 804 defines a time period and optionally a time of day for acquiring the baseline sensor reading. At operation 806, baseline readings are taken for one or more iterations of a certain or similar time period and used to establish a baseline and error range for the sensor. At operation 808, after obtaining the baseline, environmental characteristics of the peripheral structure are monitored using the sensor.
FIG. 9 shows an example of a flow chart of a method 906 employed on a deployed factory calibrated sensor. At operation 910, the sensor reading is evaluated to determine if it equals or exceeds a saturation value. If the evaluation at operation 910 is true (e.g., yes), then it is determined that the sensor is not suitable for use, does not operate properly, and/or has an environmental anomaly. At operation 912, further analysis is performed and appropriate action may be taken (e.g., repair or replacement of the sensor). If the evaluation at operation 710 is false (e.g., no), a field baseline of the sensor is obtained at 916. If the field baseline is within its pre-deployment baseline (depicted in 918), no changes are made to the sensor's field baseline value, and the factory baseline (depicted in 920) is maintained. If the baseline is outside of its pre-deployment baseline, the pre-deployment baseline is changed (e.g., recalibrated) to a deployment (e.g., on-site) baseline (depicted in 922).
The operations and methods illustrated and described herein may not necessarily be performed in the order indicated in fig. 8 and 9 (e.g., and other figures in this disclosure). It should be noted that the methods may include more or fewer operations than indicated. In some implementations, operations described as separate operations may be combined to form a smaller number of operations. Rather, what may be described herein as being implemented in a single operation may instead be implemented by way of multiple operations. The operations shown in the examples of fig. 8 and 9 may be performed by a single sensor in a sensor ensemble. The operations shown in the examples of fig. 8 and 9 may be performed by a first sensor coupled to and in communication with a second sensor. The operations shown in the examples of fig. 8 and 9 may be directed by at least one controller coupled to and in communication with the first sensor and/or the second sensor.
Fig. 10 shows an example of a flow chart of a method 1000 for calibrating sensors in a peripheral structure. Operation 1010 includes (i) obtaining a first reading of a first parameter from a first sensor and (ii) obtaining a second reading of the first parameter from a second sensor. The first sensor may be disposed at a first location. The second sensor may be disposed at a second location. The first and second positions may be proximate to each other. In one example, the first location and the second location correspond to locations within a single peripheral structure of the facility. In one example, the first sensor and the second sensor correspond to temperature sensors installed in the same peripheral structure of a facility (e.g., a building). In one example, the first sensor may correspond to a sensor calibrated in a factory environment that has not been calibrated in a target environment. In one example, the second sensor may correspond to a sensor that has been calibrated after installation in the target environment.
The method may continue at operation 1020, shown in the example of fig. 10, which includes estimating a predicted value of the first parameter at the first location based at least in part on the second reading. In one example, the second sensor corresponds to a temperature sensor mounted in the peripheral structure near the air conditioning vent. In one example, the first sensor corresponds to a temperature sensor located remotely from the air conditioning vent. The second sensor may estimate a predicted value of the first parameter (e.g., temperature). Based at least in part on the first sensor being located away from the air conditioning vent, the second sensor may estimate that the first sensor will read a higher temperature than the temperature read by the second sensor.
The method may continue at operation 1030, shown in the example of fig. 10, which includes determining a difference between (i) the estimated predicted value of the first parameter and (ii) the first reading of the first parameter. In one example, the difference between the temperature readings from the second temperature sensor may deviate from the temperature readings from the first temperature sensor by an amount of 1 ℃. At operation 1040, shown in the example of fig. 10, the second sensor may consider a difference between (i) the estimated predicted value of the first parameter and (ii) the first reading of the first parameter to modify the first reading of the first parameter. In one example, a second temperature sensor located in the peripheral structure may determine that the change in temperature reading should change by, for example, 1 ℃ or less. In one example, a temperature reading difference between the first sensor and the second sensor exceeding 1.0 ℃ (e.g., 5 ℃) may cause the second temperature sensor to adjust the temperature reading performed by the first temperature sensor downward.
The sensors of the sensor ensemble may be organized into sensor modules. The sensor assembly may include a circuit board (such as a printed circuit board) to which a plurality of sensors are adhered or attached. The sensor may be removed from the sensor module. For example, the sensor may be plugged into and/or unplugged from the circuit board. The sensors may be individually activated and/or deactivated (e.g., using a switch). The circuit board may include a polymer. The circuit board may be transparent or non-transparent. The circuit board may include a metal (e.g., an elemental metal and/or a metal alloy). The circuit board may include a conductor. The circuit board may include an insulator. The circuit board may comprise any geometric shape (e.g., rectangular or oval). The circuit board may be configured (e.g., may have a shape) to allow the aggregate to be disposed in a mullion (e.g., of a window). The circuit board may be configured (e.g., may have a shape) to allow the aggregate to be disposed in a frame (e.g., a door frame and/or a window frame). The mullion and/or the frame may include one or more apertures to allow the sensor to obtain (e.g., accurate) readings. The circuit board may include an electrical connection port (e.g., a socket). The circuit board may be connected to a power source (e.g., electrical power). The power source may include a renewable power source or a non-renewable power source.
FIG. 11 shows an example of a diagram 1100 of an ensemble of sensors organized into sensor modules. Sensors 1110A, 111OB, 1110C, and 111OD are shown as being included in a sensor aggregate 1105. The aggregate of sensors organized into sensor modules may include at least 1, 2, 4, 5, 8, 10, 20, 50, or 500 sensors. The sensor module may include a plurality of sensors (e.g., from about 1 to about 1000, from about 1 to about 500, or from about 500 to about 1000) ranging between any of the above-described values. The sensors of the sensor module may include sensors configured or designed to sense parameters including temperature, humidity, carbon dioxide, particulate matter (e.g., between 2.5 μm and 10 μm), total volatile organic compounds (e.g., changes in voltage potential caused by surface adsorption of volatile organic compounds), ambient light, audio noise levels, pressure (e.g., gases and/or liquids), acceleration, time, radar, lidar, radio signals (e.g., ultra-wideband radio signals), passive infrared, glass breakage, or movement detectors. The sensor ensemble (e.g., 1105) may include non-sensor devices (e.g., emitters), such as buzzers and light emitting diodes. An example of a sensor assembly and its use can be found in U.S. patent application serial No. 16/447169 entitled "SENSING AND communica UNIT FOR optional switch WINDOWs SYSTEMS" filed on 20.6.2019, which is incorporated herein by reference in its entirety.
In some embodiments, the one or more devices include a sensor (e.g., as part of a transceiver). In some embodiments, the transceiver may be configured to transmit and receive one or more signals using a Personal Area Network (PAN) standard (e.g., such as IEEE 802.15.4). In some embodiments, the signal may comprise a Bluetooth, Wi-Fi, or EnOcean signal (e.g., wide bandwidth). The one or more signals may include ultra-wide bandwidth (UWB) signals (e.g., having a frequency in a range of about 2.4 gigahertz (GHz) to about 10.6GHz, or about 7.5GHz to about 10.6 GHz). The ultra-wideband signal may be a signal having a fractional bandwidth greater than about 20%. An ultra-wideband (UWB) radio frequency signal may have a bandwidth of at least about 500 megahertz (MHz). For short ranges, a very low energy level may be used for the one or more signals. The signal (e.g., having a radio frequency) may employ a spectrum (e.g., walls, doors, and/or windows) that is capable of penetrating a solid structure. The low power may be up to about 25 milliwatts (mW), 50mW, 75mW, or 100 mW. The low power may be any value between the above values (e.g., 25mW to 100mW, 25mW to 50mW, or 75mW to 100 mW). The sensors and/or transceivers may be configured to support wireless technology standards for exchanging data between the stationary and mobile devices (e.g., over short distances). The signals may include Ultra High Frequency (UHF) radio waves, for example, from about 2.402 gigahertz (GHz) to about 2.480 GHz. The signal may be configured for a building Personal Area Network (PAN).
In some embodiments, the device is configured to implement geolocation techniques (e.g., Global Positioning System (GPS), Bluetooth (BLE), Ultra Wideband (UWB), and/or dead reckoning). Geolocation techniques may facilitate determining a location of a signal source (e.g., a location of a tag) with an accuracy of at least 100 centimeters (cm), 75cm, 50cm, 25cm, 20cm, 10cm, or 5 cm. In some embodiments, the electromagnetic radiation of the signal comprises ultra-wideband (UWB) radio waves, ultra-high frequency (UHF) radio waves, or radio waves utilized in the Global Positioning System (GPS). In some embodiments, the electromagnetic radiation comprises electromagnetic waves having a frequency of at least about 300MHz, 500MHz, or 1200 MHz. In some embodiments, the signal comprises location and/or time data. In some embodiments, the geolocation techniques include bluetooth, UWB, UHF and/or Global Positioning System (GPS) techniques. In some embodiments, the signal has at least about 1013 bits per second per square meter (bit/s/m) 2 ) The spatial capacity of (a).
In some embodiments, a pulse-based ultra-wideband (UWB) technology (e.g., ECMA-368 or ECMA-369) is a wireless technology used to transmit large amounts of data at low power (e.g., less than about 1 milliwatt (mW), 0.75mW, 0.5mW, or 0.25mW) over short distances (e.g., up to about 300 feet ('), 250 ', 230 ', 200 ', or 150 '). The UWB signal may occupy at least about 750MHz, 500MHz, or 250MHz of the bandwidth spectrum and/or at least about 30%, 20%, or 10% of its center frequency. The UWB signal may be transmitted by one or more pulses. The component broadcasts digital signal pulses simultaneously (e.g., precisely) timed on a carrier signal on multiple frequency channels. Information may be transmitted, for example, by modulating the timing and/or positioning (e.g., pulsing) of the signals. Signal information may be transmitted by encoding the polarity of the signal (e.g., pulses), its amplitude, and/or by using orthogonal signals (e.g., pulses). The UWB signal may be a low power information transfer protocol. UWB technology may be used for (e.g., indoor) positioning applications. The broad UWB spectrum includes low frequencies with long wavelengths that allow UWB signals to penetrate a variety of materials, including various building fixtures (e.g., walls). For example, a wide range of frequencies including low penetration frequencies may reduce the likelihood of multipath propagation errors (not wishing to be bound by theory, as some wavelengths may have line-of-sight trajectories). The UWB communication signals (e.g., pulses) may be relatively short (e.g., up to about 70cm, 60cm, or 50cm for pulses about 600MHz, 500MHz, or 400MHz wide, or up to about 20cm, 23cm, 25cm, or 30cm for pulses having a bandwidth of about 1GHz, 1.2GHz, 1.3GHz, or 1.5 GHz). Short communication signals (e.g., pulses) may reduce the likelihood that reflected signals (e.g., pulses) will overlap with the original signals (e.g., pulses).
In some embodiments, an increase in the number and/or types of sensors may be used to increase the probability that one or more measured characteristics are accurate and/or that a particular event measured by one or more sensors has occurred. In some embodiments, the sensors of the sensor assembly may cooperate with each other. In one example, a radar sensor of a sensor ensemble may determine the presence of multiple individuals in a peripheral structure. A processor (e.g., processor 915) may determine that the detection of the presence of the plurality of individuals in the peripheral structure is positively correlated with an increase in the concentration of carbon dioxide. In one example, a memory accessible to the processor may determine that the increase in detected infrared energy is positively correlated with an increase in temperature detected by the temperature sensor. In some embodiments, the network interface (e.g., 1150) may communicate with other sensor ensembles similar to the sensor ensemble. The network interface may additionally be in communication with the controller.
The individual sensors of the sensor ensemble (e.g., sensor 1110a, sensor 1110D, etc.) may include and/or utilize at least one dedicated processor. The sensor ensemble may utilize a remote processor (e.g., 1154) using wireless and/or wired communication links. The sensor complex may utilize at least one processor (e.g., processor 1152), which may represent a cloud-based processor coupled to the sensor complex via a cloud (e.g., 1150). The processors (e.g., 1152 and/or 1154) may be located in the same building, in different buildings, in buildings owned by the same entity or different entities, in facilities owned by the manufacturer of the window/controller/sensor complex, or at any other location. In various embodiments, as indicated by the dashed lines in FIG. 11, the sensor complex 1105 need not include a separate processor and network interface. These entities may be separate entities and may be operatively coupled to the aggregate 305. The dashed lines in fig. 11 indicate optional features. In some embodiments, on-board processing and/or memory of one or more aggregates of sensors may be used to support other functions (e.g., via allocating aggregate memory and/or processing power to the network infrastructure of a building).
In some embodiments, multiple sensors of the same type may be distributed in the peripheral structure. At least one sensor of the plurality of sensors of the same type may be part of the aggregate. For example, at least two sensors of the plurality of sensors of the same type may be part of at least two aggregates. The sensor ensemble may be distributed in a peripheral structure. The peripheral structure may include a conference room. For example, multiple sensors of the same type may measure environmental parameters in a conference room. A parametric topology of the peripheral structure may be generated in response to measuring the environmental parameter of the peripheral structure. The output signals from any type of sensor of the sensor ensemble may be utilized to generate the parametric topology, e.g., as disclosed herein. The parametric topology may be generated for any peripheral structure of a facility, such as a conference room, hallway, bathroom, cafeteria, garage, auditorium, utility room, storage facility, equipment room, and/or lift.
Fig. 12 shows an example of a diagram 1200 of an arrangement of sensor ensembles distributed within a peripheral structure. In the example shown in fig. 12, a group 1210 of individuals is seated in a meeting room 1202. The conference room includes an "X" size for indicating length, a "Y" size for indicating height, and a "Z" size for indicating depth. XYZ is the direction in a cartesian coordinate system. Sensor assemblies 1205A, 1205B, and 1205C include sensors that may operate similar to the sensors described with reference to sensor assembly 1105 of fig. 11. At least two sensor assemblies (e.g., 1205A, 1205B, and 1205C) may be integrated into a single sensor module. The sensor assemblies 1205A, 1205B, and 1205C can include carbon dioxide (CO) 2 ) A sensor, an ambient noise sensor, or any other sensor disclosed herein. In the example shown in fig. 12, the first sensor assembly 1205A is disposed (e.g., mounted) near a point 1215A, which may correspond to a location in a ceiling, wall, or other location on a side of a table on which the group 1210 of individuals are seated. In the example shown in fig. 12, the second sensor assembly 1205B is disposed (e.g., mounted) near a point 1215B, which may correspond to a location in a ceiling, wall, or other location above (e.g., directly above) a table on which the group 1210 of individuals is seated. In the example shown in fig. 12, the third sensor aggregate 1205C can be disposed (e.g., mounted) at or near a point 1215C, which can correspond to a location in a ceiling, wall, or side of a table in which a relatively small group 1210 of individuals is seated. Any number of additional sensors and/or sensor modules may be positioned at other locations in conference room 1202. The sensor assembly may be disposed anywhere in the peripheral structure. The location of the sensor assembly in the peripheral structure may have coordinates (e.g., in a cartesian coordinate system). At least one coordinate (e.g., of x, y, and z) may be different between two or more sensor ensembles disposed, for example, in a peripheral structure. (e.g. in X, y, and z) may differ between two or more sensor assemblies disposed in a peripheral structure, for example. All coordinates (e.g., of x, y, and z) may differ between two or more sensor assemblies disposed, for example, in a peripheral structure. For example, two sensor ensembles may have the same x coordinate and different y and z coordinates. For example, two sensor ensembles may have the same x and y coordinates and different z coordinates. For example, two sensor ensembles may have different x, y, and z coordinates.
In certain embodiments, one or more sensors of the sensor assembly provide readings. In some embodiments, the sensor is configured to sense a parameter. The parameter may include temperature, particulate matter, volatile organic compounds, electromagnetic energy, pressure, acceleration, time, radar, lidar, glass breaking, movement, or gas. The gas may comprise an inert gas. The gas may be a gas harmful to the average human being. The gas may be a gas present in the ambient atmosphere (e.g., oxygen, carbon dioxide, ozone, chlorinated carbon compounds, or nitrogen compounds such as Nitric Oxide (NO) and/or nitrogen dioxide (NO) 2 )). The gas may include oxygen, nitrogen, carbon dioxide, carbon monoxide, hydrogen sulfide, nitrogen dioxide, inert gases, noble gases (e.g., radon), chlorine, ozone, formaldehyde, methane, or ethane. The gas may include radon, carbon monoxide, hydrogen sulfide, hydrogen, oxygen, water (e.g., moisture). The electromagnetic sensor may comprise an infrared, visible, ultraviolet sensor. The infrared radiation may be passive infrared radiation (e.g., black body radiation). The electromagnetic sensor may sense radio waves. The radio waves may include broadband or ultra-wideband radio signals. The radio waves may include pulsed radio waves. The radio waves may include radio waves utilized in communications. The gas sensor may sense gas type, flow (e.g., velocity and/or acceleration), pressure, and/or concentration. The readings may have a range of amplitudes. The readings may have parameter ranges. For example, the parameter may be an electromagnetic wavelength, and the range may be a range of detected wavelengths.
In some embodiments, the sensor data is responsive to the environment in the peripheral structure and/or any causative factor of the change in the environment (e.g., any environmental disturbance factor). The sensor data may be responsive to a transmitter (e.g., an occupant, an appliance (e.g., a heater, cooler, ventilator, and/or vacuum), an opening) operatively coupled to (e.g., in) the peripheral structure. For example, the sensor data may be in response to an air conditioning duct or in response to an open window. The sensor data may be responsive to activity occurring in the room. The activities may include human activities and/or non-human activities. The activity may include an electronic activity, a gaseous activity, and/or a chemical activity. The activity may include a sensory activity (e.g., visual, tactile, olfactory, auditory, and/or taste). The activity may comprise electronic and/or magnetic activity. The activity may be perceived by a person. The activity may not be perceived by a person. The sensor data may be responsive to an occupant, a substance (e.g., gas) flow, a substance (e.g., gas) pressure, and/or a temperature in the peripheral structure.
In one example, sensor assemblies 1205A, 1205B, and 1205C include carbon dioxide (CO) 2 ) Sensors and ambient noise sensors. The carbon dioxide sensor of the sensor assembly 1205A can provide readings as depicted in the sensor output reading profile 1225A. The noise sensor of the sensor ensemble 1205A may provide readings as also depicted in the sensor output reading profile 1225A. The carbon dioxide sensor of the sensor assembly 1205B can provide readings as depicted in the sensor output reading profile 1225B. The noisy sensors of the sensor ensemble 1205B may provide readings as also depicted in the sensor output reading profile 1225B. The sensor output reading profile 1225B may indicate a higher carbon dioxide and noise level relative to the sensor output reading profile 1225A. The sensor output reading profile 1225C can indicate a lower carbon dioxide and noise level relative to the sensor output reading profile 1225B. The sensor output reading profile 1225C can indicate carbon dioxide and noise levels similar to the sensor output reading profile 1225A. The sensor output reading profiles 1225A, 1225B, and 1225C can include indications representing other sensor readings, such as temperature, humidity Degree, particulate matter, volatile organic compounds, ambient light, pressure, acceleration, time, radar, lidar, ultra wide band radio signals, passive infrared and/or glass breaking, movement detectors.
In some embodiments, data from sensors in the peripheral structure (e.g., and in the sensor ensemble) is collected and/or processed (e.g., analyzed). The data processing may be performed by a processor of the sensor, by a processor of the ensemble of sensors, by another sensor, by another ensemble, in the cloud, by a processor of the controller, by a processor in the peripheral structure, by a processor outside the peripheral structure, by a remote processor (e.g., in a different facility), by a manufacturer (e.g., of the sensor, the window, and/or the building network). The data of the sensor may have a time-indicative identification (e.g., time-stamps). The data of the sensor may have an identification of the sensor location (e.g., a location stamp). The sensors may be identifiably coupled to one or more controllers.
In particular embodiments, sensor output reading profiles 1225A, 1225B, and 1225C may be processed. For example, as part of the processing (e.g., analysis), a distribution of sensor output readings may be plotted on a graph depicting sensor readings as a function of the size (e.g., "X" size) of the peripheral structure (e.g., conference room 1202). In one example, the carbon dioxide level indicated in the sensor output reading profile 1225A may be indicated as CO of fig. 12 2 Point 1235A of graph 1230. In one example, the carbon dioxide level of the sensor output reading profile 1225B may be indicated as CO 2 Point 1235B of graph 1230. In one example, the carbon dioxide level indicated in sensor output reading profile 425C may be indicated as CO 2 Point 1235C of graph 1230. In one example, the ambient noise level indicated in sensor output reading profile 1225A may be indicated as point 1245A of noise plot 1240. In one example, the ambient noise level indicated in sensor output reading profile 1225B may be indicated as point 1245B of noise plot 1240. In one example, the sensor output readings in distribution 1225CThe indicated ambient noise level may be indicated as point 1245C of noise plot 1240.
In some embodiments, processing data derived from the sensors includes applying one or more models. The model may comprise a mathematical model. The processing may include fitting (e.g., curve fitting) of the model. The model may be multi-dimensional (e.g., two-dimensional or three-dimensional). The model may be represented as a graph (e.g., a 2-dimensional graph or a 3-dimensional graph). For example, the model may be represented as a contour map (e.g., as depicted in fig. 7). The modeling may include one or more matrices. The model may comprise a topological model. The model may relate to the topology of the sensed parameter in the peripheral structure. The model may relate to temporal variations of the topology of the sensed parameter in the peripheral structure. The model may be environment and/or peripheral structure specific. The model may take into account one or more characteristics of the peripheral structure (e.g., size, opening, and/or environmental interference factors (e.g., transmitters)). The processing of the sensor data may utilize historical sensor data and/or current (e.g., real-time) sensor data. Data processing (e.g., utilizing a model) may be used to predict environmental changes in the peripheral structure and/or recommend actions that mitigate, adjust, or otherwise react to such changes.
In particular embodiments, the sensor assemblies 1205a, 1205B, and/or 1205C can access a model to allow curve fitting of sensor readings as a function of one or more dimensions of the peripheral structure. In one example, a model can be accessed to utilize a CO 2 Points 1235A, 1235B, and 1235C of the graph 1230 generate sensor profiles 1250A, 1250B, 1250C, 1250D, and 1250E. In one, the model can be accessed to generate sensor profiles 1251A, 1251B, 1251C, 1251B, and 1251E using points 1245A, 1245B, and 1245C of noise graph 1240. In addition to the sensor distribution curves 1250 and 1251 of FIG. 12, additional models may utilize additional readings from a sensor ensemble (e.g., 1205A, 1205B, and/or 1205C) to provide the curves. The sensor profile generated in response to use of the model may indicate a distribution of sensor output readings as a size of the peripheral structure (e.g., an "X" size, a "Y" scale)Inches and/or "Z" dimension) of the particular environmental parameter.
In certain embodiments, one or more models used to form the curves 1250A-1250E and 1251A-1251E may provide a parametric topology of the peripheral structure. In one example, the parameter topology (as represented by the curves 1250A-1250E and 1251A-1251E) can be synthesized or generated from a distribution of sensor output readings. The parameter topology may be the topology of any sensed parameter disclosed herein. In one example, the parameter topology of a conference room (e.g., conference room 1202) may include a carbon dioxide profile having a relatively low value at a location away from a conference room table and a relatively high value at a location above (e.g., directly above) the conference room table. In one example, the parameter topology of the conference room may include a multi-dimensional noise distribution having a relatively low value at a location away from the conference room table and a slightly higher value above (e.g., directly above) the conference room table.
Fig. 13 shows an example of a diagram 1300 of an arrangement of sensor aggregates distributed within a peripheral structure. In the example shown in fig. 13A, a relatively large group 1310 of individuals (e.g., larger relative to the group 1010 of conference rooms) is grouped in an auditorium 1302. The auditorium includes an "X" dimension for indicating length, a "Y" dimension for indicating height, and a "Z" dimension for indicating depth. The sensor ensembles 1305A, 1305B, and 1305C may include sensors that operate similar to the sensors described with reference to the sensor ensemble 905 of fig. 9. At least two sensor assemblies (e.g., 1305a, 1305B, and 1305C) may be integrated into a single sensor module. The sensor assemblies 1305A, 1305B, and 1305C may include carbon dioxide (CO) 2 ) A sensor, an ambient noise sensor, or any other sensor disclosed herein. In the example shown in fig. 13A, a first sensor aggregate 1305A is disposed (e.g., mounted) near a point 1315A, which may correspond to a location in a ceiling, wall, or other location on the side of the seating area where the relatively large group 1310 of individuals are seated. In the example shown in FIG. 13A, a second sensor assembly 1305B may be disposed (e.g., mounted) at point 1 315B, which may correspond to a location in a ceiling, wall, or other location above (e.g., directly above) an area of relatively large group 1310 of individuals gathering. The third sensor ensemble 1305C may be disposed (e.g., mounted) at or near a point 1315C, which may correspond to a location in a ceiling, wall, or other location on a side of a table where a relatively large group 1310 of individuals are located. Any number of additional sensors and/or sensor modules may be positioned at other locations of the auditorium 1302. The sensor assembly may be disposed anywhere in the peripheral structure.
In one example, the sensor ensembles 1305A, 1305B, and 1305C include carbon dioxide sensors of the sensor ensemble 1305A that may provide readings as depicted in the sensor output reading distribution 1325A. The noisy sensors of the sensor ensemble 1305A may provide readings as also depicted in the sensor output reading profile 1325A. The carbon dioxide sensor of the sensor assembly 1305B may provide readings as depicted in the sensor output reading profile 1325B. The noise sensors of the sensor ensemble 1305B may provide readings as also depicted in the sensor output reading distribution 1325B. The sensor output reading distribution 1325B may indicate a higher carbon dioxide and noise level relative to the sensor output reading distribution 1325A. The sensor output reading distribution 1325C may indicate a lower carbon dioxide and noise level relative to the sensor output reading distribution 1325B. The sensor output reading distribution 1325C may indicate carbon dioxide and noise levels similar to the sensor output reading distribution 1325A. Sensor output reading distributions 1325A, 1325B, and 1325C can include indications of other sensor readings representative of any of the sensing parameters disclosed herein.
In a particular implementation, the sensor output reading distributions 1325a, 1325B, and 1325C can be plotted on a graph depicting sensor readings as a function of the size (e.g., "X" size) of the peripheral structure (e.g., the auditorium 1302). In one example, the carbon dioxide level indicated in sensor output reading profile 1325A (as shown in fig. 13A) may be indicated as CO 2 Point 1335A of graph 1330 (as in FIG. 13Shown). In one example, the carbon dioxide level of sensor output reading profile 1325B (as shown in FIG. 13) may be indicated as CO 2 Point 1335B of graph 1330 (as shown in fig. 13). In one example, the carbon dioxide level indicated in the sensor output reading profile 1325C may be indicated as CO 2 Point 1335C of graph 1330. In one example, the ambient noise level indicated in the sensor output reading profile 1325A may be indicated as point 1345A of the noise plot 1340. In one example, the ambient noise level indicated in the sensor output reading profile 1325B may be indicated as point 1345B of the noise plot 1340. In one example, the ambient noise level indicated in the sensor output reading profile 1325C may be indicated as point 1345C of the noise plot 1040.
In particular embodiments, the sensor assemblies 1305a, 1305B, and/or 1305C can utilize and/or access (e.g., be configured to utilize and/or access) a model to allow curve fitting of sensor readings as a function of one or more dimensions of the peripheral structure. In the example shown in FIG. 13, CO may be utilized 2 Points 1335A, 1335B, and 1335C of graph 1330 access the model to provide a sensor profile. In the example shown as an example in fig. 13, the model may be accessed to provide a sensor distribution 1351 using points 1345A, 1345B, and 1345C of the noise plot 1340. Additional models can utilize additional readings from a sensor ensemble (e.g., 1305A, 1305B, 1305C) to provide the sensor profiles (e.g., sensor profiles 1350A, 1350B, 1350C, 1350D, and 1350E) of fig. 13. The model may be utilized to provide sensor profiles (e.g., sensor profiles 1350A, 1350B, 1350C, 1350D, and 1351E) corresponding to ambient noise levels. The sensor profiles generated in response to use of the model may indicate values of particular environmental parameters as a function of dimensions of the peripheral structures (e.g., "X" dimensions, "Y" dimensions, and/or "Z" dimensions). In certain embodiments, one or more models used to form the sensor profiles 1050 and 1051 can provide a parametric topology of the peripheral structure. The parameter topology may indicate a particular type of peripheral structure. In one example, may be distributed from sensors The curves 1350 and 1351 synthesize or generate a parametric topology, which may correspond to a parametric topology of an auditorium. In one example, the parameter topology of the auditorium may include a carbon dioxide distribution having at least moderately high values at all locations and very high values near the center of the auditorium. In one example, the parameter topology of the auditorium may include a noise distribution having relatively high values at all locations of the auditorium and higher values near the center of the auditorium. In particular embodiments, sensor readings may be obtained from one or more sensors of a sensor ensemble. The sensor readings may be obtained by the sensor itself. Sensor readings may be obtained by cooperating sensors, which may be the same type or different types of sensors. The sensor readings may be obtained by one or more processors and/or controllers. The sensor readings may be processed by considering one or more other readings, historical readings, benchmarks, and/or modeling from other sensors disposed (e.g., mounted) within the peripheral structure to generate results (e.g., predictions or estimates of the sensor readings). The generated results may be used to detect outliers of sensor readings and/or outlier sensors. The generated results may be utilized to detect environmental changes at time and/or location. The generated results may be utilized to predict future readings of the one or more sensors in the peripheral structure.
In some embodiments, the sensor is operatively coupled to at least one controller and/or processor. The sensor readings may be obtained by one or more processors and/or controllers. The controller may include a processing unit (e.g., a CPU or GPU). The controller may receive input (e.g., from at least one sensor). The controller may include circuitry, wires, cables, sockets, and/or power sockets. The controller may deliver an output. The controller may include a plurality (e.g., sub) of controllers. The controller may be part of a control system. The control system may include a master controller, a floor controller (e.g., including a network controller), a local controller. The local controller may be a window controller (e.g., controlling an optically switchable window), a peripheral structure controller, or a component controller. For example, the controller can be part of a hierarchical control system (e.g., including a master controller that directs one or more controllers, such as a floor controller, a local controller (e.g., a window controller), a housing controller, and/or a component controller). The physical location of the controller types in the hierarchical control system may be changing. For example: at a first time: the first processor may assume the role of a master controller, the second processor may assume the role of a floor controller, and the third processor may assume the role of a local controller. At a second time: the second processor may assume the role of a master controller, the first processor may assume the role of a floor controller, and the third processor may retain the role of a local controller. At a third time: the third processor may assume the role of a master controller, the second processor may assume the role of a floor controller, and the first processor may assume the role of a local controller. The controller may control one or more devices (e.g., directly coupled to the devices). The controller may be located in proximity to one or more devices it controls. For example, the controller may control a light-switchable device (e.g., an IGU), an antenna, a sensor, and/or an output device (e.g., a light source, a sound source, an odor source, a gas source, an HVAC outlet, or a heater). In one embodiment, the floor controller may direct one or more window controllers, one or more enclosure controllers, one or more component controllers, or any combination thereof. The floor controller may comprise a floor controller. For example, a floor (e.g., including a network) controller may control a plurality of local (e.g., including window) controllers. A plurality of local controllers may be provided in a portion of a facility (e.g., in a portion of a building). A portion of a facility may be a floor of the facility. For example, a floor controller may be assigned to a floor. In some embodiments, a floor may include multiple floor controllers, for example, depending on the size of the floor and/or the number of local controllers coupled to the floor controller. For example, a floor controller may be assigned to a portion of a floor. For example, a floor controller may be assigned to a portion of a local controller disposed in a facility. For example, a floor controller may be assigned to a portion of a floor of a facility. The master controller may be coupled to one or more floor controllers. The floor controller may be located in the facility. The master controller may be located within the facility or outside the facility. The master controller may be disposed in the cloud. The controller may be part of or operatively coupled to a building management system. The controller may receive one or more inputs. The controller may generate one or more outputs. The controller may be a single input single output controller (SISO) or a multiple input multiple output controller (MIMO). The controller may interpret the received input signal. The controller may acquire data from one or more components (e.g., sensors). The obtaining may include receiving or extracting. The data may include measurements, estimates, determinations, generations, or any combination thereof. The controller may include feedback control. The controller may include a feed forward control. The control may include on-off control, Proportional Integral (PI) control, or Proportional Integral Derivative (PID) control. The control may include open loop control or closed loop control. The controller may comprise a closed loop control. The controller may include open loop control. The controller may include a user interface. The user interface may include (or be operatively coupled to) a keyboard, a keypad, a mouse, a touch screen, a microphone, a voice recognition package, a camera, an imaging system, or any combination thereof. The output may include a display (e.g., a screen), speakers, or a printer. Fig. 14 shows an example of a control system architecture 1400 that includes a master controller 1408 that controls a floor controller 1406, which in turn controls a local controller 1404. In some embodiments, the local controller controls one or more IGUs, one or more sensors, one or more output devices (e.g., one or more transmitters), or any combination thereof. Fig. 14 shows an example of a configuration in which a master controller is operatively coupled (e.g., wirelessly and/or wired) to a Building Management System (BMS)1424 and a database 1420. Arrows in fig. 14 indicate communication paths. The controller can be operatively coupled (e.g., directly/indirectly and/or wired and/or wireless) to an external source 1410. The external source may comprise a network. The external source may include one or more sensors or output devices. The external source may include a cloud-based application and/or a database. The communication may be wired and/or wireless. The external source may be located outside the facility. For example, the external source may include one or more sensors and/or antennas disposed, for example, on a wall or ceiling of the facility. The communication may be unidirectional or bidirectional. In the example shown in fig. 14, the communication of all communication arrows is meant to be bi-directional.
FIG. 15 shows a flow diagram of a method 1500 for detecting outliers based at least in part on sensor readings. The method of fig. 15 may be performed by individual sensors of a sensor ensemble. The method of fig. 15 may be performed by a first sensor coupled to (e.g., in communication with) a second sensor. The method of fig. 15 may be directed by a controller coupled to (e.g., in communication with) a first sensor and/or a second sensor. The method of FIG. 15 begins at 1510, where sensor readings are obtained from one or more sensors of a sensor ensemble. At 1520, the readings are processed (e.g., by taking into account peripheral structures, historical readings, benchmarks, and/or modeling) to generate results. At 1530, the results are utilized to detect outlier data, detect outlier sensors, detect environmental changes (e.g., at a particular time and/or location), and/or predict future readings of the one or more sensors.
In particular embodiments, sensor readings from a particular sensor may be correlated with sensor readings from the same type or different types of sensors. Receipt of a sensor reading may cause the sensor to access correlation data from other sensors disposed within the same peripheral structure. Based at least in part on the access correlation data, a reliability of the sensor can be determined or estimated. In response to determining or estimating the reliability of the sensor, the sensor output reading may be adjusted (e.g., increased/decreased). A reliability value may be assigned to the sensor based on the adjusted sensor reading.
Fig. 16 illustrates a flow chart of a method 1650 for detecting and adjusting outliers based at least in part on sensor readings. The method of fig. 16 may be performed by individual sensors of a sensor ensemble. The method of fig. 16 may be performed by a first sensor coupled to (e.g., and in communication with) a second sensor. The method of fig. 16 may be directed by at least one controller (e.g., a processor) coupled to (e.g., in communication with) the first sensor and/or the second sensor. The method of FIG. 16 begins 1655 where sensor readings are obtained from one or more sensors of a collection of sensors disposed in a peripheral structure. The sensor readings may be any type of reading, such as detecting movement of an individual within the peripheral structure, temperature, humidity, or any other characteristic detected by the sensor. At 1660, correlation data can be accessed from other sensors disposed in the peripheral structure. The correlation data may relate to output readings of the same type of sensor or different types of sensors operating within the peripheral structure. In one example, the noise sensor may access data from the mobile sensor to determine whether one or more individuals have entered the peripheral structure. One or more individuals moving within the peripheral structure may transmit noise levels. In one example, the output signal from the noise sensor may be confirmed by the second noise sensor and/or by the movement detector. At 1665, based at least in part on the accessed correlation data, a reliability of the obtained sensor readings can be determined. In one example, the output signal from the noise sensor may be determined to have reduced reliability in response to the output signal from the noise sensor failing (e.g., uncalibrated, miscalibrated, or otherwise failing) without movement detection by the movement detector. In one example, a sensor reading from a calibrated noise sensor may be determined to have increased reliability in response to the calibrated noise sensor reporting an increase in detected noise and concurrent movement detection. At 1670, the sensor readings may be adjusted (e.g., and recalibrated) based at least in part on the determined reliability of the obtained sensor readings. In one example, a faulty (e.g., uncalibrated, miscalibrated, or otherwise faulty) noise sensor sensing a large increase in noise when the motion sensor detects very small movements may cause an adjustment (e.g., a decrease) in the noise sensor output reading. In one example, a faulty noise sensor sensing only a small increase in noise when the movement detector detects a large number of individuals entering the peripheral structure may cause an adjustment (e.g., an increase) in the noise sensor output reading. At 1675, assigning or updating the reliability value of the one or more sensors may be performed based at least in part on the adjusted sensor readings. In one example, a newly installed sensor that repeatedly (e.g., two or more times) provides output readings that are inconsistent with other sensors of the same or different types may: (i) is assigned a lower reliability value; (ii) is calibrated or recalibrated; and/or (iii) check for any other reliability issues. In one example, a calibrated sensor that repeatedly provides output readings consistent with other sensors of the same or different types may be assigned a higher reliability value.
Fig. 17 shows an example of a controller 1705 for controlling one or more sensors. Controller 1705 includes a sensor correlator 1710, a model generator 1715, an event detector 1720, a processor and memory 1725, and a network interface 1750. The sensor correlator 1710 operates to detect correlation between various sensor types. For example, an infrared radiation sensor measuring an increase in infrared energy may be positively correlated to an increase in measured temperature. The sensor correlator may establish a correlation coefficient, such as a coefficient for a negative correlation sensor reading (e.g., a correlation coefficient between-1 and 0). For example, the sensor correlator may establish coefficients of positively correlated sensor readings (e.g., correlation coefficients between 0 and 1).
In some embodiments, the sensor data may be time-dependent. In some embodiments, the sensor data may be space-dependent. The model may utilize temporal and/or spatial dependencies of the sensed parameters. The model generator may allow fitting of sensor readings as a function of one or more dimensions of the peripheral structure. In one example, a model providing a sensor profile of carbon dioxide may utilize various gas diffusion models, which may allow for prediction of carbon dioxide levels at points between sensor locations. The processor and memory (e.g., 1725) may facilitate processing of the model.
In some embodiments, the sensor and/or sensor ensemble may act as an event detector. The event detector may be operable to guide the activity of the sensors in the peripheral structure. In one example, in response to the event detector determining that very few individuals remain in the peripheral structure, the event detector may direct the carbon dioxide sensor to decrease the sampling rate. The reduction in sampling rate may extend the life of the sensor (e.g., carbon dioxide sensor). In another example, the event detector may increase the sampling rate of the carbon dioxide sensor in response to the event detector determining that a large number of individuals are present in the room. In one example, in response to the event detector receiving a signal from the glass break sensor, the event detector may activate one or more movement detectors of the peripheral structure, one or more radar units of the detector. The network interface (e.g., 1750) may be configured or designed to communicate with one or more sensors via a wireless communication link, a wired communication link, or any combination thereof.
The controller can monitor and/or direct changes (e.g., physical) in the operating conditions of the devices, software, and/or methods described herein. Control may include regulation, manipulation, restriction, guidance, monitoring, adjustment, modulation, alteration, suppression, inspection, guidance, or management. Controlled (e.g., by a controller) may include attenuating, modulating, changing, managing, suppressing, normalizing, adjusting, constraining, supervising, manipulating, and/or directing. Controlling may include controlling a control variable (e.g., temperature, power, voltage, and/or profile). Control may include real-time or offline control. The calculations utilized by the controller may be done in real time and/or off-line. The controller may be a manual or non-manual controller. The controller may be an automatic controller. The controller may operate on request. The controller may be a programmable controller. The controller may be programmed. The controller may include a processing unit (e.g., a CPU or GPU). The controller may receive input (e.g., from at least one sensor). The controller may deliver an output. The controller may include a plurality (e.g., sub) of controllers. The controller may be part of a control system. The control system may include a master controller, a floor controller, a local controller (e.g., a peripheral structure controller or a window controller). The controller may receive one or more inputs. The controller may generate one or more outputs. The controller may be a single input single output controller (SISO) or a multiple input multiple output controller (MIMO). The controller may interpret the received input signal. The controller may acquire data from one or more sensors. The obtaining may include receiving or extracting. The data may include measurements, estimates, determinations, generations, or any combination thereof. The controller may include feedback control. The controller may include a feed forward control. The control may include on-off control, Proportional Integral (PI) control, or Proportional Integral Derivative (PID) control. The control may include open loop control or closed loop control. The controller may comprise a closed loop control. The controller may include open loop control. The controller may include a user interface. The user interface may include (or be operatively coupled to) a keyboard, keypad, mouse, touch screen, microphone, voice recognition package, camera, imaging system, or any combination thereof. The output may include a display (e.g., a screen), speakers, or a printer.
The methods, systems, and/or devices described herein may include a control system. The control system may be in communication with any of the devices (e.g., sensors) described herein. The sensors may be of the same type or different types, for example as described herein. For example, the control system may be in communication with the first sensor and/or the second sensor. The control system may control one or more sensors. The control system may control one or more components of a building management system (e.g., a lighting, security, and/or air conditioning system). The controller may adjust at least one (e.g., environmental) characteristic of the enclosure. The control system may use any component of the building management system to regulate the enclosure environment. For example, the control system may regulate the energy supplied by the heating element and/or by the cooling element. For example, the control system may regulate the velocity of air flowing into and/or out of the enclosure through the vent. The control system may include a processor. The processor may be a processing unit. The controller may comprise a processing unit. The processing unit may be central. The processing unit may include a central processing unit (abbreviated herein as "CPU"). The processing unit may be a graphics processing unit (abbreviated herein as "GPU"). A controller or control mechanism (e.g., including a computer system) may be programmed to implement one or more methods of the present disclosure. A processor may be programmed to implement the methods of the present disclosure. A controller may control at least one component of the forming systems and/or apparatus disclosed herein.
Fig. 18 shows an illustrative example of a computer system 1800 programmed or otherwise configured to perform one or more operations of any of the methods provided herein. The computer system may control (e.g., direct, monitor, and/or regulate) various features of the methods, apparatus, and systems of the present disclosure, such as controlling heating, cooling, lighting, and/or ventilation of peripheral structures, or any combination thereof. The computer system may be part of or in communication with any of the sensors or sensor assemblies disclosed herein. The computer can be coupled to one or more of the mechanisms disclosed herein and/or any portion thereof. For example, the computer may be coupled to one or more sensors, valves, switches, lights, windows (e.g., IGUs), motors, pumps, optical components, or any combination thereof. The sensor may be integrated in the transceiver.
The computer system may include a processing unit (e.g., 1806) (also "processor," "computer," and "computer processor" are used herein). The computer system may include memory or memory locations (e.g., 1802) (e.g., random access memory, read only memory, flash memory), electronic storage units (e.g., 1804) (e.g., hard disk), communication interfaces (e.g., 1803) (e.g., network adapter) for communicating with one or more other systems, and peripheral devices (e.g., 1805), such as cache, other memory, data storage, and/or electronic display adapters. In the example shown in fig. 18, memory 1802, storage unit 1804, interface 1803, and peripherals 1805 communicate with processing unit 1806 through a communication bus (solid lines), such as a motherboard. The storage unit may be a data storage unit (or data repository) for storing data. The computer system may be operatively coupled to a computer network ("network") (e.g., 1801) with the aid of a communication interface. The network may be the internet, the internet and/or an extranet, or an intranet and/or extranet in communication with the internet. In some cases, the network is a telecommunications and/or data network. The network may include one or more computer servers that may implement distributed computing, such as cloud computing. In some cases, with the aid of a computer system, the network may implement a peer-to-peer network, which may enable devices coupled to the computer system to act as clients or servers.
The processing unit may execute a series of machine-readable instructions that may be embodied in a program or software. The instructions may be stored in a memory location, such as memory 1802. The instructions may be directed to a processing unit, which may then program or otherwise configure the processing unit to implement the methods of the present disclosure. Examples of operations performed by a processing unit may include fetch, decode, execute, and write-back. The processing unit may interpret and/or execute the instructions. The processor may include a microprocessor, a data processor, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a system on a chip (SOC), a coprocessor, a network processor, an Application Specific Integrated Circuit (ASIC), an application specific instruction set processor (ASIP), a controller, a Programmable Logic Device (PLD), a chipset, a Field Programmable Gate Array (FPGA), or any combination thereof. The processing unit may be part of a circuit, such as an integrated circuit. One or more other components of system 1800 may be included in a circuit.
The storage unit may store files such as drivers, libraries, and saved programs. The storage unit may store user data (e.g., user preferences and user programs). In some cases, the computer system may include one or more additional data storage units located external to the computer system, such as on a remote server in communication with the computer system via an intranet or the Internet.
The computer system may communicate with one or more remote computer systems over a network. For example, the computer system may communicate with a remote computer system of a user (e.g., an operator). Examples of remote computer systems includeA personal computer (e.g., a laptop PC), a tablet personal computer or tablet computer (e.g.,
Figure BDA0003795254460000731
iPad、
Figure BDA0003795254460000732
galaxy Tab), telephone, smartphone (e.g.,
Figure BDA0003795254460000733
iPhone, Android enabled device,
Figure BDA0003795254460000734
) Or a personal digital assistant. A user (e.g., a client) may access the computer system via a network.
The methods described herein may be implemented by machine (e.g., computer processor) executable code stored on an electronic storage location of a computer system, such as the memory 1802 or the electronic storage unit 1804. The machine executable or machine readable code may be provided in the form of software. During use, the processor 1806 may execute code. In some cases, code may be retrieved from a storage unit and stored on a memory for ready access by a processor. In some cases, the electronic storage unit may be eliminated, and the machine-executable instructions stored on the memory.
The code may be pre-compiled and configured for use with a machine adapted to execute the code, or may be compiled at runtime. The code may be provided in a programming language that is selected to enable the code to be executed in a pre-compiled or compiled form.
In some embodiments, the processor includes code. The code may be program instructions. The program instructions may cause at least one processor (e.g., a computer) to direct a feed-forward and/or a feedback control loop. In some embodiments, the program instructions cause the at least one processor to direct a closed-loop and/or open-loop control scheme. The control may be based at least in part on one or more sensor readings (e.g., sensor data). One controller may direct multiple operations. At least two operations may be directed by different controllers. In some embodiments, different controllers may direct at least two of operations (a), (b), and (c). In some embodiments, different controllers may direct at least two of operations (a), (b), and (c). In some embodiments, the non-transitory computer readable medium causes each different computer to direct at least two of operations (a), (b), and (c). In some embodiments, the different non-transitory computer readable medium causes each different computer to direct at least two of operations (a), (b), and (c). The controller and/or computer readable medium may direct any of the devices disclosed herein or components thereof. The controller and/or computer readable medium may direct any of the operations of the methods disclosed herein.
In some embodiments, a user can adjust an environment, for example, using a Virtual Reality (VR) module (e.g., an augmented reality module). The VR module may receive data from one or more sensors regarding various environmental characteristics (e.g., features) sensed by the one or more sensors. The VR module may receive structural information about the environment, for example, to account for any surrounding walls, windows, and/or doors of the enclosed environment. The VR module may receive visual information about the environment, for example, from one or more sensors (e.g., including a camera such as a video camera). The VR module may be operated by a controller (e.g., including a processor). The VR module may be operatively (e.g., communicatively) coupled to the projection assistance apparatus. The projection assistance device may include a screen (e.g., an electronic or non-electronic screen), a projector, or a head-mounted device (e.g., glasses or goggles). The one or more sensors may be disposed on a circuit board (e.g., a motherboard). The one or more sensors may be part of a sensor ensemble. The sensor assembly may be an assembly of devices that includes (i) a sensor or (ii) a sensor and an emitter. The peripheral structure may include the same type of sensors disposed at different locations in the environment. The peripheral structure may include aggregates disposed at different locations in the environment. The VR module may allow a user (e.g., between different characteristic types) to select one type of environmental characteristic to view and/or control. The VR module may allow for any variability of characteristics in the simulated environment. The characteristic variability can be modeled as a three-dimensional map superimposed on any fixed structure of the peripheral structure of the enclosure environment. The characteristic variability in the environment may change in real time. The VR module may update the characteristic variability in real-time. The VR module may use data from the one or more sensors (e.g., measuring a requested characteristic in the environment), simulation, and/or third party data to simulate characteristic variability. The simulation may utilize artificial intelligence. The simulation may be any of the simulations described herein. The VR module can project a plurality of different characteristics in the environment, for example, simultaneously and/or in real-time. The user may request a change in any of the characteristics displayed by the VR module. The VR module can send a command (e.g., directly or indirectly) to one or more components of the environment (e.g., HVAC, lighting, or tint of a window) that affect the peripheral structure. The indirect command may be via one or more controllers communicatively coupled to the VR module. The VR module may operate via one or more processors. The VR module may reside on a network operatively coupled to the one or more components affecting the environment, to one or more controllers, and/or to one or more processors. For example, the VR module can facilitate controlling a tint of a window disposed in the peripheral structure. The VR projection may be a transmissive window, and a menu or bar (e.g., a slider) depicting various tone levels. The menu may be superimposed on the VR projection of the peripheral structure. The user can view the window and select the desired hue level. Upon receiving a command (e.g., over a network), the window controller may direct the user-selected window to change its tint. For example, the VR module may facilitate controlling temperature in the peripheral structure. In another example, the VR module can simulate a temperature distribution in the peripheral structure. The user may view the temperature ranges displayed on the menu or bar (e.g., a slider bar) and select a desired temperature in the peripheral structure and/or in a portion of the peripheral structure. The request may be directed to a local controller that directs the HVAC system (e.g., including any vents) to adjust its temperature according to the request. Upon request, the VR module may simulate a change in a characteristic (e.g., glass tone and/or temperature), for example, as a function of changes occurring in the peripheral structure. The user can be able to view both the temperature distribution and the window tint level in the same VR experience (e.g., projected time frame of the VR environment) or in different VR experiences. The user can request both a new temperature and a new window tint level in the same VR experience or a different VR experience. The user can view changes in both the new temperature and the new window tint level in the same VR experience or in different VR experiences. At times, the VR projection update of the change in the first characteristic may lag (e.g., due to processing time of the sensor data) relative to the update of the change in the at least one second characteristic, where the user requests a change in both the first characteristic and the at least one second characteristic. At times, the VR projected update of the change in the first characteristic may coincide with the update of the change in the at least one second characteristic, where the user requests a change in both the first characteristic and the at least one second characteristic. The selection may use any VR tool and/or any other user input tool, such as a touch screen, joystick, console, keyboard, controller (e.g., remote controller and/or game controller), digital pen, camera, or microphone.
In some embodiments, the at least one sensor is operatively coupled to a control system (e.g., a computer control system). The sensor may include an optical sensor, an acoustic sensor, a vibration sensor, a chemical sensor, an electrical sensor, a magnetic sensor, a flow sensor, a movement sensor, a velocity sensor, a position sensor, a pressure sensor, a force sensor, a density sensor, a distance sensor, or a proximity sensor. The sensors may include temperature sensors, weight sensors, material (e.g., powder) level sensors, metering sensors, gas sensors, or humidity sensors. The metrology sensors may include measurement sensors (e.g., height, length, width, angle, and/or volume). The metrology sensor may comprise a magnetic sensor, an acceleration sensor, an orientation sensor, or an optical sensor. The sensor may send and/or receive acoustic (e.g., echo) signals, magnetic signals, electronic signals, or electromagnetic signals. The electromagnetic signals may include visible light signals, infrared signals, ultraviolet signals, ultrasonic signals, radio wave signals, or microwave signals. The gas sensor may sense any of the gases described herein. The distance sensor may be one type of metering sensor. The distance sensor may comprise an optical sensor or a capacitive sensor. The sensor may comprise an accelerometer. The temperature sensor may comprise a bolometer, bimetallic strip, calorimeter, exhaust gas thermometer, flame detector, Gardon meter, Golay probe, heat flux sensor, infrared thermometer, microbolometer, microwave radiometer, net radiometer, quartz thermometer, resistive temperature detector, resistive thermometer, silicon bandgap temperature sensor, special sensor microwave/imager, thermometer, thermistor, thermocouple, thermometer (e.g., resistive thermometer), or pyrometer. The temperature sensor may comprise an optical sensor. The temperature sensor may include image processing. The sensor may include an IR camera, a visible light camera, and/or a depth camera. The temperature sensor may include a camera (e.g., an IR camera, a CCD camera). The pressure sensor may include a self-recording barometer, a pressure booster, a bourdon tube gauge, a hot filament ion gauge, an ionization gauge, a mcrader gauge, an oscillating U-tube, a permanent downhole gauge, a pressure gauge, a pirani gauge, a pressure sensor, a pressure gauge, a tactile sensor, or a time gauge. The position sensor may include an accelerometer, a capacitive displacement sensor, a capacitive sensing device, a free fall sensor, a gravimeter, a gyroscope sensor, a shock sensor, an inclinometer, an integrated circuit piezoelectric sensor, a laser rangefinder, a laser surface velocimeter, a lidar, a linear encoder, a Linear Variable Differential Transformer (LVDT), a liquid capacitance inclinometer, an odometer, a photosensor, a piezoelectric accelerometer, a rate sensor, a rotary encoder, a rotary variable differential transformer, an autosynchronizer, a shock detector, a shock data recorder, a tilt sensor, a tachometer, an ultrasonic thickness meter, a variable reluctance sensor, or a velocity receiver. The optical sensor may include a charge coupled device, a colorimeter, a contact image sensor, an electro-optic sensor, an infrared sensor, a dynamic inductive detector, a light emitting diode (e.g., photosensor), an optically addressed potentiometric sensor, a nicols radiometer, a fiber optic sensor, an optical position sensor, a photodetector, a photodiode, a photomultiplier tube, a phototransistor, a photosensor, a photoionization detector, a photomultiplier tube, a photoresistor, a photoswitch, a phototube, a scintillator, shack-hartmann, a single photon avalanche diode, a superconducting nanowire single photon detector, a transition edge sensor, a visible photon counter, or a wavefront sensor. The one or more sensors may be connected to a control system (e.g., to a processor, computer).
While preferred embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. The invention is not intended to be limited to the specific examples provided within the specification. While the invention has been described with reference to the foregoing specification, the description and illustrations of the embodiments herein are not meant to be construed in a limiting sense. Numerous variations, changes, and substitutions will occur to those skilled in the art without departing from the invention herein. Further, it is to be understood that all aspects of the present invention are not limited to the specific depictions, configurations or relative proportions set forth herein which depend upon a variety of conditions and variables. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is therefore contemplated that the present invention will also cover any such alternatives, modifications, variations or equivalents. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.

Claims (77)

1. A method for sensor calibration in a facility, the method comprising:
(a) Using a sensor to perform the following operations: (i) collecting first sensing data during a first duration, and (ii) collecting second sensing data during a second duration, the first duration and the second duration occurring during a time window, the first duration having a first start time and the second duration having a second start time, wherein a time span of the first duration is equal or approximately equal to a time span of the second duration, wherein the sensors are included in a sensor array disposed in the facility;
(b) evaluating the first sensed data and the second sensed data to obtain best sensed data, the best sensed data having a minimum variability greater than zero; and
(c) assigning a baseline to the sensor by taking into account the optimal sensing data.
2. The method of claim 1, wherein the time span is predetermined.
3. The method of claim 1, further comprising collecting third sensing data during the time window and assigning the time span by considering the third sensing data such that the time span includes a plurality of data that facilitates separation of signal data from noise data.
4. The method of claim 3, wherein the third sensed data is collected prior to using the sensor to: (i) collecting the first sensed data during the first duration; and (ii) collecting the second sensed data during the second time duration.
5. The method of claim 1, wherein the sensor is housed in a housing that includes (i) the sensor or (ii) the sensor and the emitter as part of a device aggregate.
6. The method of claim 1, wherein the sensor arrays are configured to operate in a coordinated manner.
7. The method of claim 1, further comprising adjusting an environment of the facility cooperatively at least in part by using data from the sensor array.
8. An apparatus for sensor self-calibration in a facility, the apparatus comprising one or more controllers configured to:
(a) operatively coupled to sensors included in a sensor array disposed in a facility;
b) the following collection or directed collection operations were performed: (i) collecting or directing collection of first sensed data during a first time duration; and (ii) collect or direct collection of second sensed data during a second duration, the first duration and the second duration occurring during a time window, the first duration having a first start time and the second duration having a second start time, wherein a time span of the first duration is equal or approximately equal to a time span of the second duration;
(c) Evaluating or instructing to evaluate the first sensed data and the second sensed data to obtain best sensed data, the best sensed data having a minimum variability greater than zero; and
(d) assigning a baseline to the sensor or assigning a guide baseline to the sensor by taking into account the optimal sensing data.
9. The apparatus of claim 8, wherein the one or more controllers are configured to collect or direct collection of the first sensed data for a first parameter comprising a characteristic of an environment of a peripheral structure in which the sensor is disposed and/or to which the sensor is attached.
10. The device of claim 8, wherein the one or more controllers are configured to collect or direct collection of parameters including temperature, humidity, sound, force, pressure, electromagnetic waves, position, distance, movement, flow, acceleration, velocity, vibration, dust, light, glare, color, gas, or volatile compounds.
11. The apparatus of claim 8, wherein the sensor is housed in a housing that includes (i) the sensor or (ii) the sensor and the emitter as part of a device aggregate.
12. The apparatus of claim 8, wherein the sensor arrays are configured to operate in a coordinated manner.
13. The apparatus of claim 8, wherein the one or more controllers are configured to cooperatively adjust or direct adjustment of the environment of the facility at least in part by using data from the sensor array.
14. A non-transitory computer program product for sensor calibration in a facility, the non-transitory computer program product including instructions recorded thereon, which, when executed by one or more processors operatively coupled to a sensor, cause the one or more processors to perform operations comprising:
(a) collecting or directing collection from the sensor: (i) collect or direct collection of first sensing data during a first duration, and (ii) collect or direct collection of second sensing data during a second duration, the first duration and the second duration occurring during a time window, the first duration having a first start time and the second duration having a second start time, wherein a time span of the first duration is equal or approximately equal to a time span of the second duration, wherein the sensor is included in a sensor array disposed in a facility;
(b) Evaluating or instructing to evaluate the first sensed data and the second sensed data to obtain best sensed data, the best sensed data having a minimum variability greater than zero; and
(c) assigning a baseline to the sensor or assigning a guide baseline to the sensor by taking into account the optimal sensing data.
15. The non-transitory computer program product of claim 14, wherein the method further comprises, prior to (a), determining the time window and/or the time span.
16. The non-transitory computer program product of claim 14, wherein the method further comprises collecting third sensing data during the time window and allocating the time span by considering the third sensing data such that the time span includes a plurality of data that facilitates separation of signal data from noise data.
17. The non-transitory computer program product of claim 14, wherein the sensor is housed in a housing that includes (i) the sensor or (ii) the sensor and the transmitter as part of a device aggregate.
18. The non-transitory computer program product of claim 14, wherein the sensor arrays are configured to operate in a coordinated manner.
19. The non-transitory computer program product of claim 14, wherein the operations comprise adjusting or directing adjustment of an environment of the facility cooperatively, at least in part, by using data from the sensor array.
20. The non-transitory computer program product of claim 14, wherein the operations further comprise collecting third sensing data during the time window and allocating the time span by considering the third sensing data such that the time span includes a plurality of data that facilitates separation of signal data from noise data.
21. A system for sensor calibration in a facility, the system comprising a sensor and one or more circuitry, the sensor and the one or more circuitry configured to perform a method, the method comprising:
(a) performing, via one or more controllers: (i) collecting first sensed data during a first time duration; and (ii) collecting second sensed data during a second time duration, the first time duration and the second time duration occurring during a time window, the first time duration having a first start time and the second time duration having a second start time, wherein a time span of the first time duration is at least approximately equal to a time span of the second time duration, the first sensed data and the second sensed data collected from sensors included in a sensor array disposed in the facility;
(b) Evaluating the first sensed data and the second sensed data to obtain best sensed data, the best sensed data having a minimum variability greater than zero; and
(c) in response to considering optimal sensing data, assigning the optimal sensing data to the sensor as a baseline.
22. A method of sensor calibration in a facility, the method comprising:
(a) obtaining a first reading of a first parameter from a first sensor disposed at a first location in a peripheral structure and a second reading of the first parameter from a second sensor disposed at a second location in the peripheral structure, wherein the first sensor and the second sensor are included in a sensor array disposed in the facility;
(b) estimating a predicted value of the first parameter at the first location using the second reading;
(c) determining a difference between (I) the predicted value of the estimated first parameter and (II) the first reading of the first parameter; and
(d) considering a difference between (i) the predicted value of the estimated first parameter and (ii) the first reading of the first parameter to modify the first reading of the first parameter.
23. The method of claim 22, wherein the first parameter comprises a characteristic of an environment of the peripheral structure.
24. The method of claim 23, wherein the characteristic of the peripheral structure comprises temperature, humidity, sound, force, pressure, electromagnetic waves, position, distance, movement, flow, acceleration, velocity, vibration, dust, light, glare, color, gas, or a volatile compound.
25. The method of claim 22, wherein the first sensor is part of an assembly of devices that includes another sensor or emitter.
26. The method of claim 25, wherein the other sensor measures a second parameter different from the first parameter.
27. The method of claim 22, wherein the sensor arrays are configured to operate in a coordinated manner.
28. The method of claim 22, further comprising adjusting an environment of the facility cooperatively at least in part by using data from the sensor array.
29. An apparatus for sensor calibration in a facility, the apparatus comprising one or more controllers configured to:
(a) Operatively coupled to a first sensor and to a second sensor, the first sensor and the second sensor included in a sensor array disposed in the facility;
(b) obtaining or directing to obtain a first reading of a first parameter from a first sensor disposed at a first location in a peripheral structure and obtaining or directing to obtain a second reading of the first parameter from a second sensor disposed at a second location in the peripheral structure;
(c) estimating or directing estimation of a predicted value of the first parameter at the first location based, at least in part, on the second reading;
(d) determining or directing a determination of a difference between (I) the predicted value of the first parameter estimated and (II) the first reading of the first parameter; and
(e) considering or directing to consider a difference between (i) the predicted value of the first parameter estimated and (ii) the first reading of the first parameter to modify the first reading of the first parameter.
30. The apparatus of claim 29, wherein the one or more controllers are operatively coupled to the first sensor that is part of a device ensemble comprising another sensor or emitter.
31. The apparatus of claim 30, wherein the other sensor measures a second parameter different from the first parameter.
32. The device of claim 29, wherein the sensor array is configured to operate in a coordinated manner.
33. The apparatus of claim 29, wherein the one or more controllers are configured to cooperatively adjust or direct adjustment of the environment of the facility at least in part by using data from the sensor array.
34. A non-transitory computer program product for sensor self-calibration in a facility, the non-transitory computer program containing instructions recorded thereon, which when executed by one or more processors operatively coupled to first and second sensors included in a sensor array disposed in the facility, cause the one or more processors to perform a method comprising:
(a) obtaining or directing to obtain a first reading of a first parameter from a first sensor disposed at a first location in a peripheral structure and obtaining or directing to obtain a second reading of the first parameter from a second sensor disposed at a second location in the peripheral structure;
(b) Estimating or directing estimation of a predicted value of the first parameter at the first location based, at least in part, on the second reading;
(c) determining or directing a determination of a difference between (I) the predicted value of the first parameter estimated and (II) the first reading of the first parameter; and
(d) considering or directing to consider a difference between (i) the predicted value of the first parameter estimated and (ii) the first reading of the first parameter to modify the first reading of the first parameter.
35. The non-transitory computer program product of claim 34, wherein the first parameter comprises a characteristic of an environment of the peripheral structure.
36. The non-transitory computer program product of claim 34, wherein the first sensor is part of a device ensemble comprising another sensor or emitter.
37. The non-transitory computer program product of claim 36, wherein the other sensor measures a second parameter different from the first parameter.
38. The non-transitory computer program product of claim 34, wherein the sensor array is configured to operate in a coordinated manner.
39. The non-transitory computer program product of claim 34, wherein the operations comprise adjusting or directing adjustment of an environment of the facility cooperatively, at least in part, by using data from the sensor array.
40. A system for performing sensor calibration in a facility, the system comprising:
one or more first sensors disposed in the facility, wherein the one or more first sensors are calibrated, wherein the peripheral structure is a target location for the one or more first sensors;
one or more second sensors located in the peripheral structure, wherein the one or more second sensors are uncalibrated or miscalibrated; and
one or more controllers operatively coupled with the one or more first sensors and the one or more second sensors, the one or more controllers calibrating and/or recalibrating the one or more second sensors using sensor measurements obtained from the one or more first sensors, the one or more first sensors and the one or more second sensors included in a sensor array disposed in the facility.
41. The system of claim 40, wherein the one or more first sensors are calibrated in the peripheral structure.
42. The system of claim 40, wherein at least one of the one or more controllers is disposed on an electronic board, at least one of the one or more first sensors being disposed on the electronic board.
43. The system of claim 40, wherein the first sensor is part of a device aggregate comprising another sensor or emitter.
44. The system of claim 40, wherein the other sensor measures a second parameter different from the first parameter.
45. The system of claim 40, wherein the sensor arrays are configured to operate in a coordinated manner.
46. The system of claim 40, wherein the operations comprise adjusting or directing adjustment of an environment of the facility cooperatively, at least in part, by using data from the sensor array.
47. A system for calibration in a sensor group of a facility, the system comprising:
a first sensor of a plurality of sensors, the first sensor disposed at a first location;
A second sensor of the plurality of sensors disposed at a second location, the second sensor operatively coupled to the first sensor, the second sensor configured to:
(a) obtaining a first reading of a first parameter from the first sensor;
(b) receiving an estimate of a predicted value of the first parameter, or estimating the predicted value of the first parameter, and generating an estimated predicted value;
(c) receiving a determination of a difference between (I) a predicted value of the estimate of the first parameter and (II) the first reading of the first parameter, or determining the difference; and
(d) receiving consideration of, or taking into account, a difference between (i) a predicted value of the estimate of the first parameter and (ii) the first reading of the first parameter to modify the first reading of the first parameter;
wherein the plurality of sensors are included in a sensor array disposed in the facility.
48. The system of claim 47, wherein the estimate of the predicted value of the first parameter is received from a cloud, a plant, and/or a data processing center.
49. The system of claim 47, wherein the determination of the predicted value of the first parameter is performed by a cloud, a plant, and/or a data processing center.
50. The system of claim 47, wherein the consideration of the predicted value of the first parameter is performed by a cloud, a plant, and/or a data processing center.
51. The system of claim 47, wherein the first reading of the first parameter is modified by the second sensor to generate a modified first reading of the first parameter.
52. The system of claim 51, wherein the second sensor is operative to convert the modified first reading of the first parameter into a correction factor used by the first sensor.
53. The system of claim 47, wherein the first sensor is part of a device aggregate comprising another sensor or emitter.
54. The system of claim 47, wherein the other sensor measures a second parameter different from the first parameter.
55. The system of claim 47, wherein the sensor arrays are configured to operate in a coordinated manner.
56. The system of claim 47, wherein the operations comprise adjusting or directing adjustment of an environment of the facility cooperatively, at least in part, by using data from the sensor array.
57. A method for adjusting an environment of a facility, the method comprising: (a) connecting to a virtual reality module to view selected sensing characteristics of the environment, the selected sensing characteristics being sensed by a sensor array disposed in the facility; and (b) adjusting the sensed characteristic of the environment using the virtual reality module.
58. The method of claim 57, wherein the virtual reality module is communicatively coupled to one or more sensors that sense the characteristic of the environment, the one or more sensors being part of a sensor array.
59. The method of claim 58, wherein the one or more sensors are part of one or more device assemblies including a device assembly having (i) a sensor or (ii) a sensor and a transmitter.
60. The method of claim 59, wherein at least two of the one or more sensors are of the same type and are disposed in different locations in the environment.
61. The method of claim 59, wherein at least two sensors of said one or more sensors are of different types and are disposed in a same device aggregate of said one or more device aggregates.
62. The method of claim 58, wherein the sensor arrays are configured to operate in a coordinated manner.
63. The method of claim 58, further comprising adjusting an environment of the facility synergistically at least in part by using data from the sensor array.
64. The method of claim 57, wherein the virtual reality module facilitates viewing changes in the sensed characteristic of the environment as a result of adjusting the sensed characteristic.
65. A non-transitory computer program product for adjusting an environment of a facility, the non-transitory computer program product including instructions recorded thereon, which when executed by one or more processors, cause the one or more processors to perform a method, the method comprising:
(a) simulating or directing simulation of a virtual reality projection of the environment to view selected sensing characteristics of the environment, the selected sensing characteristics being sensed by at least one sensor of a sensor array disposed in the facility; and
(b) using or instructing use of the virtual reality projection to facilitate adjusting the sensed characteristic of the environment.
66. The non-transitory computer program product of claim 65, wherein the virtual reality projection facilitates viewing changes in the sensing characteristic of the environment as a result of adjusting the sensing characteristic.
67. The non-transitory computer program product of claim 66, wherein adjusting the sensing characteristic comprises changing operation of one or more components disposed in and/or affecting the environment.
68. The non-transitory computer program product of claim 65, wherein the sensing characteristic is a first sensing characteristic, and wherein the operations further comprise selecting or directing selection of a second sensing characteristic for viewing and/or adjustment using the virtual reality projection.
69. The non-transitory computer program product of claim 65, wherein the sensor array comprises a device aggregate comprising (i) sensors or (ii) sensors and emitters.
70. The non-transitory computer program product of claim 65, wherein the sensor array is configured to operate in a coordinated manner.
71. The non-transitory computer program product of claim 65, wherein the operations comprise adjusting an environment of the facility cooperatively, at least in part, by using data from the sensor array.
72. An apparatus for environmental adjustment of a facility, the apparatus comprising one or more controllers comprising circuitry configured to, separately or simultaneously:
(a) operatively coupled to a virtual reality simulator;
(b) directing the virtual reality simulator to project a virtual reality projection of the environment to view a selected sensing characteristic of the environment, the selected sensing characteristic being sensed by at least one sensor of an array of sensors disposed in the facility; and
(c) using or directing use of the virtual reality projection, directing the virtual reality simulator to facilitate adjustment of the sensed characteristic of the environment.
73. The device of claim 72, wherein the sensed characteristic is a first sensed characteristic, and wherein the one or more controllers are configured to facilitate selection of a second sensed characteristic for viewing and/or adjustment using the virtual reality projection, the selection being made by a user.
74. The device of claim 72, wherein the one or more controllers are configured to facilitate viewing and/or adjusting a plurality of environmental characteristics by using the virtual reality projection.
75. The apparatus of claim 72, wherein sensors of the sensor array are housed in a housing that includes (i) the sensors or (ii) the sensors and the emitter as part of a device aggregate.
76. The device of claim 72, wherein the sensor arrays are configured to operate in a coordinated manner.
77. The apparatus of claim 72, wherein the one or more controllers are configured to cooperatively adjust or direct adjustment of the environment of the facility at least in part by using data from the sensor array.
CN202180014420.3A 2020-01-29 2021-01-28 Sensor calibration and operation Withdrawn CN115087846A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US202062967204P 2020-01-29 2020-01-29
US62/967,204 2020-01-29
US17/083,128 US20210063836A1 (en) 2017-04-26 2020-10-28 Building network
US17/083,128 2020-10-28
PCT/US2021/015378 WO2021154915A1 (en) 2020-01-29 2021-01-28 Sensor calibration and operation

Publications (1)

Publication Number Publication Date
CN115087846A true CN115087846A (en) 2022-09-20

Family

ID=77079826

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180014420.3A Withdrawn CN115087846A (en) 2020-01-29 2021-01-28 Sensor calibration and operation

Country Status (5)

Country Link
EP (1) EP4097425A1 (en)
CN (1) CN115087846A (en)
CA (1) CA3165657A1 (en)
TW (1) TW202202924A (en)
WO (1) WO2021154915A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116429189A (en) * 2023-06-13 2023-07-14 武汉能钠智能装备技术股份有限公司 Low-noise frequency source monitoring method and monitoring device
TWI817736B (en) * 2022-09-22 2023-10-01 宏達國際電子股份有限公司 Control device and correcting method of control device with strength feedback function
CN116975938A (en) * 2023-09-25 2023-10-31 北京谷器数据科技有限公司 Sensor data processing method in product manufacturing process

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11747698B2 (en) 2017-04-26 2023-09-05 View, Inc. Tandem vision window and media display
TW201907213A (en) 2017-04-26 2019-02-16 美商唯景公司 Colored window system computing platform
US11493819B2 (en) 2017-04-26 2022-11-08 View, Inc. Displays for tintable windows
US11892738B2 (en) 2017-04-26 2024-02-06 View, Inc. Tandem vision window and media display
US11747696B2 (en) 2017-04-26 2023-09-05 View, Inc. Tandem vision window and media display
EP4285085A1 (en) * 2021-01-28 2023-12-06 View, Inc. Multi-sensor synergy

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0611477A (en) * 1992-04-20 1994-01-21 Matsushita Seiko Co Ltd Carbon dioxide gas concentration sensing device
JP3019904B2 (en) * 1993-04-28 2000-03-15 松下精工株式会社 Carbon dioxide concentration detector
US6526801B2 (en) * 2000-12-29 2003-03-04 Edwards Systems Technology, Inc. Method of compensating for drift in gas sensing equipment
US6588250B2 (en) * 2001-04-27 2003-07-08 Edwards Systems Technology, Inc. Automatic calibration mode for carbon dioxide sensor
JP6786817B2 (en) * 2016-02-29 2020-11-18 株式会社デンソーウェーブ Reference value correction device for CO2 sensor, reference value correction method for CO2 sensor

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI817736B (en) * 2022-09-22 2023-10-01 宏達國際電子股份有限公司 Control device and correcting method of control device with strength feedback function
CN116429189A (en) * 2023-06-13 2023-07-14 武汉能钠智能装备技术股份有限公司 Low-noise frequency source monitoring method and monitoring device
CN116975938A (en) * 2023-09-25 2023-10-31 北京谷器数据科技有限公司 Sensor data processing method in product manufacturing process
CN116975938B (en) * 2023-09-25 2023-11-24 北京谷器数据科技有限公司 Sensor data processing method in product manufacturing process

Also Published As

Publication number Publication date
TW202202924A (en) 2022-01-16
CA3165657A1 (en) 2021-08-05
WO2021154915A1 (en) 2021-08-05
EP4097425A1 (en) 2022-12-07

Similar Documents

Publication Publication Date Title
CN115087846A (en) Sensor calibration and operation
US20230065864A1 (en) Sensor calibration and operation
JP7146872B2 (en) Multiple interacting systems in the field
US20230194115A1 (en) Environmental adjustment using artificial intelligence
US20230176669A1 (en) Device ensembles and coexistence management of devices
CN115485614A (en) Interaction between peripheral structures and one or more occupant-related applications
CA3173667A1 (en) Atmospheric adjustment in an enclosure
CA3047110A1 (en) Tester and electrical connectors for insulated glass units
CA3169820A1 (en) Virtually viewing devices in a facility
CN115668048A (en) Environmental adjustment using artificial intelligence
US20230076947A1 (en) Predictive modeling for tintable windows
TW202204939A (en) Predictive modeling for tintable windows
CN115398464A (en) Identifying, reducing health risks in a facility and tracking occupancy of a facility
CN112262300A (en) Scanning motion average radiation temperature sensor application
CN116848380A (en) Multi-sensor synergy
CN115968454A (en) Device aggregate and coexistence management of devices
US20230288770A1 (en) Atmospheric adjustment in an enclosure
WO2023010016A1 (en) Locally initiated wireless emergency alerts
US20240126130A1 (en) Multi-sensor synergy
US20230393443A1 (en) Virtually viewing devices in a facility
US20240013162A1 (en) Failure prediction of at least one tintable window
EP4237908A1 (en) Failure prediction of at least one tintable window
WO2022093629A1 (en) Failure prediction of at least one tintable window
WO2022261426A1 (en) Automatic location of devices of a facility
CN117178227A (en) Failure prediction for at least one tintable window

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20220920