US20200033163A1 - Virtual sensor system - Google Patents
Virtual sensor system Download PDFInfo
- Publication number
- US20200033163A1 US20200033163A1 US16/591,987 US201916591987A US2020033163A1 US 20200033163 A1 US20200033163 A1 US 20200033163A1 US 201916591987 A US201916591987 A US 201916591987A US 2020033163 A1 US2020033163 A1 US 2020033163A1
- Authority
- US
- United States
- Prior art keywords
- sensor
- sensors
- data
- sensor assembly
- event
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/20—Ensemble learning
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D9/00—Recording measured values
- G01D9/005—Solid-state data loggers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D5/00—Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/10—Machine learning using kernel methods, e.g. support vector machines [SVM]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/38—Services specially adapted for particular environments, situations or purposes for collecting sensor information
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D3/00—Indicating or recording apparatus with provision for the special purposes referred to in the subgroups
- G01D3/08—Indicating or recording apparatus with provision for the special purposes referred to in the subgroups with provision for safeguarding the apparatus, e.g. against abnormal operation, against breakdown
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2823—Reporting information sensed by appliance or service execution status of appliance services in a home automation network
Definitions
- One option is for users to upgrade their environments with newly released “smart” devices (e.g., light switches, kitchen appliances), many of which contain sensing functionality.
- this sensing is generally limited to the appliance itself (e.g., a smart light sensing whether it is on or off) or single parameter associated with its core function (e.g., a smart thermostat sensing whether the room is occupied).
- few smart devices are interoperable, forming silos of sensed data that thwart a holistic experience. Instead of achieving a smart home, the best one can currently hope for are small islands of smartness. This approach also carries a significant upgrade cost, which so far has proven unpopular with consumers, who generally upgrade appliances in a piecemeal manner.
- sensing modalities have been described in the context of environmental sensing, including special-purpose sensing systems, distributed sensing systems, infrastructure-mediated sensing systems, and general-purpose sensing systems. These sensing modalities can be organized according to the number of sensors that they utilize and the number of facets or parameters that they sense.
- special-purpose sensing systems utilize a single sensor
- infrastructure-mediated and general-purpose sensing systems utilize one or a few sensors
- distributed sensing systems utilize many sensors.
- special-purpose sensing systems sense a single facet
- infrastructure-mediated sensing systems tend to sense one or a few facets
- general-purpose sensing systems tend to sense many facets
- distributed sensing systems can sense anywhere from a single facet to many facets of an environment.
- sensing systems typically transfer all the sensed data to a backend server for processing and/or storage system leading to problems relating to, for example, bandwidth usage and processing speed.
- the present invention is directed to a ubiquitous sensing system utilizing one or more sensors that are capable of directly or indirectly detecting events in the environment surrounding the sensor assembly. While the sensing system is configured for indirect sensing, such that each and every object and/or person in the environment need not be instrumented in a location in order to sense their state or events associated with them, the sensors may also be coupled to objects and/or humans for direct sensing without any modifications.
- the sensing system includes a sensor assembly that can be positioned within an environment or location and that is capable of communicating with a server or other type of computer system for processing. The sensing system may optionally process sensor data locally and transmit the processed data to the server.
- the server utilizes machine learning to characterize received sensor data and/or training data in association with an event or events to learn to detect the occurrence of the designated event(s).
- the user can annotate the sensor data stream to indicate when certain events occurred and the machine learning algorithm then learns what characteristics of the data stream correlate to the event, allowing the sensing system to then detect future occurrences of the event.
- the sensing system utilizes deep machine learning to determine when events have occurred and what characteristics of the sensor data stream correlate to those events.
- the server can have a library of previously trained machine learning models and/or may train machine learning models from prior data collection steps, crowd sourcing, or the like, for different activities and events, and the sensing system can directly send sensor data and have the server determine what events have occurred.
- the server thus can define a set of machine learning-trained “virtual sensors” that are each capable of detecting events from combinations of sensor data that are correlated with the occurrences of the events, but that are not necessarily provided by sensors that are directly affixed or otherwise associated with the object(s) being sensed. More specifically, these types of virtual sensors can be referred to as “first order” virtual sensors.
- the server can further implement higher order virtual sensors that are capable of detecting events or conditions from a combination of data from lower order virtual sensors (e.g., second order virtual sensors detect an event from the output of first order virtual sensors).
- the sensing system comprises a sensor assembly with processing and communication capabilities and a back end server system.
- the sensor assembly comprises a control circuit and one or more sensors. Each of the sensors senses one or more different physical phenomenon in an environment of the sensor assembly.
- the back end server system which comprises at least one server, is in communication with the sensor assembly.
- the control circuit of the sensor assembly is configured to, among other things: (i) extract features from raw sensor data from the plurality of sensors; and (ii) transmit data packets to the back end server system, wherein the data packets comprise featurized data from the plurality of sensors.
- the at least one server of the back end server system is configured to implement the first order virtual sensors, where each of the first order virtual sensors is trained through machine learning to detect, based on the featurized data transmitted from the sensor assembly, an event or condition in the environment of the sensor assembly.
- the back end server system is programmed to receive the featurized sensor data from the sensor assembly; determine an occurrence of an event via the featurized sensor data; train, via machine learning, a virtual sensor implemented by the server to detect the event by characterizing the featurized sensor data for the plurality of sensors that are activated in association with the event; and monitor, via the virtual sensor, for subsequent occurrences of the event.
- FIG. 1A illustrates a block diagram of a sensing system, in accordance with at least one aspect of the present disclosure.
- FIG. 1B illustrates a block diagram of the sensing system of FIG. 1A with a trained virtual sensor, in accordance with at least one aspect of the present disclosure.
- FIG. 2 illustrates a block diagram of a sensing system including virtual sensors receiving data from various sensors of the sensor assembly, in accordance with at least one aspect of the present disclosure.
- FIG. 3A illustrates a block diagram of a sensing system, in accordance with at least one aspect of the present disclosure.
- FIG. 3B illustrates a block diagram of the sensing system of FIG. 3A with a trained second order virtual sensor, in accordance with at least one aspect of the present disclosure.
- FIG. 4 illustrates a block diagram of a sensing system including sensors, first order virtual sensors, and second order virtual sensors receiving data hierarchically, in accordance with at least one aspect of the present disclosure.
- FIG. 5 illustrates a block diagram of a sensing system including multiple sensor assemblies communicably coupled to a computer system, in accordance with at least one aspect of the present disclosure.
- FIG. 6 illustrates a perspective view of a sensor assembly, in accordance with at least one aspect of the present disclosure.
- FIG. 7 illustrates a timeline of sampling rates for various sensors, in accordance with at least one aspect of the present disclosure.
- FIG. 8 illustrates a first sensor data graphical display annotated with events detected by the sensing system, in accordance with at least one aspect of the present disclosure.
- FIG. 9 illustrates a second sensor data graphical display annotated with events detected by the sensing system, in accordance with at least one aspect of the present disclosure.
- FIG. 10 illustrates a third sensor data graphical display annotated with events detected by the sensing system, in accordance with at least one aspect of the present disclosure.
- FIG. 11 illustrates a fourth sensor data graphical display annotated with events detected by the sensing system, in accordance with at least one aspect of the present disclosure.
- FIG. 12 illustrates a fifth sensor data graphical display annotated with events detected by the sensing system, in accordance with at least one aspect of the present disclosure.
- FIG. 13 illustrates a sixth sensor data graphical display annotated with events detected by the sensing system, in accordance with at least one aspect of the present disclosure.
- FIG. 14 illustrates a block diagram of a microwave second order virtual sensor represented as a state machine, in accordance with at least one aspect of the present disclosure.
- FIG. 15A illustrates a graphical user interface utilized to annotate an event being detected by the sensing system, in accordance with at least one aspect of the present disclosure.
- FIG. 15B illustrates a graphical user interface displaying an event being detected by the sensing system, in accordance with at least one aspect of the present disclosure.
- FIG. 16 illustrates a logic flow diagram of a process of detecting events via virtual sensors, in accordance with at least one aspect of the present disclosure.
- FIG. 17 illustrates a block diagram of a general computing or data processing system, in accordance with at least one aspect of the present disclosure.
- FIG. 1A illustrates a block diagram of a sensing system 100 , in accordance with at least one aspect of the present disclosure.
- the sensing system 100 comprises a sensor assembly 102 having one or more sensors 110 and a computer system 104 (e.g., one or a number of networked servers) to which the sensor assembly 102 can be communicably connected via a network 108 ( FIG. 5 ).
- a computer system 104 e.g., one or a number of networked servers
- the sensors 110 include a variety of sensors for detecting various physical or natural phenomena in the vicinity of the sensor assembly 102 , such as vibration, sound, ambient temperature, light color, light intensity, electromagnetic interference (EMI), motion, ambient pressure, humidity, composition of gases (e.g., allowing certain types of gases and pollutants to be detected), distance to an object or person, presence of a user device, infrared radiation (e.g., for thermal imaging), or the like. While FIG. 1A illustrates one sensor assembly 102 included in the sensing system 100 , a plurality of sensor assemblies communicably connected with each other and/or with a computer system 104 are within the scope of this disclosure (as shown in FIG. 5 ).
- the sensing system 100 is configured to train and implement one or more virtual sensors 118 , which are machine learning based classification systems or algorithms trained to detect particular events to which the virtual sensors 118 are assigned as correlated to the data sensed by the sensors 110 of the sensor assembly 102 and/or other virtual sensors 118 .
- the training and implementation of various aspects of the virtual sensors 118 are described in more detail below.
- the sensing system 100 may be configured to be, without limitation, a special-purpose sensing system, a distributed sensing system, infrastructure-mediated sensing system, and/or a general-purpose sensing system.
- special-purpose sensing systems may include a single sensor assembly 102 configured to monitor a single facet of an environment.
- a sensor assembly 102 including a microphone can be affixed to a faucet so that water consumption can be inferred (which, in turn, is used to power behavior-changing feedback).
- a sensor assembly 102 including a temperature sensor and/or an occupancy sensor can be placed in a room to sense environmental data that can be used by a heating, ventilation, and air conditioning (HVAC) system to manage the HVAC system.
- HVAC heating, ventilation, and air conditioning
- infrastructure-mediated sensing systems may include one or more sensor assemblies 102 installed within a structure at strategic infrastructure probe points.
- sensor assemblies 102 can be coupled to a building's power lines to detect “events” caused by electrical appliances. Since home electrical lines are shared, a single sensor assembly can observe activities across an entire house.
- Infrastructure-mediated sensing systems may also be coupled to, e.g., HVAC, plumbing, natural gas lines, and electric lighting.
- a sensing assembly including one or more sensors may be installed at a probe point, enabling the sensing system to monitor aspects of the building.
- a plumbing-attached sensor assembly may be configured to detect sink, shower, and toilet use.
- Infrastructure-mediated sensing systems may include one sensor assembly and/or a plurality of sensor assemblies utilized to monitor a few facts of an environment.
- distributed sensing systems may include many sensor assemblies 102 deployed in an environment that are networked together. Such a sensing system may be used to enlarge the sensed area (e.g., occupancy sensing across an entire warehouse) or increase sensing fidelity through complementary readings (e.g., sensing seismic events utilizing sensors deployed across an area).
- the distributed sensor assemblies 102 can be homogenous (e.g., an array of identical infrared occupancy sensors) and/or heterogeneous. Also, the array can sense one facet (e.g., fire detection) or many facets (e.g., appliance use).
- a home security system is a heterogeneous distributed system, where one or more sensor assemblies may include door sensors, window sensors, noise sensors, occupancy sensors and even cameras work together to sense a single facet of the environment: “Is there an intruder in the home?”
- a homogenous array of sensor assemblies comprising magnetic sensors can be utilized to detect object interactions throughout an entire house.
- distributed sensing systems may be configured to include as many sensor assemblies utilized to monitor anywhere between a single facet to many facets of an environment, depending upon the particular implementation of the distributed sensing system.
- a general purpose sensing system may include a wide variety of underlying sensor assemblies 102 that can be utilized flexibly such that they can be attached to a variety of objects and can sense many facets without any modification to the sensor assembly 102 .
- the sensing system 100 may be a direct sensing system and/or an indirect sensing system.
- a sensor assembly 102 is physically coupled to an object or infrastructure of interest and may provide excellent signal quality.
- Some direct sensing systems may include utilize batteries or other power sources to power the sensor assembly.
- Indirect sensing systems seek to sense state and events indirectly, without having to physically couple to objects.
- a sensor assembly including an electromagnetic sensor (EMI sensor) can be installed near an appliance and/or it's power source to detect usage of the appliance because when an appliance is in different modes of operation (e.g., refrigerator compressor running, interior lights on/off), the object and/or the power source emits characteristic electromagnetic noise that can be captured and recognized.
- EMI sensor electromagnetic sensor
- a sensor assembly including an acoustic sensor can be installed in a workshop to recognize tool usage according to the detected acoustic characteristics of each tool.
- Example sensors to be included in the sensor assembly 102 that are configured for indirect sensing can include, without limitation, noncontact thermometers, rangefinders, motion sensors, EMI sensors, acoustic sensors, vibration sensors, magnetic field sensors, cameras, ultrasonic sensors, laser based sensors (e.g., lidar), or the like.
- Indirect sensing systems have greater flexibility in sensor placement which allows for sensors to be better integrated into the environment or even hidden. Further, it may be possible to place the sensor assembly of an indirect sensing system at a nearby wall power outlet, eliminating the need for batteries.
- the sensor assembly 102 further includes a featurization module 112 (which can be implemented with firmware executed by a microcontroller(s) or other programmable circuit(s) of the sensor assembly 102 ) that processes and converts raw data from the sensors 110 into various forms of processed data and extracts measurable properties or characteristics of the data, i.e., features.
- a featurization module 112 (which can be implemented with firmware executed by a microcontroller(s) or other programmable circuit(s) of the sensor assembly 102 ) that processes and converts raw data from the sensors 110 into various forms of processed data and extracts measurable properties or characteristics of the data, i.e., features.
- the featurization module 112 can output the processed raw sensor 110 data in the form of, e.g., a feature vector, to be provided to a machine learning-based classification system, a statistical classification system, and/or a clustering system that utilizes pattern recognition, machine learning techniques (e.g., classifier), logistic regression, decision tree, random forest, or the like, at the computer system 104 .
- the featurization module 112 can ingest data from both high sample rate (e.g., several kHz to several MHz) and low sample rate (e.g., 0.1 Hz to 1 kHz) sensors 110 .
- high sample rate sensors may include, without limitation, vibration sensors, EMI sensors, microphones, cameras, or the like.
- Examples of low sample rate sensors may include, without limitation, temperature sensors, humidity sensors, light level sensors, or the like.
- the featurization module 112 can determine or extract various features from, for example, the time domain and/or the frequency domain representations (e.g., by transformation of the time domain representations) of the sensor data.
- the features from the raw data from the sensors 110 can be extracted utilizing a number of different techniques, which can vary according to sample rate at which the sensor data was collected or the particular type of sensors 110 .
- the number and/or types of features extracted from the raw sensor data and/or transmitted to the computer system 104 by the featurization module 112 can be based on, for example, the sample rate, the types of sensors, user input, or the like.
- the number and/or types of features extracted by the featurization module 112 can be controlled by the featurization module 112 itself, the computer system 104 , a client 106 ( FIG. 5 ), and/or another system or device that is part of the sensing system 100 or can access the sensing system 100 .
- the data from one or more high sample rate sensors of the sensor assembly 102 can be featurized by transforming the data into a spectral representation via a sliding window Fast Fourier Transform (FFT) (e.g., 256 samples, 10% overlapping) at a particular rate (e.g., 10 Hz), with phase information either utilized or discarded. This technique may also be used to featurize data from low sample rate sensors.
- FFT Fast Fourier Transform
- the data from a high sample rate acoustic sensor (e.g., a microphone) of the sensor assembly 102 can be transformed into the frequency domain first and then one or more of the mel-frequency cepstral coefficient (MFCC) features (e.g., 14 or 40 MFCC coefficients, on a sliding window of audio data), the delta features, and/or the double delta features can be extracted from the frequency domain.
- MFCC mel-frequency cepstral coefficient
- the data from the low and/or high sample rate sensors 110 can be featurized by calculating various statistical features (e.g., min, max, range, mean, median, mode, sum, standard deviation, and/or centroid) on a rolling buffer with different time granularities (e.g., 100 ms, 500 ms, and/or one second) at a particular rate (e.g., 10 Hz).
- the raw sensor data can be featurized by transforming the data into a spectral representation.
- the featurized data for every sensor can be independently transmitted to the computer system 104 for further processing thereon.
- the featurized data for a subset of sensors can be packaged together and transmitted to the computer system 104 .
- the back-end server system can transmit a signal or instruction to the sensor assembly to inform the sensor assembly what features should be extracted for a particular sensor. That is, the back-end server system can modify or change when or which features are extracted by the sensor assembly (for embodiments where the sensor assembly is extracting the features in the raw sensor data).
- the featurization module 112 can be present onboard the sensor assembly 102 so that the data from the sensors 110 can be featurized prior to being transmitted to the computer system 104 .
- the raw sensor data is not transmitted or otherwise stored outside of the sensor assembly 102 .
- the featurization denatures the sensor 110 data, providing an additional degree of data privacy by precluding the transmitted data from being intercepted by an unintended recipient and then reconstructed.
- data from an acoustic sensor can be converted into a low-fidelity spectral representation in combination with basic statistical features, which precludes the ability to reconstruct the spoken content from the featurized data that is transmitted.
- data from a vision based sensor may be featurized and denatured.
- the featurization reduces the data packet size, which is useful for conserving transmission bandwidth and storage of the sensor 110 data.
- the featurization module 112 can be present on the computer system 104 .
- some or all of the features are extracted from the raw sensor data after the raw sensor data and/or partially featurized data is transmitted to the computer system 104 . This likewise may be advantageous for various reasons, such as by reducing the computational power that is required onboard the sensor assembly 102 .
- the featurized data can be processed and/or analyzed by a machine learning module 116 of the computer system 104 included in the sensing system 100 .
- the machine learning module 116 can generate a machine learning model to detect correlations between the data and events that have occurred.
- the machine learning module 116 generates a classifier, which is an algorithm that is trained via a machine learning model to assign an input to one or more categories based upon the training that the classifier received.
- the classifier can be trained to identify the occurrence of a given event based upon the grouped, featurized data that is provided to the machine learning module 116 as training data.
- the machine learning module 116 can assess the informational power of different sensor channels and may select appropriate thresholds for optimal accuracy in characterizing the training data.
- the training by the machine learning module 116 causes the classifier to learn what sensor data streams are associated with an event type and, further, what characteristics of those data streams identify the event type with particularity.
- a virtual sensor 118 can output a notification and/or signal when the event is detected that causes a graphical user interface 500 , an example of which is depicted in FIG. 15B , to display an icon, ideogram, textual alert, or other indicator 506 indicating that the event is being detected.
- the machine learning module 116 can utilize supervised learning, unsupervised learning, and/or both techniques in training the classifier.
- the advantage of using both supervised and unsupervised methods may be that it is an effective method for correlating different types of features from multimodal data.
- Using both supervised and unsupervised methods may also be advantageous because it enables fine tuning of unsupervised training with supervised training results.
- Supervised learning is the machine learning task of inferring a function from labeled training data.
- the training data consists of a set of training examples.
- each example is a pair consisting of an input object, typically a vector, and a desired output value or target. The goal is to learn a general rule that maps inputs to outputs.
- a supervised method may be advantageous because a supervised learning algorithm analyzes the training data and produces an inferred function, which can be used for mapping new examples. Further, an unsupervised method tries to find hidden structure in unlabeled data and includes an algorithm with no target value, i.e., there is no error or reward signal to evaluate a potential solution. Instead, the algorithm has a halting criterion. Examples of halting criteria include, but are not limited to, precision, recall, accuracy, number of cycles, and time. An unsupervised method may be advantageous for use in model training when the only data available is unlabeled data.
- the machine learning module 116 can utilize now or hereafter known machine learning methods to detect correlations between the data and events that have occurred, such as various deep learning algorithms, clustering, etc.
- the featurized data can be processed by other classification modules, such as a logistic regression module, a clustering module (e.g., k-means, spectral, density based spatial clustering of applications with noise (DBSCAN) and mean-shift), a decision tree module, or a random forest module.
- the machine learning module 116 comprises an ensemble classification model utilizing, e.g., an algebraic combination technique or a voting (plurality) combination technique.
- the machine learning module 116 comprises use base-level support vector machines (SVMs) trained for each virtual sensor, along with a global (multi-class) SVM trained on all sensors.
- SVMs base-level support vector machines
- the virtual sensors 118 could all use the same machine learning technique or they could use different machine learning techniques.
- some virtual sensors 118 could use support vector machines, some decision trees, some neural networks, etc.
- the featurized data may be organized as feature vectors and the feature vectors are fed into the machine learning module 116 as the training data for the classifiers.
- the featurized data can optionally be processed by an activation group module prior to being transmitted to and/or prior to being processed by the machine learning module 116 .
- the activation group module further processes and converts featurized data from the featurization module 112 into various forms of processed data, as discussed below.
- the activation group module can be executed on the sensor assembly 102 , i.e., prior to the featurized data being transmitted to the computer system 104 .
- the activation group module can be executed by the computer system 104 after the featurized data has been received thereby.
- the activation group module can be a part of a node between the sensor assembly 102 and the computer system 104 (e.g., a gateway). In aspects including such an intermediate node, the intermediate node can be considered part of the computer system 104 as described herein. In an aspect, the activation group module may also process raw sensor data without featurization.
- the activation group module can determine which of the sensors 110 have been triggered at a given time or within a given time window and extract a subset of the data from the sensors as activation group data corresponding to only the activated sensors. Determination of which sensor channels have been activated may reduce the effects of environmental noise on the received sensor data.
- the activation group module can determine which of the sensors 110 have been triggered or activated by, for example, determining a baseline or background profile of the environment as a calibration routine and using the baseline or background profile to determine which sensors or sensor channels are “activated” by subtracting or otherwise removing the baseline or background profile from the featurized sensor data.
- the activation group module determines whether a given sensor 110 has been activated by utilizing an adaptive background model for each sensor channel (e.g., rolling mean and standard deviation).
- all received data streams can be compared against the background profile using, e.g., a normalized Euclidean distance metric.
- Sensor channels that exceed the baseline by a predetermined threshold are tagged as “activated.”
- the activation group module can further utilize hysteresis to avoid detection jitter. Thresholds can be, e.g., empirically obtained by running the sensors 110 for several days while tracking their longitudinal variances or set by the user or system administrator.
- the background profile refers to data sensed by a sensor based on ambient condition of an environment that are not related to events of interest, and which the sensing system 100 can obtain when a sensor assembly 102 is initially activated or deployed in a new environment.
- the sensing system 100 can periodically obtain the environmental background profile for each sensor assembly 102 .
- Obtaining an environmental background profile helps with reducing false positives for the same activity as the baselines change (e.g., for detecting the sound of a particular machine in a factory setting, the constant drone of a fan or other such consistent sounds may be subtracted from the featurized data as a baseline or background profile).
- the activation group module can create data sets by subtracting the baseline or background profile from any sensor signals detected by the sensors 110 . Such data sets will require less bandwidth for transmission (if the activation group module is part of or executed by the sensor assembly 102 ) to the computer system 104 for training by the machine learning module 116 . Still further, the activation group module can tag the data sets with identification information corresponding to the activated sensors such that the machine learning module 116 knows which sensor streams to consider, and which sensor streams to ignore, when training the machine learning model (e.g., a classifier). This assists in classification by reducing the feature space in which the classifier is trained.
- machine learning model e.g., a classifier
- the identification of a particular grouping of sensors 110 that have been activated in association with an event can serve as useful metadata to describe the event, which can in turn assist in classifying the event type that has been detected.
- a boiling kettle can activate infrared, vibration, and/or acoustic sensors of the sensor assembly 102 and a determination by the activation group module that infrared, vibration, and/or acoustic sensors have been activated from amongst a group of sensors 110 of the sensor assembly 102 can itself be used as a feature to assist in classifying the event as a kettle boiling within the detection area of the sensor assembly 102 .
- the activation group module can, optionally, assemble an amalgamated feature vector of the featurized data from the activated sensors, which is then provided to the machine learning module 116 .
- the sensing system 100 can be configured to provide labels for the featurized data.
- the labels can be provided by users and/or generated by the sensing system 100 .
- the sensing system 100 can include an interface for users to indicate when and what types of events have occurred, which can then be correlated to the data sensed by the sensor assembly 102 .
- FIG. 15A illustrates a graphical user interface 500 utilized to annotate an event being detected by the sensing system 100 , in accordance with at least one aspect of the present disclosure.
- the graphical user interface 500 could be displayed on a client 106 ( FIG. 5 ) connected to the sensing system 100 , the computer system 104 , or another computer system or device that is in communication with the sensing system 100 .
- the graphical user interface 500 can allow users to visualize the sensor data streams (which can be either the raw sensor data or the featurized sensor data) and then indicate when various event types occurred, such as by annotating the sensor data streams with events types and the times that the event types occurred. By indicating when various event types occurred, the machine learning module 116 can then train a machine learning model, such as a classifier, to correlate various characteristics of the featurized sensor data streams with the occurrences of the particular events types. For example, the graphical user interface 500 could be utilized to provide a “knocking” annotation 502 at the time on the sensor data stream 504 corresponding to when there was knocking on a door.
- the annotation 502 thus provides a label for the sensor data for the machine learning module 116 to train a virtual sensor 118 to detect the corresponding event.
- a user could annotate a “faulty” label to a vibration sensor reading from a machine in a factory that is vibrating due to mechanical misalignment.
- the sensing system 100 can automatically generate the labels, by, for example, clustering, deep learning, or other now or hereafter known methods that can be implemented by the machine learning module 116 .
- a user could use the user interface 500 to verify whether any labels automatically generated by the sensing system 100 are correct or incorrect. The machine learning module 116 could then adjust the training of the machine learning model being used to generate the labels to avoid characterizing such false positives.
- a user could use the user interface 500 to supplement the labels automatically generated by the sensing system 100 or otherwise apply additional labels to the sensor data streams, as described above. The machine learning module 116 could then adjust the training of the machine learning model being used to generate the labels to properly characterize such false negatives.
- the feature vectors along with their associated labels may be fed into the machine learning module 116 as the training data for the classifiers.
- the machine learning module 116 can comprise a deep neural network configured to perform deep learning.
- deep learning refers to a form of machine learning that utilizes multiple interconnected neural network layers along with feedback mechanisms or other methods to improve the performance of the underlying neural network. Deep learning systems are usually based on several interconnected layers of a convolution neural network, among other layers, interconnections, or feedback mechanisms.
- Deep-learning models may be trained to learn representations of data using supervised and/or unsupervised learning. From a computational standpoint, the methods used in deep learning involve several mathematical calculations of matrix-to-matrix and matrix-to-vector calculations. The number and nature of these calculations makes them essentially impossible for a human to perform the calculation by-hand or by manual process, within any practical amount of time.
- the machine learning module 116 uses a two-stage clustering process.
- the machine learning module 116 reduces the dimensionality of the data set using a multi-layer perceptron configured as an autoencoder.
- the autoencoder can have, e.g., multiple nonoverlapping sigmoid functions in the hidden layer(s). Because the output of the autoencoder is the same as the input values, the hidden layer(s) will learn the best reduced representation of the feature set.
- this reduced feature set is used as input to an expectation maximization (EM) clustering algorithm.
- the machine learning module 116 can comprise a decision tree, a logistic regression model, a random forest, etc.
- the machine learning module 116 utilizes deep learning
- some or all of the featurization or feature extraction may be accomplished automatically using learning from the training data.
- pre-processing of the training data e.g., using featurization by the featurization module 112 , using the activation group module, by providing labels, etc.
- selection of training data may be used to improve accuracy of the model.
- Selection of training data includes, for example, using domain-specific knowledge to improve performance of the machine learning system. Domain expertise, as used herein, provides a context in which a deep learning system can operate and can be used to select elements of training data, the order in which the training data is presented to the deep learning system, and certain sorts of invariances.
- the computer system 104 can be further programmed to perform featurization (i.e., featurization at the computer system 104 ), in addition to and/or as an alternative to the onboard featurization performed by the sensor assembly 102 .
- the additional featurization can include extracting features that would require computationally expensive processing for the sensor assembly 102 hardware to handle or that would be too large to transmit to the computer system 104 .
- the additional features can be computed by the computer system 104 (e.g., the cloud or a remote server) or an intermediate node between the sensor assembly 102 and the computer system 104 , such as a gateway. In aspects including such an intermediate node, the intermediate node can be considered part of the computer system 104 as described herein.
- the computer system 104 (including, potentially an intermediate node) can be configured to compute additional features from data corresponding to one or more high sample rate sensors and/or one or more low sample rate sensors of the sensor assembly 102 .
- the additional features computable by the computer system 104 can include, without limitation, band ratios, fractional harmonics, first or second order signal derivatives, MFCCs, and/or statistical features (e.g., min, max, range, mean, median, mode, sum, standard deviation, and centroid) from raw data from the acoustic, EMI, vibration, or other sensors 110 , and/or from already featurized data from the featurization module 112 .
- the computer system 104 can be configured to normalize data from other sensors 110 .
- This server-side featurized sensor data can then be fed, either alone or in combination with the data featurized onboard the sensor assembly 102 , to the machine learning module 116 (or, in some aspects, to the activation group module, which in turn feeds into the machine learning module 116 ) for classification, as described above.
- the machine learning module 116 of the computer system 104 is also configured to train one or more virtual sensors 118 .
- the classifier trained by the machine learning module 116 on the provided training data and/or sensor data associated with a given event can be considered a virtual (or synthetic) sensor 118 for that event.
- a classifier trained by the machine learning module 116 to recognize a boiling kettle according to featurized data from various combinations of infrared, vibration, and/or acoustic sensors of the sensor assembly 102 can be defined as a “kettle boiling” virtual sensor 118 .
- a classifier trained by the machine learning module 116 to recognize the movement of a door according to featurized data from acoustic and/or vibration sensors 110 can be defined as a “door movement” virtual sensor 118 .
- each virtual sensor 118 is trained on data collected by a sensor assembly 102 or combinations of sensor assemblies 102 , and because such sensor assemblies 102 will be located in a number of different types of environments when utilized in the field, the subset of sensors 110 included in each sensor assembly 102 that are correlated with an event type can vary.
- the machine learning associated with the same event and same subset of sensors 110 can also vary depending on the environment of the sensor assemblies 102 (e.g., if the background profile is different for the different environments).
- each virtual sensor 118 could also have different parameters and/or weights for an event based on the environment in which the sensor assemblies 102 are located. Therefore, different virtual sensors 118 (i.e. virtual sensors 118 utilizing different subsets of sensors 110 or having different parameters and/or weights assigned to the event) may be implemented to detect the same event in different locations, different environments, or even over different time periods in the same location. In other words, each virtual sensor 118 will be uniquely trained to detect the occurrence of an event according to the sensor data unique to the particular environment(s) in which the sensor assemblies 102 are located and/or at a particular time.
- a “door movement” virtual sensor 118 in a first environment and/or during a first time period could be trained to identify the movement of a door based on a combination of acoustic and vibration data and/or a first machine learning model.
- a “door movement” virtual sensor 118 in a second environment and/or during a second time period could be trained to identify a door closing based solely upon acoustic data and/or a second machine learning model.
- the machine learning model for each virtual sensor 118 could also have different parameters and/or weights based on the environment in which the sensor assemblies 102 are located.
- the sensing system 100 does not utilize any pre-established restrictions on the training of the virtual sensors 118 , thus each virtual sensor 118 will be uniquely trained to detect events according to its environment.
- the virtual sensor 118 can receive and/or subscribe to the data streams from the sensors 110 (i.e., the data streams are transmitted to or pulled by the virtual sensors 118 ) that were activated in accordance with the event (i.e., the sensors 110 that were correlated with the event). For example, if a “kettle boiling” virtual sensor is correlated with infrared and acoustic sensors, the “kettle boiling” virtual sensor will receive and/or subscribe to the data streams from those particular sensors 118 .
- the virtual sensor 118 can subscribe to the data streams of the sensors 110 related to the event when the activation group module determines those sensors 110 are activated (i.e., in the above example, when the infrared and acoustic sensors are determined to be activated). Thereafter, the virtual sensor 118 monitors for the occurrence of the event that the virtual sensor 118 was trained to detect from the data feed transmitted by the correlated sensors 110 of the sensor assembly 102 to the computer system 104 .
- the virtual sensors 118 can thus detect actions or events directly and/or indirectly (i.e., without requiring that a sensor 110 be physically connected or otherwise associated with the object or person being sensed) by being trained to correlate stimuli detected by the sensors 110 incorporated within the sensor assembly 102 with the occurrences of the particular events.
- the virtual sensors 118 can be implemented by the computer system 104 , i.e., the same computer system 104 on which the virtual sensors 118 are trained by the machine learning module 116 .
- the virtual sensors 118 can be stored on and/or implemented by a second computer system.
- the virtual sensors 118 can be stored in a library after they are trained.
- Other computer systems can then access the library of previously trained virtual sensors 118 for different activities and events and then utilize the previously trained virtual sensors 118 to sense the occurrence of events according to data from their own sensor assemblies 102 .
- Such other computer systems may also update a previously trained virtual sensor 118 .
- FIG. 1A depicts a virtual sensor 118 being trained and FIG. 1B depicts a resulting trained virtual sensor 118 subscribing to data streams from the sensor assembly 102 , this is described for the sake of convenience and not with an intent of limiting the disclosure as comprising a series and/or a number of steps. It is to be understood that the creation and implementation of virtual sensors 118 to detect events does not need to be performed as a series of steps and/or the steps do not need to be performed in the order shown and described with respect to FIGS. 1A and 1B . In other words, the creation and implementation steps may be integrated and/or may be performed together, or the steps may be performed in the order disclosed or in an alternate order. Furthermore, creation of the virtual sensor 118 is an iterative process and the training of the classifier that forms the virtual sensor 118 may continue to improve and/or modify the virtual sensor 118 (e.g., using a feedback loop).
- FIGS. 1A and 1B depict a single virtual sensor 118 being trained and then implemented, the computer system 104 can train any number of virtual sensors 118 to detect the same event in a variety of environments (or at different times) and/or a variety of events.
- FIG. 2 depicts the sensing system 100 wherein the computer system 104 has been trained to implement n virtual sensors 118 based on data from m sensors 110 incorporated with the sensor assembly 102 , where n can be greater than, equal to, or less than m.
- Each virtual sensor 118 can subscribe to (i.e., receive data from) the data stream of one or multiple sensors 110 , in any combination.
- each of the sensor assemblies 102 is in communication with the computer system 104 as described above.
- a virtual sensor 118 could rely on featurized data from different sensor assemblies 102 in making a classification.
- FIG. 3A illustrates a block diagram of a sensing system 100 , in accordance with at least one aspect of the present disclosure, with various components such as the machine learning module 116 omitted for clarity.
- the sensing system 100 can include a hierarchical structure of virtual sensors 118 .
- the virtual sensors 118 that receive the featurized data from the sensors 110 of the sensor assembly 102 to make their classifications can be referred to as first order virtual sensors 120 .
- the computer system 104 can further be configured to implement second order virtual sensors 124 that receive and process, among other things, the outputs of one or more first order virtual sensors 120 to make their “second order” classifications; third order virtual sensors that receive and process, among other things, the outputs of one or more second order virtual sensors 124 to make their “third order” classifications; and so on for subsequent orders of virtual sensors 118 .
- the computer system 104 can be configured to implement xth order virtual sensors that receive the outputs from one or more (x ⁇ 1)th or lower order virtual sensors to detect the occurrence of an event or condition (e.g., make a classification that the event occurred or that a condition is or is not present).
- the second order virtual sensors 124 could also subscribe to and/or receive other, non-first order virtual sensor data.
- a second order virtual sensor 124 could receive data from at least one first order virtual sensor 120 , as well as featurized data from one or more of the sensors 110 of the sensor assembly 102 , in order to make its classification. This applies to higher order sensors as well.
- a xth order virtual sensor could receive data from at least one (x ⁇ 1)th order sensor, as well as either (i) data from lower order sensors (e.g., (x ⁇ 2)th, (x ⁇ 3)th, etc.) and/or (ii) featurized data from one or more of the sensors 110 of the sensor assembly 102 , in order to make their classification.
- the higher order virtual sensors can include algorithms that, for example and without limitation, count the number of occurrences or duration of an event detected by a lower order virtual sensor, algorithms that smooth the outputs of lower order virtual sensors (and, in some cases, the sensors 110 of the sensor assembly 102 ), algorithms that combine the outputs of multiple lower order virtual sensors and/or sensors 110 (featurized and/or raw data), or the like.
- a second order virtual sensor 124 could indicate whether an occupant is present within a home by analyzing the outputs of multiple human activity-related first order virtual sensors 120 , such as a “washing dishes” first order virtual sensor 120 , a “movement in the kitchen” first order virtual sensors 120 , and so on.
- a third order virtual sensor could output an alarm if the second order virtual sensor 124 determines that a home owner is not present (e.g., by determining that for a threshold period of time the first order virtual sensors 120 have indicated that no lights have not been turned on) and another first order virtual sensor 120 detects a fire event.
- the higher order virtual sensors can receive the outputs of one or more lower order virtual sensors and/or sensors 110 of the sensor assembly and then make its corresponding classification accordingly.
- the higher order virtual sensors can, as with the first order virtual sensors 120 , include classifiers trained by a machine learning module on the output from one or more lower order sensors to identify the occurrence of a trained-for event or condition.
- the higher order virtual sensors can be trained on the outputs of at least one immediately lower order of virtual sensor in the hierarchical structure, rather than strictly on the outputs of the sensors 110 of the sensor assembly 102 .
- FIG. 3A depicts the computer system 104 including a second machine learning module 122 that receives the outputs (data) from the first order virtual sensors 120 to generate a second order virtual sensors 124 .
- the data from the first order virtual sensors 120 can additionally be processed by a featurization module and/or an activation group module, as described above with respect to FIGS. 1A and 1B , prior to being processed by the second machine learning module 122 .
- the computer system 104 can also implement a featurization module for featurizing the data from a virtual sensor 118 .
- the second order virtual sensor 124 receives data streams from the first order virtual sensors 120 that the second machine learning module 122 determined correlated to the event that the particular second order virtual sensor 124 was being trained on. Thereafter, the second order virtual sensor 124 monitors for the occurrence of the event that the second order virtual sensor 124 was trained to detect from the data feed generated by the first order virtual sensors 120 .
- FIGS. 3A and 3B depict a single second order virtual sensor 124 being trained and then implemented
- the computer system 104 can train any number of higher order virtual sensors 118 to detect a variety of events.
- FIG. 4 depicts the sensing system 100 wherein the computer system 104 has been trained to implement p second order virtual sensors 124 and n first order virtual sensors 120 based on data from m sensors 110 incorporated with the sensor assembly 102 , where p can be greater than, equal to, or less than n.
- Each second order virtual sensor 124 can subscribe to (i.e., receive data from), at least, the data stream of one or multiple first order virtual sensors 120 , in any combination. Further, these same principles apply to third and higher order virtual sensors implemented by the computer system 104 .
- FIG. 4 depicts the higher order virtual sensors receiving data from different levels or orders of sensors.
- the pth second order virtual sensor 124 is depicted as receiving data from the mth sensor 110 , in addition to data from the directly preceding 2nd and nth first order virtual sensors 120 .
- the first order virtual sensors 120 can produce a binary output (e.g., are binary classifiers). For example, a first order virtual sensor 120 trained to identify whether a faucet is running or whether someone is at their desk working could produce a continuous, time-stamped binary “yes” or “no” outputs. In this aspect, higher order virtual sensors can further produce nonbinary outputs, such as state (of an object or environment), count, and duration. For example, the sensing system 100 could implement five separate first order virtual sensors 120 that track five separate aspects of a microwave: whether the microwave is running, the keypad has been pressed, the door has been opened, the door has been closed, and the completion chime has sounded.
- a second order virtual sensor 124 could generate a nonbinary output of the states of the microwave: available, door ajar, in-use, interrupted, or finished.
- the microwave state output of the second order virtual sensor 124 can change from “in-use” to “finished.”
- the microwave state output can stay as “finished” until a “door closed” event is detected (i.e., the “door closed” first order virtual sensor 120 is activated), after which the items inside the microwave are presumed to have been removed and the microwave state output of the second order virtual sensor 124 is changed to “available.”
- Second order virtual sensors 124 need not be connected to multiple first order virtual sensors 120 to produce nonbinary outputs though.
- the sensing system 100 could implement a first order virtual sensor 120 that detects when a door is opened and a second order virtual sensor 124 that counts the number of times that the first order virtual second 120 has been activated.
- the sensing system 100 could implement a first order virtual sensor 120 that detects when a faucet is running and a second order virtual sensor 124 that tracks the time duration that the first order virtual sensor 120 is activated. That way, an approximation of the total amount of water used could be computed.
- the first order virtual sensors 120 can produce nonbinary outputs.
- the first order virtual sensors 120 could include multi-class classifiers trained to output one of several labels. As described above, the first and second (or higher order) virtual sensors could be trained to detect other, binary or nonbinary, conditions, states, durations, etc.
- the sensing system 100 can infer increasingly richer details about the environment in which the sensor assembly 102 is located.
- multiple sensor assemblies 102 can be communicably connected to the computer system 104 and the data feeds from the multiple sensor assemblies 102 can be combined to provide additional data that can be processed by machine learning to infer information about the environment from correlated data from the sensor assemblies 102 .
- one or more appliance-level second order virtual sensors could feed into a kitchen-level third order virtual sensor, which could in turn feed into a house-level fourth order virtual sensor, and so on.
- a house-level virtual sensor drawing on multiple lower order sensors (whether they are virtual sensors or actual sensors disposed on one of the sensor assemblies within the house) across many rooms can classify complex facets like human activity. Tracking human activities accurately can be very useful in a variety of contexts, such as with smart homes, healthcare tracking, managed care for the elderly, and security and safety of human occupants.
- the outputs of the various virtual sensors 118 can further be fed into applications executed by the computer system 104 , an external client 106 ( FIG. 5 ), and/or other local or remote computer systems via, e.g., an application program interface (API).
- the computer system 104 , a client 106 ( FIG. 5 ), and/or other computer systems may execute a virtual machine receiver program or application to display the output in an application window, a browser, or other output window.
- the output of a virtual sensor 118 counting the number of times that paper towels are dispensed from a particular dispenser could be fed into a local or remote application that automatically orders paper towels once the count has reached a threshold.
- the output of a virtual sensor 118 which tracks anomalous conditions of a machine by considering vibrations and audio signatures could be fed into a local or remote application that notifies the machine maintainer by sounding an alarm and shuts down the machine safely.
- the output of a virtual sensor 118 tracking the duration that a light is on in a room could be fed into a local or remote application that automatically turns the light off once the duration has reached a threshold.
- the output of a virtual sensor 118 monitoring the state of a washing machine could be fed into a local or remote application that automatically notifies the user (e.g., via a text message or a push notification on the user's mobile phone) when the drying cycle of the washing machine has completed.
- event detections from multiple first order virtual sensors 120 located in a home could be fed to a second order virtual sensor 124 to track the various activities of the occupant of a home (e.g., daily routines), which are then fed into an anomaly detection system (which may be an even higher order virtual sensor or a separate application) to notify a caregiver if an elderly person's patterns deviate from their normal patterns (e.g., if the tracked individual falls down, fails to wake up at the usual time, etc.).
- an anomaly detection system which may be an even higher order virtual sensor or a separate application
- FIG. 5 illustrates a block diagram of a sensing system 100 including multiple sensor assemblies 102 communicably coupled to a computer system 104 , in accordance with at least one aspect of the present disclosure.
- Each of the sensor assemblies 102 includes a plurality of sensors 110 for detecting various physical or natural phenomena in the environment in which the sensor assembly 102 is located and a control circuit for executing the various functions of the sensor assembly 102 .
- the control circuit can include, for example, a processor coupled to primary and/or secondary computer memory for executing instructions stored on the memory, a microcontroller, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), and other such devices.
- ASIC application-specific integrated circuit
- FPGA field-programmable gate array
- the sensor assembly 102 includes a microcontroller 121 that includes a processor 123 coupled to a memory 125 .
- the microcontroller 121 executes firmware 129 , including system firmware 129 A and application firmware 129 B, stored in the memory 125 and a clock 127 .
- the firmware 129 can include, for example, firmware for the featurization module 112 (see FIG. 1 ), which can be executed by the control circuit (e.g., microcontroller 121 ).
- the control circuit e.g., microcontroller 121
- the control circuit can be embodied as a system on a chip (SoC).
- the sensor assembly 102 is communicably connectable to the computer system 104 (e.g., one or number of networked servers) such that the computer system 104 can receive the signals or data generated by the sensors 110 for processing thereon, as described above.
- each sensor assembly 102 is communicably connectable to the computer system 104 via a data communication network 108 , such as the Internet, a LAN, a WAN, a MAN, or any other suitable data communication network.
- the sensor assembly 102 can include an appropriate network interface for connecting to the data communication network 108 such as, for example, a W-Fi network interface controller.
- the sensor assembly 102 can communicably connect to the computer system 102 utilizing other wired or wireless communication protocols or other communication networks (e.g., a cellular telecommunication network or Ethernet).
- the network interface controller of the sensor assembly 102 may include a network interface controller suitable to implement wireless or wired communication utilizing a variety of communication protocols and/or access methods, such as cellular, Bluetooth, ZigBee, RFID, Bluetooth low energy, NFC, IEEE 802.11, IEEE 802.15, IEEE 802.16, Z-Wave, HomePlug, global system for mobile (GSM), general packet radio service (GPRS), enhanced data rates for GSM evolution (EDGE), code division multiple access (CDMA), universal mobile telecommunications system (UMTS), long-term evolution (LTE), LTE-advanced (LTE-A), LoRa (or another lower power wide-area network communication protocol), or any other suitable wired and/or wireless communication method or combination thereof.
- GSM global system for mobile
- GPRS general packet radio service
- EDGE enhanced data rates
- the network 108 may include one or more switches and/or routers, including wireless routers that connect the wireless communication channels with other wired networks (e.g., the Internet).
- the data communicated in the network 108 may include data communicated via short messaging service (SMS), multimedia messaging service (MMS), hypertext transfer protocol (HTTP), direct data connection, wireless application protocol (WAP), email, smart energy profile (SEP), ECHONET Lite, OpenADR, or any other protocol that may be implemented with the sensor assemblies 102 , physical hubs, cloud sever communication, or gateway modules.
- one or more of the sensor assemblies 102 may also be communicably connected to each other, and/or to a client 106 (e.g., a user device) via the network 108 and/or via a separate network.
- a network may be a local network established by a local router or a local switch.
- the sensor assemblies 102 may be a peer-to-peer (P2P) network, and may communicate with each other directly.
- P2P peer-to-peer
- service discovery schemes can multicast the presence of nodes, their capabilities, and group membership.
- the peer-to-peer devices can establish associations and subsequent interactions based on this information.
- a sensing assembly 102 may implement one or more application-layer communication protocols. Examples include constrained application protocol (CoAP), message queue telemetry transport (MQTT), OPC UA, HTTP, REST APIs and the like for implementing a respective messaging protocol. Sensing assembly 102 may also implement lower-layers communication protocols which may implement layers of a communication protocol stack lower than the application-layer.
- Example layers implemented may include one or more of the physical, data link, network, transport, session, internet, and presentation protocols.
- Example protocols implemented include one or more of: Ethernet, Internet Protocol, Transport Control Protocol (TCP), protocols for the 802.11 standard (e.g., PHY, Medium Access Control, Logical Link Control, and the like), and the like.
- the computer system 104 may be a virtual machine.
- the virtual machine may be any virtual machine, while in some embodiments the virtual machine may be any virtual machine managed by a Type 1 or Type 2 hypervisor, for example, a hypervisor developed by Citrix Systems, IBM, VMware, or any other hypervisor.
- the virtual machine may be managed by a hypervisor, while in aspects the virtual machine may be managed by a hypervisor executing on a server or a hypervisor executing on a user device.
- the client 106 may display application output generated by an application remotely executing on a server or other remotely located machine (for e.g., for controlling, communicating and/or accessing data from a sensor assembly 102 and/or controlling and/or communicating with various objects of an environment being sense).
- the client device may execute a virtual machine receiver program or application to display the output in an application window, a browser, or other output window.
- FIG. 6 depicts an embodiment of the sensor assembly 102 according to various embodiments.
- the sensor assembly 102 comprises a single circuit board 128 with the various sensors and control circuit connected thereto.
- sensors and control circuit can be positioned on one side of the circuit board, and the sensor assembly 102 can further comprise a connector 126 (e.g., a USB connector, power plug, or Ethernet connector for providing power via a power-over-Ethernet interface) on the opposite side of the circuit board for supplying power to the sensors 110 , microcontroller 121 , and other electronic components of the sensor assembly 102 .
- a connector 126 e.g., a USB connector, power plug, or Ethernet connector for providing power via a power-over-Ethernet interface
- the sensor assembly 102 is intended to be plugged into an electrical outlet within the area or environment to be monitored by the sensor assembly 102 .
- the sensor assembly 102 can be held in a stationary, non-mobile position relative to the object to which it is connected or a reference frame.
- a power source e.g., an electrical outlet
- the sensor assembly 102 does not strictly require a power source that must be replaced or recharged, obviating the need to limit the processing power of the sensor assembly 102 and the number and/or utilization of the sensors 110 in order to attempt to conserve power.
- the sensor assembly 102 can utilize certain types of sensors (e.g., vibration sensors) to detect changes in the environment within the vicinity of the sensor assembly 102 relative to a fixed position.
- the sensors 110 can be selected to account for the potential suboptimal placement of the sensor assembly 102 relative to the object or location being sensed.
- the sensor assembly 102 may need to be placed in a suboptimal location because the placement of the sensor assembly 102 will be contingent upon the location of an electrical outlet, which means that the sensor assembly 102 could potentially be located a relatively far distance from the object or location being sensed. Therefore, the sensors 110 can utilize indirect sensing techniques.
- the sensors 110 , connector 126 , microcontroller 121 , and various other components of the sensor assembly 102 can be supported upon a printed circuit board (PCB) substrate 128 .
- the sensors 110 can be disposed on a first surface of the PCB substrate 128 and the connector 126 can be disposed on a second, opposing surface of the PCB substrate 128 so that the sensors 110 are oriented outwardly towards the environment when the connector 126 is plugged into or connected to a corresponding socket.
- the sensors 110 can be mounted on various layers of the PCB substrate 128 .
- sensors 110 such as, without limitation, an EMI sensor configured to measure the electro magnetic interference in the line voltage of a power circuit caused by an electrical device may be included in a first one or more layer(s) of the PCB substrate 128 and other sensors 110 may be included in a different layer of the PCB substrate.
- the sensor assembly 102 further includes a housing enclosing the various components. That is, the housing can house the PCB substrate 128 and the sensors 110 connected thereto. The housing can protect against physical damage and electrostatic discharge. Further, the housing can be designed to accommodate sensors 110 that require line of sight and access to the environment's air by, for example, having access cutouts for the relevant sensors 110 .
- the housing could be constructed from, for example, laser cut cast acrylic and/or constructed via injection molding or 3 D printing processes.
- the sensor assembly 102 could comprise multiple PCB substrates, with the sensors 110 on different PCB substrates.
- the housing can enclose all of the sensors 110 and PCB substrates.
- the sensors 110 can include various combinations of sensing devices that are configured to detect various different physical or natural phenomena.
- the sensors 110 include an infrared radiation sensor 130 (e.g., a Panasonic Grid-EYE AMG8833 infrared array sensor), an ambient light color and/or intensity sensor 132 (e.g., a TAOS TCS34725 RGB sensor), a magnetic field sensor 134 (e.g., a Freescale Xtrinsic MAG3110 magnetometer), a temperature sensor 136 , an ambient pressure sensor, a humidity sensor (e.g., all part of a Bosch BME280 environmental sensor), an air quality or air composition sensor (e.g.
- an infrared radiation sensor 130 e.g., a Panasonic Grid-EYE AMG8833 infrared array sensor
- an ambient light color and/or intensity sensor 132 e.g., a TAOS TCS34725 RGB sensor
- a magnetic field sensor 134 e.g., a Freescale
- a Bosch BME680 sensor for sensing the presence of certain volatile organic compounds
- a vibration sensor 138 e.g., an InvenSense MPU-6500 accelerometer six-axis accelerometer and gyroscope motion tracking sensor, which can detect vibrations through the structure when the sensor assembly 102 is secured to an electrical outlet
- an external device detection sensor 140 e.g., a 2.4 GHz network interface controller for detecting the presence and/or activity of external electronic devices connected to the Wi-Fi network or a Bluetooth LE sensor for detecting the presence of external electronic devices in the vicinity of the sensor assembly 102
- a motion sensor 142 e.g., a Panasonic AMN21111 PIR motion sensor
- an acoustic sensor 144 e.g., an Analog Devices ADMP401 microphone
- an EMI sensor 146 e.g., a 100 mH inductor to capture over-air EMI and/or a passive RC network to sense EMI changes in the line voltage of the power source to which the sensor assembly
- the sensor assembly 102 can utilize any number and combination of the aforementioned sensors and any other types of sensors 110 for detecting physical or natural phenomena.
- the sensors 100 can be analog or digital sensors.
- the sensor assembly 102 does not comprise a high-resolution camera (i.e., higher resolution than a thermal imager, such as an infrared radiation sensor 130 ).
- the sensing system 100 can make the detections and classifications described herein without use of a camera, which decreases the cost and power consumption of the sensor assembly 102 . It also decreases the amount of data that needs to be featurized onboard and transmitted to the computer system 104 since there is no image data to featurize and transmit.
- the sensor assembly 102 can further include one or more interfaces that can be utilized to connect to or communicate with additional sensors external to the sensor assembly 102 .
- the interfaces can include, for example, Serial Peripheral Interface (SPI), Inter-Integrated Circuit (I 2 C), General Purpose Input/Output pins (GPIOs), and/or universal asynchronous receiver-transmitter (UART).
- SPI Serial Peripheral Interface
- I 2 C Inter-Integrated Circuit
- GPIOs General Purpose Input/Output pins
- UART universal asynchronous receiver-transmitter
- the interfaces allow additional external sensors to be connected to the sensor assembly 102 in order to supplement and/or extend the functionality of the sensor assembly 102 .
- Additional sensors that could be modularly connected to the sensor assembly 102 via the interfaces could include, for example, motion sensors (e.g., Doppler radar sensors), EMI sensors configured to detect the transients caused in the line voltage of the power source directly to which the sensor assembly 102 is connected, a lidar sensor, an ultrasonic sensor, and/or an active noise management system.
- motion sensors e.g., Doppler radar sensors
- EMI sensors configured to detect the transients caused in the line voltage of the power source directly to which the sensor assembly 102 is connected
- lidar sensor e.g., an ultrasonic sensor
- active noise management system e.g., a lidar sensor, an ultrasonic sensor, and/or an active noise management system.
- the sensors 110 of the sensor assembly 102 can include passive sensors and/or active sensors.
- a passive sensor is a sensor that simply detects or senses various physical or natural phenomena of an environment. Examples of such passive sensors are described above and may include, without limitation, vibration sensors, microphones, EMI sensors, infrared radiation sensors, acoustic sensors, temperature sensors, humidity sensors, camera, motion sensors (e.g., accelerometer, gyroscope, etc.), electric field sensors, chemical sensors, photo sensors, or the like.
- An active sensor is sensor used for measuring signals transmitted by the sensor that were reflected, refracted or scattered by an object of the environment and/or disturbances caused by the transmitted signals in the environment.
- an output device (described below) of a sensor assembly 102 may be configured to transmit a signal that may be reflected, refracted or scattered by an object of the environment and/or may cause disturbances in the environment, where such reflection, refraction, scattering, and/or disturbance is subsequently sensed by a sensor 110 of the sensor assembly 102 , thereby forming an active sensor assembly 102 without an actual active sensor 110 .
- the data from such active sensors could be featurized and used to detect events/conditions by the first or second (or higher) order virtual sensors.
- the active sensors could also be used to calibrate a space in which the sensor assembly 102 is located, as described further below.
- the data from such active sensors could be featurized and used to detect events/conditions by the first or second (or higher) order virtual sensors, just like the passive sensors as described above.
- the active sensors could also be used to calibrate a space in which the sensor assembly 102 is located, as described further below.
- an active sensor can be used for authentication of an object and/or a person as described below.
- the acoustic sensor can be utilized via an active sound management system that transmits a sound signal and receives the reflected, refracted and/or refracted signal to determine, for example, the sensor assembly's 102 position relative to walls or other structures within its vicinity and calibrate the acoustic sensor (and/or other sensors 110 ) accordingly.
- Such calibrations can be utilized to, for example, compensate for echoes or other audio artifacts that could interfere with the detection of certain events.
- the audio artifacts can be compensated for by, for example, signal processing techniques executed onboard the sensor assembly 102 to reduce errors.
- the ultrasonic sensor can be utilized to emit sound waves in order to calibrate other sensor assemblies 102 that are within the detection distance.
- Such audio signals can be utilized to pair sensor assemblies 102 together and/or allow the sensing system 100 to determine the spatial orientation of the various sensor assemblies 102 relative to each other within an environment.
- the sensor assembly includes a speaker
- the sensor can output particular sound pattern (e.g., a frequency sweep tone from configurable low frequency values to high frequency values) and have either the microphone on the same sensor assembly 102 , or a different sensor assembly 102 in the vicinity, detect the audio signal using a microphone sensor to actively measure and calibrate for the environment.
- the sampling rate of each of the sensor assembly's 102 sensors 110 can be automatically varied according to the sensor 110 type or the property or phenomena being sensed.
- the vibration sensor 138 could have a high sample rate and the temperature sensor 136 could have a low sample rate (because temperature generally changes relatively slowly). Varying the sampling rate according to the property being sensed by the sensors 110 allows data to be collected at the rate needed to capture environmental events, without unnecessary fidelity and the accompanying processing and transmission requirements.
- the temperature sensor 136 , humidity sensor, ambient pressure sensor, light color and/or light intensity sensors 132 , magnetic field sensor 134 , electronic device sensor 140 , infrared radiation sensor 130 , and motion sensor 142 are each sampled at, for example, about 8 Hz to about 12 Hz, and preferably at about 9 Hz, 10 Hz, or 11 H;
- the vibration sensor 138 is sampled at, for example, about 3 kHz to about 5 kHz, and preferably at about 3.8 kHz, 3.9 kHz, 4 kHz, 4.1 kHz, or 4.2 kHz (e.g., each axis of a three-axis accelerometer is sampled at, for example, 8 kHz, 3.9 kHz, 4 kHz, 4.1 kHz, or 4.2 kHz);
- the acoustic sensor 144 is sampled at, for example, about 15 kHz to about 19 kHz, about 16 kHz to about 18 kHz, or at about 17
- FIG. 7 depicts a timeline 200 illustrating various illustrative data sampling timescales for a variety of sensors and the events detectable by various sensors 110 at those sampling rates.
- the vibration sensor 138 , acoustic sensor 144 , and EMI sensor 146 can sample on timescales on the order of milliseconds to minutes;
- the light color sensor 132 can sample on the order of seconds to days;
- the illumination sensor 132 can sample on the order of seconds to months;
- the motion sensor 142 and the infrared sensor 130 can sample on the order of seconds to weeks;
- the electronic device sensor 140 can sample on the order to minutes;
- the temperature sensor 136 , the ambient pressure sensor, the humidity sensor, and the magnetic field sensor 134 can sample on the order of minutes to months.
- Sampling this array of sensors 110 at these rates allows the sensor assembly 102 to detect events 202 ranging from EMI spikes (e.g., from a microwave) via the EMI sensor 146 , door knocks via the vibration and acoustic sensors 138 , 144 , and/or other events that occur on the order of milliseconds, to daylight changes via the ambient light sensor 132 , seasonal changes via a variety of sensors, and/or other events that occur on the order of months. As depicted, a variety of other events such as tools running, light usage, and appliance usage can be detected at timescales between these extremes.
- EMI spikes e.g., from a microwave
- door knocks via the vibration and acoustic sensors 138 , 144
- other events that occur on the order of milliseconds to daylight changes via the ambient light sensor 132
- daylight changes via the ambient light sensor 132
- seasonal changes via a variety of sensors
- a variety of other events such as tools running, light usage, and appliance
- each sensor 110 can be buffered (e.g., a rolling 256-point buffer) in case communication is lost with the computer system 104 or the sensor assembly 102 is otherwise unable to transmit data for a period of time.
- This buffered data can be stored on the internal memory of the microprocessor on the sensor assembly 102 or an external memory module connected to the sensor assembly 102 .
- the sensor assembly 102 can be configured to resume sending the buffered data when communication resumes and can overwrite older sensor data to keep only the most recent samples if the memory space runs out.
- the sensor assemblies 102 communicate with a computer system 104 , such as a server, server system, or cloud-based computing architecture, that provides the back end computational analysis for the sensing system 100 .
- Data can be processed at multiple stages in the sensing system 100 , such as onboard the sensor assemblies 102 , at the computer system 104 , and/or at another computer system (e.g., a gateway that is part of the computer system 104 ).
- the sensor assembly 102 performs onboard featurization of the sensor 110 data and the computer system 104 processes the data through a machine learning model.
- the sensing system 100 can be a distributed computing system, wherein computational resources can be dynamically shifted between the sensor assembly 102 and distributed computers/servers of the computer system 104 .
- a central computer/server in the computer system 104 can control the amount of computational resources spent by each node in the distributed computing system 104 .
- the central computer/server can offload computational resources (e.g., for data featurization) to the sensor assembly 102 and/or an intermediate gateway if other servers/computers in the computer system 104 begins to slow or becomes over exerted, and vice versa.
- the computer system 104 can be accessible via a client 106 , such as a personal computer, laptop or mobile device, through a console user interface or a graphical user interface (GUI), such as a web browser or mobile app.
- client 106 When the client 106 connects to the computer system 104 , the computer system 104 can permit the client to access the data from the sensor assemblies 102 .
- the client 106 may only access the data registered to the user account through which the client 106 has accessed the computer system 104 or otherwise allow the user of the client 106 to access the data from the sensor assemblies 102 associated with the user.
- a user can visualize the featurized data transmitted from the sensor assembly 102 to the computer system 104 through the GUI.
- the GUI can provide spectrograms, line charts, and other graphical and/or numerical formats for viewing the received data. Further, sensor streams could be separated into time and frequency domain components.
- the GUI can be customized to visualize only a subset of the featurized sensor streams, as desired by the user. For example, FIGS. 7-13 depict various graphical formats in which the featurized data can be presented via a GUI on a client to a user.
- the GUI can be configured to automatically provide an alert when the data from one or more sensor channels exceeds a particular threshold, a particular event or condition is detected from the received data, and/or other rules programmed or otherwise specified by the user are satisfied. Further, the threshold(s) and/or other rules for providing an alert can be configurable via the GUI.
- the GUI can allow users to enable and disable particular sensor 110 streams from the sensor assembly 102 (i.e., cause the sensor assembly 102 to deactivate or stop sampling the particular sensor(s) 110 ), modify sampling frequencies of the sensor(s) 110 , allow the user to permit other users to access the sensor data associated with his or her user account, and configure other features associated with the back end computer system 104 .
- the interface can be configured to control whether some or part of the data featurization occurs onboard the sensor assembly 102 , at an intermediate gateway between the sensor assembly 102 and the computer system 104 , and/or at the computer system 104 .
- the interface can be configured to assist a user in providing and/or verifying labels for raw data and/or featurized data, as discussed above.
- the computer system 104 can further implement a management module that allows firmware and/or software updates to be transmitted to the sensor assemblies 102 .
- This management module can be controlled via an interface of the client 106 , e.g. the GUI. Further, the management module of the interface could allow custom code to be deployed at each sensor assembly 102 , as desired by the user. Still further, the management module could collect and store telemetry information, such as uptime of the sensors 110 and/or sensor assembly 102 , data rates for the sensors 110 , reboots of the sensor assembly 102 , and so on.
- the management module could further allow users to adjust the sampling rates of the sensors 110 of the sensors assemblies 102 (e.g., on a sensor-by-sensor basis and/or on a categorical basis across all of the sensor assemblies 102 ). Still further the management module can instruct the sensor assembly as to which features should be extracted for a particular and at what rate.
- FIG. 8 illustrates a first sensor data graphical display 210 annotated with events detected by the sensing system 100 , in accordance with at least one aspect of the present disclosure.
- the sensor assembly 102 includes a motion sensor 142 , a temperature sensor 136 , a humidity sensor, an electronic device sensor 140 , and EMI sensor 146 , and an ambient light color sensor 132 .
- the first sensor data graphical display 210 depicts the sensor data captured by a sensor assembly 102 placed within a studio apartment over the course of a 24-hour period. Based on the depicted sensor data, a variety of virtual sensors 116 could be trained to detect events correlated with the data captured by the sensor assembly 102 .
- a virtual sensor 116 could be trained to detect when a person is awake according to a combination of the motion data 212 and the ambient light color data 222 , which detect the movement of the occupant and the occupant turning on a lamp, respectively.
- a virtual sensor 116 could be trained to detect when the occupant is showering according to the humidity data 216 .
- a virtual sensor 116 could be trained to detect when the occupant is streaming TV according to variations in the ambient light color data 222 and the electronic device data 218 , which in this case is a W-Fi sensor configured to detect when electronic devices are being utilized according to the Received Signal Strength Indicator (RSSI) of the W-Fi.
- RSSI Received Signal Strength Indicator
- a virtual sensor 116 could be trained to detect when the occupant has come home according to a combination of the motion data 212 and/or the temperature data 214 (wherein the rising temperature could result from the occupant increasing the thermostat when he or she comes home).
- a virtual sensor 116 could be trained to detect when the microwave was being utilized according to the EMI data 220 , which can detect the EMI spike from the microwave being activated.
- FIG. 9 illustrates a second sensor data graphical display 230 annotated with events detected by the sensing system 100 , in accordance with at least one aspect of the present disclosure.
- the sensor assembly 102 includes a motion sensor 142 , a temperature sensor 136 , a humidity sensor, an ambient pressure sensor, an electronic device sensor 140 , and EMI sensor 146 , and an ambient light color/illumination sensor 132 .
- the second sensor data graphical display 230 depicts the sensor data captured by a sensor assembly 102 placed within an apartment over the course of a 72-hour period. Based on the depicted sensor data, a variety of virtual sensors 116 could be trained to detect events correlated with the data captured by the sensor assembly 102 .
- a virtual sensor 116 could be trained to detect day and night cycles according to the ambient light color data 244 and ambient light illumination data 246 .
- a virtual sensor 116 could be trained to detect when the occupant is present and active within the apartment according to some combination of the motion data 232 , the ambient light color data 244 (because the ambient light color sensor 132 can detect when a lamp and the kitchen lights are on), and/or the temperature data 234 (because the motion data 232 correlates to a temperature increase).
- a virtual sensor 116 could be trained to detect when the microwave was being utilized according to the EMI data 220 .
- a virtual sensor 116 could be trained to detect when the occupant is streaming TV according to variations in the ambient light color data 222 and the electronic device data 218 .
- the humidity data 236 and the ambient pressure data 238 could be utilized to detect additional longer term environmental changes, such as the weather.
- FIGS. 10-11 illustrate a third sensor data graphical display 250 and a fourth sensor data graphical display 270 annotated with events detected by the sensing system 100 , in accordance with at least one aspect of the present disclosure.
- the sensor assembly 102 includes a temperature sensor 136 , a humidity sensor, a magnetic field sensor, and an ambient light color/illumination sensor 132 .
- the third sensor data graphical display 250 and the fourth sensor data graphical display 270 each depict the sensor data captured by a sensor assembly 102 placed within a garage over the course of approximately a 24-hour period. Based on the depicted sensor data, a variety of virtual sensors 116 could be trained to detect events correlated with the data captured by the sensor assembly 102 .
- a virtual sensor 116 could be trained to detect rain, as depicted in FIG. 10 , according to the temperature data 252 and/or the humidity data 254 due to the fact that rain is correlated with a drop in the temperature and an increase in the humidity.
- a virtual sensor 116 could be trained to detect night time according to the ambient light color data 256 (which can detect the light from the street lights, as depicted in FIG. 10 ) and/or the ambient light illumination data 258 .
- a virtual sensor 116 could be trained to detect when the garage door opens, as depicted in FIG. 11 , according to the temperature data 252 and the humidity data 254 , which drop and rise, respectively, when the garage door is opened during the winter.
- the magnetic field data 255 could be utilized to detect additional events or parameters, such as environmental changes or seasonal changes, as discussed above in the context of other examples.
- sensor assemblies 102 with different combinations or arrangements of sensors 110 can be utilized for different applications or locations.
- the sensor assemblies 102 described in connection with FIGS. 8-9 have a more expansive suite of sensors 110 because they are intended to be utilized in a domicile to track a wide array of behaviors and activities.
- a sensor assembly 102 intended to be utilized in a location where there is less activity or less data that needs to be tracked e.g., in a garage as with FIGS. 10-11
- FIG. 12 illustrates a fifth sensor data graphical display 290 annotated with events detected by the sensing system 100 , in accordance with at least one aspect of the present disclosure.
- the sensor assembly 102 includes an acceleration sensor, an acoustic sensor 144 , a temperature sensor 136 , a humidity sensor, an ambient pressure sensor, a magnetic field sensor 134 , and an ambient light color/illumination sensor 132 .
- the fifth sensor data graphical display 290 depicts the sensor data captured by a sensor assembly 102 placed within an automobile over the course of trip. Based on the depicted sensor data, a variety of virtual sensors 116 could be trained to detect events correlated with the data captured by the sensor assembly 102 .
- This particular example showcases how the sensor assembly 102 can be utilized in a mobile setting, e.g. an automobile, to detect a variety of events by training different types of virtual sensors 116 .
- This example also illustrates a sensor assembly 102 attached or coupled to an object (i.e., the automobile) being sensed.
- the sensing system 100 in this example represents a direct sensing system with respect to the object to which the sensor assembly 102 is attached (i.e., the automobile) and an indirect sensing system with respect to environment in which the sensor assembly 102 is located (i.e., the interior of the automobile).
- a virtual sensor 116 could be trained to detect when the automobile is approaching a highway according to the acceleration data 292 , which indicates that the automobile has been gradually accelerating for an extended period of time.
- a virtual sensor 116 could be trained to detect when a window has been lowered according to some combination of the acoustic data 294 (which detects an increase in the amount of noise within the automobile), the temperature data 296 (which detects a temperature drop), the ambient humidity data 298 (which detects a humidity increase), and/or the ambient pressure data 300 (which detects a pressure drop).
- a virtual sensor 116 could likewise be trained to detect when the window has been closed according to these same data streams.
- a second order virtual sensor 124 could be trained from first order virtual sensors 120 to track the state of the automobile window. Instead of outputting a binary output as with the first order virtual sensors 120 (e.g., “Is the window closed? Yes or no?” or “Is the window open? Yes or no?”), the second order virtual sensor 124 could be trained from the outputs of the first order virtual sensors 120 to provide a nonbinary output directed to the window's state (e.g., “Is the window open, being opened, partially opened, closed, or being closed?”) based on this data. As another example, a virtual sensor 116 could be trained to detect heading of the vehicle according to the magnetic field data 302 .
- the magnetic field data 302 could train a number of first order virtual sensors 120 (e.g., “Is this vehicle heading north?” or “Is the vehicle heading west?”) and a second order virtual sensor 124 could be trained from the output of the first order virtual sensors 120 to provide a nonbinary output directed to the vehicle's state (e.g., “What direction is the vehicle heading in?”).
- a virtual sensor 116 could be trained to detect the degree of cloudiness according to the ambient light illumination data 306 , which can indicate the number and length of the instances that the sun is obscured during the course of the vehicle's trip.
- the ambient light color data 304 could be utilized to detect additional events or parameters associated with the vehicle or the vehicle's environment, such as what time of day it is, as discussed above in the context of other examples, or whether the vehicle is proceeding through a tunnel.
- FIG. 13 illustrates a sixth sensor data graphical display 310 annotated with events detected by the sensing system 100 , in accordance with at least one aspect of the present disclosure.
- the sixth sensor data graphical display 310 a subset of the sensor data streams are depicted as featurized spectrograms. While the figure illustrates a spectrogram, other display methods such as a user interface illustrating time domain data, frequency domain data, and/or both are within the scope of this disclosure.
- the sensor assembly 102 includes a vibration sensor 138 , an acoustic sensor 144 , and an EMI sensor 146 .
- the data stream from the vibration sensor 138 is broken down into X-, Y-, and Z-axis constituent parts, which can be provided by a three-axis accelerometer, for example.
- the sixth sensor data graphical display 310 depicts the sensor data captured by a sensor assembly 102 placed within a workshop. Based on the depicted sensor data, a variety of virtual sensors 116 could be trained to detect events correlated with the data captured by the sensor assembly 102 .
- a variety of virtual sensors 116 could be trained to detect when the faucet is running, a urinal has been flushed, a kettle has been put on the stove, and/or various tools are being utilized according to a combination of vibration data 312 and acoustic data 314 .
- vibration data 312 and acoustic data 314 it should be noted that although certain events can be detected utilizing the same combinations of featurized data streams, they are nonetheless detectably discernible because the different events have different patterns or characteristics within the sensor data streams. The different patterns or characteristics exhibited in the data streams for the sensors 110 activated by each event can be utilized by the machine learning of the sensing system 100 to characterize that event to generate a virtual sensor 116 that can reliably identify future occurrences of the event.
- a faucet running, a urinal flushing, an electric saw running, and the other annotated events each generate a unique signature in the vibration data 312 and/or the acoustic data 314 that can be characterized by the machine learning of the sensing system 100 to identify those events.
- a virtual sensor 116 could be trained to detect when the microwave door is opened or closed according to the acoustic data 314 . Further, a virtual sensor 116 could be trained to detect when the microwave has completed a heating cycle according to the acoustic data 314 (by detecting the microwave's completion chime). Further, a virtual sensor 116 could be trained to detect when the microwave is running according to the EMI data 316 . These virtual sensors 116 can represent first order virtual sensors 120 detecting binary properties of the microwave. The outputs of these first order virtual sensors 120 can be fed into a second order virtual sensor 124 trained to track the state of the microwave. FIG.
- the microwave second order virtual sensor 330 includes five states: available for use 332 , door ajar 334 , in-use 336 , interrupted 338 , and finished 340 .
- the microwave second order virtual sensor 330 moves from the available state 332 to the door ajar state 334 and back again if acoustic data 314 indicates that the door has been opened and then closed.
- the microwave second order virtual sensor 330 moves from the available state 332 to the in-use state 336 when the EMI data 316 indicates that the microwave is running.
- the microwave second order virtual sensor 330 moves from the in-use state 336 to the interrupted state 338 when the acoustic data 314 indicates that the door has been opened.
- the microwave second order virtual sensor 330 moves from the interrupted state 338 back to the in-use state 336 when the EMI data 316 indicates that the microwave is once again running.
- the microwave second order virtual sensor 330 moves from the in-use state 336 to the finished state 340 when the acoustic data 314 indicates that the completion chime has sounded.
- the microwave second order virtual sensor 330 then moves from the finished state 340 to the available state 332 when the acoustic data 314 indicates that the microwave door has been closed (thereby indicating that the user has removed his or her food from the microwave).
- a second order virtual sensor 124 can produce a nonbinary output by being trained on and fed a number of first order virtual sensors 120 that detect binary outputs, such as “Has the microwave door been closed?” or “Is the microwave running?”.
- the sensing system 100 can implement, in various embodiments, end-to-end encryption between the sensor assembly 102 and the computer system 104 to ensure confidentiality and authenticity of the data.
- the sensor assemblies 102 and the computer system 104 mutually authenticate themselves using asymmetric keys.
- the sensor assembly 102 can authenticate that it is talking to the correct computer system 104 (e.g., specified by a hostname/IP) and then encrypt the data that it is transmitting to the computer system 104 so that only the computer system 104 can decrypt it.
- the computer system 104 can authenticate the sensor assembly 102 by the sensor assembly 102 adding its public signature to the computer system 104 and the sensor assembly 102 signing any data item it sends to the computer system 104 with its own associated private key so that the computer system 104 can verify its authenticity.
- the sensing system 100 can utilize asymmetric key cryptography to establish the communication channel between each sensor assembly 102 and the computer system 104 and then establish a symmetric key cryptographic channel thereafter.
- the sensor assemblies 102 initiate the outgoing transmission protocol (e.g., TCP or UDP) to connect to a known server, they can punch a hole through a network address translation (NAT) or firewall and thus be deployed at homes with a single public IP address, as well as enterprises with each sensor assembly 102 having its own public address. All data communication between the sensor assembly 102 and the computer system 104 can occur over such a single, persistent, encrypted TCP socket. Further, each data packet transmitted by the sensor assembly 102 can contain a header denoting the sensor channels payload, allowing the computer system 104 to demultiplex the source and type of sensed data.
- TCP transmission protocol
- UDP User Datagram Protocol
- NAT network address translation
- the sensor assembly 102 can implement appropriate serializing and deserializing routines to package the data into chunks for transmission via, e.g., a W-Fi connection.
- data send routines execute asynchronously by the sensor assembly 102 so that sensor data reading, featurization, and transmission can proceed independently.
- the sensor assemblies 102 can transmit or stream the sensor data to a local computer or a computer external to the computer system 104 .
- the local computer can include a client 106 that is capable of executing the interface for visualizing the data from the sensor assemblies 102 and/or controlling the functions of the sensor assemblies 102 , as described above.
- the local computer can be executing the machine learning module 116 , as described above.
- the local computer to which the data is streamed can be, for example, behind a system configured to perform network address translation (NAT), common in residential settings with a single public IP address shared by many computers.
- NAT network address translation
- the computer system 104 can control whether the sensor assemblies 102 are streaming data to a local computer according to whether there is a substantial distance between the sensor assemblies 102 and the computer system 104 , whether the communication roundtrip time exceeds a particular threshold, or whether the available bandwidth falls below a particular threshold.
- a user can control whether the data from the sensor assemblies 102 is streamed to a local computer via the interface described above.
- the sensor assemblies 102 could also be programmed to automatically locate the nearest server (e.g., of the computer system 104 ) to which to stream its data based on a variety of metrics, such as distance, roundtrip communication time, and/or bandwidth.
- Users can access or log into the computer system 104 via a client 106 to view the data from the sensor assemblies 102 associated with their user account, modify features or settings of the sensor assemblies 102 , and update their security preferences.
- users can selectively permit other users to view, access, and/or modify their associated sensor assemblies 102 . This could permit, for example, all family members to view the data and events detected by the sensor assemblies 102 within the family's home.
- the permissions provided to the invited users can be controlled from a master account, for example.
- sensor streams from the sensors 110 within a single sensor assembly 102 and the sensors 110 across multiple sensors assemblies 102 within the environment are preferably temporally correlated and/or synchronized.
- a “door closing” event typically causes a synchronous increase in air pressure and a structural vibration, which could be detected across a number of sensor assemblies 102 located throughout the building.
- This co-occurrence of signals that are detectable with different sensors 110 and across different sensor assemblies 102 is what enables the virtual sensors 116 to robustly detect and classify events, such as a door closing in the above example.
- the sensing system 100 can be configured to temporally synchronize or correlate the data streams from sensors 110 both within a single sensor assembly 102 and across multiple sensor assemblies 102 to allow events detected by different data streams to be temporally associated together.
- the computer system 104 utilizes a Network Time Protocol (NTP) to synchronize its own clock periodically, which is then used to keep all of the clocks 127 of all of the sensors assemblies 102 connected to the computer system 104 in synchronization with the computer system 104 .
- NTP Network Time Protocol
- Each sensor assembly 102 can include, for example, a quartz clock to keep track of time between these time synchronizations to minimize any clock drift between the different sensor assemblies 102 .
- the sensor assembly 102 timestamps all sensor data with the system epoch time from its synchronized clock 127 to, e.g., millisecond granularity. Further, synchronizing the sensor assemblies 102 and timestamping all sensor data addresses any reordering problems from the sensors 110 being sampled asynchronously and any processing or transmission delays before the data packet reaches the computer system 104 .
- all data for each sensor stream is continuously buffered in an onboard buffer (e.g., a buffer onboard the sensor assembly 102 ), with each sensor reading timestamped according to the clock 127 synchronized across the sensing system 100 network. If communication between the sensor assembly 102 and the computer system 104 is lost or congested, the sensor assembly 102 can continually attempt to re-establish a communication channel with the computer system 104 .
- the system firmware 129 A of the sensor assembly 102 can be configured to periodically check for W-Fi connectivity and an active connection to the computer system 104 (or a server or other computer thereof).
- the sensor assembly 102 can execute an exponential back-off algorithm to periodically attempt to reconnect to the communications channel (e.g., W-Fi network). If the sensor assembly 102 is unable to reconnect to the computer system 104 for a particular length of time (e.g., one hour), the system firmware 129 A can be configured to reboot and then once again attempt to reconnect to the computer system 104 . Upon the sensor assembly 102 reconnecting to the computer system 104 , the sensor assembly 102 can then upload all of the buffered, timestamped sensor data to the computer system 104 , which can then reorganize the data with the data from other sensor assemblies 102 as necessary.
- the communications channel e.g., W-Fi network
- the sensor assembly 102 comprises a software watchdog to monitor the status of all of the sensors 110 . If any sensor 110 does not report new data within a configurable period (e.g., 60 seconds), the software watchdog executed by the system firmware 129 A can automatically restart the sensor assembly 102 .
- the sensor assembly 102 comprises a hardware watchdog that reboots the sensor assembly 102 if the application or the system firmware 129 A fails to respond in a timely manner (e.g., within one minute). After reset, the sensor assembly 102 re-initializes all the sensors 110 and resumes operation.
- the sensor assembly 102 can further include one or more output devices, such as a light emitting device assembly including one or more LEDs, microphones, vibration motors, displays, and other audio, visual, and/or haptic indicators (not shown here).
- the output devices can be utilized to provide alerts or feedback to the user when various events occur.
- the sensor assembly 102 can cause an LED assembly to illuminate or flash, a vibration motor to vibrate, a speaker to emit an audible alert, or the like when, for example, a particular event has been detected.
- the computer system 104 can detect the occurrence of the event via an appropriate virtual sensor 118 and then transmit a command or signal to the sensor assembly 102 to cause the corresponding output device to provide feedback, as previously described.
- Such audible or visual feedback can be utilized to provide notifications to hearing or vision-impaired individuals that an event has occurred (e.g., a kettle is boiling) or otherwise alert users that an event that the user may wish to be aware of has occurred.
- alerts can be configured via the interface, for example. Users can thus access the computer system 104 via a client 106 and then program or configure the desired alerts in accordance with virtual sensors 118 and/or the trigger action rules described above.
- the sensor assembly 102 can include a wireless communication circuit (e.g., a Bluetooth LE transmitter, WiFi, Zigbee, Z-Wave) for transmitting alerts (e.g., push notifications) to the user (e.g., the client 106 ) when a selected event has been detected.
- a wireless communication circuit e.g., a Bluetooth LE transmitter, WiFi, Zigbee, Z-Wave
- the output devices can also be utilized to confirm or authenticate the identity and/or location of particular sensor assemblies 102 .
- a user visualizing the data streams from a number of sensor assemblies 102 within a sensing system 100 can (e.g., via a client device 106 ) cause the output device of a particular sensor assembly 102 to begin emitting an alert so that the user can confirm the identity and location of the sensor assemblies 102 .
- a user could select “sensor assembly #3” and cause it to emit an alert.
- the computer system 104 can transmit a command to the sensor assembly 102 corresponding in identity to “sensor assembly #3” to cause it to emit a sequence of flashes by the light source or a sequence of beeps from the speaker.
- the user can then enter the sequence into the client device 106 to authenticate the client device 106 to the sensor assembly 102 .
- the output may also be provided to a user mobile device.
- the sensing system 100 can authenticate a user based upon their mobile electronic devices (e.g., smart phone, wearables, and other such devices). For example, the sensing system 100 can utilize the wireless communication circuit to determine whether human activity detected by the sensor assemblies 102 is being performed by a user of interest, based upon whether the wireless communication circuit is able to detect the mobile electronic devices of the user or other authorized individuals present within the vicinity of the relevant sensor assemblies 102 .
- the mobile electronic devices of the user and/or other authorized individuals can be, for example, pre-registered with the sensing system 100 or pre-paired with the sensor assemblies 102 .
- Various applications and/or trigger rules could then be programmed to send alerts to an authorized user and/or take other actions if human activity detected by the sensing system 100 at certain locations, at certain times, or according to other such constraints, is not being performed by the users of interest.
- Such aspects could be utilized to, for example, detect when an unauthorized individual is in the user's home at night or while the user is at work.
- Such aspects could also be utilized to, for example, confirm the identity of an individual making changes to the configurations or settings of the sensing system 100 via a client 106 according to their proximity to a sensor assembly 102 of the sensing system 100 .
- a client 106 can also be utilized to access the computer system 104 to define conditions (rules) and associated actions to take in response to those conditions via an interface.
- users can program explicit actions for the sensor assembly 102 and/or computer system 104 to take when certain conditions have been satisfied. For example, a user could define a condition where if the motion sensor 142 is triggered and the time is after midnight, then send a text message to a particular cellular number.
- the trigger action rules can be defined with multiple conditions (triggers) and multiple actions.
- Triggers may capture specific conditions on sensor values (e.g., temperature>20° C.) or demonstrated behaviors detected by a virtual sensor 116 (e.g., window is open). Both sensors and actions can either refer to specific devices (e.g., temperature on sensor assembly #25) or to locations (e.g., kitchen). For instances where a trigger and/or action is specified to a location, the computer system 104 can compute the average value of all of the sensor assemblies 102 from a given sensor channel (e.g., temperature) for that specific location (e.g., kitchen).
- a given sensor channel e.g., temperature
- the virtual sensors 118 can output data in a format that is consistent with one or more known API architectures, such as Representational State Transfer (REST) or publish-subscribe APIs. Such outputs formats can utilize appropriate authentication, access control primitives, and other controls. Outputting data from the virtual sensors 118 in accordance with a known API can allow apps to seamlessly make use of wide variety of data (whether it be raw sensor data, featurized sensor data, or higher order inferences) generated by the sensing system 100 .
- REST Representational State Transfer
- publish-subscribe APIs Such outputs formats can utilize appropriate authentication, access control primitives, and other controls.
- Outputting data from the virtual sensors 118 in accordance with a known API can allow apps to seamlessly make use of wide variety of data (whether it be raw sensor data, featurized sensor data, or higher order inferences) generated by the sensing system 100 .
- FIG. 16 illustrates a logic flow diagram of a process 600 of detecting events via virtual sensors 118 , in accordance with at least one aspect of the present disclosure.
- the first and/or higher order virtual sensors 118 of the sensing system 100 are trained. As mentioned herein, training of the virtual sensors 118 may involve training the virtual sensors 118 with annotated training examples.
- the various sensors 110 of the sensor assembly 102 sense physical phenomena in the environment of the sensor assembly 102 .
- features from the raw sensor data are extracted, as described above.
- the featurization of the sensor data can be performed, for example, by the sensor assembly 102 (e.g., the microcontroller 121 thereof executing the featurization module 112 ), by the computer system 104 (e.g., by a programmed server thereof executing the featurization module 112 ), or by both components in a distributed manner.
- the sensor assembly 102 could transmit the featurized data in encrypted, periodic data packets, where the data packets might have concatenated sensor data from multiple sensors 110 .
- one or more first order sensors 120 may detect occurrences of events that they are trained to detect at step 601 based on the featurized raw sensor data.
- the higher order virtual sensors can detect the events, conditions, durations, etc. that they are trained to detect.
- the back-end server system 104 can transmit data about the detections by the virtual sensors to a remote system via a data communication network via wired and/or wireless communication links.
- the computer system 104 could transmit detection of virtual events to the sensor assembly 102 so that the sensor assembly could trigger an output device (e.g., a light source or speaker) or transmit a notification to a user device (e.g., a user's smartphone, laptop, tablet, etc.). Also, the computer system 104 could transmit a notification directly to the user device or to another networked, computer-based system, such as an alarm or emergency response system, an ordering system, a log, a monitoring system, etc.
- an output device e.g., a light source or speaker
- a user device e.g., a user's smartphone, laptop, tablet, etc.
- the computer system 104 could transmit a notification directly to the user device or to another networked, computer-based system, such as an alarm or emergency response system, an ordering system, a log, a monitoring system, etc.
- process 600 is described for the sake of convenience and not with an intent of limiting the disclosure as comprising a series and/or a number of steps, it is to be understood that the process does not need to be performed as a series of steps and/or the steps do not need to be performed in the order shown and described with respect to FIG. 16 , but the process may be integrated and/or one or more steps may be performed together, or the steps may be performed in the order disclosed or in an alternate order.
- higher-order virtual sensors could be trained (e.g., step 601 ) after lower-order virtual sensors start detecting their associated events, conditions, durations, etc.
- the times for performance of the steps illustrated in Figure N are not necessarily discrete, but instead can be ongoing continuous. That is, for example, the training of the virtual sensors 118 may be ongoing.
- steps 602 - 606 can be performed continuously.
- a computing or data processing system 1700 suitable for storing and/or executing program code may take many forms and in one embodiment may include at least one processor 1702 , which may be or be part of a controller, coupled directly or indirectly to memory devices or elements through a system bus, as shown in FIG. 17 .
- Computing system 1700 in FIG. 17 is shown with a processor 1702 , random access memory (RAM) 1703 , nonvolatile memory 1704 , device specific circuits 1701 , and I/O interface 1705 .
- the RAM 1703 and/or nonvolatile memory 1704 may be contained in the processor 1702 as could the device specific circuits 1701 and I/O interface 1705 .
- the processor 1702 may comprise, for example, an off-the-shelf microprocessor, custom processor, FPGA, ASIC, discrete logic, etc., or generally any device for executing instructions.
- the RAM 1703 is typically used to hold variable data, stack data, executable instructions, etc., and may include dynamic random access memory (DRAM).
- DRAM dynamic random access memory
- Such a computing system 1700 may be used as one of the servers of the computer system 104 , as a user device (e.g., mobile device), a remote computing system that receives notifications from the virtual sensors 118 , etc.
- the nonvolatile memory 1704 may comprise any type of nonvolatile memory such as, but not limited to, electrically erasable programmable read only memory (EEPROM), flash programmable read only memory (PROM), battery backup RAM, hard disk drives, etc.
- EEPROM electrically erasable programmable read only memory
- PROM flash programmable read only memory
- the nonvolatile memory 1704 is typically used to hold the executable firmware and any nonvolatile data containing programming instructions that can be executed to cause the processor 102 to perform certain functions.
- the I/O interface 1705 may include a communication interface that allows the processor 1702 to communicate with devices external to the controller.
- Examples of the communication interface may comprise, but are not limited to, serial interfaces such as RS-232, Universal Serial Bus (USB), Small Computer Systems Interface (SCSI), RS-422, or a wireless communication interface such as Bluetooth, near-field communication (NFC) or other wireless interfaces.
- the computing system 1700 may communicate with an external device via the communication interface 1705 in any communication protocol such as Automation/Drive Interface (ADI).
- ADI Automation/Drive Interface
- the sensing system 100 described herein can be utilized in a number of different contexts.
- the sensing system 100 can be utilized to assist in monitoring patients and providing effective healthcare to patients. With a rapidly aging population, providing care at home for this population will become a necessity. A key aspect that caregivers need is to track the Activities of Daily Living (ADL) for people, and be able to detect anomalies when these activities deviate from the norm for each individual.
- the sensing system 100 including virtual sensors 118 can be used to provide a comprehensive system to track ADLs, such as bathroom usage, movement within the house, daily chores like cooking and eating, adherence to medications, detecting falls, all without needing instrumentation and intrusive sensing.
- virtual sensors to track various contexts within a person's home can be also used for predictive interventions (e.g. predict an impending fall), and not just for reactive events, particularly by customizing them to each individual.
- the sensing system 100 can be utilized in industrial settings. In industrial settings with mechanical and electrical machinery, preventative maintenance and predictive analytics can be of huge help to increase equipment lifetime, as well as reduce downtime due to failures. Most older equipment does not have any sensing capability and newer equipment may have limited purpose specific sensing.
- the sensing system 100 including virtual sensors 118 as described herein, can be utilized to learn different states of industrial equipment by training a variety of virtual sensors 118 and then building applications and notifications to help with usage tracking, tracking error conditions, scheduling maintenance operations automatically, and other such tasks.
- the sensing system 100 can be utilized in environmental sustainability efforts. Over two-thirds of the electricity generated in the US and over 30% of the potable water is used by human occupants of buildings, both commercial and residential. tracking resource usage, i.e. water and energy, at a fine granularity and notifying human occupants as well as building managers of wastage can lead to significant reduction in the usage of these natural resources.
- the sensing system 100 including virtual sensors 118 as described herein, can be utilized to train a variety of virtual sensors 118 in home and buildings to track individual appliance usage and water consumption and correspondingly provide comprehensive user interfaces and notifications to promote behavioral changes.
- the sensing system 100 can be utilized for managing facilities. Given that humans spent over one third of their lives inside a commercial building (i.e., their work place), methods to make them more performative to serve occupant need can lead to improved productivity, comfort, and happiness. While sensing systems do get deployed in modern buildings for heating, ventilation, air-conditioning (HVAC), and lighting management, they are all purpose-specific, costly to deploy and maintain, and not easy to repurpose.
- HVAC heating, ventilation, air-conditioning
- the sensing system 100 including virtual sensors 118 can be utilized to provide a uniform sensing substrate for all things related to smart building management, including control of HVAC systems, space utilization, power consumption tracking, occupancy and people movement tracking, fault detection, etc.
- the sensing system 100 can be utilized for home-based consumer applications. With the advent of the IoT and integration with voice assistants, such as Amazon Alexa and Google Home, the presently described sensing system 100 can be utilized in a number of different ways as part of a “smart home.” For example, the sensing system 100 can implement virtual sensors 118 to track the usage of consumables like toilet paper and soap to notify users when they are running low (or even directly order replenishments) or notify users about the status of appliances in the home (e.g., a dishwasher or laundry machine). The sensing system 100 can also implement virtual sensors 118 trained to detect meta events, such as any movements or sounds within the home, for security purposes. The sensing system 100 can also implement virtual sensors 118 trained to non-intrusively detect sleep duration and patterns, without the user(s) being required to wear any device(s).
- voice assistants such as Amazon Alexa and Google Home
- the sensing system 100 can be utilized in a variety of implementations for smart cities. There is a major push across the US and the globe to make cities smarter by adding sensing to street lights and other public infrastructure, such as buses, trolleys, and roads.
- the sensing system 100 including virtual sensors 118 as described herein and suitably outfitted for outdoor environments, can sense a wide variety of facets of the city environment.
- a number of virtual sensors 118 can be trained to detect events of interest that have distinct signatures. For example, virtual sensors 118 can sense a traffic jam, an accident, a gun shot, traffic estimation, street illumination, environmental quality, etc.
- a key advantage of the sensing system 100 implementing virtual sensors 118 is to do all the processing and featurization at the sensor assembly 102 itself, thereby addressing many of the privacy concerns in a smart city environment and also reducing the data that needs to be transmitted at the scale of a city.
- the present invention is directed to a sensing system comprising a sensor assembly and a back-end server system.
- the sensor assembly comprises a control circuit and one or more sensors, where each of the sensors is configured to sense one or more physical phenomena in an environment of the sensor assembly that are indicative of events.
- the back end server system comprises at least one server that is in communication with the sensor assembly.
- the control circuit of the sensor assembly is configured to extract a plurality of features from raw sensor data collected by the one or more sensors to form featurized data and to transmit data packets to the back end server system, where the data packets comprise the featurized data.
- the at least one server of the back end server system is configured to implement one or more first order virtual sensors, where each of the one or more first order virtual sensors is trained through machine learning to detect, based on the featurized data, an event in the environment of the sensor assembly.
- the back end server system comprises at least one server that comprises a processor and a memory for storing instructions that, when executed by the processor, cause the server to: (i) receive the featurized data from the sensor assembly; (ii) determine an occurrence of one or more events via the featurized data; (iii) train, via machine learning, one or more first order virtual sensor implemented by the server to detect the one or more events based on the featurized data; and (iv) monitor, via the virtual sensor, for subsequent occurrences of the one or more events based on featurized data from the sensor assembly.
- the events detected by the one or more first order virtual sensors are not directly sensed by any of the one or more sensors of the sensor assembly.
- the sensor assembly may be in wireless communication with the back end server system.
- the at least one server of the back end server system may be further configured to implement one or more second order virtual sensors, wherein the one or more second order virtual sensors are trained to detect, based on, at least in part, outputs of one of more of the first order virtual sensors, a second order condition in the environment of the sensor assembly.
- At least one of the one or more second order virtual sensors may produce a non-binary output and the first order virtual sensors may produce binary outputs, non-binary outputs, or a set of labels.
- the first, second and/or higher order virtual sensors may comprise machine-learned classifiers that are trained to detect events, conditions, durations, etc. in the environment of the sensor assembly.
- the classifiers could be support vector machines or deep learning algorithms/networks, for example, that may be trained through supervised or unsupervised learning. Labeled data for supervised learning may be collected from annotations of events by a user that are captured via a user interface provided by the back end server system.
- the sensors may comprise passive and/or active sensors.
- passive sensors are an infrared radiation sensor, an ambient light color sensor, an ambient light intensity sensor, a magnetic field sensor, a temperature sensor, an ambient pressure sensor, a humidity sensor, a vibration sensor, an external device communication sensor, a motion sensor, an acoustic sensor, an indoor air quality sensor, a chemical sensor, a vision sensor, and an electromagnetic interference sensor.
- active sensors are a sonar sensor, an ultrasonic sensor, a light emitting sensor, a radar based sensor, an acoustic sensor, an infrared camera, an active infrared sensor, an indoor positioning system, an x-ray based sensor, a seismic sensor, and an active sound measurement system.
- the sensor assembly may also comprise an output feedback device, such as a speaker, a light source, and a vibration source. Additionally, the sensor assembly may be positionally stationary. And it need not include a high-resolution camera.
- the back-end server system is configured to transmit a notification to the sensor assembly when a particular event is detected by the one or more first order virtual sensors.
- the sensor assembly may transmit a notification to a user via the output feedback device in response to receiving the notification from the back end server system that the particular event was detected.
- the sensor assembly comprises one or more circuit boards, where the control circuit and the one or more sensors are connected to the one or more circuit boards.
- the sensor assembly may further comprise a housing that houses the one or more circuit boards, the one or more sensors, and the control circuit.
- the sensor assembly may comprise a single circuit board and a housing.
- the control circuit and the or more sensors may be connected to the single circuit board and the housing may house the single circuit board, the one or more sensors, and the control circuit.
- a first sensor of the one or more sensors may have an adjustable sampling rate.
- the at least one server of the back end server system may be further configured to transmit an adjustment for the adjustable sampling rate for the first sensor to the sensor assembly.
- the featurized data for a sensor of the sensor assembly may comprise a statistical measure of raw sensor data for the sensor over a time window.
- the statistical measure may include; the minimum value over the time window; the maximum value over the time window; the range over the time window; the mean over the time window; the median over the time window; the mode over the time window; the sum of the raw sensor values over the time window; the standard deviation over the time window; and/or the centroid of the raw sensor values over the time window.
- control circuit of the sensor assembly may be configured to transmit periodic data packets to the back end server system, where the data packets comprise concatenated featurized data for two or more sensors of the plurality of sensors.
- the data packets may also be encrypted by the sensor assembly prior to being transmitted.
- the sensor assembly further comprises a wireless communication circuit for communicating wirelessly with a user device.
- the wireless communication circuit may comprises a wireless communication circuit selected from the group consisting of a Bluetooth circuit, a WiFi circuit, a Z-Wave circuit, a Zigbee circuit, a RFID circuit, a LoRA radio circuit and a LoRAWAN radio circuit.
- the back-end server system may be configured to transmit a notification to the sensor assembly when a particular event is detected by the one or more first order virtual sensors.
- the sensor assembly is configured to transmit a notification to the user device via the wireless communication circuit in response to receiving the notification from the back end server system that the event was detected.
- the back-end server system may be configured to transmit a notification to a remote computer-based system when a particular condition is detected by a first, second or higher order virtual sensor.
- the sensing system may include a plurality of such sensor assemblies that are distributed throughout an environment or location.
- the first, second, and/or higher order virtual sensors may use data from sensors on more than one of the sensor assemblies to detect their corresponding events, conditions, durations, etc. that they are trained to detect throughout the environment or location.
- the present invention is directed to a method that comprises the steps of (i) sensing, by a sensor assembly that comprises one or more sensors, one or more physical phenomena in an environment of the sensor assembly; (ii) extracting a plurality of features from raw sensor data collected by the one or more sensors to form featurized data; and (iii) detecting, by a machine-learning first order virtual sensor of a back end server system, based on the featurized data, an event in the environment of the sensor assembly.
- the sensor assembly extracts the plurality of features from the raw sensor data and the method further comprises the step of transmitting, by the sensor assembly, the featurized data to the back end server system.
- the method may also comprise the step of, prior to detecting the event, training the first order virtual sensor to detect the event from featurized data.
- the method may also comprise the step of receiving, by the back end server system via a user interface, annotations of occurrences of the event to use as the labeled data for the supervised training.
- the back end server system comprises a plurality of machine-learning first order virtual sensors
- the detecting step comprises detecting, by each of the plurality of machine-learning first order virtual sensors, based on the featurized data, a different event in the environment of the sensor assembly.
- the back end server system may further comprise a machine-learning second order virtual sensor that is trained through machine learning to detect, based on output from at least one of the plurality of first order virtual sensors, a second order condition in the environment of the sensor assembly, in which case the method may further comprise the step of detecting, by the machine-learning second order virtual sensor, the second order condition in the environment of the sensor assembly based on the output from at least one of the plurality of first order virtual sensors.
- one of the sensors has an adjustable sampling rate, in which case the method may further comprises the step of transmitting, by the back end server system, an adjustment for the sampling rate to the first sensor.
- the sensor assembly further comprises an output feedback device.
- the method may further comprise the step of outputting, by the output feedback device, a code for authentication of the sensor assembly to a user device.
- the sensor assembly may further comprise a wireless communication circuit for communicating wirelessly with a user device.
- the method may further comprise the steps of (i) transmitting, by the back-end server system, a notification to the sensor assembly when the event is detected by the first order virtual sensor; and (ii) transmitting, by the sensor assembly, a notification to the user device via the wireless communication circuit in response to receiving the notification from the back end server system that the event was detected.
- the various virtual sensors 118 , 120 , 124 described herein may be implemented with software stored in primary and/or secondary memory of the computer system 104 , that when executed by a processor(s) of the computer system, causes the processor(s) to perform virtual sensor classifications as described herein.
- the activation group and machine learning modules 116 may be implemented with software stored in primary or secondary memory of the computer system 104 , that when executed by a processor(s) of the computer system, causes the processor(s) to perform their respective functions as described herein.
- a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), but is not limited to, floppy diskettes, optical disks, compact disc, read-only memory (CD-ROMs), and magneto-optical disks, read-only memory (ROMs), random access memory (RAM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), magnetic or optical cards, flash memory, or a tangible, machine-readable storage used in the transmission of information over the Internet via electrical, optical, acoustical or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.). Accordingly, the non-
- control circuit may refer to, for example, hardwired circuitry, programmable circuitry (e.g., a computer processor comprising one or more individual instruction processing cores, processing unit, processor, microcontroller, microcontroller unit, controller, digital signal processor (DSP), programmable logic device (PLD), programmable logic array (PLA), or field programmable gate array (FPGA)), state machine circuitry, firmware that stores instructions executed by programmable circuitry, and any combination thereof.
- programmable circuitry e.g., a computer processor comprising one or more individual instruction processing cores, processing unit, processor, microcontroller, microcontroller unit, controller, digital signal processor (DSP), programmable logic device (PLD), programmable logic array (PLA), or field programmable gate array (FPGA)
- state machine circuitry firmware that stores instructions executed by programmable circuitry, and any combination thereof.
- the control circuit may, collectively or individually, be embodied as circuitry that forms part of a larger system, for example, an integrated circuit (IC), an application-specific integrated circuit (ASIC), a system on-chip (SoC), desktop computers, laptop computers, tablet computers, servers, smart phones, etc.
- IC integrated circuit
- ASIC application-specific integrated circuit
- SoC system on-chip
- control circuit includes, but is not limited to, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program, which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program, which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of random access memory), and/or electrical circuitry forming a communications device (e.g., a modem, communications switch, or optical-electrical equipment).
- a computer program e.g., a general purpose computer configured by a computer program, which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program, which at least partially carries out processes and/or devices described herein
- logic may refer to an app, software, firmware and/or circuitry configured to perform any of the aforementioned operations.
- Software may be embodied as a software package, code, instructions, instruction sets and/or data recorded on non-transitory computer readable storage medium.
- Firmware may be embodied as code, instructions or instruction sets and/or data that are hard-coded (e.g., nonvolatile) in memory devices.
- the terms “component,” “system,” “module,” and the like can refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution.
- an “algorithm” refers to a self-consistent sequence of steps leading to a desired result, where a “step” refers to a manipulation of physical quantities and/or logic states which may, though need not necessarily, take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It is common usage to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. These and similar terms may be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities and/or states.
- One or more components may be referred to herein as “configured to,” “configurable to,” “operable/operative to,” “adapted/adaptable,” “able to,” “conformable/conformed to,” etc.
- “configured to” can generally encompass active-state components and/or inactive-state components and/or standby-state components, unless context requires otherwise.
- any reference to “one aspect,” “an aspect,” “an exemplification,” “one exemplification,” and the like means that a particular feature, structure, or characteristic described in connection with the aspect is included in at least one aspect.
- appearances of the phrases “in one aspect,” “in an aspect,” “in an exemplification,” and “in one exemplification” in various places throughout the specification are not necessarily all referring to the same aspect.
- the particular features, structures or characteristics may be combined in any suitable manner in one or more aspects.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mathematical Physics (AREA)
- Evolutionary Computation (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Testing Or Calibration Of Command Recording Devices (AREA)
- Arrangements For Transmission Of Measured Signals (AREA)
- Testing And Monitoring For Control Systems (AREA)
Abstract
Description
- This application is a continuation of U.S. Nonprovisional patent application Ser. No. 15/961,537, titled VIRTUAL SENSOR SYSTEM, filed Apr. 24, 2018, which claims the benefit of and priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/602,487, titled SYNTHETIC SENSORS, filed Apr. 24, 2017; U.S. Provisional Patent Application No. 62/602,543, titled GENERAL PURPOSE SYNTHETIC SENSOR SYSTEM, filed Apr. 27, 2017; and U.S. Provisional Patent Application No. 62/605,675, titled SECURE UBIQUITOUS SENSING SYSTEM, filed Aug. 22, 2017; the disclosure for each of which is hereby incorporated by reference in its entirety.
- This invention was made with government support under grant CNS1526237 awarded by the National Science Foundation. The government has certain rights in the invention.
- The background description provided herein is for the purpose of generally presenting the context of the disclosure. Unless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.
- The promise of smart environments (e.g., the “smart home”) and the Internet of Things (IoT) relies on robust sensing of diverse environmental facets. Traditional approaches rely on measuring one particular aspect of an environment with special-purpose sensors. Regardless of the approach taken, the goal remains the same: to apply sensing and computation to enhance the human experience, especially as it pertains to physical contexts (e.g., home, office, workshop) and the amenities contained within. Numerous approaches have been attempted and articulated, though none have reached widespread use to date.
- One option is for users to upgrade their environments with newly released “smart” devices (e.g., light switches, kitchen appliances), many of which contain sensing functionality. However, this sensing is generally limited to the appliance itself (e.g., a smart light sensing whether it is on or off) or single parameter associated with its core function (e.g., a smart thermostat sensing whether the room is occupied). Likewise, few smart devices are interoperable, forming silos of sensed data that thwart a holistic experience. Instead of achieving a smart home, the best one can currently hope for are small islands of smartness. This approach also carries a significant upgrade cost, which so far has proven unpopular with consumers, who generally upgrade appliances in a piecemeal manner.
- A variety of different sensing modalities have been described in the context of environmental sensing, including special-purpose sensing systems, distributed sensing systems, infrastructure-mediated sensing systems, and general-purpose sensing systems. These sensing modalities can be organized according to the number of sensors that they utilize and the number of facets or parameters that they sense. In particular, special-purpose sensing systems utilize a single sensor, infrastructure-mediated and general-purpose sensing systems utilize one or a few sensors, and distributed sensing systems utilize many sensors. Further, special-purpose sensing systems sense a single facet, infrastructure-mediated sensing systems tend to sense one or a few facets, general-purpose sensing systems tend to sense many facets, and distributed sensing systems can sense anywhere from a single facet to many facets of an environment.
- However, currently existing sensing systems typically transfer all the sensed data to a backend server for processing and/or storage system leading to problems relating to, for example, bandwidth usage and processing speed.
- In one general aspect, the present invention is directed to a ubiquitous sensing system utilizing one or more sensors that are capable of directly or indirectly detecting events in the environment surrounding the sensor assembly. While the sensing system is configured for indirect sensing, such that each and every object and/or person in the environment need not be instrumented in a location in order to sense their state or events associated with them, the sensors may also be coupled to objects and/or humans for direct sensing without any modifications. The sensing system includes a sensor assembly that can be positioned within an environment or location and that is capable of communicating with a server or other type of computer system for processing. The sensing system may optionally process sensor data locally and transmit the processed data to the server. The server utilizes machine learning to characterize received sensor data and/or training data in association with an event or events to learn to detect the occurrence of the designated event(s). In one aspect, the user can annotate the sensor data stream to indicate when certain events occurred and the machine learning algorithm then learns what characteristics of the data stream correlate to the event, allowing the sensing system to then detect future occurrences of the event. In another aspect, the sensing system utilizes deep machine learning to determine when events have occurred and what characteristics of the sensor data stream correlate to those events. In yet another aspect, the server can have a library of previously trained machine learning models and/or may train machine learning models from prior data collection steps, crowd sourcing, or the like, for different activities and events, and the sensing system can directly send sensor data and have the server determine what events have occurred. The server thus can define a set of machine learning-trained “virtual sensors” that are each capable of detecting events from combinations of sensor data that are correlated with the occurrences of the events, but that are not necessarily provided by sensors that are directly affixed or otherwise associated with the object(s) being sensed. More specifically, these types of virtual sensors can be referred to as “first order” virtual sensors. The server can further implement higher order virtual sensors that are capable of detecting events or conditions from a combination of data from lower order virtual sensors (e.g., second order virtual sensors detect an event from the output of first order virtual sensors).
- In that connection, in one embodiment, the sensing system comprises a sensor assembly with processing and communication capabilities and a back end server system. The sensor assembly comprises a control circuit and one or more sensors. Each of the sensors senses one or more different physical phenomenon in an environment of the sensor assembly. The back end server system, which comprises at least one server, is in communication with the sensor assembly. Further, the control circuit of the sensor assembly is configured to, among other things: (i) extract features from raw sensor data from the plurality of sensors; and (ii) transmit data packets to the back end server system, wherein the data packets comprise featurized data from the plurality of sensors. The at least one server of the back end server system is configured to implement the first order virtual sensors, where each of the first order virtual sensors is trained through machine learning to detect, based on the featurized data transmitted from the sensor assembly, an event or condition in the environment of the sensor assembly.
- In another general embodiment, the back end server system is programmed to receive the featurized sensor data from the sensor assembly; determine an occurrence of an event via the featurized sensor data; train, via machine learning, a virtual sensor implemented by the server to detect the event by characterizing the featurized sensor data for the plurality of sensors that are activated in association with the event; and monitor, via the virtual sensor, for subsequent occurrences of the event.
- Thus, various embodiments of the present invention provide a highly capable sensor that can directly or indirectly monitor a large environment. These and other benefits of the present invention will be apparent from the description that follows.
- The features of various aspects are set forth with particularity in the appended claims. The various aspects, however, both as to organization and methods of operation, together with further objects and advantages thereof, may best be understood by reference to the following description, taken in conjunction with the accompanying drawings as follows.
-
FIG. 1A illustrates a block diagram of a sensing system, in accordance with at least one aspect of the present disclosure. -
FIG. 1B illustrates a block diagram of the sensing system ofFIG. 1A with a trained virtual sensor, in accordance with at least one aspect of the present disclosure. -
FIG. 2 illustrates a block diagram of a sensing system including virtual sensors receiving data from various sensors of the sensor assembly, in accordance with at least one aspect of the present disclosure. -
FIG. 3A illustrates a block diagram of a sensing system, in accordance with at least one aspect of the present disclosure. -
FIG. 3B illustrates a block diagram of the sensing system ofFIG. 3A with a trained second order virtual sensor, in accordance with at least one aspect of the present disclosure. -
FIG. 4 illustrates a block diagram of a sensing system including sensors, first order virtual sensors, and second order virtual sensors receiving data hierarchically, in accordance with at least one aspect of the present disclosure. -
FIG. 5 illustrates a block diagram of a sensing system including multiple sensor assemblies communicably coupled to a computer system, in accordance with at least one aspect of the present disclosure. -
FIG. 6 illustrates a perspective view of a sensor assembly, in accordance with at least one aspect of the present disclosure. -
FIG. 7 illustrates a timeline of sampling rates for various sensors, in accordance with at least one aspect of the present disclosure. -
FIG. 8 illustrates a first sensor data graphical display annotated with events detected by the sensing system, in accordance with at least one aspect of the present disclosure. -
FIG. 9 illustrates a second sensor data graphical display annotated with events detected by the sensing system, in accordance with at least one aspect of the present disclosure. -
FIG. 10 illustrates a third sensor data graphical display annotated with events detected by the sensing system, in accordance with at least one aspect of the present disclosure. -
FIG. 11 illustrates a fourth sensor data graphical display annotated with events detected by the sensing system, in accordance with at least one aspect of the present disclosure. -
FIG. 12 illustrates a fifth sensor data graphical display annotated with events detected by the sensing system, in accordance with at least one aspect of the present disclosure. -
FIG. 13 illustrates a sixth sensor data graphical display annotated with events detected by the sensing system, in accordance with at least one aspect of the present disclosure. -
FIG. 14 illustrates a block diagram of a microwave second order virtual sensor represented as a state machine, in accordance with at least one aspect of the present disclosure. -
FIG. 15A illustrates a graphical user interface utilized to annotate an event being detected by the sensing system, in accordance with at least one aspect of the present disclosure. -
FIG. 15B illustrates a graphical user interface displaying an event being detected by the sensing system, in accordance with at least one aspect of the present disclosure. -
FIG. 16 illustrates a logic flow diagram of a process of detecting events via virtual sensors, in accordance with at least one aspect of the present disclosure. -
FIG. 17 illustrates a block diagram of a general computing or data processing system, in accordance with at least one aspect of the present disclosure. - Certain aspects will now be described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the devices and methods disclosed herein. One or more examples of these aspects are illustrated in the accompanying drawings. Those of ordinary skill in the art will understand that the devices and methods specifically described herein and illustrated in the accompanying drawings are non-limiting examples aspects and that the scope of the various aspects is defined solely by the claims. The features illustrated or described in connection with one aspect may be combined with the features of other aspects. Such modifications and variations are intended to be included within the scope of the claims. Furthermore, unless otherwise indicated, the terms and expressions employed herein have been chosen for the purpose of describing the illustrative aspects for the convenience of the reader and are not to limit the scope thereof.
-
FIG. 1A illustrates a block diagram of asensing system 100, in accordance with at least one aspect of the present disclosure. Thesensing system 100 comprises asensor assembly 102 having one ormore sensors 110 and a computer system 104 (e.g., one or a number of networked servers) to which thesensor assembly 102 can be communicably connected via a network 108 (FIG. 5 ). Thesensors 110 include a variety of sensors for detecting various physical or natural phenomena in the vicinity of thesensor assembly 102, such as vibration, sound, ambient temperature, light color, light intensity, electromagnetic interference (EMI), motion, ambient pressure, humidity, composition of gases (e.g., allowing certain types of gases and pollutants to be detected), distance to an object or person, presence of a user device, infrared radiation (e.g., for thermal imaging), or the like. WhileFIG. 1A illustrates onesensor assembly 102 included in thesensing system 100, a plurality of sensor assemblies communicably connected with each other and/or with acomputer system 104 are within the scope of this disclosure (as shown inFIG. 5 ). - The
sensing system 100 is configured to train and implement one or morevirtual sensors 118, which are machine learning based classification systems or algorithms trained to detect particular events to which thevirtual sensors 118 are assigned as correlated to the data sensed by thesensors 110 of thesensor assembly 102 and/or othervirtual sensors 118. The training and implementation of various aspects of thevirtual sensors 118 are described in more detail below. - The
sensing system 100 may be configured to be, without limitation, a special-purpose sensing system, a distributed sensing system, infrastructure-mediated sensing system, and/or a general-purpose sensing system. - In an aspect, special-purpose sensing systems may include a
single sensor assembly 102 configured to monitor a single facet of an environment. For example, asensor assembly 102 including a microphone can be affixed to a faucet so that water consumption can be inferred (which, in turn, is used to power behavior-changing feedback). As another example, asensor assembly 102 including a temperature sensor and/or an occupancy sensor can be placed in a room to sense environmental data that can be used by a heating, ventilation, and air conditioning (HVAC) system to manage the HVAC system. - In an aspect, infrastructure-mediated sensing systems may include one or
more sensor assemblies 102 installed within a structure at strategic infrastructure probe points. For example,sensor assemblies 102 can be coupled to a building's power lines to detect “events” caused by electrical appliances. Since home electrical lines are shared, a single sensor assembly can observe activities across an entire house. Infrastructure-mediated sensing systems may also be coupled to, e.g., HVAC, plumbing, natural gas lines, and electric lighting. For example, a sensing assembly including one or more sensors may be installed at a probe point, enabling the sensing system to monitor aspects of the building. For example, a plumbing-attached sensor assembly may be configured to detect sink, shower, and toilet use. Infrastructure-mediated sensing systems may include one sensor assembly and/or a plurality of sensor assemblies utilized to monitor a few facts of an environment. - In an aspect, distributed sensing systems may include
many sensor assemblies 102 deployed in an environment that are networked together. Such a sensing system may be used to enlarge the sensed area (e.g., occupancy sensing across an entire warehouse) or increase sensing fidelity through complementary readings (e.g., sensing seismic events utilizing sensors deployed across an area). The distributedsensor assemblies 102 can be homogenous (e.g., an array of identical infrared occupancy sensors) and/or heterogeneous. Also, the array can sense one facet (e.g., fire detection) or many facets (e.g., appliance use). For example, a home security system is a heterogeneous distributed system, where one or more sensor assemblies may include door sensors, window sensors, noise sensors, occupancy sensors and even cameras work together to sense a single facet of the environment: “Is there an intruder in the home?” As another example, a homogenous array of sensor assemblies comprising magnetic sensors can be utilized to detect object interactions throughout an entire house. Thus, distributed sensing systems may be configured to include as many sensor assemblies utilized to monitor anywhere between a single facet to many facets of an environment, depending upon the particular implementation of the distributed sensing system. - In an aspect, a general purpose sensing system may include a wide variety of
underlying sensor assemblies 102 that can be utilized flexibly such that they can be attached to a variety of objects and can sense many facets without any modification to thesensor assembly 102. - In certain aspects, the
sensing system 100 may be a direct sensing system and/or an indirect sensing system. For direct sensing, asensor assembly 102 is physically coupled to an object or infrastructure of interest and may provide excellent signal quality. Some direct sensing systems may include utilize batteries or other power sources to power the sensor assembly. Indirect sensing systems seek to sense state and events indirectly, without having to physically couple to objects. For example, a sensor assembly including an electromagnetic sensor (EMI sensor) can be installed near an appliance and/or it's power source to detect usage of the appliance because when an appliance is in different modes of operation (e.g., refrigerator compressor running, interior lights on/off), the object and/or the power source emits characteristic electromagnetic noise that can be captured and recognized. As another example, a sensor assembly including an acoustic sensor can be installed in a workshop to recognize tool usage according to the detected acoustic characteristics of each tool. Example sensors to be included in thesensor assembly 102 that are configured for indirect sensing can include, without limitation, noncontact thermometers, rangefinders, motion sensors, EMI sensors, acoustic sensors, vibration sensors, magnetic field sensors, cameras, ultrasonic sensors, laser based sensors (e.g., lidar), or the like. Indirect sensing systems have greater flexibility in sensor placement which allows for sensors to be better integrated into the environment or even hidden. Further, it may be possible to place the sensor assembly of an indirect sensing system at a nearby wall power outlet, eliminating the need for batteries. - Referring back to
FIG. 1A , thesensor assembly 102 further includes a featurization module 112 (which can be implemented with firmware executed by a microcontroller(s) or other programmable circuit(s) of the sensor assembly 102) that processes and converts raw data from thesensors 110 into various forms of processed data and extracts measurable properties or characteristics of the data, i.e., features. Thefeaturization module 112 can output the processedraw sensor 110 data in the form of, e.g., a feature vector, to be provided to a machine learning-based classification system, a statistical classification system, and/or a clustering system that utilizes pattern recognition, machine learning techniques (e.g., classifier), logistic regression, decision tree, random forest, or the like, at thecomputer system 104. In various aspects, thefeaturization module 112 can ingest data from both high sample rate (e.g., several kHz to several MHz) and low sample rate (e.g., 0.1 Hz to 1 kHz)sensors 110. Examples of high sample rate sensors may include, without limitation, vibration sensors, EMI sensors, microphones, cameras, or the like. Examples of low sample rate sensors may include, without limitation, temperature sensors, humidity sensors, light level sensors, or the like. - The
featurization module 112 can determine or extract various features from, for example, the time domain and/or the frequency domain representations (e.g., by transformation of the time domain representations) of the sensor data. The features from the raw data from thesensors 110 can be extracted utilizing a number of different techniques, which can vary according to sample rate at which the sensor data was collected or the particular type ofsensors 110. Furthermore, the number and/or types of features extracted from the raw sensor data and/or transmitted to thecomputer system 104 by thefeaturization module 112 can be based on, for example, the sample rate, the types of sensors, user input, or the like. In various aspects, the number and/or types of features extracted by thefeaturization module 112 can be controlled by thefeaturization module 112 itself, thecomputer system 104, a client 106 (FIG. 5 ), and/or another system or device that is part of thesensing system 100 or can access thesensing system 100. In one example, the data from one or more high sample rate sensors of thesensor assembly 102 can be featurized by transforming the data into a spectral representation via a sliding window Fast Fourier Transform (FFT) (e.g., 256 samples, 10% overlapping) at a particular rate (e.g., 10 Hz), with phase information either utilized or discarded. This technique may also be used to featurize data from low sample rate sensors. In another example, the data from a high sample rate acoustic sensor (e.g., a microphone) of thesensor assembly 102 can be transformed into the frequency domain first and then one or more of the mel-frequency cepstral coefficient (MFCC) features (e.g., 14 or 40 MFCC coefficients, on a sliding window of audio data), the delta features, and/or the double delta features can be extracted from the frequency domain. In another example, the data from the low and/or highsample rate sensors 110 can be featurized by calculating various statistical features (e.g., min, max, range, mean, median, mode, sum, standard deviation, and/or centroid) on a rolling buffer with different time granularities (e.g., 100 ms, 500 ms, and/or one second) at a particular rate (e.g., 10 Hz). Also, particularly for high sample rate sensors, the raw sensor data can be featurized by transforming the data into a spectral representation. In one aspect, the featurized data for every sensor can be independently transmitted to thecomputer system 104 for further processing thereon. In another aspect, the featurized data for a subset of sensors can be packaged together and transmitted to thecomputer system 104. Also, the back-end server system can transmit a signal or instruction to the sensor assembly to inform the sensor assembly what features should be extracted for a particular sensor. That is, the back-end server system can modify or change when or which features are extracted by the sensor assembly (for embodiments where the sensor assembly is extracting the features in the raw sensor data). - In one aspect depicted in
FIGS. 1A and 1B , thefeaturization module 112 can be present onboard thesensor assembly 102 so that the data from thesensors 110 can be featurized prior to being transmitted to thecomputer system 104. In this aspect, the raw sensor data is not transmitted or otherwise stored outside of thesensor assembly 102. This may be advantageous for various reasons. First, the featurization denatures thesensor 110 data, providing an additional degree of data privacy by precluding the transmitted data from being intercepted by an unintended recipient and then reconstructed. For example, data from an acoustic sensor can be converted into a low-fidelity spectral representation in combination with basic statistical features, which precludes the ability to reconstruct the spoken content from the featurized data that is transmitted. Similarly, data from a vision based sensor (e.g., camera) may be featurized and denatured. Second, the featurization reduces the data packet size, which is useful for conserving transmission bandwidth and storage of thesensor 110 data. Alternatively and/or additionally, thefeaturization module 112 can be present on thecomputer system 104. In this aspect, some or all of the features are extracted from the raw sensor data after the raw sensor data and/or partially featurized data is transmitted to thecomputer system 104. This likewise may be advantageous for various reasons, such as by reducing the computational power that is required onboard thesensor assembly 102. - After the data is processed by the
featurization module 112, the featurized data can be processed and/or analyzed by amachine learning module 116 of thecomputer system 104 included in thesensing system 100. In one aspect, themachine learning module 116 can generate a machine learning model to detect correlations between the data and events that have occurred. In one aspect, themachine learning module 116 generates a classifier, which is an algorithm that is trained via a machine learning model to assign an input to one or more categories based upon the training that the classifier received. In this aspect, the classifier can be trained to identify the occurrence of a given event based upon the grouped, featurized data that is provided to themachine learning module 116 as training data. In training the classifier to identify an event, themachine learning module 116 can assess the informational power of different sensor channels and may select appropriate thresholds for optimal accuracy in characterizing the training data. The training by themachine learning module 116 causes the classifier to learn what sensor data streams are associated with an event type and, further, what characteristics of those data streams identify the event type with particularity. Once trained to identify an event, avirtual sensor 118 can output a notification and/or signal when the event is detected that causes agraphical user interface 500, an example of which is depicted inFIG. 15B , to display an icon, ideogram, textual alert, orother indicator 506 indicating that the event is being detected. - The
machine learning module 116 can utilize supervised learning, unsupervised learning, and/or both techniques in training the classifier. The advantage of using both supervised and unsupervised methods may be that it is an effective method for correlating different types of features from multimodal data. Using both supervised and unsupervised methods may also be advantageous because it enables fine tuning of unsupervised training with supervised training results. Supervised learning is the machine learning task of inferring a function from labeled training data. The training data consists of a set of training examples. In supervised learning, each example is a pair consisting of an input object, typically a vector, and a desired output value or target. The goal is to learn a general rule that maps inputs to outputs. A supervised method may be advantageous because a supervised learning algorithm analyzes the training data and produces an inferred function, which can be used for mapping new examples. Further, an unsupervised method tries to find hidden structure in unlabeled data and includes an algorithm with no target value, i.e., there is no error or reward signal to evaluate a potential solution. Instead, the algorithm has a halting criterion. Examples of halting criteria include, but are not limited to, precision, recall, accuracy, number of cycles, and time. An unsupervised method may be advantageous for use in model training when the only data available is unlabeled data. - In an aspect, the
machine learning module 116 can utilize now or hereafter known machine learning methods to detect correlations between the data and events that have occurred, such as various deep learning algorithms, clustering, etc. In other aspects, after the data is processed by thefeaturization module 112, the featurized data can be processed by other classification modules, such as a logistic regression module, a clustering module (e.g., k-means, spectral, density based spatial clustering of applications with noise (DBSCAN) and mean-shift), a decision tree module, or a random forest module. In one aspect, themachine learning module 116 comprises an ensemble classification model utilizing, e.g., an algebraic combination technique or a voting (plurality) combination technique. Ensemble classifications models can promote robustness against false positives, while supporting the ability to detect simultaneous events. In one aspect, themachine learning module 116 comprises use base-level support vector machines (SVMs) trained for each virtual sensor, along with a global (multi-class) SVM trained on all sensors. In embodiments where thecomputer system 104 implements numerous first or higher ordervirtual sensors 118, thevirtual sensors 118 could all use the same machine learning technique or they could use different machine learning techniques. For example, somevirtual sensors 118 could use support vector machines, some decision trees, some neural networks, etc. In an aspect, the featurized data may be organized as feature vectors and the feature vectors are fed into themachine learning module 116 as the training data for the classifiers. - In some aspects, after the
raw sensor 110 data is processed by thefeaturization module 112, the featurized data can optionally be processed by an activation group module prior to being transmitted to and/or prior to being processed by themachine learning module 116. In an aspect, the activation group module further processes and converts featurized data from thefeaturization module 112 into various forms of processed data, as discussed below. In one aspect, the activation group module can be executed on thesensor assembly 102, i.e., prior to the featurized data being transmitted to thecomputer system 104. In another aspect, the activation group module can be executed by thecomputer system 104 after the featurized data has been received thereby. Additionally and/or alternatively, the activation group module can be a part of a node between thesensor assembly 102 and the computer system 104 (e.g., a gateway). In aspects including such an intermediate node, the intermediate node can be considered part of thecomputer system 104 as described herein. In an aspect, the activation group module may also process raw sensor data without featurization. - In an aspect, the activation group module can determine which of the
sensors 110 have been triggered at a given time or within a given time window and extract a subset of the data from the sensors as activation group data corresponding to only the activated sensors. Determination of which sensor channels have been activated may reduce the effects of environmental noise on the received sensor data. The activation group module can determine which of thesensors 110 have been triggered or activated by, for example, determining a baseline or background profile of the environment as a calibration routine and using the baseline or background profile to determine which sensors or sensor channels are “activated” by subtracting or otherwise removing the baseline or background profile from the featurized sensor data. In one aspect, the activation group module determines whether a givensensor 110 has been activated by utilizing an adaptive background model for each sensor channel (e.g., rolling mean and standard deviation). In various aspects, all received data streams can be compared against the background profile using, e.g., a normalized Euclidean distance metric. Sensor channels that exceed the baseline by a predetermined threshold (which may be unique for each sensor) are tagged as “activated.” In one aspect, the activation group module can further utilize hysteresis to avoid detection jitter. Thresholds can be, e.g., empirically obtained by running thesensors 110 for several days while tracking their longitudinal variances or set by the user or system administrator. In one aspect, the background profile refers to data sensed by a sensor based on ambient condition of an environment that are not related to events of interest, and which thesensing system 100 can obtain when asensor assembly 102 is initially activated or deployed in a new environment. Alternatively and/or additionally, thesensing system 100 can periodically obtain the environmental background profile for eachsensor assembly 102. Obtaining an environmental background profile helps with reducing false positives for the same activity as the baselines change (e.g., for detecting the sound of a particular machine in a factory setting, the constant drone of a fan or other such consistent sounds may be subtracted from the featurized data as a baseline or background profile). - Further, the activation group module can create data sets by subtracting the baseline or background profile from any sensor signals detected by the
sensors 110. Such data sets will require less bandwidth for transmission (if the activation group module is part of or executed by the sensor assembly 102) to thecomputer system 104 for training by themachine learning module 116. Still further, the activation group module can tag the data sets with identification information corresponding to the activated sensors such that themachine learning module 116 knows which sensor streams to consider, and which sensor streams to ignore, when training the machine learning model (e.g., a classifier). This assists in classification by reducing the feature space in which the classifier is trained. The identification of a particular grouping ofsensors 110 that have been activated in association with an event can serve as useful metadata to describe the event, which can in turn assist in classifying the event type that has been detected. For example, a boiling kettle can activate infrared, vibration, and/or acoustic sensors of thesensor assembly 102 and a determination by the activation group module that infrared, vibration, and/or acoustic sensors have been activated from amongst a group ofsensors 110 of thesensor assembly 102 can itself be used as a feature to assist in classifying the event as a kettle boiling within the detection area of thesensor assembly 102. In some aspects, the activation group module can, optionally, assemble an amalgamated feature vector of the featurized data from the activated sensors, which is then provided to themachine learning module 116. - Referring back to the
machine learning module 116, in one aspect where themachine learning module 116 utilizes supervised learning, thesensing system 100 can be configured to provide labels for the featurized data. The labels can be provided by users and/or generated by thesensing system 100. - In aspects where the labels are provided by users, the
sensing system 100 can include an interface for users to indicate when and what types of events have occurred, which can then be correlated to the data sensed by thesensor assembly 102. For example,FIG. 15A illustrates agraphical user interface 500 utilized to annotate an event being detected by thesensing system 100, in accordance with at least one aspect of the present disclosure. Thegraphical user interface 500 could be displayed on a client 106 (FIG. 5 ) connected to thesensing system 100, thecomputer system 104, or another computer system or device that is in communication with thesensing system 100. Thegraphical user interface 500 can allow users to visualize the sensor data streams (which can be either the raw sensor data or the featurized sensor data) and then indicate when various event types occurred, such as by annotating the sensor data streams with events types and the times that the event types occurred. By indicating when various event types occurred, themachine learning module 116 can then train a machine learning model, such as a classifier, to correlate various characteristics of the featurized sensor data streams with the occurrences of the particular events types. For example, thegraphical user interface 500 could be utilized to provide a “knocking”annotation 502 at the time on thesensor data stream 504 corresponding to when there was knocking on a door. Theannotation 502 thus provides a label for the sensor data for themachine learning module 116 to train avirtual sensor 118 to detect the corresponding event. As another example, a user could annotate a “faulty” label to a vibration sensor reading from a machine in a factory that is vibrating due to mechanical misalignment. - In aspects where the labels are generated by the
sensing system 100, thesensing system 100 can automatically generate the labels, by, for example, clustering, deep learning, or other now or hereafter known methods that can be implemented by themachine learning module 116. In one aspect, a user could use theuser interface 500 to verify whether any labels automatically generated by thesensing system 100 are correct or incorrect. Themachine learning module 116 could then adjust the training of the machine learning model being used to generate the labels to avoid characterizing such false positives. In another aspect, a user could use theuser interface 500 to supplement the labels automatically generated by thesensing system 100 or otherwise apply additional labels to the sensor data streams, as described above. Themachine learning module 116 could then adjust the training of the machine learning model being used to generate the labels to properly characterize such false negatives. Additionally and/or alternatively, the feature vectors along with their associated labels may be fed into themachine learning module 116 as the training data for the classifiers. - In another aspect where the
machine learning module 116 utilizes unsupervised learning, themachine learning module 116 can comprise a deep neural network configured to perform deep learning. As used herein, “deep learning” refers to a form of machine learning that utilizes multiple interconnected neural network layers along with feedback mechanisms or other methods to improve the performance of the underlying neural network. Deep learning systems are usually based on several interconnected layers of a convolution neural network, among other layers, interconnections, or feedback mechanisms. There are many variants of neural networks with deep architecture depending on the probability specification and network architecture, including, but not limited to, deep belief networks (DBNs), restricted Boltzmann machines (RBMs), convolutional neural networks (CNNs), deep neural networks (DNNs), recurrent neural network (RNN)-enhanced models capable of sequential data pattern learning, and autoencoders. Deep-learning models may be trained to learn representations of data using supervised and/or unsupervised learning. From a computational standpoint, the methods used in deep learning involve several mathematical calculations of matrix-to-matrix and matrix-to-vector calculations. The number and nature of these calculations makes them essentially impossible for a human to perform the calculation by-hand or by manual process, within any practical amount of time. In one such aspect, themachine learning module 116 uses a two-stage clustering process. First, themachine learning module 116 reduces the dimensionality of the data set using a multi-layer perceptron configured as an autoencoder. The autoencoder can have, e.g., multiple nonoverlapping sigmoid functions in the hidden layer(s). Because the output of the autoencoder is the same as the input values, the hidden layer(s) will learn the best reduced representation of the feature set. Second, this reduced feature set is used as input to an expectation maximization (EM) clustering algorithm. In other embodiments, themachine learning module 116 can comprise a decision tree, a logistic regression model, a random forest, etc. In one aspect wherein themachine learning module 116 utilizes deep learning, some or all of the featurization or feature extraction may be accomplished automatically using learning from the training data. However, since the accuracy of a deep learning system depends, at least in part, on the sequence in which training data is provided to the deep learning system, pre-processing of the training data (e.g., using featurization by thefeaturization module 112, using the activation group module, by providing labels, etc.) and selection of training data may be used to improve accuracy of the model. Selection of training data includes, for example, using domain-specific knowledge to improve performance of the machine learning system. Domain expertise, as used herein, provides a context in which a deep learning system can operate and can be used to select elements of training data, the order in which the training data is presented to the deep learning system, and certain sorts of invariances. - In one aspect, the
computer system 104 can be further programmed to perform featurization (i.e., featurization at the computer system 104), in addition to and/or as an alternative to the onboard featurization performed by thesensor assembly 102. The additional featurization can include extracting features that would require computationally expensive processing for thesensor assembly 102 hardware to handle or that would be too large to transmit to thecomputer system 104. The additional features can be computed by the computer system 104 (e.g., the cloud or a remote server) or an intermediate node between thesensor assembly 102 and thecomputer system 104, such as a gateway. In aspects including such an intermediate node, the intermediate node can be considered part of thecomputer system 104 as described herein. In various aspects, the computer system 104 (including, potentially an intermediate node) can be configured to compute additional features from data corresponding to one or more high sample rate sensors and/or one or more low sample rate sensors of thesensor assembly 102. The additional features computable by thecomputer system 104 can include, without limitation, band ratios, fractional harmonics, first or second order signal derivatives, MFCCs, and/or statistical features (e.g., min, max, range, mean, median, mode, sum, standard deviation, and centroid) from raw data from the acoustic, EMI, vibration, orother sensors 110, and/or from already featurized data from thefeaturization module 112. In one aspect, thecomputer system 104 can be configured to normalize data fromother sensors 110. This server-side featurized sensor data can then be fed, either alone or in combination with the data featurized onboard thesensor assembly 102, to the machine learning module 116 (or, in some aspects, to the activation group module, which in turn feeds into the machine learning module 116) for classification, as described above. - The
machine learning module 116 of thecomputer system 104 is also configured to train one or morevirtual sensors 118. In one aspect, the classifier trained by themachine learning module 116 on the provided training data and/or sensor data associated with a given event can be considered a virtual (or synthetic)sensor 118 for that event. For example, a classifier trained by themachine learning module 116 to recognize a boiling kettle according to featurized data from various combinations of infrared, vibration, and/or acoustic sensors of thesensor assembly 102 can be defined as a “kettle boiling”virtual sensor 118. As another example, a classifier trained by themachine learning module 116 to recognize the movement of a door according to featurized data from acoustic and/orvibration sensors 110 can be defined as a “door movement”virtual sensor 118. It should be noted that because eachvirtual sensor 118 is trained on data collected by asensor assembly 102 or combinations ofsensor assemblies 102, and becausesuch sensor assemblies 102 will be located in a number of different types of environments when utilized in the field, the subset ofsensors 110 included in eachsensor assembly 102 that are correlated with an event type can vary. Furthermore, the machine learning associated with the same event and same subset ofsensors 110 can also vary depending on the environment of the sensor assemblies 102 (e.g., if the background profile is different for the different environments). Specifically, the machine learning model for eachvirtual sensor 118 could also have different parameters and/or weights for an event based on the environment in which thesensor assemblies 102 are located. Therefore, different virtual sensors 118 (i.e.virtual sensors 118 utilizing different subsets ofsensors 110 or having different parameters and/or weights assigned to the event) may be implemented to detect the same event in different locations, different environments, or even over different time periods in the same location. In other words, eachvirtual sensor 118 will be uniquely trained to detect the occurrence of an event according to the sensor data unique to the particular environment(s) in which thesensor assemblies 102 are located and/or at a particular time. For example, a “door movement”virtual sensor 118 in a first environment and/or during a first time period (e.g., during working hours) could be trained to identify the movement of a door based on a combination of acoustic and vibration data and/or a first machine learning model. However, a “door movement”virtual sensor 118 in a second environment and/or during a second time period (e.g., at night) could be trained to identify a door closing based solely upon acoustic data and/or a second machine learning model. Further, the machine learning model for eachvirtual sensor 118 could also have different parameters and/or weights based on the environment in which thesensor assemblies 102 are located. Thesensing system 100 does not utilize any pre-established restrictions on the training of thevirtual sensors 118, thus eachvirtual sensor 118 will be uniquely trained to detect events according to its environment. - While the above description describes a different virtual sensor 118 (i.e., a classifier or a machine learning model) for each set of unique conditions associated with an event, a single generic
virtual sensor 118 could be trained for each event that takes into account a wide variety of conditions related to different environments and over a period of time, without deviating from the principles of this disclosure. - As depicted in
FIG. 1B , upon themachine learning module 116 completing the training of thevirtual sensor 118 with respect to an event, thevirtual sensor 118 can receive and/or subscribe to the data streams from the sensors 110 (i.e., the data streams are transmitted to or pulled by the virtual sensors 118) that were activated in accordance with the event (i.e., thesensors 110 that were correlated with the event). For example, if a “kettle boiling” virtual sensor is correlated with infrared and acoustic sensors, the “kettle boiling” virtual sensor will receive and/or subscribe to the data streams from thoseparticular sensors 118. In one aspect where thesensor assembly 102 and/or thecomputer system 104 includes an activation group module, thevirtual sensor 118 can subscribe to the data streams of thesensors 110 related to the event when the activation group module determines thosesensors 110 are activated (i.e., in the above example, when the infrared and acoustic sensors are determined to be activated). Thereafter, thevirtual sensor 118 monitors for the occurrence of the event that thevirtual sensor 118 was trained to detect from the data feed transmitted by the correlatedsensors 110 of thesensor assembly 102 to thecomputer system 104. Thevirtual sensors 118 can thus detect actions or events directly and/or indirectly (i.e., without requiring that asensor 110 be physically connected or otherwise associated with the object or person being sensed) by being trained to correlate stimuli detected by thesensors 110 incorporated within thesensor assembly 102 with the occurrences of the particular events. - In one aspect depicted in
FIG. 1B , thevirtual sensors 118 can be implemented by thecomputer system 104, i.e., thesame computer system 104 on which thevirtual sensors 118 are trained by themachine learning module 116. In other aspects, thevirtual sensors 118 can be stored on and/or implemented by a second computer system. For example, thevirtual sensors 118 can be stored in a library after they are trained. Other computer systems can then access the library of previously trainedvirtual sensors 118 for different activities and events and then utilize the previously trainedvirtual sensors 118 to sense the occurrence of events according to data from theirown sensor assemblies 102. Such other computer systems may also update a previously trainedvirtual sensor 118. - Although
FIG. 1A depicts avirtual sensor 118 being trained andFIG. 1B depicts a resulting trainedvirtual sensor 118 subscribing to data streams from thesensor assembly 102, this is described for the sake of convenience and not with an intent of limiting the disclosure as comprising a series and/or a number of steps. It is to be understood that the creation and implementation ofvirtual sensors 118 to detect events does not need to be performed as a series of steps and/or the steps do not need to be performed in the order shown and described with respect toFIGS. 1A and 1B . In other words, the creation and implementation steps may be integrated and/or may be performed together, or the steps may be performed in the order disclosed or in an alternate order. Furthermore, creation of thevirtual sensor 118 is an iterative process and the training of the classifier that forms thevirtual sensor 118 may continue to improve and/or modify the virtual sensor 118 (e.g., using a feedback loop). - Furthermore, although
FIGS. 1A and 1B depict a singlevirtual sensor 118 being trained and then implemented, thecomputer system 104 can train any number ofvirtual sensors 118 to detect the same event in a variety of environments (or at different times) and/or a variety of events. For example,FIG. 2 depicts thesensing system 100 wherein thecomputer system 104 has been trained to implement nvirtual sensors 118 based on data from msensors 110 incorporated with thesensor assembly 102, where n can be greater than, equal to, or less than m. Eachvirtual sensor 118 can subscribe to (i.e., receive data from) the data stream of one ormultiple sensors 110, in any combination. In various embodiments, as described further below, there could bemultiple sensor assemblies 102 sensing different spaces of a larger environment (e.g., a building or campus), and each of thesensor assemblies 102 is in communication with thecomputer system 104 as described above. In that case, avirtual sensor 118 could rely on featurized data fromdifferent sensor assemblies 102 in making a classification. -
FIG. 3A illustrates a block diagram of asensing system 100, in accordance with at least one aspect of the present disclosure, with various components such as themachine learning module 116 omitted for clarity. In some aspects, thesensing system 100 can include a hierarchical structure ofvirtual sensors 118. Thevirtual sensors 118 that receive the featurized data from thesensors 110 of thesensor assembly 102 to make their classifications can be referred to as first ordervirtual sensors 120. Thecomputer system 104 can further be configured to implement second ordervirtual sensors 124 that receive and process, among other things, the outputs of one or more first ordervirtual sensors 120 to make their “second order” classifications; third order virtual sensors that receive and process, among other things, the outputs of one or more second ordervirtual sensors 124 to make their “third order” classifications; and so on for subsequent orders ofvirtual sensors 118. Described generally, thecomputer system 104 can be configured to implement xth order virtual sensors that receive the outputs from one or more (x−1)th or lower order virtual sensors to detect the occurrence of an event or condition (e.g., make a classification that the event occurred or that a condition is or is not present). - The second order
virtual sensors 124 could also subscribe to and/or receive other, non-first order virtual sensor data. For example, in some embodiments, a second ordervirtual sensor 124 could receive data from at least one first ordervirtual sensor 120, as well as featurized data from one or more of thesensors 110 of thesensor assembly 102, in order to make its classification. This applies to higher order sensors as well. A xth order virtual sensor could receive data from at least one (x−1)th order sensor, as well as either (i) data from lower order sensors (e.g., (x−2)th, (x−3)th, etc.) and/or (ii) featurized data from one or more of thesensors 110 of thesensor assembly 102, in order to make their classification. - In some aspects, the higher order virtual sensors can include algorithms that, for example and without limitation, count the number of occurrences or duration of an event detected by a lower order virtual sensor, algorithms that smooth the outputs of lower order virtual sensors (and, in some cases, the
sensors 110 of the sensor assembly 102), algorithms that combine the outputs of multiple lower order virtual sensors and/or sensors 110 (featurized and/or raw data), or the like. As an example of a higher order virtual sensor that can combine the outputs of lower order sensors, a second ordervirtual sensor 124 could indicate whether an occupant is present within a home by analyzing the outputs of multiple human activity-related first ordervirtual sensors 120, such as a “washing dishes” first ordervirtual sensor 120, a “movement in the kitchen” first ordervirtual sensors 120, and so on. In another example, a third order virtual sensor could output an alarm if the second ordervirtual sensor 124 determines that a home owner is not present (e.g., by determining that for a threshold period of time the first ordervirtual sensors 120 have indicated that no lights have not been turned on) and another first ordervirtual sensor 120 detects a fire event. In these aspects, the higher order virtual sensors can receive the outputs of one or more lower order virtual sensors and/orsensors 110 of the sensor assembly and then make its corresponding classification accordingly. - In some aspects, the higher order virtual sensors can, as with the first order
virtual sensors 120, include classifiers trained by a machine learning module on the output from one or more lower order sensors to identify the occurrence of a trained-for event or condition. The higher order virtual sensors can be trained on the outputs of at least one immediately lower order of virtual sensor in the hierarchical structure, rather than strictly on the outputs of thesensors 110 of thesensor assembly 102. For example,FIG. 3A depicts thecomputer system 104 including a secondmachine learning module 122 that receives the outputs (data) from the first ordervirtual sensors 120 to generate a second ordervirtual sensors 124. In some aspects, the data from the first ordervirtual sensors 120 can additionally be processed by a featurization module and/or an activation group module, as described above with respect toFIGS. 1A and 1B , prior to being processed by the secondmachine learning module 122. In such aspects, thecomputer system 104 can also implement a featurization module for featurizing the data from avirtual sensor 118. In the aspect depicted inFIG. 3B , upon the secondmachine learning module 122 completing the training of the second ordervirtual sensor 124, the second ordervirtual sensor 124 receives data streams from the first ordervirtual sensors 120 that the secondmachine learning module 122 determined correlated to the event that the particular second ordervirtual sensor 124 was being trained on. Thereafter, the second ordervirtual sensor 124 monitors for the occurrence of the event that the second ordervirtual sensor 124 was trained to detect from the data feed generated by the first ordervirtual sensors 120. - Although
FIGS. 3A and 3B depict a single second ordervirtual sensor 124 being trained and then implemented, thecomputer system 104 can train any number of higher ordervirtual sensors 118 to detect a variety of events. For example,FIG. 4 depicts thesensing system 100 wherein thecomputer system 104 has been trained to implement p second ordervirtual sensors 124 and n first ordervirtual sensors 120 based on data from msensors 110 incorporated with thesensor assembly 102, where p can be greater than, equal to, or less than n. Each second ordervirtual sensor 124 can subscribe to (i.e., receive data from), at least, the data stream of one or multiple first ordervirtual sensors 120, in any combination. Further, these same principles apply to third and higher order virtual sensors implemented by thecomputer system 104. Additionally,FIG. 4 depicts the higher order virtual sensors receiving data from different levels or orders of sensors. For example, the pth second ordervirtual sensor 124 is depicted as receiving data from themth sensor 110, in addition to data from the directly preceding 2nd and nth first ordervirtual sensors 120. - In one aspect, the first order
virtual sensors 120 can produce a binary output (e.g., are binary classifiers). For example, a first ordervirtual sensor 120 trained to identify whether a faucet is running or whether someone is at their desk working could produce a continuous, time-stamped binary “yes” or “no” outputs. In this aspect, higher order virtual sensors can further produce nonbinary outputs, such as state (of an object or environment), count, and duration. For example, thesensing system 100 could implement five separate first ordervirtual sensors 120 that track five separate aspects of a microwave: whether the microwave is running, the keypad has been pressed, the door has been opened, the door has been closed, and the completion chime has sounded. From these time-stamped binary outputs of the first ordervirtual sensors 120, a second ordervirtual sensor 124 could generate a nonbinary output of the states of the microwave: available, door ajar, in-use, interrupted, or finished. In one implementation, when the completion chime is detected (i.e., the “completion chime”first order sensor 120 is activated), the microwave state output of the second ordervirtual sensor 124 can change from “in-use” to “finished.” The microwave state output can stay as “finished” until a “door closed” event is detected (i.e., the “door closed” first ordervirtual sensor 120 is activated), after which the items inside the microwave are presumed to have been removed and the microwave state output of the second ordervirtual sensor 124 is changed to “available.” Second ordervirtual sensors 124 need not be connected to multiple first ordervirtual sensors 120 to produce nonbinary outputs though. As another example, thesensing system 100 could implement a first ordervirtual sensor 120 that detects when a door is opened and a second ordervirtual sensor 124 that counts the number of times that the first order virtual second 120 has been activated. As yet another example, thesensing system 100 could implement a first ordervirtual sensor 120 that detects when a faucet is running and a second ordervirtual sensor 124 that tracks the time duration that the first ordervirtual sensor 120 is activated. That way, an approximation of the total amount of water used could be computed. These are examples of just a few of the first and second order virtual sensors that can be implemented in various embodiments of the present invention. In other aspects, the first ordervirtual sensors 120 can produce nonbinary outputs. For example, the first ordervirtual sensors 120 could include multi-class classifiers trained to output one of several labels. As described above, the first and second (or higher order) virtual sensors could be trained to detect other, binary or nonbinary, conditions, states, durations, etc. - By having lower order virtual sensors feed into higher order virtual sensors, the
sensing system 100 can infer increasingly richer details about the environment in which thesensor assembly 102 is located. Further,multiple sensor assemblies 102 can be communicably connected to thecomputer system 104 and the data feeds from themultiple sensor assemblies 102 can be combined to provide additional data that can be processed by machine learning to infer information about the environment from correlated data from thesensor assemblies 102. For example, one or more appliance-level second order virtual sensors could feed into a kitchen-level third order virtual sensor, which could in turn feed into a house-level fourth order virtual sensor, and so on. A house-level virtual sensor drawing on multiple lower order sensors (whether they are virtual sensors or actual sensors disposed on one of the sensor assemblies within the house) across many rooms can classify complex facets like human activity. Tracking human activities accurately can be very useful in a variety of contexts, such as with smart homes, healthcare tracking, managed care for the elderly, and security and safety of human occupants. - The outputs of the various
virtual sensors 118 can further be fed into applications executed by thecomputer system 104, an external client 106 (FIG. 5 ), and/or other local or remote computer systems via, e.g., an application program interface (API). In these aspects, thecomputer system 104, a client 106 (FIG. 5 ), and/or other computer systems may execute a virtual machine receiver program or application to display the output in an application window, a browser, or other output window. For example, the output of avirtual sensor 118 counting the number of times that paper towels are dispensed from a particular dispenser could be fed into a local or remote application that automatically orders paper towels once the count has reached a threshold. As another example, the output of avirtual sensor 118 which tracks anomalous conditions of a machine by considering vibrations and audio signatures could be fed into a local or remote application that notifies the machine maintainer by sounding an alarm and shuts down the machine safely. As another example, the output of avirtual sensor 118 tracking the duration that a light is on in a room could be fed into a local or remote application that automatically turns the light off once the duration has reached a threshold. As yet another example, the output of avirtual sensor 118 monitoring the state of a washing machine could be fed into a local or remote application that automatically notifies the user (e.g., via a text message or a push notification on the user's mobile phone) when the drying cycle of the washing machine has completed. As yet another example, event detections from multiple first ordervirtual sensors 120 located in a home could be fed to a second ordervirtual sensor 124 to track the various activities of the occupant of a home (e.g., daily routines), which are then fed into an anomaly detection system (which may be an even higher order virtual sensor or a separate application) to notify a caregiver if an elderly person's patterns deviate from their normal patterns (e.g., if the tracked individual falls down, fails to wake up at the usual time, etc.). The different manners in which the outputs of thevirtual sensors 118 can be utilized are essentially limitless. -
FIG. 5 illustrates a block diagram of asensing system 100 includingmultiple sensor assemblies 102 communicably coupled to acomputer system 104, in accordance with at least one aspect of the present disclosure. Each of thesensor assemblies 102 includes a plurality ofsensors 110 for detecting various physical or natural phenomena in the environment in which thesensor assembly 102 is located and a control circuit for executing the various functions of thesensor assembly 102. The control circuit can include, for example, a processor coupled to primary and/or secondary computer memory for executing instructions stored on the memory, a microcontroller, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), and other such devices. In the depicted example, thesensor assembly 102 includes amicrocontroller 121 that includes aprocessor 123 coupled to amemory 125. Themicrocontroller 121 executesfirmware 129, includingsystem firmware 129A andapplication firmware 129B, stored in thememory 125 and aclock 127. Thefirmware 129 can include, for example, firmware for the featurization module 112 (seeFIG. 1 ), which can be executed by the control circuit (e.g., microcontroller 121). In some aspects, the control circuit (e.g., microcontroller 121) can be embodied as a system on a chip (SoC). - The
sensor assembly 102 is communicably connectable to the computer system 104 (e.g., one or number of networked servers) such that thecomputer system 104 can receive the signals or data generated by thesensors 110 for processing thereon, as described above. In the depicted example, eachsensor assembly 102 is communicably connectable to thecomputer system 104 via adata communication network 108, such as the Internet, a LAN, a WAN, a MAN, or any other suitable data communication network. In this aspect, thesensor assembly 102 can include an appropriate network interface for connecting to thedata communication network 108 such as, for example, a W-Fi network interface controller. In other aspects, thesensor assembly 102 can communicably connect to thecomputer system 102 utilizing other wired or wireless communication protocols or other communication networks (e.g., a cellular telecommunication network or Ethernet). The network interface controller of thesensor assembly 102 may include a network interface controller suitable to implement wireless or wired communication utilizing a variety of communication protocols and/or access methods, such as cellular, Bluetooth, ZigBee, RFID, Bluetooth low energy, NFC, IEEE 802.11, IEEE 802.15, IEEE 802.16, Z-Wave, HomePlug, global system for mobile (GSM), general packet radio service (GPRS), enhanced data rates for GSM evolution (EDGE), code division multiple access (CDMA), universal mobile telecommunications system (UMTS), long-term evolution (LTE), LTE-advanced (LTE-A), LoRa (or another lower power wide-area network communication protocol), or any other suitable wired and/or wireless communication method or combination thereof. Thenetwork 108 may include one or more switches and/or routers, including wireless routers that connect the wireless communication channels with other wired networks (e.g., the Internet). The data communicated in thenetwork 108 may include data communicated via short messaging service (SMS), multimedia messaging service (MMS), hypertext transfer protocol (HTTP), direct data connection, wireless application protocol (WAP), email, smart energy profile (SEP), ECHONET Lite, OpenADR, or any other protocol that may be implemented with thesensor assemblies 102, physical hubs, cloud sever communication, or gateway modules. - In certain aspects, one or more of the
sensor assemblies 102 may also be communicably connected to each other, and/or to a client 106 (e.g., a user device) via thenetwork 108 and/or via a separate network. For example, such a network may be a local network established by a local router or a local switch. Optionally, thesensor assemblies 102 may be a peer-to-peer (P2P) network, and may communicate with each other directly. In a peer-to-peer network, service discovery schemes can multicast the presence of nodes, their capabilities, and group membership. The peer-to-peer devices can establish associations and subsequent interactions based on this information. - A
sensing assembly 102 may implement one or more application-layer communication protocols. Examples include constrained application protocol (CoAP), message queue telemetry transport (MQTT), OPC UA, HTTP, REST APIs and the like for implementing a respective messaging protocol.Sensing assembly 102 may also implement lower-layers communication protocols which may implement layers of a communication protocol stack lower than the application-layer. Example layers implemented may include one or more of the physical, data link, network, transport, session, internet, and presentation protocols. Example protocols implemented include one or more of: Ethernet, Internet Protocol, Transport Control Protocol (TCP), protocols for the 802.11 standard (e.g., PHY, Medium Access Control, Logical Link Control, and the like), and the like. - In one embodiment, the
computer system 104 may be a virtual machine. The virtual machine may be any virtual machine, while in some embodiments the virtual machine may be any virtual machine managed by aType 1 orType 2 hypervisor, for example, a hypervisor developed by Citrix Systems, IBM, VMware, or any other hypervisor. In some aspects, the virtual machine may be managed by a hypervisor, while in aspects the virtual machine may be managed by a hypervisor executing on a server or a hypervisor executing on a user device. - In some embodiments, the
client 106 may display application output generated by an application remotely executing on a server or other remotely located machine (for e.g., for controlling, communicating and/or accessing data from asensor assembly 102 and/or controlling and/or communicating with various objects of an environment being sense). In these embodiments, the client device may execute a virtual machine receiver program or application to display the output in an application window, a browser, or other output window. -
FIG. 6 depicts an embodiment of thesensor assembly 102 according to various embodiments. As shown inFIG. 6 , in various embodiments, thesensor assembly 102 comprises asingle circuit board 128 with the various sensors and control circuit connected thereto. As depicted inFIG. 6 , sensors and control circuit can be positioned on one side of the circuit board, and thesensor assembly 102 can further comprise a connector 126 (e.g., a USB connector, power plug, or Ethernet connector for providing power via a power-over-Ethernet interface) on the opposite side of the circuit board for supplying power to thesensors 110,microcontroller 121, and other electronic components of thesensor assembly 102. In one aspect, thesensor assembly 102 is intended to be plugged into an electrical outlet within the area or environment to be monitored by thesensor assembly 102. When secured to the power source via theconnector 126, thesensor assembly 102 can be held in a stationary, non-mobile position relative to the object to which it is connected or a reference frame. By being plugged directly into a power source (e.g., an electrical outlet), thesensor assembly 102 does not strictly require a power source that must be replaced or recharged, obviating the need to limit the processing power of thesensor assembly 102 and the number and/or utilization of thesensors 110 in order to attempt to conserve power. Further, by being held stationary with respect to the object to which it is connected or a reference frame, thesensor assembly 102 can utilize certain types of sensors (e.g., vibration sensors) to detect changes in the environment within the vicinity of thesensor assembly 102 relative to a fixed position. In these aspects where thesensor assembly 102 is configured to be plugged directly into an electrical outlet, thesensors 110 can be selected to account for the potential suboptimal placement of thesensor assembly 102 relative to the object or location being sensed. Thesensor assembly 102 may need to be placed in a suboptimal location because the placement of thesensor assembly 102 will be contingent upon the location of an electrical outlet, which means that thesensor assembly 102 could potentially be located a relatively far distance from the object or location being sensed. Therefore, thesensors 110 can utilize indirect sensing techniques. - As mentioned above, in one aspect, the
sensors 110,connector 126,microcontroller 121, and various other components of thesensor assembly 102 can be supported upon a printed circuit board (PCB)substrate 128. In one aspect, thesensors 110 can be disposed on a first surface of thePCB substrate 128 and theconnector 126 can be disposed on a second, opposing surface of thePCB substrate 128 so that thesensors 110 are oriented outwardly towards the environment when theconnector 126 is plugged into or connected to a corresponding socket. In other aspects, thesensors 110 can be mounted on various layers of thePCB substrate 128. For example,sensors 110 such as, without limitation, an EMI sensor configured to measure the electro magnetic interference in the line voltage of a power circuit caused by an electrical device may be included in a first one or more layer(s) of thePCB substrate 128 andother sensors 110 may be included in a different layer of the PCB substrate. In one aspect, thesensor assembly 102 further includes a housing enclosing the various components. That is, the housing can house thePCB substrate 128 and thesensors 110 connected thereto. The housing can protect against physical damage and electrostatic discharge. Further, the housing can be designed to accommodatesensors 110 that require line of sight and access to the environment's air by, for example, having access cutouts for therelevant sensors 110. The housing could be constructed from, for example, laser cut cast acrylic and/or constructed via injection molding or 3D printing processes. In other aspects, thesensor assembly 102 could comprise multiple PCB substrates, with thesensors 110 on different PCB substrates. In such embodiments, the housing can enclose all of thesensors 110 and PCB substrates. - The
sensors 110 can include various combinations of sensing devices that are configured to detect various different physical or natural phenomena. In one aspect, the sensors 110 include an infrared radiation sensor 130 (e.g., a Panasonic Grid-EYE AMG8833 infrared array sensor), an ambient light color and/or intensity sensor 132 (e.g., a TAOS TCS34725 RGB sensor), a magnetic field sensor 134 (e.g., a Freescale Xtrinsic MAG3110 magnetometer), a temperature sensor 136, an ambient pressure sensor, a humidity sensor (e.g., all part of a Bosch BME280 environmental sensor), an air quality or air composition sensor (e.g. a Bosch BME680 sensor for sensing the presence of certain volatile organic compounds), a vibration sensor 138 (e.g., an InvenSense MPU-6500 accelerometer six-axis accelerometer and gyroscope motion tracking sensor, which can detect vibrations through the structure when the sensor assembly 102 is secured to an electrical outlet), an external device detection sensor 140 (e.g., a 2.4 GHz network interface controller for detecting the presence and/or activity of external electronic devices connected to the Wi-Fi network or a Bluetooth LE sensor for detecting the presence of external electronic devices in the vicinity of the sensor assembly 102), a motion sensor 142 (e.g., a Panasonic AMN21111 PIR motion sensor), an acoustic sensor 144 (e.g., an Analog Devices ADMP401 microphone), and/or an EMI sensor 146 (e.g., a 100 mH inductor to capture over-air EMI and/or a passive RC network to sense EMI changes in the line voltage of the power source to which the sensor assembly 102 is connected). Various implementations of thesensor assembly 102 can utilize any number and combination of the aforementioned sensors and any other types ofsensors 110 for detecting physical or natural phenomena. Further, thesensors 100 can be analog or digital sensors. Preferably, thesensor assembly 102 does not comprise a high-resolution camera (i.e., higher resolution than a thermal imager, such as an infrared radiation sensor 130). As such, thesensing system 100 can make the detections and classifications described herein without use of a camera, which decreases the cost and power consumption of thesensor assembly 102. It also decreases the amount of data that needs to be featurized onboard and transmitted to thecomputer system 104 since there is no image data to featurize and transmit. Further, not employing a camera reduces privacy concerns and image data of the environment of thesensor assembly 102 are not captured. Even when a high-resolution camera is employed on thesensor assembly 102, the image data can be featurized onboard thesensor assembly 102 so that privacy is maintained in the data sent to thecomputer system 104. - In one aspect, the
sensor assembly 102 can further include one or more interfaces that can be utilized to connect to or communicate with additional sensors external to thesensor assembly 102. The interfaces can include, for example, Serial Peripheral Interface (SPI), Inter-Integrated Circuit (I2C), General Purpose Input/Output pins (GPIOs), and/or universal asynchronous receiver-transmitter (UART). The interfaces allow additional external sensors to be connected to thesensor assembly 102 in order to supplement and/or extend the functionality of thesensor assembly 102. Additional sensors that could be modularly connected to thesensor assembly 102 via the interfaces could include, for example, motion sensors (e.g., Doppler radar sensors), EMI sensors configured to detect the transients caused in the line voltage of the power source directly to which thesensor assembly 102 is connected, a lidar sensor, an ultrasonic sensor, and/or an active noise management system. - The
sensors 110 of thesensor assembly 102 can include passive sensors and/or active sensors. A passive sensor is a sensor that simply detects or senses various physical or natural phenomena of an environment. Examples of such passive sensors are described above and may include, without limitation, vibration sensors, microphones, EMI sensors, infrared radiation sensors, acoustic sensors, temperature sensors, humidity sensors, camera, motion sensors (e.g., accelerometer, gyroscope, etc.), electric field sensors, chemical sensors, photo sensors, or the like. An active sensor is sensor used for measuring signals transmitted by the sensor that were reflected, refracted or scattered by an object of the environment and/or disturbances caused by the transmitted signals in the environment. Examples of such active sensors include, without limitation, sonar sensors (e.g., Doppler sensors), ultrasonic sensors, radar sensors, lidar sensors, acoustic sensors, infrared cameras, active IR sensors, indoor positioning systems, x-ray sensors, seismic sensors, active sound measurement systems, light emitting systems, or the like. In an aspect, an output device (described below) of asensor assembly 102 may be configured to transmit a signal that may be reflected, refracted or scattered by an object of the environment and/or may cause disturbances in the environment, where such reflection, refraction, scattering, and/or disturbance is subsequently sensed by asensor 110 of thesensor assembly 102, thereby forming anactive sensor assembly 102 without an actualactive sensor 110. The data from such active sensors could be featurized and used to detect events/conditions by the first or second (or higher) order virtual sensors. The active sensors could also be used to calibrate a space in which thesensor assembly 102 is located, as described further below. The data from such active sensors could be featurized and used to detect events/conditions by the first or second (or higher) order virtual sensors, just like the passive sensors as described above. The active sensors could also be used to calibrate a space in which thesensor assembly 102 is located, as described further below. Furthermore, an active sensor can be used for authentication of an object and/or a person as described below. - In aspects wherein the
sensor assembly 102 includes an active sensor, such as an acoustic sensor (e.g., a microphone), the acoustic sensor can be utilized via an active sound management system that transmits a sound signal and receives the reflected, refracted and/or refracted signal to determine, for example, the sensor assembly's 102 position relative to walls or other structures within its vicinity and calibrate the acoustic sensor (and/or other sensors 110) accordingly. Such calibrations can be utilized to, for example, compensate for echoes or other audio artifacts that could interfere with the detection of certain events. The audio artifacts can be compensated for by, for example, signal processing techniques executed onboard thesensor assembly 102 to reduce errors. In still other aspects wherein thesensor assembly 102 includes an ultrasonic sensor, the ultrasonic sensor can be utilized to emit sound waves in order to calibrateother sensor assemblies 102 that are within the detection distance. Such audio signals can be utilized to pairsensor assemblies 102 together and/or allow thesensing system 100 to determine the spatial orientation of thevarious sensor assemblies 102 relative to each other within an environment. In yet another aspect, wherein the sensor assembly includes a speaker, the sensor can output particular sound pattern (e.g., a frequency sweep tone from configurable low frequency values to high frequency values) and have either the microphone on thesame sensor assembly 102, or adifferent sensor assembly 102 in the vicinity, detect the audio signal using a microphone sensor to actively measure and calibrate for the environment. - In one aspect, the sampling rate of each of the sensor assembly's 102
sensors 110 can be automatically varied according to thesensor 110 type or the property or phenomena being sensed. For example, thevibration sensor 138 could have a high sample rate and thetemperature sensor 136 could have a low sample rate (because temperature generally changes relatively slowly). Varying the sampling rate according to the property being sensed by thesensors 110 allows data to be collected at the rate needed to capture environmental events, without unnecessary fidelity and the accompanying processing and transmission requirements. In an example, thetemperature sensor 136, humidity sensor, ambient pressure sensor, light color and/orlight intensity sensors 132,magnetic field sensor 134,electronic device sensor 140,infrared radiation sensor 130, and motion sensor 142 are each sampled at, for example, about 8 Hz to about 12 Hz, and preferably at about 9 Hz, 10 Hz, or 11 H; thevibration sensor 138 is sampled at, for example, about 3 kHz to about 5 kHz, and preferably at about 3.8 kHz, 3.9 kHz, 4 kHz, 4.1 kHz, or 4.2 kHz (e.g., each axis of a three-axis accelerometer is sampled at, for example, 8 kHz, 3.9 kHz, 4 kHz, 4.1 kHz, or 4.2 kHz); theacoustic sensor 144 is sampled at, for example, about 15 kHz to about 19 kHz, about 16 kHz to about 18 kHz, or at about 17 kHz; and theEMI sensor 146 is sampled at, for example, about 250 kHz to about 750 kHz, about 490 kHz to about 510 kHz, about 495 kHz to about 505 kHz, or at about 500 kHz. It should be noted when accelerometers are sampled at high speed, they can detect minute oscillatory vibrations propagating through structural elements in an environment (e.g., dry-wall, studs, and joists). -
FIG. 7 depicts atimeline 200 illustrating various illustrative data sampling timescales for a variety of sensors and the events detectable byvarious sensors 110 at those sampling rates. In this example illustration, thevibration sensor 138,acoustic sensor 144, andEMI sensor 146 can sample on timescales on the order of milliseconds to minutes; thelight color sensor 132 can sample on the order of seconds to days; theillumination sensor 132 can sample on the order of seconds to months; the motion sensor 142 and theinfrared sensor 130 can sample on the order of seconds to weeks; theelectronic device sensor 140 can sample on the order to minutes; and thetemperature sensor 136, the ambient pressure sensor, the humidity sensor, and themagnetic field sensor 134 can sample on the order of minutes to months. Sampling this array ofsensors 110 at these rates allows thesensor assembly 102 to detectevents 202 ranging from EMI spikes (e.g., from a microwave) via theEMI sensor 146, door knocks via the vibration andacoustic sensors light sensor 132, seasonal changes via a variety of sensors, and/or other events that occur on the order of months. As depicted, a variety of other events such as tools running, light usage, and appliance usage can be detected at timescales between these extremes. Further, the data stream of eachsensor 110 can be buffered (e.g., a rolling 256-point buffer) in case communication is lost with thecomputer system 104 or thesensor assembly 102 is otherwise unable to transmit data for a period of time. This buffered data can be stored on the internal memory of the microprocessor on thesensor assembly 102 or an external memory module connected to thesensor assembly 102. Thesensor assembly 102 can be configured to resume sending the buffered data when communication resumes and can overwrite older sensor data to keep only the most recent samples if the memory space runs out. - Referring back to
FIG. 5 , thesensor assemblies 102 communicate with acomputer system 104, such as a server, server system, or cloud-based computing architecture, that provides the back end computational analysis for thesensing system 100. Data can be processed at multiple stages in thesensing system 100, such as onboard thesensor assemblies 102, at thecomputer system 104, and/or at another computer system (e.g., a gateway that is part of the computer system 104). For example, in the aspects depicted inFIGS. 1A, 1B, 3A, and 3B thesensor assembly 102 performs onboard featurization of thesensor 110 data and thecomputer system 104 processes the data through a machine learning model. In some aspects, thesensing system 100 can be a distributed computing system, wherein computational resources can be dynamically shifted between thesensor assembly 102 and distributed computers/servers of thecomputer system 104. In one aspect, a central computer/server in thecomputer system 104 can control the amount of computational resources spent by each node in the distributedcomputing system 104. For example, the central computer/server can offload computational resources (e.g., for data featurization) to thesensor assembly 102 and/or an intermediate gateway if other servers/computers in thecomputer system 104 begins to slow or becomes over exerted, and vice versa. - The
computer system 104 can be accessible via aclient 106, such as a personal computer, laptop or mobile device, through a console user interface or a graphical user interface (GUI), such as a web browser or mobile app. When theclient 106 connects to thecomputer system 104, thecomputer system 104 can permit the client to access the data from thesensor assemblies 102. In one aspect, theclient 106 may only access the data registered to the user account through which theclient 106 has accessed thecomputer system 104 or otherwise allow the user of theclient 106 to access the data from thesensor assemblies 102 associated with the user. In one aspect, a user can visualize the featurized data transmitted from thesensor assembly 102 to thecomputer system 104 through the GUI. The GUI can provide spectrograms, line charts, and other graphical and/or numerical formats for viewing the received data. Further, sensor streams could be separated into time and frequency domain components. In one aspect, the GUI can be customized to visualize only a subset of the featurized sensor streams, as desired by the user. For example,FIGS. 7-13 depict various graphical formats in which the featurized data can be presented via a GUI on a client to a user. In another aspect, the GUI can be configured to automatically provide an alert when the data from one or more sensor channels exceeds a particular threshold, a particular event or condition is detected from the received data, and/or other rules programmed or otherwise specified by the user are satisfied. Further, the threshold(s) and/or other rules for providing an alert can be configurable via the GUI. In another aspect, the GUI can allow users to enable and disableparticular sensor 110 streams from the sensor assembly 102 (i.e., cause thesensor assembly 102 to deactivate or stop sampling the particular sensor(s) 110), modify sampling frequencies of the sensor(s) 110, allow the user to permit other users to access the sensor data associated with his or her user account, and configure other features associated with the backend computer system 104. In yet another aspect, the interface can be configured to control whether some or part of the data featurization occurs onboard thesensor assembly 102, at an intermediate gateway between thesensor assembly 102 and thecomputer system 104, and/or at thecomputer system 104. In another aspect, the interface can be configured to assist a user in providing and/or verifying labels for raw data and/or featurized data, as discussed above. - In some aspects, the
computer system 104 can further implement a management module that allows firmware and/or software updates to be transmitted to thesensor assemblies 102. This management module can be controlled via an interface of theclient 106, e.g. the GUI. Further, the management module of the interface could allow custom code to be deployed at eachsensor assembly 102, as desired by the user. Still further, the management module could collect and store telemetry information, such as uptime of thesensors 110 and/orsensor assembly 102, data rates for thesensors 110, reboots of thesensor assembly 102, and so on. The management module could further allow users to adjust the sampling rates of thesensors 110 of the sensors assemblies 102 (e.g., on a sensor-by-sensor basis and/or on a categorical basis across all of the sensor assemblies 102). Still further the management module can instruct the sensor assembly as to which features should be extracted for a particular and at what rate. - In order to assist in the understanding of the presently described sensing system, illustrative implementations of the sensing system will now be described. The following examples are intended for representative purpose only and should not be interpreted as limiting in any way.
-
FIG. 8 illustrates a first sensor datagraphical display 210 annotated with events detected by thesensing system 100, in accordance with at least one aspect of the present disclosure. In one aspect, thesensor assembly 102 includes a motion sensor 142, atemperature sensor 136, a humidity sensor, anelectronic device sensor 140, andEMI sensor 146, and an ambientlight color sensor 132. The first sensor datagraphical display 210 depicts the sensor data captured by asensor assembly 102 placed within a studio apartment over the course of a 24-hour period. Based on the depicted sensor data, a variety ofvirtual sensors 116 could be trained to detect events correlated with the data captured by thesensor assembly 102. - For example, a
virtual sensor 116 could be trained to detect when a person is awake according to a combination of themotion data 212 and the ambientlight color data 222, which detect the movement of the occupant and the occupant turning on a lamp, respectively. As another example avirtual sensor 116 could be trained to detect when the occupant is showering according to thehumidity data 216. As another example, avirtual sensor 116 could be trained to detect when the occupant is streaming TV according to variations in the ambientlight color data 222 and theelectronic device data 218, which in this case is a W-Fi sensor configured to detect when electronic devices are being utilized according to the Received Signal Strength Indicator (RSSI) of the W-Fi. As yet another example, avirtual sensor 116 could be trained to detect when the occupant has come home according to a combination of themotion data 212 and/or the temperature data 214 (wherein the rising temperature could result from the occupant increasing the thermostat when he or she comes home). As still yet another example, avirtual sensor 116 could be trained to detect when the microwave was being utilized according to theEMI data 220, which can detect the EMI spike from the microwave being activated. -
FIG. 9 illustrates a second sensor datagraphical display 230 annotated with events detected by thesensing system 100, in accordance with at least one aspect of the present disclosure. In one aspect, thesensor assembly 102 includes a motion sensor 142, atemperature sensor 136, a humidity sensor, an ambient pressure sensor, anelectronic device sensor 140, andEMI sensor 146, and an ambient light color/illumination sensor 132. The second sensor datagraphical display 230 depicts the sensor data captured by asensor assembly 102 placed within an apartment over the course of a 72-hour period. Based on the depicted sensor data, a variety ofvirtual sensors 116 could be trained to detect events correlated with the data captured by thesensor assembly 102. - For example, a
virtual sensor 116 could be trained to detect day and night cycles according to the ambientlight color data 244 and ambientlight illumination data 246. As another example, avirtual sensor 116 could be trained to detect when the occupant is present and active within the apartment according to some combination of themotion data 232, the ambient light color data 244 (because the ambientlight color sensor 132 can detect when a lamp and the kitchen lights are on), and/or the temperature data 234 (because themotion data 232 correlates to a temperature increase). As another example, avirtual sensor 116 could be trained to detect when the microwave was being utilized according to theEMI data 220. As yet another example, avirtual sensor 116 could be trained to detect when the occupant is streaming TV according to variations in the ambientlight color data 222 and theelectronic device data 218. Thehumidity data 236 and theambient pressure data 238 could be utilized to detect additional longer term environmental changes, such as the weather. -
FIGS. 10-11 illustrate a third sensor datagraphical display 250 and a fourth sensor datagraphical display 270 annotated with events detected by thesensing system 100, in accordance with at least one aspect of the present disclosure. In one aspect, thesensor assembly 102 includes atemperature sensor 136, a humidity sensor, a magnetic field sensor, and an ambient light color/illumination sensor 132. The third sensor datagraphical display 250 and the fourth sensor datagraphical display 270 each depict the sensor data captured by asensor assembly 102 placed within a garage over the course of approximately a 24-hour period. Based on the depicted sensor data, a variety ofvirtual sensors 116 could be trained to detect events correlated with the data captured by thesensor assembly 102. - For example, a
virtual sensor 116 could be trained to detect rain, as depicted inFIG. 10 , according to thetemperature data 252 and/or thehumidity data 254 due to the fact that rain is correlated with a drop in the temperature and an increase in the humidity. As another example, avirtual sensor 116 could be trained to detect night time according to the ambient light color data 256 (which can detect the light from the street lights, as depicted inFIG. 10 ) and/or the ambientlight illumination data 258. As yet another example, avirtual sensor 116 could be trained to detect when the garage door opens, as depicted inFIG. 11 , according to thetemperature data 252 and thehumidity data 254, which drop and rise, respectively, when the garage door is opened during the winter. Themagnetic field data 255 could be utilized to detect additional events or parameters, such as environmental changes or seasonal changes, as discussed above in the context of other examples. - It should be noted that
sensor assemblies 102 with different combinations or arrangements ofsensors 110 can be utilized for different applications or locations. For example, thesensor assemblies 102 described in connection withFIGS. 8-9 have a more expansive suite ofsensors 110 because they are intended to be utilized in a domicile to track a wide array of behaviors and activities. Conversely, asensor assembly 102 intended to be utilized in a location where there is less activity or less data that needs to be tracked (e.g., in a garage as withFIGS. 10-11 ) could have a more minimal suite ofsensors 110. -
FIG. 12 illustrates a fifth sensor datagraphical display 290 annotated with events detected by thesensing system 100, in accordance with at least one aspect of the present disclosure. In one aspect, thesensor assembly 102 includes an acceleration sensor, anacoustic sensor 144, atemperature sensor 136, a humidity sensor, an ambient pressure sensor, amagnetic field sensor 134, and an ambient light color/illumination sensor 132. The fifth sensor datagraphical display 290 depicts the sensor data captured by asensor assembly 102 placed within an automobile over the course of trip. Based on the depicted sensor data, a variety ofvirtual sensors 116 could be trained to detect events correlated with the data captured by thesensor assembly 102. This particular example showcases how thesensor assembly 102 can be utilized in a mobile setting, e.g. an automobile, to detect a variety of events by training different types ofvirtual sensors 116. This example also illustrates asensor assembly 102 attached or coupled to an object (i.e., the automobile) being sensed. Thus, thesensing system 100 in this example represents a direct sensing system with respect to the object to which thesensor assembly 102 is attached (i.e., the automobile) and an indirect sensing system with respect to environment in which thesensor assembly 102 is located (i.e., the interior of the automobile). - For example, a
virtual sensor 116 could be trained to detect when the automobile is approaching a highway according to theacceleration data 292, which indicates that the automobile has been gradually accelerating for an extended period of time. As another example, avirtual sensor 116 could be trained to detect when a window has been lowered according to some combination of the acoustic data 294 (which detects an increase in the amount of noise within the automobile), the temperature data 296 (which detects a temperature drop), the ambient humidity data 298 (which detects a humidity increase), and/or the ambient pressure data 300 (which detects a pressure drop). Further, avirtual sensor 116 could likewise be trained to detect when the window has been closed according to these same data streams. Additionally, a second ordervirtual sensor 124 could be trained from first ordervirtual sensors 120 to track the state of the automobile window. Instead of outputting a binary output as with the first order virtual sensors 120 (e.g., “Is the window closed? Yes or no?” or “Is the window open? Yes or no?”), the second ordervirtual sensor 124 could be trained from the outputs of the first ordervirtual sensors 120 to provide a nonbinary output directed to the window's state (e.g., “Is the window open, being opened, partially opened, closed, or being closed?”) based on this data. As another example, avirtual sensor 116 could be trained to detect heading of the vehicle according to themagnetic field data 302. Similarly to the above example, themagnetic field data 302 could train a number of first order virtual sensors 120 (e.g., “Is this vehicle heading north?” or “Is the vehicle heading west?”) and a second ordervirtual sensor 124 could be trained from the output of the first ordervirtual sensors 120 to provide a nonbinary output directed to the vehicle's state (e.g., “What direction is the vehicle heading in?”). As yet another example, avirtual sensor 116 could be trained to detect the degree of cloudiness according to the ambientlight illumination data 306, which can indicate the number and length of the instances that the sun is obscured during the course of the vehicle's trip. The ambientlight color data 304 could be utilized to detect additional events or parameters associated with the vehicle or the vehicle's environment, such as what time of day it is, as discussed above in the context of other examples, or whether the vehicle is proceeding through a tunnel. -
FIG. 13 illustrates a sixth sensor datagraphical display 310 annotated with events detected by thesensing system 100, in accordance with at least one aspect of the present disclosure. In the sixth sensor datagraphical display 310, a subset of the sensor data streams are depicted as featurized spectrograms. While the figure illustrates a spectrogram, other display methods such as a user interface illustrating time domain data, frequency domain data, and/or both are within the scope of this disclosure. In one aspect, thesensor assembly 102 includes avibration sensor 138, anacoustic sensor 144, and anEMI sensor 146. It should be noted that in this aspect, the data stream from thevibration sensor 138 is broken down into X-, Y-, and Z-axis constituent parts, which can be provided by a three-axis accelerometer, for example. The sixth sensor datagraphical display 310 depicts the sensor data captured by asensor assembly 102 placed within a workshop. Based on the depicted sensor data, a variety ofvirtual sensors 116 could be trained to detect events correlated with the data captured by thesensor assembly 102. - For example, a variety of
virtual sensors 116 could be trained to detect when the faucet is running, a urinal has been flushed, a kettle has been put on the stove, and/or various tools are being utilized according to a combination ofvibration data 312 andacoustic data 314. It should be noted that although certain events can be detected utilizing the same combinations of featurized data streams, they are nonetheless detectably discernible because the different events have different patterns or characteristics within the sensor data streams. The different patterns or characteristics exhibited in the data streams for thesensors 110 activated by each event can be utilized by the machine learning of thesensing system 100 to characterize that event to generate avirtual sensor 116 that can reliably identify future occurrences of the event. In these particular examples, a faucet running, a urinal flushing, an electric saw running, and the other annotated events each generate a unique signature in thevibration data 312 and/or theacoustic data 314 that can be characterized by the machine learning of thesensing system 100 to identify those events. - As another example, a
virtual sensor 116 could be trained to detect when the microwave door is opened or closed according to theacoustic data 314. Further, avirtual sensor 116 could be trained to detect when the microwave has completed a heating cycle according to the acoustic data 314 (by detecting the microwave's completion chime). Further, avirtual sensor 116 could be trained to detect when the microwave is running according to theEMI data 316. Thesevirtual sensors 116 can represent first ordervirtual sensors 120 detecting binary properties of the microwave. The outputs of these first ordervirtual sensors 120 can be fed into a second ordervirtual sensor 124 trained to track the state of the microwave.FIG. 14 , for example, illustrates a block diagram of a microwave second ordervirtual sensor 330 represented as a state machine, in accordance with at least one aspect of the present disclosure. The microwave second ordervirtual sensor 330 includes five states: available foruse 332, door ajar 334, in-use 336, interrupted 338, and finished 340. The microwave second ordervirtual sensor 330 moves from theavailable state 332 to the doorajar state 334 and back again ifacoustic data 314 indicates that the door has been opened and then closed. The microwave second ordervirtual sensor 330 moves from theavailable state 332 to the in-use state 336 when theEMI data 316 indicates that the microwave is running. The microwave second ordervirtual sensor 330 moves from the in-use state 336 to the interruptedstate 338 when theacoustic data 314 indicates that the door has been opened. The microwave second ordervirtual sensor 330 moves from the interruptedstate 338 back to the in-use state 336 when theEMI data 316 indicates that the microwave is once again running. The microwave second ordervirtual sensor 330 moves from the in-use state 336 to thefinished state 340 when theacoustic data 314 indicates that the completion chime has sounded. The microwave second ordervirtual sensor 330 then moves from thefinished state 340 to theavailable state 332 when theacoustic data 314 indicates that the microwave door has been closed (thereby indicating that the user has removed his or her food from the microwave). The state that the microwave second ordervirtual sensor 330 is in dictates its output. As can be seen from this example, a second ordervirtual sensor 124 can produce a nonbinary output by being trained on and fed a number of first ordervirtual sensors 120 that detect binary outputs, such as “Has the microwave door been closed?” or “Is the microwave running?”. - Referring back to
FIG. 5 , thesensing system 100 can implement, in various embodiments, end-to-end encryption between thesensor assembly 102 and thecomputer system 104 to ensure confidentiality and authenticity of the data. In one aspect, thesensor assemblies 102 and thecomputer system 104 mutually authenticate themselves using asymmetric keys. For example, thesensor assembly 102 can authenticate that it is talking to the correct computer system 104 (e.g., specified by a hostname/IP) and then encrypt the data that it is transmitting to thecomputer system 104 so that only thecomputer system 104 can decrypt it. Similarly, thecomputer system 104 can authenticate thesensor assembly 102 by thesensor assembly 102 adding its public signature to thecomputer system 104 and thesensor assembly 102 signing any data item it sends to thecomputer system 104 with its own associated private key so that thecomputer system 104 can verify its authenticity. As another example, thesensing system 100 can utilize asymmetric key cryptography to establish the communication channel between eachsensor assembly 102 and thecomputer system 104 and then establish a symmetric key cryptographic channel thereafter. Since thesensor assemblies 102 initiate the outgoing transmission protocol (e.g., TCP or UDP) to connect to a known server, they can punch a hole through a network address translation (NAT) or firewall and thus be deployed at homes with a single public IP address, as well as enterprises with eachsensor assembly 102 having its own public address. All data communication between thesensor assembly 102 and thecomputer system 104 can occur over such a single, persistent, encrypted TCP socket. Further, each data packet transmitted by thesensor assembly 102 can contain a header denoting the sensor channels payload, allowing thecomputer system 104 to demultiplex the source and type of sensed data. Further, thesensor assembly 102 can implement appropriate serializing and deserializing routines to package the data into chunks for transmission via, e.g., a W-Fi connection. Finally, data send routines execute asynchronously by thesensor assembly 102 so that sensor data reading, featurization, and transmission can proceed independently. - In one aspect, the
sensor assemblies 102 can transmit or stream the sensor data to a local computer or a computer external to thecomputer system 104. In one aspect, the local computer can include aclient 106 that is capable of executing the interface for visualizing the data from thesensor assemblies 102 and/or controlling the functions of thesensor assemblies 102, as described above. In another aspect, the local computer can be executing themachine learning module 116, as described above. The local computer to which the data is streamed can be, for example, behind a system configured to perform network address translation (NAT), common in residential settings with a single public IP address shared by many computers. In this way, the data streams from thesensor assemblies 102 do not necessarily need to be transmitted all the way to thecomputer system 104 and then back to the interface for visualization on theclient 106 and/or themachine learning module 116 for processing of the featurized data. In one aspect, thecomputer system 104 can control whether thesensor assemblies 102 are streaming data to a local computer according to whether there is a substantial distance between thesensor assemblies 102 and thecomputer system 104, whether the communication roundtrip time exceeds a particular threshold, or whether the available bandwidth falls below a particular threshold. In another aspect, a user can control whether the data from thesensor assemblies 102 is streamed to a local computer via the interface described above. In one aspect, thesensor assemblies 102 could also be programmed to automatically locate the nearest server (e.g., of the computer system 104) to which to stream its data based on a variety of metrics, such as distance, roundtrip communication time, and/or bandwidth. - Users can access or log into the
computer system 104 via aclient 106 to view the data from thesensor assemblies 102 associated with their user account, modify features or settings of thesensor assemblies 102, and update their security preferences. In one aspect, users can selectively permit other users to view, access, and/or modify their associatedsensor assemblies 102. This could permit, for example, all family members to view the data and events detected by thesensor assemblies 102 within the family's home. The permissions provided to the invited users can be controlled from a master account, for example. - In various aspects, in order to properly capture, model, and classify events in an environment, sensor streams from the
sensors 110 within asingle sensor assembly 102 and thesensors 110 acrossmultiple sensors assemblies 102 within the environment are preferably temporally correlated and/or synchronized. For example, a “door closing” event typically causes a synchronous increase in air pressure and a structural vibration, which could be detected across a number ofsensor assemblies 102 located throughout the building. This co-occurrence of signals that are detectable withdifferent sensors 110 and acrossdifferent sensor assemblies 102 is what enables thevirtual sensors 116 to robustly detect and classify events, such as a door closing in the above example. Even a minor temporal decorrelation betweensensors 110 and/orsensor assemblies 102 could reduce the segmentation confidence of the classifier (i.e., virtual sensor 116). In one aspect, thesensing system 100 can be configured to temporally synchronize or correlate the data streams fromsensors 110 both within asingle sensor assembly 102 and acrossmultiple sensor assemblies 102 to allow events detected by different data streams to be temporally associated together. - In one aspect, the
computer system 104 utilizes a Network Time Protocol (NTP) to synchronize its own clock periodically, which is then used to keep all of theclocks 127 of all of thesensors assemblies 102 connected to thecomputer system 104 in synchronization with thecomputer system 104. Eachsensor assembly 102 can include, for example, a quartz clock to keep track of time between these time synchronizations to minimize any clock drift between thedifferent sensor assemblies 102. Thesensor assembly 102 timestamps all sensor data with the system epoch time from itssynchronized clock 127 to, e.g., millisecond granularity. Further, synchronizing thesensor assemblies 102 and timestamping all sensor data addresses any reordering problems from thesensors 110 being sampled asynchronously and any processing or transmission delays before the data packet reaches thecomputer system 104. - In one aspect, all data for each sensor stream is continuously buffered in an onboard buffer (e.g., a buffer onboard the sensor assembly 102), with each sensor reading timestamped according to the
clock 127 synchronized across thesensing system 100 network. If communication between thesensor assembly 102 and thecomputer system 104 is lost or congested, thesensor assembly 102 can continually attempt to re-establish a communication channel with thecomputer system 104. Thesystem firmware 129A of thesensor assembly 102 can be configured to periodically check for W-Fi connectivity and an active connection to the computer system 104 (or a server or other computer thereof). If thesystem firmware 129A determines that the communication channel is lost or congested, thesensor assembly 102 can execute an exponential back-off algorithm to periodically attempt to reconnect to the communications channel (e.g., W-Fi network). If thesensor assembly 102 is unable to reconnect to thecomputer system 104 for a particular length of time (e.g., one hour), thesystem firmware 129A can be configured to reboot and then once again attempt to reconnect to thecomputer system 104. Upon thesensor assembly 102 reconnecting to thecomputer system 104, thesensor assembly 102 can then upload all of the buffered, timestamped sensor data to thecomputer system 104, which can then reorganize the data with the data fromother sensor assemblies 102 as necessary. - In one aspect, the
sensor assembly 102 comprises a software watchdog to monitor the status of all of thesensors 110. If anysensor 110 does not report new data within a configurable period (e.g., 60 seconds), the software watchdog executed by thesystem firmware 129A can automatically restart thesensor assembly 102. In another aspect, thesensor assembly 102 comprises a hardware watchdog that reboots thesensor assembly 102 if the application or thesystem firmware 129A fails to respond in a timely manner (e.g., within one minute). After reset, thesensor assembly 102 re-initializes all thesensors 110 and resumes operation. - In one aspect, the
sensor assembly 102 can further include one or more output devices, such as a light emitting device assembly including one or more LEDs, microphones, vibration motors, displays, and other audio, visual, and/or haptic indicators (not shown here). The output devices can be utilized to provide alerts or feedback to the user when various events occur. For example, thesensor assembly 102 can cause an LED assembly to illuminate or flash, a vibration motor to vibrate, a speaker to emit an audible alert, or the like when, for example, a particular event has been detected. Thecomputer system 104 can detect the occurrence of the event via an appropriatevirtual sensor 118 and then transmit a command or signal to thesensor assembly 102 to cause the corresponding output device to provide feedback, as previously described. Such audible or visual feedback can be utilized to provide notifications to hearing or vision-impaired individuals that an event has occurred (e.g., a kettle is boiling) or otherwise alert users that an event that the user may wish to be aware of has occurred. These alerts can be configured via the interface, for example. Users can thus access thecomputer system 104 via aclient 106 and then program or configure the desired alerts in accordance withvirtual sensors 118 and/or the trigger action rules described above. In yet another aspect, thesensor assembly 102 can include a wireless communication circuit (e.g., a Bluetooth LE transmitter, WiFi, Zigbee, Z-Wave) for transmitting alerts (e.g., push notifications) to the user (e.g., the client 106) when a selected event has been detected. - In aspects where the
sensor assembly 102 includes one or more output devices, the output devices can also be utilized to confirm or authenticate the identity and/or location ofparticular sensor assemblies 102. For example, a user visualizing the data streams from a number ofsensor assemblies 102 within asensing system 100 can (e.g., via a client device 106) cause the output device of aparticular sensor assembly 102 to begin emitting an alert so that the user can confirm the identity and location of thesensor assemblies 102. For example, a user could select “sensor assembly #3” and cause it to emit an alert. Thereafter, thecomputer system 104 can transmit a command to thesensor assembly 102 corresponding in identity to “sensor assembly #3” to cause it to emit a sequence of flashes by the light source or a sequence of beeps from the speaker. The user can then enter the sequence into theclient device 106 to authenticate theclient device 106 to thesensor assembly 102. In various aspects, the output may also be provided to a user mobile device. - In aspects wherein the
sensor assembly 102 includes a wireless communication circuit (e.g., Wi-Fi, Bluetooth, radio frequency identification, or the like), thesensing system 100 can authenticate a user based upon their mobile electronic devices (e.g., smart phone, wearables, and other such devices). For example, thesensing system 100 can utilize the wireless communication circuit to determine whether human activity detected by thesensor assemblies 102 is being performed by a user of interest, based upon whether the wireless communication circuit is able to detect the mobile electronic devices of the user or other authorized individuals present within the vicinity of therelevant sensor assemblies 102. The mobile electronic devices of the user and/or other authorized individuals can be, for example, pre-registered with thesensing system 100 or pre-paired with thesensor assemblies 102. Various applications and/or trigger rules could then be programmed to send alerts to an authorized user and/or take other actions if human activity detected by thesensing system 100 at certain locations, at certain times, or according to other such constraints, is not being performed by the users of interest. Such aspects could be utilized to, for example, detect when an unauthorized individual is in the user's home at night or while the user is at work. Such aspects could also be utilized to, for example, confirm the identity of an individual making changes to the configurations or settings of thesensing system 100 via aclient 106 according to their proximity to asensor assembly 102 of thesensing system 100. - In addition to utilizing machine learning to train
virtual sensors 116 to automatically characterize and detect various events within the detection range of eachsensor assembly 102, aclient 106 can also be utilized to access thecomputer system 104 to define conditions (rules) and associated actions to take in response to those conditions via an interface. In other words, users can program explicit actions for thesensor assembly 102 and/orcomputer system 104 to take when certain conditions have been satisfied. For example, a user could define a condition where if the motion sensor 142 is triggered and the time is after midnight, then send a text message to a particular cellular number. The trigger action rules can be defined with multiple conditions (triggers) and multiple actions. Triggers may capture specific conditions on sensor values (e.g., temperature>20° C.) or demonstrated behaviors detected by a virtual sensor 116 (e.g., window is open). Both sensors and actions can either refer to specific devices (e.g., temperature on sensor assembly #25) or to locations (e.g., kitchen). For instances where a trigger and/or action is specified to a location, thecomputer system 104 can compute the average value of all of thesensor assemblies 102 from a given sensor channel (e.g., temperature) for that specific location (e.g., kitchen). - In one aspect, the
virtual sensors 118 can output data in a format that is consistent with one or more known API architectures, such as Representational State Transfer (REST) or publish-subscribe APIs. Such outputs formats can utilize appropriate authentication, access control primitives, and other controls. Outputting data from thevirtual sensors 118 in accordance with a known API can allow apps to seamlessly make use of wide variety of data (whether it be raw sensor data, featurized sensor data, or higher order inferences) generated by thesensing system 100. -
FIG. 16 illustrates a logic flow diagram of aprocess 600 of detecting events viavirtual sensors 118, in accordance with at least one aspect of the present disclosure. Various additional details regarding the steps of theprocess 600 are described in more detail above. Atstep 601, the first and/or higher ordervirtual sensors 118 of thesensing system 100 are trained. As mentioned herein, training of thevirtual sensors 118 may involve training thevirtual sensors 118 with annotated training examples. Next, in operation, atstep 602 thevarious sensors 110 of thesensor assembly 102 sense physical phenomena in the environment of thesensor assembly 102. Next, atstep 603, features from the raw sensor data are extracted, as described above. The featurization of the sensor data can be performed, for example, by the sensor assembly 102 (e.g., themicrocontroller 121 thereof executing the featurization module 112), by the computer system 104 (e.g., by a programmed server thereof executing the featurization module 112), or by both components in a distributed manner. As described above, thesensor assembly 102 could transmit the featurized data in encrypted, periodic data packets, where the data packets might have concatenated sensor data frommultiple sensors 110. Next, atstep 604, one or morefirst order sensors 120 may detect occurrences of events that they are trained to detect atstep 601 based on the featurized raw sensor data. For aspects of thesensing system 100 with higher order virtual sensors (e.g., 2nd, 3rd, . . . , Nth order virtual sensors), atstep 605, the higher order virtual sensors can detect the events, conditions, durations, etc. that they are trained to detect. Next, atstep 606, the back-end server system 104 can transmit data about the detections by the virtual sensors to a remote system via a data communication network via wired and/or wireless communication links. For example, as explained above, thecomputer system 104 could transmit detection of virtual events to thesensor assembly 102 so that the sensor assembly could trigger an output device (e.g., a light source or speaker) or transmit a notification to a user device (e.g., a user's smartphone, laptop, tablet, etc.). Also, thecomputer system 104 could transmit a notification directly to the user device or to another networked, computer-based system, such as an alarm or emergency response system, an ordering system, a log, a monitoring system, etc. - While the
process 600 is described for the sake of convenience and not with an intent of limiting the disclosure as comprising a series and/or a number of steps, it is to be understood that the process does not need to be performed as a series of steps and/or the steps do not need to be performed in the order shown and described with respect toFIG. 16 , but the process may be integrated and/or one or more steps may be performed together, or the steps may be performed in the order disclosed or in an alternate order. For example, higher-order virtual sensors could be trained (e.g., step 601) after lower-order virtual sensors start detecting their associated events, conditions, durations, etc. In that connection, the times for performance of the steps illustrated in Figure N are not necessarily discrete, but instead can be ongoing continuous. That is, for example, the training of thevirtual sensors 118 may be ongoing. Similarly, steps 602-606 can be performed continuously. - A computing or
data processing system 1700 suitable for storing and/or executing program code may take many forms and in one embodiment may include at least oneprocessor 1702, which may be or be part of a controller, coupled directly or indirectly to memory devices or elements through a system bus, as shown inFIG. 17 .Computing system 1700 inFIG. 17 is shown with aprocessor 1702, random access memory (RAM) 1703,nonvolatile memory 1704, devicespecific circuits 1701, and I/O interface 1705. Alternatively, theRAM 1703 and/ornonvolatile memory 1704 may be contained in theprocessor 1702 as could the devicespecific circuits 1701 and I/O interface 1705. Theprocessor 1702 may comprise, for example, an off-the-shelf microprocessor, custom processor, FPGA, ASIC, discrete logic, etc., or generally any device for executing instructions. TheRAM 1703 is typically used to hold variable data, stack data, executable instructions, etc., and may include dynamic random access memory (DRAM). Such acomputing system 1700 may be used as one of the servers of thecomputer system 104, as a user device (e.g., mobile device), a remote computing system that receives notifications from thevirtual sensors 118, etc. - According to various approaches, the
nonvolatile memory 1704 may comprise any type of nonvolatile memory such as, but not limited to, electrically erasable programmable read only memory (EEPROM), flash programmable read only memory (PROM), battery backup RAM, hard disk drives, etc. Thenonvolatile memory 1704 is typically used to hold the executable firmware and any nonvolatile data containing programming instructions that can be executed to cause theprocessor 102 to perform certain functions. - In some embodiments, the I/
O interface 1705 may include a communication interface that allows theprocessor 1702 to communicate with devices external to the controller. Examples of the communication interface may comprise, but are not limited to, serial interfaces such as RS-232, Universal Serial Bus (USB), Small Computer Systems Interface (SCSI), RS-422, or a wireless communication interface such as Bluetooth, near-field communication (NFC) or other wireless interfaces. Thecomputing system 1700 may communicate with an external device via thecommunication interface 1705 in any communication protocol such as Automation/Drive Interface (ADI). - The
sensing system 100 described herein can be utilized in a number of different contexts. For example, thesensing system 100 can be utilized to assist in monitoring patients and providing effective healthcare to patients. With a rapidly aging population, providing care at home for this population will become a necessity. A key aspect that caregivers need is to track the Activities of Daily Living (ADL) for people, and be able to detect anomalies when these activities deviate from the norm for each individual. Thesensing system 100 includingvirtual sensors 118, as described herein, can be used to provide a comprehensive system to track ADLs, such as bathroom usage, movement within the house, daily chores like cooking and eating, adherence to medications, detecting falls, all without needing instrumentation and intrusive sensing. In the future, virtual sensors to track various contexts within a person's home can be also used for predictive interventions (e.g. predict an impending fall), and not just for reactive events, particularly by customizing them to each individual. - As another example, the
sensing system 100 can be utilized in industrial settings. In industrial settings with mechanical and electrical machinery, preventative maintenance and predictive analytics can be of huge help to increase equipment lifetime, as well as reduce downtime due to failures. Most older equipment does not have any sensing capability and newer equipment may have limited purpose specific sensing. Thesensing system 100 includingvirtual sensors 118, as described herein, can be utilized to learn different states of industrial equipment by training a variety ofvirtual sensors 118 and then building applications and notifications to help with usage tracking, tracking error conditions, scheduling maintenance operations automatically, and other such tasks. - As another example, the
sensing system 100 can be utilized in environmental sustainability efforts. Over two-thirds of the electricity generated in the US and over 30% of the potable water is used by human occupants of buildings, both commercial and residential. tracking resource usage, i.e. water and energy, at a fine granularity and notifying human occupants as well as building managers of wastage can lead to significant reduction in the usage of these natural resources. Thesensing system 100 includingvirtual sensors 118, as described herein, can be utilized to train a variety ofvirtual sensors 118 in home and buildings to track individual appliance usage and water consumption and correspondingly provide comprehensive user interfaces and notifications to promote behavioral changes. - As another example, the
sensing system 100 can be utilized for managing facilities. Given that humans spent over one third of their lives inside a commercial building (i.e., their work place), methods to make them more performative to serve occupant need can lead to improved productivity, comfort, and happiness. While sensing systems do get deployed in modern buildings for heating, ventilation, air-conditioning (HVAC), and lighting management, they are all purpose-specific, costly to deploy and maintain, and not easy to repurpose. Thesensing system 100 includingvirtual sensors 118, as described herein, can be utilized to provide a uniform sensing substrate for all things related to smart building management, including control of HVAC systems, space utilization, power consumption tracking, occupancy and people movement tracking, fault detection, etc. - As another example, the
sensing system 100 can be utilized for home-based consumer applications. With the advent of the IoT and integration with voice assistants, such as Amazon Alexa and Google Home, the presently describedsensing system 100 can be utilized in a number of different ways as part of a “smart home.” For example, thesensing system 100 can implementvirtual sensors 118 to track the usage of consumables like toilet paper and soap to notify users when they are running low (or even directly order replenishments) or notify users about the status of appliances in the home (e.g., a dishwasher or laundry machine). Thesensing system 100 can also implementvirtual sensors 118 trained to detect meta events, such as any movements or sounds within the home, for security purposes. Thesensing system 100 can also implementvirtual sensors 118 trained to non-intrusively detect sleep duration and patterns, without the user(s) being required to wear any device(s). - As another example, the
sensing system 100 can be utilized in a variety of implementations for smart cities. There is a major push across the US and the globe to make cities smarter by adding sensing to street lights and other public infrastructure, such as buses, trolleys, and roads. Thesensing system 100 includingvirtual sensors 118, as described herein and suitably outfitted for outdoor environments, can sense a wide variety of facets of the city environment. In addition, a number ofvirtual sensors 118 can be trained to detect events of interest that have distinct signatures. For example,virtual sensors 118 can sense a traffic jam, an accident, a gun shot, traffic estimation, street illumination, environmental quality, etc. A key advantage of thesensing system 100 implementingvirtual sensors 118 is to do all the processing and featurization at thesensor assembly 102 itself, thereby addressing many of the privacy concerns in a smart city environment and also reducing the data that needs to be transmitted at the scale of a city. - In one general aspect, therefore, the present invention is directed to a sensing system comprising a sensor assembly and a back-end server system. The sensor assembly comprises a control circuit and one or more sensors, where each of the sensors is configured to sense one or more physical phenomena in an environment of the sensor assembly that are indicative of events. The back end server system comprises at least one server that is in communication with the sensor assembly. The control circuit of the sensor assembly is configured to extract a plurality of features from raw sensor data collected by the one or more sensors to form featurized data and to transmit data packets to the back end server system, where the data packets comprise the featurized data. The at least one server of the back end server system is configured to implement one or more first order virtual sensors, where each of the one or more first order virtual sensors is trained through machine learning to detect, based on the featurized data, an event in the environment of the sensor assembly.
- In another general aspect, the back end server system comprises at least one server that comprises a processor and a memory for storing instructions that, when executed by the processor, cause the server to: (i) receive the featurized data from the sensor assembly; (ii) determine an occurrence of one or more events via the featurized data; (iii) train, via machine learning, one or more first order virtual sensor implemented by the server to detect the one or more events based on the featurized data; and (iv) monitor, via the virtual sensor, for subsequent occurrences of the one or more events based on featurized data from the sensor assembly.
- According to various implementations, the events detected by the one or more first order virtual sensors are not directly sensed by any of the one or more sensors of the sensor assembly. Also, the sensor assembly may be in wireless communication with the back end server system.
- Additionally, the at least one server of the back end server system may be further configured to implement one or more second order virtual sensors, wherein the one or more second order virtual sensors are trained to detect, based on, at least in part, outputs of one of more of the first order virtual sensors, a second order condition in the environment of the sensor assembly. At least one of the one or more second order virtual sensors may produce a non-binary output and the first order virtual sensors may produce binary outputs, non-binary outputs, or a set of labels. For example, the first, second and/or higher order virtual sensors may comprise machine-learned classifiers that are trained to detect events, conditions, durations, etc. in the environment of the sensor assembly. The classifiers could be support vector machines or deep learning algorithms/networks, for example, that may be trained through supervised or unsupervised learning. Labeled data for supervised learning may be collected from annotations of events by a user that are captured via a user interface provided by the back end server system.
- The sensors may comprise passive and/or active sensors. Examples of passive sensors are an infrared radiation sensor, an ambient light color sensor, an ambient light intensity sensor, a magnetic field sensor, a temperature sensor, an ambient pressure sensor, a humidity sensor, a vibration sensor, an external device communication sensor, a motion sensor, an acoustic sensor, an indoor air quality sensor, a chemical sensor, a vision sensor, and an electromagnetic interference sensor. Examples of active sensors are a sonar sensor, an ultrasonic sensor, a light emitting sensor, a radar based sensor, an acoustic sensor, an infrared camera, an active infrared sensor, an indoor positioning system, an x-ray based sensor, a seismic sensor, and an active sound measurement system. The sensor assembly may also comprise an output feedback device, such as a speaker, a light source, and a vibration source. Additionally, the sensor assembly may be positionally stationary. And it need not include a high-resolution camera.
- In various implementations, the back-end server system is configured to transmit a notification to the sensor assembly when a particular event is detected by the one or more first order virtual sensors. In turn, the sensor assembly may transmit a notification to a user via the output feedback device in response to receiving the notification from the back end server system that the particular event was detected.
- In various implementations, the sensor assembly comprises one or more circuit boards, where the control circuit and the one or more sensors are connected to the one or more circuit boards. The sensor assembly may further comprise a housing that houses the one or more circuit boards, the one or more sensors, and the control circuit. In particular, the sensor assembly may comprise a single circuit board and a housing. The control circuit and the or more sensors may be connected to the single circuit board and the housing may house the single circuit board, the one or more sensors, and the control circuit.
- In yet other implementations, a first sensor of the one or more sensors may have an adjustable sampling rate. In such an embodiment, the at least one server of the back end server system may be further configured to transmit an adjustment for the adjustable sampling rate for the first sensor to the sensor assembly.
- Additionally, the featurized data for a sensor of the sensor assembly may comprise a statistical measure of raw sensor data for the sensor over a time window. The statistical measure may include; the minimum value over the time window; the maximum value over the time window; the range over the time window; the mean over the time window; the median over the time window; the mode over the time window; the sum of the raw sensor values over the time window; the standard deviation over the time window; and/or the centroid of the raw sensor values over the time window.
- Furthermore, the control circuit of the sensor assembly may be configured to transmit periodic data packets to the back end server system, where the data packets comprise concatenated featurized data for two or more sensors of the plurality of sensors. The data packets may also be encrypted by the sensor assembly prior to being transmitted.
- In various implementations, the sensor assembly further comprises a wireless communication circuit for communicating wirelessly with a user device. The wireless communication circuit may comprises a wireless communication circuit selected from the group consisting of a Bluetooth circuit, a WiFi circuit, a Z-Wave circuit, a Zigbee circuit, a RFID circuit, a LoRA radio circuit and a LoRAWAN radio circuit. Additionally, the back-end server system may be configured to transmit a notification to the sensor assembly when a particular event is detected by the one or more first order virtual sensors. In turn, the sensor assembly is configured to transmit a notification to the user device via the wireless communication circuit in response to receiving the notification from the back end server system that the event was detected. Also, the back-end server system may be configured to transmit a notification to a remote computer-based system when a particular condition is detected by a first, second or higher order virtual sensor.
- In another general aspect, the sensing system may include a plurality of such sensor assemblies that are distributed throughout an environment or location. In such an embodiment, the first, second, and/or higher order virtual sensors may use data from sensors on more than one of the sensor assemblies to detect their corresponding events, conditions, durations, etc. that they are trained to detect throughout the environment or location.
- In another general aspect, the present invention is directed to a method that comprises the steps of (i) sensing, by a sensor assembly that comprises one or more sensors, one or more physical phenomena in an environment of the sensor assembly; (ii) extracting a plurality of features from raw sensor data collected by the one or more sensors to form featurized data; and (iii) detecting, by a machine-learning first order virtual sensor of a back end server system, based on the featurized data, an event in the environment of the sensor assembly.
- In various implementations, the sensor assembly extracts the plurality of features from the raw sensor data and the method further comprises the step of transmitting, by the sensor assembly, the featurized data to the back end server system. The method may also comprise the step of, prior to detecting the event, training the first order virtual sensor to detect the event from featurized data. The method may also comprise the step of receiving, by the back end server system via a user interface, annotations of occurrences of the event to use as the labeled data for the supervised training. In various embodiments, the back end server system comprises a plurality of machine-learning first order virtual sensors, and the detecting step comprises detecting, by each of the plurality of machine-learning first order virtual sensors, based on the featurized data, a different event in the environment of the sensor assembly. Also, the back end server system may further comprise a machine-learning second order virtual sensor that is trained through machine learning to detect, based on output from at least one of the plurality of first order virtual sensors, a second order condition in the environment of the sensor assembly, in which case the method may further comprise the step of detecting, by the machine-learning second order virtual sensor, the second order condition in the environment of the sensor assembly based on the output from at least one of the plurality of first order virtual sensors. In various implementations, one of the sensors has an adjustable sampling rate, in which case the method may further comprises the step of transmitting, by the back end server system, an adjustment for the sampling rate to the first sensor.
- In various implementations, the sensor assembly further comprises an output feedback device. In such an embodiment, the method may further comprise the step of outputting, by the output feedback device, a code for authentication of the sensor assembly to a user device. Also, the sensor assembly may further comprise a wireless communication circuit for communicating wirelessly with a user device. In such an embodiment, the method may further comprise the steps of (i) transmitting, by the back-end server system, a notification to the sensor assembly when the event is detected by the first order virtual sensor; and (ii) transmitting, by the sensor assembly, a notification to the user device via the wireless communication circuit in response to receiving the notification from the back end server system that the event was detected.
- While several forms have been illustrated and described, it is not the intention of the applicant to restrict or limit the scope of the appended claims to such detail. Numerous modifications, variations, changes, substitutions, combinations, and equivalents to those forms may be implemented and will occur to those skilled in the art without departing from the scope of the present disclosure. Moreover, the structure of each element associated with the described forms can be alternatively described as a means for providing the function performed by the element. Also, where materials are disclosed for certain components, other materials may be used. It is therefore to be understood that the foregoing description and the appended claims are intended to cover all such modifications, combinations, and variations as falling within the scope of the disclosed forms. The appended claims are intended to cover all such modifications, variations, changes, substitutions, modifications, and equivalents.
- The foregoing detailed description has set forth various forms of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. Insofar as such block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, and/or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. Those skilled in the art will recognize that some aspects of the forms disclosed herein, in whole or in part, can be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as one or more program products in a variety of forms, and that an illustrative form of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution. For example, the various
virtual sensors computer system 104, that when executed by a processor(s) of the computer system, causes the processor(s) to perform virtual sensor classifications as described herein. Similarly, the activation group and machine learning modules 116 (seeFIG. 1 ) may be implemented with software stored in primary or secondary memory of thecomputer system 104, that when executed by a processor(s) of the computer system, causes the processor(s) to perform their respective functions as described herein. - Instructions used to program logic to perform various disclosed aspects can be stored within a memory in the system, such as dynamic random access memory (DRAM), cache, flash memory, or other storage. Furthermore, the instructions can be distributed via a network or by way of other computer readable media. Thus a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), but is not limited to, floppy diskettes, optical disks, compact disc, read-only memory (CD-ROMs), and magneto-optical disks, read-only memory (ROMs), random access memory (RAM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), magnetic or optical cards, flash memory, or a tangible, machine-readable storage used in the transmission of information over the Internet via electrical, optical, acoustical or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.). Accordingly, the non-transitory computer-readable medium includes any type of tangible machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
- As used in any aspect herein, the term “control circuit” may refer to, for example, hardwired circuitry, programmable circuitry (e.g., a computer processor comprising one or more individual instruction processing cores, processing unit, processor, microcontroller, microcontroller unit, controller, digital signal processor (DSP), programmable logic device (PLD), programmable logic array (PLA), or field programmable gate array (FPGA)), state machine circuitry, firmware that stores instructions executed by programmable circuitry, and any combination thereof. The control circuit may, collectively or individually, be embodied as circuitry that forms part of a larger system, for example, an integrated circuit (IC), an application-specific integrated circuit (ASIC), a system on-chip (SoC), desktop computers, laptop computers, tablet computers, servers, smart phones, etc. Accordingly, as used herein “control circuit” includes, but is not limited to, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program, which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program, which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of random access memory), and/or electrical circuitry forming a communications device (e.g., a modem, communications switch, or optical-electrical equipment). Those having skill in the art will recognize that the subject matter described herein may be implemented in an analog or digital fashion or some combination thereof.
- As used in any aspect herein, the term “logic” may refer to an app, software, firmware and/or circuitry configured to perform any of the aforementioned operations. Software may be embodied as a software package, code, instructions, instruction sets and/or data recorded on non-transitory computer readable storage medium. Firmware may be embodied as code, instructions or instruction sets and/or data that are hard-coded (e.g., nonvolatile) in memory devices.
- As used in any aspect herein, the terms “component,” “system,” “module,” and the like can refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution.
- As used in any aspect herein, an “algorithm” refers to a self-consistent sequence of steps leading to a desired result, where a “step” refers to a manipulation of physical quantities and/or logic states which may, though need not necessarily, take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It is common usage to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. These and similar terms may be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities and/or states.
- Unless specifically stated otherwise as apparent from the foregoing disclosure, it is appreciated that, throughout the foregoing disclosure, discussions using terms such as “processing,” “computing,” “calculating,” “determining,” “displaying,” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission, or display devices.
- One or more components may be referred to herein as “configured to,” “configurable to,” “operable/operative to,” “adapted/adaptable,” “able to,” “conformable/conformed to,” etc. Those skilled in the art will recognize that “configured to” can generally encompass active-state components and/or inactive-state components and/or standby-state components, unless context requires otherwise.
- It is worthy to note that any reference to “one aspect,” “an aspect,” “an exemplification,” “one exemplification,” and the like means that a particular feature, structure, or characteristic described in connection with the aspect is included in at least one aspect. Thus, appearances of the phrases “in one aspect,” “in an aspect,” “in an exemplification,” and “in one exemplification” in various places throughout the specification are not necessarily all referring to the same aspect. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner in one or more aspects.
- Any patent, publication, or other disclosure material, in whole or in part, that is said to be incorporated by reference herein is incorporated herein only to the extent that the incorporated materials do not conflict with existing definitions, statements, or other disclosure material set forth in this disclosure. As such, and to the extent necessary, the disclosure as explicitly set forth herein supersedes any conflicting material incorporated herein by reference. Any material, or portion thereof, that is said to be incorporated by reference herein, but which conflicts with existing definitions, statements, or other disclosure material set forth herein will only be incorporated to the extent that no conflict arises between that incorporated material and the existing disclosure material.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/591,987 US20200033163A1 (en) | 2017-04-24 | 2019-10-03 | Virtual sensor system |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762602487P | 2017-04-24 | 2017-04-24 | |
US201762602543P | 2017-04-27 | 2017-04-27 | |
US201762605675P | 2017-08-22 | 2017-08-22 | |
US15/961,537 US10436615B2 (en) | 2017-04-24 | 2018-04-24 | Virtual sensor system |
US16/591,987 US20200033163A1 (en) | 2017-04-24 | 2019-10-03 | Virtual sensor system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/961,537 Continuation US10436615B2 (en) | 2017-04-24 | 2018-04-24 | Virtual sensor system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200033163A1 true US20200033163A1 (en) | 2020-01-30 |
Family
ID=62218309
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/961,537 Active US10436615B2 (en) | 2017-04-24 | 2018-04-24 | Virtual sensor system |
US16/591,987 Abandoned US20200033163A1 (en) | 2017-04-24 | 2019-10-03 | Virtual sensor system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/961,537 Active US10436615B2 (en) | 2017-04-24 | 2018-04-24 | Virtual sensor system |
Country Status (4)
Country | Link |
---|---|
US (2) | US10436615B2 (en) |
EP (1) | EP3616387A1 (en) |
CN (1) | CN110800273B (en) |
WO (1) | WO2018200541A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022165062A1 (en) * | 2021-01-28 | 2022-08-04 | View, Inc. | Multi-sensor synergy |
US20220249945A1 (en) * | 2020-07-27 | 2022-08-11 | Tencent Technology (Shenzhen) Company Limited | Picture update method and apparatus, computer device, and storage medium |
WO2023114121A1 (en) * | 2021-12-13 | 2023-06-22 | Mars, Incorporated | A computer-implemented method of predicting quality of a food product sample |
US11886089B2 (en) | 2017-04-26 | 2024-01-30 | View, Inc. | Displays for tintable windows |
US11892738B2 (en) | 2017-04-26 | 2024-02-06 | View, Inc. | Tandem vision window and media display |
US12057220B2 (en) | 2020-05-27 | 2024-08-06 | View Operating Corporation | Systems and methods for managing building wellness |
Families Citing this family (113)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10303035B2 (en) | 2009-12-22 | 2019-05-28 | View, Inc. | Self-contained EC IGU |
US20230076947A1 (en) * | 2012-04-13 | 2023-03-09 | View, Inc. | Predictive modeling for tintable windows |
US10700944B2 (en) * | 2012-06-07 | 2020-06-30 | Wormhole Labs, Inc. | Sensor data aggregation system |
US10374863B2 (en) * | 2012-12-05 | 2019-08-06 | Origin Wireless, Inc. | Apparatus, systems and methods for event recognition based on a wireless signal |
US10599818B2 (en) * | 2012-10-02 | 2020-03-24 | Banjo, Inc. | Event-based vehicle operation and event remediation |
US9135437B1 (en) * | 2014-03-24 | 2015-09-15 | Amazon Technologies, Inc. | Hypervisor enforcement of cryptographic policy |
US11743071B2 (en) | 2018-05-02 | 2023-08-29 | View, Inc. | Sensing and communications unit for optically switchable window systems |
JP6528567B2 (en) * | 2015-07-02 | 2019-06-12 | オムロン株式会社 | Vibration sensor and threshold adjustment method |
US10785643B2 (en) * | 2016-10-24 | 2020-09-22 | Lg Electronics Inc. | Deep learning neural network based security system and control method therefor |
WO2018176000A1 (en) | 2017-03-23 | 2018-09-27 | DeepScale, Inc. | Data synthesis for autonomous control systems |
WO2018217665A1 (en) * | 2017-05-23 | 2018-11-29 | BrainofT Inc. | Multi-modal interactive home-automation system |
US10671349B2 (en) | 2017-07-24 | 2020-06-02 | Tesla, Inc. | Accelerated mathematical engine |
US11409692B2 (en) | 2017-07-24 | 2022-08-09 | Tesla, Inc. | Vector computational unit |
US11893393B2 (en) | 2017-07-24 | 2024-02-06 | Tesla, Inc. | Computational array microprocessor system with hardware arbiter managing memory requests |
US11157441B2 (en) | 2017-07-24 | 2021-10-26 | Tesla, Inc. | Computational array microprocessor system using non-consecutive data formatting |
US20190095847A1 (en) * | 2017-09-28 | 2019-03-28 | Honeywell International Inc. | System and method for monitoring workflow checklist with an unobtrusive sensor |
JP7106847B2 (en) * | 2017-11-28 | 2022-07-27 | 横河電機株式会社 | Diagnostic device, diagnostic method, program, and recording medium |
US11410103B2 (en) * | 2017-12-06 | 2022-08-09 | International Business Machines Corporation | Cognitive ride scheduling |
US11429807B2 (en) * | 2018-01-12 | 2022-08-30 | Microsoft Technology Licensing, Llc | Automated collection of machine learning training data |
US11481571B2 (en) | 2018-01-12 | 2022-10-25 | Microsoft Technology Licensing, Llc | Automated localized machine learning training |
JP7006296B2 (en) * | 2018-01-19 | 2022-01-24 | 富士通株式会社 | Learning programs, learning methods and learning devices |
US11561791B2 (en) | 2018-02-01 | 2023-01-24 | Tesla, Inc. | Vector computational unit receiving data elements in parallel from a last row of a computational array |
JP6528869B1 (en) * | 2018-02-13 | 2019-06-12 | オムロン株式会社 | Session control apparatus, session control method and program |
JP6477943B1 (en) * | 2018-02-27 | 2019-03-06 | オムロン株式会社 | Metadata generation apparatus, metadata generation method and program |
US10310079B1 (en) * | 2018-03-02 | 2019-06-04 | Amazon Technologies, Inc. | Presence detection using wireless signals confirmed with ultrasound and/or audio |
GB201805898D0 (en) * | 2018-04-10 | 2018-05-23 | Rolls Royce Plc | Machine sensor network management |
US10906536B2 (en) * | 2018-04-11 | 2021-02-02 | Aurora Innovation, Inc. | Control of autonomous vehicle based on determined yaw parameter(s) of additional vehicle |
US11099539B2 (en) * | 2018-05-17 | 2021-08-24 | Ut-Battelle, Llc | Multi-sensor agent devices |
JP2019215728A (en) * | 2018-06-13 | 2019-12-19 | キヤノン株式会社 | Information processing apparatus, information processing method and program |
US11215999B2 (en) | 2018-06-20 | 2022-01-04 | Tesla, Inc. | Data pipeline and deep learning system for autonomous driving |
US11375404B2 (en) * | 2018-07-16 | 2022-06-28 | Revokind, Inc. | Decentralized infrastructure methods and systems |
US11361457B2 (en) | 2018-07-20 | 2022-06-14 | Tesla, Inc. | Annotation cross-labeling for autonomous control systems |
US11636333B2 (en) | 2018-07-26 | 2023-04-25 | Tesla, Inc. | Optimizing neural network structures for embedded systems |
US20190050732A1 (en) * | 2018-08-28 | 2019-02-14 | Intel Corporation | Dynamic responsiveness prediction |
US11562231B2 (en) | 2018-09-03 | 2023-01-24 | Tesla, Inc. | Neural networks for embedded devices |
EP3847433B1 (en) * | 2018-09-04 | 2023-07-05 | SeeScan, Inc. | Video pipe inspection systems with video integrated with additional sensor data |
US11169865B2 (en) * | 2018-09-18 | 2021-11-09 | Nec Corporation | Anomalous account detection from transaction data |
WO2020077117A1 (en) | 2018-10-11 | 2020-04-16 | Tesla, Inc. | Systems and methods for training machine models with augmented data |
KR102095653B1 (en) * | 2018-10-12 | 2020-03-31 | 한국수력원자력 주식회사 | System and method for abnormal operation state judgment using neural network model |
US11196678B2 (en) | 2018-10-25 | 2021-12-07 | Tesla, Inc. | QOS manager for system on a chip communications |
US11403492B2 (en) | 2018-11-02 | 2022-08-02 | Aurora Operations, Inc. | Generating labeled training instances for autonomous vehicles |
US11256263B2 (en) | 2018-11-02 | 2022-02-22 | Aurora Operations, Inc. | Generating targeted training instances for autonomous vehicles |
US10679481B2 (en) * | 2018-11-14 | 2020-06-09 | Chaoyang Semiconductor Jiangyin Technology, CO. LTD. | Method and apparatus for processor-based mini-sensor assembly |
US12020123B2 (en) * | 2018-11-20 | 2024-06-25 | Koninklijke Philips N.V. | User-customisable machine learning models |
US10991249B2 (en) | 2018-11-30 | 2021-04-27 | Parkifi, Inc. | Radar-augmentation of parking space sensors |
US11089654B2 (en) * | 2018-11-30 | 2021-08-10 | Dish Network L.L.C. | Universal narrow-band internet of things communication node for use with environmental sensors and stations |
US11816585B2 (en) | 2018-12-03 | 2023-11-14 | Tesla, Inc. | Machine learning models operating at different frequencies for autonomous vehicles |
US11537811B2 (en) | 2018-12-04 | 2022-12-27 | Tesla, Inc. | Enhanced object detection for autonomous vehicles based on field view |
AU2019393332A1 (en) * | 2018-12-06 | 2021-06-03 | Saber Astronautics Australia Pty. Ltd. | A method and a system for assessing aspects of an electromagnetic signal |
CN109464075A (en) * | 2018-12-07 | 2019-03-15 | 江苏美的清洁电器股份有限公司 | The cleaning control method and its device and sweeping robot of sweeping robot |
WO2020123394A1 (en) * | 2018-12-09 | 2020-06-18 | Spot AI, Inc. | Systems and methods for distributed image processing |
US11286058B2 (en) * | 2018-12-18 | 2022-03-29 | Textron Innovations Inc. | Heliport docking system |
EP3670415A3 (en) | 2018-12-21 | 2020-07-15 | Otis Elevator Company | Virtual sensor for elevator monitoring |
JP2022515266A (en) | 2018-12-24 | 2022-02-17 | ディーティーエス・インコーポレイテッド | Room acoustic simulation using deep learning image analysis |
US11610117B2 (en) | 2018-12-27 | 2023-03-21 | Tesla, Inc. | System and method for adapting a neural network model on a hardware platform |
US20200210804A1 (en) * | 2018-12-31 | 2020-07-02 | Qi Lu | Intelligent enclosure systems and computing methods |
CN109613610B (en) * | 2019-01-17 | 2020-04-14 | 中南大学 | Automatic picking method for microseism signal arrival time difference |
TWI734330B (en) * | 2019-01-31 | 2021-07-21 | 日商住友重機械工業股份有限公司 | Support device, support method and recording medium |
US11150664B2 (en) | 2019-02-01 | 2021-10-19 | Tesla, Inc. | Predicting three-dimensional features for autonomous driving |
US10997461B2 (en) | 2019-02-01 | 2021-05-04 | Tesla, Inc. | Generating ground truth for machine learning from time series elements |
JP6975188B2 (en) * | 2019-02-07 | 2021-12-01 | ファナック株式会社 | Status determination device and status determination method |
US11567514B2 (en) | 2019-02-11 | 2023-01-31 | Tesla, Inc. | Autonomous and user controlled vehicle summon to a target |
US10956755B2 (en) | 2019-02-19 | 2021-03-23 | Tesla, Inc. | Estimating object properties using visual image data |
CN109916627A (en) * | 2019-03-27 | 2019-06-21 | 西南石油大学 | Bearing fault detection and diagnosis based on Active Learning |
USD942995S1 (en) * | 2019-04-03 | 2022-02-08 | Siemens Aktiengesellschaft | Display screen or portion thereof with graphical user interface |
US11488025B2 (en) | 2019-04-29 | 2022-11-01 | Landmark Graphics Corporation | Hybrid neural network and autoencoder |
US11599830B1 (en) * | 2019-05-01 | 2023-03-07 | ClearCare, Inc. | Automatic change in condition monitoring by passive sensor monitoring and machine learning |
US11768129B2 (en) * | 2019-05-13 | 2023-09-26 | Volvo Car Corporation | Machine-learning based vehicle diagnostics and maintenance |
US11082521B2 (en) * | 2019-05-24 | 2021-08-03 | California Eastern Laboratories, Inc. | Single source of information apparatuses, methods, and systems |
EP3751532A1 (en) * | 2019-06-13 | 2020-12-16 | Rohde & Schwarz GmbH & Co. KG | Remote access and control system and corresponding method |
US20210000356A1 (en) * | 2019-07-02 | 2021-01-07 | Tata Consultancy Services Limited | Method and system for screening and monitoring of cardiac diseases |
US11656337B2 (en) * | 2019-07-11 | 2023-05-23 | Toyota Motor Engineering & Manufacturing North America, Inc. | Photonic apparatus integrating optical sensing and optical processing components |
BE1027428B1 (en) | 2019-07-14 | 2021-02-11 | Niko Nv | SENSOR SYSTEM FOR ACTIVITY RECOGNITION |
EP3785581A4 (en) * | 2019-07-15 | 2021-06-23 | LG Electronics Inc. | Artificial intelligence cooking apparatus |
KR102667036B1 (en) | 2019-07-19 | 2024-05-17 | 엘지전자 주식회사 | Method and heating apparatus for estimating status of heated object |
JP7268736B2 (en) * | 2019-07-24 | 2023-05-08 | 日本電信電話株式会社 | Synchronous controller |
US11727303B2 (en) * | 2019-08-22 | 2023-08-15 | Kyndryl, Inc. | Precipitation detection using mobile devices |
US11983678B1 (en) * | 2019-09-26 | 2024-05-14 | American Paper Converting Inc. | Systems and methods for restroom consumables monitoring and maintenance scheduling |
US20210097436A1 (en) * | 2019-10-01 | 2021-04-01 | Kelley S. Weiland | Automated system for generating properly tagged training data for and verifying the efficacy of artificial intelligence algorithms |
CN110913357B (en) * | 2019-11-13 | 2020-10-09 | 绍兴文理学院 | Sensing cloud double-layer network defense system and method based on security situation awareness |
WO2021108113A1 (en) * | 2019-11-27 | 2021-06-03 | Corning Incorporated | Sensing vessels for cell cultures |
US11533457B2 (en) | 2019-11-27 | 2022-12-20 | Aob Products Company | Smart home and security system |
US20210201191A1 (en) * | 2019-12-27 | 2021-07-01 | Stmicroelectronics, Inc. | Method and system for generating machine learning based classifiers for reconfigurable sensor |
US20210272025A1 (en) * | 2019-12-27 | 2021-09-02 | Stmicroelectronics, Inc. | Method and system for updating machine learning based classifiers for reconfigurable sensors |
FR3105873B1 (en) * | 2019-12-31 | 2022-03-25 | Awox | METHOD AND SYSTEM FOR PREDICTIVE AND AUTOMATIC EMISSION OF A DOMOTIC SIGNAL |
US11582152B2 (en) * | 2020-01-29 | 2023-02-14 | Hewlett Packard Enterprise Development Lp | Method and system for data management in an edge server |
JP7503752B2 (en) * | 2020-03-05 | 2024-06-21 | パナソニックIpマネジメント株式会社 | Information processing system and information processing method |
TW202206925A (en) | 2020-03-26 | 2022-02-16 | 美商視野公司 | Access and messaging in a multi client network |
GB2594328B (en) * | 2020-04-24 | 2024-04-10 | Novosound Ltd | Secure ultrasound system |
EP4158505A4 (en) * | 2020-05-27 | 2024-05-08 | Helios Sports, Inc. | Intelligent sports video and data generation from ai recognition events |
CN111766557A (en) * | 2020-05-31 | 2020-10-13 | 宁夏隆基宁光仪表股份有限公司 | Method for analyzing influence on detection precision of electric energy meter based on K-Means algorithm |
CN111682925B (en) * | 2020-06-05 | 2022-07-12 | 四川艾贝斯科技发展有限公司 | Data acquisition and processing method for intelligent street lamp |
US11667265B2 (en) * | 2020-07-14 | 2023-06-06 | Micron Technology, Inc. | Activating a security mode for a vehicle based on driver identification |
US11436864B2 (en) | 2020-07-14 | 2022-09-06 | Micron Technology, Inc. | Driver recognition to control vehicle systems |
CA3186688A1 (en) * | 2020-07-20 | 2022-01-27 | Arris Enterprises Llc | Determination of extender onboarding completion status |
US11727091B2 (en) * | 2020-09-11 | 2023-08-15 | Qeexo, Co. | Method and system for training machine learning models on sensor nodes |
SE544849C2 (en) * | 2020-10-02 | 2022-12-13 | Assa Abloy Ab | Providing data for training a machine learning model |
CN112710353B (en) * | 2020-12-23 | 2022-10-21 | 龙海建设集团有限公司 | Intelligent building monitoring system |
US11989403B2 (en) | 2021-01-27 | 2024-05-21 | Sintokogio, Ltd. | Information processing device and information processing method |
US11480358B2 (en) | 2021-02-25 | 2022-10-25 | Synapse Wireless, Inc. | Machine learning systems for modeling and balancing the activity of air quality devices in industrial applications |
WO2022199524A1 (en) * | 2021-03-25 | 2022-09-29 | 北京灵汐科技有限公司 | Sensor assembly and electronic device |
EP4075112B1 (en) | 2021-04-14 | 2024-10-02 | ABB Schweiz AG | A fault detection system |
US11636752B2 (en) * | 2021-04-26 | 2023-04-25 | Rockwell Automation Technologies, Inc. | Monitoring machine operation with different sensor types to identify typical operation for derivation of a signature |
US11545023B1 (en) * | 2021-07-06 | 2023-01-03 | Verizon Patent And Licensing Inc. | Systems and methods for monitoring a physical environment using virtual sensors |
US20230011547A1 (en) * | 2021-07-12 | 2023-01-12 | Getac Technology Corporation | Optimizing continuous media collection |
NL2028792B1 (en) * | 2021-07-20 | 2023-01-27 | Schreder Sa | Remote management framework in sensor and actuator network |
US11335203B1 (en) * | 2021-08-20 | 2022-05-17 | Beta Air, Llc | Methods and systems for voice recognition in autonomous flight of an electric aircraft |
CN113887552A (en) * | 2021-08-25 | 2022-01-04 | 苏州数言信息技术有限公司 | Sensor-based carbon metering method and device |
NO20220004A1 (en) * | 2022-01-03 | 2023-07-04 | Elliptic Laboratories Asa | Robust virtual sensor |
WO2023168361A1 (en) * | 2022-03-02 | 2023-09-07 | Qualcomm Technologies, Inc. | Piezoelectric mems contact detection system |
CN115031357B (en) * | 2022-05-10 | 2023-05-30 | 南京信息工程大学 | Voting strategy-based fault diagnosis method suitable for different types of fault characteristics |
CN115830578B (en) * | 2023-02-23 | 2023-05-23 | 北京金石视觉数字科技有限公司 | Article inspection method and device and electronic equipment |
CN118604074A (en) * | 2024-08-07 | 2024-09-06 | 上海科泽智慧环境科技有限公司 | Contaminant monitoring system and method using an electrochemical biosensor array |
Family Cites Families (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5502773A (en) | 1991-09-20 | 1996-03-26 | Vanderbilt University | Method and apparatus for automated processing of DNA sequence data |
US6408227B1 (en) * | 1999-09-29 | 2002-06-18 | The University Of Iowa Research Foundation | System and method for controlling effluents in treatment systems |
US7827131B2 (en) | 2002-08-22 | 2010-11-02 | Knowm Tech, Llc | High density synapse chip using nanoparticles |
US7519452B2 (en) | 2004-04-15 | 2009-04-14 | Neurosciences Research Foundation, Inc. | Mobile brain-based device for use in a real world environment |
US9101279B2 (en) | 2006-02-15 | 2015-08-11 | Virtual Video Reality By Ritchey, Llc | Mobile user borne brain activity data and surrounding environment data correlation system |
US20090017910A1 (en) | 2007-06-22 | 2009-01-15 | Broadcom Corporation | Position and motion tracking of an object |
US8812261B2 (en) | 2007-08-23 | 2014-08-19 | Applied Materials, Inc. | Method and apparatus to automatically create virtual sensors with templates |
CN100538761C (en) * | 2007-08-27 | 2009-09-09 | 北京交通大学 | Built-in intelligent fault diagnosing device and method based on the data fusion pattern-recognition |
US9203912B2 (en) | 2007-11-14 | 2015-12-01 | Qualcomm Incorporated | Method and system for message value calculation in a mobile environment |
US8510234B2 (en) | 2010-01-05 | 2013-08-13 | American Gnc Corporation | Embedded health monitoring system based upon Optimized Neuro Genetic Fast Estimator (ONGFE) |
US20140342343A1 (en) | 2011-09-13 | 2014-11-20 | Monk Akarshala Design Private Limited | Tutoring interfaces for learning applications in a modular learning system |
US8775337B2 (en) * | 2011-12-19 | 2014-07-08 | Microsoft Corporation | Virtual sensor development |
EP2801049B1 (en) | 2012-01-08 | 2018-11-14 | ImagiStar LLC | System and method for item self-assessment as being extant or displaced |
US10068396B2 (en) | 2016-01-15 | 2018-09-04 | John C. S. Koo | Light-based security systems |
US9740773B2 (en) | 2012-11-02 | 2017-08-22 | Qualcomm Incorporated | Context labels for data clusters |
US20140200855A1 (en) * | 2013-01-17 | 2014-07-17 | Stephen Oonk | Coremicro Reconfigurable Embedded Smart Sensor Node |
US9355368B2 (en) | 2013-03-14 | 2016-05-31 | Toyota Motor Engineering & Manufacturing North America, Inc. | Computer-based method and system for providing active and automatic personal assistance using a robotic device/platform |
US9448962B2 (en) | 2013-08-09 | 2016-09-20 | Facebook, Inc. | User experience/user interface based on interaction history |
US20160080835A1 (en) | 2014-02-24 | 2016-03-17 | Lyve Minds, Inc. | Synopsis video creation based on video metadata |
US20160071549A1 (en) | 2014-02-24 | 2016-03-10 | Lyve Minds, Inc. | Synopsis video creation based on relevance score |
US9823764B2 (en) | 2014-12-03 | 2017-11-21 | Microsoft Technology Licensing, Llc | Pointer projection for natural user input |
US10088971B2 (en) | 2014-12-10 | 2018-10-02 | Microsoft Technology Licensing, Llc | Natural user interface camera calibration |
CN104510475A (en) * | 2014-12-15 | 2015-04-15 | 中国科学院计算技术研究所 | Human body falling-down detection method and system |
US9766818B2 (en) | 2014-12-31 | 2017-09-19 | Samsung Electronics Co., Ltd. | Electronic system with learning mechanism and method of operation thereof |
IL237235B (en) | 2015-02-11 | 2019-08-29 | Friedlander Yehudah | System for analyzing electricity consumption |
US20160247043A1 (en) | 2015-02-19 | 2016-08-25 | Warren Rieutort-Louis | Thin-film Sensing and Classification System |
US9761220B2 (en) | 2015-05-13 | 2017-09-12 | Microsoft Technology Licensing, Llc | Language modeling based on spoken and unspeakable corpuses |
US10438112B2 (en) | 2015-05-26 | 2019-10-08 | Samsung Electronics Co., Ltd. | Method and apparatus of learning neural network via hierarchical ensemble learning |
EP3308302A4 (en) | 2015-06-09 | 2019-02-13 | Intuitive Surgical Operations Inc. | Video content searches in a medical context |
US9676098B2 (en) * | 2015-07-31 | 2017-06-13 | Heinz Hemken | Data collection from living subjects and controlling an autonomous robot using the data |
US10007513B2 (en) * | 2015-08-27 | 2018-06-26 | FogHorn Systems, Inc. | Edge intelligence platform, and internet of things sensor streams system |
US20170169358A1 (en) | 2015-12-09 | 2017-06-15 | Samsung Electronics Co., Ltd. | In-storage computing apparatus and method for decentralized machine learning |
SE539526C2 (en) | 2016-01-12 | 2017-10-10 | Scania Cv Ab | Method and control unit for a self-learning map |
EP3436966A4 (en) | 2016-04-01 | 2019-11-13 | INTEL Corporation | Entropic classification of objects |
KR102607216B1 (en) | 2016-04-01 | 2023-11-29 | 삼성전자주식회사 | Method of generating a diagnosis model and apparatus generating a diagnosis model thereof |
US10049284B2 (en) * | 2016-04-11 | 2018-08-14 | Ford Global Technologies | Vision-based rain detection using deep learning |
CN106407996A (en) * | 2016-06-30 | 2017-02-15 | 华南理工大学 | Machine learning based detection method and detection system for the fall of the old |
US11107016B2 (en) | 2016-08-18 | 2021-08-31 | Virtual Power Systems, Inc. | Augmented power control within a datacenter using predictive modeling |
-
2018
- 2018-04-24 EP EP18726268.8A patent/EP3616387A1/en not_active Withdrawn
- 2018-04-24 WO PCT/US2018/029165 patent/WO2018200541A1/en unknown
- 2018-04-24 CN CN201880042348.3A patent/CN110800273B/en active Active
- 2018-04-24 US US15/961,537 patent/US10436615B2/en active Active
-
2019
- 2019-10-03 US US16/591,987 patent/US20200033163A1/en not_active Abandoned
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11886089B2 (en) | 2017-04-26 | 2024-01-30 | View, Inc. | Displays for tintable windows |
US11892738B2 (en) | 2017-04-26 | 2024-02-06 | View, Inc. | Tandem vision window and media display |
US12057220B2 (en) | 2020-05-27 | 2024-08-06 | View Operating Corporation | Systems and methods for managing building wellness |
US20220249945A1 (en) * | 2020-07-27 | 2022-08-11 | Tencent Technology (Shenzhen) Company Limited | Picture update method and apparatus, computer device, and storage medium |
WO2022165062A1 (en) * | 2021-01-28 | 2022-08-04 | View, Inc. | Multi-sensor synergy |
WO2023114121A1 (en) * | 2021-12-13 | 2023-06-22 | Mars, Incorporated | A computer-implemented method of predicting quality of a food product sample |
Also Published As
Publication number | Publication date |
---|---|
EP3616387A1 (en) | 2020-03-04 |
CN110800273A (en) | 2020-02-14 |
WO2018200541A1 (en) | 2018-11-01 |
US10436615B2 (en) | 2019-10-08 |
US20180306609A1 (en) | 2018-10-25 |
CN110800273B (en) | 2024-02-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10436615B2 (en) | Virtual sensor system | |
US10395494B2 (en) | Systems and methods of home-specific sound event detection | |
US9618918B2 (en) | System and method for estimating the number of people in a smart building | |
US10359741B2 (en) | Switch terminal system with spatial relationship information | |
US9240111B2 (en) | Inferring building metadata from distributed sensors | |
EP2314031B1 (en) | System for delivering and presenting a message within a network | |
BR102013014516A2 (en) | System and method for determining the location of wireless communication devices / persons for controlling / adjusting location-based device operation | |
EP2918062B1 (en) | Aggregation framework using low-power alert sensor | |
US10621855B2 (en) | Online occupancy state estimation | |
US20220027725A1 (en) | Sound model localization within an environment | |
US11748993B2 (en) | Anomalous path detection within cameras' fields of view | |
US10973103B2 (en) | Presence and directional motion detection | |
US10599105B2 (en) | Switch terminal with wiring components secured to circuitry wiring without external live points of contact | |
US20230237897A1 (en) | Method of reducing a false trigger alarm on a security ecosystem | |
US10254722B2 (en) | Switch terminal system with display | |
US20230267815A1 (en) | Ear bud integration with property monitoring | |
US20240071083A1 (en) | Using implicit event ground truth for video cameras | |
Boovaraghavan | Enabling System Support for General-Purpose Sensing in Smart Environments | |
WO2024137818A1 (en) | Computerized systems and methods for dynamically and automatically arming and disarming a security system | |
Albinali | Activity-aware computing: Modeling of human activity and behavior |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: CARNEGIE MELLON UNIVERSITY, PENNSYLVANIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AGARWAL, YUVRAJ;HARRISON, CHRISTOPHER;LAPUT, GIERAD;AND OTHERS;SIGNING DATES FROM 20180430 TO 20180521;REEL/FRAME:053552/0480 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |