WO2022046541A1 - Mapping acoustic properties in an enclosure - Google Patents
Mapping acoustic properties in an enclosure Download PDFInfo
- Publication number
- WO2022046541A1 WO2022046541A1 PCT/US2021/046838 US2021046838W WO2022046541A1 WO 2022046541 A1 WO2022046541 A1 WO 2022046541A1 US 2021046838 W US2021046838 W US 2021046838W WO 2022046541 A1 WO2022046541 A1 WO 2022046541A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- acoustic
- sensor
- enclosure
- sound
- emitter
- Prior art date
Links
Classifications
-
- E—FIXED CONSTRUCTIONS
- E06—DOORS, WINDOWS, SHUTTERS, OR ROLLER BLINDS IN GENERAL; LADDERS
- E06B—FIXED OR MOVABLE CLOSURES FOR OPENINGS IN BUILDINGS, VEHICLES, FENCES OR LIKE ENCLOSURES IN GENERAL, e.g. DOORS, WINDOWS, BLINDS, GATES
- E06B9/00—Screening or protective devices for wall or similar openings, with or without operating or securing mechanisms; Closures of similar construction
- E06B9/24—Screens or other constructions affording protection against light, especially against sunshine; Similar screens for privacy or appearance; Slat blinds
-
- E—FIXED CONSTRUCTIONS
- E06—DOORS, WINDOWS, SHUTTERS, OR ROLLER BLINDS IN GENERAL; LADDERS
- E06B—FIXED OR MOVABLE CLOSURES FOR OPENINGS IN BUILDINGS, VEHICLES, FENCES OR LIKE ENCLOSURES IN GENERAL, e.g. DOORS, WINDOWS, BLINDS, GATES
- E06B9/00—Screening or protective devices for wall or similar openings, with or without operating or securing mechanisms; Closures of similar construction
- E06B9/24—Screens or other constructions affording protection against light, especially against sunshine; Similar screens for privacy or appearance; Slat blinds
- E06B2009/2464—Screens or other constructions affording protection against light, especially against sunshine; Similar screens for privacy or appearance; Slat blinds featuring transparency control by applying voltage, e.g. LCD, electrochromic panels
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/26—Pc applications
- G05B2219/2642—Domotique, domestic, home control, automation, smart house
Definitions
- PCT/US21/15378 is also a Continuation-in-Part of U.S. Patent Application Serial No. 17/083,128 and its priority chain recited herein.
- International Patent Application Serial No. PCT/US21/15378 is also a Continuation-in-Part of U.S. Patent Application Serial No. 16/447,169 and its priority chain recited herein.
- This application is also a continuation in part of International Patent Application Serial No. PCT/US19/36571 filed June 11 , 2019, titled “OPTICALLY SWITCHABLE WINDOWS FOR SELECTIVELY IMPEDING PROPAGATION OF LIGHT FROM AN ARTIFICIAL SOURCE,” which claims priority from U.S. Provisional Patent Application Serial No.
- a processing system may have a plurality of nodes that may be linked together in a network.
- the processing system can be, can be included in, or can include a control system.
- Some of the nodes may include software and/or hardware that may be configured to operate various systems in one or more facilities (i.e., enclosures).
- Facilities can include at least one building or any portion(s) of the building.
- the operating systems to be controlled can include smart windows (e.g., having insulated glass units such as electrochromic devices), building management systems, environmental sensors, and/or actuators (e.g., HVAC systems).
- Optically switchable windows exhibit a controllable and reversible change in an optical property when appropriately stimulated by, for example, a voltage change.
- the optical property is typically color, transmittance, absorbance, and/or reflectance.
- Electrochromic (EC) devices are sometimes used in optically switchable windows.
- One well-known electrochromic material for example, is tungsten oxide (WO 3 ).
- Tungsten oxide is a cathodic electrochromic material in which a coloration transition, transparent to blue, occurs by electrochemical reduction.
- Switchable windows may be used in buildings to control transmission of solar energy. Switchable windows may be manually or automatically tinted and cleared to reduce energy consumption, by heating, air conditioning and/or lighting systems, while maintaining occupant comfort.
- Electrochromic materials may be incorporated into, for example, windows for home, commercial and other uses as thin film coatings on the window glass.
- a small voltage applied to an electrochromic device of the window will cause them to darken; reversing the voltage polarity causes them to lighten. This capability allows control of the amount of light that passes through the windows and presents an opportunity for electrochromic windows to be used as energy-saving devices.
- a community of components may be placed at various locations in an enclosure (e.g., a facility, a building, and/or a room) to analyze, detect, and/or react to: data, an/or (e.g., environmental) aspects of the enclosure.
- the various aspects may include temperature, humidity, sound, electromagnetic waves, position, distance, movement, speed, vibration, volatile compounds (VOCs), dust, light, glare, color, gases, and/or other aspects of the enclosure.
- Components may be deployed in an ensemble in a common assembly having a housing (e.g., box) containing a requested grouping of such components (e.g., modules).
- the components may include sensors, emitters, actuators, controllers, processors, antenna, electronic memory, and/or other peripheral electronics.
- the peripheral electronics may interconnect in a hierarchical manner, such as is shown in U.S. Patent Serial No. 10,495,939, issued December 3, 2019, titled “CONTROLLERS FOR OPTICALLY- SWITCHABLE DEVICES,” that is incorporated herein by reference in its entirety.
- acoustic comfort To establish, manipulate, and/or maximize acoustic comfort, accurate sound mapping (e.g., and tuning) of various facility environments (e.g., of rooms) may be beneficial, e.g., to ensure that these environments are acoustically suitable for their intended purpose.
- a conference room or library may have stricter sound requirements that an entrance hall or cafeteria.
- the acoustics of an environment depends, e.g., on the various fixtures and non-fixtures in that environment, their arrangement, material properties and the like. The acoustics of an environment may be subject to change when these fixtures and non-fixtures are altered.
- Fixtures may include non-movable objects such as walls, ceilings, floors, light fixtures and/or other immovable or semi-permanent objects.
- Non-fixtures may include movable objects, e.g., furniture, appliances, portable light fixtures, plants, blinds, shutters, computers and/or people.
- a current acoustic map of the facility areas is established.
- An update of the acoustic mapping during operation of the facility may be required as any of the fixtures and/or non-fixtures change.
- updates of the acoustic map of the facility can be done automatically and/or as close as possible to the change made to the facility (e.g., in real time).
- sound emitters e.g., speakers
- sound sensors e.g., microphones
- the sound emitters and sensors have a known location and are communicative coupled via a network, e.g., a building communications and power network, e.g., as described in This application claims priority from International Patent Application Serial No. PCT/US21/17946 filed February 12, 2021 , titled “DATA AND POWER NETWORK OF A FACILITY,” which is incorporated herein by reference in its entirety.
- the sound emitters and sensors may use (i) sound frequency sweeping, (ii) their location, and (iii) mutual coordination, to generate the acoustic mapping of the facility. Such acoustic mapping can be done automatically, in situ, and/or in real time.
- acoustic mapping allows one to know how well various facility environments are isolated from noise, and those that are not sufficiently isolated. From this data, e.g., by including sound absorbers, diffusers and/or deflectors in specific areas, insufficiently acoustically isolated facility environments can be modified to improve the acoustic isolation.
- acoustics of a space can be tuned for a specific purpose, such as interpersonal communication, musical listening, and the like.
- a method of acoustic mapping comprises: (A) using an emitter to emit a first acoustic test signal, which first emitter is disposed at a first location in an enclosure; (B) using a sensor to measure a first acoustic response corresponding to the first acoustic test signal, which sensor is disposed at a second location; (C) storing a first acoustic map indicative of an acoustic transfer function between the first location and the second location; (D) using the emitter to emit a second acoustic test signal; (E) measuring a second acoustic response corresponding to the second acoustic test signal; (F) determining a second acoustic map; and (G) generating a notification and/or a report when a difference between the second acoustic map and the first acoustic map is greater than a threshold.
- the emitter is part of a device ensemble housing at least one sensor and at least one emitter.
- the threshold is a function.
- the emitter is operatively coupled to a control system.
- the method further comprises controlling at least one apparatus in the enclosure and/or in a facility in which the enclosure is disposed, which controlling is by the control system.
- the at least one apparatus comprises a lighting device, a tintable window, another sensor, another emitter, a media display, a dispenser, a processor, a power source, a security system, a fire alarm system, a sound media, a heater, a cooler, a vent, or a heating ventilation and air conditioning system (HVAC).
- HVAC heating ventilation and air conditioning system
- the control system comprises a hierarchy of controllers.
- the emitter is operatively coupled to a network.
- the sensor is communicatively coupled to a network in a wired and/or wireless manner.
- the emitter and/or the sensor are communicatively coupled to a network in a wired and/or wireless manner.
- the network is configured to transmit power and/or data.
- the network is configured to transmit broadband cellular network technology communication of at least a third generation, fourth generation, or fifth generation cellular communication protocol.
- the network is operatively coupled to a router, multiplier, antenna, and/or transceiver.
- the network is disposed at least in an envelope of the enclosure and/or a building in which the enclosure is disposed.
- the emitter comprises a buzzer.
- the method further comprises using the emitter to emit sounds including discrete sounds of a sound spectrum.
- the sensor is configured to detect sounds including continuous sounds of a sound spectrum.
- the method further comprises using the emitter to emit sounds including sounds of a sound having a spectrum frequency of from about 10 Hz to about 20 kHz.
- the method further comprises using the emitter to emit the first acoustic test signal and/or the second acoustic test signal according to a schedule.
- the method further comprises using the emitter to emit the first acoustic test signal and/or the second acoustic test signal when the enclosure is non-inhabited. In some embodiments, the method further comprises using the emitter to emit the first acoustic test signal and/or the second acoustic test signal outside standard work hours in the enclosure and/or in a facility in which the enclosure is disposed. In some embodiments, the method further comprises using the emitter to emit the first acoustic test signal and/or the second acoustic test signal when the enclosure is forecasted to experience a quiet period of a length that is at least sufficient to generate the first acoustic map and/or the second acoustic map.
- the method further comprises using the emitter to emit the second acoustic test signal according to a schedule that considers a change in a fixture of the enclosure and/or of the facility in which the enclosure is disposed. In some embodiments, the method further comprises using the emitter to emit the second acoustic test signal according to a schedule that considers a change in a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed.
- the enclosure is at least part of a building, or a vehicle. In some embodiments, the enclosure comprises a room. In some embodiments, the enclosure is configured for one or more occupants.
- the senor is comprised in a device ensemble housing another device that includes at least one sensor and/or at least one emitter.
- the second location is in the enclosure. In some embodiments, the second location is outside of the enclosure.
- the storing of the first acoustic map utilizes a memory disposed in the enclosure, and/or in a building in which the enclosure is disposed. In some embodiments, storing of the first acoustic map utilizes a memory disposed outside of the enclosure and/or outside of a building in which the enclosure is disposed. In some embodiments, storing of the first acoustic map is in an ensemble housing at least one other device including at least one sensor and/or at least one emitter.
- storing of the first acoustic map utilizes a network to which the sensor and emitter are coupled to.
- the first acoustic map and/or the second acoustic map is generated by a processor that is part of, or is operatively coupled to, a control system.
- the acoustic map is generated by a processor that is part of, or is operatively coupled to, a network to which the sensor and emitter are coupled to.
- generation of the acoustic map excludes utilizing a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed.
- generation of the acoustic map comprises utilizing a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed.
- measurement of the second acoustic response is by the same sensor measuring the first acoustic response.
- the sensor is a first sensor, and wherein measurement of the second acoustic response is at least in part by a second sensor.
- the second sensor is disposed in the enclosure. In some embodiments, the second sensor is disposed outside of the enclosure.
- the senor is a first sensor
- the method further comprises operations: (H) using a second sensor disposed at a third location to measure a third acoustic response to the second acoustic test signal, wherein the second acoustic response measured in (E) is sensed at the second location by the second sensor; and (I) comparing the second acoustic response and the third acoustic response to detect a fault in the emitter or in one of the sensors.
- the emitter is a first emitter
- the method further comprises operations: (H) using a second emitter at a third location to emit a third acoustic test signal; (I) measuring a third acoustic response corresponding to the third acoustic test signal; and (J) comparing the third acoustic response to the acoustic response to the second acoustic test signal to detect a fault in the sensor, in the first emitter, or in the second emitter.
- the method further comprises operations: (H) detecting an irregular sound event in the enclosure utilizing a plurality of sensors that include the sensor; (I) compensating the detected sound event according to a corresponding acoustic transfer function from the first acoustic map and/or the second acoustic map; (J) recognizing an event type utilizing the compensated detected sound event; and (K) generating a notification of the event type to a user.
- the method further comprises localizing an origination of the irregular sound event based at least in part on relative magnitudes of the detected irregular sound event by at least two, or by at least three of the plurality of sensors.
- a non-transitory computer readable media for acoustic mapping when read by one or more processors, is configured to execute operations comprises: (A) using, or direct usage of, an emitter to emit a first acoustic test signal, which first emitter is disposed in a first location in an enclosure; (B) using, or direct usage of, a sensor to measure a first acoustic response corresponding to the first acoustic test signal, which sensor is disposed in a second location; (C) storing, or direct storage of, a first acoustic map indicative of an acoustic transfer function between the first location and the second location; (D) using, or direct usage of, the emitter to emit a second acoustic test signal; (E) measuring, or directing measurement of, a second acoustic response corresponding to the second acoustic test signal; (F) determining, or directing determination of,
- the emitter is part of a device ensemble housing at least one sensor and at least one emitter.
- the threshold is a function.
- the emitter is operatively coupled to a control system.
- the operations comprise controlling at least one apparatus in the enclosure and/or in a facility in which the enclosure is disposed, which controlling is by the control system.
- the at least one apparatus comprises a lighting device, a tintable window, another sensor, another emitter, a media display, a dispenser, a processor, a power source, a security system, a fire alarm system, a sound media, a heater, a cooler, a vent, or a heating ventilation and air conditioning system (HVAC).
- HVAC heating ventilation and air conditioning system
- the control system comprises a hierarchy of controllers.
- the emitter is operatively coupled to a network.
- the sensor is communicatively coupled to a network in a wired and/or wireless manner.
- the emitter and/or the sensor are communicatively coupled to a network in a wired and/or wireless manner.
- the network is configured to transmit power and/or data.
- the network is configured to transmit broadband cellular network technology communication of at least a third generation, fourth generation, or fifth generation cellular communication protocol.
- the network is operatively coupled to a router, multiplier, antenna, and/or transceiver.
- the network is disposed at least in an envelope of the enclosure and/or a building in which the enclosure is disposed.
- the emitter comprises a buzzer.
- the operations comprise using the emitter to emit sounds including discrete sounds of a sound spectrum.
- the sensor is configured to detect sounds including continuous sounds of a sound spectrum.
- the operations comprise using the emitter to emit sounds including sounds of a sound having a spectrum frequency of from about 10 Hz to about 20 kHz.
- the operations comprise using the emitter to emit the first acoustic test signal and/or the second acoustic test signal according to a schedule.
- the operations comprise using the emitter to emit the first acoustic test signal and/or the second acoustic test signal when the enclosure is non-inhabited. In some embodiments, the operations comprise using the emitter to emit the first acoustic test signal and/or the second acoustic test signal outside standard work hours in the enclosure and/or in a facility in which the enclosure is disposed. In some embodiments, the operations comprise using the emitter to emit the first acoustic test signal and/or the second acoustic test signal when the enclosure is forecasted to experience a quiet period of a length that is at least sufficient to generate the first acoustic map and/or the second acoustic map.
- the operations comprise using the emitter to emit the second acoustic test signal according to a schedule that considers a change in a fixture of the enclosure and/or of the facility in which the enclosure is disposed. In some embodiments, the operations comprise using the emitter to emit the second acoustic test signal according to a schedule that considers a change in a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed.
- the enclosure is at least part of a building, or a vehicle. In some embodiments, the enclosure comprises a room. In some embodiments, the enclosure is configured for one or more occupants.
- the senor is comprised in a device ensemble housing another device that includes at least one sensor and/or at least one emitter.
- the second location is in the enclosure. In some embodiments, the second location is outside of the enclosure.
- storing of the first acoustic map utilizes a memory disposed in the enclosure, and/or in a building in which the enclosure is disposed. In some embodiments, storing of the first acoustic map utilizes a memory disposed outside of the enclosure and/or outside of a building in which the enclosure is disposed. In some embodiments, storing of the first acoustic map is in an ensemble housing at least one other device including at least one sensor and/or at least one emitter.
- storing of the first acoustic map utilizes a network to which the sensor and emitter are coupled to.
- the first acoustic map and/or the second acoustic map is generated by a processor that is part of, or is operatively coupled to, a control system.
- the operations further comprise generating, or directing generation of, the acoustic map by at least one processor that is part of, or is operatively coupled to, a network to which the sensor and emitter are coupled to.
- generation of the acoustic map excludes utilizing a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed.
- generation of the acoustic map comprises utilizing a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed.
- measurement of the second acoustic response is by the same sensor measuring the first acoustic response.
- the sensor is a first sensor, and wherein measurement of the second acoustic response is at least in part by a second sensor.
- the second sensor is disposed in the enclosure. In some embodiments, the second sensor is disposed outside of the enclosure.
- the senor is a first sensor
- the operations comprise: (H) using a second sensor disposed at a third location to measure a third acoustic response to the second acoustic test signal, wherein the second acoustic response measured in (E) is sensed at the second location by the second sensor; and (I) comparing the second acoustic response and the third acoustic response to detect a fault in the emitter or in one of the sensors.
- the emitter is a first emitter
- the operations comprise: (H) using a second emitter at a third location to emit a third acoustic test signal; (I) measuring a third acoustic response corresponding to the third acoustic test signal; and (J) comparing the third acoustic response to the acoustic response to the second acoustic test signal to detect a fault in the sensor, in the first emitter, or in the second emitter.
- the operations comprise: (H) detecting an irregular sound event in the enclosure utilizing a plurality of sensors that include the sensor; (I) compensating the detected sound event according to a corresponding acoustic transfer function from the first acoustic map and/or the second acoustic map; (J) recognizing an event type utilizing the compensated detected sound event; and (K) generating a notification of the event type to a user.
- an apparatus for acoustic mapping comprises at least one controller comprising circuitry, which at least one controller is configured to: (A) operatively couple to a first emitter, a second emitter, and to a sensor; (B) direct the first emitter to emit, a first acoustic test signal, which first emitter is disposed in a first location in an enclosure; (C) direct the sensor to measure a first acoustic response corresponding to the first acoustic test signal, which sensor is disposed in a second location; (D) store, or direct storage of, a first acoustic map indicative of an acoustic transfer function between the first location and the second location; (E) direct the first emitter to emit a second acoustic test signal; (F) direct measurement of a second acoustic response corresponding to the second acoustic test signal; (G) determine, or direct determination of, a second acoustic map; and (H)
- the emitter is included in a device ensemble housing at least one sensor and at least one emitter.
- the threshold is a function.
- the emitter is operatively coupled to a control system.
- the at least one controller is configured to control at least one apparatus in the enclosure and/or in a facility in which the enclosure is disposed, which controlling is by the control system.
- the at least one apparatus in the enclosure comprises a lighting device, a tintable window, another sensor, another emitter, a media display, a dispenser, a processor, a power source, a security system, a fire alarm system, a sound media, a heater, a cooler, a vent, or a heating ventilation and air conditioning system (HVAC).
- the control system is configured to include a hierarchy of controllers.
- the emitter is operatively coupled to a network.
- the sensor is communicatively coupled to a network in a wired and/or wireless manner.
- the emitter and/or the sensor are communicatively coupled to a network in a wired and/or wireless manner.
- the network is configured to transmit power and/or data. In some embodiments, the network is configured to transmit broadband cellular network technology communication of at least a third generation, fourth generation, or fifth generation cellular communication protocol. In some embodiments, the network is operatively coupled to a router, multiplier, antenna, and/or transceiver. In some embodiments, the network is disposed at least in an envelope of the enclosure and/or a building in which the enclosure is disposed. In some embodiments, the emitter comprises a buzzer. In some embodiments, the at least one controller is configured to direct the emitter to emit sounds including discrete sounds of a sound spectrum. In some embodiments, the at least one controller is configured to direct the sensor to detect sounds including continuous sounds of a sound spectrum.
- the at least one controller is configured to direct the emitter to emit sounds including sounds of a sound having a spectrum frequency of from about 10 Hz to about 20 kHz. In some embodiments, the at least one controller is configured to direct the emitter to emit the first acoustic test signal and/or the second acoustic test signal according to a schedule. In some embodiments, the at least one controller is configured to direct the emitter to emit the first acoustic test signal and/or the second acoustic test signal when the enclosure is noninhabited.
- the at least one controller is configured to direct the emitter to emit the first acoustic test signal and/or the second acoustic test signal outside standard work hours in the enclosure and/or in a facility in which the enclosure is disposed. In some embodiments, the at least one controller is configured to direct the emitter to emit the first acoustic test signal and/or the second acoustic test signal when the enclosure is forecasted to experience a quiet period of a length that is at least sufficient to generate the first acoustic map and/or the second acoustic map.
- the at least one controller is configured to direct the emitter to emit the second acoustic test signal according to a schedule that considers a change in a fixture of the enclosure and/or of the facility in which the enclosure is disposed. In some embodiments, the at least one controller is configured to direct the emitter to emit the second acoustic test signal according to a schedule that considers a change in a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed. In some embodiments, the enclosure is at least part of a building, or a vehicle. In some embodiments, the enclosure comprises a room. In some embodiments, the enclosure is configured for one or more occupants.
- the senor is comprised in a device ensemble housing another device that includes at least one sensor and/or at least one emitter.
- the second location is in the enclosure. In some embodiments, the second location is outside of the enclosure.
- the method further comprises a memory storing the first acoustic map, wherein the memory is disposed in the enclosure, and/or in a building in which the enclosure is disposed. In some embodiments, the method further comprises a memory storing the first acoustic map, wherein the memory is disposed outside of the enclosure and/or outside of a building in which the enclosure is disposed.
- the method further comprises an ensemble housing at least one other device including at least one sensor and/or at least one emitter, wherein the first acoustic map is stored in the ensemble.
- the at least one controller is configured to store the first acoustic map in a network to which the sensor and emitter are coupled to.
- the at least one controller is configured to generate the first acoustic map and/or the second acoustic map, and wherein the at least one controller is part of, or is operatively coupled to, a control system.
- the at least one controller is configured to generate the acoustic map, and wherein the at least one controller is part of, or is operatively coupled to, a network to which the sensor and emitter are coupled to. In some embodiments, the at least one controller is configured to generate the acoustic map without utilizing a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed. In some embodiments, the at least one controller is configured to generate the acoustic map utilizing a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed. In some embodiments, the same sensor measuring the first acoustic response is configured to measure the second acoustic response.
- the senor is a first sensor, and wherein measurement of the second acoustic response is at least in part by a second sensor.
- the second sensor is disposed in the enclosure. In some embodiments, the second sensor is disposed outside of the enclosure.
- the senor is a first sensor
- the at least one controller is configured to: (H) operatively couple to a second sensor disposed at a third location; (I) direct the second sensor to measure a third acoustic response to the second acoustic test signal, wherein the second acoustic response measured in (E) is sensed at the second location by the second sensor; and (I) compare, or direct comparison of, the second acoustic response and the third acoustic response to detect a fault in the emitter or in one of the sensors.
- the emitter is a first emitter
- the at least one controller is configured to: (H) operatively couple to a second emitter disposed at a third location; (I) direct the second emitter to emit a third acoustic test signal; (J) measure, or direct measurement of, a third acoustic response corresponding to the third acoustic test signal; and (H) compare, or direct comparison of, the third acoustic response to the acoustic response to the second acoustic test signal to detect a fault in the sensor, in the first emitter, or in the second emitter.
- the at least one controller is configured to: (H) operatively couple to a plurality of sensors that include the sensor; (I) direct the plurality of sensors to detect an irregular sound event in the enclosure; (J) compensate, or direct compensation of, the detected sound event according to a corresponding acoustic transfer function from the first acoustic map and/or the second acoustic map; (K) recognize, or direct recognition of, an event type utilizing the compensated detected sound event; and (L) generate, or direct generation of, a notification of the event type to a user.
- the at least one controller is configured to localize, or direct localization of, an origination of the irregular sound event based at least in part on relative magnitudes of the detected irregular sound event by at least two, or by at least three of the plurality of sensors.
- a method of acoustic mapping comprises: (A) using an emitter to emit an acoustic test signal, which emitter is disposed at a first location in an enclosure; (B) using a sensor to measure an acoustic response corresponding to the acoustic test signal, which sensor is disposed at a second location; and (C) using information pertaining to an inanimate alteration to generate an acoustic map indicative of an acoustic transfer function between the first location and the second location, which inanimate alteration is projected to affect the acoustic mapping of the enclosure.
- the emitter is included in a device ensemble housing that includes at least one sensor and/or at least one emitter. In some embodiments, the emitter is operatively coupled to a control system. In some embodiments, the method further comprises controlling at least one apparatus in the enclosure and/or in a facility in which the enclosure is disposed, which controlling is by the control system. In some embodiments, the at least one apparatus comprises a lighting device, a tintable window, another sensor, another emitter, a media display, a dispenser, a processor, a power source, a security system, a fire alarm system, a sound media, a heater, a cooler, a vent, or a heating ventilation and air conditioning system (HVAC).
- HVAC heating ventilation and air conditioning system
- the control system comprises a hierarchy of controllers.
- the emitter is operatively coupled to a network.
- the sensor is communicatively coupled to a network in a wired and/or wireless manner.
- the emitter and/or the sensor are communicatively coupled to a network in a wired and/or wireless manner.
- the network is configured to transmit power and/or data.
- the network is configured to transit broadband cellular network technology communication of at least a third generation, fourth generation, or fifth generation cellular communication protocol.
- the network is operatively coupled to a router, multiplier, antenna, and/or transceiver.
- the network is disposed at least in an envelope of the enclosure and/or a building in which the enclosure is disposed.
- the emitter comprises a buzzer.
- the method further comprises using the emitter to emit sounds including discrete sounds of a sound spectrum.
- the method further comprises using the sensor to sense sounds including discrete sounds of a sound spectrum.
- the method further comprises using the emitter to emit sounds including sounds having a spectrum frequency from about 10 Hz to about 20 kHz.
- the method further comprises using the emitter to emit the first acoustic test signal and/or the second acoustic test signal according to a schedule.
- the method further comprises using the emitter to emit the first acoustic test signal and/or the second acoustic test signal when the enclosure is non-inhabited. In some embodiments, the method further comprises using the emitter to emit the first acoustic test signal and/or the second acoustic test signal outside standard work hours in the enclosure and/or in a facility in which the enclosure is disposed. In some embodiments, the method further comprises using the emitter to emit the second acoustic test signal according to a schedule that considers a change in a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed. In some embodiments, the enclosure is at least part of a building, or a vehicle.
- the enclosure comprises a room. In some embodiments, the enclosure is configured for one or more occupants. In some embodiments, the emitter is a first emitter, and wherein the method further comprises using a second emitter disposed at a third location to emit at least one other acoustic test signal. In some embodiments, the third location is different from the first location and from the second location. In some embodiments, one or more of the locations is disposed in the enclosure. In some embodiments, one or more of the locations is disposed outside the enclosure. In some embodiments, generation of the acoustic map comprises utilizing sensor measurements responsive to the at least one other acoustic test signal. In some embodiments, the second location is in the enclosure.
- the second location is outside of the enclosure.
- the sensor is a first sensor, and wherein the method further comprises using a second sensor to measure at least one other acoustic response corresponding to the first acoustic test signal, which second sensor is disposed at a third location different from the second location.
- the third location is different from the first location.
- one or more of the locations is disposed in the enclosure.
- one or more of the locations is disposed outside the enclosure.
- generation of the acoustic map comprises utilizing measurements of the second sensor.
- the second sensor is at least two other sensors.
- the second location differs from the third location horizontally and/or vertically.
- the method further comprises generating a second acoustic mapping at a second time after the inanimate alteration to detect the alteration in the acoustic transfer function.
- the information is based at least in part on a Building Information Modeling file.
- the information comprises a shape, or a material property of the one or more fixtures.
- the inanimate alteration is of one or more fixtures and/or non-fixtures.
- the alteration comprises an alteration in the enclosure.
- the alteration comprises an alteration out of the enclosure.
- the fixture comprises a wall, a window, a shelf, a lighting, or a door.
- the nonfixtures comprise a desk, or a chair.
- the inanimate alteration is of an inanimate object.
- the first acoustic map is stored in a memory disposed in the enclosure, and/or in a building in which the enclosure is disposed. In some embodiments, the first acoustic map is stored in a memory disposed outside of the enclosure and/or outside of a building in which the enclosure is disposed. In some embodiments, storing the first acoustic map utilizes a network to which the sensor and emitter are coupled to.
- the first acoustic map and/or the second acoustic map is generated by a processor that is part of, or is operatively coupled to, a control system. In some embodiments, the acoustic map is generated by a processor that is part of, or is operatively coupled to, a network to which the sensor and emitter are coupled to. In some embodiments, generation of the acoustic map comprises utilizing a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed. In some embodiments, the first acoustic map is generated within at most about a day, 8h, 4h, 2h, or 1 h.
- generation of the acoustic map utilizes information of (i) sound frequency sweeping, (ii) location, and (iii) coordination, of the emitter, of the sensor, of the at least one other emitter, and/or of the at least one sensor.
- coordination comprises coordination of sound emission times, or coordination of sound sensing times.
- a non-transitory computer readable media for acoustic mapping when read by one or more processors, is configured to execute operations comprises: (A) using, or direct usage of, an emitter to emit an acoustic test signal, which emitter is disposed in a first location in an enclosure; (B) using, or direct usage of, a sensor to measure an acoustic response corresponding to the acoustic test signal, which sensor is disposed in a second location; and (C) using, or direct usage of, information pertaining to an inanimate alteration to generate an acoustic map indicative of an acoustic transfer function between the first location and the second location, which inanimate alteration is projected to affect the acoustic mapping of the enclosure.
- the emitter is included in a device ensemble housing that includes at least one sensor and/or at least one emitter. In some embodiments, the emitter is operatively coupled to a control system. In some embodiments, the operations further comprise controlling, or directing control of, at least one apparatus in the enclosure and/or in a facility in which the enclosure is disposed, which control is by the control system. In some embodiments, the at least one apparatus comprises a lighting device, a tintable window, another sensor, another emitter, a media display, a dispenser, a processor, a power source, a security system, a fire alarm system, a sound media, a heater, a cooler, a vent, or a heating ventilation and air conditioning system (HVAC).
- HVAC heating ventilation and air conditioning system
- the control system comprises a hierarchy of controllers.
- the emitter is operatively coupled to a network.
- the sensor is communicatively coupled to a network in a wired and/or wireless manner.
- the emitter and/or the sensor are communicatively coupled to a network in a wired and/or wireless manner.
- the network is configured to transmit power and/or data.
- the network is configured to transit broadband cellular network technology communication of at least a third generation, fourth generation, or fifth generation cellular communication protocol.
- the network is operatively coupled to a router, multiplier, antenna, and/or transceiver.
- the network is disposed at least in an envelope of the enclosure and/or a building in which the enclosure is disposed.
- the emitter comprises a buzzer.
- the operations further comprise using, or directing usage of the emitter to emit sounds including discrete sounds of a sound spectrum.
- the operations further comprise using, or directing usage of, the sensor to sense sounds including discrete sounds of a sound spectrum.
- the operations further comprise using, or directing usage of, the emitter to emit sounds including sounds having a spectrum frequency from about 10 Hz to about 20 kHz.
- the operations further comprise using, or directing usage of, the emitter to emit the first acoustic test signal and/or the second acoustic test signal according to a schedule. In some embodiments, the operations further comprise using, or directing usage of, the emitter to emit the first acoustic test signal and/or the second acoustic test signal when the enclosure is non-inhabited. In some embodiments, the operations further comprise using, or directing usage of, the emitter to emit the first acoustic test signal and/or the second acoustic test signal outside standard work hours in the enclosure and/or in a facility in which the enclosure is disposed.
- the operations further comprise using, or directing usage of, the emitter to emit the second acoustic test signal according to a schedule that considers a change in a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed.
- the enclosure is at least part of a building, or a vehicle.
- the enclosure comprises a room.
- the enclosure is configured for one or more occupants.
- the emitter is a first emitter, and wherein the operations further comprise using, or directing usage of, a second emitter disposed at a third location to emit at least one other acoustic test signal.
- the third location is different from the first location and from the second location.
- one or more of the locations is disposed in the enclosure. In some embodiments, one or more of the locations is disposed outside the enclosure.
- generation of the acoustic map comprises the operation of utilizing sensor measurements responsive to the at least one other acoustic test signal.
- the second location is in the enclosure. In some embodiments, the second location is outside of the enclosure.
- the senor is a first sensor, and wherein the operations further comprise using, or directing usage of, a second sensor to measure at least one other acoustic response corresponding to the first acoustic test signal, which second sensor is disposed at a third location different from the second location.
- the third location is different from the first location.
- one or more of the locations is disposed in the enclosure.
- one or more of the locations is disposed outside the enclosure.
- generation of the acoustic map comprises the operation of utilizing measurements of the second sensor.
- the second sensor is at least two other sensors.
- the second location differs from the third location horizontally and/or vertically.
- the operations further comprise generating, or directing generation of, a second acoustic mapping at a second time after the inanimate alteration to detect the alteration in the acoustic transfer function.
- the information is based at least in part on a Building Information Modeling file.
- the information comprises a shape, or a material property of the one or more fixtures.
- the inanimate alteration is of one or more fixtures and/or non-fixtures.
- the alteration comprises an alteration in the enclosure.
- the alteration comprises an alteration out of the enclosure.
- the fixture comprises a wall, a window, a shelf, a lighting, or a door.
- the non-fixtures comprise a desk, or a chair.
- the inanimate alteration is of an inanimate object.
- the operations further comprise storing, or directing storage of, the first acoustic map in a memory disposed in the enclosure, and/or in a building in which the enclosure is disposed.
- the operations further comprise storing, or directing storage of, the first acoustic map in a memory disposed outside of the enclosure and/or outside of a building in which the enclosure is disposed.
- storage of the first acoustic map comprises an operation of utilizing a network to which the sensor and emitter are coupled to.
- the first acoustic map and/or the second acoustic map is generated by a processor of the one or more processors; which processor is included in, or is operatively coupled to, a control system.
- the acoustic map is generated by a processor of the one or more processors; which processor is included in, or is operatively coupled to, a network to which the sensor and emitter are coupled to.
- generation of the acoustic map further comprises the operation of utilizing, or directing utilization of, a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed.
- the first acoustic map is generated within at most about a day, 8h, 4h, 2h, or 1 h.
- the operations further comprise generation, or directing generation of, of the acoustic map utilizing information of (i) sound frequency sweeping, (ii) location, and (iii) coordination, of the emitter, of the sensor, of the at least one other emitter, and/or of the at least one sensor.
- coordination comprises coordination of sound emission times, and/or coordination of sound sensing times.
- an apparatus for acoustic mapping comprises at least one controller comprising circuitry, which at least one controller is configured to: (A) operatively couple to an emitter and to a sensor, (B) direct the emitter to emit an acoustic test signal, which emitter is disposed in a first location in an enclosure; (C) direct the sensor to measure an acoustic response corresponding to the acoustic test signal, which sensor is disposed in a second location; and (D) use, or direct usage of, information pertaining to an inanimate alteration to generate an acoustic map indicative of an acoustic transfer function between the first location and the second location, which inanimate alteration is projected to affect the acoustic mapping of the enclosure.
- the apparatus further comprises a device ensemble housing devices that include at least one sensor and/or at least one emitter, wherein the emitter is included in the device ensemble.
- the emitter is operatively coupled to a control system.
- the at least one controller is included, or is operatively coupled to, the control system.
- the at least one controller is configured to control at least one apparatus in the enclosure and/or in a facility in which the enclosure is disposed, which controlling is by the control system.
- the at least one apparatus comprises a lighting device, a tintable window, another sensor, another emitter, a media display, a dispenser, a processor, a power source, a security system, a fire alarm system, a sound media, a heater, a cooler, a vent, or a heating ventilation and air conditioning system (HVAC).
- the control system comprises a hierarchy of controllers.
- the emitter is operatively coupled to a network in a wired and/or wireless manner.
- the sensor is communicatively coupled to a network in a wired and/or wireless manner.
- the emitter and/or the sensor are communicatively coupled to a network in a wired and/or wireless manner.
- the network is configured to transmit power and/or data.
- the network is configured to transit broadband cellular network technology communication of at least a third generation, fourth generation, or fifth generation cellular communication protocol.
- the network is operatively coupled to a router, multiplier, antenna, and/or transceiver.
- the network is disposed at least in an envelope of the enclosure and/or a building in which the enclosure is disposed.
- the emitter comprises a buzzer.
- the at least one controller is configured to use, or direct usage of, the emitter to emit sounds including discrete sounds of a sound spectrum. In some embodiments, the at least one controller is configured to use, or direct usage of, the sensor to sense sounds including discrete sounds of a sound spectrum. In some embodiments, the at least one controller is configured to use, or direct usage of, the emitter to emit sounds including sounds having a spectrum frequency from about 10 Hz to about 20 kHz. In some embodiments, the at least one controller is configured to use, or direct usage of, the emitter to emit the first acoustic test signal and/or the second acoustic test signal according to a schedule.
- the at least one controller is configured to use, or direct usage of, the emitter to emit the first acoustic test signal and/or the second acoustic test signal when the enclosure is non-inhabited. In some embodiments, the at least one controller is configured to use, or direct usage of, the emitter to emit the first acoustic test signal and/or the second acoustic test signal outside standard work hours in the enclosure and/or in a facility in which the enclosure is disposed. In some embodiments, the at least one controller is configured to use, or direct usage of, the emitter to emit the second acoustic test signal according to a schedule that considers a change in a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed.
- the enclosure is at least part of a building, or a vehicle. In some embodiments, the enclosure comprises a room. In some embodiments, the enclosure is configured for one or more occupants. In some embodiments, the emitter is a first emitter, and wherein the method further comprises using a second emitter disposed at a third location to emit at least one other acoustic test signal. In some embodiments, the third location is different from the first location and from the second location. In some embodiments, one or more of the locations is disposed in the enclosure. In some embodiments, one or more of the locations is disposed outside the enclosure.
- the at least one controller is configured to generate the acoustic map utilizing sensor measurements responsive to the at least one other acoustic test signal.
- the second location is in the enclosure. In some embodiments, the second location is outside of the enclosure.
- the sensor is a first sensor, and wherein the at least one controller is configured to use a second sensor to measure at least one other acoustic response corresponding to the first acoustic test signal, which second sensor is disposed at a third location different from the second location. In some embodiments, the third location is different from the first location.
- one or more of the locations is disposed in the enclosure. In some embodiments, one or more of the locations is disposed outside the enclosure.
- the at least one controller is configured to generate, or direct generation of, the acoustic map utilizing measurements of the second sensor.
- the second sensor is at least two other sensors.
- the second location differs from the third location horizontally and/or vertically.
- the at least one controller is configured to generate, or direct generation of, a second acoustic mapping at a second time after the inanimate alteration to detect the alteration in the acoustic transfer function.
- the information is based at least in part on a Building Information Modeling file.
- the information comprises a shape, or a material property of the one or more fixtures.
- the inanimate alteration is of one or more fixtures and/or non-fixtures.
- the alteration comprises an alteration in the enclosure.
- the alteration comprises an alteration out of the enclosure.
- the fixture comprises a wall, a window, a shelf, a lighting, or a door.
- the non-fixtures comprise a desk, or a chair.
- the inanimate alteration is of an inanimate object.
- the at least one controller is configured to store the first acoustic map in a memory disposed in the enclosure, and/or in a building in which the enclosure is disposed.
- the at least one controller is configured to store, or direct storage of, the first acoustic map in a memory disposed outside of the enclosure and/or outside of a building in which the enclosure is disposed. In some embodiments, the at least one controller is configured to store, or direct storage of, the first acoustic map in a network to which the sensor and emitter are coupled to. In some embodiments, the first acoustic map and/or the second acoustic map is generated by a processor that is part of, or is operatively coupled to, a control system. In some embodiments, the at least one controller is part of, or is operatively coupled to, a control system.
- the acoustic map is generated by a processor and/or a controller that is part of, or is operatively coupled to, a network to which the sensor and emitter are coupled to.
- the at least one controller is configured to generate, or direct generation of, the acoustic map utilizing a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed.
- the at least one controller is configured to generate, or direct generation of, the first acoustic map within at most about a day, 8h, 4h, 2h, or 1 h.
- the at least one controller is configured to generate, or direct generation of, the acoustic map utilizing information of (i) sound frequency sweeping, (ii) location, and (iii) coordination, of the emitter, of the sensor, of the at least one other emitter, and/or of the at least one sensor.
- coordination comprises the at least one controller being configured to coordinate, or direct coordination of, sound emission times, or coordinate sound sensing times.
- a method of acoustic mapping comprises: (A) sensing a present sound event in an enclosure by using a plurality of sensors; (B) comparing the present sound event sensed by the plurality of sensors to historic sensed data by the plurality of sensors to generate a result; (C) using the result to determine any irregular sound event in the enclosure by comparing to a threshold; and (D) compensating for the irregular sound event according to a corresponding acoustic transfer function of the enclosure, which transfer function is determined utilizing at least one sensor of the plurality of sensors.
- the method further comprises localizing an origination of the irregular sound event based at least in part on relative magnitudes of the detected irregular sound event sensed by at least two, or by at least three of the plurality of sensors.
- the method further comprises recognizing an event type of the irregular sound event, and generating a notification of the event type to a user.
- recognizing the event type includes using machine learning to determine an identifying signature of the irregular sound event.
- the recognized event type is associated with anticipated recurring sounds, and wherein the method further comprises preemptively adjusting acoustic properties in the enclosure to obtain an acoustic transfer function that mitigates effects of the anticipated recurring sounds.
- the sound event comprises a gathering such as a meeting, a conference, or a party.
- the sound event comprises a gun shot, earthquake, strong wind, or a cry.
- the strong wind comprises tornado, hurricane, or tsunami initiated wind.
- the event type comprises a safety event, a health event, and/or a security event.
- the notification comprises an event category, a subtype, or an event location.
- the event category comprises a gunshot and the subtype comprises a type of gun.
- the event category comprises a cough and the subtype comprises a suspected type of a cough.
- the event category comprises a weather phenomenon.
- the threshold is comprised of a value. In some embodiments, the threshold is comprised of a function. In some embodiments, the function is a time dependent function. In some embodiments, the compensation is done in real time during the present sound event. In some embodiments, the compensation is automatic. In some embodiments, the compensation utilizes one or more acoustic modification devices operatively coupled to a network to which the plurality of sensors are operatively coupled to. In some embodiments, the acoustic modification devices comprise at least one sound emitter, sound dampener, actuator, lever, and/or vent. In some embodiments, the network is operatively coupled to a control system. In some embodiments, the control system comprises a hierarchy of controllers.
- the acoustic transfer function is determined utilizing at least one emitter, the method further comprises: (E) using the emitter to emit an acoustic test signal, which emitter is disposed at a first location in the enclosure; (F) using the one sensor to measure an acoustic response corresponding to the acoustic test signal, which one sensor is disposed at a second location; and (G) storing an acoustic map indicative of the acoustic transfer function between the first location and the second location.
- the emitter comprises a buzzer.
- the method further comprises using the emitter to emit sounds including discrete sounds of a sound spectrum.
- the at least one sensor is configured to detect sounds including continuous sounds of a sound spectrum.
- the method further comprises using the emitter to emit sounds including sounds of a sound having a spectrum frequency of from about 10 Hz to about 20 kHz.
- the method further comprises using the emitter to emit the acoustic test signal according to a schedule.
- the method further comprises using the emitter to emit the acoustic test signal when the enclosure is non-inhabited.
- the method further comprises using the emitter to emit the acoustic test signal outside standard work hours in the enclosure and/or in a facility in which the enclosure is disposed.
- the method further comprises using the emitter to emit the acoustic test signal when the enclosure is forecasted to experience a quiet period of a length that is at least sufficient to generate the acoustic map. In some embodiments, the method further comprises using the emitter to emit the acoustic test signal according to a schedule that considers a change in a fixture of the enclosure and/or of the facility in which the enclosure is disposed. In some embodiments, the method further comprises using the emitter to emit the acoustic test signal according to a schedule that considers a change in a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed. In some embodiments, the enclosure is at least part of a building, or a vehicle.
- a non-transitory computer readable media for acoustic mapping when read by one or more processors, is configured to execute operations comprises: (A) using, or direct usage of, a plurality of sensors to sense a present sound event in an enclosure; (B) comparing, or direct comparison of, the present sound event sensed by the plurality of sensors to historic sensed data by the plurality of sensors to generate a result; (C) using, or direct usage of, the result to determine any irregular sound event in the enclosure by comparing to a threshold; and (D) compensating, or direct compensation, for the irregular sound event according to a corresponding acoustic transfer function of the enclosure, which transfer function is determined utilizing at least the one sensor of the plurality of sensors.
- the operations comprise localizing an origination of the irregular sound event based at least in part on relative magnitudes of the detected irregular sound event sensed by at least two, or by at least three of the plurality of sensors.
- the operations comprise recognizing, or directing recognition of, an event type of the irregular sound event, and (i) generating a notification of the event type to a user or (ii) directing generation of a notification of the event type to a user.
- the operation of recognizing, or directing recognition of, the event type includes using machine learning to determine an identifying signature of the irregular sound event.
- recognition of the event type is associated with anticipated recurring sounds, and wherein the operations further comprise preemptively adjusting, or directing adjustment of, acoustic properties in the enclosure to obtain an acoustic transfer function that mitigates effects of the anticipated recurring sounds.
- the sound event comprises a gathering such as a meeting, a conference, or a party.
- the sound event comprises a gun shot, earthquake, strong wind, or a cry.
- the strong wind comprises a hurricane, a tornado, or a tsunami initiated wind.
- the event type comprises a safety event, a health event, and/or a security event.
- the notification comprises an event category, a subtype, or an event location.
- the event category comprises a gunshot and the subtype comprises a type of gun. In some embodiments, the event category comprises a cough and the subtype comprises a suspected type of a cough. In some embodiments, the event category comprises a weather phenomenon. In some embodiments, the threshold is comprised of a value. In some embodiments, the threshold is comprised of a function. In some embodiments, the function is a time dependent function. In some embodiments, the compensation is done in real time during the present sound event. In some embodiments, the compensation is automatic. In some embodiments, the operation of compensation utilizes one or more acoustic modification devices operatively coupled to a network to which the plurality of sensors are operatively coupled to.
- the operation of acoustic modification devices comprises adjusting at least one sound emitter, sound dampener, actuator, lever, and/or vent.
- the network is operatively coupled to a control system.
- the control system comprises a hierarchy of controllers.
- the one or more processors are operatively coupled to, or are included in, the control system.
- the acoustic transfer function is determined utilizing at least one emitter, wherein the operations further comprise: (E) using the emitter to emit an acoustic test signal, which emitter is disposed at a first location in the enclosure; (F) using the one sensor to measure an acoustic response corresponding to the acoustic test signal, which one sensor is disposed at a second location; and (G) storing an acoustic map indicative of the acoustic transfer function between the first location and the second location.
- the emitter comprises a buzzer.
- the operations further comprise using, or directing usage of, the emitter to emit sounds including discrete sounds of a sound spectrum.
- the at least one sensor is configured to detect sounds including continuous sounds of a sound spectrum.
- the operations further comprise using, or directing usage of, the emitter to emit sounds including sounds of a sound having a spectrum frequency of from about 10 Hz to about 20 kHz.
- the operations further comprise using, or directing usage of, the emitter to emit the acoustic test signal according to a schedule.
- the operations further comprise using, or directing usage of, the emitter to emit the acoustic test signal when the enclosure is non-inhabited.
- the operations further comprise using, or directing usage of, the emitter to emit the acoustic test signal outside standard work hours in the enclosure and/or in a facility in which the enclosure is disposed. In some embodiments, the operations further comprise using, or directing usage of the emitter to emit the acoustic test signal when the enclosure is forecasted to experience a quiet period of a length that is at least sufficient to generate the acoustic map. In some embodiments, the operations further comprise using, or directing usage of, the emitter to emit the acoustic test signal according to a schedule that considers a change in a fixture of the enclosure and/or of the facility in which the enclosure is disposed.
- the operations further comprise using, or directing usage of, the emitter to emit the acoustic test signal according to a schedule that considers a change in a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed.
- the enclosure is at least part of a building, or a vehicle.
- an apparatus for acoustic mapping comprises at least one controller comprising circuitry, which at least one controller is configured to: (A) operatively couple to a plurality of sensors; (B) direct a plurality of sensors to sense a present sound event in an enclosure; (C) compare, or direct comparison of, the present sound event sensed by the plurality of sensors to historic sensed data by the plurality of sensors to generate a result; (D) use, or direct the use of, the result to determine any irregular sound event in the enclosure by comparing to a threshold; and (E) compensate, or direct compensation, for the irregular sound event according to a corresponding acoustic transfer function of the enclosure, which transfer function is determined utilizing at least the one sensor of the plurality of sensors.
- the at least one controller is configured to localize an origination of the irregular sound event based at least in part on relative magnitudes of the irregular sound event sensed by at least two, or at least three of the plurality of sensors. In some embodiments, the at least one controller is configured to recognize an event type of the irregular sound event, and to (i) generate a notification of the event type to a user or (ii) direct generation of a notification of the event type to a user. In some embodiments, the at least one controller is configured to recognize, or direct recognition of, the event type by use of machine learning to determine an identifying signature of the irregular sound event.
- the at least one controller is configured to recognize, or direct recognition of, the event type associated with anticipated recurring sounds, and wherein the at least one controller is configured to preemptively adjust, or direct adjustment of, one or more acoustic properties in the enclosure to obtain an acoustic transfer function that mitigates effects of the anticipated recurring sounds.
- the sound event comprises a gathering such as a meeting, a conference, or a party.
- the sound event comprises a gun shot, earthquake, strong wind, or a cry.
- the strong wind comprises hurricane, tornado, or tsunami initiated wind.
- the event type comprises a safety event, a health event, and/or a security event.
- the notification comprises an event category, a subtype, or an event location.
- the event category comprises a gunshot and the subtype comprises a type of gun.
- the event category comprises a cough and the subtype comprises a suspected type of a cough.
- the event category comprises a weather phenomenon.
- the threshold is comprised of a value.
- the threshold is comprised of a function.
- the function is a time dependent function.
- the at least one controller is configured to compensate in real time during the present sound event. In some embodiments, the compensation is automatic.
- the at least one controller is configured to compensate, or direct compensation, by the use of one or more acoustic modification devices operatively coupled to a network to which the plurality of sensors are operatively coupled to. In some embodiments, the at least one controller is configured to adjust, or direct adjustment of, at least one sound emitter, sound dampener, actuator, lever, and/or vent.
- the network is operatively coupled to a control system. In some embodiments, the control system comprises a hierarchy of controllers. In some embodiments, the control system is operatively coupled to, or includes, the at least one controller.
- the at least one controller is configured to determine the acoustic transfer function by use of at least one emitter, wherein the at least one controller is further configured to: (E) use the emitter to emit an acoustic test signal, which emitter is disposed at a first location in the enclosure; (F) use the one sensor to measure an acoustic response corresponding to the acoustic test signal, which one sensor is disposed at a second location; and (G) store an acoustic map indicative of the acoustic transfer function between the first location and the second location.
- the emitter comprises a buzzer.
- the at least one controller is configured to use, or direct usage of, the emitter to emit sounds including discrete sounds of a sound spectrum. In some embodiments, the at least one controller is configured to use, or direct usage of, the at least one sensor to detect sounds including continuous sounds of a sound spectrum. In some embodiments, the at least one controller is configured to use, or direct usage of, the emitter to emit sounds including sounds of a sound having a spectrum frequency of from about 10 Hz to about 20 kHz. In some embodiments, the at least one controller is configured to use, or direct usage of, the emitter to emit the acoustic test signal according to a schedule.
- the at least one controller is configured to use, or direct usage of, the emitter to emit the acoustic test signal when the enclosure is non-inhabited. In some embodiments, the at least one controller is configured to use, or direct usage of, the emitter to emit the acoustic test signal outside standard work hours in the enclosure and/or in a facility in which the enclosure is disposed. In some embodiments, the at least one controller is configured to use, or direct usage of, the emitter to emit the acoustic test signal when the enclosure is forecasted to experience a quiet period of a length that is at least sufficient to generate the acoustic map.
- the at least one controller is configured to use, or direct usage of, the emitter to emit the acoustic test signal according to a schedule that considers a change in a fixture of the enclosure and/or of the facility in which the enclosure is disposed. In some embodiments, the at least one controller is configured to use, or direct usage of, the emitter to emit the acoustic test signal according to a schedule that considers a change in a Building Information Modeling file of the enclosure and/or of the facility in which the enclosure is disposed. In some embodiments, the enclosure is at least part of a building, or a vehicle.
- an apparatus for acoustic (e.g., sound) conditioning in a facility
- the apparatus comprises at least one controller configured to: (i) operatively couple to at least one sounds sensor disposed in a facility; (ii) direct the at least sound sensor to collect sound measurements over a first time; and (iii) use, or direct usage of, the sound measurements to condition the sound in at least a portion of the facility at a second time after the first time.
- the at least one controller is configured to user, or direct usage of the sound measurements at least in part by using artificial intelligence, wherein the artificial intelligence optionally comprises machine learning.
- the artificial intelligence is using a learning set comprising (i) historical sound measurements in the facility, (ii) historical sound measurements in another facility, or (iii) synthesized sounds measurements.
- the artificial intelligence is based at least in part on artificial intelligence computational schemes.
- at least one of the artificial intelligence computational schemes has a weight different than at least one other of the artificial intelligence computational schemes.
- the at least one controller is configured to damp, or direct damping of, sound in the facility, and optionally wherein the at least one controller is configured to damp, or direct damping of, sound in the at least the portion of the facility.
- the at least one controller is configured to damp, or direct damping of, sound in the facility at least in part by being configured to direct vibrating at least one window of the facility, and wherein the window is optionally disposed in the at least the portion of the facility.
- the at least one controller is configured to damp, or direct damping of, sound in the facility at least in part by being configured to direct imposing a passive and/or an active damping aid.
- the at least one controller is configured to use, or direct usage of the sound measurements at least in part by using measurements of at least one other sensor.
- the at least one sounds sensor is disposed in a first enclosure of the facility, and wherein the at least one other sensor is disposed in the first enclosure.
- the at least one sounds sensor is disposed in a first enclosure of the facility, and wherein the at least one other sensor is disposed in a second enclosure of the facility different from the first enclosure.
- the at least one sounds sensor is disposed in a first enclosure of the facility, and wherein the at least one other sensor is disposed in a second enclosure of the facility different from the first enclosure.
- the at least one sounds sensor is disposed in a first enclosure of the facility, and wherein the at least one other sensor comprises a first sensor and a second sensor, and wherein the first sensor is disposed in the first enclosure, and wherein the second sensor is disposed in a second enclosure of the facility different from the first enclosure.
- the at least one other sensor is of the same type as the at least one sound sensor.
- the at least one other sensor is of a different type as the at least one sound sensor.
- the at least one controller is configured to use, or direct usage of, (i) measurements of the at least one sounds sensor and (ii) measurements of the at least one other sensor synergistically and/or symbiotically.
- the at least one controller is configured to use, or direct usage of, (i) measurements of the at least one sounds sensor based and (ii) measurements of the at least one other sensor.
- the at least one other sensor is configured to measure an attribute comprising: temperature, electromagnetic radiation, pressure, gas, volatile organic compounds, particulate matter, or movement.
- the gas comprises carbon dioxide, carbon monoxide, nitrogen monoxide, nitrogen dioxide, radon, phosgene, oregano halogens, halogen, formaldehyde, or water.
- the at least one other sensor is configured to measure color temperature.
- the at least one other sensor is configured to measure an attribute comprising: gas type, gas velocity, gas pressure, or gas concentration. In some embodiments, the at least one other sensor is configured to measure an attribute comprising: electromagnetic radiation wavelength, electromagnetic radiation wavelength phase, electromagnetic radiation frequency, or electromagnetic radiation amplitude. In some embodiments, the at least one other sensor is configured to measure an attribute comprising: visible, infrared, ultraviolet, or radio frequency. In some embodiments, the radio frequency comprises ultrawide bandwidth. In some embodiments, the at least one other sensor comprises an accelerometer. In some embodiments, at least one sounds sensor comprises a sensor disposed in a device ensemble.
- the device ensemble comprises (i) sensors, (ii) a sensor and an emitter, or (iii) a sensor and a transceiver, and/or (B) the device ensemble is disposed in, or attached to, a fixture of the facility.
- at least one other sensor comprises a sensor disposed in a device ensemble.
- the device ensemble comprises (i) sensors, (ii) a sensor and an emitter, or (iii) a sensor and a transceiver, and/or (B) the device ensemble is disposed in, or attached to, a fixture of the facility.
- the at least one controller is configured to generate, or direct generation of, sound mapping of at least a portion of the facility. In some embodiments, the at least one controller is configured to damp, or direct damping of, sound in at least a portion of the facility in an intermittent basis, or on a continuous basis. In some embodiments, the intermittent basis is based at least in part on activity scheduling in the at least the portion of the facility, and/or on a detected activity in the at least the portion of the facility. In some embodiments, the at least one controller comprises circuitry, memory, and/or control logic. In some embodiments, the at least one controller comprises a hierarchical control system comprising at least three levels of hierarchy.
- a non-transitory computer readable program instructions for acoustic (e.g., sound) conditioning in a facility
- the non-transitory computer readable program instructions when read by one or more processors operatively coupled to the at least one sound sensor, cause the one or more processors to execute, or direct execution of, operations comprising any operation the apparatus disclosed above.
- the one or more processors include: a processor disposed in a fixture of the facility, a processor disposed in an envelope of the facility, and/or a processor as part of a controller.
- the one or more processors include: a microprocessors, or a graphical processing unit.
- a method of acoustic (e.g., sound) conditioning in a facility, the method comprising any operation of the apparatus disclosed above.
- a system for acoustic (e.g., sound) conditioning in a facility
- the system comprises a network configured to operatively couple to the at least one sounds sensor, the network further configured to transmit one or more signals associates with any operation of the apparatus disclosed above.
- the network is configured to transmit a control automation protocol. In some embodiments, the network is configured to transmit power and communication on a single cable. In some embodiments, the network is configured to transmit cellular communication abiding by at least a fourth generation and/or a fifth generation cellular communication protocol. In some embodiments, the network is configured to transmit control communication, cellular communication, media, and/or other data. In some embodiments, the network is operatively coupled to one or more devices comprising: a sensor, an emitter, a controller, a communication interface, a power supply, controlled entrances, lighting, memory, ventilation system, heating system, cooling system, or a heating cooling and ventilation (HVAC) system.
- HVAC heating cooling and ventilation
- the network is configured to facilitate conditioning the environment of the at least the portion of the facility. In some embodiments, the network is configured (i) to allow entry of authorized users and/or (ii) block entry of unauthorized users. In some embodiments, the network is configured as a secure network.
- an apparatus for sound conditioning comprises a compartment housing an ensemble of devices comprising (A) the at least one sound sensor and (B) (i) a sensor of a different type, (ii) an emitter, or (iii) a transceiver, which device ensemble is configured to facilitate any operation of the apparatus disclosed above.
- the housing comprises at least one circuity board having at least one circuitry operatively coupled to the devices.
- the devices are configured to operatively coupled to a power and/or communication network.
- the devices are configured for synergetic and/or symbiotic collaboration in controlling the facility.
- the devices comprise a communication interface, an accelerometers, a graphical processing unit, a heat sink, a microcontroller, geolocation technology.
- the compartment comprises one or more holes configured to facilitate operations of at least a portion of the devices disposed in the compartment.
- the compartment comprises a body and a lid comprising the one or more holes.
- one or more devices may include windows.
- the one or more devices may include a controller configured to control functions of at least one of the windows.
- the one or more devices may include a device selected from the group consisting of an Internet of Things (loT) device, a wireless device, a sensor, an antenna, a fifth generation communication protocol (5G) compatible device, an Ultra- Wide Band (UWB) device, a millimeter (mm) Wave device, a microphone, a speaker, and a microprocessor.
- the method may (e.g., further) include installing the one or more devices in, or on, a structural element of the enclosure (e.g., building).
- the network may facilitate communication to, from, and/or inter communication of the devices.
- forming the network may be performed during construction of the building. In some examples, forming the network may include coupling the circuits to windows of the building.
- the one or more devices may be selected from the group consisting of Internet of Things (loT) devices, wireless devices, sensors, antennas, fifth generation communication protocol (5G) compatible devices, microphones, microprocessors, and speakers.
- the one or more devices may be is in, or on, a structure of the building.
- the one or more devices may include an optically switchable window.
- the optically switchable window may include an electrochromic window.
- the optically switchable window may include a digital display technology.
- the present disclosure provides systems, apparatuses (e.g., controllers), and/or non-transitory computer-readable medium (e.g., software) that implement any of the methods disclosed herein.
- apparatuses e.g., controllers
- non-transitory computer-readable medium e.g., software
- the network is a local network.
- the network comprises a cable configured to transmit power and communication in a single cable.
- the communication can be one or more types of communication.
- the communication can comprise cellular communication abiding by at least a second generation (2G), third generation (3G), fourth generation (4G) or fifth generation (5G) cellular communication protocol.
- the communication comprises media communication facilitating stills, music, or moving picture streams (e.g., movies or videos).
- the communication comprises data communication (e.g., sensor data).
- the communication comprises control communication, e.g., to control the one or more nodes operatively coupled to the networks.
- the network comprises a first (e.g., cabling) network installed in the facility. In some embodiments, the network comprises a (e.g., cabling) network installed in an envelope of the facility (e.g., in an envelope of a building included in the facility).
- the present disclosure provides systems, apparatuses (e.g., controllers), and/or non-transitory computer-readable medium or media (e.g., software) that implement any of the methods disclosed herein.
- apparatuses e.g., controllers
- non-transitory computer-readable medium or media e.g., software
- the present disclosure provides methods that use any of the systems, computer readable media, and/or apparatuses disclosed herein, e.g., for their intended purpose.
- an apparatus comprises at least one controller that is programmed to direct a mechanism used to implement (e.g., effectuate) any of the method disclosed herein, which at least one controller is configured to operatively couple to the mechanism.
- a mechanism used to implement e.g., effectuate
- at least two operations e.g., of the method
- at least less at two operations are directed/executed by different controllers.
- an apparatus comprises at least one controller that is configured (e.g., programmed) to implement (e.g., effectuate) any of the methods disclosed herein.
- the at least one controller may implement any of the methods disclosed herein.
- at least two operations e.g., of the method
- at less at two operations are directed/executed by different controllers.
- one controller of the at least one controller is configured to perform two or more operations. In some embodiments, two different controllers of the at least one controller are configured to each perform a different operation.
- a system comprises at least one controller that is programmed to direct operation of at least one another apparatus (or component thereof), and the apparatus (or component thereof), wherein the at least one controller is operatively coupled to the apparatus (or to the component thereof).
- the apparatus (or component thereof) may include any apparatus (or component thereof) disclosed herein.
- the at least one controller may be configured to direct any apparatus (or component thereof) disclosed herein.
- the at least one controller may be configured to operatively couple to any apparatus (or component thereof) disclosed herein.
- at least two operations e.g., of the apparatus
- at less at two operations are directed by different controllers.
- a computer software product e.g., inscribed on one or more non-transitory medium
- program instructions when read by at least one processor (e.g., computer), cause the at least one processor to direct a mechanism disclosed herein to implement (e.g., effectuate) any of the method disclosed herein, wherein the at least one processor is configured to operatively couple to the mechanism.
- the mechanism can comprise any apparatus (or any component thereof) disclosed herein.
- at least two operations e.g., of the apparatus
- at less at two operations are directed/executed by different processors.
- the present disclosure provides a non-transitory computer- readable program instructions (e.g., included in a program product comprising one or more non-transitory medium) comprising machine-executable code that, upon execution by one or more processors, implements any of the methods disclosed herein.
- a non-transitory computer- readable program instructions e.g., included in a program product comprising one or more non-transitory medium
- machine-executable code that, upon execution by one or more processors, implements any of the methods disclosed herein.
- at least two operations are directed/executed by the same processor. In some embodiments, at less at two operations are directed/executed by different processors.
- the present disclosure provides a non-transitory computer- readable medium or media comprising machine-executable code that, upon execution by one or more processors, effectuates directions of the controller(s) (e.g., as disclosed herein). In some embodiments, at least two operations (e.g., of the controller) are directed/executed by the same processor. In some embodiments, at less at two operations are directed/executed by different processors. [0052] In another aspect, the present disclosure provides a computer system comprising one or more computer processors and a non-transitory computer-readable medium or media coupled thereto.
- the non-transitory computer-readable medium comprises machineexecutable code that, upon execution by the one or more processors, implements any of the methods disclosed herein and/or effectuates directions of the controller(s) disclosed herein.
- the present disclosure provides a non-transitory computer readable program instructions that, when read by one or more processors, causes the one or more processors to execute any operation of the methods disclosed herein, any operation performed (or configured to be performed) by the apparatuses disclosed herein, and/or any operation directed (or configured to be directed) by the apparatuses disclosed herein.
- the program instructions are inscribed in a non-transitory computer readable medium or media.
- at least two of the operations are executed by one of the one or more processors.
- at least two of the operations are each executed by different processors of the one or more processors.
- the present disclosure provides networks that are configured for transmission of any communication (e.g., signal) and/or (e.g., electrical) power facilitating any of the operations disclosed herein.
- the communication may comprise control communication, cellular communication, media communication, and/or data communication.
- the data communication may comprise sensor data communication and/or processed data communication.
- the networks may be configured to abide by one or more protocols facilitating such communication.
- a communications protocol used by the network can be a building automation and control networks protocol (BACnet).
- BACnet building automation and control networks protocol
- a communication protocol may facilitate cellular communication abiding by at least a 2nd, 3rd, 4th, or 5th generation cellular communication protocol.
- Fig. 1 shows various network linking topologies coupling a network in an enclosure
- FIG. 2 schematically shows a control system architecture
- FIG. 3 schematically shows a control system architecture
- Fig. 4 schematically shows a block diagram showing various devices (e.g., a digital architectural element), and their connectivity to a network.
- FIG. 5 shows a schematic architectural diagram depicting an enclosure layout
- FIG. 6 shows a schematic architectural diagram depicting an enclosure layout
- Figs. 7A-7B schematically show sensors and sound emits with propagating sounds
- Figs. 8A-8B show plots of time dependent frequency sweeps
- Figs. 9A-9B show plots of time dependent frequency sweeps
- Figs. 10A-10B shows plots of sound levels depending on frequency
- FIG. 11 is a flowchart showing a testing operation relating to acoustic mapping
- Fig. 12 is a flowchart showing a fault detection operations
- Figs. 13A, 13B, and 13C show fault detection matrices
- Fig. 14 is a flowchart showing relating to sound event detection
- Fig. 15 shows a schematic block diagram of an enclosure with sound related components
- FIGs. 16A-16B schematically show block diagrams of control systems
- Figs. 17, 18, and 19 list digital architectural element features
- Fig. 20 depicts a digital architectural element having various functionalities
- Fig. 21 illustrates a control related flow chart
- Fig. 22 illustrates an example of a suite of functional modules
- Fig. 23 illustrates an example physical representation a digital architectural element and its placement in a framing
- Fig. 24 shows an example of a portion of a data and power distribution system having a digital architectural element (DAE);
- DAE digital architectural element
- Fig. 25 illustrates a DAE that can support a plurality of communication types
- Fig. 26 illustrates a system of components that may be incorporated in or associated with a DAE
- FIG. 27 schematically depicts a processing system
- Fig. 28 schematically shows an electrochromic device
- Fig. 29 schematically shows a cross section of an Integrated Glass Unit (IGU);
- Fig. 30 shows various components of a device ensemble; and
- Fig. 31 shows a graph of sound measurements as a function of time.
- the figures and components therein may not be drawn to scale. Various components of the figures described herein may not be drawn to scale.
- the conjunction “and/or” in a phrase such as “including X, Y, and/or Z”, refers to in inclusion of any combination or plurality of X, Y, and Z.
- such phrase is meant to include X.
- such phrase is meant to include Y.
- such phrase is meant to include Z.
- such phrase is meant to include X and Y.
- such phrase is meant to include X and Z.
- such phrase is meant to include Y and Z.
- such phrase is meant to include a plurality of Xs.
- such phrase is meant to include a plurality of Ys.
- such phrase is meant to include a plurality of Zs.
- such phrase is meant to include a plurality of Xs and a plurality of Ys.
- such phrase is meant to include a plurality of Xs and a plurality of Zs.
- such phrase is meant to include a plurality of Ys and a plurality of Zs.
- such phrase is meant to include a plurality of Xs and Y.
- such phrase is meant to include a plurality of Xs and Z.
- such phrase is meant to include a plurality of Ys and Z.
- such phrase is meant to include X and a plurality of Ys.
- such phrase is meant to include X and a plurality of Zs.
- such phrase is meant to include Y and a plurality of Zs.
- the conjunction “and/or” is meant to have the same effect as the phrase “X, Y, Z, or any combination or plurality thereof.”
- the conjunction “and/or” is meant to have the same effect as the phrase “one or more X, Y, Z, or any combination thereof.”
- the term “operatively coupled” or “operatively connected” refers to a first element (e.g., mechanism) that is coupled (e.g., connected) to a second element, to allow the intended operation of the second and/or first element.
- the coupling may comprise physical or nonphysical coupling (e.g., communicative coupling).
- the non-physical coupling may comprise signal-induced coupling (e.g., wireless coupling). Coupled can include physical coupling (e.g., physically connected), or non-physical coupling (e.g., via wireless communication). Operatively coupled may comprise communicatively coupled.
- An element that is “configured to” perform a function includes a structural feature that causes the element to perform this function.
- a structural feature may include an electrical feature, such as a circuitry or a circuit element.
- a structural feature may include an actuator.
- a structural feature may include a circuitry (e.g., comprising electrical or optical circuitry).
- Electrical circuitry may comprise one or more wires.
- Optical circuitry may comprise at least one optical element (e.g., beam splitter, mirror, lens and/or optical fiber).
- a structural feature may include a mechanical feature.
- a mechanical feature may comprise a latch, a spring, a closure, a hinge, a chassis, a support, a fastener, or a cantilever, and so forth.
- Performing the function may comprise utilizing a logical feature.
- a logical feature may include programming instructions. Programming instructions may be executable by at least one processor. Programming instructions may be stored or encoded on a medium accessible by one or more processors. Additionally, in the following description, the phrases “operable to,” “adapted to,” “configured to,” “designed to,” “programmed to,” or “capable of’ may be used interchangeably where appropriate.
- an enclosure comprises an area defined by at least one structure.
- the at least one structure may comprise at least one wall.
- An enclosure may comprise and/or enclose one or more sub-enclosure.
- the at least one wall may comprise metal (e.g., steel), clay, stone, plastic, glass, plaster (e.g., gypsum), polymer (e.g., polyurethane, styrene, or vinyl), asbestos, fiber-glass, concrete (e.g., reinforced concrete), wood, paper, or a ceramic.
- the at least one wall may comprise wire, bricks, blocks (e.g., cinder blocks), tile, drywall, or frame (e.g., steel frame).
- the enclosure comprises one or more openings.
- the one or more openings may be reversibly closable.
- the one or more openings may be permanently open.
- a fundamental length scale of the one or more openings may be smaller relative to the fundamental length scale of the wall(s) that define the enclosure.
- a fundamental length scale may comprise a diameter of a bounding circle, a length, a width, or a height.
- a surface of the one or more openings may be smaller relative to the surface the wall(s) that define the enclosure.
- the opening surface may be a percentage of the total surface of the wall(s). For example, the opening surface can measure about 30%, 20%, 10%, 5%, or 1% of the walls(s).
- the wall(s) may comprise a floor, a ceiling, or a side wall.
- the closable opening may be closed by at least one window or door.
- the enclosure may be at least a portion of a facility.
- the facility may comprise a building.
- the enclosure may comprise at least a portion of a building.
- the building may be a private building and/or a commercial building.
- the building may comprise one or more floors.
- the building may include at least one of: a room, hall, foyer, attic, basement, balcony (e.g., inner or outer balcony), stairwell, corridor, elevator shaft, fagade, mezzanine, penthouse, garage, porch (e.g., enclosed porch), terrace (e.g., enclosed terrace), cafeteria, and/or Duct.
- an enclosure may be stationary and/or movable (e.g., a train, an airplane, a ship, a vehicle, or a rocket).
- the enclosure may comprise a building such as a multi-story building.
- the multi-story building may have at least about 2, 8, 10, 25, 50, 80, 100, 120, 140, or 160 floors that are controlled by the control system.
- the number of controlled by the control system may be any number between the aforementioned numbers (e.g., from 2 to 50, from 25 to 100, or from 80 to 160).
- the floor may be of an area of at least about 150 m 2 , 250 m 2 , 500m 2 , 1000 m 2 , 1500 m 2 , or 2000 square meters (m 2 ).
- the floor may have an area between any of the aforementioned floor area values (e.g., from about 150 m 2 to about 2000 m 2 , from about 150 m 2 to about 500 m 2 ’ from about 250 m 2 to about 1000 m 2 , or from about 1000 m 2 to about 2000 m 2 ).
- Certain disclosed embodiments provide a network infrastructure in the enclosure (e.g., a facility such as a building).
- the network infrastructure is available for various purposes such as for providing communication and/or power services.
- the communication services may comprise high bandwidth (e.g., wireless and/or wired) communications services.
- the communication services can be to occupants of a facility and/or users outside the facility (e.g., building).
- the network infrastructure may work in concert with, or as a partial replacement of, the infrastructure of one or more cellular carriers.
- the network infrastructure can be provided in a facility that includes electrically switchable windows. Examples of components of the network infrastructure include a high speed backhaul.
- the network infrastructure may include at least one cable, switch, physical antenna, transceivers, sensor, transmitter, receiver, radio, processor and/or controller (that may comprise a processor).
- the network infrastructure may be operatively coupled to, and/or include a wireless network.
- the network infrastructure may comprise wiring.
- One or more sensors can be deployed (e.g., installed) in an environment as part of installing the network and/or after installing the network.
- the network may be a local network.
- the network may comprise a cable configured to transmit power and communication in a single cable.
- the communication can be one or more types of communication.
- the communication can comprise cellular communication abiding by at least a second generation (2G), third generation (3G), fourth generation (4G) or fifth generation (5G) cellular communication protocol.
- the communication may comprise media communication facilitating stills, music, or moving picture streams (e.g., movies or videos).
- the communication may comprise data communication (e.g., sensor data).
- the communication may comprise control communication, e.g., to control the one or more nodes operatively coupled to the networks.
- the network may comprise a first (e.g., cabling) network installed in the facility.
- the network may comprise a (e.g., cabling) network installed in an envelope of the facility (e.g., such as in an envelope of an enclosure of the facility. For example, in an envelope of a building included in the facility).
- the present disclosure provides networks that are configured for transmission of any communication (e.g., signal) and/or (e.g., electrical) power facilitating any of the operations disclosed herein.
- the communication may comprise control communication, cellular communication, media communication, and/or data communication.
- the data communication may comprise sensor data communication and/or processed data communication.
- the networks may be configured to abide by one or more protocols facilitating such communication.
- a communications protocol used by the network e.g., with a BMS
- BACnet building automation and control networks protocol
- the network may be configured for (e.g., include hardware facilitating) communication protocols comprising BACnet (e.g., BACnet/SC), LonWorks, Modbus, KNX, European Home Systems Protocol (EHS), BatiBUS, European Installation Bus (EIB or Instabus), zigbee, Z-wave, Insteon, X10, Bluetooth, or WiFi.
- the network may be configure to transmit the control related protocol.
- a communication protocol may facilitate cellular communication abiding by at least a 2 nd , 3 rd , 4 th , or 5 th generation cellular communication protocol.
- the (e.g., cabling) network may comprise a tree, line, or star topologies.
- the network may comprise interworking and/or distributed application models for various tasks of the building automation.
- the control system may provide schemes for configuration and/or management of resources on the network.
- the network may permit binding of parts of a distributed application in different nodes operatively coupled to the network.
- the network may provide a communication system with a message protocol and models for the communication stack in each node (capable of hosting distributed applications (e.g., having a common Kernel).
- the control system may comprise programmable logic controller(s) (PLC(s)).
- PLC(s) programmable logic controller
- a network infrastructure supports a control system for one or more windows such as electrochromic (e.g., tintable) windows.
- the control system may comprise one or more controllers operatively coupled (e.g., directly or indirectly) to one or more windows. While the disclosed embodiments describe electrochromic windows as one type of referred to herein as “optically switchable windows,” “tintable windows”, or “smart windows”, the concepts disclosed herein may apply to other types of switchable optical devices comprising a liquid crystal device, an electrochromic device, suspended particle device (SPD), NanoChromics display (NCD), Organic electroluminescent display (OELD), suspended particle device (SPD), NanoChromics display (NCD), or an Organic electroluminescent display (OELD).
- the display element may be attached to a part of a transparent body (such as the windows).
- the tintable window may be disposed in a (non- transitory) facility such as a building, and/or in a transitory vehicle such as a car, RV, buss, train, airplane, helicopter, ship, or boat.
- a building management system is a computer-based control system installed in a building that controls (e.g., monitors) the building's mechanical and electrical equipment such as one or more ventilation, lighting, power system, elevator, fire system, and/or security system. Controllers (e.g., nodes and/or processors) described herein may be suited for integration with a BMS.
- a BMS may consist of hardware, including interconnections by communication channels to processor(s) (e.g., computer(s)) and/or associated software for maintaining conditions in the building, e.g., according to preferences set by at least one user.
- the user can be an occupant, an owner, a lessor, and/or a building manager.
- a BMS may be implemented using a local area network, such as Ethernet.
- the software can be based at least in part on, for example, internet protocols and/or open standards.
- One example is software from Tridium, Inc. (of Richmond, Va.).
- One communication protocol commonly used with a BMS is BACnet (building automation and control networks).
- a BMS is disposed in an enclosure such as a facility.
- the facility can comprise a building such as a multistory building.
- the BMS may functions at least to control the environment in the facility (e.g., in the building).
- the control system and/or BMS may control at least one environmental characteristic of the enclosure.
- the at least one environmental characteristic may comprise temperature, humidity, fine spray (e.g., aerosol), sound, electromagnetic waves (e.g., light glare, color), gas makeup, gas concentration, gas speed, vibration, volatile compounds (VOCs), debris (e.g., dust), or biological matter (e.g., gas borne bacteria and/or virus).
- the gas(es) may comprise oxygen, nitrogen, carbon dioxide, carbon monoxide, hydrogen sulfide, nitrogen dioxide, inert gas, Nobel gas (e.g., radon), cholorophore, ozone, formaldehyde, methane, or ethane.
- a BMS may control temperature, carbon dioxide levels, and/or humidity within an enclosure.
- Mechanical devices that can be controlled by a BMS and/or control system may comprise lighting, a heater, air conditioner, blower, or vent.
- a BMS and/or control system may adjust (e.g., turn on and off) one or more of the devices it controls, e.g., under defined conditions.
- a (e.g., core) function of a modern BMS and/or control system may be to maintain a comfortable environment for the occupants of the enclosure, e.g., while minimizing energy consumption (e.g., while minimizing heating and cooling costs/demand).
- a modern BMS and/or control system can be used to control (e.g., monitor), and/or to optimize the synergy between various systems, for example, to conserve energy and/or lower enclosure (e.g., facility) operation costs.
- the control system is operatively (e.g., communicatively) coupled to an ensemble of devices (e.g., sensors and/or emitters).
- the ensemble facilitates the control of the environment and/or the alert.
- the control may utilize a control scheme such as feedback control, or any other control scheme delineated herein.
- the ensemble may comprise at least one sensor configured to sense electromagnetic radiation.
- the electromagnetic radiation may be (humanly) visible, infrared (IR), or ultraviolet (UV) radiation.
- the at least one sensor may comprise an array of sensors.
- the ensemble may comprise an IR sensor array (e.g., a far infrared thermal array such as the one by Melexis).
- the IR sensor array may have a resolution of at least 32x24 pixels.
- the IR sensor may be coupled to a digital interface.
- the ensemble may comprise an IR camera.
- the ensemble may comprise a sound detector.
- the ensemble may comprise a microphone.
- the ensemble may comprise any sensor and/or emitter disclosed herein.
- the ensemble may include CO 2 , VOC, temperature, humidity, electromagnetic light, pressure, and/or noise sensors.
- the sensor may comprise a gesture sensor (e.g., RGB gesture sensor), an acetometer, or a sound sensor.
- the sounds sensor may comprise an audio decibel level detector.
- the sensor may comprise a meter driver.
- the ensemble may include a microphone and/or a processor.
- the ensemble may comprise a camera (e.g., a 4K pixel camera), a ultra wide band (UWB) sensor and/or emitter, a Bluetooth (BLE) sensor and/or emitter, a processor.
- the camera may have any camera resolution disclosed herein.
- One or more of the devices e.g., sensors
- the sensor ensemble may be utilized to determine presence of occupants in an enclosure, their number and/or identity (e.g., using the camera).
- the sensor ensemble may be utilized to control (e.g., monitor and/or adjust) one or more environmental characteristics in the enclosure environment (e.g., as disclosed herein).
- the sounds sensor may comprise a microphone.
- the sounds sensor may comprise an acoustic noise sensor.
- the sound sensor may comprise a PUI Audio TOM 1545-P-R sensor.
- the sound sensor may be omnidirectional.
- the sound sensor may have a sensitivity of at most about -34dB, -38dB, -40dB, -42dB, -46dB, --or 48dB.
- the sound sensor may require a power supply of at most about 1 .0 Volts (V), 1 ,5V, or 2.0V.
- the sound sensor may have a FLS of at most about 10 millimeters (mm), 9mm, 6mm, or 4mm.
- the sounds sensor may have an impedance of at most about 0.1 Kilo Ohms (kOhm), 0.5kOhm, 1.0 kOhm, 1.5 kOhm, 2.0 kOhm, 2.2kOhm, 2.5 kOhm, or 3.0 kOhm.
- a plurality of devices may be operatively (e.g., communicatively) coupled to the control system.
- the plurality of devices may be disposed in a facility (e.g., including a building and/or room).
- the control system may comprise the hierarchy of controllers.
- the devices may comprise an emitter, a sensor, or a window (e.g., IGU).
- the device may be any device as disclosed herein. At least two of the plurality of devices may be of the same type. For example, two or more IGUs may be coupled to the control system. At least two of the plurality of devices may be of different types. For example, a sensor and an emitter may be coupled to the control system.
- the plurality of devices may comprise at least 20, 50, 100, 500, 1000, 2500, 5000, 7500, 10000, 50000, 100000, or 500000 devices.
- the plurality of devices may be of any number between the aforementioned numbers (e.g., from 20 devices to 500000 devices, from 20 devices to 50 devices, from 50 devices to 500 devices, from 500 devices to 2500 devices, from 1000 devices to 5000 devices, from 5000 devices to 10000 devices, from 10000 devices to 100000 devices, or from 100000 devices to 500000 devices).
- the number of windows in a floor may be at least 5, 10, 15, 20, 25, 30, 40, or 50.
- the number of windows in a floor can be any number between the aforementioned numbers (e.g., from 5 to 50, from 5 to 25, or from 25 to 50).
- the devices may be in a multi-story building. At least a portion of the floors of the multi-story building may have devices controlled by the control system (e.g., at least a portion of the floors of the multi-story building may be controlled by the control system).
- the building may comprise an area of at least about 1000 square feet (sqft), 2000 sqft, 5000 sqft, 10000 sqft, 100000 sqft, 150000 sqft, 200000 sqft, or 500000 sqft.
- the building may comprise an area between any of the above mentioned areas (e.g., from about 1000 sqft to about 5000 sqft, from about 5000 sqft to about 500000 sqft, or from about 1000 sqft to about 500000 sqft).
- the building may comprise an area of at least about 100m 2 , 200 m 2 , 500 m 2 , 1000 m 2 , 5000 m 2 , 10000 m 2 , 25000 m 2 , or 50000 m 2 .
- the building may comprise an area between any of the above mentioned areas (e.g., from about 100m 2 to about 1000 m 2 , from about 500m 2 to about 25000 m 2 , from about 100m 2 to about 50000 m 2 ).
- the facility may comprise a commercial or a residential building.
- the commercial building may include tenant(s) and/or owner(s).
- the residential facility may comprise a multi or a single family building.
- the residential facility may comprise an apartment complex.
- the residential facility may comprise a single family home.
- the residential facility may comprise multifamily homes (e.g., apartments).
- the residential facility may comprise townhouses.
- the facility may comprise residential and commercial portions.
- the facility may comprise at least about 1 , 2, 5, 10, 50, 100, 150, 200, 250, 300, 350, 400, 420, 450, 500, or 550 windows (e.g., tintable windows).
- the windows may be divided into zones (e.g., based at least in part on the location, fagade, floor, ownership, utilization of the enclosure (e.g., room) in which they are disposed, any other assignment metric, random assignment, or any combination thereof. Allocation of windows to the zone may be static or dynamic (e.g., based on a heuristic). There may be at least about 2, 5, 10, 12, 15, 30, 40, or 46 windows per zone.
- the window systems and associated components disclosed in these embodiments can facilitate high bandwidth (e.g., gigabit) communication and associated data processing. These communications and data processing may employ optically switchable window systems components and facilitate various window and non-window functions as described herein and in International Patent Application Serial No.
- optically switchable window system components include components of a communications network and power distribution system for powering window transitions as described in U.S. Patent Application Serial No. 15/365,685, filed November 30, 2016.
- the network comprises a communication network.
- Example components for enhancing functionality of a communications network that serves optically switchable windows may include: (1) a control panel with a high bandwidth switching and/or routing capability (e.g., one gigabit or faster Ethernet switch); (2) a backbone that includes control panels and high bandwidth links (e.g., 10 gigabit or faster Ethernet capability) between the control panels; (3) a digital element (e.g., device ensemble) including sensors, display drivers, and/or logic for various functions that employ high data rate processing.
- a control panel with a high bandwidth switching and/or routing capability e.g., one gigabit or faster Ethernet switch
- a backbone that includes control panels and high bandwidth links (e.g., 10 gigabit or faster Ethernet capability) between the control panels
- a digital element e.g., device ensemble
- the digital element can be configured as a digital wall interface or a digital architectural element such as a digital mullion insert; (4) an enhanced functionality window controller that includes an access point for wireless communication, e.g., a Wi-Fi access point; and (5) high bandwidth data communication links between the control panels and digital elements and/or enhanced functionality window controllers, the data communication links configured, for example, as trunk lines or to follow paths that at least partially overlap with the paths of trunk lines.
- an enhanced functionality window controller that includes an access point for wireless communication, e.g., a Wi-Fi access point
- high bandwidth data communication links between the control panels and digital elements and/or enhanced functionality window controllers configured, for example, as trunk lines or to follow paths that at least partially overlap with the paths of trunk lines.
- Fig. 1 shows a (e.g., simplified) top level view of a system 100 that includes a building 101 that includes a number of (e.g., EC) windows.
- a subset of the (e.g., EC) windows is connected by way of (e.g., EC window) power and communications lines to a "Control Panel" (CP) 103a.
- CP Control Panel
- the building's windows are grouped in three subsets, each connected to a respective CP of 103a-c, but it will be appreciated that fewer or more than three CP's may be contemplated for any given building.
- the three CPs 103a-c are communicatively coupled by a (e.g., high bandwidth such as 10 Gigabits per second (Gbps)) communication backbone, and to an external network 105.
- Gbps gigabits per second
- the network links provide data transmission to other elements (e.g., devices) such as digital wall interfaces, enhanced functionality window controllers, digital architectural elements, and the like.
- a hierarchical network may be used wherein a distributed network includes at least two of a master controller, an intermediate controller (that can be floor controllers and/or network controllers), and a local controller (e.g., end or leaf controllers such as window controllers).
- a master controller may or may not be in physical proximity to a BMS.
- a master controller may be operatively coupled to a BMS.
- At least one floor (e.g., each floor) of a building may have one or more intermediate controllers.
- At least one device (e.g., window) may have its own local controller.
- a local controller may control at least 1 , 2, 3, 4, 5, 6, 7, 8, 9, or 10 devices.
- the control system may or may not have intermediate controller(s).
- the control system may have 1 , 2, 3, or more hierarchal control levels.
- a local controller may control a plurality of devices.
- the devices may comprise a (e.g., smart) window, a sensor, an emitter, an antenna, a receiver, or a transceiver, for example.
- Fig. 2 shows an example of a control system architecture 200 comprising a master controller 208 that controls intermediate (e.g., floor) controllers 206, that in turn control local controllers 204.
- a local controller controls one or more integrated glass units (IGUs), one or more sensors, one or more output devices (e.g., one or more emitters), one or more antennas, or any combination thereof.
- IGUs integrated glass units
- Fig. 2 shows an example of a configuration in which the master controller is operatively coupled (e.g., wirelessly and/or wired) to a building management system (BMS) 224 and to a database 220. Arrows in Fig. 2 represent communication pathways.
- BMS building management system
- a controller may be operatively coupled (e.g., directly/indirectly and/or wired and wirelessly) to an external source 210.
- the external source may comprise a network.
- the external source may comprise one or more sensor or output devices.
- the external source may comprise a cloud-based application and/or database.
- the communication may be wired and/or wireless.
- the external source may be disposed external to the facility.
- the external source may comprise one or more sensors and/or antennas disposed, e.g., on a wall or on a ceiling of the facility.
- the communication may be mono-directional or bidirectional. In the example shown in Fig. 2, all communication arrows are meant to be bidirectional.
- Fig. 3 illustrates a block diagram of a control panel 303 interfacing with a plurality of EC windows 312.
- the control panel 303 includes a master control and power module 304 and to network controllers (NC's) 310. It will be appreciated that the control panel 303 may include fewer or more NC's 310 than illustrated.
- Each NC 310 is respectively coupled with two or more window controllers (WC's) 311 , each window controller 311 being associated with a respective EC window 312.
- the control system in Fig. 3 is an example of the more general control system illustrate in Fig. 2.
- a controller network may provide data transmission for standard window controllers (WC2's) dedicated to controlling optically switchable windows.
- the controller network may provide data transmission supporting enhanced functionality window controllers (WC3's) that may have a Wi-Fi access point, cellular capability, etc.
- enhanced functionality window controllers connect to a controller network bus to send and receive data relating to controlling optically switchable windows assigned to the window controllers.
- the enhanced functionality window controllers may connect to a high bandwidth line such as a gigabit Ethernet line to send and receive data relating to non-window functions such as Wi-Fi and/or cellular communications.
- the enclosure includes at least one digital architectural element (e.g., device ensemble) disposed in each of a plurality of separate areas (e.g., rooms).
- the enclosure e.g., room
- a digital architectural element may contain a sensor, an emitter, processor (e.g., a microcontroller and/or a non-volatile memory), network interface, and/or peripheral interface.
- DAE can refer to any device, device ensemble, or interface, configured to be mounted to and/or retained in, or on, any structural component in an enclosure (e.g., framework, beam, joist, wall, ceiling, floor, window, fascia, transom, and/or casement of an enclosure.
- a DAE may include, for example, a window-mullion interface, a digital wall interface, and/or a ceiling-mounted interface.
- DAE sensor examples include light sensor.
- the DAE may include image capture sensor such as a camera, audio sensor such as voice coil and/or microphone, air quality sensor, and proximity sensor (e.g., certain IR and/or RF sensor).
- the network interface may be a high bandwidth interface such as a gigabit (or faster) Ethernet interface.
- DAE peripherals include video display monitors, add-on speakers, mobile devices, battery chargers, and the like.
- peripheral interfaces include standard Bluetooth modules, ports such as USB ports and network ports, etc. Ports may include any of various proprietary ports for third party devices.
- the DAE operates in conjunction with other hardware and/or software provided for an optically switchable window system, e.g., to a media display construct coupled to window, and/or to a display projected on the window.
- the DAE includes a controller (e.g., any controller disclosed herein). Examples of display constructs, windows, control system, network, and related touch screen, can be found in US Provisional Patent Application Serial No. 62/975,706, filed on February 12, 2020, titled “TANDEM VISION WINDOW AND MEDIA DISPLAY,” that is incorporated herein by reference in its entirety.
- a DAE includes one or more signal generating devices such as a speaker, a light source (e.g., an LED), a beacon, an antenna (e.g., a Wi-Fi or cellular communications antenna), and the like.
- the signal generating device can be an emitter.
- a DAE includes an energy storage component and/or a power harvesting component.
- a DAE may contain one or more batteries and/or capacitors, e.g., as energy storage devices, the DAE may include a photovoltaic cell.
- a DAE has one or more user interface components (e.g., a microphone or a speaker), one more sensors (e.g., a proximity sensor), and a network interface (e.g., for a high bandwidth communications).
- a DAE is designed, or configured to, attach to (or otherwise be collocated with) a structural element of an enclosure (e.g., a building).
- a DAE has an appearance that blends in with the structural element with which it is associated.
- a DAE may have a shape, size, and/or color that blends with the associated structural element.
- a DAE may not be easily visible to occupants of a building; e.g., the element is fully or partially camouflaged in the surrounding in which it is disposed.
- such element may interface with other component(s) that do not blend in, such as one or more video display monitors, touch screens, projectors, and the like.
- the building structural elements to which DAE may be attached include any of various building structures.
- building structures to which DAEs attach are installed and/or constructed during building construction, in some cases early in building construction when the building skeleton or envelope is constructed.
- the building structural elements for DAEs are elements that serve a building structural function. Such elements may be permanent, e.g., not easily removable from a building. Examples include columns, piers (e.g., elevator, communication, or electrical piers), walls, partitions (e.g., office space partitions), doors, beams, stairs, fagades, moldings, mullions and/or transoms.
- the structural elements are located on a perimeter of the enclosure.
- the DAE is provided as separate modular unit or as a housing (e.g., box) that attach to the building structural element.
- a DAE is provided in a fagade for building structural element.
- a DAE may be provided as a cover for a portion of a mullion, transom, or door.
- a DAE is configured as a mullion or disposed in or on a mullion. If it is attached to a frame portion (e.g., mullion), the DAE may be bolted on, snapped to, or otherwise attached to the rigid parts of the mullion.
- a DAE can snap onto a structural element of the enclosure.
- a DAE serves as a molding, e.g., a crown molding.
- a DAE is modular; e.g., it serves as a module for part of a larger system such as a communications network, a power distribution network, and/or computational system.
- the computation system can employ an external video display and/or other user interface component(s).
- the DAE is a digital frame portion (e.g., mullion portion) designed to be deployed on one or more frame portions (e.g., mullions) in an enclosure.
- digital frame portions are deployed in a regular or periodic fashion. For example, digital frame portions may be deployed on every (e.g., second, fourth, sixth, or tenth) successive frame.
- the DEA has a network connection.
- the DEA houses one or more devices (e.g., digital and/or analog components).
- the DAE in addition to the (e.g., high bandwidth) network connection (port, switch, and/or router) and housing, the DAE includes one or more of the following digital and/or analog components.
- the devices may include: a camera, a proximity or movement sensor, an occupancy sensor, a color temperature sensor, an infrared sensor, an ultraviolet sensor, a visible light sensor, a biometric sensor, a speaker, a microphone, an air quality sensor, a hub for power and/or data connectivity, display video driver, a Wi-Fi access point, an antenna, a location service (e.g., Bluetooth, Global Positioning System, or ultra-wide band) via beacons or other mechanism, a power source, a light source, a processor, a memory, and/or a circuitry (e.g., ancillary processing device).
- One or more cameras may include a sensor and/or processing logic for imaging features in the visible, IR, or other wavelength region; various resolutions of the camera are possible including high definition (HD) and greater.
- the DAE may include one or more of the devices disclosed herein.
- the camera and/or display construct may have at its fundamental length scale 2000, 3000, 4000, 5000, 6000, 7000, or 8000 pixels.
- the camera and/or display construct may have at its fundamental length scale any number of pixels between the aforementioned number of pixels (e.g., from about 2000 pixels to about 4000 pixels, from about 4000 pixels to about 8000 pixels, or from about 2000 pixels to about 8000 pixels).
- a fundamental length scale may comprise a diameter of a bounding circle, a length, a width, or a height.
- the fundamental length scale may be abbreviated herein as “FLS.”
- the camera and/or display construct may comprise a high resolution display.
- the camera and/or display construct may have a resolution of at least about 550, 576, 680, 720, 768, 1024, 1080, 1920, 1280, 2160, 3840, 4096, 4320, or 7680 pixels, by at least about 550, 576, 680, 720, 768, 1024, 1080, 1280, 1920, 2160, 3840, 4096, 4320, or 7680 pixels (at 30Hz or at 60Hz).
- the first number of pixels may designate the height of the display and the second pixels may designates the length of the display.
- the camera and/or display construct may have a resolution of 1920 x 1080, 3840 x 2160, 4096 x 2160, or 7680 x 4320.
- the camera and/or display construct may be a standard definition, enhanced definition, high definition display, or an ultra-high definition.
- One or more proximity or movement sensors may include an infrared sensor (abbreviated herein as an “IR” sensor).
- IR infrared sensor
- a proximity sensor is a radar or radar-like device that detects distances from and between objects using a ranging function. Radar sensors can also be used to distinguish between closely spaced occupants via detection of their biometric functions, for example, detection of their different breathing movements. When radar or radar-like sensors are used, better operation may be facilitated when disposed unobstructed or behind a plastic case of a DAE.
- One or more occupancy sensors may include a multi-pixel thermal imager, which when configured with an appropriate computer implemented algorithm can be used to detect and/or count the number of occupants in a room.
- data from a thermal imager or thermal camera is correlated with data from a radar sensor to provide a better level of confidence in a particular determination being made.
- thermal imager measurements can be used to evaluate other thermal events in a particular location, for example, changes in air flow caused by open windows and doors, the presence of intruders, and/or fires.
- One or more color temperature sensors may be used to analyze the spectrum of illumination present in a particular location and to provide outputs that can be used to implement changes in the illumination as needed or desired, for example, to alter (e.g., improve) an occupant's health, comfort, or mood.
- One or more biometric sensors e.g., for fingerprint, retina, or facial recognition
- One or more speakers and associated power amplifiers may be included as part of a DAE or separate from it.
- two or more speakers and an amplifier are configured as a sound bar; e.g., a bar-shaped device containing multiple speakers.
- the device may be designed (e.g., configured) to provide high fidelity sound.
- One or more microphones and/or logic for detecting and processing sounds may be provided as part of a DAE or separate from it.
- the microphone(s) may be configured to detect internally and/or externally generated sounds. Internal may refer to internal to the enclosure. External my refer to external to the enclosure.
- processing and analysis of the sounds is performed by logic (embodied in software, firmware, and/or hardware) in one or more digital structural element and/or by logic in one or more other devices coupled to the network, for example, in one or more controllers coupled to the network.
- the logic is configured to (e.g., automatically) adjust a sound output of one or more speaker to mask and/or cancel sounds, frequency variations, echoes, and other factors detected by one or more microphone, e.g., that negatively impact (or potentially could negatively impact) occupants present in a location within the enclosure (e.g., the building).
- the sounds comprise sounds generated by, but not limited to: indoor machinery, indoor office equipment, outdoor construction, outdoor traffic, and/or airplanes.
- the DAE comprises one or more air quality sensors.
- the one or more air quality sensors (optionally able to measure one or more of the following air components: volatile organic compounds (VOC), carbon dioxide temperature, humidity) may be used in conjunction with a heating, ventilation, and air-conditioning system (HVAC system) to adjust (e.g., improve) air circulation.
- VOC volatile organic compounds
- HVAC system heating, ventilation, and air-conditioning system
- the DAE may include a connectivity and/or power hub.
- One or more hubs for power and/or data connectivity to sensor(s), speakers, microphone, and the like may be provided by the DAE.
- the hub may comprise a USB hub, or a Bluetooth hub.
- the hub may include one or more ports such as USB ports, High Definition Multimedia Interface (HDMI) ports, or any other port, plug, or socket disclosed herein.
- the DAE may include a connector dock for external sensors, light fixtures, peripherals (e.g., a camera, microphone, speaker(s)), network connectivity, power sources, etc.
- one or more video drivers may be provided in the DAE.
- the driver may be utilized for a media display (e.g., a transparent OLED media display construct) on or proximate to a window (such as an integrated glass unit (IGU)) associated with the DAE element.
- the driver may be operatively coupled (e.g., wireless, physically wired, and/or optically coupled) to the DAE.
- the optical signal may be launched into the window by optical transmission, such as a switchable Bragg grating that includes a display with a light engine and lens that focuses on glass waveguides that transmits through the glass and travels perpendicularly to line of sight.
- the DAE or a faceplate that covers all or a portion of the DAE may serve as an antenna.
- Various approaches may be employed to insulate the DAE and use it to transmit and/or receive directionally.
- a prefabricated antenna may be employed in the enclosure.
- a window antenna may be employed. Examples of antennas and their integration in a facility and deployment may be found in International Patent Application Serial No. PCT/US17/31106, filed May 4, 2017, which is incorporated herein by reference in its entirety.
- One or more power sources such as an energy storage device (e.g., a rechargeable battery and/or a capacitor), and the like may be provided.
- the power source may be renewable or non-renewable.
- the plurality of power sources may comprises renewable or nonrenewable power sources.
- a power harvesting device is included; e.g., a photovoltaic cell or panel of cells. This may allow the device to be self-contained or partially self-contained.
- the light harvesting device may be transparent or opaque, e.g., depending on where it is attached.
- a photovoltaic cell may be attached to, e.g., and partially or fully cover, the exterior of a digital mullion.
- a transparent photovoltaic cell may be cover a display and/or user interface (e.g., a dial, button, etc.), e.g., on the DAE.
- One or more processors may be configured to provide various embedded or nonembedded applications.
- the processor may comprise a microcontroller.
- the processor is low-powered mobile computing unit (MCU) with memory and configured to run a lightweight secure operating system hosting applications and data.
- the processor is an embedded system, system on chip, or an extension.
- One or more ancillary processing devices such as a graphical processing unit, an equalizer, or other audio processing device may be used to interpret audio signals.
- the speaker, microphone, and associated logic are configured to use acoustic information to characterize the acoustic map of the enclosure, its air quality, and/or air conditions.
- an algorithm may issue ultrasonic pulses, and detect the transmitted and/or reflected pulses coming back to a microphone.
- the algorithm may be configured to analyze the detected acoustic signal, sometimes using a transmitted vs. received differential audio signal, to determine air density, particulate deflection, and the like, e.g., to characterize air quality in the enclosure.
- the DAE is coupled to a signal (e.g., sound) equalizer.
- the equalizer can facilitate adjustment of room acoustics using, e.g., real time, time delay reflectometry.
- the equalizer (and associated components) can compensate for unwanted audio artifacts, e.g., produced by interactions of the sound waves with items that are in the enclosure (e.g., a room) or otherwise in close proximity with an occupant.
- a signal pulse is generated by a speaker associated with the DAE.
- One or more microphones can pick up the pulse (e.g., directly) and as reflected and/or attenuated by items in the room (e.g., wall roughness, or shelf angle).
- the system can infer boundaries of the enclosure (e.g., room boundaries), etc.
- a user’s mobile device e.g., smart phone, pad, or laptop
- the user e.g., with the mobile device enabled
- the DAE can determine how to optimize speaker output.
- the optimization may be after the acoustic profile of the room is mapped.
- the optimization may be a corrective action.
- the optimization may comprise (e.g., controllably and/or automatically) adjusting one or more sound absorbers, diffusers, and/or deflectors in specific areas that affect the sound map in the enclosure.
- the optimization may be automatically controlled.
- the optimization may comprise altering a white noise level, a fixture (e.g., wall or ceiling) roughness, adjustable shelve(s) (e.g., vents), and/or speaker output.
- the DAE can be programmed to tune its speaker output based on various factors such as where the user is located in the enclosure.
- the DAE (e.g., device ensemble) can, in some embodiments, detect the user location using any of a number of proximity techniques, such as those described in International Patent Application Serial No. PCT/US17/31106, filed May 4, 2017, which is incorporated herein by reference in its entirety.
- Fig. 4 schematically shows an example of components related to a digital architectural element (DAE).
- an arrangement 400 includes a DAE 430 and a processor (e.g., computer) 440.
- the processor 440 is connected (e.g., via Ethernet connection) to an external network 441 .
- the external network can include internet and/or a cloud-based content and/or service provider.
- the connection of the processor to the external network may include an appropriate modem, router, switch and/or a high bandwidth backbone such as the 10 Gigabyte backbone.
- the processor 440 may also be connected to a display 409 (e.g., video display) via, in this example, a High-Definition Multimedia Interface (HDMI) link.
- the processor 440 is connected to ports 411 (e.g., USB, Wi-Fi, Bluetooth, or any other port, and/or socket disclosed herein), e.g., to make available additional internal and/or external resources for the DAE 430.
- a DAE may include any device disclosed herein (e.g., various sensors and peripheral elements). In the example illustrated in Fig.
- DAE 430 includes speakers 417, microphone 419, and various sensors 421 such as temperature, humidity, pressure, and gas flow sensors. Any one or more of these components may be coupled to the computer or processor 440 via the ports 411 . Ay of the device may be reversibly plugged in and out of the electronic circuitry of the DAE, e.g., via connectors 421- 423. Any of the devices may communicate via wired or wireless (e.g., 425) communication. The communication may be to the network, to the processor 411 , or to any other processor configure to receive the communication. The communication can be monodirectional or bidirectional. In the example shown in Fig. 4, bidirectional communication is designated by bidirectional arrows, e.g., 431-436.
- the DAE is coupled an equalizer 413 configured to provide tone control to adjust for acoustics of the enclosure in which the DAE is disposed.
- the DAE may be also referred to herein as “device ensemble,” “ensemble of devices,” or a “device assembly.”
- a plurality of transducers such as sound emitters (e.g., speakers) and sound sensors (e.g., microphones) are disposed in the facility to acoustically map enclosure (e.g., acoustic) environments.
- the sound transducers may each have known locations.
- the sounds transducers may be communicative coupled together via a network, e.g., a communications and power network.
- the sound emitters and sensors may use (i) sound frequency sweeping, (ii) their location (e.g., relative and/or absolute location), and (iii) mutual timing coordination, to generate the acoustic mapping of the enclosure (e.g., facility).
- the acoustic mapping can be done automatically, in situ, and/or in real time during a sound event (e.g., a conference).
- the acoustic mapping may be done outside of the sound event (e.g., after work hours). Any change in the enclosure (e.g., facility) affecting the acoustic mapping can be accounted for in initial acoustic mapping and/or updated testing.
- acoustic mapping allows one to know how well various enclosure (e.g., facility) environments (e.g., rooms) are isolated from noise generated in other areas of an enclosure (e.g., other rooms), allowing areas that are not sufficiently isolated to be identified for corrective action (e.g., sound optimization). From this data, insufficiently acoustically isolated enclosure environments can be made more so by taking any of the corrective measures disclosed herein. For example, by adding or configuring sound absorbers, diffusers, and/or deflectors in specific areas.
- a two-dimensional or three-dimensional virtual representation of an environment helps define the areas of interest for which acoustic properties are to be determined and managed.
- a representation may utilize a model such as a Building Information Modeling (BIM) model (e.g., an IMS model, a building Information Modeling (BIM) model (e.g., an IMS model, a building Information Modeling (BIM) model (e.g.
- Autodesk Revit file e.g., to derive a representation of (e.g., basic) fixed structures and movable items such as doors, windows, and elevators.
- the model may be annotated with representations of other elements (e.g., fixtures and non-fixtures) which may be permanent or non-permanent elements.
- the installed locations of transducers e.g., speakers and microphones
- sensor ensembles e.g., DAE
- a user may annotate the model to include information regarding requested acoustic properties, e.g., for corresponding zones (e.g., rooms) in the model.
- a zone may be designated as a one-person office which implies requesting a high degree of acoustic isolation (e.g., so that use of a speaker-phone can be conducted in the office without sound interference from outside the office and so that the sounds made by the user of the office and the sounds from the speaker-phone itself do not become distractions for people outside the office).
- Modifications within an enclosure may alter the areas of interest.
- office cubicles may be introduced and/or reconfigured in ways that change the acoustic properties (e.g., transfer function(s) defining an acoustic attenuation) of one of more zones in ways that could be undesirable.
- the roughness and/or material of fixture surface facing the enclosure interior may be altered (e.g., to alter the sound map in the enclosure). Angle of various shelves may be altered to change the sound map.
- the enclosure comprises one or more sound transducers (e.g., emitters such as speakers) and/or sound sensors.
- the sound transducers and/or sensors may be installed to occupy regularly spaced locations.
- an interplay between emitters and sensors can be attuned to the expected acoustics of an enclosure.
- the emitters and sensors are spaced according to an occupant density in a building to achieve a finer acoustic tuning in more heavily occupied spaces.
- the emitters and sensors are spaced according to area of interest in a building to achieve a finer and/or rougher acoustic tuning according to the area of interest requirements.
- Transducer locations may be chosen toward the top and sides of a particular zone so that the interplay between emitters and sensors can be used to create a 3D acoustic mapping.
- Fig. 5 shows an example representation 500 of an enclosure as a floor 501 of a (e.g., multi-story) building.
- an outer hallway 502 has access to an office suite 503 via a main door 504.
- the office suite includes inner offices 505, 506, and 507 and a conference room 508.
- Doors 509 provide access to inner offices 505-507 and conference room 508 as shown.
- a plurality of digital architectural elements (e.g., device ensembles) 510 are installed, e.g., throughout office suite 503.
- Each ensemble 510 may include at least one sound emitter (e.g., speaker or buzzer) and at least one sound sensor (e.g., microphone) integrated into a common housing.
- Each ensemble 510 may be mounted in a window frame portion (e.g., mullion), on a wall surface, or suspended from a ceiling, for example.
- the sound transducers of the ensembles (e.g., 510) may be capable of testing and monitoring a respective zone which may vary depending on surrounding structures such as fixtures (e.g., walls, windows, and doors) and non-fixtures (e.g., furniture and movable partitions for cubicles).
- a center area of suite 503 includes partitions 511 for cubicles.
- acoustic properties may be acquired using a data collection process that establishes an initial acoustic map that is valid for the office configuration at the time of data collection.
- Fig. 6 shows an example representation 600 of floor 601 that is floor 501 shown in the example of Fig. 5, after being reconfigured is ways that impact the acoustic properties of interest.
- groups of cubicle partitions 605 and 606 are added to office suite 603.
- Sensor ensembles 610 are shown as having the same configuration as in Fig. 5, although some reconfiguration of emitters/sensors could occur (e.g., additional emitter/sensor locations).
- alteration of fixtures and/or nonfixtures may follow additions and/or relocation of DAE(s).
- Revised acoustic properties may be acquired to establish an updated acoustic map that is valid for the enclosure configuration, e.g., at the time of an updated data collection.
- a processing system may be configured to analyze (e.g., compare) the sound emitted by the emitter and the sound sensed by the sound sensor, and based on the locations of these emitters and sensors, form the acoustic mapping of the environment. Changes in fixtures and/or nonfixtures may happen during use of an enclosure (e.g., office). The changes may or may not have a noticeable and/or measurable effect on the acoustics of the enclosure.
- a notification and/or a report is issued to a user (e.g., building or office manager) only when a difference between the updated acoustic map and the initial acoustic map is greater than a threshold (e.g., predetermined difference).
- the threshold may be a value or a function (e.g., time dependent function and/or space dependent function).
- a communication network e.g., that of a facility, is communicatively coupled to a plurality of sound emitters (e.g., speakers) and a plurality of sound sensors (e.g., microphones) disposed in the facility.
- the emitters may be configured to emit sound in a (e.g., wide) range of frequencies and/or at a sound intensity (e.g., sound pressure level or power).
- the frequency may be at least about 1 Hertz (Hz), 10 Hz, 100Hz, 1kHz, 10kHz, 20 kHz, or 50kHz.
- the frequency may be at most about 10Hz, 100Hz, 1 kHz, 10kHz, 20kHz, or 50kHz.
- the frequency may be between any of the aforementioned frequency values (e.g., from about 1 Hz to about 50kHz, from about 10Hz to about 20kHz, from about 100Hz to about 50kHz).
- the frequency range may comprise (i) a continuous frequency range or (ii) a discrete frequency range.
- the sound intensity may be predetermined.
- the sound intensity may comprise a range of sound intensities. In some embodiments, the sound may or may not be perceptible to the human ear.
- the sensors are configured to receive the sound(s) and convert it to an electrical signal.
- the emitter(s) and sensor(s) may be utilized for creation and/or alteration of an acoustic transfer function.
- the emitter(s) and sensor(s) may be utilized for detection of faults and/or changes in an acoustic transfer function (e.g., due to a new obstruction, and/or fixture change).
- data collection for acoustic mapping utilizes a first sound sensor at a first location, a second sound sensor at a second location different from the first location, and a sound emitter at a third location.
- the third location may be different from the first and second locations.
- the locations may differ in X, Y, and/or Z Cartesian coordinates. In some embodiments, the third location may coincide with one of the first and second locations.
- a greater number of sensors and emitters is used with a greater number of (e.g., predetermined) locations, e.g., in order to obtain a greater mapping resolution (e.g., with the distribution of sensors and emitters providing appropriate overlap of zones so that test signals from an emitter can be sensed at a greater number of sensors).
- Testing may be performed at a time of low occupancy in the facility, e.g., at night, on a weekend, and/or on a holiday.
- a time for the sound sweeping subroutine may be scheduled using a calendar function.
- Network interaction between modules or nodes e.g., device ensembles) may be used to coordinate a sound sweeping subroutine among the various emitters and sensors.
- sequential frequency sweeping may be performed by selected (e.g., some or all) emitters in the facility.
- the sweeping of sound frequencies may extend to any frequency range delineated herein (e.g., from about 10 Hz to about 20 kHz, or from about 1 Hz to about 50kHz).
- a first sound emitter may emit a sound at a frequency range (e.g., using frequency sweeping), and (e.g., selected or all) sensor(s) in the facility may be programmed to “listen” and sense the emitted sound frequencies.
- the sound emitter may be in an enclosure (e.g., room), and the sound sensors may be in the same enclosure (e.g., room) and/or in a different enclosure (e.g., anywhere else in the facility) where a sound may be detectable.
- the emitters may be included in window frames of the facility envelope, e.g., in a transom and/or mullion, as shown for example in U.S. Patent Application Serial No. 16/608,157, filed on October 24, 2019, titled “DISPLAYS FOR TINTABLE WINDOWS,” which is incorporated herein by reference in its entirety.
- at least a portion of the emitters may be located farther within the interior of the facility.
- a second sound emitter may emit a sound at a frequency range (e.g., using frequency sweeping), and (e.g., selected or all) sensor(s) in the facility may be programmed to “listen” and sense the emitted sound frequencies. This process may be continued with other sound emitter(s) until all requested emitters have completed their frequency sweep, and each of the sound sensors have measured a respective acoustic response corresponding to each acoustic test signal.
- a sound attenuation e.g., acoustic transfer function
- the testing/mapping may be performed per enclosure or enclosure portion (e.g., per room, per group of rooms, per floor, per group of floors, per building, or per facility). All the emitters (e.g., speakers) and sensors (e.g., microphones) may have known locations in the enclosure. The locations may be determined at the time of installation (e.g., by a traveler such as an installer or a robot such as a wheeled robot or a drone), obtained from an architectural planning, computer aided design (CAD) file (e.g., Revit file), detected using an autolocation procedure (e.g., as disclosed in US Provisional Patent Application Serial No.
- CAD computer aided design
- the amount of time required for data collection according to the frequency mapping procedure and to generate a corresponding mapping of the facility may be at most a day, 8h, 4h, 2h, or 1 h.
- Fig. 7A shows an example of device ensembles 701-705, each containing a respective sound emitter and sound sensor. It should be understood however, that not every DAE must include a sensor and an emitter.
- a DAE may comprise a sound emitter and be devoid of a sound sensor.
- a DAE may comprise a sound sensor and be devoid of a sound emitter.
- ensemble 701 includes a collocated sound emitter E1 and sound sensor S1 .
- Ensemble 702 includes a collocated sound emitter E2 and a sound sensor S2.
- Ensemble 703 includes a collocated sound emitter E3 and a sound sensor S3.
- Ensemble 704 includes a collocated sound emitter E4 and a sound sensor S4.
- Ensemble 705 includes a collocated sound emitter E5 and a sound sensor S5.
- Fig. 7A depicts an example of a sound pressure wave 700 propagating from emitter E1 (in ensemble 701) to sound sensors S2, S3, S4, and S5 (in ensembles 702-705, respectively) during a respective test signal.
- a plurality of sound paths e.g., through respective zones or rooms in the enclosure
- FIG. 7B shows an example of a succeeding step in the acoustic mapping process, wherein a sound pressure wave 710 propagates from sound emitter E2 (in ensemble 702) to sound sensors S1 , S3, S4, and S5 during a second test signal.
- each testing signal uses a frequency sweeping signal which is continuous in a frequency range.
- a transfer function defining the acoustic attenuation from one location (e.g., zone) to another may be determined as a change in sound intensity according to sound frequency (e.g., some frequencies are attenuated more quickly than others), in a space (e.g., a space of the enclosure).
- the transducers and support electronics e.g., drivers
- an acoustic mapping taking frequency into account may be used.
- a tone with a (e.g., continuous and/or discrete) frequency sweep (e.g., ramping) between a first frequency and a second frequency
- discrete frequency steps may be used (e.g., discrete frequencies that are detectibly separable from each other).
- the discrete steps may follow continuously or may be spaced apart in time.
- the sound sweeping may be partially continuous (e.g., continuous ramping in a first frequency range) and partially discrete (e.g., discrete steps in a second frequency range).
- Fig. 8A shows an example of a frequency ramp 800 conducted over a testing interval for a particular sound emitter.
- Ramp 800 begins at a first time at a minimum frequency and increases over time to a maximum frequency.
- a frequency ramp begins at the maximum frequency and decreases over the testing interval to the minimum frequency.
- Fig. 8B shows a testing signal 810 having discrete steps of increasing frequency during a testing interval. Likewise, the discrete steps may be decreasing from the maximum frequency to the minimum frequency during the testing interval. When discrete steps are used, the frequency progression may follow any arbitrary ordering of frequencies.
- At least two of the discrete frequency steps may be of different durations and/or different frequency spans (e.g., one spanning 5 Hz, and the other spanning 10 Hz). At least two (e.g., all) of the discrete frequency steps may be of the same duration and/or the same frequency span (e.g., both spanning 10 Hz).
- Fig. 8B shows an example in which all of the discrete frequency steps are of the same duration and of the same frequency span.
- Fig. 9A shows an example of a combined continuous and discrete testing signal 900 having an initial continuously-increasing ramp phase 901 , an intermediate discrete phase 902, and a final continuously-increasing ramp phase 903. Fig.
- FIG. 9B shows an example of a testing sequence 910 with discrete steps (e.g., 911 , 912, and 913) spaced in time. Although monotonically-decreasing frequencies are shown, an increasing or arbitrary ordering of (e.g., increasing and decreasing) frequencies may be used.
- the sound intensity generated by a sound emitter as it sweeps through various frequencies is kept substantially constant.
- the sound intensity is a function of the emitted frequency (e.g., following a loudness curve according to the sensitivity of human hearing).
- the sensor(s) may measure an intensity (e.g., sound pressure level (SPL), and/or sound power expressed in dB).
- SPL sound pressure level
- a difference between the emitted intensity and the detected intensity at each frequency may specify an acoustic transfer function between a respective pair of a sound emitter and a sound sensor, for example.
- the corresponding attenuation (e.g., in dB) at various frequencies enables analysis of how well various frequencies are activated and/or damped by the fixtures (e.g., wall) and non-fixtures (e.g., table) of the enclosure.
- An acoustic map may be comprised of a compilation of attenuation data for a pair of a sound emitter and a sound sensor, which may enable analysis of whether different zones (or locations) provide the requested acoustic attenuation for intended uses of the space, and/or to detect any changes in suitability of the space over time.
- Fig. 10A shows an example of sound intensity diagrams for sounds emitted according to a sweeping of the frequency of a test tone during a testing interval.
- a plot 1000 shows a constant sound intensity whereby all emitted frequencies are produced at the same SPL or power.
- a plot 1001 shows a variable sound intensity generally following an “equal loudness contour” wherein midrange frequencies (e.g., from about 1 kHz to about 4 kHz, corresponding to the greatest sensitivity of human hearing) have a lowest emitted intensity and the frequencies at the low end and at the high end have a higher emitted intensity.
- Fig. 10B shows an example of a sound intensity diagram for sounds received at different sensors according to the sweeping of the test tone frequency from one emitter.
- a plot 1010 shows a received intensity according to frequency for a sensor (e.g., microphone) at a first location
- a plot 1011 shows a received intensity according to frequency for a sensor at a second location.
- differences in a distance traveled, intervening objects, and/or reflective surfaces for the respective sound paths from the emitter to each of the sensors results in a lower overall sound intensity for plot 1011.
- a received intensity for the same testing signal may change at a later time for one particular sensor due to modifications of the fixtures and/or non-fixtures present within the enclosure.
- generating an acoustic mapping is done based on experimental results alone, e.g., (I) without considering a map (e.g., a 3D map) of the space, such as a BIM (e.g., Revit) file of the facility, and/or (II) without any other information regarding fixtures and non-fixtures in the enclosure.
- a map e.g., a 3D map
- BIM Building Information Modeling
- Revit Building Information Modeling
- the BIM file may assist in determining a testing sequence, e.g., using (I) emitter and sensor locations, and/or (II) how the acoustic zones align with fixtures and non-fixtures in the enclosure.
- a BIM file may be created before or upon construction of the facility. Since the BIM file may not be constantly updated (e.g., it may be cumbersome and/or time consuming to update), a testing sequence may be determined without consulting a BIM file. Selections of emitters and sensors to participate in a testing sequence may be preselected, automatically generated (e.g., by at least one controller), and/or user selected.
- a testing sequence may correspond to an indicated size of portion of a facility (e.g., multi-story building or a floor).
- a testing sequence may be conducted, e.g., at a user selected time and zone(s) (e.g., point of interest).
- a testing sequence may be automatically or manually triggered, e.g., after a big change has occurred in the facility (e.g., wall restructuring, and/or revised placement of furniture).
- an initial acoustic map is generated for a facility upon installation of the digital architectural elements and processor network. Based at least in part on an initial acoustic map and the desired acoustic performance (e.g., isolation) between different areas within the enclosure, alterations of and/or additions to, the fixtures and/or non-fixtures in the enclosure may be made in order to achieve the requested acoustic performance.
- a sound map is generated for the enclosure.
- data collection may proceed by selecting a first sound emitter for producing a test tone.
- the sound emitter may be commanded to generate the test tone while one or more sensors are commanded to monitor for reception of the test tone.
- the sound sensors may record a measured intensity at which the test tone is received.
- the sound emitters are sequentially triggered while corresponding sound sensors monitor the received intensities.
- the sound attenuation data is mapped for the areas (e.g., zones) of interest in the enclosure.
- the acoustic map comprises transfer functions according to the sound attenuation along (e.g., each) relevant sound path.
- a newly generated acoustic map can be analyzed relative to (e.g., compared to) a previous map (e.g., the initial acoustic map or an acoustic map from a (e.g., the most) recent performance of the texting sequence). If significant (e.g., above a threshold) changes are found between the successive acoustic mappings, then an electronic notification and/or report may be generated to inform a user (e.g., facility owner, tenant, and/or building manager) of the changed situation, e.g., so that mitigating actions can be taken.
- a user e.g., facility owner, tenant, and/or building manager
- generation of an acoustic map includes sound simulation(s) according to a model of the enclosure (e.g., Revit file and information concerning contents such as furniture).
- An accurate sound simulation may take an extensive amount of time (e.g., in the order of days, depending on the requested resolution), computing power, and/or cost.
- acoustic mapping relies on experimental results without use of a previously generated physical simulation (e.g., using physics modeling considering enclosure fixture and non-fixture structure, material, and surface texture, and sound interacting with those).
- a mapping function may run a simulation of lower complexity (e.g., without considering (e.g., surface) material properties of the facility fixtures and/or non-fixtures).
- Fig. 11 shows a scheme for performing a testing sequence and detecting changes in acoustic properties in an enclosure.
- the testing sequence begins in block 1101 with the selection of an emitter to be activated.
- a test tone e.g., frequency sweep
- relevant sensors e.g., frequency, optionally intensity, and optionally time.
- a check is performed in block 1103 to determine whether there are more sound emitters to be included in a testing sequence.
- a return is made to block 1101 to proceed to the next sound emitter. If not (e.g., all the sound emitters have been activated) then the collected data is used to create an acoustic map in block 1104.
- the acoustic map is analyzed (e.g., compared) with an acoustic map as created from a historic (e.g., prior) data collection.
- a check is performed in block 1106 to determine whether the analysis indicates that a significant (e.g., above a threshold) change has occurred.
- a significant change may be detected when a transfer function (e.g., attenuation of sound intensity) associated with a particular sound path has changed by an amount greater than a predetermined difference (e.g., a threshold specified as a particular value in dB). If there is not a significant change, then the procedure returns to block 1100 to wait for a time at which a next testing event will occur (e.g., according to a schedule). If there is a significant change, then actions may be taken in block 1107 to deliver a notification (e.g., report) to a person and/or to mitigate a deteriorated acoustic property by reconfiguring, adding, or removing a fixture or non-fixture that impacts the associated sound path.
- a transfer function e.g., attenuation of sound intensity
- a predetermined difference e.g., a threshold specified as a particular value in dB
- the scheduling of the time event may depend at least in part on a calendar, a time interval in which low sound activity in the enclosure is projected (e.g., night time, out of working hours, closure, holiday, and/or vacation).
- the scheduling of the time event may depend at least in part on a projected or occurred change in a fixture and/or non-fixture in the enclosure.
- the scheduling of the time event may depend at least in part on a user request.
- changes in measured sound attenuation from one testing sequence to another are used to distinguish between changes caused by faults occurring in a sound transducer (e.g., speaker or microphone) and changes caused by altered acoustic properties along the sound paths.
- a sound transducer e.g., speaker or microphone
- a first sound emitter e.g., first buzzer
- the emitted first sound is picked up at a second sound sensor of a second ensemble and optionally at a third sound sensor in a third location (e.g., and in other additional sounds sensors at other ensembles) at other known locations.
- a second sound emitter e.g., second buzzer
- the emitted second sound is picked up at the second sound sensor of the second ensemble and optionally at the third sound sensor in the third location (e.g., and in other additional sounds sensors at other ensembles) at other known locations, without change.
- a second sound emitter e.g., second buzzer
- a third sound emitter e.g., third buzzer
- the sounds are picked up in a first sound sensor of the first ensemble (e.g., and at the third, and at other sounds sensors).
- Fig. 12 shows an example of a flowchart depicting operations for acoustic mapping and fault detection.
- a first sound emitter is activated and corresponding sound is detected at one or more sound sensors at other locations (each paired sensor defining a respective sound path).
- the detected sounds are measured (e.g., a sound intensity is recorded at respective frequencies in a frequency swept signal).
- second, third, and/or other sound emitters are activated at second, third, and/or other locations respectively, which sound emitted is detected by a first sound sensor co-located with the first sound emitter (e.g., in a same device ensemble or digital architectural element).
- the detected (e.g., measured and recorded) sounds are compared to an initial or other previously measured mapping of the corresponding sound paths. If each sound intensity is (e.g., substantially) as expected (e.g., the same as its previous measurement), then the process determines at block 1204 that there are no sensor/emitter (e.g., transducer) faults and there have been no changes in the acoustic properties of the sound paths.
- sensor/emitter e.g., transducer
- the process determines at block 1208 that there is a high likelihood of fault in the first sensor. Otherwise, it at block 1209 there is a determination of a high likelihood that there has been a change in the acoustic properties of the corresponding sound path.
- the system may send a notification or direct initiation of an appropriate corrective action that can be taken. For example, a technician can be sent to verify the situation of the suspected sensor, emitter, and/or surrounding. For example, a technician can be sent to replace the suspected sensor, and/or emitter. For example, a technician can be sent to record a change in the surrounding (e.g., in a BIM such as a Revit file).
- Figs. 13A, 13B, and 13C show examples of decision matrices as embodiments in the flowchart of Fig. 12.
- a cell in each matrix contains a “Y” (e.g., Yes) to indicate that an unexpected change has occurred (e.g., a difference between measured intensities Mi and M 2 of certain sound frequency(ies) at first and second respective testing times is greater than a threshold difference A, designated by “M, - M 2 > A”) or an “N” (e.g., No) if an unexpected change is not detected.
- the letter “X” signifies a non-applicable matrix rubric.
- a test signal emitted by sound emitter E1 at a first location results in unexpected changes in the sound sensed by sound sensors S2 and S3 at second and third locations.
- a change in a sensor refers to an alteration in a sound emitted by an emitter that is sensed by the sensor, as compared to historic measurement(s) of a sound previously emitted by the emitter, which was sensed by that sensor.
- a test signal emitted by sound emitter E2 at the second location results in no changes in the sound as sensed at sound sensors S1 and S2 at the first and second locations.
- one test signal along the first acoustic path between the first and second locations resulted in a change and another test signal sent in the opposite direction resulted in no change. Therefore, it is likely that sound emitter E1 and/or sensor S1 is faulty.
- a determination with high likelihood may be made as to which sound emitter is faulty.
- a faulty sound emitter emits a faulty sound in frequency and/or intensity. Since the emitted test signal from emitter E1 also resulted in a changed result at sensor S3, and since the emitted test signal from emitter E3 resulted in no change at sensor S1 , it may be deduced that sound emitter E1 is faulty with high likelihood.
- Fig. 13B shows an example of a test signal emitted by sound emitter E1 at a first location results in no change in the sound sensed by sound sensors S2 and S3 (as compared to historic measurements).
- a test signal emitted by sound emitter E2 at the second location results in an unexpected change in the sound sensed by sound sensor S1.
- one test signal along the first acoustic path between the first and second locations resulted in no change and another test signal sent in the opposite direction resulted in an unexpected change. Therefore, it is likely that sound emitter E1 and/or sound sensor S1 is faulty.
- Fig. 13C shows an example of a test signal emitted by sound emitter E1 results in an unexpected change in the sound sensed by sound sensor S2 but no change in the sound sensed by sound sensor S3.
- a test signal emitted by sound emitter E2 results in an unexpected change in the sound sensed by sound sensor S1 but no change in the sound sensed by sound sensor S3. Therefore, it is unlikely that either sound emitter E1 or sound sensor S1 is faulty.
- an ability to differentiate between a faulted transducer and an actual change in acoustic properties is obtained (e.g., with high likelihood) without requiring a co-location pairing of emitters and sensors.
- Emitters and sensors may be separately and/or arbitrarily placed, provided that there is sufficient overlap of their operational zones (e.g., each emitter can be receivable by one or more sensors and the sensor is able to receive by one or more emitters).
- Acoustic mapping may proceed, for example, by considering measured attenuation between each respective pairing of an emitter and a sensor at locations within a normally receivable range.
- the sensors at locations where the same emitter is receivable may be analyzed (e.g., evaluated and/or checked) to determine whether they detected a similar change.
- the lack of a similar change may indicate the possibility of a fault in the sensor of the particular pairing.
- other measurements from the sensor of the particular pairing made in response to other emitters may be analyzed (e.g., evaluated and/or checked) to determine whether they detected a similar change.
- a similar change for all measurements made by the sensor of the particular pair may indicate the possibility of a fault in that sensor.
- a possible emitter fault may be detected by checking whether it is true that (A) all the sensors within a receivable range of the particular emitter detected a similar change, and (B) all the sensors detected no substantial change for (e.g., any) other emitters.
- a suspicion or detection of a fault may be reported (e.g., via an immediate notification or a periodic report).
- updated acoustic properties may be analyzed (e.g., and if detected, also reported and/or updated in the BIM).
- the availability of a mapping of acoustic properties in an enclosure enable the detection and/or characterization of predetermined sound events (e.g., loud and/or abrupt sounds being detected for building safety, health, and/or security purposes). For example, a location of a sound event and/or a classification of the sound may be (e.g., automatically) detected. While a single sensor (e.g., microphone) may be able to record a sound that is then compared to prototypical sound samples for possible classification, the localization may only be within a range of the microphone, and the classification accuracy may limited.
- predetermined sound events e.g., loud and/or abrupt sounds being detected for building safety, health, and/or security purposes.
- a location of a sound event and/or a classification of the sound may be (e.g., automatically) detected. While a single sensor (e.g., microphone) may be able to record a sound that is then compared to prototypical sound samples for possible classification, the localization may only be within a range of
- the sound event may have a (e.g., predetermined) sound signature.
- the sound event may be an emergency event.
- the sound event may be a plea for help.
- the occurrence of the event have two or more level of classification. For example, a general even type (e.g., cough, wind, breakage, gun-shot, or explosion), and event type.
- on origin (e.g., point or area) of the sound may be detectable.
- an occurrence of a gun-shot, the type of gun-shot, and an origin of gun-shot can be detected (e.g., wherever there is sufficient acoustic mapping resolution).
- a sound event may be recognizable as cough, which may be characterized according to cough type (e.g., dry vs. wet cough, deep vs shallow), and location of cough origin (e.g., floor, room, or location w/i a room) depending on mapping resolution.
- a cough detection differentiates between types of coughs, e.g., Covid-19 cough, pneumonia cough, or common cold cough.
- screams may be enumerated as accompanied by a prototypical pattern or other kinds of acoustical recognition.
- acoustical recognition For example, an abrupt and/or intense sound due to: wind (e.g., due to hurricane, tornado, tsunami, typhoon, or derecho), earthquake (e.g., Tectonic, volcanic, collapse, or explosion), explosion, breakage (e.g., of a fixture such as window or wall), or volcanic eruption.
- steps may be detected (e.g., running direction of a person can be tracked).
- a potential sound event is detected in response to a sound of interest (e.g., an irregular (e.g., loud and/or abrupt) sound burst) detected (e.g., substantially) simultaneously at two or more sensors.
- the relative detected sound intensities at the sensors may be used to interpolate a location where the sound was generated (e.g., having a generation signature) and/or where it is most intense.
- it is compensated according to the known acoustical transfer function(s) of the sound paths, e.g., from the interpolated location of the sound to the locations of the sensors.
- the acoustic transfer function may involve greater attenuation at certain frequencies and/or intensities.
- the compensation may include applying an inverse transfer function which boosts a sound signal at the frequencies attenuated by the sound path.
- Compensated sound signals from different sensors may be combined prior to classification (e.g., pattern recognition or matching) to further improve the accuracy of recognition.
- classification e.g., pattern recognition or matching
- a predetermined sound event it may be reported to a user and/or some types of events may have corresponding automatic mitigating actions that may be taken (e.g., activating an alarm).
- the sound event may be (e.g., automatically and/or manually) notified to (e.g., all) enclosure occupants (e.g., via their mobile devices and/or ID tags), to enclosure owners, to enclosure lessor, to enclosure lessee, to authorities (e.g., police, firefighters, hospitals).
- the notification may be an electronic notification (e.g., to e-mail and/or mobile devices of the notified personnel).
- a notification may be issued to an individual, to a population within the enclosure, and/or remotely (e.g., to authorities such as police, fire, health officials, building owner, tenants, building manager).
- Fig. 14 shows an example of a flowchart for detecting irregular sound events.
- a network of sensors continuously monitors zones of an enclosure for sound signatures (e.g., loud and/or abrupt sounds).
- Sensor units may be synchronized such that when one sensor detects a potential sound event in block 1401 it subsequently notifies other (e.g., adjacent) sensor units that can confirm whether the potential sound event was likewise detected and/or manner of its evolution.
- the relative sound intensity(ies) and/or frequency(ies) at each sensor location is used to interpolate a location for the sound event in block 1402 and/or its propagation (e.g., evolution) in space and/or time.
- Sound sample data as recorded by the sensors can be optionally compensated according to the acoustical transfer function for each corresponding sound path in block 1403.
- sound recognition techniques are used to identify a type of sound event (e.g., optionally using classification of the compensated sound data and/or the originally detected sound data).
- a notification of the sound event is optionally generated and/or a corresponding mitigation action is taken according to the type of sound event.
- additional sensor(s) are added to the (e.g., deployed) sound sensors and emitters (speaker/microphone).
- An DAE can include a sound sensor and/or a sound emitter.
- a sensor ensemble includes an accelerometer to detect motion of the enclosure structures. Accelerometer data may be used to correlate readings of different sensors. It may be used to subtract outside noises impacting at least a portion of the (e.g., the whole) facility.
- a sound emitter e.g., speaker
- an exterior emitter may be used to test the acoustics of a wall, a window, a ceiling, a floor and/or other building features. Based on such tests, a building owner, tenant, or system installer may make adjustments in space (e.g., locations for types of uses such as private offices, conference rooms, etc.) considering the sound mapping.
- the use of the enclosure may be adjusted for acoustic privacy and/or lack thereof depending on room type specification, and/or area or point of interest.
- the selection of locations for sound emitters and sensors (e.g., device ensembles) to be used for continuous monitoring may be adjusted according to a mapping based at least in part on exterior noises.
- mounting on lower side walls may be subject to hindrance by occupants and/or furniture while mounting toward the tops of walls may have little hindrance from occupants and/or furniture.
- Fig. 15 shows a horizontal cross sectional example of digital architectural elements DAEs (e.g., device ensembles) and processing system for an enclosure with an exterior wall 1500 and an interior wall 1501.
- a DAE 1502 provides a first digital architectural element with network connection 1503 (arrows designating bidirectional communication) as part of a network of the enclosure (e.g., a control network as disclosed herein).
- a second DAE (digital architectural element) 1504 in another room separated by interior wall 1501 is also connected to the network by network connection 1503.
- An exterior emitter 1505 emplaced outside the enclosure is also connected in the network by network connection 1503.
- DAEs 1502 and 1504 each includes respective emitters (speaker abbreviated as “Spk”), sound sensors (e.g., microphone abbreviated as “Mic”), and accelerometers (abbreviated as “Acc”). Even though separated by interior wall 1501 , sounds emitted by one DAE may be receivable at the other during a data collection event.
- digital elements may be provided in various formats and housings that allow, as the purpose dictates, installation on building structural elements, which may include permanent elements (e.g., fixtures), and/or on building walls, floors, ceilings, mullion, transoms, any other frame portion, openings, or roofs.
- the chassis or housing of a digital element is no greater than about 5 meters in any dimension, or no greater than about 3 meters in any dimension.
- the digital architectural element may have a housing with a lid.
- the lid e.g., configured to face the interior of the enclosure
- the lid can have an aspect ratio that differs from 1 :1.
- the lid can have an aspect ratio of 1 :X, where X is at least about 1 , 2, 3, 4, 5, 6, or 8.
- the housing is rigid or semi-rigid and encompasses some or all components of the DAE.
- the housing provides a frame and/or scaffold for attaching one or more components including a speaker, a display, an antenna, and/or a sensor.
- the housing provides external access to one or more ports or cables such as ports or cables for attaching to network links, video displays, mobile electronic devices, power, battery chargers, etc.
- Window controller networks and associated digital elements may be installed during and/or upon construction (e.g., relatively early in the construction) of the enclosure (e.g., office buildings and other types of buildings).
- the network e.g., control network
- the network can be installed before any other network, e.g., before networks for other building functions such as Building Management Systems (BMSs), security systems, Information Technology (IT) systems of tenants, etc.
- BMSs Building Management Systems
- IT Information Technology
- sensors on a window network are installed close to where building occupants spend their time, thereby improving the sensors’ effectiveness in providing occupant comfort.
- digital elements as described herein that are connected to a high bandwidth network may be deployed in various locations throughout a building. Examples of such locations include building structural elements in offices, lobbies, mezzanines, bathrooms, stairwells, terraces, and the like. Within any of these locations, digital elements may be positioned and/or oriented proximate to occupant positions, thereby collecting environment data that is most appropriate for triggering building systems to act in a way maintain or enhance occupant comfort.
- a digital architectural element contains sensor(s), emitter(s), a circuitry (such as a processor (e.g., a microcontroller)), a network interface, and/or one or more peripheral interfaces.
- DAE sensor include light sensor, optionally including image capture sensor such as camera, audio sensor such as voice coils or microphones, air quality sensor, and/or proximity sensor (e.g., certain IR and/or RF sensors).
- the network interface may be a high bandwidth interface such as a gigabit (or faster) Ethernet interface.
- DAE peripherals include video display monitors, addon speakers, mobile devices, battery chargers, and the like.
- peripheral interfaces include standard Bluetooth modules, ports such as USB ports network ports, power ports, image ports, etc. Ports may include any of various proprietary ports for third party devices.
- the digital architectural element works in conjunction with other hardware and/or software provided for an optically switchable window system and/or a display on window.
- the digital architectural element includes a local (e.g., window) controller or other controller such as a master controller, a network controller, etc.
- a digital architectural element includes one or more signal generating device such as a speaker, a light source (e.g., and LED), a beacon, an antenna (e.g., a Wi-Fi or cellular communications antenna), and the like.
- a digital architectural element includes an energy storage component and/or a power harvesting component.
- an element may contain one or more batteries or capacitors as energy storage devices.
- Such elements may additionally include a photovoltaic cell.
- the DAE may include a power source, or may be operatively coupled to a power source (e.g., via a connector).
- a digital architectural element has one or more user interface components (e.g., a microphone or a speaker), and one more sensors (e.g., a proximity sensor), as well a network interface for a high bandwidth communications.
- a digital architectural element is designed or configured to attach to, or otherwise be collocated with, a structural element of building.
- a digital architectural element has an appearance that blends in with the structural element with which it is associated.
- a digital architectural element may have a shape, size, and color that blends with the associated structural element.
- a digital architectural element is not easily visible to occupants of a building; e.g., the element is fully or partially camouflaged. However, such element may interface with other components that do not blend in such as video display monitors, touch screens, projectors, and the like.
- the building structural elements to which digital architectural elements may be attached include any of various building structures.
- building structures to which digital architectural elements attach are structures that are installed during building construction, in some cases early in building construction.
- the building structural elements for digital architectural elements are elements that serve as a building structural function. Such elements may be permanent, i.e., not easy to remove from a building such as fixtures. Examples include walls, partitions (e.g., office space partitions), doors, beams, stairs, fagades, moldings, mullions and transoms, etc.
- the building structural elements are located on a building or room perimeter.
- digital architectural elements are provided as separate modular units or boxes that attach to the building structural element.
- digital architectural elements are provided as fagades for building structural elements.
- a digital architectural element may be provided as a cover for a portion of a mullion, transom, or door.
- a digital architectural element is configured as a mullion or disposed in or on a mullion. If it is attached to a mullion, it may be bolted on or otherwise attached to the rigid parts of the mullion.
- a digital architectural element can snap onto a building structural element.
- a digital architectural element serves as a molding, e.g., a crown molding.
- a digital architectural element is modular; i.e., it serves as a module for part of a larger system such as a communications network, a power distribution network, and/or computational system that employs an external video display and/or other user interface components.
- the digital architectural element is a digital mullion designed to be deployed on some but not all mullions in a room, floor, or building. In some cases, digital mullions are deployed in a regular or periodic fashion. For example, digital mullions may be deployed on every sixth mullion.
- the DAE may be configured for a high bandwidth network connection (port, switch, router, etc.) and have a housing.
- the digital architectural element may include the following digital and/or analog component(s): a camera, a proximity and/or movement sensor, an occupancy sensor, a color temperature sensor, a biometric sensor, a speaker, a microphone, an air quality sensor, a hub for power and/or data connectivity, display video driver, a Wi-Fi access point, an antenna, a location service via beacons or other mechanism, a power source, a light source, a processor and/or ancillary processing device.
- One or more cameras may include a sensor and processing logic for imaging features in the visible, IR (see use of thermal imager below), or other wavelength region; various resolutions are possible including high definition (e.g., HD) and greater such as at least about 2K, 4K, 6K, 8K, or 10K resolution (one thousand is abbreviated as “K”).
- high definition e.g., HD
- K 10K resolution
- One or more proximity and/or movement sensors may include an infrared sensor, e.g., an IR sensor.
- a proximity sensor is a radar or radar-like device that detects distances from and between objects using a ranging function. Radar sensors can also be used to distinguish between closely spaced occupants via detection of their biometric functions, for example, detection of their different breathing movements. When radar or radar-like sensors are used, better operation may be facilitated when disposed unobstructed or behind a plastic case of a digital architectural element.
- One or more occupancy sensor may include a multi-pixel thermal imager, which when configured with an appropriate computer implemented algorithm can be used to detect and/or count the number of occupants in a room.
- data from a thermal imager or thermal camera is correlated with data from a radar sensor to provide a better level of confidence in a particular determination being made.
- thermal imager measurements can be used to evaluate other thermal events in a particular location, for example, changes in air flow caused by open windows and doors, the presence of intruders, and/or fires.
- One or more color temperature sensors may be used to analyze the spectrum of illumination present in a particular location and to provide outputs that can be used to implement changes in the illumination as needed or desired, for example, to improve an occupant's health or mood.
- One or more biometric sensor may be provided as a stand-alone sensor or be integrated with another sensor such as a camera.
- One or more speakers and associated power amplifiers may be included as part of a digital architectural element or separate from it. In some embodiments, two or more speakers and an amplifier may, collectively, be configured as a sound bar; e.g., a barshaped device containing multiple speakers. The device may be designed or configured to provide high fidelity sound.
- One or more microphones and logic for detecting and processing sounds may be provided as part of a digital architectural element or separate from it.
- the microphones may be configured to detect one or both of internally or externally generated sounds.
- processing and analysis of the sounds is performed by logic embodied as software, firmware, or hardware in one or more digital structural element and/or by logic in one or more other devices coupled to the network, for example, one or more controllers coupled to the network.
- the logic is configured to automatically adjust a sound output of one or more speaker to mask and/or cancel sounds, frequency variations, echoes, and other factors detected by one or more microphone that negatively impact (or potentially could negatively impact) occupants present in a particular location within a building.
- the sounds comprise sounds generated by, but not limited to: indoor machinery, indoor office equipment, outdoor construction, outdoor traffic, and/or airplanes.
- one or more microphones are positioned on, or next to, windows of a building; on ceilings of the building; and/or or other interior structures of the building.
- the logic may be configured in a singular or arrayed fashion to analyze and determine the type, intensity, spectrum, location and/or direction interior sounds present in a building.
- the logic is functionally connected to other fixed or moving network connected devices that may be being used in a building, for example, devices such as computers, smart phones, tablets, and the like, and is configured to receive and analyze sounds or related signals from such devices.
- the logic is configured to measure and analyze real time delays in signals from microphones to predict the amount and type of sound needed to mask or cancel unwanted external and/or internal sound present at a particular location in the building.
- the logic is configured to detect changes in the level and/or location of the unwanted external and/or internal sound where, for example, the changes can be caused by movements of objects and people within and outside a building, and to dynamically adjust the amount of the masking and/or canceling sound based on the changes.
- the logic is configured to use signals from tracking sensors in a building and, according to the signals, to cause the masking and/or canceling sounds to be increased or decreased at a particular location in the building according to a presence and/or location of one or more occupant.
- one or more of the speakers are positioned to generate masking and/or canceling sounds that propagate substantially in a plane of travel of unwanted sound, including in a horizontal plane, vertical plane, and/or combinations of the two.
- the logic comprises a calculation and/or an algorithm designed to acoustically map an interior of a building, to locate in-office noise source locations, and to improve speech privacy.
- the logic may be used to perform an acoustical sweep so as to cause each speaker to generate sound that in turn is detected by each microphone.
- time delays, sound level decreases, and spectrum differences in the detected sounds are used to calculate and map effective acoustical distances between speakers, microphones, and between them.
- an acoustical transfer function of an interior of a building map may be obtained from the acoustical sweep.
- the logic can make appropriate masking and/or canceling level determinations when sources of unwanted sounds generated in the spaces are present.
- the logic can adjust speaker generated sounds to correct for absorption of certain absorptive surfaces, for example, a sound that may otherwise be sound muffled bouncing off of a soft partition can be adjusted to sound crisp again.
- the acoustical map of a space can also be used to determine what is direct versus indirect sound and adjust time delays of masking and/or canceling sounds so that they arrive at a desired location at the same time.
- One or more air quality sensor s may be used in conjunction with HVAC to improve air circulation control.
- One or more hubs for power and/or data connectivity to sensor(s), speakers, microphone, and the like may be provided.
- the hub may be a USB hub, a Bluetooth hub, etc.
- the hub may include one or more ports such as USB ports, High Definition Multimedia Interface (HDMI) ports, etc.
- the element may include a connector dock for external sensors, light fixtures, peripherals (e.g., a camera, microphone, speaker(s)), network connectivity, power sources, etc.
- the architectural element itself or faceplate that covers all or a portion of the architectural element serves as an antenna.
- Various approaches may be employed to insulate the architectural element and make it transmit or receive directionally.
- a prefabricated antenna may be employed or a window antenna as described in International Patent Application Serial No. PCT/US17/31106, filed May 4, 2017, incorporated herein by reference in its entirety.
- One or more power sources such as an energy storage device (e.g., a rechargeable battery or a capacitor), and the like may be provided.
- a power harvesting device is included; e.g., a photovoltaic cell or panel of cells. This allows the device to be self-contained or partially self-contained.
- the light harvesting device may be transparent or opaque, depending on where it is attached.
- a photovoltaic cell may be attached to, and partially or fully cover, the exterior of a digital mullion, while a transparent photovoltaic cell may be cover a display or user interface (e.g., a dial, button, etc.) on the digital architectural element.
- One or more light sources configured with the processor to emit light under certain conditions such signaling when the device is active.
- One or more processors may be configured to provide various embedded or non- embedded applications.
- the processor may be a microcontroller.
- the processor is low-powered mobile computing unit (MCU) with memory and configured to run a lightweight secure operating system hosting applications and data.
- the processor is an embedded system, system on chip, or an extension.
- One or more ancillary processing devices such as a graphical processing unit, or an equalizer or other audio processing device configured to interpret audio signals.
- a camera of a digital architectural element is configured to capture images in the visible portion of the electromagnetic spectrum.
- the camera provides images in high resolution, e.g., high definition, of at least about 720 pixels or at least about 1080 pixels in one dimension.
- the camera resolution may be any camera resolution disclosed herein.
- the camera may also capture images having information about the intensity of wavelengths outside the visible range.
- a camera may be able capture infrared signals.
- a digital architectural element includes a near infrared device such as a forward looking infrared (FUR) camera or near-infrared (NIR) camera. Examples of suitable infrared cameras include the BosonTM or LeptonTM from FUR Systems, of Wilsonville, OR. Such infrared cameras may be employed to augment a visible camera in a digital architectural element.
- FUR forward looking infrared
- NIR near-infrared
- the camera may be configured to map the heat signature of a room such that it may serve as a temperature sensor with three-dimensional awareness.
- such cameras in a digital architectural element enable occupancy detection, augment visible cameras to facilitate detecting a human instead of a hot wall, provide quantitative measurements of solar heating (e.g., image the floor or desks and see what the sun is actually illuminating), etc.
- the speaker, microphone, and associated logic are configured to use acoustic information to characterize air quality or air conditions.
- an algorithm may issue ultrasonic pulses, and detect the transmitted and/or reflected pulses coming back to a microphone.
- the algorithm may be configured to analyze the detected acoustic signal, sometimes using a transmitted vs. received differential audio signal, to determine air density, particulate deflection, and the like to characterize air quality.
- an enhanced functionality window controller may include a Wi-Fi access point, and optionally also has cellular communications capability. It is often configured to connect to multiple networks (e.g., a Controller Area Network (CAN) bus and Ethernet).
- networks e.g., a Controller Area Network (CAN) bus and Ethernet.
- an enhanced functionality local (e.g., window) controller may have the basic structure and function as described above herein, but with an added gigabit Ethernet interface and a processor with enhanced computing power.
- the enhanced functionality window controller may have a CAN bus interface or similar controller network.
- the controller has video capability and/or may include features described in U.S. Patent Application Serial No. 15/287,646, filed October 6, 2016, which is incorporated herein by reference in its entirety.
- the enhanced functionality local (e.g., window) controller is implemented as a module having (i) a processor with sufficiently high processing power to handle video and other functions requiring significant processing power, (ii) an Ethernet connection, (iii) optionally video processing capabilities, (iv) optionally a Wi-Fi access point or other wireless communications capability, etc.
- This module may be attached to a base board having other, more conventional, window controller functionality such as a power amplifier or another baseboard that is used with a (e.g., ring) sensor.
- the sensor may be disposed externally or internally to the enclosure.
- the sensor may be disposed in the ambient environment external to the enclosure.
- the resulting device may be used to control an optically switchable window, or it may be used simply provide wireless communications, video, and/or other functions not necessarily associated with controlling the states of optically switchable windows.
- the enhanced functionality window controller is provisioned, controlled, alarmed, etc. by a CAN bus or similar controller network protocol, as with a conventional window controller described herein, but additionally it provides video, WiFi, and/or other extra functions.
- Figure 16A illustrates an example of a comparison between a block diagram of a local controller that is a window controller WC2 (Detail A) and, according to some implementations a block diagram of a WC3 (Detail B).
- the WC2 block diagram is an example of a conventional window controller such as those available from View, Inc. of Milpitas, CA.
- Some of the depicted components include at least one voltage regulator 1641 , a controller network interface, CAN 442 a processing unit (microcontroller) 1643, and various ports and connectors.
- Some of these components and example architectures are described in U.S. Patent Application Serial No. 13/449,251 , filed April 17, 2012, and U.S. Patent Application Serial No. 15/334,835, filed October 26, 2016, which are incorporated herein by reference in their entireties.
- Fig. 16B depicts an example of an enhanced functionality local controller that is a window controller, WC3.
- the conventional window controller (WC 2) and the enhanced functionality window controller (WC3) have a similar architecture and some common components.
- the enhanced functionality window controller WC3 has a more capable microcontroller 1653, a gigabit Ethernet interface 1654, a wireless (e.g., Wi-Fi, Bluetooth or cellular) interface 1655 and an optional MoCA interface 1656.
- the gigabit Ethernet interface may be a conventional unshielded twisted pair (e.g., UTP/CAT5-6) interface and/or a MoCA (GbE over coaxial cable) interface.
- connection to the enhanced functionality window controller is via a conventional RJ45 modular connector (jack).
- the controller includes a separate adaptor feeding the jack.
- such adaptor may be an Actiontec (Actiontec Electronics, Inc. of Sunnyvale, CA) adaptor such as the ECB6250 MoCA 2.5 network adapter, e.g., an adaptor that provides data communication speeds up to about 2.5 Gbps.
- Figs. 17 through 20 illustrate a number of examples of applications and uses of the digital architectural element and related elements contemplated by the present disclosure.
- the network and/or high bandwidth backbone described herein may be used for various functions, some of which are not related to controlling DAE, their components and/or windows.
- One such function is the providing of internet, local network, and/or computational services for tenants or other building occupants, construction personnel on site during the construction of the building, and the like.
- the network and computation resources provided by the backbone and digital elements may be used for more than commissioning windows. For example, they may be used to provide architectural information, construction instructions, and the like. In this way, construction personnel have ready access to construction information they need via a high bandwidth, on-site network.
- the network, communications, and/or computational services provided by the network and computational infrastructure as described herein are utilized in multi-tenant buildings or shared workspaces such as those provided by WeWork.com.
- shared workspace buildings need only provide temporary connectivity and processing power as needed.
- a building network such as described herein affords central control and flexible assignment of computational resources to particular building locations. Such flexibility may allow assignment of different resources to different occupants (e.g., tenants).
- Readings from sensor(s) in a digital element may provide information about the enclosure environment, e.g., in the vicinity of the digital architectural element.
- sensors include sensors for any one or more of temperature, humidity, volatile organic compounds (VOCs), carbon dioxide, dust, light level, glare, and color temperature.
- VOCs volatile organic compounds
- readings from one or more such sensors are input to an algorithm (e.g., comprising a calculation) that determines actions that other building systems should take, e.g., to offset the deviation in measured readings to get these readings to target values for occupant's comfort or building efficiency, depending on the contextual index of occupant's presence and other signals.
- a digital element may be provided on a roof of a building, optionally collocated with a sky sensor and/or a ring sensor such as described in U.S. Patent Application Serial No. 15/287,646, filed October 6, 2016, that is incorporated herein by reference in its entirety. Such element may be outfitted with some or all features presented elsewhere herein for a digital architectural element. Examples include sensors, antenna, radio, radar, air quality detectors, etc.
- the digital element on the roof or other building exterior location provides information about air quality and/; in this way, digital elements may provide information about the air quality both inside and outside of the enclosure, and/or about the weather. This allows decisions about window tint states and other environmental conditions to be made using a full set of information (e.g., when conditions outside the building are unhealthy (or at least worse than they are inside), a decision may be made prohibit venting air from outside).
- the light levels, glare, color temperature, and/or other characteristics of ambient or artificial light in a region of building are used to make decisions about whether to change the tint state of an electrochromic device.
- these decisions employ one or more algorithms or analyses as described in U.S. Patent Application Serial No. 15/347,677, filed November 9, 2016, and U.S. Patent Application Serial No. 15/742,015, each which is incorporated herein by reference in its entirety.
- tinting decisions are made by using a solar calculator and/or a reflection model in conjunction with an algorithm for interpreting light information from sensors of the digital architectural element.
- the algorithm may in some cases use information about the presence of occupants, how many there are, and/or where they are located (data that can be obtained with a digital architectural element) to assist in making decisions about whether to tint a window and what tint state should be chosen.
- a digital architectural element is used in lieu of or in conjunction with a sky sensor such as described in U.S. Patent Application No. 15/287,646, filed October 6, 2016, which is incorporated herein by reference in its entirety.
- sensors in a digital element may provide feedback about local light, temperature, color, glare, etc. in a room or other portion of a building.
- the logic associated with a digital element may then determine that the light intensity, direction, color, etc. should be changed in the room or portion of a building and may also determine how to effect such change.
- a change may be necessary for user comfort (e.g., reduce glare at the user’s workspace, increase contrast, or correct a color profile for sensitive users) or privacy or security.
- the logic may then send instructions to change one or more lighting or solar components such as optically switchable window tint states, display device output, switched particle device film states (e.g., transparent, translucent, opaque), light projection onto a surface, artificial light output (color, intensity, direction, etc.), and the like. All such decisions may be made with or without assistance from building-wide tint state processing logic such as described in U.S. Patent Application Serial No. 15/347,677, filed November 9, 2016, and U.S. Patent Application Serial No. 15/742,015, filed January 4, 2018, each of which is incorporated herein by reference in its entirety.
- An array of digital architectural elements in a building may form a mesh edge access network enabling interactions between building occupants and the building or machines in the building.
- a digital architectural element and/or a digital wall interface and/or an enhanced functionality window controller can be used as a digital compute mesh network node providing connectivity, communication, application execution, etc. within building structural elements (e.g., mullions) for ambient compute processing. It may be powered, monitoring and controlled in a similar or identical manner as an edge sensor node in a mesh network setup in the buildings. It may be used as gateway for other sensor nodes.
- a non-exhaustive list of functions or uses for the high bandwidth window network and associated digital elements contemplated by the present disclosure includes: (a) Speaker phone - a digital wall interface or a digital architectural element may be configured to provide all the functions of a speaker phone; (b) Personalization of space - an occupant’s preferences and/or roles may be stored and then implemented in particular locations where the occupant is present. In some cases, the preferences and/or roles are implemented only temporarily, when a user is at a particular location.
- the preferences and/or roles remain in effect so long as the occupant is assigned a work space or living space;
- Control HVAC air quality;
- Noise cancellation - E.g., microphone detects white noise, and the sound bar cancels the white noise;
- Enhancements to personal digital assistants such as Amazon’s Alexa, Microsoft’s Cortana, Google’s Google Home, Apple’s Siri, and/or other personal digital assistants;
- Conditions may be determined using one or more of the following types of sensed conditions, for example: temperature & humidity, volatile organic compounds (VOC), CO2, dust, smoke and lighting (light levels, glare, color temperature.
- data from at least two different sensors are used synergistically.
- the sensors can be of different type or of the same type.
- data from at least two different device ensembles are used synergistically.
- the two different device ensemble can have the same sensors (e.g., the same sensor combination) or different sensors (e.g., a different sensor combination).
- the device ensemble may be deployed throughout an enclosure of the facility and/or across the facility.
- the window may have a pane configured to generate vibrations.
- the window may contain, or may be operatively coupled to, a vibration generator.
- the vibration generator may be acoustic or mechanical.
- the vibration generator may comprise an actuator.
- the vibration generator may comprise a speaker.
- Vibration generators may operate synergistically.
- a first window may include, or be operatively coupled to, a first vibration generator
- a second window may include, or be operatively coupled to, a second vibration generator.
- the first vibration generator and the second vibration generator may operate in tandem (e.g., synergistically or symbiotically).
- Operation of the first vibration generator may consider operation and/or status of the second vibration generator.
- Operation of the second vibration generator may consider operation and/or status of the first vibration generator.
- the consideration may include taking into account respective sensor(s) measurements (e.g., sensor(s) disposed in a framing of the window, or operatively coupled to the window).
- the sensor(s) may be incorporated in a device ensemble.
- the consideration may comprise using artificial intelligence (e.g., a learning module).
- the vibration generator and/or sensor(s) may be operatively coupled to the control system (e.g., of the facility).
- Operatively coupled may comprise electrically coupled, communicatively coupled, wirelessly coupled, and/or physically connected via wire(s).
- the consideration may comprise input of various sensors.
- At least two of the various sensor may be of the same type. At least two of the various sensors may be of a different type (e.g., different kind). At least two of the various sensors may be disposed in an enclosure (e.g., room) in which the first window and/or the second window is disposed. At one of the various sensors may be disposed in a different enclosure (e.g., room) from the one in which the first window and/or the second window is disposed.
- the sensor may be a sound sensor.
- the sound sensor may measure vibrations in the enclosure (e.g., room).
- the sounds sensor may measure vibrations arising from the window(s).
- the sound sensor may measure vibrations in the enclosure (e.g., different from the ones arising from the window(s)).
- the framing may comprise a mullion or a transom. The sensor may or may not be in direct contact with the window (e.g., whether an internally facing window-pane, or an externally facing windowpane).
- the artificial intelligence may comprise data analysis (e.g., data gathered by one or more sensors).
- the data analysis e.g., analysis of the sensor measurements
- the circuitry may be of a processor.
- the sensor data analysis may utilize artificial intelligence.
- the sensor data analysis may rely on one or more models (e.g., mathematical models).
- the sensor data analysis comprises linear regression, least squares fit, Gaussian process regression, kernel regression, nonparametric multiplicative regression (NPMR), regression trees, local regression, semiparametric regression, isotonic regression, multivariate adaptive regression splines (MARS), logistic regression, robust regression, polynomial regression, stepwise regression, ridge regression, lasso regression, elasticnet regression, principal component analysis (PCA), singular value decomposition, fuzzy measure theory, Borel measure, Han measure, risk-neutral measure, Lebesgue measure, group method of data handling (GMDH), Naive Bayes classifiers, k-nearest neighbors algorithm (k-NN), support vector machines (SVMs), neural networks, support vector machines, classification and regression trees (CART), random forest, gradient boosting, or generalized linear model (GLM) technique.
- NPMR nonparametric multiplicative regression
- MERS multivariate adaptive regression splines
- PCA principal component analysis
- fuzzy measure theory Borel measure, Han measure, risk-neutral measure, Lebe
- the data analysis may comprise vector regression.
- the data analysis my comprise at least one software library.
- the software library may provide a regularizing gradient boosting framework
- the software library may be configured to provide a scalable, portable and/or distributed gradient boosting (GBM, GBRT, GBDT) library (e.g., XGBoost library).
- the software library may be configured to run on a single processor, as well as the distributed processing frameworks.
- the software library may be configured to offer clever penalization of trees, proportional shrinking of leaf nodes, Newton Boosting, extra randomization parameter, Implementation on single, distributed systems and out-of-core computation, and/or automatic Feature selection.
- the root-mean square error (RMSE) of the simulation as compared to real data may be at most about 5, 10, 15, 20, 25, 30, 35, 40, or 45.
- the control system may utilize a learning module (e.g., for environmental adjustment and/or forecasting such as for acoustic conditioning and/or forecasting).
- the learning module may comprise machine learning.
- the learning module may comprise a multilayer neural network (e.g., a deep learning algorithm).
- the learning module may include an unbounded number of layers of bounded size, e.g., to progressively extract higher-level features from the raw (e.g., sensor) input measurements.
- the layers in the multilayer neural network may be hierarchical (e.g., each layer’s output may be a higher- level abstraction of inputs from previous layers).
- the learning module may utilize a heuristic technique (e.g., gross model and sensor data) that will accelerate outputting a reliable prediction as a result.
- the learning module may optimize for prediction accuracy and/or computational speed.
- the learning module may consider the neural network size (number of layers and number of units per layer), learning rate, and/or initial weights (e.g., of artificial neurons and/or algorithms (when several algorithms are utilized to generate the result)).
- the learning module may learn from measurements with respect to failure of tintable windows, by using sensor measurements (e.g., real time, historical, or synthetic sensor measurements).
- a learning module comprises a computational scheme, an algorithm and/or a calculation.
- the learning model may comprise machine learning, artificial intelligence (Al), and/or a statistical validation layer.
- the learning module can be trained to identify a threshold (e.g., value or function) for failure. Alternatively, the learning module may not be trained to identify a failure threshold.
- the learning module can be trained using historical, real-time, and/or synthesized data, used as a training set.
- a machine learning (ML) ensemble can be used to implement the learning module.
- the machine learning ensemble can include a plurality of models (e.g., at least about 2, 3, 4 5, 7, or 10 models) working together, e.g., using a voting scheme. At least two of the models in the plurality of models can be given different weights. At least two of the models in the plurality of models can be give the same weight.
- the ML ensemble can include at least one model. Usage of the ML ensemble may be automatic, scheduled, and/or controlled.
- the learning module incorporates a validation mechanism that is configured to perform data management.
- the learning module can utilize one or more models.
- One model (or model combination) may be more appropriate in a situation than another. For example, rare circumstances may require use of specific models.
- the model can use adaptive synthetic oversampling.
- the model can use deep learning techniques (e.g., convolutional neural networks).
- the model can use Al techniques that exclude deep learning algorithms and/or new Al techniques that include deep learning algorithms.
- the learning set may comprise real data.
- the learning set may comprise synthetic data.
- the synthetic data may be synthesized using real data. For example, the synthetic data may use a real data backbone to which different type of non-substantial information (e.g., noise) has been added.
- the non-substantial information may be characteristics to sensor measurements (e.g., of failed, failing, and/or properly functioning tintable windows).
- the learning model can use a temporal convolution neural network.
- the learning model can incorporate a computation scheme also utilized for analyzing visual imagery.
- the learning model can use data collected in a first enclosure (e.g., first facility), or from another second enclosure (e.g., from the same first facility of from another second facility).
- the second facility can be geographically separated (e.g., distant) from the first facility in which the first tintable window is disposed.
- the vibrations of the window are configured for sound dampening (e.g., reducing or block).
- the sounds may be noise (e.g., mechanical noise such as from a motor, or human generated noise).
- the noise may be external to the enclosure.
- the noise may be internal to the enclosure (e.g., arising from a motor in the enclosure).
- the vibrations in the window e.g., glass
- the vibrations in the window may be configured to at least partially cancel out certain sound (e.g., certain vibrational frequencies).
- the vibrations in the window e.g., glass
- the vibrations in the window may be configured to at least partially destructively interfere with sounds frequency (e.g., at least a portion of the frequencies are subject to destructive interference by vibrations created by the window).
- the vibrations may be optically measured (e.g., using a laser).
- vibrations generated in an enclosure cause vibration of the window (e.g., of an internal pane of the window), which window vibrations may be measured and deciphered.
- the presently disclosed logic and computational processing resources may be provided within a digital element such as a digital wall interface or a digital architectural element as described herein, and/or it may be provided via a network connection to a remote location such as another building using the same or similar resources and services, servers on the internet, cloud-based resources, etc.
- Certain embodiments disclosed herein relate to systems for generating and/or using functionality for a building such as the uses described in the preceding “Applications and Uses” section.
- a programmed or configured system for performing the functions and uses may be configured to (i) receive input such as sensor data characterizing conditions within a building, occupancy details, and/or exterior environmental conditions, and (ii) execute instructions that determine the effect of such conditions or details on a building environment, and optionally take actions to maintain or change the building environment.
- Many types of computing systems having any of various computer architectures may be employed as the disclosed systems for implementing the functions and uses described herein.
- the systems may include software components executing on one or more general purpose processors or specially designed processors such as programmable logic devices (e.g., Field Programmable Gate Arrays (FPGAs)). Further, the systems may be implemented on a single device or distributed across multiple devices. The functions of the computational elements may be merged into one another or further split into multiple sub-modules.
- the computing system contains a microcontroller. In certain embodiments, the computing system contains a general purpose microprocessor. Frequently, the computing system is configured to run an operating system and one or more applications.
- code for performing a function or use described herein can be embodied in the form of software elements which can be stored in a nonvolatile storage medium (such as optical disk, flash storage device, mobile hard disk, etc.).
- a software element is implemented as a set of commands prepared by the programmer/developer.
- the module software that can be executed by the computer hardware is executable code committed to memory using “machine codes” selected from the specific machine language instruction set, or “native instructions,” designed into the hardware processor.
- the machine language instruction set, or native instruction set is known to, and essentially built into, the hardware processor(s). This is the “language” by which the system and application software communicates with the hardware processors.
- Each native instruction is a discrete code that is recognized by the processing architecture and that can specify particular registers for arithmetic, addressing, or control functions; particular memory locations or offsets; and particular addressing modes used to interpret operands. More complex operations are built up by combining these simple native instructions, which are executed sequentially, or as otherwise directed by control flow instructions.
- the algorithms used herein may be configured to execute on a single machine at a single location, on multiple machines at a single location, or on multiple machines at multiple locations. When multiple machines are employed, the individual machines may be tailored for their particular tasks. For example, operations requiring large blocks of code and/or significant processing capacity may be implemented on large and/or stationary machines.
- certain embodiments relate to tangible and/or non-transitory computer readable media or computer program products that include program instructions and/or data (including data structures) for performing various computer-implemented operations.
- Examples of computer-readable media include, but are not limited to, semiconductor memory devices, phase-change devices, magnetic media such as disk drives, magnetic tape, optical media such as CDs, magneto-optical media, and hardware devices that are specially configured to store and perform program instructions, such as read-only memory devices (ROM) and random access memory (RAM).
- ROM read-only memory
- RAM random access memory
- the computer readable media may be directly controlled by an end user or the media may be indirectly controlled by the end user. Examples of directly controlled media include the media located at a user facility and/or media that are not shared with other entities.
- Examples of indirectly controlled media include media that is indirectly accessible to the user via an external network and/or via a service providing shared resources such as the “cloud.”
- Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.
- the data or information employed in the disclosed methods and apparatus is provided in a digital format.
- Such data or information may include sensor data, building architectural information, floor plans, operating or environment conditions, schedules, and the like.
- data or other information provided in digital format is available for storage on a machine and transmission between machines.
- data may be stored as bits and/or bytes in various data structures, lists, databases, etc.
- the data may be embodied electronically, optically, etc.
- algorithms for implementing functions and uses described herein may be viewed as a form of application software that interfaces with a user and with system software.
- System software typically interfaces with computer hardware and associated memory.
- the system software includes operating system software and/or firmware, as well as any middleware and drivers installed in the system.
- the system software provides basic non-task-specific functions of the computer.
- the modules and other application software are used to accomplish specific tasks.
- Each native instruction for a module is stored in a memory device and is represented by a numeric value.
- the presently disclosed techniques contemplate a network of digital architectural elements (DAE's) capable of collecting a rich set of data related to environmental, occupancy and security conditions of a building's interior and/or exterior.
- the digital architectural elements may include optically switchable windows and/or mullions or other architectural features associated with optically switchable windows.
- the digital architectural elements may be widely distributed throughout all or much of, at least, a building's perimeter. As a result, the collected data may provide a highly granular, detailed representation of environmental, occupancy and security conditions associated with much or all of a building's interior and/or exterior.
- many or all of the building's windows may include, or be associated with, a digital architectural element that includes a suite of sensors such as light sensors and/or cameras (visible and/or IR), acoustic sensors such as microphone arrays, temperature and humidity sensors and air quality sensors that detect VOCs, CO2, carbon monoxide (CO) and/or dust.
- sensors such as light sensors and/or cameras (visible and/or IR), acoustic sensors such as microphone arrays, temperature and humidity sensors and air quality sensors that detect VOCs, CO2, carbon monoxide (CO) and/or dust.
- occupancy levels of a room in a building may be determined by light sensors cameras and/or acoustic sensors, and a correlation may be made between a particular change in level of occupancy and a desired change in HVAC function. For example, an increased occupancy level may be correlated with a need to increase airflow and/or lower a thermostat setting. As a further example, data from air quality sensors that detect levels of dust may be correlated with a need to perform building maintenance or introduce or exclude outside air from interior spaces.
- measured interior air-quality may be observed to (i) improve or (ii) degrade when a window is opened.
- air circulation ducts or filters of an HVAC system should be serviced.
- exterior airquality is poor, and that windows of the building should preferentially be maintained in a closed position.
- a correlation may be drawn between the number of occupants in a conference room, and whether doors and/or windows are open or closed, with Co2 levels and/or rate of change of Co2 levels.
- an "enclosure (e.g., building) condition” may refer to a physical, measurable condition in an enclosure (e.g., building)or a portion of an enclosure (e.g., building). Examples include temperature, air flow rate, light flux and color, occupancy, air quality and composition (particulate count, gas concentration of carbon dioxide, carbon monoxide, water (i.e., humidity)).
- a " enclosure (e.g., building) system” may refer to a system that can control or adjust an enclosure (e.g., building) operation parameter. Examples include an HVAC system, a lighting system, a security system, a window optical condition control system.
- An enclosure (e.g., building)operation parameter may refer to a parameter that can be controlled by one or more enclosure (e.g., building) systems to adjust or control an enclosure (e.g., building)condition. Examples include heat flux from or to heaters or air conditioners, heat flux from windows or lighting in a room, air flow through a room, and light flux from artificial lights or natural light through an optically switchable window.
- a method 2100 may include collecting inputs, block 2110, from a plurality of sensors. Some or all of the sensors may be disposed on or associated with a respective window, with a respective digital architectural element (associated or not associated with a window), and/or with a digital wall interface.
- the sensors may include visible and/or IR light sensors or cameras, acoustic sensors, temperature sensors, humidity sensors, and/or air quality sensors, for example. It will be appreciated that the collected inputs may represent a variety of environmental condition measurements that are temporally and/or spatially diverse. In some implementations, at least some of the inputs may include a combination of sensors.
- separate sensors specialized for respective measurements of CO 2 , CO, dust and/or smoke may be contemplated, and a combination of inputs from the separate sensors may be analyzed (block 2120), e.g., for determination of air-quality control.
- inputs relevant to a determination of occupancy levels in a room collected from separate sensors that measure, respectively, optical and acoustic signals may be analyzed (block 2120).
- inputs may be received, nearly simultaneously, from spatially distributed sensors.
- the sensors may be spatially distributed with respect to a given room or distributed between multiple rooms and/or floors of the building.
- analysis of the measured data at block 2120 may take into account certain "context information" not necessarily obtained from the sensors.
- Context information may include: time of day and time of year, local weather and/or climatic information.
- Context information may include information regarding the building layout, and/or usage parameters of various portions of the building.
- the context information may be initially input by a user (e.g., a building manager).
- the context information may be updated from time to time, manually and/or automatically.
- Examples of usage parameters may include a building's operating schedule, and/or an identification of expected and/or permitted/authorized usages of individual rooms or larger portions (e.g., floors) of the building.
- certain portions of the billing may be identified as lobby space, restaurant/cafeteria space, conference rooms, open plan areas, private office spaces, etc.
- the context information may be utilized in making a determination as to whether and/or how to modify building operation parameter, block 2130, and also for calibration and, optionally, adjustment of the sensors.
- certain sensors may, optionally, be disabled in certain portions of the building in order to meet an occupant's privacy expectations.
- sensors for rooms in which a considerable number of persons may be expected to congregate e.g., an auditorium
- An objective of the analysis at block 2120 may be to determine that a particular building condition exists or may be predicted to exist.
- the analysis may include comparing a sensor reading such as a light flux or temperature measurement with a threshold.
- the analysis at block 2120 may, first, directly recognize the change as a result of inputs from acoustic and/or optical sensors associated with the room; second, the analysis may predict an environmental parameter that may be expected to change as a result of a change in occupancy load.
- an increase in occupancy load can be expected to lead to increased ambient temperatures and increased levels of CO 2 .
- the analysis at block 620 may be performed automatically on a periodic or continuous basis, using models or other algorithms that may be improved over time using, for example, machine learning techniques.
- the analysis may not explicitly identify a particular building condition (or combination of conditions) in order to determine that a building operation parameter should be adjusted.
- a determination as to whether or how to modify building operation parameter may be made based on the results of analysis block 2120. Depending on the determination, the building condition may or may not be changed. When a determination is made to not modify building operation parameter the method may return to block 2110.
- one or more enclosure (e.g., building) conditions may be adjusted, at block 2140, for purposes of improving occupant comfort or safety and/or to reduce operating costs and energy consumption, for example. For example, lights and/or HVAC service, may be set to a low power condition in rooms that are determined to be unoccupied.
- a determination may be made that a fault or issue has arisen that requires attention of the enclosure (e.g., building) administration, maintenance and/or security personnel.
- the determination may be made on a reactive and/or proactive basis. For example, the determination may react to changes in measured parameters, e.g., a determination may be made to increase HVAC flowrates when a rise in ambient CO 2 is measured.
- the determination may be made on a proactive basis, i.e., the building operation parameter may be adjusted in anticipation of an environmental change before the change is actually measured. For example, an observed change in occupancy loads may result in a decision to increase HVAC flowrates whether or not a corresponding rise in ambient CO 2 or temperature is measured.
- the determination may relate to building operation parameters associated with HVAC (e.g., airflow rates and temperature settings), which may be controlled in one or more locations based on measured temperature, CO 2 levels, humidity, and/or local occupancy.
- the determination may relate to building operation parameters associated with building security. For example, in response to an anomalous sensor reading, a security system alarm may be caused to trigger, selected doors and windows may be locked or unlocked, and/or a tint state of all or some windows may be changed. Examples of security-related building conditions include detection of a broken window, detection of an unauthorized person in a controlled area, and detection of unauthorized movement of equipment, tools, electronic devices or other assets from one location to another.
- Other types of security-related building condition information can include information related to detection of the occurrence of the detection of sound outside and/or within the building.
- the detected sound is analyzed for type of sound.
- analysis is initiated via hardware, firmware, or software onboard to one or more digital structural element or elsewhere in a building, or even offsite.
- sound outside or inside of a building causes conductive layers deposited on window glass of an electrochromic window to vibrate, which vibrations cause changes in capacitance between the conductive layers, and which changes of capacitance are converted into a signal indicative of the sound.
- some windows of the present invention can inherently provide the functionality of a sound and/or vibration sensor, however, in other embodiments, sound and/or vibration sensor functionality can be provided by sensors that have been added to windows with or without conductive layers, and/or by one or more sensors implemented in digital structural elements.
- an originating location of sound can be determined by analyzing differences in sound amplitude and/or sound time delays that different ones of sound and or vibration sensors experience.
- Types of sound detected and then analyzed include, but are not limited to: broken window sounds, voices (for example, voices of persons authorized or unauthorized to be in certain areas), sounds caused by movement (of persons, machines, air currents), and sounds caused by the discharge of firearms.
- one or more appropriate security or other action is initiated by one or more system within the building. For example, upon a determination that a firearm has been discharged at a location outside or inside of a building, a building management system makes an automated 911 call to summon emergency responders to the location.
- a tint of a tintable window closest to the sound of interest is caused to change to a tint that is darker than the tint of windows that are farther away from the sound, or vice versa.
- responders were unable to quickly be able to locate a particular room on a particular floor of a particular building, they might to be able to do so by visually looking for a window that has been distinctively tinted to be darker or lighter than other windows.
- a current location of a person associated with a particular sound may be different from their initial location, in which case, their change in location can be updated via detection of other sounds or changes caused by the person to the environment.
- gas sensors in digital architectural elements or other predetermined locations can be used to monitor changes in air quality caused by the presence of exploded gunpowder, and to thereby provide responders with updates as to location of the shooter. Sound and other sensors could also be used to obtain the location of persons trying to quietly hide from and active shooter (for example, via infrared detection of their location).
- sounds can be generated by speakers in digital architectural elements or other speakers in the shooters location to distract the shooter, or to mask noises made by hostages trying to hide from him.
- speakers and/or microphones in digital architectural elements or other devices could be selectively made active to communicate with persons trying to hide from an active shooter.
- the distinctive tint of the windows may need to be changed to some other tint, for example to provide more light to facilitate one or more persons entry or egress from a particular location or to provide less light to hinder visibility in a particular location.
- one or more enclosure (e.g., building) parameters may be modified responsive to the determination made at block 2130.
- the enclosure (e.g., building) parameter modification may be implemented under the control of a building management system in some embodiments, and may be implemented by one or more of the enclosure (e.g., building) systems such as HVAC, lighting, security, and window controller network, for example. It will be appreciated that the enclosure (e.g., building) parameter modification may be selectively made on a global (building-wide) basis or localized areas (e.g., individual rooms, suites of rooms, floors, etc.),
- an enclosure (e.g., building) system that determines how to modify enclosure (e.g., building) operation parameters may employ machine learning.
- a machine learning model is trained using training data.
- the process begins by training an initial model through supervised or semi-supervised learning.
- the model may be refined through on-going training/learning afforded by use in the field (e.g., while operating in a functioning building).
- Part of the purpose of machine learning is to identify unknown or hidden patterns or relationships, so the learning typically uses a large number of inputs (X) for each possible output (Y).
- a digital architectural element 2200 includes a power and communications module 2210, an audiovisual (A/V) module 2220, an environmental module 2230, a compute/learning module 2240 and a controller module 2250.
- A/V audiovisual
- the power and communications module 2210 may include one or more wired and/or wireless interfaces for transmission and reception of communication signals and/or power. Examples of wireless power transmission techniques suitable for use in connection with the presently disclosed techniques are described in US Provisional Patent Application Serial No. 62/642,478, filed March 13, 2018, titled “WIRELESSLY POWERED AND POWERING ELECTROCHROMIC WINDOWS, filed March 13, 2018, International Patent Application Serial No. PCT/US17/52798, filed September 21 , 2017, titled “WIRELESSLY POWERED AND POWERING ELECTROCHROMIC WINDOWS,” and US Patent Application Serial No.
- the power and communications module 2310 may be communicatively coupled with and distribute power to each of the audiovisual (A/V) module 2220, the environmental module 2230, the compute/learning module 2240 and the controller module 2250.
- the power and communications module 2210 may also be communicatively coupled with one or more other digital architectural elements (not illustrated) and/or interface with a power and/or control distribution node of the building.
- the A/V module 2230 may include one or more of the A/V components described hereinabove, including a camera or other visual and/or IR light sensor, a visual display, a touch interface, a microphone or microphone array, and a speaker or speaker array.
- the "touch" interface may additionally include gesture recognition capabilities operable to detect recognize and respond to non-touching motions of a person's appendage or a handheld object.
- the environmental module 2230 may include one or more of the environmental sensing components described hereinabove, including temperature and humidity sensors, acoustic light sensors, IR sensors, particle sensors (e.g., for detection of dust, smoke, pollen, etc.), VOC, CO, and/or CO2 sensors.
- the environmental module 2230 may functionally incorporate a suite of audio and/or electromagnetic sensors that may partially or completely overlap the sensors (e.g., microphones, visual and/or IR light sensors) described above in connection with A/V module 2230.
- a "sensor" as the term is used herein may include some processing capability, in order, for example, to make determinations such as occupancy (or number of occupants) in a region. Cameras, particularly those detecting IR radiation can be used to directly identify the number of people in a region.
- a sensor may provide raw (unprocessed) signals to the compute/learning module 2240 and/or to the controller module 2250.
- the compute and/or learning module 2240 may include processing components (including general or special purpose processors and memories) as described hereinabove for the digital architectural element, the digital wall interface, and/or the enhanced functionality window controller.
- the compute and/or learning module may include a specially designed ASIC, digital signal processor, or other type of hardware, including processors designed or optimized to implement models such as machine learning models (e.g., neural networks). Examples include Google’s “tensor processing unit” or TPU.
- Such processors may be designed to efficiently compute activation functions, matrix operations, and/or other mathematical operations required for neural network or other machine learning computation.
- other special purpose processors may be employed such as graphics processing units (GPUs).
- the processors may be provided in a system on a chip architecture.
- the controller module 2250 may be or include a window control module incorporating one more features described in U.S. Patent Application Serial No. 15/882,719, filed January 29, 2018, titled “CONTROLLER FOR OPTICALLY-SWITCHABLE WINDOWS,” U.S. Patent Application Serial No. 13/449,251 , filed April 17, 2012, titled “CONTROLLER FOR OPTICALLY-SWITCHABLE WINDOWS,” International Patent Application Serial No. PCT/US17/47664, filed August 18, 2017, titled "ELECTROMAGNETIC-SHIELDING ELECTROCHROMIC WINDOWS," U.S. Patent Application Serial No.
- Fig. 22 presents the digital architectural elements 2200 as incorporating separate and distinct modules 2210, 2220, 2230, 2240 and 2250. It should be appreciated however that two or more modules may be structurally combined with each other and/or with features of the digital wall interface described hereinabove. Moreover, it is contemplated that, in a building installation including a number of digital architectural elements, not every digital architectural element will necessarily include all the described modules 2210, 2220, 2230, 2240 and 2250. For example, in some embodiments one or more of the described modules 2210, 2220, 2230, 2240 and 2250 may be shared by a plurality of digital architectural elements.
- Fig. 23 illustrates an example of a digital architectural element 2300, according to some implementations.
- the DAE is disposed in a window frame portion 2301 (shown as 2304 in magnification) that borders windows 2302 and 2303.
- window frame portion 2301 shown as 2304 in magnification
- the functionality of the described modules 2210, 2220, 2230, 2240 and 2250 may be configured in a physical package having a size and form factor that can be readily accommodated by an architectural feature such as a typical window mullion.
- Fig. 24 shows an example of a portion of a data and power distribution system having a digital architectural element (such as a "smart frame" or similar communications/processing module) 2430 coupled by way of a drop line 2413 with a combination module 2480 that includes a directional coupler 2489 and a bias tee circuit 2484.
- the drop line 2413 may carry both power and data downstream (e.g., using a coaxial cable), to the DAE 2430, and carries data from the DAE 2430 upstream, to a control panel (not shown). Data from a control panel (or other upstream source) may be provided via a coaxial cable input port 2481 . This data is provided to the directional coupler 2489 of combination module 2480.
- the directional coupler 2489 can extract some of the data signal and transmits it on a line 2482, which may be a cable, an electrical trace on a circuit board, etc., depending on the design of the combination module 2480. Data from the control panel that is not tapped off by the combination trunk tee exits via a coaxial cable output port 2483.
- Line 2482 connects to the bias tee circuit 2484 in the combination module 2480. Two twisted pair conductors (or other power carrying lines) 2485(1) and 2485(2) are also connected to the bias tee circuit 2484. With these connections, the bias tee circuit couples the power and data onto drop line 2413, which may be a coaxial cable.
- the digital architectural element or other communications/processing element 2430 may, as depicted, include and/or connect to components for cellular communication (e.g., the illustrated antenna) and cellular or CBRS processing logic 2435 that.
- the processing logic 2435 may be at least fifth generation communication protocol (5G) compatible.
- the digital architectural element or other communications/processing element 2430 provides a CAN bus gateway that provides data and power to one or more CAN bus nodes such as window controllers, which control tint states of associated optically controllable windows.
- modules such as the combination module 2480 illustrated in Fig. 24 may be installed (e.g., liberally) throughout the building, including at some locations where they are not initially connected to digital architectural elements or other processing/communications modules.
- the combination trunk tees may be used, after construction, to install digital processing devices, as needed by the building and/or tenants or other occupants.
- Figs. 25, 26, and 27 present examples of block diagrams of versions of a digital architectural element, a digital wall interface, or similar device.
- DAE digital architectural element
- Fig. 25 illustrates a DAE 2530 that can support multiple communication types, including, e.g., Wi-Fi communications with its own antenna 2537.
- the DAE 2530 may include or be coupled with cellular communications infrastructure such as, in the illustrated embodiment, a base band radio, an amplifier, and an antenna.
- digital architectural element 2530 may support a citizen’s band radio system (CBRS) employing a similar base band radio.
- CBRS citizen’s band radio system
- the digital architectural element in this figure has the same general architecture as the full-featured digital architectural element. But it does not include a sensor and perhaps not ancillary components such as a display, microphone, and speakers.
- digital architectural elements support a modular style sensor configuration that allow for individual upgrade and replacement of sensors via plug and play insertion in a backbone type circuit board having a set of slots or sockets.
- sensors used in the digital structural elements can be installed normal to the backbone in one of a multitude of slots/sockets standardized for maximum flexibility and functionality.
- the sensors are modular and can be plug and play replaced via removal and insertion through openings in housing of the digital architectural elements. Failed sensors can be replaced or functionality/capabilities can be modified as needed.
- sensors could be installed to track construction assets within the site or monitor for unsafe (OSHA+) noise or air quality levels and/or a night camera could be installed to monitor movement on a construction site when the site would normally be unoccupied by workers.
- OSHA+ unsafe
- a night camera could be installed to monitor movement on a construction site when the site would normally be unoccupied by workers.
- these or other sensors could be removed, and quickly and easily replaced or supplemented during an occupancy phase, or at a later phase, when upgraded or sensors with new capabilities were needed or became available.
- Fig. 26 illustrates a system 2600 of components that may be incorporated in or associated with a DAE.
- the system 2600 may be configured to receive and transmit data wirelessly (e.g., Wi-Fi communications, cellular communications, citizens band radio system communications, etc.) and/or to transmit data upstream and receive data downstream via, e.g., a coaxial drop line.
- elements of the system 2600 are presented at a relatively high level.
- the embodiment illustrated in Fig. 26 includes circuits that serve a similar function to the combination module 2480 (described in connection with Fig.
- a module 2680 including a bias tee circuit 2684 takes power and data from separate conductors (trunk line) and puts them on one cable (a drop line 2613).
- a coaxial drop line may deliver both power and data to a MoCA interface 2690 of a digital architectural element on the same conductors.
- the system 2600 includes the bias tee circuit 2684 coupled by way of the drop line 2613 to a MoCA interface 2690.
- the MoCA interface 2690 is configured to convert downstream data signals provided in a MoCA format on coaxial cable (the drop line in this case) to data in a conventional format that can be used for processing.
- the MoCA interface 2690 may be configured to format upstream data for transmission on a coaxial cable (drop line 2613).
- packetized Ethernet data may be MoCA formatted for upstream transmission on coaxial cable.
- a DC-DC power supply 2601 receives DC electrical power from the bias tee circuit 2684 and transforms this relatively high voltage power to a lower voltage power suitable for powering the processing components and other components of digital architectural element 2630.
- power supply 2601 includes a Buck converter.
- the power supply may have various outputs, each with a power or voltage level suitable for a component that it powers. For example, one component may require 12 volt power and a different component may require 3.3 volt power.
- the bias tee circuit 2684, the MoCA interface 2690, and the power supply 2601 are provided in a module (or other combined unit) that is used across multiple designs of a digital architectural element or similar network device. Such a module may provide data and power to one or more downstream data processing, communications, and/or sensing devices in the digital architectural element.
- a processing block 2603 provides processing logic for cellular (e.g., 5G) or other wireless communications functionality as enabled by a transmission (Tx) antenna and associated RF power amplifier and by a reception (Rx) antenna and associated analog-to-digital converter.
- processing block 2603 may be implemented as one or more distinct physical processors. While the block is shown with a separate microcontroller and digital signal processor, the two may be combined in a single physical integrated circuit such as an ASIC.
- a digital architectural element supports multiple wireless communications protocols such as one or more cellular formats (e.g., 5G for Sprint, 5G for T mobile, 4G/LTE for ATT, etc.), it may include separate hardware such antennas, amplifiers, and analog-to- digital converters for each format. Further, if a digital architectural element supports non- cellular wireless communications protocols such as Wi-Fi, citizen’s band radio system, etc., it may require separate antennas and/or other hardware for each of these. However, in some embodiments, a single power amplifier may be shared by antennas and/or other hardware for multiple wireless communications formats.
- the processing block 2603 may implement functionality associated with communications such as, for example, a baseband radio for cellular or citizens band radio communications.
- a baseband radio for cellular or citizens band radio communications.
- different physical processors are employed for each supported wireless communications protocol.
- a single physical processor is configured to implement multiple baseband radios, which optionally share certain additional hardware such as power amplifiers and/or antennas.
- the different baseband radios may be definable in software or other configurable logic. Examples of network and control system can be found in U.S. Provisional Patent Application Serial No. 63/027,452, filed May 20,2020, titled “DATA AND POWER NETWORK OF AN ENCLOSURE,” which is incorporated herein by reference in its entirety.
- a digital architectural element includes a controller.
- the controller may monitor and/or direct (e.g., physical) alteration of the operating conditions of the apparatuses, software, and/or methods described herein.
- Control may comprise regulate, manipulate, restrict, direct, monitor, adjust, modulate, vary, alter, restrain, check, guide, or manage.
- Controlled e.g., by a controller
- the control may comprise controlling a control variable (e.g., temperature, power, voltage, and/or profile).
- the control can comprise real time or off-line control.
- the controller may be a manual or a non-manual controller.
- the controller may be an automatic controller.
- the controller may operate upon request.
- the controller may be a programmable controller.
- the controller may be programed.
- the controller may comprise a processing unit (e.g., CPU or GPU).
- the controller may receive an input (e.g., from at least one sensor).
- the controller may deliver an output.
- the controller may comprise multiple (e.g., sub-) controllers.
- the controller may be a part of a control system.
- the control system may comprise a master controller, floor controller, local controller (e.g., enclosure controller, or window controller).
- the controller may receive one or more inputs.
- the controller may generate one or more outputs.
- the controller may be a single input single output controller (SISO) or a multiple input multiple output controller (MIMO).
- the controller may interpret the input signal received.
- the controller may acquire data from the one or more sensors. Acquire may comprise receive or extract.
- the data may comprise measurement, estimation, determination, generation, or any combination thereof.
- the controller may comprise feedback control.
- the controller may comprise feed-forward control.
- the control may comprise on-off control, proportional control, proportional-integral (PI) control, or proportional-integral-derivative (PID) control.
- the control may comprise open loop control, or closed loop control.
- the controller may comprise closed loop control.
- the controller may comprise open loop control.
- the controller may comprise a user interface.
- the user interface may comprise (or operatively coupled to) a keyboard, keypad, mouse, touch screen, microphone, speech recognition package, camera, imaging system, or any combination thereof.
- the outputs may include a display (e.g., screen), speaker, or printer.
- the methods, systems, and/or the apparatus described herein may comprise a control system.
- the control system can be in communication with any of the apparatuses (e.g., sensors) described herein.
- the sensors may be of the same type or of different types, e.g., as described herein.
- the control system may be in communication with the first sensor and/or with the second sensor.
- the control system may control the one or more sensors.
- the control system may control one or more components of a building management system (e.g., lightening, security, and/or air conditioning system).
- the controller may regulate at least one (e.g., environmental) characteristic of the enclosure (e.g., sound).
- the control system may regulate the enclosure environment using any component of the building management system.
- control system may regulate the energy supplied by a heating element and/or by a cooling element.
- control system may regulate velocity of an air flowing through a vent to and/or from the enclosure.
- the controller may control items (e.g., level angle, and/or surface roughness) and/or sounds (e.g., white noise) affecting the acoustic mapping in the enclosure.
- the control system may comprise a processor.
- the processor may be a processing unit.
- the controller may comprise a processing unit.
- the processing unit may be central.
- the processing unit may comprise a central processing unit (abbreviated herein as “CPU”).
- the processing unit may be a graphic processing unit (abbreviated herein as “GPU”).
- the controller(s) or control mechanisms may be programmed to implement one or more methods of the disclosure.
- the processor may be programmed to implement methods of the disclosure.
- the controller may control at least one component of the forming systems and/or apparatuses disclosed herein.
- the computer system that is programmed or otherwise configured to one or more operations of any of the methods provided herein can control (e.g., direct, monitor, and/or regulate) various features of the methods, apparatuses and systems of the present disclosure, such as, for example, control heating, cooling, lightening, and/or venting of an enclosure, or any combination thereof.
- the computer system can be part of, or be in communication with, any sensor or sensor ensemble disclosed herein (e.g., as part of a device ensemble).
- the sensor may be a standalone sensor or be integrated as part of a device ensemble, e.g., having a single housing.
- the computer may be coupled to one or more mechanisms disclosed herein, and/or any parts thereof.
- the computer may be coupled to one or more sensors, valves, switches, lights, windows (e.g., IGUs), motors, pumps, optical components, or any combination thereof.
- Fig. 27 shows a schematic example of a computer system 2700 that is programmed or otherwise configured to one or more operations of any of the methods provided herein.
- the computer system can include a processing unit (e.g., 2706) (also “processor,” “computer” and “computer processor” used herein).
- the computer system may include memory or memory location (e.g., 2702) (e.g., random-access memory, read-only memory, flash memory), electronic storage unit (e.g., 2704) (e.g., hard disk), communication interface (e.g., 2703) (e.g., network adapter) for communicating with one or more other systems, and peripheral devices (e.g., 2705), such as cache, other memory, data storage and/or electronic display adapters.
- memory 2702, storage unit 2704, interface 2703, and peripheral devices 2705 are in communication with the processing unit 2706 through a communication bus (solid lines), such as a motherboard.
- the storage unit can be a data storage unit (or data repository) for storing data.
- the computer system can be operatively coupled to a computer network (“network”) (e.g., 2701) with the aid of the communication interface.
- the network can be the Internet, an internet and/or extranet, or an intranet and/or extranet that is in communication with the Internet.
- the network is a telecommunication and/or data network.
- the network can include one or more computer servers, which can enable distributed computing, such as cloud computing.
- the network in some cases with the aid of the computer system, can implement a peer-to-peer network, which may enable devices coupled to the computer system to behave as a client or a server.
- the processing unit can execute a sequence of machine-readable instructions, which can be embodied in a program or software.
- the instructions may be stored in a memory location, such as the memory 2702.
- the instructions can be directed to the processing unit, which can subsequently program or otherwise configure the processing unit to implement methods of the present disclosure. Examples of operations performed by the processing unit can include fetch, decode, execute, and write back.
- the processing unit may interpret and/or execute instructions.
- the processor may include a microprocessor, a data processor, a central processing unit (CPU), a graphical processing unit (GPU), a system-on- chip (SOC), a co-processor, a network processor, an application specific integrated circuit (ASIC), an application specific instruction-set processor (ASIPs), a controller, a programmable logic device (PLD), a chipset, a field programmable gate array (FPGA), or any combination thereof.
- the processing unit can be part of a circuit, such as an integrated circuit.
- One or more other components of the system 2700 can be included in the circuit.
- the storage unit can store files, such as drivers, libraries and saved programs.
- the storage unit can store user data (e.g., user preferences and user programs).
- the computer system can include one or more additional data storage units that are external to the computer system, such as located on a remote server that is in communication with the computer system through an intranet or the Internet.
- the computer system can communicate with one or more remote computer systems through a network.
- the computer system can communicate with a remote computer system of a user (e.g., operator).
- remote computer systems include personal computers (e.g., portable PC), slate or tablet PC's (e.g., Apple® iPad, Samsung® Galaxy Tab), telephones, Smart phones (e.g., Apple® iPhone, Android-enabled device, Blackberry®), or personal digital assistants.
- a user e.g., client
- Methods as described herein can be implemented by way of machine (e.g., computer processor) executable code stored on an electronic storage location of the computer system, such as, for example, on the memory 2702 or electronic storage unit 2704.
- the machine executable or machine-readable code can be provided in the form of software.
- the processor 2706 can execute the code.
- the code can be retrieved from the storage unit and stored on the memory for ready access by the processor.
- the electronic storage unit can be precluded, and machineexecutable instructions are stored on memory.
- the code can be pre-compiled and configured for use with a machine have a processer adapted to execute the code or can be compiled during runtime.
- the code can be supplied in a programming language that can be selected to enable the code to execute in a pre-compiled or as-compiled fashion.
- the processor comprises a code.
- the code can be program instructions.
- the program instructions may cause the at least one processor (e.g., computer) to direct a feed forward and/or feedback control loop.
- the program instructions cause the at least one processor to direct a closed loop and/or open loop control scheme.
- the control may be based at least in part on one or more sensor readings (e.g., sensor data).
- One controller may direct a plurality of operations. At least two operations may be directed by different controllers. In some embodiments, a different controller may direct at least two of operations (a), (b) and (c). In some embodiments, different controllers may direct at least two of operations (a), (b) and (c).
- a non-transitory computer- readable medium cause each a different computer to direct at least two of operations (a), (b) and (c). In some embodiments, different non-transitory computer-readable mediums cause each a different computer to direct at least two of operations (a), (b) and (c).
- the controller and/or computer readable media may direct any of the apparatuses or components thereof disclosed herein. The controller and/or computer readable media may direct any operations of the methods disclosed herein.
- a tintable window exhibits a (e.g., controllable and/or reversible) change in at least one optical property of the window, e.g., when a stimulus is applied.
- the change may be a continuous change.
- a change may be to discrete tint levels (e.g., to at least about 2, 4, 8, 16, or 32 tint levels).
- the optical property may comprise hue, or transmissivity.
- the hue may comprise color.
- the transmissivity may be of one or more wavelengths.
- the wavelengths may comprise ultraviolet, visible, or infrared wavelengths.
- the stimulus can include an optical, electrical and/or magnetic stimulus.
- the stimulus can include an applied voltage and/or current.
- One or more tintable windows can be used to control lighting and/or glare conditions, e.g., by regulating the transmission of solar energy propagating through them.
- One or more tintable windows can be used to control a temperature within a building, e.g., by regulating the transmission of solar energy propagating through the window.
- Control of the solar energy may control heat load imposed on the interior of the facility (e.g., building).
- the control may be manual and/or automatic.
- the control may be used for maintaining one or more requested (e.g., environmental) conditions, e.g., occupant comfort.
- the control may include reducing energy consumption of a heating, ventilation, air conditioning and/or lighting systems. At least two of heating, ventilation, and air conditioning may be induced by separate systems.
- tintable windows may be responsive to (e.g., and communicatively coupled to) one or more environmental sensors and/or user control.
- Tintable windows may comprise (e.g., may be) electrochromic windows. The windows may be located in the range from the interior to the exterior of a structure (e.g., facility, e.g., building). However, this need not be the case.
- Tintable windows may operate using liquid crystal devices, suspended particle devices, microelectromechanical systems (MEMS) devices (such as microshutters), or any technology known now, or later developed, that is configured to control light transmission through a window.
- MEMS microelectromechanical systems
- Windows e.g., with MEMS devices for tinting
- MEMS devices for tinting
- one or more tintable windows can be located within the interior of a building, e.g., between a conference room and a hallway.
- one or more tintable windows can be used in automobiles, trains, aircraft, and other vehicles, e.g., in lieu of a passive and/or non-tinting window.
- the tintable window comprises an electrochromic device (referred to herein as an “EC device” (abbreviated herein as ECD), or “EC”).
- An EC device may comprise at least one coating that includes at least one layer.
- the at least one layer can comprise an electrochromic material.
- the electrochromic material exhibits a change from one optical state to another, e.g., when an electric potential is applied across the EC device.
- the transition of the electrochromic layer from one optical state to another optical state can be caused, e.g., by reversible, semi-reversible, or irreversible ion insertion into the electrochromic material (e.g., by way of intercalation) and a corresponding injection of charge-balancing electrons.
- the transition of the electrochromic layer from one optical state to another optical state can be caused, e.g., by a reversible ion insertion into the electrochromic material (e.g., by way of intercalation) and a corresponding injection of charge-balancing electrons.
- Reversible may be for the expected lifetime of the ECD.
- Semi-reversible refers to a measurable (e.g., noticeable) degradation in the reversibility of the tint of the window over one or more tinting cycles.
- a fraction of the ions responsible for the optical transition is irreversibly bound up in the electrochromic material (e.g., and thus the induced (altered) tint state of the window is not reversible to its original tinting state).
- at least some (e.g., all) of the irreversibly bound ions can be used to compensate for “blind charge” in the material (e.g., ECD).
- suitable ions include cations.
- the cations may include lithium ions (Li+) and/or hydrogen ions (H+) (e.g., protons).
- other ions can be suitable.
- Intercalation of the cations may be into an (e.g., metal) oxide.
- a change in the intercalation state of the ions (e.g., cations) into the oxide may induce a visible change in a tint (e.g., color) of the oxide.
- the oxide may transition from a colorless to a colored state.
- intercalation of lithium ions into tungsten oxide may cause the tungsten oxide to change from a transparent state to a colored (e.g., blue) state.
- EC device coatings as described herein are located within the viewable portion of the tintable window such that the tinting of the EC device coating can be used to control the optical state of the tintable window.
- Fig. 28 shows an example of a schematic cross-section of an electrochromic device 2800 in accordance with some embodiments.
- the EC device coating is attached to a substrate 2802, a transparent conductive layer (TCL) 2804, an electrochromic layer (EC) 2806 (sometimes also referred to as a cathodically coloring layer or a cathodically tinting layer), an ion conducting layer or region (IC) 2808, a counter electrode layer (CE) 2810 (sometimes also referred to as an anodically coloring layer or anodically tinting layer), and a second TCL 2814.
- TCL transparent conductive layer
- EC electrochromic layer
- IC ion conducting layer or region
- CE counter electrode layer
- Elements 2804, 2806, 2808, 2810, and 2814 are collectively referred to as an electrochromic stack 2820.
- a voltage source 2816 operable to apply an electric potential across the electrochromic stack 2820 effects the transition of the electrochromic coating from, e.g., a clear state to a tinted state.
- the order of layers is reversed with respect to the substrate. That is, the layers are in the following order: substrate, TCL, counter electrode layer, ion conducting layer, electrochromic material layer, TCL.
- the ion conductor region may form from a portion of the EC layer (e.g., 2806) and/or from a portion of the CE layer (e.g., 2810).
- the electrochromic stack e.g., 2820
- the ion conductor region may form where the EC layer and the CE layer meet, for example through heating and/or other processing steps.
- an EC device coating may include one or more additional layers such as one or more passive layers. Passive layers can be used to improve certain optical properties, to provide moisture, and/or to provide scratch resistance. These and/or other passive layers can serve to hermetically seal the EC stack 2820.
- the electrochromic device is configured to (e.g., substantially) reversibly cycle between a clear state and a tinted state. Reversible may be within an expected lifetime of the ECD.
- the expected lifetime can be at least about 5, 10, 15, 25, 50, 75, or 100 years.
- the expected lifetime can be any value between the aforementioned values (e.g., from about 5 years to about 100 years, from about 5 years to about 50 years, or from about 50 years to about 100 years).
- a potential can be applied to the electrochromic stack (e.g., 2820) such that available ions in the stack that can cause the electrochromic material (e.g., 2806) to be in the tinted state reside primarily in the counter electrode (e.g., 2810) when the window is in a first tint state (e.g., clear).
- the ions can be transported across the ion conducting layer (e.g., 2808) to the electrochromic material and cause the material to enter the second tint state (e.g., tinted state).
- the reference to a transition between a clear state and tinted state is non-limiting and suggests only one example, among many, of an electrochromic transition that may be implemented. Unless otherwise specified herein, whenever reference is made to a clear-tinted transition, the corresponding device or process encompasses other optical state transitions such as non-reflective-reflective, and/or transparent-opaque. In some embodiments, the terms “clear” and “bleached” refer to an optically neutral state, e.g., untinted, transparent and/or translucent. In some embodiments, the “color” or “tint” of an electrochromic transition is not limited to any wavelength or range of wavelengths.
- the choice of appropriate electrochromic material and counter electrode materials may govern the relevant optical transition (e.g., from tinted to untinted state).
- at least a portion (e.g., all of) the materials making up electrochromic stack are inorganic, solid (e.g., in the solid state), or both inorganic and solid. Because various organic materials tend to degrade over time, particularly when exposed to heat and UV light as tinted building windows are, inorganic materials offer an advantage of a reliable electrochromic stack that can function for extended periods of time. In some embodiments, materials in the solid state can offer the advantage of being minimally contaminated and minimizing leakage issues, as materials in the liquid state sometimes do.
- One or more of the layers in the stack may contain some amount of organic material (e.g., that is measurable).
- the ECD or any portion thereof e.g., one or more of the layers
- the ECD or any portion thereof may contain little or no measurable organic matter.
- the ECD or any portion thereof e.g., one or more of the layers
- Solid state material may be deposited (or otherwise formed) using one or more processes employing liquid components, such as certain processes employing sol-gels, physical vapor deposition, and/or chemical vapor deposition.
- Fig. 29 shows an example of a cross-sectional view of a tintable window embodied in an insulated glass unit (“IGU”) 2900, in accordance with some implementations.
- IGU insulated glass unit
- the terms “IGU,” “tintable window,” and “optically switchable window” can be used interchangeably herein. It can be desirable to have IGUs serve as the fundamental constructs for holding electrochromic panes (also referred to herein as “lites”) when provided for installation in a building.
- An IGU lite may be a single substrate or a multi-substrate construct. The lite may comprise a laminate, e.g., of two substrates.
- IGUs can provide a number of advantages over single pane configurations.
- multi-pane configurations can provide enhanced thermal insulation, noise insulation, environmental protection and/or durability, when compared with single-pane configurations.
- a multi-pane configuration can provide increased protection for an ECD.
- the electrochromic films e.g., as well as associated layers and conductive interconnects
- the inert gas fill may provide at least some (heat) insulating function for an IGU.
- Electrochromic IGUs may have heat blocking capability, e.g., by virtue of a tintable coating that absorbs (and/or reflects) heat and light.
- an “IGU” includes two (or more) substantially transparent substrates.
- the IGU may include two panes of glass. At least one substrate of the IGU can include an electrochromic device disposed thereon. The one or more panes of the IGU may have a separator disposed between them.
- An IGU can be a hermetically sealed construct, e.g., having an interior region that is isolated from the ambient environment.
- a “window assembly” may include an IGU.
- a “window assembly” may include a (e.g., standalone) laminate.
- a “window assembly” may include one or more electrical leads, e.g., for connecting the IGUs and/or laminates.
- the electrical leads may operatively couple (e.g., connect) one or more electrochromic devices to a voltage source, switches and the like, and may include a frame that supports the IGU or laminate.
- a window assembly may include a window controller, and/or components of a window controller (e.g., a dock).
- Fig. 29 shows an example implementation of an IGU 2900 that includes a first pane 2904 having a first surface S1 and a second surface S2.
- the first surface S1 of the first pane 2904 faces an exterior environment, such as an outdoors or outside environment.
- the IGU 2900 also includes a second pane 2906 having a first surface S3 and a second surface S4.
- the second surface (e.g., S4) of the second pane (e.g., 2906) faces an interior environment, such as an inside environment of a home, building, vehicle, or compartment thereof (e.g., an enclosure therein such as a room).
- the first and the second panes are transparent or translucent, e.g., at least to light in the visible spectrum.
- each of the panes can be formed of a glass material.
- the glass material may include architectural glass, and/or shatter-resistant glass.
- the glass may comprise a silicon oxide (SOx).
- the glass may comprise a soda-lime glass or float glass.
- the glass may comprise at least about 75% silica (SiO 2 ).
- the glass may comprise oxides such as Na 2 O, or CaO.
- the glass may comprise alkali or alkali-earth oxides.
- the glass may comprise one or more additives.
- the first and/or the second panes can include any material having suitable optical, electrical, thermal, and/or mechanical properties.
- Other materials (e.g., substrates) that can be included in the first and/or the second panes are plastic, semi-plastic and/or thermoplastic materials, for example, poly(methyl methacrylate), polystyrene, polycarbonate, allyl diglycol carbonate, SAN (styrene acrylonitrile copolymer), poly(4-methyl-1-pentene), polyester, and/or polyamide.
- the first and/or second pane may include mirror material (e.g., silver).
- the first and/or the second panes can be strengthened. The strengthening may include tempering, heating, and/or chemically strengthening.
- the device ensemble (e.g., DAE) has one or more holes in its casing (e.g., housing or container).
- the holes may facilitate sensing attributes by the senso(s) disposed in the device ensemble casing.
- a hole of the casing may be aligned with a sound sensor disposed in the interior of the device ensemble casing.
- Fig. 30 shows an example of a device ensemble having a casing cover 3051 that comprises a smoother externally exposed surface portion 3057 and a rougher externally exposed surface portion 3056 depicting a pattern that is a hexagonal pattern (e.g., honeycomb pattern).
- the rougher externally exposed surface portion comprises a plurality of holes including 3051 , 3052, 3053, 3054, and 3055.
- the casing cover is of a casing that houses a circuit board (e.g., printed circuit board) 3000 that includes devices.
- the devices can comprise sensor(s), emitter(s), processor(s), network interface, memory, transceiver, antenna(s), communication and power port(s), controller(s), and/or any other device disclosed herein.
- the holes 3051-3052 may be disposed such that they align with a sensor or sensor array.
- the sensor(s) may be disposed on a front side of circuit board 3000 facing the viewer, or on a back side of circuit board 3000 away from the viewer.
- hole 3051 aligns with sound sensor 3001 disposed on the front side of circuit board 3000 facing the viewer
- hole 3052 aligns with sensor 3002 disposed on the front side of circuit board 3000 facing the viewer
- hole 3053 aligns with sensor 3003 disposed on aback side of circuit board 3000 away from the viewer
- hole 3054 aligns with sensor 3004 disposed on aback side of circuit board 3000 away from the viewer
- hole 3055 aligns with sensor 3005 disposed on the front side of circuit board 3000 facing the viewer.
- the sensor(s) disposed in the back side of circuit board 3000 may be gas sensor(s) such as carbon dioxide and/or humidity sensors.
- the circuit board may have a plurality of temperature sensors configure to sense temperature of the devise ensemble interior and/or exterior.
- a sensor that may be configured to sense the device ensemble exterior may be aligned with a hole in the device ensemble casing cover 3051 . Examples of sensor and/or emitter configuration in a device ensemble are disclosed in International Patent Application Serial No. PCT/US21/30798 filed May 5, 2021 , titled “DEVICE ENSEMBLES AND COEXISTENCE MANAGEMENT OF DEVICES,” which is incorporated herein by reference in its entirety.
- the device ensemble may comprise a casing enclosure devices comprising (i) sensors, (ii) a sensor and an emitter, or (iii) a sensor and a transceiver.
- the device ensemble housing may enclosure at least 2, 3, 5, 7, 10, 15, 20, or 30 devices.
- the devices of the device ensemble may be operatively couple to one or more circuit boards enclosed by the casing (e.g., by the housing).
- sensors disposed at different locations of a facility measure different measurements of an attribute.
- different sound sensors disposed in different locations in the facility may measure different sounds and/or different sounds patterns.
- the sound patterns may have an oscillatory attribute.
- the oscillation may correspond to a frequency of a mechanical device such as an actuator (e.g., motor).
- the oscillation may correspond to behavioral patterns occurring around or in the facility, e.g., of behavioral patterns of the facility occupants.
- the oscillations may have a fine structure that may or may not be oscillating. The fine structure may superimpose the oscillations. For example, a building may be noisy during the day when occupants are active, and quieter during the night when occupants are absent or passive.
- the noise pattern may raise during the day and fall during the night.
- the noise level may especially elevate in the facility.
- the noise pattern may detect on what day the gathering occurred, and at which location (e.g., location having a sensor that measured that abnormally loud sounds).
- a control system may take remedial measures to dampen the sound.
- a repetitive loud sound is detected at a location (e.g., a conference room or cafeteria in which the sound is consistently uncomfortably loud)
- persistent remedial measures may be taken in that location.
- the persistent remedial measures may be passive (e.g., installing sound damping wall, ceiling, and/or floor material).
- the persistent remedial measures may be active (e.g., using persistent white noise machine, vibrating windows to dampen the sound, and the like).
- Fig. 31 shows an example of a graph depicting sound as a function of time of three different sensors number 1 , 2, and 3 that are disposed in a facility at different locations.
- the graph delineates a relative lowest noise level of sensor #1 measuring data 3101 , as compared to an increased noise level measured by sensor #3 measuring data 3103, and a highest noise level measured by sensor #2 measuring data 3102. All three sensors measure oscillatory noise level that seems to oscillate on an approximate 24h basis, with some variations.
- measurements of sensor #1 depict data variations such as spike 3104, two daily maxima 3105 and 3106, and one daily maxima 3107.
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/042,712 US20230333434A1 (en) | 2017-04-26 | 2021-08-20 | Mapping acoustic properties in an enclosure |
Applications Claiming Priority (16)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202063069358P | 2020-08-24 | 2020-08-24 | |
US63/069,358 | 2020-08-24 | ||
US202063079851P | 2020-09-17 | 2020-09-17 | |
US63/079,851 | 2020-09-17 | ||
US17/083,128 | 2020-10-28 | ||
US17/083,128 US20210063836A1 (en) | 2017-04-26 | 2020-10-28 | Building network |
PCT/US2021/015378 WO2021154915A1 (en) | 2020-01-29 | 2021-01-28 | Sensor calibration and operation |
USPCT/US2021/015378 | 2021-01-28 | ||
US202163146365P | 2021-02-05 | 2021-02-05 | |
US63/146,365 | 2021-02-05 | ||
PCT/US2021/017946 WO2021163552A1 (en) | 2020-02-14 | 2021-02-12 | Data and power network of a facility |
USPCT/US2021/017946 | 2021-02-12 | ||
PCT/US2021/030798 WO2021226182A1 (en) | 2020-05-06 | 2021-05-05 | Device ensembles and coexistence management of devices |
USPCT/US2021/030798 | 2021-05-05 | ||
US202163233122P | 2021-08-13 | 2021-08-13 | |
US63/233,122 | 2021-08-13 |
Related Parent Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/083,128 Continuation-In-Part US20210063836A1 (en) | 2009-12-22 | 2020-10-28 | Building network |
PCT/US2021/015378 Continuation-In-Part WO2021154915A1 (en) | 2017-04-26 | 2021-01-28 | Sensor calibration and operation |
US17/759,709 Continuation-In-Part US20230065864A1 (en) | 2020-01-29 | 2021-01-28 | Sensor calibration and operation |
PCT/US2021/030798 Continuation-In-Part WO2021226182A1 (en) | 2017-04-26 | 2021-05-05 | Device ensembles and coexistence management of devices |
US17/922,219 Continuation-In-Part US20230176669A1 (en) | 2017-04-26 | 2021-05-05 | Device ensembles and coexistence management of devices |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022046541A1 true WO2022046541A1 (en) | 2022-03-03 |
Family
ID=80353848
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2021/046838 WO2022046541A1 (en) | 2017-04-26 | 2021-08-20 | Mapping acoustic properties in an enclosure |
Country Status (2)
Country | Link |
---|---|
TW (1) | TW202227890A (en) |
WO (1) | WO2022046541A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4357739A1 (en) * | 2022-10-21 | 2024-04-24 | Valeo Telematik Und Akustik GmbH | Method and a system for evaluating an emergency situation in a vehicle |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170086003A1 (en) * | 2002-03-25 | 2017-03-23 | Bose Corporation | Automatic audio system equalizing |
US20170345267A1 (en) * | 2015-10-20 | 2017-11-30 | Vivint, Inc. | System and methods for correlating sound events to security and/or automation system operations |
US9930463B2 (en) * | 2016-03-31 | 2018-03-27 | Sonos, Inc. | Defect detection via audio playback |
KR101853568B1 (en) * | 2016-12-02 | 2018-04-30 | 송원섭 | Smart device, and method for optimizing sound using the smart device |
US20190356508A1 (en) * | 2018-05-02 | 2019-11-21 | View, Inc. | Sensing and communications unit for optically switchable window systems |
-
2021
- 2021-08-20 TW TW110130891A patent/TW202227890A/en unknown
- 2021-08-20 WO PCT/US2021/046838 patent/WO2022046541A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170086003A1 (en) * | 2002-03-25 | 2017-03-23 | Bose Corporation | Automatic audio system equalizing |
US20170345267A1 (en) * | 2015-10-20 | 2017-11-30 | Vivint, Inc. | System and methods for correlating sound events to security and/or automation system operations |
US9930463B2 (en) * | 2016-03-31 | 2018-03-27 | Sonos, Inc. | Defect detection via audio playback |
KR101853568B1 (en) * | 2016-12-02 | 2018-04-30 | 송원섭 | Smart device, and method for optimizing sound using the smart device |
US20190356508A1 (en) * | 2018-05-02 | 2019-11-21 | View, Inc. | Sensing and communications unit for optically switchable window systems |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4357739A1 (en) * | 2022-10-21 | 2024-04-24 | Valeo Telematik Und Akustik GmbH | Method and a system for evaluating an emergency situation in a vehicle |
Also Published As
Publication number | Publication date |
---|---|
TW202227890A (en) | 2022-07-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230353416A1 (en) | Sensing and communications unit for optically switchable window systems | |
US11294254B2 (en) | Building network | |
US20220179275A1 (en) | Building network | |
US20230176669A1 (en) | Device ensembles and coexistence management of devices | |
CA3066285A1 (en) | Edge network for building services | |
WO2021211798A1 (en) | Interaction between an enclosure and one or more occupants | |
US20230194115A1 (en) | Environmental adjustment using artificial intelligence | |
JP2023113826A (en) | Sensing and communication unit for optically switchable window system | |
WO2022046541A1 (en) | Mapping acoustic properties in an enclosure | |
WO2021226182A1 (en) | Device ensembles and coexistence management of devices | |
WO2021237019A1 (en) | Environmental adjustment using artificial intelligence | |
US20230132451A1 (en) | Interaction between an enclosure and one or more occupants | |
US20230333434A1 (en) | Mapping acoustic properties in an enclosure | |
WO2023010016A1 (en) | Locally initiated wireless emergency alerts | |
US20230288770A1 (en) | Atmospheric adjustment in an enclosure | |
TW202210920A (en) | Device ensembles and coexistence management of devices | |
WO2023034839A1 (en) | Occupant-centered predictive control of devices in facilities | |
WO2022221234A1 (en) | Temperature and thermal comfort mapping of an enclosed environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21862425 Country of ref document: EP Kind code of ref document: A1 |
|
WPC | Withdrawal of priority claims after completion of the technical preparations for international publication |
Ref document number: PCT/US2021/015378 Country of ref document: US Date of ref document: 20230222 Free format text: WITHDRAWN AFTER TECHNICAL PREPARATION FINISHED Ref document number: 17/083,128 Country of ref document: US Date of ref document: 20230222 Free format text: WITHDRAWN AFTER TECHNICAL PREPARATION FINISHED Ref document number: PCT/US2021/017946 Country of ref document: US Date of ref document: 20230222 Free format text: WITHDRAWN AFTER TECHNICAL PREPARATION FINISHED Ref document number: PCT/US2021/030798 Country of ref document: US Date of ref document: 20230222 Free format text: WITHDRAWN AFTER TECHNICAL PREPARATION FINISHED |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21862425 Country of ref document: EP Kind code of ref document: A1 |