CN115485614A - Interaction between peripheral structures and one or more occupant-related applications - Google Patents

Interaction between peripheral structures and one or more occupant-related applications Download PDF

Info

Publication number
CN115485614A
CN115485614A CN202180029153.7A CN202180029153A CN115485614A CN 115485614 A CN115485614 A CN 115485614A CN 202180029153 A CN202180029153 A CN 202180029153A CN 115485614 A CN115485614 A CN 115485614A
Authority
CN
China
Prior art keywords
user
facility
controller
control
service device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202180029153.7A
Other languages
Chinese (zh)
Inventor
T·马克尔
N·特里卡
R·P·穆尔普里
M·D·门登霍尔
D·什里瓦斯塔瓦
S·C·布朗
A·古普塔
A·马利克
隋思遥
王楚晴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
View Inc
Original Assignee
View Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/946,947 external-priority patent/US11592723B2/en
Priority claimed from PCT/US2020/053641 external-priority patent/WO2021067505A1/en
Priority claimed from US17/081,809 external-priority patent/US11460749B2/en
Priority claimed from US17/083,128 external-priority patent/US20210063836A1/en
Priority claimed from US17/249,148 external-priority patent/US11735183B2/en
Application filed by View Inc filed Critical View Inc
Publication of CN115485614A publication Critical patent/CN115485614A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04KSECRET COMMUNICATION; JAMMING OF COMMUNICATION
    • H04K3/00Jamming of communication; Counter-measures
    • H04K3/60Jamming involving special techniques
    • H04K3/68Jamming involving special techniques using passive jamming, e.g. by shielding or reflection
    • EFIXED CONSTRUCTIONS
    • E06DOORS, WINDOWS, SHUTTERS, OR ROLLER BLINDS IN GENERAL; LADDERS
    • E06BFIXED OR MOVABLE CLOSURES FOR OPENINGS IN BUILDINGS, VEHICLES, FENCES OR LIKE ENCLOSURES IN GENERAL, e.g. DOORS, WINDOWS, BLINDS, GATES
    • E06B9/00Screening or protective devices for wall or similar openings, with or without operating or securing mechanisms; Closures of similar construction
    • E06B9/24Screens or other constructions affording protection against light, especially against sunshine; Similar screens for privacy or appearance; Slat blinds
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • H04L12/282Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home
    • EFIXED CONSTRUCTIONS
    • E06DOORS, WINDOWS, SHUTTERS, OR ROLLER BLINDS IN GENERAL; LADDERS
    • E06BFIXED OR MOVABLE CLOSURES FOR OPENINGS IN BUILDINGS, VEHICLES, FENCES OR LIKE ENCLOSURES IN GENERAL, e.g. DOORS, WINDOWS, BLINDS, GATES
    • E06B9/00Screening or protective devices for wall or similar openings, with or without operating or securing mechanisms; Closures of similar construction
    • E06B9/24Screens or other constructions affording protection against light, especially against sunshine; Similar screens for privacy or appearance; Slat blinds
    • E06B2009/2464Screens or other constructions affording protection against light, especially against sunshine; Similar screens for privacy or appearance; Slat blinds featuring transparency control by applying voltage, e.g. LCD, electrochromic panels
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/15Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on an electrochromic effect
    • G02F1/163Operation of electrochromic cells, e.g. electrodeposition cells; Circuit arrangements therefor
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F2201/00Constructional arrangements not provided for in groups G02F1/00 - G02F7/00
    • G02F2201/50Protective arrangements
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2614HVAC, heating, ventillation, climate control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04KSECRET COMMUNICATION; JAMMING OF COMMUNICATION
    • H04K2203/00Jamming of communication; Countermeasures
    • H04K2203/10Jamming or countermeasure used for a particular application
    • H04K2203/14Jamming or countermeasure used for a particular application for the transfer of light or images, e.g. for video-surveillance, for television or from a computer screen

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Structural Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Architecture (AREA)
  • Civil Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The network system in the peripheral structure includes one or more interactive targets, such as tintable windows, HVAC components, sensors, computing devices, media display devices, and/or service devices. Various different types of local and remote interfaces are used to facilitate remote (e.g., indirect) manipulation of an interactive target, for example, using a digital twin (e.g., a representative virtual model) of a facility and/or a user's mobile circuitry. The environment and/or goals may be controlled according to the preferences and/or requests of its users.

Description

Interaction between peripheral structures and one or more occupant-related applications
This application claims priority from U.S. provisional patent application Ser. No. 63/080,899 entitled "INTERACTION BETWEEN AN OPERATION AND ONE OR MORE OCCUPANTS" filed on 21/9/2020, U.S. provisional patent application Ser. No. 63/080,899 entitled "INDECT INTERACTION WITH A TARGET IN AN OPERE" filed on 16/7/2020, AND U.S. provisional application Ser. No. 63/052,639 entitled "INDECT INTERACTION WITH A TARGET IN AN OPERE" filed on 16/4/2020. This application is also a continuation-in-part application of U.S. patent application Ser. No. 17/249,148, entitled "control optical _ SWITCH-SWITCH DEVICES", filed 22/2021, a continuation-in-part application of U.S. patent application Ser. No. 16/096,557, entitled "control optical _ SWITCH-SWITCH DEVICES", filed 25/10/25/2018, a continuation-in-part application of International patent application Ser. No. PCT/US17/29476, filed 25/4/25/2017, a priority of U.S. provisional application Ser. No. 62/327,880, filed 2016, 26/4/2016, entitled "control optical _ SWITCH-SWITCH DEVICES", the latter is a continuation-in-part application of U.S. patent application Ser. No. 14/391,122 entitled "application FOR CONTROLLING operable switching DEVICES", filed on 7.10.2014 (now published on 30.7.7.2019), which is a national phase record of International patent application Ser. No. PCT/US13/36456, filed on 12.4.4.2013, entitled "application FOR CONTROLLING operable SVVITCHICES", filed on 13.4.2012, which claims priority of U.S. provisional application Ser. No. 61/624,175, filed on 13.13.4.2012, entitled "application FOR CONTROLLING operable switching DEVICES". This application is also a continuation-IN-part application OF U.S. patent application Ser. No. 16/946,947 entitled "AUTOMATED COMMISIONING OF CONTROLLERS IN A WINDOW NETWORK", filed 7/13/2020, which is a continuation-IN-part application OF U.S. patent application Ser. No. 16/462,916 entitled "AUTOMATED COMMISIONING OF CONTROLLERS IN A WINDOW NETWORK", filed 21/5/2019, which is a continuation-IN-part application OF U.S. patent application Ser. No. 16/462,793 entitled "METHOD OF COMMISIONING ELECTRIC WINDOWS", filed 6/9/2018 (and issued 0821/3/0821 as U.S. patent No. 10,935,864). U.S. patent application Ser. No. 16/462,916 entitled "AUTOMATED COMMISIONING OF CONTROLLING IN A WINDOW NETWORK" filed on 21.5.2019 is also a national phase record OF International patent application Ser. No. PCT/US17/62634 filed on 20.11.11.20.2017 entitled "AUTOMATED COMMISIONING OF CONTROLLING IN A WINDOW NETWORK" filed on 29.8.8.2017 entitled "AUTOMATED COMMISIONING OF CONTROLLING IN A WINDOW NETWORK" U.S. provisional patent application Ser. No. 62/551,649 and a CONTROLLING OF CONTROLLING IN A WINDOW NETWORK filed on 23.11.2016.23.20.4 priority OF U.S. provisional application Ser. No. 62/426,126.
This application is also a partial continuation of U.S. patent application Ser. No. 16/950,774 entitled "DISPLAY FOR TINTABLE WINDOWS" filed on 17.11.2020, which is a continuation of U.S. patent application Ser. No. 16/608,157 entitled "DISPLAY FOR TINTABLE WINDOWS" filed on 24.10.2019, which is a national phase record of International patent application Ser. No. PCT/US18/29476 filed on 25.4.4.2018, entitled "DISPLAY FOR TINTABLE WINDOWS", filed on 19.12.2017.19.12.4. This application claims (i) U.S. provisional patent application Ser. No. 62/607,618 entitled "TRANSPARENT DISPLAY TECHNOLOGY FIED", (ii) U.S. provisional patent application Ser. No. 62/523,606 entitled "ElectroHROMIC WINDOWS WITH TRANSPARENT DISPLAY TECHNOLOGY" filed on 22.6.2017, (iii) U.S. provisional patent application Ser. No. 62/507,704 entitled "ELECTROHROMIC WINDOWS WITH TRANSPARENT DISPLAY TECHNOLOGY" filed on 17.5.7, (iv) U.S. provisional patent application Ser. No. 62/506,514 entitled "ELECTROHROMIC WINDOWS WITH TRANSPARENT DISPLAY TECHNOLOGY filed on 15.5.15.2017, and (v) priority of U.S. provisional patent application Ser. No. 62/490,457 entitled" ELECTROROHROMIC WINDOWS WITH TRANSPARENT PLAY TECHNOLOGY TESCHLOGY filed on 26.4.7. This application is also a continuation-in-part application of U.S. patent application Ser. No. 17/083,128 entitled "BUILDING NETWORK" filed on 28.10.2020, which is a continuation of U.S. patent application Ser. No. 16/664,089 entitled "BUILDING NETWORK" filed on 25.10.2019, which is a national phase record of International patent application Ser. No. PCT/US19/30467 entitled "EDGE NETWORK FOR BUILDING SERVICES" filed on 2.5.2019, which claims priority to U.S. provisional patent application Ser. No. 62/666,033 filed on 2.5.2018. U.S. patent application Ser. No. 17/083,128 is also a continuation-in-part application of International patent application Ser. No. PCT/US18/29460 filed 2018, 25/4, which claims priority to U.S. provisional patent application Ser. No. 62/607,618, U.S. provisional patent application Ser. No. 62/523,606, U.S. provisional patent application Ser. No. 62/507,704, U.S. provisional patent application Ser. No. 62/506,514, and U.S. provisional patent application Ser. No. 62/490,457. This application is also a continuation-in-part application of U.S. patent application Ser. No. 17/081,809 entitled "TINTABLE WINDOW SYSTEM COMPTING PLATFORM" filed on 27.10.2020, which is a continuation-in-part application of U.S. patent application Ser. No. 16/608,159 entitled "TINTABLE WINDOW SYSTEM COMPTING PLATFORM filed on 24.10.10.24.2019, which is a national phase record of International patent application Ser. No. PCT/US18/29406 filed on 25.4.4.25.4.8.2018, entitled" TINTABLE WINDOW SYSTEM COMPTING PLATFORM ", which claims priority to U.S. provisional patent application Ser. No. 62/607,618, U.S. provisional patent application Ser. No. 62/523,606, U.S. provisional patent application Ser. No. 62/507,704, U.S. provisional patent application Ser. No. 62/506,514, and U.S. provisional patent application Ser. No. 62/490,457. This application is also a continuation-in-part application of International patent application Ser. No. PCT/US20/53641 entitled "TANDEM VISION WINDOW AND MEDIA DISPLAY" filed 30.9.9.5, the latter requiring U.S. provisional patent application Ser. No. 62/911,271 entitled "TANDEM VISION WINDOW AND TRANSPARENT DISPLAY" filed 5.10.9.2019, U.S. provisional patent application Ser. No. 62/952,207 entitled "TANDEM VISION WINDOW AND TRANSPARENT DISPLAY" filed 12.2.2020, U.S. provisional patent application Ser. No. 62/975,706 entitled "TANDEM VISION WINDOW AND MEDIA DISPLAY" filed 12.2.12.2020, AND U.S. provisional patent application Ser. No. 254/085,63 entitled "TANDEM VISION AND MEDIA DISPLAY" filed 9.30.2020, priority. This application is also filed on even 2.4.2021, entitled "DISPLAY CONSTRUCT FOR MEDIA PROJECTION AND WIRELESS CHARGING" U.S. provisional patent application Ser. No. 63/170,245, filed on even 26.2.2021, entitled "DISPLAY CONSTRUCT FOR MEDIA PROJECTON AND WIRELESS CHARGING" U.S. provisional patent application Ser. No. 63/154,352, filed on even 19.11.2020, AND filed on even 19.11.842, part of U.S. provisional patent application Ser. No. 63/115,842, entitled "DISPLAY CONSTRUCT FOR MEDIA PROJECTON". Each of the above-mentioned patent applications is incorporated herein by reference in its entirety.
Background
The present disclosure relates generally to interacting with (e.g., controlling) one or more interactive targets in a peripheral structure. The interactive targets may include optically switchable devices (e.g., tintable windows in buildings), projection media, environmental appliances, sensors, or any other device communicatively coupled to a communication network in a peripheral structure.
The ability to control environmental conditions is becoming increasingly popular, as are the deployment and manipulation of associated equipment such as sensors, transmitters, and/or environmentally affecting devices. Controlling the environment may be to increase occupant comfort and/or reduce power consumption and improve the efficiency of systems (e.g., heaters, coolers, vents, and/or lighting) that control the environment of the peripheral structures.
Included in these devices are optically switchable windows. With the increasing interest in energy efficiency and system integration, the development and deployment of optically switchable windows for peripheral structures has increased. Electrochromic windows are a promising class of optically switchable windows. Electrochromism is a phenomenon in which a material exhibits a reversible, electrochemically-mediated change in one or more optical properties when stimulated to a different electronic state. Electrochromic materials and devices made from them can be incorporated into windows, for example, for home, commercial, or other uses. The color, hue, transmittance, absorbance or reflectance of the electrochromic window can be changed by inducing a change in the electrochromic material, for example, by applying a voltage across the electrochromic material. This capability may allow for control of the intensity of various wavelengths of light that may pass through the window. One area of interest is control systems for driving optical transitions in optically switchable windows to provide desired lighting conditions, for example, while reducing power consumption of such devices and increasing efficiency of systems integrated therewith.
At least one user of the facility (e.g., a building occupant and/or a remotely located person) may want to manipulate (e.g., control) various network-connected devices (e.g., tintable windows) and/or media content in the facility. For convenience, such personal or social interactions should be as intuitive as possible. For example, a user may wish to control various aspects of a facility environment (e.g., using HVAC, sensors, transmitters, and/or tintable windows) via a gaming controller (e.g., a virtual reality or VR controller). The user may also want to control the media content projected in the facility (e.g., using electrochromic window projection or using a projector to display on a wall). The control of any newly added targets (e.g., devices) of the facility should preferably be as seamless as possible and require as little human labor as possible to configure the devices and the remote control system.
Disclosure of Invention
Various aspects disclosed herein at least partially mitigate the disadvantages and/or desires associated with remote control of interactive targets. Various embodiments herein relate to methods, systems, software, and networks for manipulating (e.g., controlling) a target (e.g., a device) communicatively coupled to a network, such as by manipulating a digital twin (e.g., a representative virtual model) of a facility. The target may comprise an optically switchable device. The target may be controlled using a remote controller (e.g., a pointing device) and/or a Virtual Reality (VR) user interface. Various embodiments disclosed herein relate to adjusting peripheral structures and/or target devices according to preferences and/or desires of one or more users and/or occupants. The adjustment may utilize a prediction of preferences and/or desires (e.g., using machine learning) by a learning module. The adjustment can facilitate seamless coupling and/or interaction between peripheral structures (e.g., facilities including a building) and their users and/or occupants. The adjustment may facilitate seamless coupling and/or interaction between the control system of the peripheral structure, its controlled target device, and its user and/or occupant. Such adjustments may improve the efficiency of activities occurring in the peripheral structure (e.g., work, health, safety, and/or leisure-related activities).
In another aspect, there is provided a method for controlling an interactive target of a facility, the method comprising: (A) Monitoring a position of a mobile circuit relative to a digital twin, the digital twin comprising a virtual three-dimensional representation of a structural feature of a facility having a real interactive target, the mobile circuit (I) being movable by a user, (II) having a known position relative to at least a portion of the structural feature, and (111) being coupled to the virtual representation of the real interactive target in the digital twin; (B) Correlating a gesture imparted on the movement circuit with the digital twin and generating a result, the gesture (i) being imparted by the user, (ii) intended to remotely cause a change in the real interactive target, and (iii) during coupling with the real interactive target; and (C) using the result to change the current state of the real interactive target in the facility.
In some embodiments, the method further comprises configuring the digital twin according to a building information modeling data file from which the facility was constructed or from which the facility was constructed.
In some embodiments, the building information modeling data file includes building details, annotations, or information from a model database of the facility. In some embodiments, the building information modeling data file is used to plan and/or track various stages in the life cycle of a facility, including concept, construction, maintenance, and/or demolition of the facility. In some embodiments, the building information modeling data file includes a three-dimensional structure model annotated with two-dimensional drawing elements in a database. In some embodiments, the three-dimensional structural model includes parametric elements that provide geometric parameters of the structural feature. In some embodiments, the facility includes a digital network. In some embodiments, the digital network is used, at least in part, to monitor the location of the mobile circuit. In some embodiments, the facility includes a digital network communicatively coupled to the real interactive target. In some embodiments, the facility includes a control network communicatively coupled to the real interactive target to support (a) monitoring a location of the mobile circuit, and/or (b) changing a current state of the real interactive target. In some embodiments, the control network is a hierarchical network comprising a plurality of controllers. In some embodiments, the mobile circuit is included in a handheld pointing device. In some embodiments, the mobile circuitry is included in a mobile phone. In some embodiments, the movement circuitry is included in a handheld game controller having motion functionality and click/select functionality. In some embodiments, the movement circuit includes or is coupled to a motion sensor. In some embodiments, the mobile circuitry does not use an electromagnetic beam or an acoustic beam. In some embodiments, the movement circuit is included in a Virtual Reality (VR) interface that includes a display headset, a handheld controller with motion or selection functionality. In some embodiments, the mobile circuitry is included in a laptop computer. In some embodiments, the mobile circuit is included in a tablet computer. In some embodiments, the gesture includes a movement. In some embodiments, the position relative to the structural feature of the facility is established at a first time, and the relative position is maintained while the mobile circuit is moving (i) locally in the facility or (ii) away from the facility. In some embodiments, the relative position is maintained in real time. In some embodiments, the digital twin includes a virtual three-dimensional representation of a plurality of structural features of the facility, including a wall, floor, window, door, or table of the facility. In some embodiments, the digital twin includes a virtual three-dimensional representation of a plurality of structural features of the facility, the plurality of structural features including fixed or non-fixed devices of the facility. In some embodiments, the coupling of the mobile circuit to the virtual representation of the real interactive target in the digital twin includes (i) a spatial relationship between the mobile circuit and at least a portion of the structural feature identified in at least two dimensions, and (ii) a relative pointing direction of the mobile circuit to the real interactive target. In some embodiments, the coupling of the movement circuit to the virtual representation of the real interactive target in the digital twin is made up of a virtual digital ray extending from the location of the movement circuit to the target. In some implementations, the gesture imparted by the user consists of a gesture performed with the movement circuitry to specify a direction of the virtual digital ray. In some embodiments, the gesture includes a pointing, moving, or clicking action. In some embodiments, the real interactive target comprises a media display. In some embodiments, coupling of the movement circuitry to the virtual representation of the real interactive target in the digital twin includes selecting an active media element in the media display. In some implementations, the active media element is a drop-down menu selection. In some implementations, the selection of the active media element includes a pointing motion, a moving motion, or a clicking action. In some embodiments, changing the function of the real interactive target is commensurate with the intent of the gesture. In some implementations, the gesture imparted by the user is coupled to the real interactive target because the movement circuit is pointed at the real interactive target. In some embodiments, the current state change comprises a change in a signal sent to the optically tintable window. In some embodiments, the optically tintable window is an electrochromic window. In some embodiments, the current state change comprises a tint of the optically tintable window. In some embodiments, the optically tintable window is an electrochromic window. In some embodiments, the current state change comprises a menu control parameter of the media content display. In some embodiments, the current state change comprises a parameter of the sensor and/or the transmitter. In some embodiments, the current state change includes a command setting of the environmental control unit. In some embodiments, the environment control unit controls the environment of the facility. In some embodiments, the command settings include (i) a tint density of the tintable window, (ii) a temperature setting of the HVAC unit, (iii) a fan setting of the HVAC unit, or (iv) an on/off setting of the lighting unit. In some embodiments, the digital twin represents a plurality of structural features. In some embodiments, the structural features include static elements and/or dynamic elements. In some embodiments, the dynamic element comprises a virtual representation of the real interactive target. In some embodiments, the dynamic element includes a current state of the real interactive target. In some embodiments, the facility includes a control network communicatively coupled to (i) the real interactive target and (ii) the digital twin. In some embodiments, the method comprises: updating, by the control network, a current state of the virtual representation of the real interactive target in the digital twin as the current state changes. In some embodiments, the facility includes a network of controllers communicatively coupled to the real interactive targets. In some embodiments, the method further includes sending one or more data messages from the controller network to the digital twin to update the digital twin. In some embodiments, the digital twin is updated according to a change in a current state of the real interactive target. In some embodiments, the method further comprises exchanging one or more messages between the digital twin and the mobile circuit to (i) provide the mobile circuit with an analysis corresponding to the initial virtual location, and (ii) virtually navigate in the digital twin to interact with a virtual representation of the real interactive target in the digital twin. In some embodiments, the initial virtual location is a known location. In some embodiments, the initial virtual location is different from the known location. In some embodiments, the moving circuit is disposed remotely from the known location. In some embodiments, the initial virtual location is aligned with a virtual representation of a known location in the digital twin. In some embodiments, a user manipulating the mobile circuit is positioned away from a known location. In some embodiments, the initial virtual location is aligned with a virtual representation of a known location in the digital twin. In some embodiments, the user is outside the facility. In some embodiments, the user is in a facility. In some embodiments, the initial virtual location is a default location. In some embodiments, the method further comprises sending a location message from the mobile circuit to the digital twin to specify an initial virtual location. In some embodiments, the method further comprises (a) sending at least one control action message from the movement circuit to the digital twin in response to at least one gesture performed with the movement circuit; (b) Verifying the at least one gesture against a predetermined control action in the digital twin; and (c) sending at least one command message to the real interactive target when the gesture is verified. In some embodiments, the digital twin includes a virtual three-dimensional representation of a plurality of structural features of the facility, the plurality of structural features including a plurality of real interactive targets. In some embodiments, the virtual three-dimensional representation is modified in response to adding and/or subtracting real interactive targets in the facility.
In another aspect, an apparatus for controlling an interactive target of a facility is provided, the apparatus comprising one or more controllers having circuitry, the one or more controllers configured to: (A) Communicatively coupled to (a) a digital twin comprising a virtual three-dimensional representation of a structural feature of a facility with a real interactive target, and (b) a movement circuit that (I) is movable by a user, (II) has a known position relative to at least a portion of the structural feature, and (III) is coupled to a virtual representation of the real interactive target in the digital twin; (B) Monitoring or directing monitoring of a position of the moving circuit relative to the digital twin; (C) Correlating or directing a gesture imparted on the movement circuitry with the digital twin and generating a result, the gesture (i) being imparted by the user, (ii) intended to remotely cause a change in the real interactive target, and (iii) during coupling of the user with the real interactive target; and (D) using the result to change or direct a change to the current state of the real interactive target in the facility.
In some embodiments, the one or more controllers include or are communicatively coupled to a building management system. In some embodiments, the one or more controllers are part of a hierarchical control system. The control system may include one or more controllers. The at least one controller of the control system may be in a facility. The at least one controller may be remote from the facility (e.g., located outside the facility, e.g., in the cloud). The at least one controller may be located in a peripheral structure in which the target device is located. At least one controller may be located in a room in which the target device is located. At least one controller may be located in the floor where the target device is located. The at least one controller may be located in a facility in which the target device is located. The at least one controller may be located in a different room than the room in which the target device is located. The at least one controller may be located in a peripheral structure different from the peripheral structure in which the target device is located. The at least one controller may be located in a floor different from the floor in which the target device is located. The at least one controller may be located in a building different from the building in which the target device is located. In some embodiments, the one or more controllers include a control scheme comprising a feedback, feed-forward, closed-loop, or open-loop control scheme. In some embodiments, the one or more controllers are interconnected in a network disposed in the facility. In some embodiments, the network includes a cable including stranded cable, coaxial cable, and/or fiber optic cable. In some embodiments, the network is at least partially disposed in an enclosure of the facility, in an electrical shaft, a communications shaft, an elevator shaft, and/or an electrical room. In some embodiments, the one or more controllers are configured to alter or direct the altering of the digital twin in accordance with a building information modeling data file from which the facility was constructed or from which the facility was constructed. In some embodiments, the building information modeling data includes building details, annotations, and/or information from the model for the facility. In some embodiments, the building information modeling data file is used to plan and/or track various stages in the life cycle of a facility, including concept, construction, maintenance, and/or demolition of the facility. In some embodiments, the building information modeling data file includes a three-dimensional structure model annotated with two-dimensional drawing elements in a database. In some embodiments, the three-dimensional structural model includes parametric elements that provide geometric parameters of the structural feature. In some embodiments, the device further comprises a digital network. In some embodiments, the digital network is used to monitor the location of the mobile circuit. In some embodiments, the relative position is maintained in real time. In some embodiments, the device further comprises a digital network communicatively coupled to the real interactive target. In some embodiments, the digital network supports at least a fourth generation (4G) communication interface. In some embodiments, the digital network supports at least a fifth generation (5G) communication interface. In some embodiments, the apparatus further includes a control network disposed at least in part in the facility, the control network communicatively coupled to the real interactive target to facilitate (a) monitoring a location of the mobile circuit, and/or (b) changing a current state of the real interactive target. In some embodiments, the control network is a hierarchical network comprising a plurality of controllers. In some embodiments, the mobile circuit is included in a handheld pointing device. In some embodiments, the mobile circuitry is included in a mobile phone. In some embodiments, the movement circuitry is included in a handheld game controller having a motion function and a click/select function. In some embodiments, the movement circuit includes or is coupled to a motion sensor. In some embodiments, the mobile circuitry does not use an electromagnetic beam or an acoustic beam. In some embodiments, the movement circuit is included in a Virtual Reality (VR) interface that includes a display headset, a handheld controller with motion or selection functionality. In some embodiments, the mobile circuitry is included in a laptop computer. In some embodiments, the mobile circuit is included in a tablet computer. In some embodiments, the gesture includes a movement. In some embodiments, the location relative to the structural feature of the facility is established at a first time. In some embodiments, the one or more controllers are configured to maintain or direct maintenance of the relative position as the movement circuitry moves (i) locally in the facility or (ii) away from the facility. In some embodiments, the one or more controllers are configured to maintain or direct the maintenance of relative positions in real time. In some embodiments, the digital twin includes a virtual three-dimensional representation of a plurality of structural features of the facility, including a wall, floor, window, door, or table of the facility. In some embodiments, the digital twin includes a virtual three-dimensional representation of a plurality of structural features of the facility, the plurality of structural features including fixed or non-fixed devices of the facility. In some embodiments, the digital twin is configured to (a) identify a spatial relationship between the known location and the virtual representation of the real interactive target in at least two dimensions, and/or (b) identify a relative pointing direction to the real interactive target. In some embodiments, the initial virtual location is a known location. In some embodiments, the initial virtual location is different from the known location. In some embodiments, the movement circuit is disposed remotely from the known location. In some embodiments, the one or more controllers are configured to align or guide the initial virtual position with a virtual representation of a known position in the digital twin. In some embodiments, a user manipulating the mobile circuit is positioned away from a known location. In some embodiments, the one or more controllers are configured to align or guide the initial virtual location with a virtual representation of a known location in the digital twin. In some embodiments, the user is outside the facility. In some embodiments, the user is in a facility. In some embodiments, the digital twin is configured to identify a digital ray projected from the location of the moving circuit to the real interactive target. In some embodiments, the movement circuitry is configured to perform at least one gesture to specify the direction of the digital ray. In some embodiments, the at least one gesture includes a pointing, moving, or clicking action. In some embodiments, the real interactive target comprises a media display. In some embodiments, the digital twin is configured to identify a selection of an active media element in the media display. In some implementations, the active media element is a drop-down menu selection. In some implementations, the selection of the active media element includes a pointing motion, a moving motion, or a clicking action. In some embodiments, the one or more controllers are configured to change the current state of the real interactive target commensurate with the intent of the gesture. In some embodiments, the one or more controllers are configured to use a gesture imparted by a user to couple the movement circuit to the real interactive target by pointing the movement circuit to one side of the real interactive target. In some embodiments, the side is the front of the remote controller where the mobile circuitry is embedded. In some embodiments, the one or more controllers are configured to generate a change in an electrical signal sent to the optically tintable window to change the tint of the optically tintable window. In some embodiments, the optically tintable window is an electrochromic window. In some embodiments, the one or more controllers are configured to change a current state of tint of the optically tintable window. In some embodiments, the optically tintable window is an electrochromic window. In some embodiments, the one or more controllers are configured to change a current state of a menu controller parameter of the media content display. In some embodiments, the one or more controllers are configured to change a current state of a parameter of the sensor and/or the transmitter. In some embodiments, the one or more controllers are configured to change a current state of a command setting of the environmental control unit. In some embodiments, the environment control unit is configured to control the environment of the facility. In some embodiments, the command settings are configured to include (i) a tint density of the tintable window, (ii) a temperature setting of the HVAC unit, (iii) a fan setting of the HVAC unit, and/or (iv) an on/off setting of the lighting unit. In some embodiments, the digital twin is configured to represent a plurality of structural features. In some embodiments, the structural features include static elements and dynamic elements. In some embodiments, the dynamic element is configured to include a virtual representation of the real interactive target. In some embodiments, the dynamic element is configured to include a current state of the real interactive target. In some embodiments, the one or more controllers are configured to: updating, by the control network, a current state of the virtual representation of the real interactive target in the digital twin when the current state of the real interactive target changes. In some embodiments, at least two of operations (a), (B), (C), and (D) are configured to be performed by the same one of the one or more controllers. In some embodiments, at least two of operations (a), (B), (C), and (D) are configured to be performed by different ones of the one or more controllers.
In another aspect, a non-transitory computer program product for controlling an interactive target of a facility, the non-transitory computer program product comprising instructions recorded thereon, which when executed by one or more processors, cause the one or more processors to perform operations comprising: (A) Monitoring a position of a mobile circuit relative to a digital twin, the digital twin comprising a virtual three-dimensional representation of a structural feature of a facility having a real interactive target, the mobile circuit (I) being movable by a user, (II) having a known position relative to at least a portion of the structural feature, and (III) being coupled to a virtual representation of the real interactive target in the digital twin; (B) Correlating a gesture imparted on the movement circuit with the digital twin and generating a result, the gesture (i) being imparted by the user, (ii) intended to remotely cause a change in the real interactive target, and (iii) during coupling with the real interactive target; and (C) using the result to change the current state of the real interactive target in the facility.
In some embodiments, the operations include configuring the digital twin according to a building information modeling data file from which the facility was built or from which the facility was built. In some embodiments, the building information modeling data file includes building details, annotations, or information from a model database of the facility. In some embodiments, the building information modeling data file is used to plan and/or track various stages in the life cycle of a facility, including concept, construction, maintenance, and/or demolition of the facility. In some embodiments, the building information modeling data file includes a three-dimensional structure model annotated with two-dimensional drawing elements in a database. In some embodiments, the three-dimensional structure model includes parametric elements that provide geometric parameters of the structural feature. In some embodiments, the operations are applicable to facilities including digital networks. In some implementations, a digital network is used, at least in part, to monitor the location of the mobile circuit and/or gestures imparted on the mobile circuit. In some embodiments, the operations are adapted for a facility including a digital network communicatively coupled to a real interactive target. In some embodiments, the operations are applicable to a facility that includes a control network communicatively coupled to a real interactive target to support (a) monitoring gestures imparted on a mobile circuit, and/or (b) changing a current state of the real interactive target. In some embodiments, the operations are applicable to a control network, which is a hierarchical network comprising a plurality of controllers. In some embodiments, the operations are applicable to a mobile circuit included in a handheld pointing device. In some embodiments, the operations are applicable to mobile circuitry included in a mobile phone. In some embodiments, the operations are applicable to a movement circuit included in a hand-held game controller having a motion function, a click function, and/or a selection function. In some embodiments, the operation is adapted for the mobile circuit to include or be coupled to a motion sensor. In some embodiments, the operation is adapted to exclude the use of electromagnetic or acoustic beams by the mobile circuit. In some embodiments, the operations are suitable for mobile circuits included in a Virtual Reality (VR) interface that includes a display headset, a hand-held controller with motion functionality and/or selection functionality. In some embodiments, the operations are applicable to mobile circuitry included in a laptop computer. In some embodiments, the operations are applicable to mobile circuitry included in a tablet computer. In some embodiments, the location relative to the structural feature of the facility is established at a first time, and the relative location is maintained while the mobile circuit is moving (i) locally in the facility or (ii) away from the facility. In some embodiments, the relative position is maintained in real time. In some embodiments, the digital twin includes a virtual three-dimensional representation of a plurality of structural features of the facility, including a wall, floor, window, door, or table of the facility. In some embodiments, the digital twin includes a virtual three-dimensional representation of a plurality of structural features of the facility, the plurality of structural features including fixed or non-fixed devices of the facility. In some embodiments, the coupling of the mobile circuit to the virtual representation of the real interactive target in the digital twin includes (i) a spatial relationship between the mobile circuit and at least a portion of the structural feature identified in at least two dimensions, and (ii) a relative pointing direction of the mobile circuit to the real interactive target. In some embodiments, the coupling of the movement circuit to the virtual representation of the real interactive target in the digital twin consists of a virtual digital ray extending from the location of the movement circuit to the target. In some embodiments, the gesture is performed with movement circuitry to specify a direction of the virtual digital ray. In some embodiments, the gesture includes a pointing, moving, or clicking action. In some embodiments, the real interactive target comprises a media display. In some implementations, coupling of the movement circuitry to the virtual representation of the real interactive target in the digital twin includes selecting an active media element in the media display. In some implementations, the active media element is a drop-down menu selection. In some embodiments, the selection of the active media element includes a pointing motion, a moving motion, or a clicking action. In some embodiments, changing the function of the real interactive target is commensurate with the intent of the gesture. In some implementations, the gesture imparted by the user is coupled to the real interactive target because the movement circuit is pointed at the real interactive target. In some embodiments, the current state change comprises a change in a signal sent to the optically tintable window. In some embodiments, the optically tintable window is an electrochromic window. In some embodiments, the current state change comprises a menu control parameter of the media content display. In some embodiments, the current state change includes a parameter of the sensor and/or the transmitter. In some embodiments, the current state change includes a command setting of the environmental control unit. In some embodiments, the environment control unit controls the environment of the facility. In some embodiments, the command settings include (i) a tint density of the tintable window, (ii) a temperature setting of the HVAC unit, (iii) a fan setting of the HVAC unit, or (iv) an on/off setting of the lighting unit. In some embodiments, the digital twin represents a plurality of structural features. In some embodiments, the structural features include static elements and/or dynamic elements. In some embodiments, the dynamic element comprises a virtual representation of the real interactive target. In some embodiments, the dynamic element includes a current state of the real interactive target. In some embodiments, the facility includes a control network communicatively coupled to (i) the real interactive target and (ii) the digital twin. In some embodiments, the operations comprise: updating, by the control network, a current state of the virtual representation of the real interactive target in the digital twin as the current state of the real interactive target changes. In some embodiments, the operations are adapted for a facility including a network of controllers communicatively coupled to a real interactive target. In some embodiments, the operations further comprise sending one or more data messages from the controller network to the digital twin for updating the digital twin. In some embodiments, the digital twin is updated according to a change in the current state of the real interactive target. In some embodiments, the operations further comprise exchanging messages between the digital twin and the mobile circuit to (i) provide the mobile circuit with an analysis corresponding to the initial virtual location, and (ii) virtually navigate in the digital twin to interact with a virtual representation of the real interactive target in the digital twin. In some embodiments, the initial virtual location is a known location. In some embodiments, the initial virtual location is different from the known location. In some embodiments, the movement circuit is disposed remotely from the known location. In some embodiments, the operations include aligning the initial virtual location with a virtual representation of a known location in the digital twin. In some embodiments, a user manipulating the mobile circuit is positioned away from a known location. In some embodiments, the operations include aligning the initial virtual location with a virtual representation of a known location in the digital twin. In some embodiments, the user is outside the facility. In some embodiments, the user is in a facility. In some embodiments, the initial virtual location is a default location. In some embodiments, the operations further comprise sending a location message from the mobile circuit to the digital twin to specify an initial virtual location. In some embodiments, the operations further comprise (a) sending a control action message from the movement circuit to the digital twin in response to a gesture imparted on the movement circuit; (b) Verifying the gesture against a predetermined control action in the digital twin; and (c) sending a command message to the real interactive target when the gesture is verified. In some embodiments, the digital twin includes a virtual three-dimensional representation of a plurality of structural features of the facility, the plurality of structural features including a plurality of real interactive targets. In some embodiments, the virtual three-dimensional representation is modified in response to adding and/or subtracting one or more real interactive targets in the facility. In some embodiments, at least two of operations (a), (B), and (C) are performed by the same processor of the one or more processors. In some embodiments, at least two of operations (a), (B), and (C) are performed by different processors of the one or more processors.
In another aspect, there is provided a method for controlling a service device of a facility, the method comprising: (a) Identifying, by a control system configured to control a service device, the service device in proximity to a user disposed in a facility; (b) Registering in the control system the location of the user in the facility; (c) Providing a service device from a plurality of devices, the service device provided based at least in part on a location of a user; and (d) instructing the service apparatus to execute the service selected by the user.
In some embodiments, the method further comprises providing the user with a selection comprising a service provided by the service device. In some embodiments, the location of the user is sensed by a sensor communicatively coupled to the control system. In some embodiments, the method further comprises operatively coupling the service device to the control system by utilizing a network authentication protocol. In some embodiments, the method further comprises utilizing a security protocol in controlling the service device. In some embodiments, the method further comprises utilizing building automation and control protocols in controlling the service device. In some embodiments, the proximate user is at least about fifteen (15) meters, twenty (20) meters, thirty (30) meters, or more. In some embodiments, the mobile circuitry of the user is indirectly coupled to the serving device. In some embodiments, the service device has a range, and wherein the proximate user is when the user is within range of the service device. In some embodiments, the service device has a first range, wherein the user has a second range, and wherein the neighboring users include an intersection between the first range of the service device and the second range of the user. In some embodiments, the first range and/or the second range is adjustable. In some embodiments, the first range is specific to the service device, the service device type, and/or the location of the service device. In some embodiments, the first range differs between service devices, different types of service devices, and/or service devices in different locations. In some embodiments, the control system is a control system configured to control at least one other device attached to the facility. In some embodiments, the method further comprises controlling the at least one other device of the facility using the mobile circuitry and the control system. In some embodiments, the at least one other device comprises an electrochromic device. In some embodiments, the at least one other device comprises a media display, lighting, sensors, transmitters, antennas, heating ventilation, and air conditioning (HVAC) system. In some embodiments, the location of the user is sensed by a sensor communicatively coupled to the control system. In some embodiments, the method further comprises determining a location of the user. In some embodiments, the location of the user is determined by utilizing ultra-wide radio waves. In some implementations, determining the location of the user has an accuracy of at least about twenty (20) meters or more. In some embodiments, ultra-wide radio waves are used to determine the location of the user. In some embodiments, the service device is provided by an application installed in a mobile circuit held by the user. In some embodiments, the selection of services is provided by an application installed in a mobile circuit maintained by the user. In some embodiments, the user selects the service without contacting the service device. In some implementations, the service of the service device is depicted on the service device. In some embodiments, the user selects the service on the service device without contacting the service device by pointing the mobile circuit to a depiction of the service. In some embodiments, the mobile circuit comprises a cellular phone, a tablet computer, or a laptop computer. In some embodiments, the method further comprises depicting a virtual representation of at least a portion of the facility in which the service device is disposed, the depicting being performed by the mobile circuit. In some embodiments, the method further comprises depicting, by the mobile circuit, a virtual representation of the service device. In some embodiments, the method further comprises depicting a virtual representation of the service provided by the service device, the depicting being performed by the mobile circuit. In some embodiments, the method further comprises rendering, by the service device, the virtual representation performed by the service, the rendering being performed by the mobile circuit. In some embodiments, the method further comprises updating, using the control system, the virtual representation of the service execution of the service device in real time as the service executes.
In another aspect, a non-transitory computer program product for controlling a service device of a facility is provided, the non-transitory computer program product including instructions recorded thereon, which when executed by one or more processors, cause the one or more processors to perform operations comprising: (a) Identifying, by a control system configured to control a service device, the service device being proximate to a user disposed in a facility; (b) Registering in the control system the location of the user in the facility; (c) Providing a service device from a plurality of devices, the service device provided based at least in part on a location of a user; and (d) instructing the service apparatus to execute the service selected by the user.
In some embodiments, the operations include providing a selection to the user that includes a service provided by the service device. In some embodiments, the location of the user is sensed by a sensor communicatively coupled to the control system. In some embodiments, the operations include operatively coupling the service device to the control system by utilizing a network authentication protocol. In some embodiments, the operations include identifying, by a control system configured to control the service device by utilizing a security protocol, the service device. In some embodiments, the operations include identifying, by a control system configured to control the service device through a building automation and control protocol, the service device. In some embodiments, the operations further comprise identifying, by the control system, the service device further configured to control at least one other device attached to the facility. In some embodiments, the operations further comprise controlling the at least one other device of the facility by using the control system and the mobile circuitry. In some embodiments, the at least one other device comprises an electrochromic device. In some embodiments, the at least one other device comprises a media display, lighting, sensors, transmitters, antennas, heating ventilation, and air conditioning (HVAC) system. In some embodiments, the service device has a range, and wherein the proximate user is when the user is within range of the service device. In some embodiments, the service device has a first range, wherein the user has a second range, and wherein the neighboring users include an intersection between the first range of the service device and the second range of the user. In some embodiments, the first range and/or the second range is adjustable. In some embodiments, the first range is specific to the service device, the service device type, and/or the location of the service device. In some embodiments, the first range differs between service devices, different types of service devices, and/or service devices in different locations. In some embodiments, the operations further comprise determining a location of the user. In some embodiments, the operations further comprise determining the location of the user by using ultra-wide radio waves. In some embodiments, the operations further comprise determining the location of the user with an accuracy of at least about twenty (20) meters or more. In some embodiments, the service device is provided by an application installed in a mobile circuit held by the user. In some embodiments, the selection of services is provided by an application installed in a mobile circuit maintained by the user. In some embodiments, the user selects the service without contacting the service device. In some embodiments, the service of the service device is depicted on the service device. In some embodiments, the operations further comprise facilitating selection of the service by detecting a depiction of a pointing target of the mobile circuit toward the service on the service device, the pointing target being pointed to by the user without contacting the service device. In some embodiments, the mobile circuit comprises a cellular telephone, a tablet computer, or a laptop computer. In some embodiments, the operations further comprise facilitating use of the mobile circuitry to depict a virtual representation of at least a portion of a facility in which the service device is disposed. In some embodiments, the operations further comprise depicting, on the mobile circuit, a virtual representation of the service device. In some embodiments, the operations further comprise depicting a virtual representation of a service provided by the service apparatus on the mobile circuit. In some embodiments, the operations further comprise depicting a virtual representation of service execution by the service device on the mobile circuit. In some embodiments, the operations further comprise updating, using the control system, the virtual representation of the service execution of the service device in real-time as the service executes. In some embodiments, a processor is configured to perform at least two of operations (a), (b), (c), and (d). In some embodiments, at least two of operations (a), (b), (c), and (d) are performed by different controllers.
In another aspect, there is provided an apparatus for controlling service devices of a facility, the apparatus comprising at least one controller comprising circuitry, the at least one controller configured to: (a) Operatively coupled to the service device and controlling or directing the control service device; (b) Identifying or directing identification of a service device disposed proximate to a user disposed in a facility; (c) registering or guiding the location of the registered user in the facility; (d) Providing or directing provision of a service device from a plurality of devices, the service device being provided based at least in part on a location of a user; and (e) instructing the service apparatus to execute the service selected by the user.
In some embodiments, the at least one controller is configured to provide or direct provision of a selection including a service provided by the service apparatus. In some embodiments, the at least one controller is configured to be operatively coupled to a sensor configured to sense a position of a user. In some embodiments, the at least one controller is configured to operatively couple the service device to the control system by utilizing a network authentication protocol or by directing the utilization. In some embodiments, the at least one controller is configured to control or direct the control service by using a secure protocol. In some embodiments, the at least one controller is configured to control or direct the control service by using a building automation and control protocol. In some embodiments, the at least one controller is configured to be operatively coupled to and control or direct control of at least one other device attached to the facility. In some embodiments, the at least one controller is configured to control or direct at least one other device of the control facility by using the mobile circuit. In some embodiments, the at least one other device comprises an electrochromic device. In some embodiments, the at least one other device comprises a media display, lighting, a sensor, a transmitter, an antenna, a Heating Ventilation and Air Conditioning (HVAC) system. In some embodiments, the service device has a range, and wherein the proximate user is when the user is within range of the service device. In some embodiments, the service device has a first range, wherein the user has a second range, and wherein the neighboring users comprise an intersection between the first range of the service device and the second range of the user. In some embodiments, the first range and/or the second range is adjustable. In some embodiments, the first range is specific to the service device, the service device type, and/or the location of the service device. In some embodiments, the first range differs between service devices, different types of service devices, and/or service devices in different locations. In some embodiments, the at least one controller is configured to determine or direct the determination of the location of the user. In some embodiments, the at least one controller is configured to determine or guide the determination of the location of the user by using ultra-wide radio waves. In some embodiments, the at least one controller is configured to determine or guide the determination of the location of the user with an accuracy of at least about twenty (20) meters or more. In some embodiments, the at least one controller is configured to provide or direct provision of the service device through an application installed in a mobile circuit held by the user. In some embodiments, the at least one controller is configured to provide or direct selection of the service provided by an application installed in the mobile circuitry maintained by the user. In some implementations, the service of the service device is depicted on the service device. In some embodiments, the at least one controller is configured to facilitate or direct facilitating the service by detecting or directing a depiction of a pointing object of the mobile circuit toward the service on the service device, the pointing object being pointed to by the user without contacting the service device. In some embodiments, the mobile circuit comprises a cellular telephone, a tablet computer, or a laptop computer. In some embodiments, the at least one controller is configured to facilitate or direct the facilitating of use of the mobile circuitry to depict a virtual representation of at least a portion of a facility in which the service device is disposed. In some embodiments, the at least one controller is configured to render or direct rendering of a virtual representation of the service device on the mobile circuit. In some embodiments, the at least one controller is configured to depict or direct the depiction of a virtual representation of a service provided by the service apparatus on the mobile circuit. In some embodiments, the at least one controller is configured to depict or direct the depiction of a virtual representation of a service performed by the service apparatus on the mobile circuit. In some embodiments, the at least one controller is configured to update or direct the virtual representation of service execution of the real-time update service device in real-time as the service is executed. In some embodiments, a controller is configured to perform at least two of operations (b), (c), (d), and (e). In some embodiments, at least two of operations (b), (c), (d), and (e) are performed by different controllers.
In another aspect, there is provided a method of controlling a facility, the method comprising: identifying, by a control system, an identity of a user; (b) Tracking a location of a user in a facility by using one or more sensors disposed in the facility, the one or more sensors communicatively coupled to a control system; (c) using input relating to the user; and (d) using the control system to automatically control (e.g., change) one or more devices in the facility using the input and the user's location information, the one or more devices communicatively coupled to the control system.
In some embodiments, the location is a current location of the user or a past location of the user. In some embodiments, identifying the identity of the user includes receiving an identification card reading, or performing image recognition on a captured image of the user in the facility. In some embodiments, the one or more sensors include a camera or a geographic location sensor. In some embodiments, the geographic position sensor comprises an ultrawide bandwidth sensor. In some embodiments, the geographic position sensor may locate the user with a resolution of at least twenty (20) centimeters or greater. In some embodiments, the input related to the user comprises a service request made by, on behalf of, or for the user. In some embodiments, the input related to the user relates to the user's activity in the peripheral structure in which the user is located. In some embodiments, the input related to the user comprises an electronic file. In some embodiments, the input related to the user includes a gesture and/or a voice command made by the user. In some embodiments, the input related to the user relates to a preference of the user. In some embodiments, the user's preferences are provided by machine learning that takes into account the user's past activities. In some embodiments, the user's preferences are input by the user. In some embodiments, the one or more devices include lighting, ventilation and air conditioning systems, heating systems, sound systems, or odor conditioning systems. In some embodiments, the one or more devices are configured to affect the atmosphere in which the user's peripheral structures are disposed. In some embodiments, the one or more devices include service, office, and/or factory equipment. In some embodiments, the one or more devices are disposed outside of a peripheral structure of a facility in which the user is located. In some embodiments, the one or more devices are disposed in a peripheral structure of a facility in which the user is located. In some embodiments, the one or more devices include a media projection device. In some embodiments, the one or more devices comprise a tintable window. In some embodiments, the one or more devices comprise an electrochromic window. A non-transitory computer-readable medium for controlling a facility is provided that, when read by one or more processors, is configured to perform operations comprising any of the above method operations.
In another aspect, an apparatus for controlling a facility is provided, the apparatus comprising at least one controller having circuitry, the at least one controller configured to: (a) Operatively coupled to one or more sensors disposed in the facility and one or more devices disposed in the facility; (b) identifying or directing identification of the user; (c) Tracking or guiding tracking of a user's location in a facility by using one or more sensors; (d) receiving input relating to a user; and (e) automatically controlling (e.g., changing) or directing automatic control (e.g., changing) of one or more devices in the facility by using the input and the user's location information.
In some embodiments, the at least one controller is configured to utilize a location of the user, the location being a current location of the user or a past location of the user. In some embodiments, the at least one controller is configured to identify or direct identification of the user at least in part by (I) receiving an identification card reading or (II) performing image recognition on a captured image of the user in the facility. In some embodiments, the one or more sensors include a camera or a geolocation sensor. In some embodiments, the geographic position sensor comprises an ultrawide bandwidth sensor. In some embodiments, the geographic position sensor may locate the user with a resolution of at least twenty (20) centimeters or greater. In some embodiments, the input related to the user comprises a service request made by, on behalf of, or for the user. In some embodiments, the input related to the user relates to an activity of the user in a peripheral structure of a facility in which the user is located. In some embodiments, the input related to the user comprises an electronic file. In some embodiments, the input related to the user includes a gesture and/or a voice command made by the user. In some embodiments, the input related to the user relates to a preference of the user. In some embodiments, the user's preferences are provided by a machine learning module that takes into account the user's past activities, wherein the at least one controller is operatively coupled to the machine learning module. In some embodiments, the user's preferences are input by the user. In some embodiments, the one or more devices include lighting, ventilation and air conditioning systems, heating systems, sound systems, or odor conditioning systems. In some embodiments, the one or more devices are configured to affect the atmosphere of the peripheral structure of the facility in which the user is disposed. In some embodiments, the one or more devices include service, office, and/or factory equipment. In some embodiments, the one or more devices are disposed outside of a peripheral structure of a facility in which the user is located. In some embodiments, the one or more devices are disposed in a peripheral structure of a facility in which the user is located. In some embodiments, the one or more devices include a media projection device. In some embodiments, the one or more devices comprise a tintable window. In some embodiments, the one or more devices comprise an electrochromic window. A non-transitory computer-readable medium for controlling a facility is provided that, when read by one or more processors, is configured to perform operations including the operations of any of the one or more controllers above.
In another aspect, the present disclosure provides a system, device (e.g., controller) and/or non-transitory computer-readable medium (e.g., software) that implements any of the methods disclosed herein.
In another aspect, the present disclosure provides methods of using (e.g., for its intended purpose) any of the systems and/or apparatuses disclosed herein.
In another aspect, an apparatus comprises at least one controller programmed to direct a mechanism for carrying out (e.g., implementing) any of the methods disclosed herein, wherein the at least one controller is operatively coupled to the mechanism.
In another aspect, an apparatus includes at least one controller configured (e.g., programmed) to implement (e.g., realize) the methods disclosed herein. The at least one controller may implement any of the methods disclosed herein.
In another aspect, a system comprises: at least one controller programmed to direct operation of at least one other device (or component thereof); and the apparatus (or components thereof), wherein the at least one controller is operatively coupled to the apparatus (or components thereof). The device (or components thereof) may include any of the devices (or components thereof) disclosed herein. The at least one controller may direct any of the devices (or components thereof) disclosed herein.
In another aspect, a computer software product comprises a non-transitory computer-readable medium having program instructions stored therein, which when read by a computer, cause the computer to direct a mechanism disclosed herein to perform (e.g., implement) any method disclosed herein, wherein the non-transitory computer-readable medium is operatively coupled to the mechanism. The mechanism may comprise any of the devices (or any component thereof) disclosed herein.
In another aspect, the present disclosure provides a non-transitory computer-readable medium comprising machine executable code that, when executed by one or more computer processors, implements any of the methods disclosed herein.
In another aspect, the present disclosure provides a non-transitory computer-readable medium comprising machine-executable code that, when executed by one or more computer processors, enables booting of a controller (e.g., as disclosed herein).
In another aspect, the present disclosure provides a computer system that includes one or more computer processors and a non-transitory computer-readable medium coupled thereto. The non-transitory computer-readable medium includes machine-executable code that, when executed by one or more computer processors, implements any of the methods disclosed herein and/or implements the guidance of the controller disclosed herein.
The contents of this summary section are provided as a simplified introduction to the disclosure and are not intended to limit the scope of any invention disclosed herein or the scope of the appended claims.
Other aspects and advantages of the present disclosure will become apparent to those skilled in the art from the following detailed description, wherein only illustrative embodiments of the present disclosure are shown and described. As will be realized, the disclosure is capable of other and different embodiments and its several details are capable of modifications in various obvious respects, all without departing from the disclosure. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.
These and other features and embodiments will be described in more detail below with reference to the following drawings.
Is incorporated by reference
All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.
Drawings
The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings or figures (also referred to herein as "figures"), wherein:
FIG. 1 shows a perspective view of a peripheral structure (e.g., a building) and a control system;
FIG. 2 schematically depicts a processing system;
FIG. 3 shows a block diagram of an exemplary Master Controller (MC);
FIG. 4 illustrates a block diagram of an example Network Controller (NC);
FIG. 5 illustrates an exemplary controller;
FIG. 6 illustrates a device including a sensor ensemble and its components and connectivity options;
FIG. 7 is a block diagram illustrating exemplary modules that may be used to implement speech control;
FIG. 8 is a flow chart of a control method;
FIG. 9 is a flow chart of a control method;
FIG. 10 is a flow chart of a control method;
FIG. 11A illustrates a user interacting with a wall device, and FIG. 11B illustrates a configuration of components that may be used to implement certain control methods described herein;
12A-12C illustrate various configurations of components that may be used to implement certain control methods described herein;
FIGS. 13A and 13B illustrate various window and display configurations;
FIG. 14 schematically illustrates a display construction assembly;
FIG. 15 depicts a peripheral structure communicatively coupled to a digital twin representation thereof;
FIG. 16 is a flow chart of a control method;
FIG. 17 depicts user interaction with a digital twin to control a target;
FIG. 18 is a schematic diagram of a message diagram relating to communications between system components;
FIG. 19 is a flow chart of a control method;
FIG. 20 depicts a peripheral structure communicatively coupled to a digital twin representation thereof;
FIG. 21 schematically illustrates an electrochromic device;
fig. 22 shows a cross-sectional view of an exemplary electrochromic window;
FIG. 23 shows a voltage curve over time;
FIG. 24 schematically illustrates a building and a network; and is
Fig. 25 shows a flowchart of the control method.
The drawings and components therein may not be to scale. The components in the figures described herein may not be drawn to scale.
Detailed Description
While various embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed.
Terms such as "a," "an," and "the" are not intended to refer to only a singular entity, but include the general class of which a particular example may be used for illustration. The terminology herein is used to describe specific embodiments of the invention, but their usage does not limit the invention.
When referring to ranges, the ranges are meant to include the endpoints unless otherwise indicated. For example, a range between a value of 1 and a value of 2 is meant to be inclusive and includes both values of 1 and 2. Ranges, inclusive, will span any value from about the value 1 to about the value 2. As used herein, the term "adjacent" or "with respect to" \8230; \8230 "; adjacent" includes "immediately adjacent", "abutting", "contacting", and "proximate".
The terms "operatively coupled" or "operatively connected" refer to a first element (e.g., a mechanism) that is coupled (e.g., connected) to a second element to allow for the intended operation of the second element and/or the first element. Coupling may include physical or non-physical coupling. The non-physical coupling may include signal inductive coupling (e.g., wireless coupling). Coupling may include physical coupling (e.g., a physical connection) or non-physical coupling (e.g., via wireless communication). In addition, in the following description, the phrases "operable," "adapted," "configured," "designed," "programmed," or "capable" may be used interchangeably where appropriate.
An element (e.g., a mechanism) that is "configured to" perform a function includes a structural feature that causes the element to perform the function. The structural features may include electrical features such as circuitry or circuit elements. The structural features may include circuitry (e.g., including electrical or optical circuitry). The electrical circuitry may include one or more wires. The optical circuitry may include at least one optical element (e.g., a beam splitter, a mirror, a lens, and/or an optical fiber). The structural features may include mechanical features. The mechanical features may include latches, springs, closures, hinges, chassis, supports, fasteners, or cantilevers, etc. Performing the function may include utilizing the logic feature. The logic features may include programming instructions. The programming instructions may be executable by at least one processor. The programming instructions may be stored or encoded on a medium accessible by one or more processors.
The following detailed description is directed to specific example embodiments for the disclosed subject matter. Although the disclosed embodiments are described in sufficient detail to enable one of ordinary skill in the art to practice the disclosed subject matter, the disclosure is not limited to the specific features of the specific example embodiments described herein. Rather, the concepts and teachings disclosed herein may be embodied and applied in many different forms and manners without departing from the spirit and scope of the present invention. For example, while the disclosed embodiments focus on electrochromic windows (also referred to as smart windows), some of the systems, devices, and methods disclosed herein may be manufactured, applied, or used without undue experimentation to incorporate or simultaneously incorporate other types of actively switching/controlling optically switchable devices, rather than passive coatings that passively color in response to sunlight, such as thermochromic or photochromic coatings. Some other types of actively controlled optically switchable devices include liquid crystal devices, suspended particle devices, micro-shutters and the like. For example, some or all of these other optically switchable devices may be powered, driven, or otherwise controlled or integrated with one or more of the disclosed controller embodiments described herein.
In some embodiments, the peripheral structure includes a region defined by at least one structure (e.g., a fixation device). The at least one structure may include at least one wall. The peripheral structure may include and/or surround one or more sub-peripheral structures. The at least one wall may comprise metal (e.g., steel), clay, stone, plastic, glass, stucco (e.g., gypsum), polymer (e.g., polyurethane, styrene, or vinyl), asbestos, fiberglass, concrete (e.g., reinforced concrete), wood, paper, or ceramic. The at least one wall may include wires, bricks, blocks (e.g., cinder blocks), tiles, drywall, or framing (e.g., steel and/or wood frames).
In some embodiments, the peripheral structure includes one or more openings. The one or more openings may be reversibly closable. The one or more openings may be permanently open. The base length dimension of the one or more openings may be smaller relative to the base length dimension of the walls defining the peripheral structure. The base length dimension may include a diameter, length, width, or height of the bounding circle. The fundamental length scale may be abbreviated herein as "FLS". The surface of the one or more openings may be smaller relative to the surface of the wall defining the peripheral structure. The open surface may be a certain percentage of the total surface of the wall. For example, the opening surface may be measured as about 30%, 20%, 10%, 5%, or 1% of the wall. The wall may comprise a floor, ceiling or side wall. The closable opening may be closed by at least one window or door. The peripheral structure may be at least a portion of a facility. The facility may comprise a building. The peripheral structure may comprise at least a portion of a building. The building may be a private building and/or a commercial building. A building may include one or more floors. The building (e.g., a floor thereof) may include at least one of: a room, a hallway, a rooftop, a basement, a balcony (e.g., an interior or exterior balcony), a stairwell, an aisle, an elevator shaft, a facade, a mid-floor, an attic, a garage, a porch (e.g., an enclosed porch), a balcony (e.g., an enclosed balcony), a cafeteria, and/or a duct. In some embodiments, the peripheral structure may be fixed and/or movable (e.g., a train, airplane, ship, vehicle, or rocket).
In some embodiments, the peripheral structure surrounds the atmosphere. The atmosphere may include one or more gases. The gas may include an inert gas (e.g., argon or nitrogen) and/or a non-inert gas (e.g., oxygen or carbon dioxide). The peripheral structure atmosphere may be similar to the atmosphere outside the peripheral structure (e.g., ambient atmosphere) in at least one external atmospheric feature, including: temperature, relative gas content, gas type (e.g., humidity and/or oxygen content), debris (e.g., dust and/or pollen), and/or gas velocity. The peripheral structure atmosphere may differ from the atmosphere outside the peripheral structure in at least one external atmospheric feature, the at least one external atmospheric feature comprising: temperature, relative gas content, gas type (e.g., humidity and/or oxygen content), debris (e.g., dust and/or pollen), and/or gas velocity. For example, the peripheral structural atmosphere may be less humid (e.g., drier) than the external (e.g., ambient) atmosphere.
For example, the peripheral structure atmosphere may contain the same (e.g., or substantially similar) oxygen-nitrogen ratio as the atmosphere outside the peripheral structure. The velocity and/or content of the gas in the peripheral structure may be (e.g., substantially) similar throughout the peripheral structure. The velocity and/or content of the gas in the peripheral structure may be different in different portions of the peripheral structure (e.g., by flowing the gas through a vent coupled to the peripheral structure). The gas content may include a relative gas ratio.
In some embodiments, the network infrastructure is provided in a peripheral structure (e.g., a facility such as a building). The network infrastructure may be used for various purposes, such as for providing communication and/or power services. The communication services may include high bandwidth (e.g., wireless and/or wired) communication services. The communication service may be available to occupants of the facility and/or users outside of the facility (e.g., building). The network infrastructure may operate in conjunction with or as a partial replacement for the infrastructure of one or more cellular carriers. The network may include one or more levels of encryption. The network may be communicatively coupled to the cloud and/or one or more servers external to the facility. The network may support at least fourth generation wireless (4G) or fifth generation wireless (5G) communications. The network may support cellular signals external and/or internal to the facility. The downlink communication network speed may have a peak data rate of at least about 5 gigabits per second (Gb/s), 10Gb/s, or 20 Gb/s. The uplink communication network speed may have a peak data rate of at least about 2Gb/s, 5Gb/s, or 10 Gb/s. The network infrastructure may be provided in a facility comprising electrically switchable windows. Examples of components of the network infrastructure include high-speed backhaul. The network infrastructure may include at least one cable, switch, antenna (e.g., physical), transceiver, sensor, transmitter, receiver, radio, processor, and/or controller (which may include a processor). The network infrastructure may be operatively coupled to and/or include a wireless network. The network infrastructure may include cabling (e.g., including fiber optic, stranded cable, or coaxial cable). One or more devices (e.g., sensors and/or transmitters) may be deployed (e.g., installed) in an environment, for example, as part of and/or after installation of a network. These devices may be communicatively coupled to a network. The network may include a power network and/or a communication network. A device may discover itself on a network, e.g., once it is coupled to the network (e.g., when it attempts to couple to the network). The network structure may comprise a peer-to-peer network structure or a client-server network structure. The network may or may not have a central coordination entity (e.g., a server or another stable host).
In some embodiments, a Building Management System (BMS) is a computer-based control system. A BMS can be installed in a facility to monitor and otherwise control (e.g., regulate, manipulate, limit, direct, monitor, adjust, modulate, change, alter, inhibit, check, direct, or manage) the facility. For example, the BMS may control one or more devices communicatively coupled to the network. The one or more devices may include mechanical and/or electrical equipment, such as ventilation, lighting, electrical systems, elevators, fire protection systems, and/or security systems. The controller (e.g., node and/or processor) may be adapted to be integrated with the BMS. The BMS may include hardware. The hardware may include an interconnection with one or more processors (e.g., and associated software) over a communication channel, for example, for maintaining one or more conditions in a facility. One or more conditions in the facility may be based on preferences set by a user (e.g., an occupant, a facility owner, and/or a building manager). For example, the BMS may be implemented using a local area network such as ethernet. The software may utilize, for example, an internet protocol and/or an open standard. One example is software from Tridium corporation (Richmond, va.). One communication protocol commonly used with BMS is BACnet (building automation and control network). A node may be any addressable circuit. For example, a node may be a circuit having an Internet Protocol (IP) address.
In some embodiments, the BMS may be implemented in a facility (e.g., a multi-story building). The BMS may (also) be used to control one or more characteristics of the environment in the setting. The one or more characteristics may include: temperature, carbon dioxide level, air flow, various volatile organic compounds, and/or humidity in the building. There may be mechanical devices controlled by the BMS, such as one or more heaters, air conditioners, blowers, and/or vents. The BMS may turn on and/or off these various devices under defined conditions in order to control the facility environment. A core function of a BMS may be to maintain a comfortable environment for occupants of the environment, e.g., while minimizing heating and cooling costs and/or requirements. The BMS may be used to control one or more of a variety of systems. BMS can be used to optimize synergy between various systems. For example, BMS can be used to save energy and reduce building operating costs.
In some embodiments, the facility comprises a multi-storey building. The multi-storey building may have at least 2, 8, 10, 25, 50, 80, 100, 120, 140 or 160 storeys controlled by the control system and/or including network infrastructure. The number of floors controlled by the control system and/or comprising the network infrastructure may be any number between the above-mentioned numbers (e.g. 2 to 50, 25 to 100 or 80 to 160). The floor may have at least about 150m 2 、250m 2 、500m 2 、1000m 2 、1500m 2 Or 2000 square meters (m) 2 ) The area of (a). A floor may have an area between any of the aforementioned floor area values (e.g., from about 150 m) 2 To about 2000m 2 From about 150m 2 To about 500m 2 From about 250m 2 To about 1000m 2 Or from about 1000m 2 To about 2000m 2 )。
In some embodiments, the window controller is integrated with the BMS. For example, a window controller may be configured to control one or more tintable windows (e.g., electrochromic windows). In one embodiment, the one or more electrochromic windows include at least one all-solid and inorganic electrochromic device, but may include more than one electrochromic device, for example where each sheet or pane of the IGU is tintable. In one embodiment, the one or more electrochromic windows include only all solid state and inorganic electrochromic devices. In one embodiment, the electrochromic window is a multi-state electrochromic window. An example of a tintable window can be found in U.S. patent application Ser. No. 12/851,514, entitled "Multipanel Electrochromo Windows," filed 8/5/2010, which is incorporated by reference herein in its entirety.
In some embodiments, one or more devices, such as sensors, emitters, and/or actuators, are operatively coupled to at least one controller and/or processor. The sensor readings may be obtained by one or more processors and/or controllers. The controller may include a processing unit (e.g., a CPU or GPU). The controller may receive input (e.g., from at least one device or projection media). The controller may include electrical circuitry, electrical wiring, optical wiring, an electrical outlet, and/or an electrical outlet. The controller may receive input and/or deliver output. The controller may include a plurality (e.g., sub) of controllers. Operations (e.g., as disclosed herein) may be performed by a single controller or by multiple controllers. The at least two operations may each be predetermined by different controllers. At least two operations may be predetermined by the same controller. The devices and/or media may be controlled by a single controller or by multiple controllers. At least two devices and/or media may be controlled by different controllers. At least two devices and/or media may be controlled by the same controller. The controller may be part of a control system. The control system may include a master controller, a floor controller (e.g., including a network controller), or a local controller. The local controller may be a target controller. For example, the local controller may be a window controller (e.g., controlling an optically switchable window), a peripheral structure controller, or a component controller. The controller may be part of a hierarchical control system. The hierarchical control system may include a master controller, such as a floor controller, a local controller (e.g., a window controller), a peripheral structure controller, and/or a component controller, that directs one or more controllers. The target may comprise a device or a medium. The device may include an electrochromic window, a sensor, a transmitter, an antenna, a receiver, a transceiver, or an actuator.
In some embodiments, the network infrastructure is operatively coupled to one or more controllers. In some embodiments, the physical location of controller types in a hierarchical control system may change. The controller may control one or more devices (e.g., directly coupled to the devices). The controller may be located in proximity to one or more devices it controls. For example, the controller may control a light-switchable device (e.g., an IGU), an antenna, a sensor, and/or an output device (e.g., a light source, a sound source, an odor source, a gas source, an HVAC outlet, or a heater). In one embodiment, the floor controller may direct one or more window controllers, one or more peripheral structure controllers, one or more component controllers, or any combination thereof. The floor controller may comprise a floor controller. For example, a floor (e.g., including a network) controller may control a plurality of local (e.g., including window) controllers. A plurality of local controllers may be disposed in a portion of a facility (e.g., in a portion of a building). A portion of a facility may be a floor of the facility. For example, a floor controller may be assigned to a floor. In some embodiments, a floor may include multiple floor controllers, for example, depending on the size of the floor and/or the number of local controllers coupled to the floor controller. For example, a floor controller may be assigned to a portion of a floor. For example, a floor controller may be assigned to a portion of a local controller disposed in a facility. For example, a floor controller may be assigned to a portion of a floor of a facility. The master controller may be coupled to one or more floor controllers. The floor controller may be provided in the facility. The master controller may be located within the facility or outside the facility. The master controller may be disposed in the cloud. The controller may be part of or operatively coupled to a building management system. The controller may receive one or more inputs. The controller may generate one or more outputs. The controller may be a single input single output controller (SISO) or a multiple input multiple output controller (MIMO). The controller may interpret the received input signal. The controller may acquire data from one or more components (e.g., sensors). The obtaining may comprise receiving or extracting. The data may include measurements, estimates, determinations, generations, or any combination thereof. The controller may include feedback control. The controller may include a feed forward control. The control may include on-off control, proportional Integral (PI) control, or Proportional Integral Derivative (PID) control. The control may include open loop control or closed loop control. The controller may comprise a closed loop control. The controller may include open loop control. The controller may include a user interface. The user interface may include (or be operatively coupled to) a keyboard, keypad, mouse, touch screen, microphone, voice recognition package, camera, imaging system, or any combination thereof. The output may include a display (e.g., a screen), speakers, or a printer. In some implementations, a local controller controls one or more devices and/or media (e.g., media projection). For example, the local controller may control one or more IGUs, one or more sensors, one or more output devices (e.g., one or more transmitters), one or more media, or any combination thereof.
In some embodiments, the BMS includes a multi-purpose controller. By incorporating feedback (e.g., of the controller), the BMS can provide, for example, enhanced: 1) environmental control, 2) energy savings, 3) safety, 4) flexibility in control options, 5) improved reliability and usable life of other systems, (e.g., due to reduced dependency on and/or maintenance), 6) information availability and/or diagnostics, 7) higher productivity from personnel (e.g., employees) in the building, and various combinations thereof. These enhancements may enable any device to be controlled automatically. In some embodiments, a BMS may not be present. In some embodiments, the BMS may exist without communicating with the master network controller. In some embodiments, the BMS may communicate with a portion of the levels in the hierarchy of controllers. For example, the BMS may communicate (e.g., at a high level) with the master network controller. In some embodiments, the BMS may not communicate with a portion of the levels in the hierarchy of controllers of the control system. For example, the BMS may not communicate with the local controller and/or the intermediate controller. In certain embodiments, maintenance of the BMS does not disrupt control of devices communicatively coupled to the control system. In some embodiments, the BMS includes at least one controller that may or may not be part of a hierarchical control system.
Fig. 1 shows an example of a control system architecture 100 disposed at least partially in a peripheral structure (e.g., a building) 150. The control system architecture 100 includes a master controller 108 that controls the floor controllers 106, which in turn control the local controllers 104. In the example shown in fig. 1, the master controller 108 is operatively coupled (e.g., wirelessly and/or by wire) to a Building Management System (BMS) 124 and a database 120. The arrows in fig. 1 represent communication paths. The controller may be operatively coupled (e.g., directly/indirectly and/or wired and/or wirelessly) to an external source 110. The master controller 108 may control a floor controller including a network controller 106, which may in turn control a local controller, such as the window controller 104. The floor controller 106 may also include a Network Controller (NC). In some embodiments, a local controller (e.g., 106) controls one or more targets, such as IGU 102, one or more sensors, one or more output devices (e.g., one or more transmitters), media, or any combination thereof. The external source may comprise a network. The external source may include one or more sensors or output devices. The external source may include a cloud-based application and/or a database. The communication may be wired and/or wireless. The external source may be located outside the facility. For example, the external source may include one or more sensors and/or antennas disposed, for example, on a wall or ceiling of the facility. The communication may be unidirectional or bidirectional. In the example shown in fig. 1, the communication of all communication arrows is meant to be bi-directional (e.g., 118, 122, 114, and 112).
The methods, systems, and/or devices described herein may include a control system. The control system may be in communication with any of the devices (e.g., sensors) described herein. The sensors may be of the same type or different types, e.g., as described herein. For example, the control system may be in communication with the first sensor and/or the second sensor. A plurality of devices (e.g., sensors and/or emitters) may be disposed in the container and may constitute an aggregate (e.g., a digital building element). The aggregate may comprise at least two devices of the same type. The aggregate may comprise at least two devices of different types. The devices in the aggregate may be operatively coupled to the same circuit board. The circuit board may include circuitry. The circuit board may include or be operatively coupled to a controller (e.g., a local controller). The control system may control one or more devices (e.g., sensors). The control system may control one or more components of a building management system (e.g., a lighting, security, and/or air conditioning system). The controller may adjust at least one (e.g., environmental) characteristic of the peripheral structure. The control system may use any component of the building management system to regulate the peripheral structural environment. For example, the control system may regulate the energy supplied by the heating element and/or by the cooling element. For example, the control system may regulate the velocity of air flowing into and/or out of the peripheral structure through the vents. The control system may include a processor. The processor may be a processing unit. The controller may comprise a processing unit. The processing unit may be central. The processing unit may include a central processing unit (abbreviated herein as "CPU"). The processing unit may be a graphics processing unit (abbreviated herein as "GPU"). A controller or control mechanism (e.g., including a computer system) may be programmed to implement one or more methods of the present disclosure. A processor may be programmed to implement the methods of the present disclosure. A controller may control at least one component of the forming systems and/or apparatus disclosed herein. Examples of digital construction elements can be found in PCT patent application Ser. No. PCT/US20/70123, which is incorporated herein by reference in its entirety.
Fig. 2 shows an illustrative example of a computer system 200 programmed or otherwise configured to perform one or more operations of any one of the methods provided herein. The computer system may control (e.g., direct, monitor, and/or regulate) various features of the methods, apparatus, and systems of the present disclosure, such as controlling heating, cooling, lighting, and/or ventilation of peripheral structures, or any combination thereof. The computer system may be part of or in communication with any of the sensors or sensor assemblies disclosed herein. A computer may be coupled to one or more of the mechanisms disclosed herein and/or any portion thereof. For example, the computer may be coupled to one or more sensors, valves, switches, lights, windows (e.g., IGU), motors, pumps, optical components, or any combination thereof.
The computer system may include a processing unit (e.g., 206) (also "processor," "computer," and "computer processor" are used herein). The computer system may include a memory or memory location (e.g., 202) (e.g., random access memory, read only memory, flash memory), an electronic storage unit (e.g., 204) (e.g., hard disk), a communication interface (e.g., 203) (e.g., a network adapter) for communicating with one or more other systems, and a peripheral device (e.g., 205), such as a cache, other memory, a data store, and/or an electronic display adapter. In the example shown in fig. 2, the memory 202, the storage unit 204, the interface 203, and the peripheral device 205 communicate with the processing unit 206 through a communication bus (solid line) such as a motherboard. The storage unit may be a data storage unit (or data repository) for storing data. The computer system may be operatively coupled to a computer network ("network") (e.g., 201) with the aid of a communication interface. The network may be the internet, the internet and/or an extranet, or an intranet and/or extranet in communication with the internet. In some cases, the network is a telecommunications and/or data network. The network may include one or more computer servers that may implement distributed computing, such as cloud computing. In some cases, with the aid of a computer system, the network may implement a peer-to-peer network, which may enable a device coupled to the computer system to act as a client or server.
The processing unit may execute a series of machine-readable instructions that may be embodied in a program or software. The instructions may be stored in a memory location, such as memory 202. The instructions may be directed to a processing unit, which may then program or otherwise configure the processing unit to implement the methods of the present disclosure. Examples of operations performed by a processing unit may include fetch, decode, execute, and write-back. The processing unit may interpret and/or execute the instructions. The processor may include a microprocessor, a data processor, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a system on a chip (SOC), a coprocessor, a network processor, an Application Specific Integrated Circuit (ASIC), an application specific instruction set processor (ASIP), a controller, a programmable logic device (PLO), a chipset, a Field Programmable Gate Array (FPGA), or any combination thereof. The processing unit may be part of a circuit, such as an integrated circuit. One or more other components of system 200 may be included in a circuit.
The storage unit may store files such as drivers, libraries, and saved programs. The storage unit may store user data (e.g., user preferences and user programs). In some cases, the computer system may include one or more additional data storage units located external to the computer system, such as on a remote server in communication with the computer system via an intranet or the internet.
The computer system may communicate with one or more remote computer systems over a network. For example, the computer system may communicate with a remote computer system of a user (e.g., an operator). Examples of remote computer systems include a personal computer (e.g., a laptop PC), a tablet PC or tablet PC (e.g.,
Figure BDA0003894501620000331
iPad、
Figure BDA0003894501620000332
galaxy Tab), telephone, smartphone (e.g.,
Figure BDA0003894501620000333
iPhone, android enabled device,
Figure BDA0003894501620000334
) Or a personal digital assistant. A user (e.g., a client) may access the computer system via a network.
The methods described herein may be implemented by machine (e.g., computer processor) executable code stored on an electronic storage location of a computer system, such as memory 202 or electronic storage unit 204. The machine executable or machine readable code may be provided in the form of software. During use, the processor 206 may execute code. In some cases, code may be retrieved from a storage unit and stored on a memory for ready access by a processor. In some cases, the electronic storage unit may be eliminated, and the machine-executable instructions stored on the memory.
The code may be pre-compiled and configured for use with a machine adapted to execute the code, or may be compiled at runtime. The code may be provided in a programming language that is selected to enable the code to be executed in a pre-compiled or compiled form. In some embodiments, the processor includes code. The code may be program instructions. The program instructions may cause at least one processor (e.g., a computer) to direct a feed-forward and/or feedback control loop. In some embodiments, the program instructions cause the at least one processor to direct a closed-loop and/or open-loop control scheme. The control may be based at least in part on one or more sensor readings (e.g., sensor data). One controller may direct a plurality of operations. At least two operations may be directed by different controllers. In some embodiments, different controllers may direct at least two of operations (a), (b), and (c). In some embodiments, different controllers may direct at least two of operations (a), (b), and (c). In some embodiments, the non-transitory computer readable medium causes each different computer to boot at least two of operations (a), (b), and (c). In some embodiments, the different non-transitory computer readable medium causes each different computer to direct at least two of operations (a), (b), and (c). The controller and/or computer readable medium may direct any of the devices disclosed herein or components thereof. The controller and/or computer readable medium may direct any of the operations of the methods disclosed herein. The controller may be operatively (communicatively) coupled to control logic (e.g., code embedded in software) in which its operations are embodied.
In some embodiments, the optically switchable window forms or occupies a substantial portion of the exterior wall of the building. For example, optically switchable windows may form a large portion of the walls, facades and even roofs of corporate office buildings, other commercial or residential buildings. A distributed network of controllers may be used to control the optically switchable windows. For example, the network system may operate to control multiple IGUs. One of the primary functions of the network system is to control the optical state of an electrochromic device (ECD) or other optically switchable device within the IGU. In some implementations, one or more windows may be multi-zone windows, for example, where each window includes two or more independently controllable ECDs or zones. In some embodiments, network system 300 (fig. 3) operates to control the electrical characteristics of the power signals provided to the IGU. For example, the network system may generate and transmit coloring instructions (also referred to herein as "tone commands") to control the voltage applied to an ECD within an IGU.
In some embodiments, another function of the network system is to obtain status information from the IGU (hereinafter "information" is used interchangeably with "data"). For example, the state information for a given IGU may include an identification of, or information about, the current hue state of an ECD within the IGU. Network systems are also used to acquire data from various sensors, whether integrated on or within the IGU or located at various other locations in, on or around the building, such as temperature sensors, photoelectric sensors (also referred to herein as light sensors), humidity sensors, airflow sensors, or occupancy sensors.
The network system may include any suitable number of distributed controllers having various capabilities or functions. In some implementations, the hierarchy defines the functionality and arrangement of the various controllers. For example, a network system may include a plurality of distributed Window Controllers (WCs), a plurality of Network Controllers (NCs), and a Master Controller (MC). The network controller may be included in a floor controller. In some implementations, the MC may communicate with and control tens or hundreds of NCs. In various embodiments, the MC issues high-level instructions to the NC over one or more wired and/or wireless links. The instructions may include, for example, tone commands for causing transitions in optical states of IGUs controlled by the respective NCs. Each NC, in turn, may communicate with and control multiple WCs via one or more wired and/or wireless links. For example, each NC may control tens or hundreds of WCs. Each WC, in turn, may communicate with, drive, or otherwise control one or more respective IGUs over one or more wired and/or wireless links.
In some embodiments, the MC issues a communication including a hue command, a status request command, a data (e.g., sensor data) request command, or other instructions. MC 308 may periodically issue such communications at certain predefined times of day (which may vary based at least in part on the day of the week or the day of the year) or based at least in part on detection of a particular event, condition, or combination of events or conditions (e.g., as determined by acquired sensor data, or based at least in part on receipt of a request initiated by a user or by an application, or a combination of such sensor data and such request). In some embodiments, when the MC determines to cause a hue state change in the set of one or more IGUs, the MC generates or selects a hue value corresponding to the desired hue state. In some embodiments, the set of IGUs is associated with a first protocol Identifier (ID) (e.g., BACnet ID). The MC then generates and sends a communication, referred to herein as a "dominant hue command," that includes the hue value and the first protocol ID communicated over the link via a first communication protocol (e.g., BACnet-compatible protocol). The MC may address the dominant hue command to a particular NC that controls a particular one or more WCs that in turn control the set of IGUs to be transitioned.
In some embodiments, the NC receives a dominant hue command that includes a hue value and a first protocol ID, and maps the first protocol ID to one or more second protocol IDs. Each second protocol ID may identify a corresponding one of the WCs. The NC then sends an auxiliary tone command including a tone value over the link to each identified WC via the second communication protocol. For example, each WC receiving an auxiliary hue command may then select a voltage or current profile from internal memory based at least in part on the hue value to drive its respectively connected IGU to a hue state consistent with the hue value. For example, each WC may then generate and provide a voltage or current signal over the link to its respectively connected IGU to apply a voltage or current profile.
In some embodiments, various targets (e.g., IGUs) are (e.g., advantageously) grouped into zones of targets (e.g., EC windows). At least one zone (e.g., each zone) may include a subset of targets (e.g., IGUs). For example, at least one (e.g., each) zone of a target (e.g., IGU) can be controlled by one or more respective floor controllers (e.g., NCs) and one or more respective local controllers (e.g., WCs) controlled by the floor controllers (e.g., NCs). In some examples, at least one (e.g., each) zone may be controlled by a single floor controller (e.g., NC) and two or more local controllers (e.g., WCs) controlled by the single floor controller (e.g., NC). For example, a zone may represent a logical grouping of targets (e.g., IGUs). Each zone may correspond to a set of targets (e.g., IGUs) in a particular location or area of a building that are driven together based at least in part on their locations. For example, a building may have four sides or sides (north, south, east and west) and ten floors. In this teaching example, each zone may correspond to a set of electrochromic windows on a particular floor and one of four faces. At least one (e.g., each) zone may correspond to a set of targets (e.g., IGUs) sharing one or more physical characteristics (e.g., device parameters such as size or age). In some embodiments, the zones of a target (e.g., an IGU) may be grouped based at least in part on one or more non-physical characteristics (e.g., security designations or traffic levels) (e.g., IGUs defining a manager office may be grouped in one or more zones, while IGUs defining a non-manager office may be grouped in one or more different zones).
In some embodiments, at least one (e.g., each) floor controller (e.g., NC) is capable of addressing all targets (e.g., IGUs) in at least one (e.g., each) of the one or more respective zones. For example, the MC may issue a keytone command to a floor controller (e.g., NC) that controls the target zone. The keytone command may include an abstract identification of the target region (hereinafter also referred to as "zone ID"). For example, the zone ID may be a first protocol ID, such as the protocol ID described immediately in the above example. In such situations, the floor controller (e.g., NC) receives a dominant hue command that includes hue values and zone IDs, and maps the zone IDs to second protocol IDs associated with local controllers (e.g., WCs) within the zone. In some embodiments, the zone ID is a higher level of abstraction than the first protocol ID. In such cases, the floor controller (e.g., NC) may first map the zone IDs to one or more first protocol IDs, and then map the first protocol IDs to second protocol IDs.
In some embodiments, the MC is coupled to one or more outbound networks via one or more wired and/or wireless links. For example, the MC may transmit the acquired state information or sensor data to a remote computer, mobile device, server, database in or accessible by the outbound network. In some embodiments, various applications (including third party applications or cloud-based applications) executing within such remote devices are able to access data from or provide data to the MC. In some embodiments, an authorized user or application may transmit a request via the network to modify the tint state of various IGUs to the MC. For example, the MC may first determine whether to grant the request before issuing the tone command (e.g., based at least in part on power considerations or based at least in part on whether the user has proper authorization). The MC may then calculate, determine, select, or otherwise generate a hue value and send the hue value in a dominant hue command to cause a hue state transition in the associated IGU.
In some embodiments, the user submits such a request from a computing device, such as a desktop computer, a laptop computer, a tablet computer, or a mobile device (e.g., a smartphone). The user's computing device may execute a client application that is capable of communicating with the MC and, in some examples, with a master controller application executing within the MC. In some embodiments, the client application may communicate with a separate application in the same or a different physical device or system as the MC, which then communicates with the master controller application to affect the desired hue state modification. For example, a master controller application or other separate application may be used to authenticate a user to authorize a request submitted by the user. The user may select a target to manipulate (e.g., an IGU to be colored) and notify the MC of the selection directly or indirectly, such as by entering a peripheral structure ID (e.g., room number) via the client application.
In some embodiments, the user's mobile circuitry (e.g., a mobile electronic device or other computing device) may communicate with various local controllers (e.g., WCs), e.g., wirelessly. For example, a client application executing within a user's mobile circuitry (e.g., a mobile device) may send wireless communications including control signals related to a target to a local controller to control the target, the target communicatively coupled to the local controller (e.g., via a network). For example, a user may initiate directing a hue state control signal to a WC to control the hue state of a respective IGU connected to the WC. For example, a user may use a client application to control (e.g., maintain or modify) the hue state of an IGU proximate to a room occupied by the user (or occupied by the user or others in the future). For example, a user may initiate directing a sensor frequency change control signal to a local controller to control a data sampling rate of a sensor communicatively coupled to the local controller. For example, a user may use a client application to control (e.g., maintain or modify) the data sampling rate of sensors in a room occupied proximate to the user (or occupied by the user or others in the future). For example, a user may initiate directing a light intensity change control signal to a local controller to control the light of a lamp communicatively coupled to the local controller. For example, a user may use a client application to control (e.g., maintain or modify) the light intensity of lights in a room that is proximate to the user's occupancy (or occupied by the user or others in the future). For example, a user may initiate directing a media projection change control signal to a local controller to control media projected by a projector communicatively coupled to the local controller. For example, a user may use a client application to control (e.g., maintain or modify) media projected by a projector in a room occupied by the user (or occupied by the user or others in the future). Such wireless communications may be generated, formatted, and/or transmitted using various wireless network topologies and protocols WC, for example.
In some embodiments, control signals sent from mobile circuitry (e.g., a device) of a user (or other computing device) to a local controller (e.g., a WC) override previously sent signals (e.g., tone values previously received by the WC from a respective NC). The previously transmitted signal may be automatically generated, for example, by the control system. In other words, the local controller (e.g., WC) may provide the applied voltage to the target (e.g., IGU) based at least in part on a control signal from the user's mobile circuitry (e.g., the user's computing device), e.g., rather than based at least in part on a predetermined signal (e.g., tone value). For example, a control algorithm or rule set stored in and executed by a local controller (e.g., a WC) may decide that one or more control signals from a user's mobile device (e.g., an authorized user's computing device) will override corresponding signals received from a control system (e.g., tone values received from an NC). In some embodiments, such as in high demand situations, the control signal (such as the hue value from the NC) overrides any control signal received by a local controller (e.g., the WC) from the user's mobile circuitry (e.g., the user's computing device). A control algorithm or set of rules may decide that only control signals (e.g., related to coloring) covered from certain users (or groups or categories of users) may be prioritized based at least in part on the rights granted to such users. In some cases, other factors including the time of day or the location of the target (e.g., IGU) may affect the authority of the predetermined signal for the overlay control system.
In some embodiments, based at least in part on receiving control signals from a user's mobile circuitry (e.g., an authorized user's computing device), the MC uses the information about the combination of known parameters to calculate, determine, select, and/or otherwise generate command signals (e.g., related to hue values) that provide the (e.g., lighting) conditions requested by (e.g., typical of) the user, such as requested in some cases where power is also being used effectively. For example, the MC may determine the state of the target based at least in part on preset preferences defined by or for a particular user who requested a change in the state of the target via the mobile circuitry (e.g., via a computing device). For example, the MC may determine the tonal value based at least in part on preset preferences defined by or for a particular user for whom a change in tonal state was requested via the computing device. For example, the user may be required to enter a password or otherwise log in or obtain authorization to request a state change (e.g., a hue state change) of the target. The MC may determine the identity of the user based at least in part on a password, security token, and/or identifier of a particular mobile circuit (e.g., mobile device or other computing device). After determining the identity of the user, the MC may then retrieve the user's preset preferences and use the preset preferences alone or in combination with other parameters (e.g., power considerations and/or information from various sensors) to generate and transmit a change in the state of the target (e.g., a hue value for use in coloring the corresponding IGU).
In some embodiments, the network system includes a wall switch, dimmer, or other (e.g., tint state) control device. Wall switches generally refer to an electromechanical interface connected to a local controller (e.g., a WC). The wall switch may transmit a target state change (e.g., tint) command to a local controller (e.g., WC) which may then transmit the target state change (e.g., tint) command to a superior controller, such as a local controller (e.g., NC). Such control devices may be collectively referred to as "wall devices," although the devices are not necessarily limited to wall-mounted implementations (e.g., such devices may also be located on the ceiling or floor, or integrated on or within a desktop or conference table). For example, some or all offices, conference rooms, or other rooms of a building may include such wall devices for controlling the state of a target (e.g., the tint state of an adjacent IGU, or the lighting state of a light bulb). For example, IGUs that border a particular room may be grouped into zones. Each wall device may be operated by an end user (e.g., an occupant of a respective room) to control the grouping targets (e.g., control the hue state or other function or parameter of the IGU adjacent to the room). For example, at certain times of the day, adjacent IGUs may be colored to a dark state to reduce the amount of light energy entering the room from the outside (e.g., to reduce AC cooling requirements). For example, at certain times of the day, adjacent heaters may be turned on to warmer temperatures to promote occupant comfort. In some embodiments, when a user requests use of the room, the user may operate the wall device to transmit one or more control signals to cause a transition (e.g., a tint state) from one state of the target to another state (e.g., a transition from a dark state to a lighter tint state of the IGU).
In some embodiments, each wall device includes one or more switches, buttons, dimmers, dials, or other physical user interface controls that enable a user to select a particular tint state or increase or decrease the current tint level of the IGU of the adjoining room. The wall device may include a display with a touch screen interface that enables a user to select a particular tint state (e.g., by selecting a virtual button, selecting from a drop down menu, or by entering a tint level or tint percentage) or modify the tint state (e.g., by selecting a "dim" virtual button, a "brighten" virtual button, or by rotating a virtual dial or sliding a virtual bar). In some embodiments, the wall device may include a docking interface that enables a user to physically and communicatively dock a mobile circuit (e.g., a portable device such as a smartphone, multimedia device, remote control, virtual reality device, tablet computer, or other portable computing device (e.g., IPHONE, IPOD, or IPAD produced by Apple, inc., cupertino, california)). The mobile circuit may be embedded in a vehicle (e.g., an automobile, a motorcycle, a drone, an airplane). The mobile circuit may be embedded in the robot. The circuitry may be embedded in (e.g., as part of) virtual assistant Al technology, a speaker (e.g., a smart speaker such as Google Nest or Amazon Echo Dot). The coupling of the mobile circuit to the network may be initiated by the presence of a user in the peripheral structure or by the coupling of a user to the network (e.g., remote or local coupling). The coupling of the user to the network may be secure (e.g., have one or more security layers, and/or require one or more security tokens (e.g., keys)). The presence of a user in the peripheral structure may be sensed (e.g., automatically) by using a sensor coupled to the network. The minimum distance from the sensor when the user is coupled to the network may be predetermined and/or adjusted. The user may override its coupling to the network. The user may be a manager, supervisor, owner, lessor, or manager of the network and/or facility. The user may be a user of the mobile circuit. The ability to couple the mobile circuit to the network may or may not be covered by the user. The ability to alter the minimum coupling distance between the mobile circuit and the network may or may not be covered by the user. There may be a hierarchy of overlay rights. The hierarchy may depend on the type of user and/or the type of mobile circuit. For example, a plant employee user may not be allowed to change the coupling of production machines to the network. For example, an employee may be allowed to change the coupling distance of his/her company laptop to the network. For example, an employee may be allowed to allow or prevent his/her personal cellular telephone and/or automobile from coupling to the network. For example, a visitor may be prevented from having the visitor's mobile circuitry connected to the network. The coupling to the network may be automatic and seamless (e.g., after initial preferences have been set). Seamless coupling may not require input from a user.
In such an example, a user may control the tint level via input to a mobile circuit (e.g., a portable device), which is then received by the wall device through the docking interface and subsequently transferred to the control system (e.g., to the MC, NC, or WC). The mobile circuit (e.g., portable device) may include an application for communicating with an API presented by the wall device.
In some embodiments, the wall device may send a request for a state change (e.g., a hue state change) of the target to the control system (e.g., to the MC). The control system (e.g., MC) may first determine whether to grant the request (e.g., based at least in part on power considerations and/or based at least in part on whether the user has the appropriate authorization or authority). The control system (e.g., MC) may calculate, determine, select, and/or otherwise generate a state change (e.g., tone) value, and send the state change (e.g., tone) value in a primary state change (e.g., tone) command to cause the target to change (e.g., cause a tone state transition in a neighboring IGU). For example, each wall device may be connected to the control system (e.g., MC therein) via one or more wired links (e.g., through a communication line such as CAN or ethernet-compatible lines and/or through a power line using power line communication technology). For example, each wall device may be connected to the control system (e.g., the MC therein) via one or more wireless links. The wall device may be connected (via one or more wired and/or wireless connections) to an outbound network, which may communicate with the control system (e.g., the MC therein) via a link.
In some embodiments, the control system identifies a target (e.g., a target device) associated with the wall device based at least in part on previously programmed or discovered information associating the wall device with the target. For example, the MC identifies the IGU associated with the wall device based at least in part on previously programmed or discovered information associating the wall device with the IGU. A control algorithm or rule set may be stored in and executed by the control system (e.g., the MC therein) to determine that one or more control signals from the wall device take precedence over, for example, tonal values generated by the control system (e.g., the MC therein). At times of high demand (e.g., high power demand), a control algorithm or rule set stored in and executed by the control system (e.g., the MC therein) may be used to determine that the tone values previously generated by the control system (e.g., the MC therein) take precedence over any control signals received from the wall device.
In some embodiments, based at least in part on receiving a request or control signal (e.g., a tint state change request or control signal) from a wall device for a change in the state of a target, a control system (e.g., an MC therein) uses information about a combination of known parameters to generate a state change (e.g., tint) value that provides a typical user desired lighting condition. Thus, the control system (e.g., the MC therein) may use the power more efficiently. In some embodiments, the control system (e.g., the MC therein) may generate the state change (e.g., tint) value based at least in part on preset preferences defined by or for a particular user that requested the targeted (e.g., tint) state change via the wall device. For example, a user may be required to enter a password in a wall device or to use a security token or key (e.g., IBUTTON or other 1-Wire device) to gain access to the wall device. The control system (e.g., the MC therein) may then determine the identity of the user based at least in part on the password, the security token, and/or the security key. The control system (e.g., the MC therein) may retrieve the user's preset preferences. The control system (e.g., the MC therein) may use the preset preferences, either alone or in combination with other parameters such as power considerations or information from various sensors, historical data, and/or user preferences, to calculate, determine, select, and/or otherwise generate the hue values for the respective IGU.
In some embodiments, the wall device sends a hue state change request to an appropriate control system (e.g., an NC therein). The lower level of the control system (e.g., to the NC therein) may communicate the request, or a communication based at least in part on the request, to the higher level of the control system (e.g., to the MC). For example, each wall device may be connected with a corresponding NC via one or more wired links. In some embodiments, the wall device sends a request to the appropriate NC, which itself then determines whether to override a primary tone command previously received from the MC or a primary or secondary tone command previously generated by the NC. As described below, the NC may generate the tone command without first receiving the tone command from the MC. In some embodiments, the wall device transmits the request or control signal directly to the WC controlling the adjacent IGU. For example, each wall device may be connected to a respective WC by one or more wired links, such as those just described for MC or by wireless links.
In some embodiments, the NC or MC determines whether the control signal from the wall device should override the tone value previously generated by the NC or MC. As mentioned above, the wall device is able to communicate directly with the NC. However, in some examples, the wall device may transmit the request directly to the MC or directly to the WC, which then transmits the request to the NC. In some embodiments, the wall device can communicate the request to a customer-facing network (such as a network managed by the owner or operator of the building) and then pass the request (or a request based thereon) directly or indirectly through the MC to the NC. For example, a control algorithm or rule set stored in and executed by the NC or MC may indicate that one or more control signals from the wall device take precedence over tone values previously generated by the NC or MC. In some embodiments (e.g., at times of high demand), a control algorithm or rule set stored in and executed by the NC or MC decides that a previously generated tone value of the NC or MC takes precedence over any control signal received from the wall device.
In some embodiments, based at least in part on receiving a hue state change request or control signal from a wall device, the NC may use information regarding a combination of known parameters to generate hue values that provide desired lighting conditions for a typical user. In some embodiments, the NC or MC may generate the hue value based at least in part on preset preferences defined by or for a particular user who requested the hue state change via the wall device. For example, a user may be required to enter a password in a wall device or use a security token or key (e.g., IBUTTON or other 1-Wire device) to gain access to the wall device. In this example, the NC may communicate with the MC to determine the identity of the user, or the MC may determine the identity of the user based at least in part on a password, security token, or security key. The MC may then retrieve the user's preset preferences and use the preset preferences, either alone or in combination with other parameters (such as power considerations or information from various sensors), to calculate, determine, select, or otherwise generate the tonal values for the respective IGU.
In some embodiments, the control system (e.g., the MC therein) is coupled to an external database (or "data store" or "data warehouse"). For example, the database may be a local database coupled to the control system (e.g., the MC therein) via a wired hardware link. In some embodiments, the database is a remote or cloud-based database that can be accessed by the control system (e.g., the MC therein) via an internal private network or by an external network. Other computing devices, systems, or servers may also be accessible to read data stored in the database, for example, over an outbound network. One or more control applications or third party applications may also access to read data stored in the database via the outbound network. In some embodiments, the control system (e.g., the MC therein) stores a record of all tone commands in the database, including the corresponding tone values issued by the control system (e.g., the MC therein). The control system (e.g., the MC therein) may also collect status and sensor data and store it in a database (which may constitute historical data). The local controller (e.g., WC) can collect sensor data and/or status data from the peripheral structure and/or from other devices (e.g., IGUs) or media disposed in the peripheral structure and transmit the sensor data and/or status data to a corresponding higher-level controller (e.g., NC) over the communication link. This data may move the control chain up, for example to the MC. For example, the controller (e.g., NC or MC) itself may also be communicatively coupled (e.g., connected) to various sensors within the building, such as light, temperature, or occupancy sensors, as well as sensors (e.g., light and/or temperature sensors) located on, around, or otherwise outside of the building (e.g., on the roof of the building). In some embodiments, the control system (e.g., NC or WC) may also send status and/or sensor data (e.g., directly) to a database for storage.
In some embodiments, the network system is adapted to integrate intelligent thermostat services, reminder services (e.g., fire detection), security services, and/or other appliance automation services. An example of a home automation service is manufactured by Nest Labs of palo alto, california
Figure BDA0003894501620000431
(
Figure BDA0003894501620000432
Is a registered trademark of google, inc, mountain view, california). As used herein, references to a BMS in some embodiments may also or instead encompass such other automated services. In some embodiments, the control system (e.g., the MC therein) and the separate automation service (such as the BMS) may communicate via an Application Programming Interface (API). For example, the API can execute with a (e.g., master) controller application (or platform) within a controller (e.g., MC) and/or with a building management application (or platform) within a BMS. The controller (e.g., MC) and BMS may communicate over one or more wired links and/or via an outbound network. For example, the BMS may communicate instructions for controlling the IGU to a controller (e.g., MC), which then generates a primary status (e.g., coloring) command for the target and sends the command to the appropriate lower level controller (e.g., to the NC). The lower level controllers (e.g., NCs or WCs) may communicate directly with the BMS (e.g., communicate wirelessly through wired/hardware links and/or through wireless data links). In some embodiments, the BMS also receives data collected by one or more controllers in the control system (e.g., by the MC, NC, and/or WC), such as sensor data, status data, and associated time And (4) stamping data. For example, a controller (e.g., MC) may publish such data over a network. In some embodiments that store such data in a database, the BMS may access some or all of the data stored in the database.
In some embodiments, a controller (e.g., "MC") is collectively referred to as any suitable combination of hardware, firmware, and software for implementing the described functions, operations, processes, or capabilities. For example, an MC may refer to a computer executing a master controller application (also referred to herein as a "program" or "task"). For example, a controller (e.g., MC) may include one or more processors. The processor may be or may include a Central Processing Unit (CPU), such as a single or multi-core processor. In some examples, the processor may additionally include a Digital Signal Processor (DSP) or a network processor. The processor may also include one or more Application Specific Integrated Circuits (ASICs). The processor is coupled to the primary memory, the secondary memory, the inward network interface, and the outward network interface. The main memory may include one or more high speed memory devices, such as one or more Random Access Memory (RAM) devices including Dynamic RAM (DRAM) devices. Such DRAM devices may include, for example, synchronous DRAM (SDRAM) devices and double data rate SDRAM (DDR SDRAM) devices (including DDR2 SDRAM, DDR3 SDRAM, and DDR4 SDRAM), thyristor RAM (T-RAM), and zero capacitor
Figure BDA0003894501620000441
And other suitable storage devices. In some embodiments, the secondary storage may include one or more Hard Disk Drives (HDDs) or one or more Solid State Drives (SSDs). In some embodiments, the memory may store processor executable code (or "programming instructions") for executing a multitasking operating system, e.g., based at least in part on
Figure BDA0003894501620000442
The operating system of the kernel. The operating system may be based on the similarity
Figure BDA0003894501620000443
Or Unix operating System, microsoft based
Figure BDA0003894501620000444
Or other suitable operating system. The memory may also store code executable by the processor to execute the main controller application program described above, as well as code for executing other applications or programs. The memory may also store status information, sensor data, or other data collected from the network controller, the window controller, and various sensors.
In some embodiments, the controller (e.g., MC) is a "headless" system; i.e., a computer that does not include a display monitor or other user input device. For example, an administrator or other authorized user may log in or otherwise access the controller (e.g., MC) from a remote computer or mobile computing device over a network to access and retrieve information stored in the controller (e.g., MC), write or otherwise store data in the controller (e.g., MC), and/or control various functions, operations, processes, or parameters implemented or used by the controller (e.g., MC). The controller (e.g., MC) may also include a display monitor and direct user input devices (e.g., one or more of a mouse, keyboard, and/or touch screen).
In some embodiments, the inbound network interface enables one controller (e.g., MC) of the control system to communicate with various distributed controllers and/or various targets (e.g., sensors). An inbound network interface may collectively refer to one or more wired network interfaces and/or one or more wireless network interfaces (including one or more radio transceivers). For example, the inbound network interface may enable communication with a downstream controller (e.g., NC) over a link.
Downstream may refer to a lower level of control in the control hierarchy.
In some embodiments, the outbound network interface enables the controller (e.g., MC) to communicate with various computers, mobile circuits (e.g., mobile devices), servers, databases, and/or cloud-based database systems over one or more networks.
An outbound network interface may collectively refer to one or more wired network interfaces and/or one or more wireless network interfaces (including one or more radio transceivers). In some embodiments, various applications executing within such remote devices, including third party applications and/or cloud-based applications, may access or provide data from or to the controller (e.g., MC) or to the database via the controller (e.g., MC). For example, a controller (e.g., MC) may include one or more Application Programming Interfaces (APIs) for facilitating communication between the controller (e.g., MC) and various third-party applications. Some examples of APIs that a controller (e.g., MC) may implement may be found in PCT patent application PCT/US15/64555 (attorney docket number VIEWP073 WO), filed 12, 8/2015 and entitled "MULTIPLE interaction SYSTEMS AT a SITE," which is incorporated by reference herein in its entirety. For example, the third party application may include various monitoring services, including: thermostat services, alarm services (e.g., fire detection), security services, and/or other appliance automation services. Other examples of MONITORING services AND systems can be found in PCT patent application PCT/US2015/019031 (attorney docket number vigp ewr WO), filed 3, 5/3/2015 AND entitled "MONITORING services control switching DEVICES AND controls," which is incorporated by reference herein in its entirety.
In some embodiments, one or both of the inbound and outbound network interfaces may comprise a building automation and control network (BACnet) compatible interface. BACnet is a communication protocol, commonly used in building automation and control networks, defined by ASHRAE/ANSI 135 and ISO 16484-5 standards. The BACnet protocol broadly provides a mechanism for computerized building automation systems and devices to exchange information, e.g., regardless of the particular service they perform. For example, BACnet may be used to enable communication between (i) heating, ventilation, and air conditioning control (HVAC) systems, (ii) lighting control systems, (iii) access and/or security control systems, (iv) fire detection systems, or (v) any combination thereof, and their associated equipment. In some examples, one or both of the inbound and outbound network interfaces may comprise an oBIX (open building information exchange) compatible interface or another RESTful Web service based interface.
In some embodiments, the controller (e.g., MC) may calculate, determine, select, and/or otherwise generate a preferred state of the target (e.g., a hue value of one or more IGUs) based at least in part on a combination of the parameters. For example, the combination of parameters may include time and/or calendar information, such as time of day, day of year, or time of season. The combination of parameters may include solar calendar information, such as the direction of the sun relative to a facility and/or target (e.g., IGU). The direction of the sun relative to the facility and/or target (e.g., IGU) may be determined by the controller (e.g., MC) based at least in part on time and/or calendar information, e.g., along with known information about the geographic location of the facility (e.g., building) on earth and the direction in which the target (e.g., IGU) is facing (e.g., in a northeast coordinate system). The combination of parameters may also include external and/or internal environmental conditions. Such as outside temperature (outside the building), inside temperature (within a room adjacent the target IGU), or temperature within the interior volume of the IGU. The combination of parameters may include information about the weather (e.g., whether it is sunny, cloudy, rainy, or snowy). Parameters such as time of day, day of year, and/or direction of the sun may be programmed into and tracked by the control system (e.g., the MC therein). Parameters such as external temperature, internal temperature, and/or IGU temperature may be obtained from sensors within, on, or around the building or sensors integrated with the target (e.g., on or within the IGU). Sometimes the target may include a sensor. Examples of algorithms, routines, modules or other means FOR generating IGU tone values are described in U.S. patent application No. 13/772,969, entitled "CONTROL METHOD FOR tilt WINDOWS", filed 21.2.2013, and PCT patent application No. PCT/US15/029675, entitled "CONTROL METHOD FOR tilt WINDOWS", filed 7.5.2015., each of which is incorporated herein by reference in its entirety.
In some embodiments, at least one (e.g., each) device (e.g., ECD) within each IGU can be colored, for example, in response to a suitable drive voltage applied across the EC stack. The hue may be any hue state within a continuous hue range (e.g., virtually) defined by the material properties of the EC stack. However, the control system (e.g., the MC therein) may be programmed to select tonal values (e.g., tonal values designated as integer values) from a limited number of discrete tonal values. In some such implementations, the number of available discrete tone values may be at least 2, 4, 8, 16, 32, 64, 128, or 256 or more. For example, a 2-bit binary number may be used to specify any of four possible integer tone values, a 3-bit binary number may be used to specify any of eight possible integer tone values, a 4-bit binary number may be used to specify any of sixteen possible integer tone values, a 5-bit binary number may be used to specify any of thirty-two possible integer tone values, and so on. At least one (e.g., each) hue value may be associated with a target hue level (e.g., expressed as a maximum hue, a maximum safe hue, or a maximum desired color, and/or as a percentage of the available hue). For purposes of illustration, consider an example in which MC is selected from four available tonal values: 0. 5, 10 and 15 (using a binary number of 4 bits or higher). Hue values 0, 5, 10, and 15 may be associated with target hue levels of 60%, 40%, 20%, and 4%, or 60%, 30%, 10%, and 1%, respectively, or another desired favorable or appropriate target hue level.
Fig. 3 shows a block diagram of an exemplary Master Controller (MC) 300. The MC 300 may be implemented in or as one or more computers, computing devices, or computer systems (used interchangeably herein, unless otherwise indicated, where appropriate). For example, the MC 300 includes one or more processors 302 (hereinafter collectively referred to as "processors 302"). The processor 302 may be or may include a Central Processing Unit (CPU), such as a single-core or multi-core processor. In some examples, processor 302 mayAdditionally included is a Digital Signal Processor (DSP) or network processor. Processor 302 may also include one or more Application Specific Integrated Circuits (ASICs). Processor 302 is coupled to a main memory 304, a secondary memory 306, an inbound network interface 308, and an outbound network interface 310. The main memory 304 may include one or more high-speed memory devices, such as one or more Random Access Memory (RAM) devices including Dynamic RAM (DRAM) devices. Such DRAM devices may include, for example, synchronous DRAM (SDRAM) devices and double data rate SDRAM (DDR SDRAM) devices (including DDR2 SDRAM, DDR3 SDRAM, and DDR4 SDRAM), thyristor RAM (T-RAM), and zero capacitor
Figure BDA0003894501620000481
And other suitable storage devices.
In some implementations, the MC and NC are implemented as a master controller application and a network controller application, respectively, executing within respective physical computers or other hardware devices. For example, each of the master controller application and the network controller application may be implemented within the same physical hardware. Each of the master controller application and the network controller application may be implemented as separate tasks executing within a single computer device that includes a multitasking operating system, e.g., based at least in part on
Figure BDA0003894501620000482
A kernel operating system or other suitable operating system.
In some embodiments, the master controller application and the network controller application may communicate through an Application Programming Interface (API). In some embodiments, the master controller and the network controller application communicate through a loopback interface. For reference, a loopback interface is a virtual network interface implemented by an operating system that enables communication between applications executing within the same device. The loopback interface is typically identified by the IP address (typically in a 127.0.0.0/8 address block in 1Pv4, or 0 in 1Pv 6. For example, the main controller application and the network controller application may each be programmed to send communications targeted to each other to the IP address of the loopback interface. In this manner, when the master controller application sends a communication to the network controller application, or vice versa, the communication need not leave the computer.
In some embodiments where the MC and NC are implemented as a master controller and network controller application, respectively, there is typically no restriction that limits the available protocols that are applicable for communication between the two applications. This is generally true whether the master controller application and the network controller application are executed as tasks within the same or different physical computers. For example, there is no need to use a broadcast communication protocol, such as BACnet, which restricts communication with a network segment as defined by a switch or router boundary. For example, the oBIX communication protocol may be used in some implementations for communication between the MC and NC.
In some embodiments, each NC is implemented as an instance of a network controller application that executes as a task within a respective physical computer. In some embodiments, at least one of the computers executing the instance of the network controller application also executes the instance of the master controller application to implement the MC. For example, two or more computers executing network controller application instances may install an instance of a master controller application, although only one instance of the master controller application may be actively executing in the network system at any given time. In this manner, redundancy is added so that the computer currently executing the primary controller application is no longer a single point of failure for the entire system. For example, if a computer executing the primary controller application fails or a particular instance of the primary controller application otherwise ceases to function, another computer installed with an instance of the primary network application may begin executing the primary controller application to take over another failed instance. In some embodiments, more than one instance of the master controller application may be executing simultaneously. For example, the functions, processes or operations of the master controller application may be distributed to two (or more) instances of the master controller application.
Fig. 4 illustrates a block diagram of an example Network Controller (NC) 400, which may be implemented in or as one or more network components, network devices, computers, computing devices or computer systems (which may be used interchangeably herein, where appropriate, unless otherwise indicated). References to "NC 400" are collectively referred to as any suitable combination of hardware, firmware, and software for implementing the described functions, operations, procedures, or capabilities. For example, the NC 400 may refer to a computer executing a network controller application (also referred to herein as a "program" or "task"). The NC 400 typically includes one or more processors 402 (hereinafter collectively referred to as "processors 402"). In some implementations, the processor 402 is implemented as a microcontroller or as one or more logic devices including one or more Application Specific Integrated Circuits (ASICs) or Programmable Logic Devices (PLDs), such as Field Programmable Gate Arrays (FPGAs) or Complex Programmable Logic Devices (CPLDs). When implemented in a PLD, a processor may be programmed into the PLD as an Intellectual Property (IP) block or permanently formed in the PLD as an embedded processor core. The processor 402 may be or may include a Central Processing Unit (CPU), such as a single-core or multi-core processor. Processor 402 is coupled to main memory 404, secondary memory 406, downstream network interface 408, and upstream network interface 410. In some embodiments, main memory 404 may be integrated with processor 402, e.g., as a system-on-chip (SOC) package, or in embedded memory within the PLD itself. The NC 400 may include one or more high speed memory devices, such as one or more RAM devices. In some embodiments, secondary memory 406 may include one or more Solid State Drives (SSDs) that store one or more look-up tables or arrays of values. The secondary memory 406 may store a lookup table that maps a first protocol ID (e.g., BACnet ID) received from the MC to a second protocol ID (e.g., CAN ID), each protocol ID identifying a respective one of the WCs, and vice versa. In some embodiments, secondary memory 406 stores one or more arrays or tables. The downstream network interface 408 enables the NC 400 to communicate with distributed WCs and/or various sensors. The upstream network interface 410 enables the NC 400 to communicate with the MC and/or various other computers, servers, or databases.
In some embodiments, when the MC determines to tint one or more IGUs, the MC writes a particular tone value to the AV in the NC associated with the respective one or more WCs of the control-target IGU. For example, the MC may generate a dominant hue command communication that includes a BACnet ID associated with the WC of the control target IGU. The dominant hue command may also include the hue value of the target IGU. The MC may direct the transmission of the keytone command to the NC using a network address (e.g., an IP address or MAC address). In response to receiving such a keytone command from the MC over the upstream interface, the NC may unpack the communication, map the BACnet ID (or other first protocol ID) in the keytone command to one or more CAN IDs (or other second protocol ID), and write the tone value from the keytone command to the first of the various AVs associated with each CAN ID.
In some embodiments, the NC then generates an assistive hue command for each WC identified by the CAN ID. Each secondary tone command may be addressed to a respective one of the WCs by a respective CAN ID. For example, each secondary hue command may also include a hue value extracted from the primary hue command. The NC may send the secondary tone command to the target WC over the downstream interface via a second communication protocol (e.g., via the CANOpen protocol). In some embodiments, when the WC receives such an assist tone command, the WC sends a status value back to the NC indicating the status of the WC. For example, the hue state value may represent a "coloring state" or a "transition state" indicating that the WC is in the process of coloring the target IGU, the "active" or "complete" state indicating that the target IGU is in the target hue state or transition completed, or the "error state" indicating an error. After the state values have been stored in the NC, the NC may publish or otherwise make the state information accessible to the MC or various other authorized computers or applications. In some embodiments, the MC requests state information for a particular WC from the NC based at least in part on intelligence, scheduling policy, or user override. For example, intelligence may be within the MC or within the BMS. The scheduling policy may be stored in the MC, in another storage location within the network system, or within the cloud-based system.
In some embodiments, the NC handles some of the functions, processes, or operations described above as being the responsibilities of the MC. In some embodiments, the NC may include additional functions or capabilities not described with reference to the MC. For example, the NC also includes a data logging module (or "data logger") for logging data associated with the IGU controlled by the NC. In some embodiments, the data logger logs the status information included in each of some or all of the responses to the status requests. For example, the state information that the WC transmits to the NC in response to each state request may include: a hue state value (S) of the IGU, a value indicative of a particular stage in a coloring transition (e.g., a particular stage of a voltage control curve), a value indicative of whether the WC is in a sleep mode, a hue value (C), a value set by the WC based at least in part on the hue value (e.g., an effective applied voltage V) Eff Value of) measured, detected or otherwise determined at the ECD within the IGU Act Actual current level I measured, detected or otherwise determined by the ECD within the IGU Act And various sensor data collected, for example, from a photosensor or temperature sensor integrated on or within the IGU. The NC 500 may collect and queue state information in a message queue, such as RabbitMC, activeMQ, or Kafka, and stream the state information to the MC for subsequent processing, such as data reduction/compression, event detection, etc., as further described herein.
In some embodiments, a data logger within the NC collects and stores various information received from the WC in the form of a log file such as a Comma Separated Values (CSV) file or by another table structured file format. For example, each row of a CSV file may be associated with a respective status request, and may include C, S, V Eff 、V Act And I Act And sensor data (or other data) received in response to the status request. In some implementations, each row is identified by a timestamp corresponding to the respective status request (e.g., when the NC is presentWhen a status request is sent, when the WC collects data, when a response including data is transmitted by the WC, or when a response is received by the NC). In some embodiments, each row also includes a CAN ID or other ID associated with the respective WC.
In some embodiments, each row of the CSV file includes requested data for all WCs controlled by the NC. The NC may cycle through all the WCs it controls sequentially during each round of status requests. In some embodiments, each row of the CSV file is identified by a timestamp (e.g., in the first column), but a timestamp may be associated with the beginning of each row of the dynamic request, rather than with each individual request. In one particular example, columns 2-6 may each include C, S, V for a first one of the WCs controlled by the NC Eff 、V Act And I Act The values, columns 7-11, may include C, S, V, respectively, for a second one of the WCs Eff ,V Act And I Act Values, columns 12-16 may include C, S, V, respectively, for a third one of the WCs Eff ,V Act And I Act Values, and so on and up to all WCs controlled by the NC. Subsequent rows in the CSV file may include the corresponding values for the next wheel state request. In some embodiments, each row may also include sensor data obtained from a photosensor, temperature sensor, or other sensor integrated with the respective IGU controlled by each WC. For example, such sensor data values may be input to C, SV for a first one of the WCs Eff ,V Act And I Act In the corresponding column between values, but in the row for C, S, V of the next one of the WCs Eff 、V Act And I Act Before the value. Each row may include sensor data values from one or more external sensors, e.g., positioned on one or more facades of a building or on a roof. The NC may send a status request to the external sensor at the end of each lane of status requests.
In some embodiments, one function of the NC may be to translate between various upstream and downstream protocols, e.g., to enable distribution of information between the WC and the MC or between the WC and an outbound network. For example, the NC may include a protocol conversion module that is responsible for such conversion or translation services. The protocol conversion module may be programmed to perform conversion between any of a plurality of upstream protocols and any of a plurality of downstream protocols. For example, such upstream protocols may include UDP protocols such as BACnet, TCP protocols such as oBix, other protocols built over these protocols, and various wireless protocols. Downstream protocols may include, for example, CANopen, other CAN-compatible protocols, and various wireless protocols, including, for example, protocols based at least in part on IEEE 802.11 standards (e.g., wiFi), protocols based at least in part on IEEE 802.15.4 standards (e.g., zigBee, 6LoWPAN, isa100.11a, wirelessHART, or MiWi), protocols based at least in part on bluetooth standards (including classic bluetooth, bluetooth high-speed, and bluetooth low energy consumption protocols and including bluetooth v4.0, v4.1, and v4.2 versions), or protocols based at least in part on the EnOcean standard (ISO/IEC 14543-1453-10).
In some embodiments, the NC uploads information recorded by the data recorder (e.g., as a CSV file) to the MC periodically (e.g., every 24 hours). For example, the NC may send CSV files to the MC via File Transfer Protocol (FTP) or other suitable protocol over ethernet data link 316. The state information may be stored in a database or accessible by the application over an outbound network.
In some embodiments, the NC includes functionality to analyze information recorded by the data recorder. For example, an analysis module may be provided in the NC to receive and/or analyze (e.g., in real-time) raw information recorded by the data recorder. Real-time may include within at most 1 second (sec), 30 seconds, 45 seconds, 1 minute (min), 2 minutes, 3 minutes, 4 minutes, 5 minutes, 10 minutes, 15 minutes, or 30 minutes from the receipt of the recorded information by the data recorder and/or from the start of operation (e.g., from the receipt and/or from the analysis). In some embodiments, the analysis module is programmed to make the decision based at least in part on raw information from the data logger. In some embodiments, the analysis module may communicate with a database to analyze the status information recorded by the data logger after the data logger is stored in the database. For example, the analysis module may analyze the originality of the electrical characteristic Value (such as V) Eff 、V Act And I Act ) A comparison is made to an expected value or range of expected values and a special condition is flagged based at least in part on the comparison. Such flagged conditions may include, for example, power spikes indicative of faults such as short circuits, errors, or ECO damage. The analysis module may communicate such data to a hue determination module or a power management module in the NC.
In some embodiments, the analysis module may also filter the raw data received from the data logger to more intelligently or efficiently store information in the database. For example, the analysis module may be programmed to only pass "interesting" information to the database manager for storage in the database. For example, the information of interest may include outliers, values that otherwise deviate from expected values (such as based at least in part on empirical or historical values), or particular periods of time during which the transition occurred. Examples of data manipulation (e.g., filtering, parsing, temporary storage, and long term efficient storage in a database) can be found in PCT patent application PCT/US15/029675 filed 5/7/2015 and entitled "CONTROL METHOD FOR mobile WINDOWS" (attorney docket number vipp ew049X 1 WO), which is hereby incorporated by reference in its entirety.
In some embodiments, a database manager module (or "database manager") in the control system (e.g., in the NC) is configured to store information recorded by the data logger to the database on a regular basis, e.g., at least every hour, every few hours, or every 24 hours. The database may be an external database, such as the database described above. In some embodiments, the database may be internal to the controller (e.g., NC). For example, the database may be implemented as a time series database, such as a graph database within a secondary memory of the controller (e.g., of the NC) or within another long term memory within the controller (e.g., of the NC). For example, the database manager may be implemented as a graph Daemon executing a background process, task, sub-task, or application within a multi-tasking operating system as a controller (e.g., NC). Time series databases may be preferred over relational databases such as SQL because time series databases are more efficient for analyzing data over time.
In some embodiments, a database may collectively refer to two or more databases, each of which may store some or all of the information obtained by some or all NCs in a network system. For example, it may be desirable to store copies of information in multiple databases for redundancy purposes. A database may collectively refer to multiple databases, each within a respective controller (e.g., NC), such as a graph or other time series database. It may be advantageous to store copies of information in multiple databases so that information requests from applications, including third party applications, may be distributed among the databases and more efficiently processed. For example, the databases may be synchronized periodically or otherwise, such as to maintain consistency.
In some embodiments, the database manager filters the data received from the analysis module to more intelligently and/or efficiently store information, such as in internal and/or external databases. For example, a database manager may be programmed to store (e.g., only) information of "interest" to a database. The information of interest may include outliers, values that otherwise deviate from expected values (such as based at least in part on empirical or historical values), and/or specific time periods during which the transition occurred. A more detailed example of how data manipulation (e.g., how raw data is filtered, parsed, temporarily stored, and efficiently stored in a database FOR a long period of time) may be found in PCT patent application PCT/US15/029675 (attorney docket number VIEWP049X1 WO) filed 5/7/2015 and entitled "CONTROL METHOD FOR TINTABLE WINDOWS," which is hereby incorporated by reference in its entirety.
In some embodiments, a state determination module for a target is included in a controller (e.g., NC, MC, or WC), for example, for calculating, determining, selecting, or otherwise generating a state value for the target. For example, a hue determination module may be included in a controller (e.g., NC, MC, or WC) for calculating, determining, selecting, or otherwise generating a hue value for an IGU. For example, the state (e.g., hue) determination module may execute various algorithms, tasks, or subtasks based at least in part on a combination of parameters to generate a hue value. The combination of parameters may include, for example, status information collected and stored by the data logger. The parameter combinations may also include time and calendar information, such as time of day, day of year, or time of season. The combination of parameters may include solar calendar information, such as the direction of the sun relative to a target (e.g., IGU). The combination of parameters may include one or more characteristics of the surrounding structural environment, including gas concentration (e.g., VOC, humidity, carbon dioxide, or oxygen), debris, gas type, gas flow rate, gas flow direction, gas (e.g., atmospheric) temperature, noise level, or light level (e.g., brightness). The combination of parameters may also include an external parameter (e.g., temperature) external to the peripheral structure (building), an internal parameter (e.g., temperature) within the peripheral structure (e.g., within a room adjacent the target IGU), and/or a temperature within the interior volume of the IGU. The combination of parameters may include information about the weather (e.g., whether sunny, cloudy, rainy, or snowy). Parameters such as time of day, day of year, and/or direction of the sun may be programmed into and tracked by a control system (e.g., a control system including an NC). For example, parameters such as external temperature, internal temperature, and/or IGU temperature may be obtained from sensors in, on, or around the building or integrated on or within the IGU. In some embodiments, the various parameters are provided by, or determined based at least in part on, information provided by various applications, including third party applications that may communicate with the controller (e.g., NC) via an API. For example, a network controller application or operating system on which it runs may be programmed to provide an API.
In some embodiments, a target state (e.g., tint) determination module determines a state (e.g., tint) value for a target based at least in part on user coverage, e.g., user coverage received via various mobile circuit (e.g., device) applications, wall devices, and/or other devices. In some embodiments, the state (e.g., shading) determination module determines the state (e.g., shading) value based at least in part on commands or instructions received by various applications, including, for example, third party applications and/or cloud-based applications. For example, such third party applications may include various monitoring services, including: thermostat services, alarm services (e.g., fire detection), security services, and/or other appliance automation services. Other examples of MONITORING services AND systems may be found in PCT/US2015/019031 (attorney docket No. VIEWP061 WO), filed 3/5/2015 AND entitled "MONITORING SITES CONTAINING SWITCH OPTICAL DEVICES AND CONTROLLERS," which is incorporated by reference herein in its entirety. These applications may communicate with the status (e.g., hue) determination module and/or other modules within the controller (e.g., NC) via one or more APIs. Some examples of APIs that a controller (e.g., NC) may implement are described in PCT patent application PCT/US15/64555 (attorney docket number VIEWP073 WO), filed on 8/12/2015 and entitled "MULTIPLE interface system AT a SITE," which is incorporated by reference herein in its entirety.
In some embodiments, the analysis module compares V Eff 、V Act And I Act And sensor data obtained in real time and/or previously stored in a database, is compared to an expected value or range of expected values, and a particular condition is flagged based at least in part on the comparison. For example, the analysis module may communicate such tagged data, tagged conditions, or related information to the power management module. Such flagged conditions may include, for example, a power spike, an error indicating an ECO short, or a smart window (e.g., ECO) break. In some embodiments, the power management module modifies the operation based at least in part on the tagged data or condition. For example, the power management module may delay status (e.g., coloring) commands of the target until power demand has dropped, stop sending commands to the failed controller (e.g., a local controller such as a WC) (and place them in an idle state), begin interleaving commands to the controller (e.g., a lower level controller such as a WC), manage peak power, and/or issue help signals.
Fig. 5 illustrates an exemplary Network Controller (NC) 500 that includes a plurality of modules. The NC 500 is coupled to the MC 502 and the database 504 through an interface 510 and to the WC 506 through an interface 508. In an example, the internal modules of the NC 500 include a data logger 512, a protocol conversion module 514, an analysis module 516, a database manager 518, a hue determination module 520, a power management module 522, and a debugging module 524.
In some embodiments, a controller (e.g., a WC) or other network device includes a sensor or collection of sensors. For example, multiple sensors or sensor aggregates may be organized into sensor modules. The sensor assembly may comprise a circuit board (such as a printed circuit board), for example, wherein a plurality of sensors are adhered or attached to the circuit board. The sensor may be removed from the sensor module. For example, the sensor may be inserted into and/or extracted from the circuit board. The sensors may be individually activated and/or deactivated (e.g., using a switch). The circuit board may comprise a polymer. The circuit board may be transparent or non-transparent. The circuit board may include a metal (e.g., an elemental metal and/or a metal alloy). The circuit board may include a conductor. The circuit board may include an insulator. The circuit board may comprise any geometric shape (e.g., rectangular or oval). The circuit board may be configured (e.g., may have a shape) to allow the aggregate to be disposed in a frame portion such as a mullion (e.g., of a window). The circuit board may be configured (e.g., may have a shape) to allow the aggregate to be disposed in a frame (e.g., a door frame and/or a window frame). The frame may include one or more apertures, for example, to allow the sensor to obtain (e.g., accurate) readings. The circuit board may be enclosed in a wrapper. The wrapper may include a flexible portion or a rigid portion. The wrapper may be flexible. The wrapper may be rigid (e.g., composed of a hardened polymer, from glass, or from a metal (e.g., containing an elemental metal or metal alloy)). The wrapper may comprise a composite material. The packing material may include carbon fibers, glass fibers, and/or polymer fibers. The wrapper may have one or more holes, for example, to allow the sensor to obtain (e.g., accurate) readings. The circuit board may include an electrical connection port (e.g., a socket). The circuit board may be connected to a power source (e.g., electrical power). The power source may include a renewable power source and/or a non-renewable power source.
FIG. 6 shows a diagram 600 of an example of a sensor ensemble organized into sensor modules. The sensors 610A, 61OB, 61OC, and 61OD are shown as being included in the sensor aggregate 605. The aggregate of sensors organized into sensor modules may include at least 1, 2, 4, 5, 8, 10, 20, 50, or 500 sensors. The sensor module may include a plurality of sensors ranging between any of the above values (e.g., from about 1 to about 1000, from about 1 to about 500, or from about 500 to about 1000). The sensors of the sensor module may include sensors configured and/or designed to sense parameters including temperature, humidity, carbon dioxide, particulate matter (e.g., between 2.5 μm and 10 μm), total volatile organic compounds (e.g., changes in voltage potential caused by surface adsorption of volatile organic compounds), ambient light, audio noise levels, pressure (e.g., gases and/or liquids), acceleration, time, radar, lidar, radio signals (e.g., ultra-wideband radio signals), passive infrared, glass break, or movement detectors. The sensor ensemble (e.g., 605) may include non-sensor devices such as a buzzer and a light emitting diode. An example of a sensor assembly AND its use can be found in U.S. patent application Ser. No. 16/447169, filed on 20.6.2019 AND entitled "SENSING AND COMMUNICATIONS UNIT FOR OPTICAL SWITCHABLE WINDOWS SYSTEMS," which is incorporated herein by reference in its entirety.
In some embodiments, an increase in the number and/or types of sensors may be used to increase the probability that one or more measured characteristics are accurate and/or that a particular event measured by one or more sensors has occurred. In some embodiments, the sensors of the sensor assembly may cooperate with each other. In one example, a radar sensor of a sensor ensemble may determine the presence of multiple individuals in a peripheral structure. A processor (e.g., processor 615) may determine that detection of the presence of the plurality of individuals in the peripheral structure is positively correlated with an increase in the concentration of carbon dioxide. In one example, a memory accessible to the processor may determine that the increase in detected infrared energy is positively correlated with an increase in temperature detected by the temperature sensor. In some embodiments, a network interface (e.g., 650) may communicate with other sensor ensembles similar to a sensor ensemble. The network interface may additionally be in communication with the controller.
The individual sensors of the sensor ensemble (e.g., sensor 610A, sensor 610D, etc.) may include and/or utilize at least one dedicated processor. The sensor ensemble may utilize a remote processor (e.g., 654) using wireless and/or wired communication links. The sensor ensemble may utilize at least one processor (e.g., processor 652), which may represent a cloud-based processor coupled to the sensor ensemble via a cloud (e.g., 650). The processors (e.g., 652 and/or 654) may be located in the same building, in different buildings, in buildings owned by the same entity or different entities, in facilities owned by the manufacturer of the window/controller/sensor complex, or at any other location. In various embodiments, as indicated by the dashed lines of fig. 6, the sensor complex 605 need not include a separate processor and network interface. These entities may be separate entities and may be operatively coupled to the aggregation 605. The dashed lines in fig. 6 indicate optional features. In some embodiments, on-board processing and/or memory of one or more ensembles of sensors may be used to support other functions (e.g., via allocating ensemble memory and/or processing power to the network infrastructure of the building).
In some embodiments, sensor data is exchanged between various network devices and a controller. For example, the sensor data can also be accessed by a remote user (e.g., inside or outside the same building) for retrieval using the personal electronic device.
An application executing on the remote device to access the sensor data may also provide commands for controllable functions, such as tint commands for the window controller. Exemplary window CONTROLLERS are described in PCT patent application PCT/US16158872, entitled "CONTROL FOR OPTICAL-SWITCH DEVICES", filed 2016 at 26.10.10.2016 and U.S. patent application 15/334,832, filed 2016 at 26.10.10.18.18.5.3.A "CONTROL FOR OPTICAL-SWITCH DEVICES", both of which are incorporated herein by reference in their entirety.
In some embodiments, a controller (e.g., NC) periodically requests state information from a lower level controller (e.g., a WC requesting from it to control). For example, the controller (e.g., NC) may communicate status requests to at least one (e.g., each) of the lower-level controllers at a frequency of at least every few seconds, every few tens of seconds, every minute, every few minutes, or after any requested time period. In some embodiments, at least one (e.g., each) status request is directed to a respective one of the lower-level controllers (e.g., WCs) using a CAN ID or other identifier of the respective lower-level controller (e.g., WC). In some embodiments, a controller (e.g., NC) sequentially proceeds through all lower level controllers (e.g., WCs) controlled thereby during at least one (e.g., each) round of state acquisition. The controller (e.g., NC) may cycle through at least two (e.g., all) of the lower level controllers (e.g., WCs) it controls such that state requests are sequentially sent to these lower level controllers (e.g., WCs) in this wheel of state acquisitions. After a status request has been sent to a given lower level controller (e.g., WC), an upper level controller (e.g., NC) may wait to receive status information from one lower level controller (e.g., WC), e.g., before sending a status request to the next lower level controller (e.g., WC) in the wheel status acquisition.
In some implementations, after state information has been received from all lower-level controllers (e.g., WCs) controlled by an upper-level controller (e.g., NC), the upper-level controller (e.g., NC) performs a one-pass dynamic change (e.g., shading) command assignment to the target (e.g., to the IGU). For example, in some implementations, at least one (e.g., each) wheel state acquisition is followed by one round of tone command assignment, followed by a next wheel state acquisition and a next round of tone command assignment, and so on. In some embodiments, during a round of status (e.g., tone) command assignment to a target controller, the controller (e.g., NC) continues to send higher level controller (e.g., NC) controlled tone commands to lower level controllers (e.g., WC). In some embodiments, the hierarchical controller (e.g., NC) sequentially proceeds through all lower level controllers (e.g., WCs) controlled thereby during the round of tone command assignments. In other words, a higher-level (e.g., NC) controller cycles through the (e.g., all) lower-level controllers (e.g., WCs) it controls such that a state (e.g., tone) command is sent to (e.g., each) lower-level controller (e.g., WC) in the wheel state (e.g., tone) command assignment to change the state of the target (e.g., change the tone state of the IGU).
In some embodiments, the status request includes one or more instructions indicating what status information is requested from a respective lower-level controller (e.g., a local controller such as a WC). In some embodiments, in response to receiving such a request, the respective lower-level controller (e.g., WC) responds (e.g., via a communication line in the upstream cable set) by sending the requested state information to the higher-level controller (e.g., NC). In some other embodiments, each status request by default causes a lower-level controller (e.g., a WC) to send a predefined set of information for a set of targets (e.g., IGUs, sensors, transmitters, or media) that it controls. The state information that a lower-level controller (e.g., WC) transmits to an upper-level controller (e.g., NC) in response to a state request may include a (e.g., colored) state value (S) of a target (e.g., IGU). For example, whether the target (e.g., IGU) is undergoing a state change (e.g., a colored transition) or has completed a state change (e.g., a colored transition or a light intensity change). The hue state value S or another value may indicate a particular phase in the coloration transition (e.g., a particular phase of the voltage control curve). In some embodiments, the state value S or another value indicates whether a lower level controller (e.g., WC) is in a sleep mode. The state information transmitted in response to the state request may also include a state (e.g., coloring) value (C) of a target (e.g., IGU), e.g., a state set by a controller (e.g., MC or NC) And (4) state value. The response may also include a V effectively applied based at least in part on the state (e.g., coloring) value (e.g., effectively applied) by a lower level controller (e.g., WC) Eff Value of) is set. In some embodiments, the near real-time actual voltage level V is measured, detected, or otherwise determined in response to an ECD included within the IGU (e.g., via an amplifier and feedback circuit) Act . In some embodiments, the response includes a near real-time actual current level I measured, detected, or otherwise determined by an ECD within the IGU (e.g., via an amplifier and feedback circuit) Act . The response may also include various near real-time sensor data, for example, collected from a photosensor or temperature sensor integrated on or within the IGU.
In some implementations, voice and/or gesture control is used to interact with a target (e.g., an optically switchable device). Such a control approach may be more convenient than, for example, more traditional control approaches that may require a user to touch or otherwise physically interact with a particular component (e.g., a switch, knob, keypad, touch screen, etc.). Voice control may be beneficial for users with certain disabilities, for example.
In some implementations, voice and/or gesture control is used to achieve any type of manipulation of a target (e.g., any type of command on an optically switchable device). For example, voice and/or gesture control may be used to implement coloring commands for a target or for a group or zone of targets. For example, the command may be for a single optically switchable device (e.g., "change window 1 to hue 4" or "make window 1 darker"), or for a group or zone of optically switchable devices (e.g., "change window in zone 1 to hue 4" or "make window in zone 1 darker" or "make window in zone 1 much darker", etc.). The command may relate to a discrete optical state (e.g., a discrete tone level or other discrete optical state) that the associated optically switchable device should change, or a relative change in the optical state of the optically switchable device (e.g., darker, brighter, more reflective, less reflective, e.g., or "my office is too dark, please light it" or "i want to run a projector" (let the system know to dim the room) or "hot here" (let the system know to dim the window and block heat from increasing), etc.). Where relative changes are used, the control system may be designed and/or configured to implement incremental (e.g., step) changes (e.g., 10% dark or light) in the optical state of the optically switchable device to execute the command. The degree to which each increment (e.g., step) is changed may be predefined. In some embodiments, the control system is designed and/or configured to effect incremental (e.g., stepped) changes in size and/or degree specified by a user. These commands may be modified by any relevant word used in the command (e.g., "very" or "slightly", or "brighter" or "darker", etc.).
In some embodiments, voice control may also be used to set a schedule for a target (e.g., an optically switchable device). For example, the user may direct the optically switchable device to tint at a particular time/day (e.g., "make the windows in zone 1 change to tint 4 at 2 pm monday through friday afternoon" or "the morning sun warms up here" (let the system know to tint the windows at the morning time when the sun is on that side of the building) or "i do not see a mountain in the afternoon" (let the system know that the windows tint too much in the afternoon and lighten them in the afternoon)). Similarly, voice control may be used to implement the coloring rules for the optically switchable device (e.g., "color the window in zone 1 to hue 4 when the outside sunlight is sufficient" or "color the window in the room if the temperature within the room is above 70 ° f"). In some embodiments, any rules that may be implemented on the network of optically switchable devices (including any other networked components such as thermostats, BMS, electronic devices, etc.) may be initiated through voice control.
In some embodiments, voice control is implemented on various components of a control architecture of a target (e.g., a smart window system), such as an onboard window controller or other window controller, a network controller, a master controller, a wall switch (e.g., an interface with a control component), and/or a separate device that interfaces with any or all of the above devices and/or components.
In some embodiments, gesture control is used to control a target. Gesture control may or may not use a limited set of commands (e.g., due to a smaller number of movements that need to be recognized as compared to a larger vocabulary dictionary that may be recognized when using voice control). For example, gesture control may be used to execute many types of commands. For example, gesture control may be used to indicate that a particular target (e.g., window) or group of targets (e.g., windows) should change its state (e.g., to a brighter or darker state (or to other optical states if a non-electrochromic optically switchable device is used)). The user may indicate an object (e.g., a window) to be changed, for example, by standing in front of and/or pointing at the associated object (e.g., a window). The indication of the target may trigger the coupling of the gesture to the target. For example, a user may indicate a desired change by raising or lowering their hand or arm, or by opening or closing their palm. A dictionary of recognized gestures may be created to define the types of commands that may be accomplished through gesture control. A wider gesture dictionary may enable finer and more complex control of the optically switchable device.
There may be a degree of trade-off in ease of use, with a smaller gesture dictionary likely to be easier for the user to master.
In some embodiments, at least one sensor is used to detect gestures. The sensor may be communicatively coupled to a network. The sensor may be an optical sensor (e.g., a camera, such as a video camera). The sensor (e.g., camera) may be provided on any available device, and in some examples, as part of a wall unit, as part of a device that interfaces with a wall unit (e.g., a smartphone, tablet, or other electronic device), as part of a handheld device (e.g., a smartphone, tablet, or other electronic device), on an electrochromic window or frame, or as part of any other device configured to control an electrochromic or other optically switchable window. For example, a user may hold, wear, or otherwise move a sensing device configured to sense movement and/or acceleration, among other things. The readings on the sensing device may be used to help determine gestures made by the user. The motion sensing device may include one or more accelerometers (e.g., 3-axis accelerometers), gyroscopes, magnetometers, cameras, etc., and may be included in a Virtual Reality (VR) interface, such as Oculus Quest or Oculus Rift available from Facebook Technologies, inc. The attached document can be found with the annotation on OVRPlayerController. The movement circuitry may be or be included in a user controller, a character controller, and/or a player controller.
In some embodiments, the sensing device is an exercise device (e.g., any of a variety of wearable devices available from Fitbit inc. Or Jawbone, both of which are San Francisco, california, usa), a watch (e.g., apple inc. From Cupertino, california, usa or pebbele Technology Corporation, palo Alto, california, usa), or similar wearable device. In some embodiments, the relative positioning is velocity, acceleration, and/or doppler, for determining a change in gesture as a command to change a state of a target. In some embodiments, facial recognition software is used to determine a change in a gesture as a command to change the state of a target. In some embodiments, facial recognition software is used to determine changes in facial expressions as commands to change the tint level of the window. The gesture may include a facial or body gesture (e.g., a gesture of a limb or a portion of a limb). The gesture may include a kinesthetic movement. The gesture may include a physical movement of a body part. The gestures may include torso and/or anatomical movement. The movement may comprise muscle movement. Movement may include movement of one or more bones (e.g., by moving a muscle that it abuts).
In some embodiments, one command that may be initiated by voice control is to turn off the "listening mode". A sound sensor (e.g., a listening device) may be operatively (e.g., communicatively) coupled to a network. When the listen mode is on, the device listening for commands may retrieve the password command. When the listening mode is off, the device listening to the command cannot receive, listen to, and/or record such commands. For example, the device listening for the command may be part of a (e.g., window) controller, an IGU, a wall device, and/or another electronic device (e.g., phone, tablet, etc.). The user may request that the listening mode be turned off to increase privacy, and/or to conserve power, etc. For example, in some cases, a user may request that the listening mode be off for a specified period of time (e.g., the duration of a meeting). To reopen the listening mode, the user may press a button/touch screen (e.g., on the device listening for commands, on a window controller, IGU, wall device, or other electronic device) or otherwise indicate that the listening mode should be reopened. The device may indicate when to turn on and/or off the listening mode. In one example, one or more lights (e.g., LEDs) can indicate whether the listening mode is on or off. The lights may be turned on to indicate that the listening mode is on and off to indicate that the listening mode is off (or vice versa). In another example, a first light or light color may indicate that the listening mode is on and a second light or light color may indicate that the listening mode is off. In another example, the device may use audio cues, e.g., may emit tones, e.g., periodically, as a reminder to the user that the hearing mode is inactive (or active). In some implementations, the listening mode can be deactivated for a period of time (e.g., at least about 1 minute, 10 minutes, 30 minutes, 1 hour, 2 hours, 3 hours, 1 day, etc.), after which the listening mode can be automatically reactivated. For example, the period of time for which the listening mode remains deactivated may be selected by the user, or may be preset. In some embodiments, the listening mode is activated by default. The listening mode is on unless the listening mode is turned off (e.g., permanently turned off or turned off for a period of time, as described herein). In some embodiments, the default setting is listening mode off (e.g., listening mode is not active unless a command to turn on listening mode is received).
In some embodiments, where gesture commands are used, the user may control whether the associated device interpreting the gesture command is in a "view mode". As with the listening mode, the viewing mode may be turned on and off. For example, when the device is in a viewing mode, it can perceive and interpret gesture commands. When the viewing mode is off, the device cannot sense, record, and/or process gesture commands. The details provided herein relating to the listening mode may be similarly applied to the viewing mode. The means to interpret the gestures may or may not be part of the control system. The gesture interpretation means may comprise circuitry (e.g. may comprise a processor). The gesture interpretation apparatus may be communicatively coupled to a network and/or control system. Gestures may be interpreted from a virtual image of a peripheral structure in which a controllable object (e.g., an IGU, sensor, light, or media) is disposed. The gesture may be interpreted according to a target to which the gesture is coupled (e.g., pointed).
In some implementations, one or more voice commands are used to pose a problem to a system that controls a target, such as an optically switchable device (or some component on a network on which the optically switchable device is installed). The issue may relate directly to a target (e.g., an actuator or an optically switchable device), or more generally to any target (e.g., an optically switchable device) or group of targets (e.g., devices) that are communicatively coupled to a network (e.g., on a network), for example. For example, a user may ask what the current optical state of a particular optically switchable device is (e.g., "what is the tint level of glazing 1. Similarly, the user may ask about upcoming behavior for a particular optically switchable device (e.g., "when will a window in the next i's office begin to darken. The questions may also relate to any other information that the network may access. For example, a user may query weather data (e.g., temperature data, cloud data, precipitation data, forecast data, etc.), location data (e.g., "do i am where. The user may query the peripheral structure for any environmental characteristics (e.g., as described herein). The user may ask for an explanation of the reason the target (e.g., optically switchable device) is performing in some way. In one example, the user may ask "why window 1 is colored? And the system may interpret "expect to cloud out in 20 minutes, bright sunlight, so color is needed" in response to the query. This feature may be particularly useful where the optically switchable device is programmed to execute rules that may not be immediately observable and/or understandable. The answers may be provided visually (e.g., on a screen), as printed material, or audibly (e.g., through a speaker). In some embodiments, voice commands are used to control the degree of privacy in a peripheral structure (e.g., a room), for example, relative to (e.g., wireless) communications. In some embodiments, the optically switchable window is patterned to include one or more antennas that can be used to block or allow certain wavelengths to pass through the window.
When activated, these patterned antennas may provide increased security/privacy by preventing collection communications, wi-Fi communications, and the like. An example of a patterned antenna and related privacy considerations may be found in PCT application PCT/US15/62387 filed 11/24/2015 and entitled "WINDOWs antenna," which is incorporated by reference herein in its entirety.
In some embodiments using voice and/or gesture control, one or more dictionaries may be defined. For speech control, the lexicon may define a set of words and/or phrases that the system is configured to interpret/understand. Similarly, for gesture control, the dictionary may define a set of gestures that the system is configured to interpret/understand. The dictionaries may be ranked, for example, a command may be given in a first level dictionary, a new dictionary for a second level may be launched to receive the command, and once received, another level of the dictionary may be launched. In this way, the single dictionary need not be overly complex, and end users can quickly enter their desired command structure. In some implementations (e.g., when the target is media), the gesture is interpreted as a cursor movement on a media projection.
Examples of words or phrases that may be defined include the name/identification of each optically switchable device or group of devices (e.g., "Window 1," "group 1," "zone 1," etc.). These names/identifications may also be based at least in part on the location of the optically switchable device. In this regard, a dictionary may be defined to include words that identify optically switchable devices based at least in part on location (e.g., "first floor" or "lounge" or "eastward"), and/or words that provide a relationship between the user (or some other person) and the optically switchable device that is identified (e.g., "my office," "left window," or "deep room").
In some embodiments, the dictionary may also define words that relate to desired commands that may be indicated. For example, the dictionary may include words such as "light", "clear", "clearest", "darker", "darkest", "brighter", "brightest", "more", "less", "very", "tone level", "tone 1", "tone 2", and the like. When using spoken commands, any words that a person may use when instructing the optically switchable device may be included in the dictionary. Where the system is configured to allow the user to set up schedules or rules for the behaviour of the optically switchable device, the dictionary or dictionaries may include any words required for understanding these commands (e.g. "monday", "tuesday to friday", "morning", "afternoon", "bedtime", "sunrise", "if", "then", "when", "don't care", "cloudy", "sunny", "degree", "someone", "nobody", "sports", "only", etc.). Similarly, where the system is configured to allow a user to ask a question, the dictionary or dictionaries may include any words necessary to understand the type of question that the system is intended to answer.
In some embodiments, there are some tradeoffs between larger dictionaries than smaller dictionaries, which may enable finer control, more natural and/or flexible commands, and more complex functionality (e.g., answering any questions for which answers are available on the internet), which may be easier for a person to master, and which may enable faster and/or more local processing. Smaller dictionaries may be used in a hierarchical format where a user provides appropriate voice or gesture commands in one dictionary to provide access to successive dictionaries in order to allow access to the next dictionary.
In some embodiments, a single dictionary may be used. In other embodiments, two or more dictionaries may be used, and the dictionary used at a particular time depends on what type of command, or which part of the command the user is attempting to convey. For example, a first dictionary may be used when a user identifies which optically switchable device they wish to control, and a second dictionary may be used when a user identifies what they want an optically switchable device to do. The first dictionary may include any words needed to recognize the associated optically switchable device, while the second dictionary may include any words needed to explain what the user wants the optically switchable device to do. Such a context dictionary may provide a limited subset of words that the system is configured to understand and/or interpret whenever a particular dictionary is used. This may make it easier to interpret the user's commands.
In some embodiments, one or more dictionaries may be customized for a particular user. For example, a dictionary used to define and/or determine which electrochromic window a user wishes to switch may be limited based at least in part on the window the user is authorized to switch. In one example, user A is allowed to switch windows 1-5, while user B is allowed to switch windows 6-10. The dictionary or dictionaries used for transcribing and/or interpreting commands from user a may be limited to recognition windows 1-5 and the dictionary or dictionaries used for transcribing and/or interpreting commands from user B may be limited to recognition windows 6-10.
In some embodiments, each dictionary includes certain keywords that allow the user to more easily navigate through the system. Such keywords may include phrases such as "help," "back," "return," "previous," "undo," "skip," "restart," "stop," "abort," and the like. When the user requests assistance, the system may be configured to communicate to the user the words, phrases, commands, windows, etc. that the system is currently configured to accept/understand based at least in part on the dictionary (e.g., visually and/or audibly) used by the user at a given time. For example, if the user requests help while the system is accessing a dictionary defining different windows available for switching, the system may communicate that the inputs available at the time are, for example, "window 1", "window 2", "window 3", "group 1", etc.
In some embodiments, the system is used to ensure that a user is authorized to issue a particular command before executing the command. This may prevent unauthorized users from making changes to the optically switchable device. One particularly valuable environment is a conference room, where many people may be present at the same time. In this case it may be desirable to ensure that a person without authority to change the optical state of the optically switchable device is prevented from doing so.
This may reduce the risk of the optically switchable device changing based at least in part on (typically irrelevant) comments heard by people in the room. Another environment in which this feature is valuable may be a business office space where it is desirable that individuals can each control a limited number of optically switchable devices, for example, in the vicinity of their workspace. In one example, a person (e.g., each) may be authorized to control a target (e.g., an optically switchable window) or the like in their particular office or on their particular floor. For example, it may be beneficial to ensure that (e.g., only) people who are able to initiate a change (e.g., an optical transition) in a target via a voice or gesture command are authorized to do so.
In some embodiments, authorization is performed to identify a user by "logging" the user into the system. This may be done by logging into an application on the electronic device (e.g., smartphone, tablet, etc.), by typing in a code, electronically identifying a code, fingerprint identification, eye recognition, facial recognition, or speaking a password, etc. In another example, voice recognition may be used to confirm the identity of the user. In another example, facial recognition, fingerprint scanning, retina scanning, or other biometric-based methods may be used to confirm the identity of the user. Different authorization procedures may be best suited for different applications and/or contexts. In a particular example, the user may be automatically authorized. Such authorization may be based at least in part on a physical authorization token (e.g., an RFID badge, BLE beacon, UWF beacon with appropriate identification information, etc.), and the proximity of the physical authorization token to the sensor reading the token. The sensor may be disposed on or adjacent to the optically switchable device (e.g., on a frame portion of the IGU, such as in a mullion), on a controller in communication with the optically switchable device, a wall unit in communication with the optically switchable device, or the like. The verification may occur locally (e.g., on a sensor reading the token, on the optically switchable device, on the controller, on a wall unit, etc.) and/or in the cloud.
In some embodiments, authorization occurs when needed, and may expire after a period of time has elapsed or after a period of user inactivity (e.g., after 24 hours or 1 hour, or after 10 minutes). The time period for automatic logoff may depend on settings for installing or projecting the target (e.g., a window). For example, whether the target (e.g., window) is in a public area or a private area. In some cases, authorization may not expire before the user logs off (e.g., using any available method, including but not limited to verbally requesting logout, pressing a logout button, etc.). In some embodiments, authorization is performed each time a command is issued. In some implementations, authorization occurs in stages even when a single command is interpreted. In a first authorization phase, it may be determined whether the user has authorization to make any changes on the network, and in a second authorization phase, it may be determined whether the user has authorization to make certain changes that the user has requested and/or initiated.
In some embodiments, the authorization process is used to limit the lexicon used to interpret voice and/or gesture commands. For example, the dictionary or dictionaries of a particular user may exclude one or more designated targets (e.g., optically switchable devices (or groups/zones of such devices)) that the user is not authorized to control. In one example, a user may only be authorized to control the optically switchable devices in zones 1 and 2, so the dictionary or dictionaries used to interpret the user's commands may include "zone 1" and "zone 2" while excluding "zone 3". Any other words needed to interpret and/or understand the command may also be included in the dictionary.
In some embodiments, a voice and/or gesture control system includes several modules that may be used in implementing the disclosed voice and/or gesture control embodiments. These modules may be implemented separately or together, as appropriate for a particular application. The modules may be provided in separate hardware and/or may control various processors. Modules may be executed concurrently or non-concurrently (e.g., in sequence). The modules may be implemented independently on a controller (e.g., a window controller, a network controller, and/or a master controller), an optically switchable device, a wall device, a router, a remote processor, and/or any other object (e.g., as disclosed herein). In some embodiments, one or more of these modules are implemented on a processor and/or processing unit of a media controller or window controller. Within each module, any relevant processing may be done locally and/or remotely. The processing may be done in a central location and/or device, or it may be distributed across multiple locations and/or devices.
In some embodiments, the speech and/or gesture control system includes a speech recognition module that converts and/or transcribes speech to text. In other words, the input to the module may be speech (spoken by the user and captured/recorded by the microphone) and the output from the module may be a text string or file. The module may be implemented using any number of commercially available speech-to-text products, services and/or libraries. As one example, university of kainkume, pittsburgh, pennsylvania, offers many open source voice software resources that may be used, such as CMU Sphinx. Additional examples include various Dragon products available from Nuance Communications, inc. of Burlington, mass, and Tazti available from Voice Tech Group, inc. of Cincinnati, ohio. The voice recognition module may also be implemented using custom software specifically designed for voice control associated with the optically switchable device.
In some embodiments, the voice and/or gesture control system includes a command processing module that interprets text to determine desired command instructions. In other words, the input to the module may be a text file (which may be generated by a voice recognition module) and the output may be a set of commands and/or instructions that may be interpreted by the window controller (or other controller on the network) to cause the associated target (e.g., sensor, emitter, media, or optically switchable device) to initiate the requested command. This functionality may also be referred to as language processing or natural language processing. Similar to the speech recognition module, the command processing module may be implemented using a number of available products and/or services, or using software developed specifically for a particular application.
In some embodiments, the voice and/or gesture control system includes an authentication module for implementing the authorization and/or security techniques discussed herein. For example, an authorization module may be used to ensure that the person giving the command is authorized to issue the command. The authentication module may include a blockchain program and/or an embedded encryption key. The blockchain procedure may include (e.g., peer-to-peer) voting. The encryption key may be linked to a target (e.g., a device). The authentication module may be designed to ensure that only authorized devices may be connected to a given network, facility, and/or service. The module may compare the optically switchable devices identified in the command with a list of optically switchable devices that the user is authorized to control. In the event that a user attempts to control an optically switchable device they are not authorized to control, the authentication module may be configured to notify the user (e.g., visually, printed, and/or audibly) that they are not authorized to control the associated optically switchable device. In other cases, no action is taken when an unauthorized command is given (e.g., the user is not notified and the target state is not changed (e.g., the optically switchable device is not switched)). Authentication may take into account the user's identification and/or other employee data, such as ratings, seniority, authentication, education, and/or department associations. The identification of the user may be provided to the authentication module, for example, via the user's facility entry tag. An authentication module may be required to limit access to sensitive medical information, hazardous manufacturing machinery, and/or any restricted information. Examples of authentications (e.g., using a blockchain program) can be found in PCT patent application serial No. PCT/US20/70123, which is incorporated by reference herein in its entirety.
In some embodiments, the voice and/or gesture control system includes a command execution module that executes commands on an associated optically switchable device. The command may be executed on the master controller, the network controller, and/or the window controller. In one example, the command may be executed by instructing the master controller to send all windows in a particular group or region to the desired level of tint. In general, the command may be executed on and/or by any of the control devices described herein, or by any of the control methods described herein.
In some embodiments, the voice and/or gesture control system includes a response generation module that generates a response. The response may be communicated to the user by a response delivery module. The response generated by the response generation module may be a textual response (e.g., an optical display, a printed display, and/or an audible sound). The textual response may be displayed to the user using a response delivery module, for example, optically on a screen. For example, the response delivery module may convert the text response into a voice response (e.g., in a sound file) that is played to the user. Any suitable text-to-speech method may be used for this purpose. For example, the response delivery module may convert the text response to a hard print (e.g., on paper). In general, the response generation module and the response delivery module may work together to generate a response and/or deliver the response to a user.
In some embodiments, a response may be provided to the query of the transfer module (e.g., automatically by the control system, for example), which may be transferred via the response generation module. One purpose of the response generation module and/or the response transmission module may be to inform the user what commands the control system has understood. Similarly, any of these modules may be used to notify the user, for example, regarding any action taken by the optically switchable device in response to a user command. In one example, the response generation module may generate a response that repeats the user giving a basic command to change the target state (e.g., "Window 1 to shade 4" or "color Window 1 to shade 4 when the weather becomes clear"). The response may then be transmitted to the user by a response transmission module. The response generation module and/or the response delivery module may also be used to request a specification from the user. For example, if it is unclear whether the user wants to change window 1 or window 2, the response generation module may be used to prompt the user to clarify and/or provide further information.
FIG. 7 illustrates an exemplary voice and/or gesture control system 700 that includes various modules. The functional modules within the control system 700 include a speech recognition module 702, a command processing module 704, an authentication module 706, a command execution module 708, a response generation module 710, and a response delivery module 712.
In operation of some embodiments, a voice and/or gesture control system implements a method for controlling (e.g., changing) the state of a target, e.g., using voice control to control one or more devices. The at least one microphone may be configured and positioned to receive voice commands. The microphones may be located in any part of the facility where the target is located, for example in a peripheral structure where the target is located, for example, on the target (e.g., on the optically switchable device), on a wall device or on another electronic device such as a smartphone, tablet, laptop, PC, etc. One exemplary command includes "transition window 1 to tint 4". For example, if the listening mode is on, the microphone can listen and/or record voice commands from the user. Once recorded, the voice command may be converted and/or transcribed into a text command.
In some embodiments, the speech to text conversion is affected by one or more dictionaries as described above. For example, words or phrases that sound similar to words or phrases stored in the relevant lexicon can be converted to words/phrases stored in the lexicon, even if not identical. In a particular example, the user gives a command to "switch window 1 to hue 4", but the speech recognition module initially interprets the command as "switch window 1 to hue layer". If the relevant dictionary or dictionaries associated with the speech recognition module define phrases such as "window 1", "window 2", "hue 1", "hue 2", "hue 3", and "hue 4", but do not include any phrases having the word "floor", the speech recognition module may recognize that the user may say "hue 4" instead of the originally understood "hue layer", which has no relevant meaning in the relevant dictionary or dictionaries. In other words, the results of text-to-speech operations may be limited or otherwise affected by the associated dictionary used.
In some embodiments, the text command is interpreted. The interpretation may be done by a command processing module. Similar to the speech-to-text conversion, the interpretation of the text command in operation 1007 may be affected by the dictionary or dictionaries being used. The operation may involve specifically identifying which target or targets (e.g., one or more optically switchable devices) the user requests to change, and/or identifying the changes of a particular request.
In some embodiments, it is determined whether the user is authorized to issue the requested command. For example, authorization may be accomplished by an authentication module. If the user is not authorized to issue the requested command, the operation may end (1) without any reaction, or (2) generating a response to inform the user that they are not authorized to issue the command. The response may be provided visually (e.g., via a visual display (e.g., on or adjacent to the optically switchable window), in printed form, or audibly (e.g., by playing a sound file via a speaker on the optically switchable device, wall device, or other electronic device).
In some embodiments, a response to the user is generated if the user is authorized to issue the requested command. The response may be generated by a response generation module. The response may confirm that the requested command is occurring. The response may be communicated to the user by a response delivery module. The response may be presented to the user visually (e.g., on a display), in printed form (e.g., hard-printed), and/or audibly (e.g., through a speaker). The display and/or speakers may be provided on an optically switchable device, a wall device, or other electronic device (e.g., a smartphone, tablet, laptop, PC, etc.).
Fig. 8 shows a flow diagram of a method 800 of controlling one or more optically switchable devices (e.g., electrochromic windows) using voice control. Method 800 begins at operation 801 when a user provides a voice command. For example, depending on the configuration of the voice control system and the robustness of the voice control process, the voice command may be given in various ways.
Next, in operation 803, it is determined whether the listening mode is turned on. When the listening mode is on, the microphone may listen and/or record voice commands from the user.
When the listening mode is off, the microphone may be turned off or not accept voice commands associated with the optically switchable device. One example of when the microphone may remain "on" while the listening mode is "off is when the microphone is located in the user's handset and the user is making an unrelated call on their handset. The determination in operation 803 may be made passively. If the listening mode is not turned on (e.g., "off"), the microphone will not pick up and/or record the voice command issued in operation 801, and nothing will happen, as shown in operation 804. In some embodiments, the user may optionally manually activate the listening mode, as shown in operation 802. In this case, the method may continue at operation 801, where the user repeats the command. If the listening mode is turned on in operation 803, the method continues to operation 805 where the voice command is converted/transcribed to a text command. The speech to text conversion may be accomplished by a speech recognition module.
Next, in operation 807, the text command is interpreted. The interpretation may be done by a command processing module. Similar to the speech to text conversion discussed with respect to operation 805, the interpretation of the text command in operation 807 may be affected by the lexicon or dictionaries being used. The operation may involve specifically identifying which optically switchable device or devices the user requests to change, and identifying the changes that are specific to the request. For example, if the user provided a command to "switch window 1 to hue 4," the interpretation may involve determining that (1) the user is requesting a change in window 1, and (2) the requested change is related to switching the window to hue state 4.
The text command interpretation at operation 807 (and the speech to text conversion at operation 805) may be influenced by user preferences and/or user permissions. For example, if the user issues a "dim window" voice command, the system may interpret which windows are desired to be switched based at least in part on the windows the user typically switches and/or based at least in part on the windows the user is allowed to switch.
At operation 809, it is determined whether the user is authorized to issue the requested command. For example, authorization may be accomplished by an authentication module. If the user is not authorized to issue the requested command, the method ends at operation 810, where (1) there is no reaction, or (2) a response is generated to inform the user that they are not authorized to issue the command. The response may be provided visually (e.g., through a visual display on the optically switchable window, wall device, or other electronic device) and/or audibly (e.g., playing a sound file through a speaker on the optically switchable device, wall device, or another electronic device). Further details regarding response generation are provided below.
If the user is authorized to issue the requested command, the method may continue at operation 811 where the text command is executed. The command may be executed using any of the methods and systems described herein. The command may be executed using a command execution module. In some embodiments, the command may be executed on a network on which the optically switchable device is installed, and may involve one or more window controllers, network controllers, and/or master controllers. For example, operation 811 involves executing a command requested by the user in operation 801.
In operation 813, a response to the user is generated. The response may be generated by a response generation module. The response may confirm that the requested command is occurring. The response may specifically indicate the content of the command so that the user knows whether she is correctly understood. An exemplary response may be "switch window 1 to hue 4". Simpler positive responses such as "ok" or green lights and/or tones may let the user know that she was heard without specifically repeating the content of the command (e.g., using the response generation module and/or the response transmission module). In a particular example, the response may include a request for the user to confirm that the system has correctly understood the desired command. In this case, the command may not be executed until such confirmation is received from the user.
At operation 815, the response is transmitted to the user. The response may be communicated to the user by a response delivery module. The response may be presented to the user visually (e.g., on a display) and/or audibly (e.g., through a speaker). The display and/or speakers may be provided on an optically switchable device, a wall device, or other electronic device (e.g., a smartphone, tablet, laptop, PC, etc.). The display and/or the speaker may be provided in the same unit as the microphone or they may be provided in a separate unit. In some cases where an audible response is provided, the response generation may involve generating the desired response text (e.g., using a response generation module) and then generating and playing a sound file corresponding to the desired text (e.g., using a response delivery module). The method 800 may be practiced in various ways. In some embodiments, certain operations occur in a different order than shown in fig. 8.
In some embodiments, the speech control method involves the use of two or more dictionaries. FIG. 9 shows a flowchart of an example of a method 900 of controlling one or more optically switchable devices using two or more dictionaries associated with voice control.
The method 900 of FIG. 9 is similar to the method 800 of FIG. 8, except that the commands are interpreted in a scattered manner, with different dictionaries applied to different parts of the commands. Many of the operations shown in fig. 9 are the same as those shown in fig. 8, and for the sake of brevity, the description will not be repeated.
In one implementation of the method 900, after determining that the listening mode is on in operation 903, a part 1 of the voice command is converted to a part 1 of the text command using a first dictionary in operation 925. The particular dictionary used may correspond to the portion of text being interpreted. Next, it is determined in operation 926 whether there are additional portions of the voice command to interpret/convert to text. If there are additional portions of the voice command to be interpreted, the method continues at operation 927 where the dictionary optionally switches to another dictionary. The next dictionary selected may correspond to the next portion of the command to be interpreted. The method then continues at operation 925, where part 2 of the voice command is converted to part 2 of the text command, optionally using a different dictionary than that used in conjunction with part 1 of the command. The operational loops 925, 926, and 927 continue until all portions of the command have been converted to text using the appropriate dictionary.
In one example, the full voice command is "switch Window 1 to tone 4". A portion of the voice command (e.g., portion 1) may involve identifying which optically switchable devices the user desires to switch, in this case "window 1". Another portion of the voice command (e.g., portion 2) may involve identifying what the desired command/end optical state is, in this case switching to "tone 4". Different portions of the command may be structured according to the needs of a particular system. More structured commands may be easier to process and/or interpret, which may make local processing a more attractive option. Less structured commands may be more difficult to process and/or interpret, which may make remote processing a more attractive option.
In some embodiments, after all portions of the voice command have been converted to text, different portions of the text command are concatenated together to define a full text command, and the method continues at operation 907. The remainder of the method is the same as that described with respect to fig. 8.
Fig. 10 depicts a flow diagram similar to that shown in fig. 8, in the context of a particular example, where a user in a facility such as an office building requests that the control system switch a window in the user's office to a particular tint state. Method 1030 begins at operation 1031, where the user requests "switch my window to hue 4" by voice. If the listen mode is not turned on, the system will take no action in response to the user's request, as shown in operation 1034. In some cases, the user may optionally manually activate the listening mode, as shown in operation 1032. In this case, the method may continue with operation 1031, where the user repeats the command. When the listening mode is turned on at operation 1033, the method continues at operation 1035, where the voice command is converted to a text command. At this time, the control system may have an audio recording of the voice command given by the user, and a text file indicating the content of the voice command.
Next, at operation 1037, the text command is interpreted. This may be done by the command processing module.
This operation may involve identifying which windows are to be changed. In this example, the user requests to change "my window". The control system may identify the window to change by analyzing who gave the command, which windows the user is authorized to change, which windows the user often changes, which windows are associated with the user in the database, which windows the user is near when she made the command, etc. The identification of the user may be accomplished in a number of ways as described above with respect to authorization. In this particular example, the control system uses speech recognition to identify the user and identifies the window to be changed by utilizing a database that associates each employee with the window in each employee's office. At the end of operation 1037, the control system has identified that the user wishes to switch all windows in the user's office to hue 4.
At operation 1039, it is determined whether the user is authorized to issue the command. This may be done by the authentication module. In this example, the authorization process involves speech recognition. The system may analyze the recorded voice command given by the user in operation 1031 and compare it to previous recordings from the user and other users. This process allows the system to identify the person who issued the command in operation 1031. The authorization process may also involve ensuring that the identified user is allowed to change the window that she has requested to be changed. In this example, the control system checks whether the user is authorized to change the windows in their office by utilizing a database that associates each user with each window that the user is authorized to change. The user in this example is working on floor 10 and is authorized to switch all windows of floor 10. Thus, the method continues with operation 1041, where the command is executed (e.g., via the command execution module) and all windows in the user's office begin switching to hue 4. In the event that the user issues an unauthorized command (e.g., the user is visiting a colleague on floor 9 and requests that the window in the colleague's office go to hue 4, at which point the user is only authorized to switch windows on floor 10, where the user's office is located), the method may continue to operation 1040 where there is no reaction or the command system indicates that the user is not authorized to issue the requested command. The system may or may not interpret the reason why the user is not authorized to make the requested command, and/or may interpret which windows, if any, the user is authorized to change.
In operation 1043, the control system generates a response indicating "window in office darkened to hue 4". This may be done by the response generation module. The response may indicate which windows are to be affected, as well as the particular action they are to take (e.g., darken, lighten, the final requested tint state, etc.). In this example, operation 1043 involves generating a text file indicating what the response will be. Next, at operation 1045, a response is communicated to the user. This may be done by the response transfer module. In various instances, the response may be conveyed visually or audibly. In one example, the operation may involve, for the response, generating a sound file corresponding to the response in the text file. The sound file may then be played to the user so that she knows that she has heard her command and that the system is operating on her request. For example, a text file (or another file generated based at least in part on the text file) may be displayed to the user so that she can visually understand that her commands have been heard.
The examples in fig. 8-10 are provided for IGU targets, and the state change is a change in hue of the IGU. Any state change of the target can be achieved in a similar manner.
In some embodiments that use gesture commands instead of voice commands, a movement circuit or sensor (e.g., of a camera) may be used instead of (or in addition to) a microphone in order to sense and record the user's commands. The mobile circuit may be communicatively coupled to a network that is communicatively coupled to a digital twin of the peripheral structure in which the target is located. Instead of a speech recognition module, a gesture recognition module may be used to analyze the movement circuit and/or sensor (e.g., camera) data. For example, a user may be positioned within a field of view of a camera such that the user's movements may be captured, the movements being in accordance with a desired control action to be taken in conjunction with a controllable target (e.g., device), such as a tintable window. For example, the user's movement may be captured by a mobile device manipulated by the user (e.g., moved by the user), the movement being in accordance with a desired control action to be taken in conjunction with a controllable target (e.g., device), such as a tintable window.
FIG. 11A shows an example of a state in which a user interacts with device 1105 to control a target, i.e., the optical states of electrochromic windows 1100a-1100 d. In this example, the device 1105 is a wall device as described above. In some embodiments, the wall device 1105 is or includes a smart device, such as an electronic tablet or similar device. The device 1105 may be a device configured to control the electrochromic windows 1100a-1100d, including but not limited to a smart phone, tablet, laptop, PC, etc. The apparatus 1105 may run an electrochromic window application/program configured for control. In some embodiments, device 1105 communicates with access point 1110, such as through a wired connection or a wireless connection (e.g., wiFi, bluetooth low energy, zigBee, wiMax, etc.). The wireless connection may allow at least one device (e.g., a target device) to connect to a network, the internet, and/or wirelessly communicate with each other over a certain area (e.g., within a certain range). The access point 1110 may be a network hardware device that allows Wi-Fi compatible devices to connect to a wired network. The apparatus 1105 may communicate with a controller (e.g., a window controller, a network controller, and/or a master controller) through a connection scheme.
In some embodiments, the access point is connected to a switch to complete network communications between a user's control device (e.g., mobile circuit) and a control unit of a target (e.g., window, media, or other device) that is to receive the command. For example, the switch may be connected to a router and/or a control unit. The connections between the different elements may be wired and/or wireless, as appropriate for a particular application. For example, the access point may be a wireless access point and the connection between the access point and the device may be a wireless connection. In some embodiments, the device may be any number of electronic devices configured to control the state of a target (e.g., a media or an electrochromic window). The router may include firewall protection to enhance security. The control unit may be a window controller, a network controller or a master controller. For example, if the control unit is not a window controller, it may relay instructions to the relevant window controller over the network.
Fig. 11B shows an example of a user device 1105 connected to an access point 1110, which is further connected to a switch 1115. Switch 1115 may be connected to both router 1120 and controller (i.e., control unit) 1125. Router 1120 may include firewall protection to enhance security. Controller 1125 may be a window controller, a network controller, or a master controller. If controller 1125 is not a window controller, controller 1125 may relay instructions to the relevant window controller over the network.
Fig. 12A shows an example of the apparatus 1205 connected to an access point 1210 that is connected to a controller 1225. Each of these connections may be wired and/or wireless. Fig. 12B shows an example where the apparatus 1205 is directly connected to the controller 1225. The connection may be wired and/or wireless. Fig. 12C shows an example of the apparatus 1205 connected to the cloud 1230 (e.g., the internet). The cloud 1230 is also connected to a router 1220 that is connected to a switch 1215 that is connected to a controller 1225. The connection may be wired and/or wireless, as appropriate for a particular application. In a particular example, the apparatus 1205 can be a smartphone that is wirelessly connected with the cloud 1230 (e.g., via a communication network capable of transmitting at least a third, fourth, or fifth generation communication (e.g., 3G, 4G, or 5G communication)).
In some embodiments, the interactive system to be controlled by the user includes media (e.g., visual and/or audio content) for display, for example, to occupants of the building. The display may comprise a stop-motion picture or a video projection arrangement. The display may include a Transparent Organic Light Emitting Device (TOLED). The display may be integrated into a display configuration having louvers (e.g., a frame). An example of a DISPLAY configuration can be found in U.S. provisional patent application Ser. No. 62/975,706, filed on 12.2.2020 AND entitled "TANDEM VISION WINDOW AND MEDIA DISPLAY," which is incorporated herein in its entirety.
In some embodiments, the display construction is coupled with a viewing (e.g., tintable viewing) window. The viewing window may comprise an Integrated Glass Unit (IGU). The display configuration may include one or more glass panes. The display (e.g., display matrix) may include Light Emitting Diodes (LEDs). The LEDs may comprise organic materials (e.g., organic light emitting diodes, abbreviated herein as "OLEDs"). The OLED may comprise a transparent organic light emitting diode display (abbreviated herein as "TOLED"), which is at least partially transparent. The display may have 2000, 3000, 4000, 5000, 6000, 7000 or 8000 pixels in its basic length scale. The display may have any number of pixels in between the number of pixels described above on its fundamental length scale (e.g., about 2000 pixels to about 4000 pixels, about 4000 pixels to about 8000 pixels, or about 2000 pixels to about 8000 pixels). The base length dimension may include a diameter, length, width, or height of the bounding circle. The base length scale may be abbreviated herein as "FLS". The display configuration may include a high-resolution display. For example, the display construction may have a resolution of at least about 550, 576, 680, 720, 768, 1024, 1080, 1920, 1280, 2160, 3840, 4096, 4320, or 7680 pixels by at least about 550, 576, 680, 720, 768, 1024, 1080, 1280, 1920, 2160, 3840, 4096, 4320, or 7680 pixels (at 30Hz or at 60 Hz). The first number of pixels may specify a height of the display and the second number of pixels may specify a length of the display. For example, the display may be a high resolution display with a resolution of 1920x1080, 3840x2160, 4096x2160, or 7680x 4320. The display may be a standard definition display, an enhanced definition display, a high definition display, or an ultra high definition display. The display may be rectangular. The image projected by the display matrix may be refreshed at a frequency (e.g., at a refresh rate) of at least about 20Hz, 30Hz, 60Hz, 70Hz, 75Hz, 80Hz, 100Hz, or 120 hertz (Hz). The FLS of the display configuration can be at least 20", 25", 30", 35", 40", 45", 50", 55", 60", 65", 80", or 90 inches (), the FLS of the display configuration can be any value between the above values (e.g., about 20" to about 55", about 55" to about 100", or about 20" to about 100 ").
In some embodiments, at least a portion of a window surface in a facility is used to display various media using a glass display construction. The display may be used to view (e.g., at least partially) an environment outside the window (e.g., an outdoor environment), for example, when the display is not operating. The display may be used to display media (e.g., as disclosed herein) to augment the external view through (e.g., optical) layering, augmented reality, and/or illumination (e.g., the display may act as a light source). The media may be used for entertainment and non-entertainment purposes. The media may be used for video conferencing. For example, the media may be used for work (e.g., data analysis, charting, and/or video conferencing). For example, the media may be used for educational, health, security, purchasing, currency, or entertainment purposes. The media may be presented by a person (e.g., a remote employee) who is not at the peripheral structure where the media display is located. The media may be presented to a person at a peripheral structure where the media display is located. For example, the media display may mirror the person (e.g., and its actions, such as in real-time) in a peripheral structure in which the media display and local personnel are located. The media may be used as a tutoring tool by mirroring local personnel. For example, the mirrored media may be used as a fitness coaching tool, a speech coaching tool, a gesture coaching tool, and/or a behavior coaching tool. The media may be presented to persons at the peripheral structure where the media display is located and to remote persons, such as in collage, stacked and/or branched displays. The media may be manipulated (e.g., by utilizing a display construct). The utilization of the display configuration may be direct or indirect. An indirect utilization of the media may be using an input device such as an electronic mouse or keyboard. The input device may be communicatively coupled (e.g., wired and/or wirelessly) to the media. Direct utilization may be to use the display configuration as a touch screen using a user (e.g., a finger) or a contact device (e.g., an electronic pen or stylus).
In some embodiments, the media may be displayed by a transparent media display construction. The transparent display construction configured as a display medium may be disposed on or coupled (e.g., attached) to a window, door, wall, partition, or any other architectural element of a facility. The construction element may be a fixed device or a non-fixed device. The architectural element (e.g., a window, wall, or partition) can be stationary or moving (e.g., a moving window or moving door). The architectural element can include a tintable window. The architectural element can include a colorable substance (e.g., an optically switchable device, such as an electrochromic device). The optically switchable device may change its transparency, absorbance or color, for example at least in the visible spectrum. The user may control the use of the media and/or the hue state of the architectural elements, e.g., individually or in association with each other. A user viewing outward from the peripheral structure through the transparent media display in one peripheral structure can optionally see both the media and the environment external to the peripheral structure through the media display.
Embodiments described herein relate to a visual window having a tandem (e.g., transparent) display configuration. In certain embodiments, the visual window is an electrochromic window. Electrochromic windows may include solid state and/or inorganic Electrochromic (EC) devices. The vision window may be in the form of an insulated glass unit ("IGU"). When the IGU includes an electrochromic (abbreviated herein as "EC") device, it may be referred to as an "EC IGU". The EC IGU may color (e.g., darken) the room in which the IGU is disposed and/or provide a colored (e.g., darker) background as compared to a non-colored IGU. The colored IGU can provide a background that is preferred (e.g., necessary) for acceptable (e.g., good) contrast on the (e.g., transparent) display construction. In another example, a window having a (e.g., transparent) display configuration may replace a television (abbreviated herein as "TV") in commercial and residential applications. The (e.g., transparent) display construction and EC IGU may together provide a visual privacy glass function, for example, because the display may enhance the privacy provided by the EC glass alone.
Fig. 13A shows an example of a window 1302 framed in a window frame 1303, and a fastener structure 1304 including a first hinge 1305a and a second hinge 1305b that facilitate rotation of the display construction 1301 about a hinge axis, e.g., in the direction of arrow 1311. The window may be a smart window, such as an Electrochromic (EC) window. The window may be in the form of an EC IGU. In one embodiment, mounted to a window frame (e.g., 1303) are one or more display configurations (e.g., transparent displays) (e.g., 1301) that are at least partially transparent. In one embodiment, the one or more display constructions (e.g., transparent displays) include T-OLED technology, but it should be understood that the invention should not be limited to or by such technology. In one embodiment, one or more display configurations (e.g., transparent displays) are mounted to a frame (e.g., 1303) via fastener structures (e.g., 1304). In one embodiment, the fastener structure (also referred to herein as a "fastener") comprises a stent. In one embodiment, the fastener structure includes an L-shaped bracket. In one embodiment, the L-shaped bracket includes a length that is approximately or equal to the length of the side of the window (e.g., and in the example shown in fig. 13A, also includes the length of the fastener 1304). In embodiments, the window has a base length dimension (e.g., length) of at most about 60 feet ('), 50', 40', 30', 25', 20', 15', 10', 5', or 1'. The FLS of the window can be any value between the foregoing values (e.g., 1 'to 60',1 'to about 30',30 'to 60', or 10 'to 40'). In embodiments, the base length dimension (e.g., length) of the window is at least about 60', 80', or 100'. In one embodiment, the display configuration (e.g., transparent display) encompasses an area that matches (e.g., substantially) the surface area of the sheet (e.g., pane).
Fig. 13B shows examples of various windows in a facade 1320 of a building, including windows 1322, 1323, and 1321 and display configurations 1, 2, and 3. In the example shown in fig. 13B, display construction 1 is at least partially transparent and is disposed over window 1323 (e.g., display construction 1 is specifically positioned over window 1323) such that the entire window 1323 is covered by the display construction and a user can view the external environment (e.g., flowers, glass, and trees) through display construction 1 and window 1323. The display construction 1 is coupled to the window with a fastener that facilitates rotation of the display construction about an axis parallel to the horizontal edge of the bottom of the window, the rotation being in the direction of arrow 1327. In the example shown in fig. 13B, display constructs 2 and 3 are at least partially transparent and are disposed over window 1321 such that the entire window 1321 is covered by both display constructs, each covering (e.g., extending) half of the surface area of window 1321, and a user can view the external environment (e.g., flowers, glass, and trees) through display constructs 2 and 3 and window 1321. Display construction 2 is coupled to window 1321 with a fastener that facilitates rotation of the display construction about an axis parallel to the left vertical edge of the window, the rotation being in the direction of arrow 1326. Display construction 3 is coupled to the window with a fastener that facilitates rotation of the display construction about an axis parallel to the right vertical edge of window 1321, the rotation being in the direction of arrow 1325.
In some embodiments, the display construction comprises a hardened transparent material, such as plastic or glass. The glass may be in the form of one or more glass panes. For example, the display configuration may include a display matrix (e.g., an array of lights) disposed between two panes of glass. The array of lamps may comprise an array of colored lamps. For example an array of red, green and blue lamps. For example an array of cyan, magenta and yellow lamps. The array of lights may include the colors of light used in electronic screen displays. The array of lights may include an array of LEDs (e.g., OLEDs, e.g., TOLEDs). The matrix display (e.g. the array of lamps) may be at least partially transparent (e.g. to the ordinary human eye). Transparent OLEO may facilitate conversion of a majority (e.g., greater than about 30%, 40%, 50%, 60%, 80%, 90%, or 95%) of the intensity and/or wavelength sensed by the ordinary human eye. The matrix display may cause as little disturbance to a user looking through the array as possible. The array of lights may cause as little disturbance as possible to a user looking through the window in which the array is disposed. The display matrix (e.g., array of lights) may be maximally transparent. At least one of the glass panes of the display construction may have a conventional glass thickness. Conventional glass may have a thickness of at least about 1 millimeter (mm), 2mm, 3mm, 4mm, 5mm, or 6 mm. Conventional glasses can have a thickness between any of the foregoing values (e.g., 1mm to 6mm,1mm to 3mm,3mm to about 4mm, or 4mm to 6 mm). At least one pane of glass of the display construction may have a thin glass thickness. The thin glass may have a thickness of at most about 0.4 millimeters (mm), 0.5mm, 0.6mm, 0.7mm, 0.8mm, or 0.9mm thick. The thin glass may have a thickness between any of the foregoing values (e.g., 0.4mm to 0.9mm,0.4mm to 0.7mm, or 0.5mm to 0.9 mm).
The glass of the display construction may be at least transmissive (e.g., in the visible spectrum). For example, the glass can be at least about 80%, 85%, 90%, 95%, or 99% transmissive. The glass can have a percent transmittance value between any of the foregoing percentages (e.g., about 80% to about 99%). The display configuration may include one or more panes (e.g., glass panes). For example, a display construct may include multiple (e.g., two) panes. The glass panes can have (e.g., substantially) the same thickness, or different thicknesses. The forward pane may be thicker than the backward pane. The backward pane may be thicker than the forward pane. The front portion may be in the direction of the intended viewer (e.g., in front of the display configuration 101, viewing the display configuration 101). The rear portion may be in the direction of the (e.g., tintable) window (e.g., 102). One piece of glass may be thicker relative to the other piece of glass. The thickness of the thicker glass may be at least about 1.25, 1.5, 2, 2.5, 3, 3.5, or 4 of the thinner glass. The symbol "+" indicates a "multiple" mathematical operation. The display construction, including one or more panes and a display matrix (e.g., a light array or LCD), may have a transmittance of at least about 20%, 30%, 35%, 40%, 45%, 50%, 60%, 70%, 80%, or 90%. The display construction can have a percent transmittance value between any of the foregoing percentages (e.g., about 20% to about 90%, about 20% to about 50%, about 20% to about 40%, about 30% to about 40%, about 40% to about 80%, or about 50% to about 90%). A higher percent transmittance indicates a higher intensity and/or a broader spectrum through the material (e.g., glass). The transmittance may be for visible light. Transmittance may be measured as visible light transmittance (abbreviated herein as "Tvis") representing the amount of light in the visible portion of the spectrum that passes through a material. The transmittance may be relative to the intensity of the incident light. The display construction may transmit at least about 80%, 85%, 90%, 95%, or 99% of the visible spectrum (e.g., wavelength spectrum) of light passing therethrough. The display construction may transmit a percentage value between any of the foregoing percentages (e.g., about 80% to about 99%). In some embodiments, a liquid crystal display is used in place of the lamp array.
Fig. 14 shows a schematic example of a display construction assembly 1400 prior to lamination thereof, the display construction comprising a thicker glass pane 1405, a first adhesive layer 1404, a display matrix 1403, a second adhesive layer 1402, and a thinner glass pane 1401, wherein the matrix is connected via wiring 1411 to circuitry 1412 that controls at least one aspect of the display construction, which is coupled to a fastener 1413.
In some embodiments, multiple types of interfaces are employed to provide user control of interactive targets (e.g., systems, devices, and/or media). The interactive target may be controlled, for example, using a control interface. The control interface may be local and/or remote. The control interface may communicate over a network. The control system may be communicatively coupled to a network to which the targets are communicatively coupled. Examples of control interfaces include digital twins (e.g., representative models) of the handling facility. For example, the mobile circuitry may be used to control one or more interactive devices (e.g., optically switchable windows, sensors, emitters, and/or media displays). The movement circuitry may include a gaming-type controller (e.g., a pointing device) or a Virtual Reality (VR) user interface. When additional new devices are installed in the facility (e.g., in its room) and coupled to the network, new targets (e.g., devices) may be detected (e.g., and included in the digital twin). The detection of and/or inclusion of new targets into the digital twin may be done automatically and/or manually. For example, detection of and/or inclusion of a new target into a digital twin may not require (e.g., any) manual intervention.
In some embodiments, the digital twin includes a digital model of the facility. The digital twin is composed of a virtual three-dimensional (3D) model of the facility. The facilities may include static elements and/or dynamic elements. For example, a static element may include a representation of a structural feature of a facility, and a dynamic element may include a representation of an interactive device having a controllable feature. The 3D model may include visual elements. The visual element may represent a facility fixture. The fixture may include a wall, floor, wall, door, shelf, structural (e.g., walk-in) closet, fixture light, electrical panel, elevator shaft, or window. The securing device may be secured to the structure. The visual element may represent a non-fixed device. The non-stationary device may include a person, a chair, a movable light, a table, a sofa, a movable closet, or a media projection. The visual elements may represent facility features including floors, walls, doors, windows, furniture, appliances, people, and/or interactive objects. The digital twin may represent the environment of a real facility, similar to a virtual world for computer games and simulations. The creation of the 3D model may include analysis of Building Information Modeling (BIM) models (e.g., autodesk review files in the x. RVT format), for example, to derive representations of (e.g., basic) fixed structures and movable items such as doors, windows, and elevators. The 3D model may include building details related to the design of the facility, such as 3D models, details of facades, floor plans, and/or project settings related to the facility. The 3D model may include annotations (e.g., with a two-dimensional (2D) drawing element). The 3D model may facilitate access to information from a model database of the facility. The 3D model may be used to plan and/or track various stages in the life cycle of the facility (e.g., facility concept, build, maintenance, and/or demolition). The 3D model may be updated during the life cycle of the facility. The update may occur periodically, intermittently, upon occurrence of an event (e.g., related to the structural state of the facility), in real-time, upon availability of human resources, and/or on the rise of time. The digital twin may include a 3D model and may be updated in relation to (e.g., as) the 3D model of the facility updates. The digital twin may be linked to (e.g., and thus updated with) the 3D model. Real-time may include up to 15 seconds (sec), 30 seconds, 45 seconds, 1 minute (min), 2 minutes, 3 minutes, 4 minutes, 5 minutes, 10 minutes, 15 minutes, or 30 minutes from the occurrence of a change in the peripheral structure (e.g., a change initiated by a user).
In some embodiments, a digital twin (e.g., a 3D model of a facility) is defined at least in part by determining a layout of a real facility using one or more sensors (e.g., optical, acoustic, pressure, gas velocity, and/or distance measuring sensors). The use of sensor data may be used specifically to model the environment of the peripheral structure. The use of sensor data may be used in conjunction with a 3D model of the facility (e.g., (BIM model)) to model the environment of the peripheral structure. The BIM model of the facility may be obtained before, during, and/or after the facility is built. The BIM model of the facility may be updated (e.g., manually and/or using sensor data) during operation of the facility (e.g., in real-time). Real-time may include during a change in a facility or a change in a facility occurs. Real-time may include up to 2 hours, 4 hours, 6 hours, 8 hours, 12 hours, 24 hours, 36 hours, 48 hours, 60 hours, or 72 hours from a change in a facility or a change in a facility occurs.
In some embodiments, the dynamic elements in the digital twin include target (e.g., device) settings. The target settings may include (e.g., existing and/or predetermined): hue values, temperature settings, and/or light switch settings. The target settings may include available actions in the media display. The available actions may include displaying menu items or hotspots in the content. The digital twin may include a virtual representation of the target and/or movable objects (e.g., a chair or door) and/or occupants (from a camera or from a stored actual image of the avatar). In some embodiments, a dynamic element may be a target (e.g., a device) that is newly inserted into the network and/or disappears from the network (e.g., due to a failure or relocation). The digital twin may reside in any circuit (e.g., processor) operatively coupled to the network. The circuitry on which the digital circuitry resides may be in the facility, external to the facility, and/or in the cloud. In some implementations, a bidirectional link is maintained between the digital twin and the real circuit. The real circuit may be part of the control system. The real circuit may be included in a master controller, a network controller, a floor controller, a local controller, or in any other node in the processing system (e.g., in the facility or outside the facility). For example, real circuitry may use the bi-directional link to notify the digital twin of changes in the dynamic elements and/or static elements so that the 3D representation of the peripheral structure may be updated, e.g., in real-time. Real-time may include during a change in the peripheral structure or a change in the peripheral structure occurs. Real-time may include up to 15 seconds (sec), 30 seconds, 45 seconds, 1 minute (min), 2 minutes, 3 minutes, 4 minutes, 5 minutes, 10 minutes, 15 minutes, or 30 minutes from the occurrence of the change in the peripheral structure. The digital twin may use the bidirectional link to inform the real circuit of a manipulation (e.g., control) action entered by the user on the mobile circuit. The movement circuitry may be a remote control (e.g., including a handheld pointing device, manual input buttons, or a touch screen).
In some embodiments, the user's one or more mobile circuit devices are aligned with (e.g., linked to) a virtual 3D "digital twin" model of the facility (or any portion thereof), e.g., aligned via a WiFi or other network connection. The mobile circuitry may include a remote (e.g., mobile) control interface. The movement circuitry may include a pointing device, a game controller, and/or a Virtual Reality (VR) controller. For example, the mobile circuit may not interact with the physical infrastructure, e.g., forwarding network communications to and/or from the digital twin only via the aligned communication channel. User interaction with any device controlled in the peripheral structure may not be direct and/or physical. The user interaction of the user with the target may be indirect. The user's interaction with the target may be without tactile touch, light projection, and/or human voice. The control action of the control target taken by the user may be based at least in part on a relative position of the digital circuit manipulated by the user with respect to a modeled space (e.g., virtual movement within a modeled peripheral structure) in the digital twin. The control action taken by the user to control the target may not be based on (e.g., and is not intended to) the spatial relationship between the user and the digital twin. For example, the user may use a remote control pointing device and point to the presentation portion. The presentation may be displayed on a TOLED display construct disposed in a line of sight between the user and a window (e.g., a smart window). The coupling between the mobile circuit and the target may be time based and/or may be action based. For example, a user may use a remote control to point at the presentation and thereby couple with the presentation. The coupling may begin when the pointing duration exceeds a duration threshold. The coupling may be initiated by clicking on the remote control when pointed. The user may then point to a location in the presentation that triggers a drop down menu. This drop down menu may be visible in the following cases: when the pointing may exceed a time threshold, (ii) when a user presses a button on the remote control (e.g., based on a motion), and/or (iii) when a user performs a gesture (e.g., as disclosed herein). The user can then select from the menu. The selection may be initiated in the following cases: when the pointing may exceed a time threshold, (ii) when a user presses a button on the remote control (e.g., based on a motion), and/or (iii) when a user performs a gesture (e.g., as disclosed herein). The action performed by the user in conjunction with the mobile circuitry (e.g., remote control) may be communicated to the network and thereby to the digital twin, which is then communicated to the target. Thus, the user may indirectly communicate with the target through the digital twin. The mobile circuitry (e.g., remote control) may be positioned relative to the peripheral structure once, at time intervals, and/or continuously. Once the relative position of the mobile circuitry (e.g., remote control) and the peripheral structure is determined, the user can use the remote control anywhere (e.g., within or outside the peripheral structure). Outside the peripheral structure may be included in the facility or outside the facility. For example, a conference room may establish its relative position with a remote controller. Thereafter, a user may use a relatively positioned remote control to manipulate the light intensity of a light bulb disposed in the conference room, while located in the conference room or while located outside the conference room (e.g., at home).
In some embodiments, the mobile circuit (e.g., a remote controller) may control (e.g., any) interactive and/or controllable target (e.g., device) in the facility or any portion thereof, so long as (i) the target and (ii) the mobile circuit (e.g., a remote controller) are communicatively coupled to the digital twin (e.g., using a network). For example, the facility may include an interactive target with one or more sensors, emitters, tintable windows, or media displays, coupled to a communication network. In some embodiments, the user interacts with the digital twin from a (e.g., arbitrary) location within or outside of the facility. For example, the remote controller device may include a Virtual Reality (VR) device, e.g., with a head-mounted device (e.g., a binocular display) and/or a hand-held controller (e.g., a motion sensor with or without input buttons). The mobile circuitry may include an Oculus virtual reality player controller (ovrplayerontroller). In some embodiments, a remote control interface may be used that provides (i) a visual representation of a digitally twin user to navigate within a virtual facility, and/or (ii) a user input action to move within a 3D model. The user input action may include (1) pointing at a desired interactive target to be controlled (e.g., to change the state of the target), (2) a gesture, and/or (3) a button press to indicate a selection action to be taken with the mobile circuit (e.g., a remote control). The remote controller may be used to manipulate the interactive target by pointing at the interactive target (e.g., for coupling), gesturing in other directions, and/or pressing one or more buttons operatively coupled to the mobile circuit (e.g., buttons disposed on the enclosure of the mobile circuit). The interfacing between the moving circuitry and the digital twin may be performed without depicting a screen of the digital twin. The interfacing between the user and the digital twin may be performed without displaying a screen of the digital twin. The interfacing between the mobile circuitry and the digital model may not require (e.g., any) optical sensors as a facilitator. Some embodiments employ different modes of input from an augmented reality application that operates through interaction with a screen (e.g., through the use of an optical sensor such as a camera).
In some embodiments, a mobile circuit (e.g., a handheld controller) is used that does not have any display or screen that can depict a digital representation of a peripheral structure and/or object. For example, instead of the user performing virtual navigation within the peripheral structure, the user's actual position may be determined in order to establish the user's position in the digital twin, e.g. for use as a reference in relation to the user's pointing actions. For example, a mobile circuit (e.g., a handheld controller) may include geo-tracking capabilities (e.g., GPS, UWB, BLE, and/or dead reckoning) such that the location coordinates of the mobile circuit may be transmitted to the digital twin using any suitable network connection established by the user between the mobile circuit and the digital twin. For example, the network connection may include, at least in part, a transmission link used by a hierarchical controller network within the facility. The network connection may be separate from the facility's controller network (e.g., using a wireless network such as a cellular network).
In some embodiments, the user may be coupled to the requested target. The coupling may include gestures using a movement circuit. The coupling may include an electronic trigger event in the mobile circuit. The coupling may include a movement, a pointing, a clicking gesture, or any combination thereof. For example, the coupling may begin, at least in part, by pointing at the target for a period of time that is above a (e.g., predetermined) threshold. For example, the coupling can be initiated at least in part by clicking a button (e.g., a target selection button) on a remote control that includes the mobile circuit. For example, the coupling may be initiated at least in part by moving the moving circuit in the direction of the target. For example, the coupling may begin at least in part by pointing at the front portion of the mobile circuit in the direction of the target (e.g., for a time above a first threshold) and clicking a button (e.g., for a time above a second threshold). The first threshold and the second threshold may be (e.g., substantially) the same or different.
Fig. 15 shows an exemplary embodiment of a control system in which a real physical peripheral structure (e.g., room) 1500 includes a network of controllers for managing interactive network devices under the control of a processor 1501 (e.g., master controller). As part of the modeling and/or simulation system executing in the computing asset, the structure and content of building 1500 is represented in 3D model digital twin 1502. The computing assets may be co-located with or remotely located from the peripheral structure 1500 and the processor (e.g., master controller) 1501. Network link 1503 in peripheral architecture 1500 connects processor 1501 with a plurality of network nodes including interactive target 1505. Interactive target 1505 is represented as a virtual object 1506 within digital twin 1502. A network link 1504 connects processor 1501 with digital twin 1502.
In the example of fig. 15, a user located in the peripheral configuration 1500 carries a handheld controller 1507 with pointing capability (e.g., coupled with a target 1505). The location of handheld controller 1507 may be tracked, for example, via a network link (not shown) with digital twin 1502. The link may include some transmission media contained within the network 1503. The hand-held controller 1507 is represented as a virtual hand-held controller 1508 within the digital twin 1502. Based at least in part on the tracked position and pointing capabilities of the handheld controller 1507, when a user initiates a pointing event (e.g., aims at a particular target and presses an action button on the handheld controller), the event is sent to the digital twin 1502. Thus, the digital twin 1502 is associated with a target (e.g., represented as a digital ray 1509 from a tracking location within the digital twin 1502). The digital ray 1509 intersects the virtual device 1506 at intersection 1510. The resulting interpretation of the actions made by the user in digital twin 1502 is reported by digital twin 1502 to processor 1501 via network link 1504. In response, processor 1501 relays a control message to interactive device 1505 to initiate a command action in accordance with a gesture (or other input action) made by the user.
Fig. 16 shows an exemplary method corresponding to the embodiment of fig. 15. For example, a user carrying mobile circuitry (e.g., a handheld remote control) in a peripheral structure (e.g., a building) represented by a digital twin may wish to interact with a particular interactive target. In operation 1600, a user is coupled to a target, for example, by pointing to and/or clicking on a tracked remote control to represent a requested control action. The movement circuit may be coupled to the target by pointing at the target (e.g., for a period of time longer than a threshold time). The mobile circuit may be coupled to the target by a coupling command. The coupling command may include a tactile, verbal, visual, and/or written command. The coupling may include any of the voice and/or gesture commands disclosed herein. The coupling may include pressing a button operatively (e.g., communicatively) coupled to the moving circuit, the target, and/or the digital twin.
In some embodiments, the moving circuit may be oriented in at least two directions. For example, the mobile circuit may have a front direction and a back direction. For example, the movement circuitry may be capable of distinguishing at least two, three, four, five or six spatial directions. Directions may include up, down, front, back, right, or left. Directions may include north, south, east, and west. The direction may be a relative direction, e.g. relative to a previous position of the moving circuit. The direction may be an absolute direction (e.g., within a measurable error range). The directions may be in accordance with a Global Positioning System (GPS). Coupling of the mobile circuit (e.g., remote control) and the target (e.g., media projection) may include pointing a front direction of the mobile circuit at the target, e.g., for a time above a threshold. Using a network communication route from the remote controller to the digital twin, the intersection between the mobile circuit and the target can be digitally mapped in the digital twin. The intersection point may come from a tracked location of the mobile circuitry (e.g., a handheld controller) along the digital ray indicated by the pointing direction, for example, to identify any requested interactive target (e.g., a device and/or control element on the device). In the example shown in fig. 16, a remote controller communicatively coupled to a digital twin (e.g., and tracked via a network communication route) is pointed at a target disposed in a peripheral structure in operation 1601. A virtual digital ray may be envisaged from the pointed remote controller to the target to which the remote controller is directed. The network communication route may include a (e.g., separate) network connection. In operation 1602, it is determined whether any predetermined event (e.g., any control event) is associated with an intersection at the interactive target. For example, the intersection may indicate a light switch target. The event associated with pointing and/or clicking the light switch may be a change in the on/off state of the light switch. If no associated event is found for the intersection, no action is taken and the method ends at operation 1603. If an associated event is found, the method proceeds to operation 1604 to send an event command from the digital twin to a processor (e.g., a controller) operatively coupled with the lighting switches in the peripheral structure. In operation 1605, the processor receives the event command and triggers the associated event in the corresponding physical peripheral structure. Triggering the associated event can be done by sending a command to the appropriate controller for the interactive apparatus (e.g., sending a tone command to the corresponding window controller).
In some embodiments, social interaction and/or communication is provided via a digital twin. When the digital twin is coupled to the communication network, it (e.g., essentially) allows a social experience of remote participants joining the facility and interacting with a target (e.g., a device or media) in the facility via the digital twin. The concept of digital twinning may involve multiple users in manipulating interactive targets disposed in a peripheral structure; whether the participant is in a peripheral structure, and/or whether the participant is local or remote. For example, multiple users may simultaneously access (e.g., interact with) the digital twin in a manner that is perceptible to other users. For example, when a user employs VR headsets with visual displays and audio communications, the users may see and/or hear each other in the virtual space of the digital twin representation. For example, when users employ a video conferencing tool, the users may see and/or hear each other in the virtual space of the digital twin representation. For example, the tracked user may be represented as an avatar placed within a corresponding location in the digital twin and displayed to other users. The avatar may be generic and/or may include photographic data (e.g., captured using a camera or other personal recognition method) that may be pre-stored or captured during user interaction with the digital twin. Personal identification methods may include facial recognition, fingerprint scanning, retina scanning, or other biometric-based methods for confirming the identity of a user.
FIG. 17 illustrates an example in which multiple users interact via a digital twin that provides access to controllable features of an interactive target within a peripheral structural environment. For example, the building network 1700 may include network communication links between master controllers, network controllers, window controllers, and interactive targets (such as sensors, actuators, transmitters, media displays, computing devices, and/or electrochromic windows). Fig. 17 represents a group of individuals in a meeting where a mobile circuit (e.g., laptop) 1702 is connected to a building network 1700 by a communication link (e.g., wiFi) 1701 to provide a media presentation. A projector 1704 that projects a media display 1705 (e.g., on a display construct) to a set of room occupants 1706 is coupled to the building network 1700 by a communication link 1703. Accordingly, media content of a presentation (e.g., a computer application such as a spreadsheet or slide) generated by the apparatus 1702 may be sent to the projector 1704 for display. The media content may also be sent to digital twin 1710 over link 1711 such that the media content may be represented as a visible element in digital twin 1710. The media content may instead be sent over a direct link (e.g., bluetooth (BLE) or WiFi) between the device 1702 and the projector 1704. The apparatus 1702 may be connected in parallel with the building network 1700 such that the media content may be provided to the digital twin 1710 or a simulation model may be maintained without including the media content in the digital twin.
Digital twin 1710 may be accessible by user 1713 via a communication link 1712 between digital twin 1710 and/or user interface equipment. For example, the user interface equipment may include a VR headset 1714 and a VR handheld controller 1715. Another user 1721 simultaneously accesses the digital twin 1710 via a communication link 1720. The user 1721 may have a VR headset 1722 and a VR handheld controller 1723. In some embodiments, the digital twin 1710 may include dynamic elements of a room for housing the group conference 1706 (e.g., representations of people sitting around a conference table, representations of remote participants at virtual locations to which the remote participants navigate within the VR model, and/or instantaneous views of media content being displayed in the room). The digital twin 1710 may provide an audio signal captured by a microphone (e.g., disposed in a room and/or VR equipment) for reproduction by other participants.
In some embodiments, network communications between the controller (e.g., MC), the digital twin, the user mobile circuitry (e.g., remote controller), and the local interactive device include one-way or two-way messaging functionality. For example, a combination of local and/or wide area networks with appropriate gateways may be configured to facilitate (i) exchanging messages, (ii) updating the digital twin, and/or (ii) remote interaction of the user with the target (e.g., for remotely controlling an interactive target). The message may be associated with a change in the status of the target and/or a user of the conference (independent or associated with the target, independent or associated with the peripheral structure in which the target is located, dependent or independent of the subject of the conference). The controller may be configured (e.g., by suitable software programming) to interact with the digital twin. The interaction may be used to provide data identifying a change in the static element and a state of the dynamic element included in the digital twin. The digital twins may be configured to provide (i) intuitive ability to remotely manipulate the target; (ii) A virtual reality experience of at least one user navigating in a virtual 3D model of a peripheral structure; (iii) (iii) study various dynamic states in the digital twin, and/or (iv) exchange interactive (e.g., control) actions (e.g., events) related to the target, the actions initiated by at least one user, e.g., via a virtual reality interface. Remote manipulation may or may not include an electromagnetic and/or sonic beam directed from a remote controller to a target. In some embodiments, the remote steering may be free of electromagnetic and/or sonic beams directed from the remote controller to the target. In some embodiments, the communicative coupling of the remote controller with the target may be (e.g., only) through a communicatively coupling to the digital twin network. In some embodiments, the communicative coupling of the remote controller to the target may be (e.g., only) through the digital twin (e.g., using the network as a communication path communicatively coupling the target, the digital twin, and the remote controller (including the mobile circuit)). The communication coupling may include wired and/or wireless communication. The digital twin may be configured to process the user input event, e.g., (i) to identify whether the input event corresponds to a valid command related to the target (e.g., a predetermined list of valid control actions from the target) and/or (ii) to forward the valid command (e.g., to at least one controller or directly to the target) for manipulating the target (e.g., manipulating a state of a manipulable target). In some embodiments, at least one controller monitors its ongoing data and/or command exchange with the local interactive target, e.g., to collect and/or forward update information for the digital twin. The update information may include any dynamic state change, such as a remote event generation initiated by a user.
In some embodiments, the messaging sequence includes one or more data messages and one or more command message exchanges between (i) one or more local targets and the processor, (ii) the processor and the digital twin, and/or (iii) the digital twin and the mobile circuit. For example, a processor (e.g., a controller, such as a master controller) may send a data message to the digital twin when one or more new targets join the network from time to time. The data may represent new static and/or dynamic elements for inclusion in the digital twin 3D model of the facility. The data may represent changes in the (e.g., system) state of the dynamic element of the target.
In some embodiments, the mobile circuitry exchanges with the digital twin one or more messages that enable a user to control (including monitoring and/or altering) the operation of real objects (e.g., by manipulating virtual twin elements of the real objects in the digital twin). For example, a user may activate their mobile circuitry (e.g., remote game controllers such as VR headset and handheld VR controllers (e.g., click buttons)) to create a link with the digital twin. In some implementations, upon initial connection, the digital twin and the mobile circuit exchange data messages with data for displaying an analog scene in the digital twin, e.g., according to a default starting position. For example, the virtual simulation may begin at the entrance of the peripheral structure, or at any other point of interest (e.g., selected by the user). In some embodiments, the starting location may correspond to the current location of the user when the user is actually located in the represented peripheral structure (e.g., the starting message may provide geographic coordinates of a GPS-equipped user remote control). The data or commands within the message between the movement circuit and the digital twin may include navigation actions (producing an updated view back from the digital twin) and/or control actions (e.g., point and click) to indicate a desired change in the changeable state of the target.
In some embodiments, the digital twin validates the received control action, for example, by mapping the control action to an indicated location in the digital twin and/or checking against a list of valid actions. For example, a digital twin may send a message to a processor (e.g., a controller) only when a user's control action event corresponds to a recognizable and authorized interaction. When a valid interaction is found, a command message may be sent from the digital twin to a processor (e.g., a controller) and forwarded to the affected target. After executing the command, one or more acknowledgement messages may be propagated back to the digital twin, and the 3D model of the digital twin may optionally be updated accordingly. For example, after performing a change in tint value of an Insulated Glass Unit (IGU), a digital twin model of the IGU may be adjusted to display a corresponding change in tint level.
Fig. 18 is an exemplary message sequence during operation of a control system in a peripheral structure (e.g., a building in which a digital twin has been constructed) including a controller and/or processor 1800, one or more interactive interconnection targets (e.g., devices) 1802. One or more new targets may join the network from time to time. For example, the new target sends a join message 1804 to the processor and/or controller 1800 when it is interconnected. The new target may for example represent a new static and/or dynamic element for inclusion within the digital twin 3D model. For example, when a new static element has been added, a new static element message 1805 is sent from processor and/or controller 1800 to digital twin 1801. The processor and/or controller 1800 and the target 1802 may exchange data and/or command messages 1806 (e.g., continuously or intermittently), for example, as part of their normal operation. In some embodiments, controller and/or processor 1800 may identify changes in the exchange of data and commands and/or messages (e.g., 1806) that manifest as a changed system state that produces a dynamic element. Accordingly, the processor and/or controller 1800 may send a new dynamic element message 1807 to the digital twin 1801. The digital twin 1801 may then update the digital twin (e.g., a 3D model of the peripheral structure) to reflect the new state (e.g., a tint state of the window, or content of a display screen in the media presentation).
In the example of fig. 18, network interactions that are completely independent of processor and/or controller 1800 interactions are undertaken by a user (whether the user is remotely located or in a peripheral configuration).
For example, the movement circuit 1803 and the digital twin 1801 (e.g., embedded in the remote controller) exchange messages that enable a user to monitor and/or change the operation of the real object 1802, e.g., by manipulating its virtual twin element in the digital twin 1801. For example, the user may activate their mobile circuitry (e.g., remote gaming controllers such as VR headset and handheld VR controllers (e.g., click buttons)) to cause a launch message 1808 to be sent to the digital twin 1801. In response, the digital twin 1801 may send a start message 1809 to the mobile circuit 1803. The start message may comprise, for example, data for displaying the simulated scene in the digital twin, for example according to a default start position. For example, the virtual simulation may begin at the entrance of the peripheral structure, or at any other point of interest (e.g., selected by the user).
In the example of fig. 18, a user may invoke gestures (e.g., movements) and/or button presses on a remote control that includes movement circuitry 1803, e.g., to navigate through various locations in the 3D model. A corresponding navigation action message 1810 may be sent from the move circuit 1803 to the digital twin 1801 and data for the updated view returned from the digital twin 1801 to the move circuit 1803 to view the updated view message 1811. Once the user approaches the requested interactive target in the simulation, the user may initiate a control action (e.g., point and click) causing a control action message 1812 to be sent to the digital twin 1801.
In some embodiments, the digital twin 1801 validates the control action by mapping the control action to an indicated location in the 3D model and/or checking against a list of valid actions. When a valid control action event is detected, digital twin 1801 may send a command message 1813 to processor and/or controller 1800 to identify a corresponding target and a corresponding state change (e.g., switching of an identified lighting circuit, or selecting a menu item in a projected display of a laptop presentation). Command messages 1814 may be sent from the processor and/or controller 1800 to affected targets 1802. After executing the command, the target 1802 may send an acknowledgement message 1815 to the processor and/or controller 1800. If the change is between dynamic elements included in the digital twin, the processor and/or controller 1800 may send an update dynamic element message 1816 to the digital twin 1801. If the current simulation being viewed by the user includes a dynamic element, an update view message 1817 may be sent to the remote controller 1803, e.g., to provide new data adjusted for the new dynamic state.
Direct contact between a user and a target device (e.g., a surface of the target device) may sometimes be requested and/or advantageously reduced (e.g., eliminated). For example, reducing direct user interaction between a user and a target device may reduce the risk of infection by pathogens (e.g., fungi, viruses, and/or bacteria) that reside on a device (e.g., a surface). The pathogen may be infectious and/or cause disease. The target device may be an interactive target. The target device may be disposed in a peripheral structure. The target device may be a third party device. The target device may be a service apparatus (e.g., an apparatus that provides a service to a user).
*
In some embodiments, the target device is operatively coupled to a network. The network is operatively coupled to or includes a control system (e.g., one or more controllers, such as a hierarchical control system). In some implementations, the mobile circuit of the user is paired with a target device (e.g., a serving appliance). When operatively (e.g., communicatively) coupled to a network (e.g., and to a control system), the target device may receive the identification tag. The target device may be operatively coupled to the mobile circuitry over a network (e.g., using indirect coupling). The coupling between the mobile circuit and the target device may be by an application of the appliance and/or the target device. Physical proximity between the target device and the mobile circuitry (e.g., and the user) may not be required. The target device may be selected using information about the user's location and/or the user's mobile circuit. The user may be located at a distance of at most 50 meters (m), 25m, 10m, 5m, 2m, or 1.5m from the target device. The user may be located at a distance from the target device between any of the above distances (e.g., about 50m to about 1.5m, about 50m to about 25m, about 25m to about 1.5 m). The distance between the user and the target device may be greater than the distance that requires pairing (e.g., bluetooth-type pairing) between the apparatuses. Any physical proximity between the user (and/or the user's mobile circuitry) and the target device (e.g., the serving appliance) may not be required. The user may select a target device (e.g., a service appliance) from a list (e.g., a drop-down menu). A user may need to operatively couple the mobile circuit to a network to which the target device is coupled. Communication between the mobile circuit and the serving device may be unidirectional (e.g., from the mobile circuit to the target device, or vice versa) or bidirectional (e.g., over a network) between the target device and the mobile circuit. One user may control one or more target devices (e.g., service apparatuses). A target device may be controlled by one or more users. Multiple users may send requests to a target device, which may be placed in a queue (e.g., based on a prioritization scheme such as time of receipt, urgency, and/or user seniority).
In some embodiments, the target device is identified by the network when connected to the network (the connection may be wired and/or wireless). The target device may be identified by an identification code (e.g., RFID, QR-ID, barcode). In some embodiments, the identification code is not a visible (e.g., scannable) identification code. The identification code may include a non-contact identification (e.g., electromagnetic and/or optical). The optically recognized indicia may be a machine readable code, for example, comprised of an array of black and white squares or lines (e.g., a bar code or Quick Response (QR) code). Electromagnetic identification methods may include Radio Frequency Identification (RFID). The RFID may be an ultra high frequency RFID. The identification method may include a transponder (e.g., an RF transponder), a receiver, a transmitter, or an antenna. The identification method may be passive or active (e.g. emitting electromagnetic radiation). The identification method may include Near Field Communication (NFC).
In some embodiments, a user may control a target device (e.g., a serving appliance). For example, a user may control mechanical, electrical, electromechanical, and/or electromagnetic (e.g., optical and/or thermal) actions of a target device. For example, a user may control the physical actions of the target device. For example, a user may control whether a target device is open or closed, whether any of its controllable compartments is open or closed, direct direction (e.g., left, right, up, down), enter and/or change settings, enable or deny access, transfer data to memory, reset data in memory, upload and/or download software or executable code to the target device, have executable code executed by a processor associated with and/or incorporated into the target device, change channels, change volume, return operations to default settings and/or modes. The user may change settings stored in a data set associated with the target device, configure or reconfigure software associated with the target device. The memory may be associated with and/or part of the target device.
In some embodiments, the target device is operatively (e.g., communicatively) coupled to a network of peripheral structures (e.g., a communications, power, and/or control network). Once the target device is operatively coupled to the network of peripheral structures, it may be part of the target via digital twin control. The new target (e.g., a third party target) may provide one or more services to the user. For example, the target (e.g., target device) may be a dispenser. The dispenser may dispense food, beverages, and/or equipment upon command. The service devices may include media players (e.g., media may include music, video, television, and/or the internet), manufacturing equipment, medical devices, and/or sports equipment. The target device may include a television, a recording apparatus (e.g., a Video Cassette Recorder (VCR), a Digital Video Recorder (DVR), or any non-volatile memory), a Digital Versatile Disc (DVD) player, a digital audio file player (e.g., MP3 player), a cable and/or satellite converter set top box ("STB"), an amplifier, a Compact Disc (CD) player, a game console, home lighting, an electrically controlled window shade (e.g., a window blind), a tintable window (e.g., an electrochromic window), a fan, an HVAC system, a thermostat, a personal computer, a dispenser (e.g., a soap, beverage, food, or equipment dispenser), a washing machine, or a dryer. In some embodiments, the target device does not include an entertainment device (e.g., a television, a recording device (e.g., a Video Cassette Recorder (VCR), a Digital Video Recorder (DVR), or any non-volatile memory), a digital versatile disc or Digital Video Disc (DVD) player, a digital audio file player (e.g., MP3 player), a cable and/or satellite converter set top box ("STB"), an amplifier, a Compact Disc (CD) player, a game console). The command may be initiated by contacting the target or by communicating with the target (e.g., remotely). For example, a user may press a button on the target device to dispense an item (e.g., food, beverage, and/or equipment). For example, a user may interact with a target device by using a mobile circuit. The mobile circuit may include a cellular telephone, a touch pad, or a laptop computer.
In some embodiments, the network may be a low latency network. The low latency network may include edge computations. For example, at least one (e.g., any) controller of the (e.g., hierarchical) control system may be part of a computing system. For example, at least one (e.g., any) circuit coupled to the network may be part of the computing system. A delay (e.g., hysteresis or postponement) may refer to a time interval between the cause and its effect on certain physical changes in the observed system. For example, the delay may be physically a result of the finite speed at which any physical interaction may propagate. For example, delay may refer to the time interval between a stimulus and a stimulus response. For example, latency may refer to a delay before a data transfer begins following instructions for transferring data. The network may include optical fibers. The delay may be at least about 3.33 microseconds (μ s) or 5.0 μ s for each kilometer of fiber path length. The delay of the network may be at most about 100 milliseconds (ms), 75ms, 50ms, 25ms, 10ms, 5ms, 4ms, 3ms, 2ms, 1ms, or 0.5ms. The delay target for the network can be any value between the foregoing values (e.g., about 100ms to about 0.5ms, about 100ms to about 50ms, about 50ms to about 5ms, or about 5ms to about 0.5 ms). The network may comprise a packet switched network. The delay may be measured as the time (e.g., one-way delay) from the source sending a packet to the destination receiving the packet. The delay may be a measured one-way delay from the source to the destination plus a one-way delay from the destination back to the source (e.g., round-trip delay).
In some embodiments, the mobile circuit includes an application related to the target device (e.g., a third party device). The application may depict one or more service options provided by the target device. For example, if the target device is a beverage dispenser, the application may provide for selection of various beverage options available to the user that are provided by the target device. For example, if the target device is a food dispenser, the application may provide a selection of various food options available to the user that are provided by the target device. For example, if the target device is a mask dispenser, the application may provide for the assignment of one mask option available to the user.
In some embodiments, the user may be located in a peripheral structure (e.g., a facility such as a building). One or more sensors may be used to locate the user. The user may carry a tag. The tag may include radio frequency identification (e.g., RFID) technology (e.g., transceiver), bluetooth technology, and/or Global Position System (GPS) technology. The radio frequency may comprise an ultra wideband radio frequency. The tag may be sensed by one or more sensors disposed in the peripheral structure. These sensors may be provided in the device aggregate. The device aggregate may include a sensor or an emitter. The sensor may be operatively (e.g., communicatively) coupled to a network. The network may have low latency communications, for example, within the peripheral fabric. The radio waves (e.g., emitted and/or sensed by the tag) may include broadband or ultra-wideband radio signals. The radio waves may include pulsed radio waves. The radio waves may include radio waves utilized in communications. The radio waves may be at an intermediate frequency of at least about 300 kilohertz (KHz), 500KHz, 800KHz, 1000KHz, 1500KHz, 2000KHz, or 2500 KHz. The radio waves may be at an intermediate frequency of up to about 500KHz, 800KHz, 1000KHz, 1500KHz, 2000KHz, 2500KHz, or 3000 KHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 300KHz to about 3000 KHz). The radio waves may be at a high frequency of at least about 3 megahertz (MHz), 5MHz, 8MHz, 10MHz, 15MHz, 20MHz, or 25 MHz. The radio waves may be at high frequencies of up to about 5MHz, 8MHz, 10MHz, 15MHz, 20MHz, 25MHz, or 30 MHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 3MHz to about 30 MHz). The radio waves may be at a very high frequency of at least about 30 megahertz (MHz), 50MHz, 80MHz, 100MHz, 150MHz, 200MHz, or 250 MHz. The radio waves may be at very high frequencies of up to about 50MHz, 80MHz, 100MHz, 150MHz, 200MHz, 250MHz, or 300 MHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 30MHz to about 300 MHz). The radio waves may be at an ultra-high frequency of at least about 300 kilohertz (MHz), 500MHz, 800MHz, 1000MHz, 1500MHz, 2000MHz, or 2500 MHz. The radio waves may be at ultra high frequencies of up to about 500MHz, 800MHz, 1000MHz, 1500MHz, 2000MHz, 2500MHz, or 3000 MHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 300MHz to about 3000 MHz). The radio waves may be at very high frequencies of at least about 3 gigahertz (GHz), 5GHz, 8GHz, 10GHz, 15GHz, 20GHz, or 25 GHz. The radio waves may be at very high frequencies up to about 5GHz, 8GHz, 10GHz, 15GHz, 20GHz, 25GHz, or 30 GHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 3GHz to about 30 GHz).
In some embodiments, the identification tag of the occupant comprises a location device. A location device (also referred to herein as a "positioning device") may damage a radio transmitter and/or receiver (e.g., a wideband or ultra-wideband radio transmitter and/or receiver). The positioning device may comprise a Global Positioning System (GPS) device. The positioning device may comprise a bluetooth device. The positioning means may comprise a radio wave transmitter and/or receiver. The radio waves may include broadband or ultra-wideband radio signals. The radio waves may include pulsed radio waves. The radio waves may include radio waves utilized in communications. The radio waves may be at an intermediate frequency of at least about 300 kilohertz (KHz), 500KHz, 800KHz, 1000KHz, 1500KHz, 2000KHz, or 2500 KHz. The radio waves may be at an intermediate frequency of up to about 500KHz, 800KHz, 1000KHz, 1500KHz, 2000KHz, 2500KHz, or 3000 KHz.
The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 300KHz to about 3000 KHz). The radio waves may be at a high frequency of at least about 3 megahertz (MHz), 5MHz, 8MHz, 10MHz, 15MHz, 20MHz, or 25 MHz. The radio waves may be at high frequencies of up to about 5MHz, 8MHz, 10MHz, 15MHz, 20MHz, 25MHz, or 30 MHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 3MHz to about 30 MHz). The radio waves may be at a very high frequency of at least about 30 megahertz (MHz), 50MHz, 80MHz, 100MHz, 150MHz, 200MHz, or 250 MHz. The radio waves may be at very high frequencies of up to about 50MHz, 80MHz, 100MHz, 150MHz, 200MHz, 250MHz, or 300 MHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 30MHz to about 300 MHz). The radio waves may be at an ultra-high frequency of at least about 300 kilohertz (MHz), 500MHz, 800MHz, 1000MHz, 1500MHz, 2000MHz, or 2500 MHz. The radio waves may be at ultra high frequencies of up to about 500MHz, 800MHz, 1000MHz, 1500MHz, 2000MHz, 2500MHz, or 3000 MHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 300MHz to about 3000 MHz). The radio waves may be at very high frequencies of at least about 3 gigahertz (GHz), 5GHz, 8GHz, 10GHz, 15GHz, 20GHz, or 25 GHz. The radio waves may be at very high frequencies up to about 5GHz, 8GHz, 10GHz, 15GHz, 20GHz, 25GHz, or 30 GHz. The radio waves may be at any frequency between the aforementioned frequency ranges (e.g., from about 3GHz to about 30 GHz).
In some embodiments, the positioning device facilitates positioning within a range of error. The error range of the positioning device may be up to about 5 meters (m), 4m, 3m, 2m, 1m, 0.5m, 0.4m, 0.3m, 0.2m, 0.1m, or 0.05m. The error range for the positioning device can be any value between the foregoing values (e.g., about 5m to about 0.05m, about 5m to about 1m, about 1m to about 0.3m, and about 0.3m to about 0.05 m). The error range may represent the accuracy of the positioning device.
In some embodiments, the user seeks a service from a target device that is a service apparatus. A user may approach the service device and open an application on their mobile circuitry (e.g., handheld processor) that is related to the facility (or a service provided by and/or in the facility). The mobile circuitry may be operatively coupled (e.g., wirelessly) to a network. In parallel with and/or as a result of the opening of the application, the network can ascertain the location of the user. The location of the user may be ascertained via the mobile circuitry and/or via a tag carried by the user. The tag may transmit (e.g., transmit) the user's identification and/or the user's location. The mobile circuit may be a handheld mobile circuit (e.g., a cellular phone, laptop computer, tablet computer, game controller, virtual reality controller, or any other remote controller). The transmission may be sensed by one or more sensors disposed in the peripheral structure. By ascertaining the location of the user, the application may present qualified targets (e.g., service devices) near the user. The user may select the requested target from the eligible targets presented by the application. Selection of a service device may allow its interface to be opened (e.g., and thus allow its service to be selected). The user may select the requested service. The user selection may be sent to the service device over the network and the service device may satisfy the user's request. In this way, service selection can be performed without requiring the user to physically touch the service device. The user may then retrieve the completed service. Alternatively, the user may disable the location of the service and select a remote service device to satisfy the request. The user may or may not view (e.g., in an application) the digital twin of the peripheral structure in which the service device is located. The user may employ gesture controls to operate the service device. For example, the user may employ his mobile circuitry to point to a service option visible on the service device, which may be translated by the control system into an option selection.
For example, a user seeks a latte beverage from an automatic coffee dispenser that can prepare espresso, macchiato, cappuccino, latte, and mocha. The user approaches the coffee dispenser and opens the facility application on their cellular telephone coupled to the facility network. In parallel with and/or as a result of the opening of the application, the network can ascertain the location of the user. The user's location may be ascertained via the user's cellular telephone and/or via an identification tag (e.g., an ID tag) carried by the user (e.g., a tag that allows entry into a facility). The tag may transmit (e.g., transmit) the user's identification and/or the user's location. The transmission may be sensed by one or more sensors disposed in the facility. By ascertaining the user's location, the application may present qualified targets (e.g., service devices) near the user. The user may select a coffee dispenser from the qualifying goals presented by the application. In one option, selection of the coffee dispenser may allow the interface to be opened to allow selection between espresso, macchiato, cappuccino, latte, and mocha beverages. The user may select latte. The user selection may be sent to the coffee dispenser over a network, and the coffee dispenser may satisfy the user's request for a latte coffee beverage. In this way, service selection of a latte coffee beverage can be performed without requiring the user to physically touch the coffee dispenser. The user may then retrieve the latte coffee beverage without touching the coffee dispenser. In another option, the selection of the coffee dispenser may allow to treat the room in which the coffee dispenser is positioned as a digital twin. The user may point the cellular device at a coffee beverage option displayed on the coffee dispenser. The gesture may be sent to the control system via a network and translated by the control system into an option selection. The user selection may be sent to the coffee dispenser over the network, and the coffee dispenser may satisfy the user's latte coffee beverage request. In this way, service selection of a latte coffee beverage can be performed without requiring the user to physically touch the coffee dispenser. The user may then retrieve the latte coffee beverage without touching the coffee dispenser.
In some examples, there are various target devices (e.g., machines) of the same type in the facility. For example, several printers, several coffee machines or several food dispensers. The user may send a request to the target device type. The particular target device of the type that performed the request may be the device closest to the user. The user's location may be ascertained via the network (e.g., using facial recognition and/or an ID tag). The control system may use the user's location to identify a particular target device of the requested type for performing the requested task. The user may override such recommendations by the control system. The user may request that a particular target device perform a task. Certain target devices may be dedicated to certain groups of users (e.g., departments). There may be a hierarchy in providing users with rights to use the service device. The hierarchy may depend on the location, level, department of the user. The hierarchy may depend on the date and time the request was made, and/or the request execution time of the request. The user group may be identified by the control system. The user group may be determined based on the user's activities at work and/or outside work. The members of the group may be notified of the other group members and/or the presence of the group. Sometimes certain functions may be aware of the group and/or its members (e.g., human resources, management, and/or facilities). For example, in the event of a fire in a facility, a group of firefighters in the facility may be notified. For example, in the event of an emergency situation in a facility, a group of medical professionals in the facility may be notified.
In some embodiments, the user switches between a gesture control mode and a tap control mode. In the gesture control mode, the user may utilize the movement circuit to point the movement circuit at a target device in space. In the tap control mode, the user need not point the mobile circuit at a target device in space, but select relevant options on the target device that appear on the mobile circuit for selection (e.g., via a drop-down menu selection).
Selection between options presented on the mobile circuit may be made by using a touch screen of the mobile circuit, and/or scrolling through the options, such as by using a scrolling function (e.g., represented by an arrow) implemented in the mobile circuit.
In some embodiments, the interactive target is operatively coupled to the network via a computing interface. The computing interface may include an Application Programming Interface (API). The computing interface may define interactions between multiple software and/or hardware intermediaries. The computing interface may identify the requests that may be made, how those requests are made, the data format that should be used, and/or any particular conventions to follow. The computing interface may provide an extension mechanism to allow a user to extend existing functionality. For example, the API may be target specific, or the API may be designed using industry standards (e.g., to ensure interoperability). When a user requests a service from a service device (e.g., via a computing interface) via mobile circuitry and/or gesture control, a message is sent to a server (e.g., as part of a control system), the service device may receive the notification and may pick the request from a queue of servers, process the service request, and deploy (e.g., provide) the service for pick by the user. Examples of communication interfaces, MESSAGING, and control can be found IN U.S. provisional patent application serial No. 63/000,342 filed on 26/3/2020 and entitled "MESSAGING IN a MULTI CLIENT NETWORK," which is incorporated herein by reference IN its entirety.
Fig. 19 illustrates an exemplary method corresponding to the embodiment of fig. 19, where in operation 1900 a service device (e.g., a third party device) connects to a network of a facility, in 1901 the service device provides an identification, and in operation 1902 the service device remains alert (e.g., checks the network for any incoming requests). In operation 1903, a location of a user disposed in a peripheral structure is identified. Once the user opens the facility application, the user is provided with service devices in the vicinity of the user in operation 1904. In operation 1905, the user may select a service device and a service provided by the service device. The selection of the service may be made through an application menu or through a gesture control. In operation 1906, the selection of the service is sent to the selected service device through the network, and then the service device executes the request in operation 1907.
Fig. 20 shows an exemplary embodiment of a control system in which a real physical peripheral structure (e.g., room) 2000 includes a network of controllers for managing interactive network devices under control of a processor 2001 (e.g., master controller). The structure and contents of the building 2000 are represented in the 3D model digital twin 2002 as part of a modeling and/or simulation system executing in the computing asset. The computing assets may be co-located with the peripheral structure 2000 and the processor 2001 or remotely located from both. A network link 2003 in the peripheral fabric 2000 connects the processor 2001 with a plurality of network nodes including an interactive target 2005, which is a real service device with various service options 2022, 2023, and 2021 and a service implementation compartment 2020. The service device 2005 is represented as a virtual object 2006 (e.g., a virtual service device) within the digital twin 2002. A network link 2004 connects the processor 2001 with the digital twin 2002.
In the example of fig. 20, a user located in the peripheral configuration 2000 carries a handheld controller 2007 with pointing capabilities (e.g., coupled with a target 2005). The location of the handheld controller 2007 may be tracked, for example, via a network link (not shown) with the digital twin 2002. The link may include some transmission media contained within the network 2003. Hand-held controller 2007 is represented as a virtual hand-held controller 2008 within digital twin 2002. Based at least in part on the tracked position and pointing capabilities of the handheld controller 2007, when a user initiates a pointing event (e.g., aims at a particular target 2022, 2023, or 2021 and presses an action button on the handheld controller), the event is sent to the digital twin 2002. Thus, the digital twin 2002 is associated with a target (e.g., represented as a digital ray 2009 from a tracked location within the digital twin 2002). The digital ray 2009 intersects the virtual service device 2006 at an intersection 2010 in a virtual service option 2032 provided by the virtual service device 2006. The resulting interpretation of the actions made by the user in the digital twin 2002 is reported by the digital twin 2002 to the processor 2001 via the network link 2004. In response, the processor 2001 relays a control message to the interactive device 2005 to initiate a command action in accordance with the gesture (or other input action) made by the user. The real service device 2005 is similar to the virtual service device 2006. Real service options 2021-2023 are similar to virtual service options 2031-2033. The real dispensing compartment 2020 is similar to the virtual dispensing compartment 2030.
In some embodiments, a target device (e.g., a serving appliance) may be discovered within a certain range from a user (e.g., using a network and control system). In some embodiments, a target device (e.g., a service apparatus) may be discovered within a certain range from the target device. The user range and the device range may intersect.
A scope may be referred to herein as a "discovery scope," e.g., a service device discovery scope. When the target device discovery range intersects with the user discovery range, the user may discover the target device. For example, the target device may be discovered by the user when the user is within a target device discovery range. Network discovery may be used. The findings may be displayed in the user's mobile circuitry (e.g., cellular telephone). The range may be specific to the target device, the target device type, or a set of target device types. For example, a first range may be used for manufacturing machines, a second range may be used for media displays, and a third range may be used for food service machines. The range may be specific to the peripheral structure or a portion of the peripheral structure. For example, a first discovery range may be for a lobby, a second discovery range may be for a cafeteria, and a third discovery range may be for an office or for a group of offices. The range may be fixed or adjustable (e.g., by a user, manager, facility owner, and/or lessor). The first target device type may have a different discovery range than the second target device type. For example, a larger control range may be assigned to the light switch and a closer control range may be assigned to the beverage service device. A larger control range may be up to about 1 meter (m), 2m, 3m, or 5m. A more recent control range may be up to about 0.2m, 0.3m, 0.4m, 0.5m, 0.6m, 0.7m, 0.8m or 0.9m. A user may detect devices that are within a relevant usage range of the user (e.g., visually and/or using a list). "visually" includes the use of icons, graphics, and/or digital twins of peripheral structures (e.g., as disclosed herein). The use of discovery scopes may facilitate focusing on (e.g., shortening) the list of target devices that are relevant to user control, e.g., and prevent a user from having to select from a long list of (e.g., largely irrelevant) target devices (e.g., serving appliances). Controlling the range may use the user's location (e.g., using a geolocation device, such as one that includes UWB technology) and pair the target device with the network (e.g., wi-Fi pairing). The discovery range is not limited by a range decided by a direct device-user pairing technique (e.g., a bluetooth pairing range). For example, when a user is located remotely from a target device, the user can couple with the target device even if the apparatus is outside the direct apparatus-user pairing technology range (e.g., user range). The third party target device selected by the user may or may not incorporate techniques for direct device-user pairing techniques.
In some embodiments, pulse-based ultra-wideband (UWB) technology (e.g., ECMA-368 or ECMA-369) is a wireless technology for transmitting large amounts of data at low power (e.g., less than about 1 milliwatt (mW), 0.75mW, 0.5mW, or 0.25 mW) over short distances (e.g., up to about 300 feet ('), 250', 230', 200', or 150 '). The UWB signal may occupy at least about 750MHz, 500MHz or 250MHz of the bandwidth spectrum and/or at least about 30%, 20% or 10% of its center frequency. The UWB signal may be transmitted by one or more pulses. The component broadcasts digital signal pulses simultaneously (e.g., precisely) timed on a carrier signal on multiple frequency channels. Information may be transmitted, for example, by modulating the timing and/or positioning (e.g., pulsing) of the signals. Signal information may be transmitted by encoding the polarity of the signal (e.g., pulses), its amplitude, and/or by using orthogonal signals (e.g., pulses). The UWB signal may be a low power information transfer protocol. UWB technology may be used for (e.g., indoor) positioning applications. The broad UWB spectrum includes low frequencies with long wavelengths that allow UWB signals to penetrate a variety of materials, including various building fixtures (e.g., walls). For example, a wide range of frequencies including low penetration frequencies may reduce the likelihood of multipath propagation errors (not wishing to be bound by theory, as some wavelengths may have line-of-sight trajectories). The UWB communication signals (e.g., pulses) may be short (e.g., up to about 70cm, 60cm, or 50cm for pulses about 600MHz, 500MHz, or 400MHz wide, or up to about 20cm, 23cm, 25cm, or 30cm for pulses having a bandwidth of about 1GHz, 1.2GHz, 1.3GHz, or 1.5 GHz). Short communication signals (e.g., pulses) may reduce the likelihood that reflected signals (e.g., pulses) will overlap with the original signals (e.g., pulses).
In some embodiments, the user's Identification (ID) tag may comprise a microchip. The microchip may be a micro positioning chip. The microchip may incorporate automated positioning technology (also referred to herein as "a micropositioning chip"). The microchip may incorporate techniques for automatically reporting high resolution and/or high precision position information. Automatic location technology may include GPS, bluetooth, or radio wave technology. Automatic location techniques may include electromagnetic wave (e.g., radio wave) transmission and/or detection. The radio wave technology can be any of the RF technologies disclosed herein (e.g., high frequency, very high frequency, ultra high frequency). The radio wave technology may include UWB technology. The microchip may facilitate its position determination with an accuracy of up to about 25 cm, 20cm, 15cm, 10cm, or 5 cm. In various embodiments, the control system, sensor and/or antenna are configured to communicate with the micropositioning chip. In some embodiments, the ID tag may include a micro positioning chip. The micro positioning chip may be configured to broadcast one or more signals. The signal may be an omni-directional signal. One or more components operatively coupled to the network may (e.g., each) include a micro positioning chip. A micro positioning chip (e.g., disposed in a fixed and/or known location) may be used as an anchor. By analyzing the time it takes for the broadcast signal to reach the anchor within the transmissible distance of the ID tag, the location of the ID tag can be determined. One or more processors (e.g., one or more processors of a control system) may perform the analysis of the position-related signals. For example, the relative distance between the microchip and one or more anchors and/or other microchips may be determined (e.g., within the emission range limits). Relative distance, known location, and/or anchor information may be aggregated. At least one of the anchors may be provided in a floor, ceiling, wall and/or mullion of a building. There may be at least 1, 2, 3, 4, 5, 8, or 10 anchors disposed in the peripheral structure (e.g., in a room, in a building, and/or in a facility). At least two of the anchors may have at least (e.g., substantially) the same X, Y, and Z coordinates (cartesian coordinate system).
In some embodiments, the window control system enables the location and/or tracking of one or more devices (e.g., including automatic location technology, such as a micropositioning chip) and/or at least one user carrying such devices. The relative position between two or more such devices may be determined from information relating to transmissions received, for example, at one or more antennas and/or sensors. The positioning of the device may comprise geolocation. The position of the device may be the result of an analysis of electromagnetic signals emitted from the device and/or the micro-positioning chip. Information that may be used to determine location includes, for example, received signal strength, time of arrival, signal frequency, and angle of arrival. When determining the location of the one or more components from the metrics, a location determination (e.g., using trilateration, such as triangulation) module may be implemented. The positioning module may include calculations and/or algorithms. The automatic positioning may include geolocation. An example of a positioning method can be found in PCT patent application serial No. PCT/US17/31106, filed on 4.5.4.2017 and entitled "WINDOW antenna," which is incorporated herein by reference in its entirety.
In some embodiments, the user's location may be located using one or more positioning sensors. The positioning sensors may be disposed in a peripheral structure (e.g., a facility, building, or room). The positioning sensor may be part of the sensor ensemble or separate from the sensor ensemble (e.g., a stand-alone positioning sensor). The positioning sensor may be operatively (e.g., communicatively) coupled to the network. The network may be a network of facilities (e.g., of a building). The network may be configured to transmit communications and power. The network may be any of the networks disclosed herein. The network may be spread over several rooms, floors, several rooms, several floors, several buildings of a building or facility. The network may be operatively coupled (e.g., to facilitate power and/or communications) to a control system, sensors, transmitters, antennas, routers, power sources, building management systems (and/or components thereof) (e.g., as disclosed herein). The network may be coupled to personal computers of users (e.g., occupants) associated with the facility (e.g., employees and/or tenants). At least a portion of the network may be installed as an initial network of the facility and/or disposed in an enclosure of the facility. The user may or may not be present in the facility. The user's personal computer may be located remotely from the facility. The network may be operatively coupled to other devices in the facility (e.g., production machines, communication machines, and/or service machines) that perform operations of or associated with the facility. The production machines may include computers, factory related machines, and/or any other machine configured to produce a product (e.g., a printer and/or a dispenser). The service machine may comprise a food and/or beverage related machine, a hygiene related machine (e.g. a mask dispenser and/or a sanitizer dispenser). The communication machine may include a media projector, a media display, a touch screen, a speaker, and/or a lighting device (e.g., an entrance, exit, and/or security lighting device).
In some embodiments, at least one device aggregate comprises at least one processor and/or memory. The processor may perform computational tasks (e.g., including machine learning and/or artificial intelligence related tasks). In this manner, the network may allow for low latency (e.g., as disclosed herein) and faster response times for applications and/or commands. In some embodiments, a network and circuitry coupled to the network may form a distributed computing environment (e.g., including CPUs, memory, and storage) for application and/or service hosting to store and/or process things that are proximate to a user's removable circuitry (e.g., a cellular device, tablet, or laptop).
In some embodiments, a network is coupled to the collection of devices. The device ensemble may perform (e.g., in real-time) sensing and/or tracking of occupants in the peripheral structure in which the device ensemble is disposed (e.g., in situ), e.g., to (i) enable the user's mobile circuitry to seamlessly connect with the network and/or adjust machines coupled to the network according to the user's requirements and/or preferences, (ii) identify the user (e.g., using facial recognition, voice recognition, and/or identification tags), and/or (iii) satisfy the environment of the peripheral structure according to any preferences of the user. For example, when a meeting organizer enters an assigned meeting room, the organizer can be identified by one or more sensors (e.g., using facial recognition and/or ID tags), and the organizer's presentation can appear on the meeting room's screen and/or on the invitee's processor's screen. The screen may be controlled (e.g., remotely by an organizer or invitee, e.g., as disclosed herein). The invitees may be located in the meeting room or remotely. The organizer may connect to the assistant via a network. The assistant may be real or virtual (e.g., a digital office assistant). The organizer may place one or more requests to the assistant, which may be satisfied by the assistant. These requests may require communication and/or control using a network. For example, the requests may be to retrieve files and/or to manipulate files (e.g., during a meeting). These requests may alter functions controlled by the control system (e.g., turn lights on, cool the room environment, sound an alarm, close a facility door, and/or stop operation of plant machinery). An assistant (e.g., a digital assistant) may take notes during the meeting (e.g., using voice recognition), schedule the meeting, and/or update files. The assistant may analyze (e.g., read) the email and/or reply to the email. The occupant may interact with the assistant in a contactless (e.g., remote) manner, e.g., using gestures and/or voice interactions (e.g., as disclosed herein).
Fig. 24 shows an example of a building having an assembly of devices (e.g., components, also referred to herein as "digital building elements"). As connection points, the building may include a plurality of roof donor antennas 2405, 2405b and a sky sensor 2407 for transmitting electromagnetic radiation (e.g., infrared, ultraviolet, radio frequency, and/or visible light). These wireless signals may allow the building services network to wirelessly interface with one or more communication service provider systems. The building has a control panel 2413 for connection to a provider's central office 2411 via physical lines 2409 (e.g., optical fibers, such as single mode fibers). The control panel 2413 may include hardware and/or software configured to provide functionality, e.g., of a signal source carrying headend, a fiber distribution headend, and/or a (e.g., bi-directional) amplifier or repeater. The roof donor antennas 2405a and 2405b may allow building occupants and/or devices to access (e.g., third party) provider's wireless system communication services. These antennas and/or controllers may provide access to the same service provider's system, different service provider's system, or some variant, such as two interface elements providing access to a first service provider's system and a different interface element providing access to a second service provider's system.
As shown in the example of fig. 24, the vertical data plane may include (e.g., high capacity or high speed) data carrying lines 2419, such as (sufficiently sized) (e.g., single mode) optical fiber or UTP copper lines. In some embodiments, at least one control panel may be disposed on at least a portion of a floor of a building (e.g., on each floor). In some embodiments, one (e.g., high capacity) communication line may directly connect the control panel in the top floor with the (e.g., primary) control panel 2413 in the bottom floor (or in the basement). Note that in the example shown in fig. 24, the control panel 2417 is directly connected to the rooftop antennas 2405a, 2405b and/or the sky sensor 2407, while the control panel 2413 is directly connected to a (e.g., third party) service provider central office 2411.
Fig. 24 shows an example of a horizontal data plane that may include one or more of a control panel and data-carrying cabling (e.g., wires), including trunk wires 2421. In certain embodiments, the trunk line includes (e.g., is made of) a coaxial cable. The trunk lines may include any of the wiring disclosed herein. The control panel may be configured to provide data on the trunk lines 2421 via a data communication protocol (such as MoCA and/or d.hn). The data communication protocol may include: (i) Next generation home network protocols (abbreviated herein as "g.hn" protocols); (ii) Communication technology that transmits digital information over power lines traditionally used to (e.g., only) deliver power; or (iii) hardware devices designed for communication and data transfer over the electrical wiring of the building (e.g., ethernet, USB, and Wi-Fi). The data transfer protocol may facilitate a data transmission rate of at least about 1 gigabit per second (Gbit/s), 2Gbit/s, 3Gbit/s, 4Gbit/s, or 5 Gbit/s. The data transfer protocol may operate over telephone wiring, coaxial cable, power lines, and/or (e.g., plastic) optical fiber. A chip (e.g., including a semiconductor device) may be used to facilitate the data transfer protocol. At least one (e.g., each) horizontal data plane may provide high speed network access to one or more device ensembles such as 2423 (e.g., a set of one or more devices in a housing that includes device components) and/or antennas (e.g., 2425), some or all of which are optionally integrated with the device ensemble. The antenna (and associated radio, not shown) may be configured to provide wireless access through any of a variety of protocols including, for example, cellular (e.g., one or more frequency bands at or near 28 GHz), wi-Fi (e.g., one or more frequency bands at 2.4, 5, and 60 GHz), CBRS, and so forth. The drop line may connect a device aggregate (e.g., 2423) to a trunk line (e.g., 2421). In some embodiments, the horizontal data plane is deployed on a floor of a building. The devices in the device aggregate may include sensors, transmitters, or antennas. The device aggregate may include circuitry. Devices in the device aggregate may be operatively coupled to the circuitry. The circuit may include a processor. The circuitry may be operatively coupled to a memory and/or a communications hub (e.g., ethernet and/or cellular communications). One or more donor antennas (e.g., 2405a, 2405 b) can be connected to a control panel (e.g., 2413) via high speed lines (e.g., single mode fiber or copper cables). In the depicted example of fig. 24, the control panel 2413 is located in a lower floor of the building. The connection to the donor antenna may be via one or more vRAN radios and wiring (e.g., coaxial cable).
In the example shown in fig. 24, a communications service provider central office 2411 is connected to an underlying control panel 2413 via a high-speed line 2409 (e.g., an optical fiber used as part of a backhaul). This entry point of the service provider into the building is sometimes referred to as the master entry point (MPOE), and it may be configured to allow the building to distribute both voice and data traffic.
In some cases, a building is provided with a small cell system, at least in part, via one or more antennas. Examples of antennas, sky sensors, and control systems may be found in U.S. patent application 15/287,646, filed 10/6/2016, which is incorporated herein by reference in its entirety.
In some embodiments, the target device is operatively coupled to a network. The network may be operatively (e.g., communicatively) coupled to one or more controllers. The network may be operatively (e.g., communicatively) coupled to one or more processors. The coupling of the target device to the network may allow the user to engage in contactless communication with the target device using the user's mobile circuitry (e.g., through a software application installed on the mobile circuitry). In this way, the user need not be communicatively coupled and decoupled directly with the serving device (e.g., using bluetooth technology). By coupling a target device to a network to which a user is communicatively coupled (e.g., via the user's mobile circuitry), the user may be communicatively coupled to multiple target devices at the same time (e.g., concurrently). The user may sequentially control at least two of the plurality of target devices. The user may control at least two of the plurality of target devices simultaneously (e.g., concurrently). For example, a user may open (e.g., run) two applications of two different target devices on their mobile circuitry, e.g., for control (e.g., manipulation).
In some examples, the user's discovery of the target device is not limited in scope. User discovery of the target device may be limited by at least one security protocol (e.g., only allowed manufacturing personnel may operate the hazardous manufacturing machine). The security protocol may have one or more security levels. The user's discovery of a target device may be limited by the devices in the room, floor, building or facility in which the user is located. The user may cover at least one (e.g., any) range limit and select a target device from all available target devices.
In some embodiments, the target device is communicatively coupled to a network. The target device may utilize a network authentication protocol. The network authentication protocol may open one or more ports for network access. These ports may be opened when an organization and/or facility authenticates (e.g., via network authentication) the identity of a target device operatively coupled (and/or physically coupled) to the network. Operatively coupling may include communicatively coupling. An organization and/or facility may authorize (e.g., using a network) a target device to access the network. Access may or may not be restricted. The restrictions may include one or more security levels. The identity of the target device may be determined based on the credentials and/or certificates. The credentials and/or certificates may be validated by the network (e.g., validated by a server operatively coupled to the network). The authentication protocol may or may not be specific to physical communications (e.g., ethernet communications) in a Local Area Network (LAN), e.g., it utilizes packets. The standards may be maintained by the Institute of Electrical and Electronics Engineers (IEEE). The standard may specify the physical medium (e.g., target device) and/or operating characteristics of the network (e.g., ethernet). The network standard may support Virtual LANs (VLANs) over a local area network, such as ethernet. The standard may support power over a local area network (e.g., ethernet). The network may provide power line (e.g., coaxial cable) based communications. The power may be Direct Current (DC) power. The power may be at least about 12 watts (W), 15W, 25W, 30W, 40W, 48W, 50W, or 100W. The criteria may facilitate mesh networking. The standards may facilitate Local Area Network (LAN) technology and/or Wide Area Network (WAN) applications. Standards may facilitate physical connections between target devices and/or infrastructure equipment (hubs, switches, routers) through various types of electrical cables (e.g., coaxial, twisted pair, copper, and/or optical). Examples of network authentication protocols may be 802.1x or KERBEROS. The network authentication protocol may include key cryptography. The network may support (e.g., communicate) protocols including 802.3, 802.3af (PoE), 802.3at (PoE +), 802.1Q, or 802.11s. The network may support a communication protocol for a Building Automation and Control (BAC) network (e.g., BACnet). The protocol may define services for communicating between building devices. The protocol services may include device and object discovery (e.g., who-ls, I-Am, who-Has, and/or I-Have). The protocol service may include a read attribute and a write attribute (e.g., for data sharing). The network protocol may define object types (e.g., acted upon by the service). The protocol may define one or more data link/physical layers (e.g., ARCNET, ethernet, BACnet/lP, BACnet/lPv6, BACnet/MSTP, RS-232 based point-to-point, RS-485 based master-slave/token passing, zigBee, and/or LonTalk). The protocol may be specific to a device (e.g., an internet of things (IoT) device and/or machine-to-machine (M2M) communication). The protocol may be a messaging protocol. The protocol may be a publish-subscribe protocol. The protocol may be configured for messaging. The protocol may be configured for a remote device. The protocol may be configured for devices with small code footprints and/or minimal network bandwidth. The small code footprint may be configured to be processed by a microcontroller. The agreement may have a plurality of quality of service levels, including (i) at most once, (ii) at least once, and/or (iii) exactly once. The multiple quality of service levels may improve the reliability of message delivery (e.g., delivery to its target) in the network. The protocol may facilitate (i) device-to-cloud and/or (ii) cloud-to-device messaging. The messaging protocol is configured to broadcast messages to a target group, such as a target device (e.g., appliance), sensor, and/or transmitter. The protocol may comply with the organization for the advancement of structured information standards (OASIS). The protocol may support security schemes such as authentication (e.g., using tokens). The protocol may support access delegation standards (e.g., OAuth). The protocol may support granting the first application (and/or website) access to information on the second application (and/or website) without providing the second application with a security code (e.g., token and/or password) related to the first application. The protocol may be a Message Queue Telemetry Transport (MQTT) or an Advanced Message Queue Protocol (AMQP) protocol. The protocol may be configured for a message rate of at least one (1) message per second per publisher. The protocol may be configured to facilitate message payload sizes of up to 64, 86, 96, or 128 bytes. The protocol may be configured to communicate (e.g., from the microcontroller to the server) with any device operating a library of compliant protocols (e.g., MQTT) and/or connected over a network to a compliant agent (e.g., MQTT agent). Each device (e.g., target device, sensor, or transmitter) may be a publisher and/or subscriber. An agent may handle millions of devices connected simultaneously, or less than millions of devices. The agent may handle at least about 100, 10000, 100000, 1000000, or 10000000 devices that are simultaneously connected. In some embodiments, the broker is responsible for receiving (e.g., all) messages, screening messages, determining devices interested in each message, and/or sending messages to these subscribing devices (e.g., broker clients). The protocol may require an internet connection to the network. The protocol may facilitate bidirectional and/or synchronous peer-to-peer messaging. The protocol may be a binary wire protocol. Examples of such NETWORK protocols, control systems, and NETWORKs can be found IN U.S. provisional patent application serial No. 63/000,342, filed on 26/3/2020 and entitled "MESSAGING IN a MULTI CLIENT NETWORK," which is incorporated herein by reference IN its entirety.
Examples of NETWORK security, communication standards, communication interfaces, message passing, coupling of devices to NETWORKs, and control can be found in U.S. provisional patent application serial No. 63/000,342, and PCT patent application serial No. PCT/US20/70123 entitled "SECURE BUILDING SERVICES NETWORK," filed on 04/6/2020, each of which is incorporated by reference herein in its entirety.
In some embodiments, the network allows the target device to couple to the network. The network (e.g., using the controller and/or processor) may join the target device to the network, authenticate the target device, monitor activity on the network (e.g., activity related to the target device), facilitate performance of maintenance and/or diagnostics, and secure data communicated over the network. The security level may allow two-way or one-way communication between the user and the target device. For example, the network may only allow one-way communication from the user to the target device. For example, the network may limit the availability of data communicated over and/or coupled to the network from being accessed by a third party owner (e.g., a service device) of the target device. For example, the network may restrict the availability of data communicated over and/or coupled to the network from being accessible to organizations and/or facilities interested in data related to third party owners and/or manufacturers of target devices (e.g., service appliances).
In some embodiments, the control system is operatively coupled to the learning module. The learning module may utilize a learning scheme, including, for example, artificial intelligence. The learning module may be a learning preference of one or more users associated with the facility. The users associated with the facility may include occupants of the facility and/or users associated with an entity that resides and/or owns the facility (e.g., employees of a company that resides in the facility). The learning module may analyze the preferences of a user or group of users. The learning module may collect preferences of the user with respect to one or more environmental characteristics. The learning module may use the user's past preferences as a learning set for the user or a group to which the user belongs. The preferences may include environmental preferences or preferences related to the target device (e.g., work machine and/or production machine).
In some embodiments, the control system regulates various aspects of the peripheral structure. For example, the control system may regulate the environment of the peripheral structure. The control system may predict future environmental preferences of the user and adjust environmental adjustments in advance (e.g., at a future time) based on these preferences. The preferred environmental characteristics may be assigned based on (i) a user or group of users, (ii) time, (iii) date, and/or (iv) space. The data preferences may include seasonal preferences. The environmental characteristics may include lighting, ventilation speed, atmospheric pressure, odor, temperature, humidity, carbon dioxide, oxygen, VOCs, particulate matter (e.g., dust), or color. The environmental characteristic may be a preferred color scheme or theme of the peripheral structure. For example, at least a portion of the peripheral structure may be projected with a preferred theme (e.g., projection color, picture, or video). For example, the user is a cardiac patient and prefers (e.g., needs) an oxygen content that is higher than ambient oxygen content (e.g., 20% oxygen) and/or a certain humidity level (e.g., 70%). When the heart patient occupant is located in a certain peripheral structure, the control system can adjust the atmosphere of the environment (e.g., by controlling the BMS) to achieve the oxygen and humidity levels.
In some embodiments, the control system may operate the target device according to the preferences of a user or a group of users. The preferences may be based on past behavior (e.g., settings, service selections, timing-related selections, and/or location-related selections) of the user with respect to the target device. For example, a user may want to make latte at 9 am with a coffee maker in a first position near his desk, with 1 teaspoon of sugar. A coffee maker located in the first position may automatically generate a cup of such coffee in the first position at 9 am. For example, a group of users (such as a workgroup) would like to enter a meeting room with a forest as background, with a breeze, temperature of 22 ℃. When the group is in a meeting, in each meeting room, the control system may control (e.g., on walls and/or media screens) to project a forest background, adjust the ventilation system to have a breeze, and adjust the HVAC system to 22 ℃. The control system may facilitate such control by controlling the HVAC system, the projector, and/or the media display.
In some embodiments, the control system may adjust the environment and/or target devices according to the rating preferences. When several different users (e.g., different groups) are grouped together in a peripheral structure, the preferences of those users conflict, and the control system may adjust the environment and/or target devices according to a pre-established hierarchy. The hierarchy may include jurisdictional (e.g., health and/or security) criteria, health, security, employee level, activities occurring in the peripheral structure, number of occupants in the peripheral structure, type of peripheral structure, time of day, date, season, and/or activities in the facility.
In some embodiments, the control system considers results (e.g., based on scientific and/or research results) regarding environmental conditions that affect the health, safety, and/or performance of occupants of the peripheral structure. The control system may establish a threshold and/or a preferred window range for one or more environmental characteristics of the peripheral structure (e.g., the atmosphere of the peripheral structure). The threshold may include the level of atmospheric components (e.g., VOCs and/or gases), the temperature, and the time of a certain level. Certain levels may be abnormally high, abnormally low, or at an average level. For example, the controller may allow an abnormally high VOC content for a shorter time, but not allow that VOC content for a longer time. The control system may automatically override the user's preferences if the user's preferences contradict the health and/or safety thresholds. The health and/or safety thresholds may be at a higher hierarchical level relative to the user's preferences. The hierarchy may take advantage of most human preferences. For example, if two occupants of a conference room have one preference, and a third occupant has a conflicting preference, then the preferences of the first two occupants will take precedence (e.g., unless the preference conflicts with health and/or security considerations).
Fig. 25 shows an example of a flow chart depicting operation of a control system operatively coupled to one or more devices in a peripheral structure (e.g., a facility). In block 2500, the identity of the user is recognized by the control system. The identity may be recognized by one or more sensors (e.g., a camera) and/or by an identification tag (e.g., by being scanned or otherwise sensed by one or more sensors). In block 2501, the user's location may optionally be tracked as the user moves about the peripheral structure. The user may provide input regarding any preferences. Preferences may be related to target device and/or environmental characteristics. In block 2503, the learning module may optionally track such preferences and provide predictions about any future preferences of the user. The user's past preferences may be recorded (e.g., in a database) and may be used as a learning set for the learning module. As the learning process progresses over time and the user provides more and more inputs, the prediction accuracy of the learning module may improve. The learning module can include any of the learning schemes disclosed herein (e.g., including artificial intelligence and/or machine learning). The user may override the recommendations and/or predictions made by the learning module. The user may provide manual input to the control system. In block 2502, user input (whether provided directly by the user or through prediction by the learning module) is provided to the control system. The control system may implement user preferences (e.g., inputs) by using the inputs to change (or direct a change) one or more devices in the facility. The control system may or may not use the user's location. The location may be a past location or a current location. For example, a user may enter a workplace by scanning a label. Scanning of identification tags (ID tags) may inform the control system of the identity of the user, as well as the user's location at the time of scanning. The user may express a preference for a certain level of sound that constitutes the input. The expression of preferences may be made through manual input, including tactile, voice, and/or gesture commands. Past expressions of preferences may be registered in a database and linked with the user. The user may enter the conference room at a predetermined time. (i) When a predetermined meeting plan begins, and/or (ii) when one or more sensors sense the presence of a user in the meeting room, the volume in the meeting room may be adjusted according to user preferences. (i) When the predetermined meeting schedule is over, and/or (ii) when one or more sensors sense that there is no user in the meeting room, the volume in the meeting room may return to a default level and/or adjust according to another preference.
In some embodiments, the user expresses at least one preferred environmental characteristic and/or target device, the preference constituting an input. The input may be made by manual input, including tactile, voice, and/or gesture commands. Past expressions of preferences (e.g., inputs) may be registered in a database and linked with the user. The user may be part of a group of users. The group of users may be any grouping disclosed herein. The user's preferences may be linked to the group to which the user belongs. The user may enter the peripheral structure at a predetermined time. (i) The environmental characteristics of the peripheral structure may be adjusted according to user preferences when a user plans to enter the peripheral structure, and/or (ii) when one or more sensors sense the presence of a user in the peripheral structure. (i) When the planned presence of the user in the peripheral structure is terminated, and/or (ii) when the one or more sensors sense the absence of the user in the peripheral structure, the environmental characteristics of the peripheral structure may return to a default level and/or adjust according to another preference. (i) The target device may be adjusted according to user preferences when the user plans to use the target device, and/or (ii) when one or more sensors sense the presence of the user in proximity to the target device (e.g., within a predetermined distance threshold). (i) When the user's intended use of the target device ends, and/or (ii) when one or more sensors sense that there is no user in the vicinity of the target device (e.g., within a predetermined distance threshold), the target device may return to a default setting or adjust according to other preferences.
In some embodiments, the data is analyzed by a learning module. The data may be sensor data and/or user input. The user input may pertain to one or more preferred environmental characteristics and/or target devices. The learning module can include at least one rational decision-making process, and/or learning that utilizes data (e.g., as a learning set). The analysis of the data may be used to adjust the environment, for example, by adjusting one or more components of the environment that affect the peripheral structure. The analysis of the data may be used to control certain target devices, for example, to produce products according to user preferences, and/or to select certain target devices (e.g., based on user preferences and/or user location). The data analysis may be performed by a machine-based system (e.g., including circuitry). The circuitry may be a processor. Sensor data analysis may utilize artificial intelligence. Data analysis may rely on one or more models (e.g., mathematical models). In some embodiments, the data analysis includes linear regression, least squares fitting, gaussian process regression, nuclear regression, nonparametric multiplicative regression (NPMR), regression trees, local regression, semi-parametric regression, order-preserving regression, multivariate Adaptive Regression Splines (MARS), logistic regression, robust regression, polynomial regression, stepwise regression, ridge regression, lasso regression, elastic network regression, principal Component Analysis (PCA), singular value decomposition, fuzzy measurement theory, borel measures, han measures, risk neutral measures, lebesgue measures, data processing Grouping Methods (GMDH), naive bayes classifier, k nearest neighbor algorithms (k-NN), support Vector Machines (SVM), neural networks, support vector machines, classification and regression trees (CART), random forest, gradient boosting, or Generalized Linear Model (GLM) techniques. The data analysis may include deep learning algorithms and/or Artificial Neural Networks (ANNs). Data analysis may include learning schemes that utilize multiple layers in the network (e.g., ANN). Learning by the learning module can be supervised, semi-supervised or unsupervised. The deep learning architecture may include a deep neural network, a deep belief network, a recurrent neural network, or a convolutional neural network. The learning scheme may be a learning scheme used in computer vision, machine vision, speech recognition, natural language processing, audio recognition, social network filtering, machine translation, bioinformatics, drug design, medical image analysis, materials review programs, and/or board game programs.
In some examples, the target device is a tintable window (e.g., an electrochromic window). In some embodiments, the dynamic state of the electrochromic window is controlled by varying the voltage signal to the electrochromic device (ECO) for providing coloration or coloring. The electrochromic window may be manufactured, configured, or otherwise provided as an Insulated Glass Unit (IGU). When provided for installation in a building, the IGU may serve as a basic construction for holding electrochromic panes (also referred to as "tiles"). The IGU sheet or pane can be a single substrate or a multi-substrate construction, such as a laminate of two substrates. IGUs (particularly those having a two-pane or three-pane configuration) can provide a number of advantages over single-pane configurations; for example, a multiple pane configuration may provide enhanced thermal insulation, noise insulation, environmental protection, and/or durability when compared to a single pane configuration. For example, the multi-window configuration may also provide enhanced protection for the ECO because the electrochromic film and associated layers and conductive interconnects may be formed on the inner surface of the multi-window IGU and protected by an inert gas filled in the interior volume of the IGU.
In some embodiments, the tintable window exhibits a (e.g., controllable and/or reversible) change in at least one optical characteristic of the window, e.g., when a stimulus is applied. The stimulation may include optical, electrical and/or magnetic stimulation. For example, the stimulus may include an applied voltage. One or more tintable windows may be used to control lighting and/or glare conditions, for example, by regulating the transmission of solar energy propagating through the one or more tintable windows. One or more tintable windows may be used to control the temperature within a building, for example, by regulating the transmission of solar energy propagating through the one or more tintable windows. Controlling solar energy can control the thermal load applied inside a facility (e.g., a building). The control may be manual and/or automatic. The control may be used to maintain one or more requested (e.g., environmental) conditions, such as human comfort. The control may include reducing energy consumption of a heating system, a ventilation system, an air conditioning system, and/or a lighting system. At least two of the heating, ventilation and air conditioning may be implemented by separate systems. At least two of the heating, ventilation, and air conditioning may be implemented by one system. Heating, ventilation, and air conditioning may be implemented by a single system (abbreviated herein as "HVAC"). In some cases, the tintable window may be responsive to (e.g., and communicatively coupled to) one or more environmental sensors and/or user controls. The tintable window may comprise (e.g., may be) an electrochromic window. The window may be located within the interior to the exterior of the structure (e.g., a facility; e.g., a building). However, this need not be the case. The tintable window may operate using a liquid crystal device, a suspended particle device, a micro-electromechanical system (MEMS) device, such as a micro-shutter, or any technique now known or later developed that is configured to control light transmission through the window. An example of a window (e.g., with a MEMS device for coloration) is described in U.S. patent application Ser. No. 14/443,353, entitled "Multi-PANE WINDOWS INCLUDING ELECTROCHECKROMIC DEVICES AND ELECTROMECHANICAL SYSTEMS DEVICES", filed 5/15/2015, which is incorporated by reference herein in its entirety. In some cases, one or more tintable windows may be located within the interior of a building, for example between a conference room and a hallway. In some cases, one or more tintable windows may be used in automobiles, trains, airplanes, and other vehicles, for example, in place of passive and/or non-tinted windows.
In some embodiments, the tintable window includes an electrochromic device (referred to herein as an "EC device" (abbreviated herein as ECD) or "EC"). The EC device may include at least one coating having at least one layer. The at least one layer may comprise an electrochromic material. In some embodiments, the electrochromic material exhibits a change from one optical state to another, e.g., when an electrical potential is applied across the EC device. The transition of the electrochromic layer from one optical state to another may result from, for example, reversible, semi-reversible, or irreversible ion insertion into the electrochromic material (e.g., by intercalation) and corresponding charge-balancing electron injection. For example, the transition of the electrochromic layer from one optical state to another may be caused by, for example, reversible ion insertion into the electrochromic material (e.g., by intercalation) and corresponding charge-balanced electron injection. May be reversible during the expected lifetime of the ECO. Semi-reversible refers to a measurable (e.g., significant) degradation in the reversibility of the tint of the window during one or more tinting cycles. In some cases, a portion of the ions responsible for the optical transition are irreversibly bound in the electrochromic material (e.g., and thus the induced (altered) tint state of the window irreversibly passes to its original colored state). In many EC devices, at least some (e.g., all) of the irreversibly bound ions can be used to compensate for "blind charges" in the material (e.g., ECO).
In some implementations, suitable ions include cations. The cation may comprise lithium ions (Li +) and/or hydrogen ions (H +) (i.e., protons). In some implementations, other ions may be suitable. The cations may be intercalated into the (e.g., metal) oxide. A change in the state of intercalation of ions (e.g., cations) into the oxide can induce a visible change in the hue (e.g., color) of the oxide. For example, the oxide may transition from a colorless to a colored state. For example, lithium ion-intercalating tungsten oxide (WO 3-y (0 < -y-0.3)) may change tungsten oxide from a transparent state to a colored (e.g., blue) state. An EC device coating as described herein is located within a visible portion of a tintable window such that the coloration of the EC device coating can be used to control the optical state of the tintable window.
Fig. 21 shows an example of a schematic cross section of an electrochromic device 2100 according to some embodiments. The EC device coating is attached to a substrate 2102, a Transparent Conductive Layer (TCL) 2104, an electrochromic layer (EC) 2106 (sometimes also referred to as a cathodically coloring layer or cathodically coloring layer), an ionically conductive layer or region (IC) 2108, a counter electrode layer (CE) 2110 (sometimes also referred to as an anodically coloring layer or anodically coloring layer), and a second TCL 2114. Elements 2104, 2106, 2108, 2110, and 2114 are collectively referred to as an electrochromic stack 2120. A voltage source 2116 operable to apply a potential across electrochromic stack 2120 effects a transition of the electrochromic coating from, for example, a clear state to a colored state. In other embodiments, the order of the layers is reversed relative to the substrate. That is, the layers are in the following order: the device comprises a substrate, a TCL, a counter electrode layer, an ion conducting layer, an electrochromic material layer and a TCL. In various embodiments, the ion conductor region (e.g., 2108) may be formed from a portion of the EC layer (e.g., 2106) and/or a portion of the CE layer (e.g., 2110). In such embodiments, the electrochromic stack (e.g., 2120) may be deposited to include a cathodically coloring electrochromic material (EC layer) in direct physical contact with an anodically coloring counter electrode material (CE layer). Ion conductor regions (sometimes referred to as interface regions, or substantially electrically insulating layers or regions known as ion conduction) may be formed, for example, by heating and/or other processing steps, where the EC and CE layers meet. Examples of ELECTROCHROMIC DEVICES (e.g., including those fabricated without depositing different ion conductor materials) may be found in U.S. patent application No. 13/462,725, entitled "electrochrome DEVICES," filed on 5/2/2012, which is incorporated herein by reference in its entirety. In some embodiments, the EC device coating may comprise one or more additional layers, such as one or more passive layers. The passive layer can be used to improve certain optical properties, provide moisture, and/or provide scratch resistance. These and/or other passive layers may be used to hermetically seal EC stack 2120. Various layers including transparent conductive layers, such as 2104 and 2114, may be treated with antireflective and/or protective layers, such as oxide and/or nitride layers.
In some embodiments, the IGU comprises two (or more) substantially transparent substrates. For example, an IGU may include two panes of glass. At least one substrate of the IGU may include an electrochromic device disposed thereon. One or more panes of the IGU may have a separator disposed between them. The IGU may be a hermetically sealed construction, e.g., having an interior region isolated from the surrounding environment. The "window assembly" may include an IGU. A "window assembly" may include a (e.g., free-standing) laminate. The "window assembly" may include one or more electrical leads, for example, for connecting to the IGU and/or laminate. Electrical leads may operatively couple (e.g., connect) one or more electrochromic devices to a voltage source, switch, etc., and may include a frame that supports an IGU or laminate. The window assembly may include a window controller, and/or a component of a window controller (e.g., a dock).
Fig. 22 shows an exemplary implementation of an IGU 2200 comprising a first pane 2204 having a first surface S1 and a second surface S2. In some implementations, the first surface S1 of the first pane 2204 faces an external environment, such as outdoors or an external environment. The IGU 2200 also includes a second pane 2206 having a first surface S3 and a second surface S4. In some implementations, the second surface S4 of the second pane 2206 faces an interior environment, such as an interior environment of a home, building, or vehicle, or a room or compartment within a home, building, or vehicle.
In some embodiments, the first pane 2204 and/or the second pane 2206 (e.g., each) is transparent and/or translucent to light, e.g., in the visible spectrum. For example, the first pane 2204 and/or the second pane 2206 (e.g., each) can be made of a glass material (e.g., architectural glass or other shatterproof glass material, such as silica-based (SO) x ) Glass material of (c)) is formed. The first pane 2204 and/or the second pane 2206 (e.g., each) can be a soda lime glass substrate or a float glass substrate. Such glass substrates may be composed of, for example, about 75% silicon dioxide (SiO) 2 ) And Na 2 O, caO and several trace additives. However, the first pane 2204 and/or the second pane 2206 (e.g., each) can be formed of any material having suitable optical, electrical, thermal, and mechanical properties. For example, other suitable substrates that may be used as one or both of the first pane 2204 and the second pane 2206 may include other glass materials as well as plastics, semi-plastics, and thermoplastic materials (e.g., poly)(methyl methacrylate), polystyrene, polycarbonate, allyl diglycerol carbonate, SAN (styrene acrylonitrile copolymer), poly (4-methyl-1-pentene), polyester, polyamide) and/or a specular material. In some embodiments, the first pane 2204 and/or the second pane 2206 (e.g., each) can be strengthened, for example, by tempering, heating, or chemical strengthening.
In fig. 22, the first and second panes 2204, 2206 are spaced apart from one another by a spacer 2218, which is generally a frame structure, to form the interior volume 2208. In some embodiments, the interior volume is filled with argon (Ar) or another gas, such as another inert gas (e.g., krypton (Kr) or xenon (Xn)), another (non-inert) gas, or a gas mixture (e.g., air). Filling the internal volume 2208 with a gas such as Ar, kr, or Xn may reduce conductive heat transfer through the IGU 2200. Without wishing to be bound by theory, this may be due to the low thermal conductivity of these gases and improved sound insulation, for example due to their increased atomic weight. In some embodiments, the internal volume 2208 can be evacuated of air or other gases. The spacer 2218 generally determines the height "C" of the interior volume 2208 (e.g., the spacing between the first pane 2204 and the second pane 2206). In FIG. 22, the thicknesses (and/or relative thicknesses) of the ECO, the sealant 2220/2222, and the bus bar 2226/2228 may not be drawn to scale. These components are generally thin and are exaggerated herein, for example, for ease of illustration only. In some embodiments, the spacing "C" between the first pane 2204 and the second pane 2206 is in the range of about 6mm to about 30 mm. The width "D" of the spacers 2218 can be in the range of about 5mm to about 15mm (although other widths are possible and may be desired). The spacers 2218 may be frame structures formed around all sides of the IGU 2200 (e.g., the top, bottom, left, and right sides of the IGU 100). For example, the spacers 2218 may be formed from a foam or plastic material. In some embodiments, the spacers 2218 may be formed from metal or other conductive material, for example, a metal tube or channel structure having at least 3 sides, two sides for sealing to each substrate and one side for supporting and separating the sheets and a surface 2224 as a spreading sealant. The first main seal 2220 adheres to and hermetically seals the spacer 2218 and the second surface S2 of the first pane 2204. The second main seal 2222 adheres to and hermetically seals the spacer 2218 and the first surface S3 of the second pane 2206. In some implementations, each of the primary seals 2220 and 2222 may be formed from a viscous sealant, such as Polyisobutylene (PIB). In some implementations, the IGU 2200 also includes a secondary seal 2224 that hermetically seals the boundary of the entire IGU 2200 around the exterior of the spacer 2218. To this end, the spacer 2218 may be inserted a distance "E" from the edges of the first and second panes 2204, 2206, which may be in the range of about four (4) millimeters (mm) to about eight (8) mm (although other distances may be and may be desired). In some implementations, the secondary seal 2224 may be formed from a viscous sealant, e.g., a polymeric material that is water resistant and adds structural support to the assembly, such as silicone, polyurethane, and similar structural sealants that form a water-tight seal.
In the example of fig. 22, the ECO coating on surface S2 of substrate 2204 extends around its entire perimeter to and below spacer 2218. This configuration is functionally desirable because it protects the ECO edges within the primary sealant 2220 and is aesthetically desirable because there is a single piece of ECO within the inner perimeter of the spacer 2218 without any bus bars or scribe lines.
Examples of configurations of IGUs are described in U.S. patent 8,164,818 (attorney docket No. VIEWP 006), published 24/2012 AND entitled "ELECTROCHROMIC WINDOW family METHODS" (attorney docket No. VIEWP 006), U.S. patent application 13/456,056 (attorney docket No. VIEWP006X 1), published 25/2012, 25/2012 AND entitled "ELECTROCHROMIC WINDOW family METHODS" (attorney docket No. VIEWP 006), PCT patent application PCT/US2012/068817 (attorney docket No. VIEWP 036), published 27/2016, 27/2016 AND entitled "THIN-FILM family AND family" PCT patent application PCT/US2012/068817 (attorney docket No. VIEWP036 US), published 27/2016 AND 27/11, published 12/201413 AND entitled "THIN-FILM family AND family" patent application PCT/US 300781 (PCT patent application PCT/US patent application PCT/2014) incorporated by reference herein.
In the example shown in fig. 22, the ECD 2210 is formed on the second surface S2 of the first pane 2204. The ECD 2210 includes an electrochromic ("EC") stack 2212, which may itself include one or more layers. For example, EC stack 2212 may include an electrochromic layer, an ion conducting layer, and a counter electrode layer. The electrochromic layer may be formed from one or more inorganic solid materials. The electrochromic layer may include or be formed from one or more of a variety of electrochromic materials, including electrochemical cathode or electrochemical anode materials. The EC stack 2212 may be interposed between first and second conductive (or "conducting") layers. For example, the ECD 2210 may include a first Transparent Conductive Oxide (TCO) layer 2214 adjacent to a first surface of the EC stack 2212 and a second TCO layer 2216 adjacent to a second surface of the EC stack 2212. Examples of similar EC DEVICES and smart windows may be found in U.S. patent No. 8,764,950, published 2214, 7-1, wang et al and entitled "ELECTROCHROMIC DEVICES," and U.S. patent No. 9,261,751, published 2216, 2-16, pradhan et al and entitled "ELECTROCHROMIC DEVICES," both of which are incorporated herein by reference in their entirety. In some implementations, the EC stack 2212 may also include one or more additional layers, such as one or more passive layers. For example, passive layers can be used to improve certain optical properties, provide moisture, or provide scratch resistance. These or other passive layers may also be used to hermetically seal the EC stack 2212.
In some embodiments, the electrochromic and the selection or design of the electrode materials generally control the possible optical transitions. During operation, in response to a voltage generated across the thickness of the EC stack (e.g., between the first and second TCO layers), the electrochromic layer transfers or exchanges ions to or from the counter electrode layer to drive the electrochromic layer to a desired optical state. To transition the EC stack to a transparent state, a positive voltage may be applied across the EC stack (e.g., such that the electrochromic layer is more positive than the electrode layer). In some implementations, the available ions in the stack are primarily located in the counter electrode layer in response to application of a positive voltage. When the magnitude of the electrical potential on the EC stack is reduced or when the polarity of the electrical potential is reversed, ions may be transported back through the ion conducting layer to the electrochromic layer, causing the electrochromic material to transition to an opaque state (or a "more colored", "darker", or "less transparent" state). In contrast, in some embodiments using electrochromic layers with different properties, to transition the EC stack to the opaque state, a negative voltage is applied to the electrochromic layer relative to the counter electrode layer. For example, when the magnitude of the electrical potential on the EC stack is reduced or its polarity is reversed, ions can be transported back through the ion conducting layer to the electrochromic layer, causing the electrochromic material to transition to a transparent or "bleached" state (or "less colored", "lighter", or "more transparent" state).
In some embodiments, the transfer or exchange of ions to or from the counter electrode layer also results in an optical transition in the counter electrode layer. For example, in some embodiments, the electrochromic layer and the counter electrode layer are complementary colored layers. More specifically, in some such embodiments, the counter electrode layer becomes more transparent when or after ions are transferred into the counter electrode layer, and similarly, the electrochromic layer becomes more transparent when or after ions are transferred out of the electrochromic layer. Conversely, when the switching polarity or the potential is decreased, and ions are transferred from the counter electrode layer into the electrochromic layer, both the counter electrode layer and the electrochromic layer become less transparent.
In some embodiments, the transition of the electrochromic layer from one optical state to another is caused by reversible ion insertion (e.g., by intercalation) into the electrochromic material and corresponding charge-balancing electron injection. For example, a portion of the ions responsible for the optical transition may be irreversibly bound in the electrochromic material. In some embodiments, suitable ions include lithium ions (Li +) and hydrogen ions (H +) (i.e., protons). In some other embodiments, other ions may be suitable. Lithium ions are intercalated into, for example, tungsten oxide (W03-y (0 ≦ y ≦ 0.3)) to change the tungsten oxide from a transparent state to a blue state.
In some embodiments, the colored transition is a transition from a clear (or "translucent", "bleached", or "minimally colored") state to an opaque (or "fully darkened" or "fully colored") state. Another example of a colored transition is the opposite transition, i.e. the transition from an opaque state to a transparent state. Other examples of colored transitions include transitions to various mid-tone states, e.g., transitions from less colored, lighter or more transparent states to more colored, darker or less transparent states, and vice versa. Each of these hue states and the coloration transitions between them may be characterized or described in terms of a transmission percentage. For example, the tinting transition may be described as a transition from a current percent transmission (% T) to a target% T. Rather, in some other cases, each hue state and the colored transitions between them may be characterized or described in terms of a color percentage; for example, a transition from a current shading percentage to a target shading percentage.
In some implementations, the voltage applied to the transparent electrode layer (e.g., across the EC stack) follows a control curve for driving a transition in the optically switchable device. For example, a window controller may be used to generate and apply a control curve to drive an ECO from a first optical state (e.g., a transparent state or a first intermediate state) to a second optical state (e.g., a fully colored state or a more colored intermediate state). To back-drive the ECO-from a more colored state to a less colored state-the window controller may apply a similar but opposite curve. In some embodiments, the control curves for coloring and lightening may be asymmetric. For example, a transition from a first, more colored state to a second, less colored state may in some cases require more time than the opposite transition; i.e. from the second less coloured state to the first more coloured state. In some embodiments, the opposite transition may be true. It may take more time to transition from the second less colored state to the first more colored state. Depending on the device structure and materials, bleaching or brightening may not necessarily be (e.g., only) a dyed or colored reverse. In fact, ECDs typically behave differently for each transition due to the difference in the driving force for ions to intercalate into and deintercalate from the electrochromic material.
FIG. 23 showsAn exemplary control curve 2300 is shown as a voltage control curve implemented by varying the voltage provided to the ECO. For example, the solid line in FIG. 23 represents the effective voltage V applied across the ECO during the color transition and during subsequent maintenance Eff . For example, the solid line may represent the voltage V applied to two conductive layers of an ECO App1 And V App2 Relative difference in time. The dashed lines in fig. 23 represent the corresponding current (I) through the device. In the example shown, the voltage control curve 2300 includes four phases: a ramp to drive phase 2302 of the transition is initiated, and a drive phase, a ramp to hold phase, and a hold phase of the transition are continued.
In FIG. 23, ramp-to-drive phase 2302 is characterized by applying a voltage ramp having a magnitude that is from time t 0 To increase to time t 1 Maximum drive value V of Drive . For example, the ramp-to-drive phase 2302 can be defined by three drive parameters known or set by the window controller: initial voltage (current voltage on ECO at the start of transition), V Drive The magnitude of (control end optical state), and the duration of application of the ramp (determining the transition speed). The window controller may also set a target ramp rate, a maximum ramp rate, or a ramp type (e.g., a linear ramp, a second degree ramp, or an nth degree ramp). In some embodiments, the ramp rate may be limited to avoid damaging the ECO.
In FIG. 23, drive phase 2304 includes applying a constant voltage V Drive At time t 1 Is started and at time t 2 And end, at which point the end optical state is reached (or approximately reached). Ramp-to-hold phase 2306 is characterized by applying a voltage ramp whose magnitude is from time t 2 Drive value V of Drive Reduced to time t 3 Minimum hold value V of Hold . In some embodiments, the ramp-to-hold phase 2306 may be defined by three drive parameters known or set by the window controller: drive voltage V Drive A holding voltage V Hold And the duration of application of the ramp. The window controller may also set a ramp rate or ramp type (e.g., linear ramp, second degree ramp)Grade or nth degree slope).
In FIG. 23, hold phase 2308 is characterized by the application of a constant voltage V beginning at time h Hold . Holding voltage V Hold May be used to maintain the ECO in an end optical state. Thus, the holding voltage V is applied Hold May occur concomitant with the duration of time that the ECO is to be maintained in the ending optical state. For example, leakage current I due to non-idealities associated with ECO Leak Can result in slow draining of charge from the ECO. This expulsion of charge can result in a corresponding inversion of ions across the ECO and thus a slow inversion of the optical transition. Holding voltage V Hold May be applied continuously to counteract or prevent leakage currents. In some embodiments, the holding voltage V Hold Periodically applied to "refresh" the desired optical state or, in other words, bring the ECO back to the desired optical state.
The voltage control curve 2300 shown and described with reference to fig. 23 is but one example of a voltage control curve suitable for certain implementations. However, many other approaches may be desirable or appropriate in such an implementation or in various other implementations or applications. These other curves can also be readily achieved using the controller and optically switchable device disclosed herein. For example, a current curve may be applied instead of a voltage curve. In some embodiments, a current control curve similar to the current density shown in fig. 23 may be applied. In some embodiments, the control curve may have more than four phases. For example, the voltage control curve may include one or more overdrive phases. For example, the magnitude of the voltage ramp applied during the first phase 2302 may be increased beyond the drive voltage V Drive To an overdrive voltage V OD . The first phase 2302 may be followed by a ramp phase 2303 during which an applied voltage is driven from an overdrive voltage V OD Reduced to the drive voltage V Drive . In some embodiments, the voltage may be ramped down to the drive voltage V Drive Previously applied overdrive voltage V OD Relatively short duration.
In some embodiments, the applied voltageOr the current profile is interrupted for a relatively short duration to provide an open circuit condition across the device. While this open circuit condition is effective, the actual voltage or other electrical characteristic may be measured, detected or otherwise determined to monitor how far the optical transition has progressed, and in some cases, to determine whether a change in the curve is required. Such open circuit conditions may also be provided during the hold phase to determine whether the hold voltage V should be applied Hold Or hold voltage V Hold Whether the magnitude of (a) should be changed. An example relating to CONTROLLING optical TRANSITIONS is provided IN PCT patent application PCT/US14/43514 filed 6/20 2014 under the name "CONTROLLING TRANSITIONS IN optical switching DEVICES," which is incorporated herein by reference IN its entirety.
In one or more aspects, one or more of the functions described herein may be implemented in hardware, digital electronic circuitry, analog electronic circuitry, computer software, firmware (including the structures disclosed in this specification and their structural equivalents), or any combination thereof. Certain embodiments of the subject matter described herein may also be implemented as one or more controllers, computer programs, or physical structures, e.g., one or more modules of computer program instructions, encoded on a computer storage medium for execution, or to control the operation of a window controller, a network controller, and/or an antenna controller. Any of the disclosed embodiments presented as or for electrochromic windows may be more generally embodied as or for switchable optical devices (including windows, mirrors, etc.).
Various modifications to the embodiments described in this disclosure will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the claims are not intended to be limited to the implementations shown herein but are to be accorded the widest scope consistent with the present disclosure, the principles and novel features disclosed herein. In addition, those of ordinary skill in the art will readily appreciate that the terms "upper" and "lower" are sometimes used to facilitate the description of the figures and indicate relative positions corresponding to the orientation of the graphics on a properly oriented page and may not reflect the proper orientation of the implemented device.
Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment.
Conversely, various features that are described in the context of a single implementation can also be used in multiple implementations separately or in any suitable subcombination. Furthermore, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this does not necessarily imply that the operations need be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Further, the figures may schematically depict one or more example processes in the form of a flow diagram. However, other operations not shown may be included in the example process schematically shown. For example, one or more additional operations may be performed before, after, concurrently with, or between any of the illustrated operations. In some cases, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products. Additionally, other implementations are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results.
While preferred embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. The invention is not intended to be limited to the specific examples provided within the specification. While the invention has been described with reference to the foregoing specification, the description and illustrations of the embodiments herein are not meant to be construed in a limiting sense. Numerous variations, changes, and substitutions will occur to those skilled in the art without departing from the invention herein. Further, it is to be understood that all aspects of the present invention are not limited to the specific depictions, configurations or relative proportions set forth herein which depend upon a variety of conditions and variables. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is therefore contemplated that the present invention will also cover any such alternatives, modifications, variations or equivalents. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.

Claims (44)

1. A method for controlling interactive apparatus of a facility, the method comprising:
(A) Monitoring a position of a moving circuit relative to a digital twin, the digital twin comprising a virtual three-dimensional representation of a structural feature of the facility having a real interactive target, the moving circuit (I) being movable by a user, (II) having a known position relative to at least a portion of the structural feature, and (III) being coupled to a virtual representation of the real interactive target in the digital twin;
(B) Correlating a gesture imparted on the movement circuit with the digital twin and generating a result, the gesture (i) being imparted by the user, (ii) intended to remotely cause a change in the real interactive target, and (iii) during coupling with the real interactive target; and
(C) Using the result to change a current state of the real interactive target in the facility.
2. The method of claim 1, wherein the facility includes a control network communicatively coupled to the real interactive target to support (a) monitoring a location of the mobile circuit and/or (b) changing the current state of the real interactive target.
3. The method of claim 1, wherein the movement circuit is included in a Virtual Reality (VR) interface, the VR interface including a display headset, a handheld controller with motion or selection functionality.
4. The method of claim 1, wherein the digital twin comprises a virtual three-dimensional representation of a plurality of structural features of the facility, the plurality of structural features comprising fixed or non-fixed devices of the facility.
5. The method of claim 1, wherein the coupling of the movement circuit to the virtual representation of the real interactive target in the digital twin includes (i) a spatial relationship between the movement circuit and the at least a portion of the structural feature identified in at least two dimensions, and (ii) a relative pointing direction of the movement circuit to the real interactive target.
6. The method of claim 1, wherein the current state change comprises a change in a signal sent to an optically tintable window.
7. The method of claim 1, wherein the current state change comprises a command setting of an environmental control unit.
8. The method of claim 7, wherein the command settings comprise (i) a tint density of the tintable window, (ii) a temperature setting of the HVAC unit, (iii) a fan setting of the HVAC unit, or (iv) an on/off setting of the lighting unit.
9. The method of claim 1, further comprising exchanging one or more messages between the digital twin and the mobile circuit to (i) provide the mobile circuit with an analysis corresponding to an initial virtual location, and (ii) virtually navigate in the digital twin to interact with the virtual representation of the real interactive target in the digital twin.
10. The method of claim 9, wherein the user manipulating the movement circuit is positioned away from the known location, and wherein the initial virtual location is aligned with a virtual representation of the known location in the digital twin.
11. The method of claim 1, wherein the digital twin comprises a virtual three-dimensional representation of a plurality of structural features of the facility, the plurality of structural features comprising a plurality of real interactive targets, and wherein the virtual three-dimensional representation is modified in response to adding and/or subtracting the real interactive targets in the facility.
12. An apparatus for controlling an interactive target of a facility, the apparatus comprising one or more controllers having circuitry, wherein the one or more controllers are configured to:
(A) Communicatively coupled to (a) a digital twin comprising a virtual three-dimensional representation of a structural feature of the facility with a real interactive target, and (b) a movement circuit that (I) is movable by a user, (II) has a known position relative to at least a portion of the structural feature, and (III) is coupled to a virtual representation of the real interactive target in the digital twin;
(B) Monitoring or directing monitoring of a position of the moving circuit relative to the digital twin;
(C) Correlating a gesture imparted on the movement circuit with the digital twin or guiding the correlation, and generating a result, the gesture (i) being imparted by the user, (ii) intended to remotely cause a change in the real interactive target, and (iii) during coupling of the user with the real interactive target; and
(D) Using the result to change or direct a change in a current state of the real interactive target in the facility.
13. A non-transitory computer program product for controlling an interactive target of a facility, the non-transitory computer program product comprising instructions recorded thereon, which when executed by one or more processors, cause the one or more processors to perform operations comprising:
(A) Monitoring a position of a moving circuit relative to a digital twin, the digital twin comprising a virtual three-dimensional representation of a structural feature of the facility having a real interactive target, the moving circuit (I) being movable by a user, (II) having a known position relative to at least a portion of the structural feature, and (III) being coupled to a virtual representation of the real interactive target in the digital twin;
(B) Correlating a gesture imparted on the movement circuit with the digital twin and generating a result, the gesture (i) being imparted by the user, (ii) intended to remotely cause a change in the real interactive target, and (iii) during coupling with the real interactive target; and
(C) Using the result to change a current state of the real interactive target in the facility.
14. The non-transitory computer program product of claim 13, wherein the operations include configuring the digital twin in accordance with a building information modeling data file from which the facility was built or from which the facility was built.
15. The non-transitory computer program product of claim 14, wherein the building information modeling data file is used to plan and/or track various stages in a lifecycle of the facility, including concept, construction, maintenance, and/or demolition of the facility.
16. The non-transitory computer program product of claim 13, wherein the operations are adapted for the facility to include a digital network, wherein the digital network is used, at least in part, to monitor the location of the mobile circuit and/or gestures imparted on the mobile circuit.
17. The non-transitory computer program product of claim 13, wherein the operations are adapted for the mobile circuit to include or be coupled to a motion sensor.
18. The non-transitory computer program product of claim 13, wherein changing the function of the real interactive target is commensurate with the intent of the gesture.
19. The non-transitory computer program product of claim 13, wherein the digital twin represents a plurality of structural features, wherein the structural features include static elements and/or dynamic elements.
20. A method for controlling a service device of a facility, the method comprising:
(a) Identifying, by a control system configured to control the service device, the service device proximate to a user disposed in the facility;
(b) Registering in the control system a location of the user in the facility;
(c) Optionally providing the service device from a plurality of devices, the service device provided based at least in part on the location of the user; and
(d) Directing the service device to perform a service by using (i) the location of the user and/or (ii) the service device selected by the user.
21. The method of claim 20, wherein the service device is a media screen in a peripheral structure in which the user is located.
22. The method of claim 20, wherein the service device has a first range, wherein the user has a second range, and wherein adjacent to the user comprises an intersection between the first range of the service device and the second range of the user.
23. The method of claim 22, wherein the first range is specific to the serving device, a serving device type, and/or a location of the serving device.
24. The method of claim 20, wherein the control system is a control system configured to control at least one other device attached to the facility.
25. The method of claim 20, wherein the service of the service device is depicted on the service device, and wherein the user selects the service without contacting the service device by pointing a mobile circuit to the depiction of the service on the service device.
26. The method of claim 25, further comprising depicting a virtual representation of at least a portion of the facility in which the service device is disposed, the depicting by the mobile circuit.
27. The method of claim 24, wherein the other devices comprise a media display, lighting, a sensor, a transmitter, an antenna, a Heating Ventilation and Air Conditioning (HVAC) system.
28. A non-transitory computer program product for controlling service devices of a facility, the non-transitory computer program product comprising instructions recorded thereon, which, when executed by one or more processors, cause the one or more processors to perform operations comprising:
(a) Identifying the service device by a control system configured to control the service device,
the service device being proximate to a user disposed in the facility;
(b) Registering in the control system a location of the user in the facility;
(c) Optionally providing the service device from a plurality of devices, the service device provided based at least in part on the location of the user; and
(d) Directing the service device to perform a service by utilizing (i) the location of the user and/or (ii) the service device selected by the user.
29. The non-transitory computer program product of claim 28, wherein the selection of the service is provided by an application installed in a mobile circuit held by the user.
30. An apparatus for controlling service devices of a facility, the apparatus comprising at least one controller having circuitry, the at least one controller configured to:
(a) Operatively coupled to the service device and controlling or directing control of the service device;
(b) Identifying or directing identification of the service device disposed proximate to a user disposed in the facility;
(c) Registering or directing registration of the user's location in the facility;
(d) Optionally providing or directing provision of the service device from a plurality of devices, the service device being provided based at least in part on the location of the user; and
(e) Direct the service device to perform a service based on (i) the location of the user and/or (ii) the service device selected by the user.
31. The apparatus of claim 30, wherein the at least one controller is configured to control or direct control of the service device by using a building automation and control protocol.
32. The apparatus of claim 30, wherein the at least one controller is configured to determine or direct determination of the location of the user.
33. The device of claim 32, wherein the at least one controller is configured to determine or direct determination of the location of the user by using ultra-wide radio waves.
34. The apparatus of claim 30, wherein the at least one controller is configured to provide or direct provision of the service device through an application installed in a mobile circuit maintained by the user.
35. A method of controlling a facility, the method comprising:
(a) Identifying, by the control system, an identity of the user;
(b) Tracking a location of the user in the facility by using one or more sensors disposed in the facility, the one or more sensors communicatively coupled to the control system;
(c) Using input related to the user; and
(d) Using the control system to automatically control one or more devices in the facility using the input and the user's location information, the one or more devices communicatively coupled to the control system.
36. The method of claim 35, wherein the input related to the user comprises a gesture and/or a voice command made by the user.
37. The method of claim 35, wherein the input related to the user relates to a preference of the user.
38. The method of claim 37, wherein the preferences of the user are provided by machine learning that takes into account past activities of the user.
39. The method of claim 35, wherein the one or more devices comprise a lighting, ventilation, and air conditioning system, a heating system, a sound system, or an odor conditioning system.
40. A non-transitory computer-readable medium for controlling a facility, which, when read by one or more processors, is configured to perform operations comprising the method operations of any of claims 35 to 39.
41. An apparatus for controlling a facility, the apparatus comprising at least one controller having circuitry, the at least one controller configured to:
(a) Operatively coupled to one or more sensors disposed in the facility and one or more devices disposed in the facility;
(b) Identifying or directing identification of a user;
(c) Tracking or guiding tracking of the user's location in the facility by using one or more sensors;
(d) Receiving input relating to the user; and
(e) Automatically controlling or directing automatic control of one or more devices in the facility by using the input and the user's location information.
42. The apparatus of claim 41, wherein the at least one controller is configured to identify or direct identification of the user at least in part by (I) receiving an identification card reading or (II) performing image recognition on the captured image of the user in the facility.
43. The apparatus of claim 41, wherein the one or more devices comprise a tintable window.
44. A non-transitory computer-readable medium for controlling a facility, which, when read by one or more processors, is configured to perform operations comprising operations of any of one or more controllers of claims 41-43.
CN202180029153.7A 2020-04-16 2021-04-15 Interaction between peripheral structures and one or more occupant-related applications Withdrawn CN115485614A (en)

Applications Claiming Priority (27)

Application Number Priority Date Filing Date Title
US202063010977P 2020-04-16 2020-04-16
US63/010,977 2020-04-16
US16/946,947 2020-07-13
US16/946,947 US11592723B2 (en) 2009-12-22 2020-07-13 Automated commissioning of controllers in a window network
US202063052639P 2020-07-16 2020-07-16
US63/052,639 2020-07-16
US202063080899P 2020-09-21 2020-09-21
US63/080,899 2020-09-21
US202063085254P 2020-09-30 2020-09-30
US63/085,254 2020-09-30
PCT/US2020/053641 WO2021067505A1 (en) 2019-10-05 2020-09-30 Tandem vision window and media display
USPCT/US2020/053641 2020-09-30
US17/081,809 US11460749B2 (en) 2017-04-26 2020-10-27 Tintable window system computing platform
US17/081,809 2020-10-27
US17/083,128 2020-10-28
US17/083,128 US20210063836A1 (en) 2017-04-26 2020-10-28 Building network
US16/950,774 US20210132458A1 (en) 2017-04-26 2020-11-17 Displays for tintable windows
US16/950,774 2020-11-17
US202063115842P 2020-11-19 2020-11-19
US63/115,842 2020-11-19
US17/249,148 US11735183B2 (en) 2012-04-13 2021-02-22 Controlling optically-switchable devices
US17/249,148 2021-02-22
US202163154352P 2021-02-26 2021-02-26
US63/154,352 2021-02-26
US202163170245P 2021-04-02 2021-04-02
US63/170,245 2021-04-02
PCT/US2021/027418 WO2021211798A1 (en) 2020-04-16 2021-04-15 Interaction between an enclosure and one or more occupants

Publications (1)

Publication Number Publication Date
CN115485614A true CN115485614A (en) 2022-12-16

Family

ID=78084605

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180029153.7A Withdrawn CN115485614A (en) 2020-04-16 2021-04-15 Interaction between peripheral structures and one or more occupant-related applications

Country Status (5)

Country Link
EP (1) EP4136504A4 (en)
CN (1) CN115485614A (en)
CA (1) CA3169817A1 (en)
TW (1) TW202147074A (en)
WO (1) WO2021211798A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116738764A (en) * 2023-08-08 2023-09-12 中国海洋大学 Ocean platform cabin comfort level assessment method based on singular value threshold algorithm

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10303035B2 (en) 2009-12-22 2019-05-28 View, Inc. Self-contained EC IGU
US11054792B2 (en) 2012-04-13 2021-07-06 View, Inc. Monitoring sites containing switchable optical devices and controllers
US10989977B2 (en) 2011-03-16 2021-04-27 View, Inc. Onboard controller for multistate windows
WO2016004109A1 (en) 2014-06-30 2016-01-07 View, Inc. Control methods and systems for networks of optically switchable windows during reduced power availability
EP4145379A1 (en) 2014-03-05 2023-03-08 View, Inc. Monitoring sites containing switchable optical devices and controllers
US11868103B2 (en) 2014-03-05 2024-01-09 View, Inc. Site monitoring system
US11743071B2 (en) 2018-05-02 2023-08-29 View, Inc. Sensing and communications unit for optically switchable window systems
US11740948B2 (en) 2014-12-08 2023-08-29 View, Inc. Multiple interacting systems at a site
EP3616189A4 (en) 2017-04-26 2020-12-09 View, Inc. Tintable window system for building services
US11747698B2 (en) 2017-04-26 2023-09-05 View, Inc. Tandem vision window and media display
US11892738B2 (en) 2017-04-26 2024-02-06 View, Inc. Tandem vision window and media display
US11747696B2 (en) 2017-04-26 2023-09-05 View, Inc. Tandem vision window and media display
EP3966963A2 (en) 2019-05-09 2022-03-16 View, Inc. Antenna systems for controlled coverage in buildings
TW202206925A (en) 2020-03-26 2022-02-16 美商視野公司 Access and messaging in a multi client network
US11631493B2 (en) 2020-05-27 2023-04-18 View Operating Corporation Systems and methods for managing building wellness
US11809640B2 (en) * 2021-12-09 2023-11-07 Htc Corporation Method for detecting movement of ring controller, ring controller, and computer readable medium
IT202100033131A1 (en) * 2021-12-30 2023-06-30 Celli Spa Method for creating a digital twin of a beverage dispenser
IT202100033128A1 (en) * 2021-12-30 2023-06-30 Celli Spa Control system for controlling one or more tap valves of a beverage dispenser

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7933945B2 (en) * 2002-06-27 2011-04-26 Openpeak Inc. Method, system, and computer program product for managing controlled residential or non-residential environments
US8723467B2 (en) * 2004-05-06 2014-05-13 Mechoshade Systems, Inc. Automated shade control in connection with electrochromic glass
WO2007029215A2 (en) * 2005-09-08 2007-03-15 Spd Control Systems Corporation Intelligent spd control apparatus with scalable networking capabilities for window and multimedia applications
BRPI0918937A2 (en) * 2009-01-07 2016-10-11 Koninkl Philips Electronics Nv lighting management system, method for implementing the lighting management system and executive module for use in a lighting management system
US10613704B2 (en) * 2009-06-03 2020-04-07 Savant Systems, Llc Small screen virtual room-based user interface
US11137659B2 (en) * 2009-12-22 2021-10-05 View, Inc. Automated commissioning of controllers in a window network
WO2018232147A1 (en) * 2017-06-15 2018-12-20 Johnson Controls Technology Company Building management system with artificial intelligence for unified agent based control of building subsystems

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116738764A (en) * 2023-08-08 2023-09-12 中国海洋大学 Ocean platform cabin comfort level assessment method based on singular value threshold algorithm
CN116738764B (en) * 2023-08-08 2023-10-20 中国海洋大学 Ocean platform cabin comfort level assessment method based on singular value threshold algorithm

Also Published As

Publication number Publication date
CA3169817A1 (en) 2021-10-21
WO2021211798A1 (en) 2021-10-21
EP4136504A4 (en) 2024-06-12
TW202147074A (en) 2021-12-16
EP4136504A1 (en) 2023-02-22

Similar Documents

Publication Publication Date Title
CN115485614A (en) Interaction between peripheral structures and one or more occupant-related applications
US20210390953A1 (en) Immersive collaboration of remote participants via media displays
US11735183B2 (en) Controlling optically-switchable devices
JP7078206B2 (en) Control of optically switchable devices
US20210383804A1 (en) Immersive collaboration of remote participants via media displays
US20220413351A1 (en) Tintable window system computing platform
US20230132451A1 (en) Interaction between an enclosure and one or more occupants
TW202333483A (en) Edge network for building services
TW202227890A (en) Mapping acoustic properties in an enclosure
TW202219665A (en) Atmospheric adjustment in an enclosure
US20240233724A9 (en) Behavior recognition in an enclosure
WO2023003877A1 (en) Control of one or more devices in a vehicle
WO2022221532A1 (en) Immersive collaboration of remote participants via media displays
US20230333434A1 (en) Mapping acoustic properties in an enclosure
WO2022178150A1 (en) Behavior recognition in an enclosure
WO2022178156A1 (en) Wearable device coupled to a facility network
TW202314563A (en) Occupant-centered predictive control of devices in facilities
US20240242717A1 (en) Immersive collaboration of remote participants via media displays
US20240046928A1 (en) Controlling optically-switchable devices
TW202246865A (en) Immersive collaboration of remote participants via media displays

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20221216

WW01 Invention patent application withdrawn after publication