EP4133695A1 - Nuage informatique local qui est interactif avec un nuage informatique public - Google Patents

Nuage informatique local qui est interactif avec un nuage informatique public

Info

Publication number
EP4133695A1
EP4133695A1 EP21785519.6A EP21785519A EP4133695A1 EP 4133695 A1 EP4133695 A1 EP 4133695A1 EP 21785519 A EP21785519 A EP 21785519A EP 4133695 A1 EP4133695 A1 EP 4133695A1
Authority
EP
European Patent Office
Prior art keywords
computing system
model
data
home computing
home
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP21785519.6A
Other languages
German (de)
English (en)
Other versions
EP4133695A4 (fr
Inventor
Hung Bun Choi
Gideon Sui Pang Tsang
Chun Kit CHU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Computime Ltd
Original Assignee
Computime Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/840,648 external-priority patent/US11656966B2/en
Priority claimed from US16/840,708 external-priority patent/US11399069B2/en
Application filed by Computime Ltd filed Critical Computime Ltd
Publication of EP4133695A1 publication Critical patent/EP4133695A1/fr
Publication of EP4133695A4 publication Critical patent/EP4133695A4/fr
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2807Exchanging configuration information on appliance services in a home automation network
    • H04L12/2814Exchanging control software or macros for controlling appliance services in a home automation network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/56Provisioning of proxy services
    • H04L67/565Conversion or adaptation of application format or content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/08Protocols for interworking; Protocol conversion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/18Multiprotocol handlers, e.g. single devices capable of handling multiple protocols
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • aspects of the disclosure relate to supporting a local computing cloud that is interactive with a public computing cloud.
  • the local computing cloud may be located in a home and may support one or more Internet of Things (IoT) devices.
  • An analytic model may be downloaded from the public computing cloud and locally executed. Reinforcement training may also be locally performed without externally conveying device data and user behavior information, vastly reducing data traffic that may jeopardize data privacy.
  • IoT Internet of Things
  • remote and centralized servers to collect input data, and based on the current input as well as the historical data, to generate certain actions.
  • IoT devices such as smart sensors, thermostats, and smart appliances
  • a remote server such as a public computing cloud.
  • a gateway may be needed to convert data from one connectivity protocol to another in order to send data from the end devices to server, for example, ZigBee to WiFi.
  • the huge amount of data transmission between the end devices and server means an expensive service cost.
  • a home computing system (which may be referred to as a “home computing cloud”) integrates a communications gateway, WiFi router, cloud server, and mass storage device to support one or more Internet of Things (IoT) devices in a local environment such as a residential home.
  • IoT Internet of Things
  • the home computing cloud locally processes collected device data rather than sending the device data to a public computer cloud (system) for processing
  • the home computing cloud often reduces the amount of data traffic sent to a public computing cloud (PCC). This approach improves network latency, reduces data loss during transmission, and helps to maintain a desired quality of service level.
  • a HCC may download an appropriate data analytic model (which may be referred to as a “model” and selected from a plurality of data analytic models) from a PCC based on configuration information (for example, the types of supported IoT devices).
  • the HCC can then locally execute the model by obtaining device data from one or more IoT devices, apply some or all of the device data to the model, and obtain a predictive result from the model.
  • the predictive result may then be applied to one or more of the supported IoT devices to affect the operation of the one or more IoT devices.
  • a HCC sends a subset of the device data to a PCC for further processing and receives decision information based on the subset of data.
  • the subset of device data may represent one or more signal characteristics of a complex signal (for example, multimedia signals including voice, music, image or video signals) that require intensive processing that the HCC may be unable to support.
  • the HCC may implement the image pre-processing layer and feature extraction layer of an analytic model and send the resultant data to the PCC for analysis and decision making.
  • the HCC applies the received result as well as other device data (corresponding to model inputs) to a downloaded data analytic model.
  • the PCC executes the input processing layers of the predictive model and sends the corresponding outputs to the PCC.
  • the PCC then executes all the remaining hidden layers and sends the corresponding outputs of the final hidden layer back to the HCC.
  • the HCC then executes the output layer.
  • the distribution of a work load for executing the model may be based on the computer power of the HCC (such as sending raw data to the PCC for the entire process); the amount of data traffic (such as sending only the feature data to the PCC for processing the remaining tasks); data privacy (such as sending a mathematically transformed data within a layer of the model to the PCC to continue the analysis); the consistency of the model parameters (such as the HCC executing layers with parameters that are fixed and the PCC may executing layers with parameters that are changing continuously via reinforcement training).
  • a HCC may have sufficient computing resources for executing more complex tasks, such as training a deep neural network.
  • the HCC may download an appropriate template of a data analytic model from the PCC, train the model locally, and execute the trained model to obtain prediction information from the collected IoT device data.
  • both a HCC and a PCC may execute and train the same data analytic models (for example, assistant training).
  • the learning rate at the HCC and PCC may be different (for example, because of more computational capability at the PCC).
  • the HCC While the HCC is executing and training the local model based on IoT device data, the HCC also sends the device data to the PCC.
  • the PCC executes and trains with the same device data and sends error measures back to the HCC.
  • the HCC compares the error measures from the two clouds and continues the training until the error measures from the HCC is lower than a threshold.
  • a HCC may decide to use to the parameters from the PCC to continue the training, when the error measures from the PCC are continuously lower than or substantially lower than the HCC.
  • a HCC may decide to use the model trained by the PCC and stop training if the error measure from the PCC reaches the threshold first.
  • a HCC may upload a trained model to a PCC for archiving, sharing, or optimization.
  • a PCC may analyze all the received models from other HCC’s and optimize a new model.
  • the PCC may distribute the new model to all the HCC’s.
  • a HCC may decide to use the new model completely, use the new model with the parameters from the existing model, or totally ignore the new model.
  • the decision may be based on a comparison to the error measures when executing different models with the empirical data locally stored.
  • a HCC may continue to run or train a local model, and a PCC may train the new model in parallel with new input data.
  • the HCC continuously sends new input data to the PCC for training the new model until the new model is sufficiently accurate.
  • the HCC may then download the new model for use.
  • a HCC may request the PCC to use the parameters in the HCC to continuously train the new model.
  • subjective weightings may be applied when calculating an error, based on an application scenario, in training the model.
  • FIG. 1 shows a home environment in which a home computing cloud (HCC) is interactive with a public computing cloud (PCC) in accordance with an embodiment.
  • HCC home computing cloud
  • PCC public computing cloud
  • Figure 2 shows a HCC without a WiFi router capability in accordance with an embodiment.
  • FIG. 3 shows a HCC with a WiFi router capability in accordance with an embodiment.
  • Figure 4 shows a PCC that is interactive with a plurality of HCC’s in accordance with an embodiment.
  • Figure 5 shows a HCC that is interactive with a PCC and a user application in accordance with an embodiment.
  • Figure 6 shows a HCC in accordance with an embodiment, in which the HCC is executing an analytic model while a PCC is executing reinforcement training.
  • FIG. 7 shows a HCC in accordance with an embodiment, where the HCC allocates all of the data analytic and reinforcement training tasks to a PCC.
  • Figure 8 shows an approach for a HCC partitioning an analytic model into two sub-models in accordance with an embodiment. Part of the original model is executed at the HCC, and the remaining part of the model is executed at a PCC in order to reduce the computation at the HCC to reduce data traffic and preserve data privacy when sending data over the network.
  • Figure 9 shows an approach for a HCC executing reinforcement learning in accordance with an embodiment.
  • Figure 10 shows an approach for a HCC interacting with a PCC to perform assistant learning in accordance with an embodiment.
  • a “HCC” (home computing cloud) may not be limited to a residential home and may support other types of entities such as a business or building. Consequently, a “HCC” may be construed as a “local computing cloud.” Also, a “cloud” may be referred to a computing system or the like.
  • a HCC integrates a communications gateway, WiFi router, cloud server, and mass storage device to support one or more Internet of Things (IoT) devices in a local environment such as a residential home.
  • IoT Internet of Things
  • the HCC locally processes collected device data rather than sending the device data to a public computing cloud (PCC) for processing, the HCC often reduces the amount of data traffic sent to a PCC. This approach improves network latency, reduces data loss during transmission, and helps to maintain a desired quality of service level.
  • a HCC may download an appropriate data analytic model (which may be referred to as a “model”) from a PCC based on configuration information (for example, the types of supported IoT devices).
  • the HCC can then locally execute the model by obtaining device data from one or more IoT devices, apply some or all of the device data to the model, and obtain a predictive result from the model.
  • the predictive result may then be applied to one or more of the supported IoT devices to affect the operation of the one or more IoT devices.
  • the HCC may include one or more IoT devices that are located in a home.
  • IoT devices including but not limited to, smart thermostats, appliances, lighting devices, security devices, and so forth.
  • the HCC may interact with a PCC in order to exchange information that is pertinent to the one or more IoT devices.
  • the information may include data (for example, temperature measurements) provided by the one or more IoT devices and information indicative of actions (for example, a mode of operation) to be performed by the one or more IoT devices.
  • the PCC (which may be referred to a “public cloud”) may provide computing services offered by third-party providers over the public Internet, making them available to anyone who wants to use or purchase them.
  • the services may be free or sold on-demand, thus allowing customers to pay only per usage for consumed CPU cycles, storage, or bandwidth.
  • algorithms may be available for training data analytic models locally.
  • Reinforcement (machine) learning may also be added to provide machine learning capability to the HCC.
  • privacy of users may be substantially improved by limiting the amount of data and types of data sent via the network and stored in a PCC.
  • a HCC home computing system locally executes both a data analytic model and reinforcement learning.
  • a data analytic model is partitioned into two sub-models.
  • the first sub-model includes an input processing layer of a data analytic model and is executed by a home computing system (cloud).
  • the second sub-model includes the hidden layers of the data analytic model and is executed by the public computing cloud.
  • a data analytic model is partitioned into three sub-models.
  • the first sub-model and third sub-model include the input layer and the output layer, respectively, and are executed by the home computing system (cloud).
  • the second sub-model includes only the hidden layers and is executed by the public computing cloud.
  • assistant learning enables training in a public computing cloud while executing a data analytic model at a home computing system.
  • assistant learning enables parallel training in both a public computing cloud as well as a home computing cloud while executing a data analytic model at the home computing system.
  • Figure 1 shows a home environment in which a HCC 101 is interactive with a PCC 102 via data channel 151 in accordance with an embodiment.
  • An !oT device may be an interrelated computing device (for example, a smart thermostat or appliance) within a home that provides sent information and obtains received information via HCC 101.
  • the received information may be indicative of one or more actions that the loT device should perform.
  • Figure 1 depicts an operating environment spanning a home, embodiments may span other local environments such as a building or a business location.
  • Data traffic capacity, data security and data privacy are important considerations when implementing an Internet of Things (IoT) system.
  • IoT Internet of Things
  • the exposure of the data from unauthorized access may be reduced.
  • the data traffic may be reduced and hence the cost in using the service provided by PCC 102.
  • By storing data within HCC 101 and conducting data analytic and machine learning from within HCC 101 services may be maintained when an internet connection is inaccessible. Moreover, the latency introduced from the internet connection may be eliminated.
  • one may not completely circumvent the services provided by PCC 102 as it often provides computational power and software services in which HCC 101 may not be able to provide.
  • HCC 101 may continuously send an update of the number of supported IoT devices (for example, devices 203-205 as shown in Figure 2) and the nature of the devices to PCC 102 via data channel 151.
  • HCC 101 may also send information about the trained model (for example, the parameters of the analytic model and the error measure) periodically to public computing cloud 102.
  • PCC 102 may collect data from all available HCCs 101 and 401 (as will be further discussed with Figure 4) and train a new pre-trained model based on the collected data, for example, a model template. The new pre-trained model may then be distributed back to each HCC 101 and 401. Alternatively, PCC 102 may inform HCC 101 and/or 401 that a new pre-trained model is available, where HCC 101 and/or 401 may decide whether to download it via data channel 151 based on a predetermined criterion.
  • model template may be directly applied to the model template.
  • reinforcement learning may be applied using the model template with model data stored locally or brand new data if no data available.
  • parallel training (machine learning in both the home and PCCs) may be applied when reinforcement learning is being performed.
  • Model parameters may be exchanged during the training.
  • the model to be adopted by the HCC may be chosen based on the error measurement.
  • PCC 102 may consistently update the machine learning algorithm to HCC 101.
  • HCC 101 may stream signal data to PCC 102.
  • PCC 102 When a data analytic model is supported at PCC 102, the result from the model may be returned to HCC 101.
  • the analytic model may be split into two parts (for example, sub-models) and partially executed at the HCC and PCC.
  • the data exchange between the two clouds may be the parameters in one or more layers of the analytic model. This approach may reduce the amount of data to be exchanged between the two clouds.
  • the privacy may be maintained with respect to sending the raw data stream.
  • the analytic model may be split into three sub-models and partially executed at the HCC and PCC.
  • the input processing layers and output layer(s) of the analytic model are executed at the HCC, and the hidden layers are executed at the PCC.
  • the raw input data and predictive outputs which likely contain private information about the device owner, are kept locally and not externally exposed.
  • a model may be trained in PCC 102, downloaded from PCC 102 to HCC 101, and locally executed by HCC 101.
  • the decision for retraining a new model may be triggered by the owner (user). Examples include adding a new device in the model for recognition, adding a new rule in the model, and so forth.
  • Figure 2 shows a HCC 201 associated with a separate WiFi router 206 that interacts with PCC 202 in accordance with an embodiment.
  • IoT devices 204-205 may be supported by protocol gateway 210 and IoT message translator 211 executing at HCC 201.
  • IoT devices 204-205 communicate by the corresponding protocols (for example, Zigbee) via a protocol gateway 210.
  • Protocol gateway 210 passes device messages to IoT message translator 211 that comprises IoT protocol message broker 208 (for example, MQTT broker) or COAP server (not explicitly shown) and IoT protocol message bridge 209 (for example, MQTT/Zigbee bridge).
  • IoT protocol message broker 208 for example, MQTT broker
  • COAP server not explicitly shown
  • IoT protocol message bridge 209 for example, MQTT/Zigbee bridge.
  • Message translator 211 bridges IoT device messages into IoT protocol messages (for example, MQTT messages).
  • the MQTT messages may be directed to other IoT devices connected to HCC 201, to the rule engine, or to PCC 202.
  • a device message from the Zigbee device 205 may be sent to HCC 201 via Zigbee gateway 210.
  • Device data may be extracted from the device message and sent to the analytic model for processing.
  • the device message may be passed to MQTT/Zigbee bridge 209 and MQTT broker 208 to reach PCC 202 via home WiFi router 206.
  • WiFi devices for example, device 203 that support the MQTT client may also connect to MQTT broker 208 within HCC 201.
  • Device data collected by HCC 201 may be stored into a mass data storage device (not explicitly shown) and thus the additional cost to send the collected data back and forth with PCC 201 is circumvented.
  • the communication between WiFi device 203 and HCC 201 may occur through two different paths 251 or 252, depending on which WiFi access point WiFi device 203 is connected to.
  • path 251 the MQTT message from WiFi device 203 is routed from home WiFi router 206 to MQTT broker 208 that may further directed to other IoT devices or to PCC 202 via home WiFi router 206.
  • WiFi device 203 is directly connected to HCC 201, which is acting as a WiFi access point (AP) and may also connect to the home WiFi router 206 to PCC 202.
  • AP WiFi access point
  • User application (app) 207 may interact with HCC 201 and/or PCC 202 via WiFi connection 253.
  • FIG. 3 shows HCC 301 with a WiFi router capability in accordance with an embodiment. Because HCC 301 includes a WiFi router 306, all WiFi devices (for example, device 303) may be connected to it for accessing internet services to user applications 307 and/or PCC 302. Moreover, mobile device 307 in close proximity may also be connected to HCC 301 for accessing internet services.
  • WiFi router 306 all WiFi devices (for example, device 303) may be connected to it for accessing internet services to user applications 307 and/or PCC 302.
  • mobile device 307 in close proximity may also be connected to HCC 301 for accessing internet services.
  • FIG. 4 shows PCC 102 that is interactive with a plurality of HCC’s including HCC’s 101 and 401 in accordance with an embodiment. Consequently, PCC 102 may obtain data about data analytic models executing on the plurality of HCC’s and may train a mirrored model executing on PCC 102. PCC 102 may subsequently distribute the trained model to one or more HCC’s so that the trained model can be locally executed.
  • FIG. 5 shows HCC 501 that is interactive with PCC 514 and user application 512 in accordance with an embodiment.
  • HCC 501 interacts with IoT devices 504-506 via communication server 507, PCC 514 via cloud interface 503, and mobile device 512.
  • HCC 501 comprises processing device 502, cloud interface 503, communication server 507, memory device 509, and storage device 511.
  • HCC 501 may include embedded WiFi router 508 with some embodiments (for example, as shown in Figure 3).
  • Processing device 502 controls operation of HCC 501 by executing computer readable instructions stored on memory device 509.
  • processing device 502 may execute computer readable instructions to perform a processes 600-1000 shown in Figures 6-10, respectively.
  • Embodiments may support a variety of computer readable media that may be any available media accessed by processing device 502 and include both volatile and nonvolatile media, removable and non-removable media.
  • computer readable media may comprise a combination of computer storage media and communication media.
  • Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Computer storage media include, but is not limited to, random access memory (RAM), read only memory (ROM), electronically erasable programmable read only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store the desired information and that can be accessed by the computing device.
  • Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
  • Modulated data signal is a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
  • HCC 501 may execute downloaded model 510 at memory device 509 from PCC 514.
  • Machine learning model 510 may comprise a neural network model processing data from IoT devices 504-506 as inputs resulting with one or more decision outputs from model 510.
  • Home computing cloud 501 may apply reinforcement learning to train the model if there is any corrective action to the predictive output from the model.
  • FIG. 6 shows logic flow 600 for executing an analytic model locally.
  • HCC 101 When HCC 101 is setting up or a new device is added, HCC 101 sends system configuration information to PCC 102 at block 601 and downloads the corresponding analytic model at block 602.
  • the analytic model is implemented at block 603 and executed at block 604 based on IoT device inputs 651 and model parameters 650.
  • HCC 101 currently supports a thermostat and a presence sensor in a residential home.
  • the thermostat has learned that when there is user presence at home from April to October, the operating mode should be set to cool and temperature should be set to 23C, while from November to March, the operating mode should be set to heat and temperature should be set to 25C:
  • HCC 101 when a user adds a smart curtain (a new IoT device) to the ecosystem, HCC 101 sends a configuration information (for example, config file) to PCC 102 about the thermostat, presence sensor, and smart curtain.
  • PCC 102 notifies HCC 101 that a new analytic model is available when the new device (smart curtain) is added. Since the setting “Use of new model template” is set to “Yes”, HCC 101 downloads the new model template. With the new model, the original settings are applied.
  • a new input parameter (IoT device input) “curtain opening level (0% fully opened to 100% fully closed)” is introduced. With the new model provided by PCC 102, the level of curtain opening has no impact to the set temperature:
  • HCC 101 continuously receives inputs from IoT devices 651 and is processed through the analytic model 604 to obtain predictive results. The predictive results are applied to corresponding IoT devices at block 605.
  • HCC 101 continuously monitors the IoT ecosystem for any corrections to the predictive results at block 606. If there is any correction made, HCC 101 provides corrections T[n] 654 as feedback to PCC 102 together with the corresponding IoT device inputs S[n] 652 and the predictive results R[n] 653 at block 607, where result information may comprise R[n] 653 and T[n] 654. For example, HCC 101 may conditionally initiate reinforcement learning at PCC 102 and consequently receive updated parameters from PCC 102 in response. HCC can then update the downloaded analytic model.
  • reinforcement learning is performed at block 621 to obtain a new set of model parameters (replacing model parameters 650).
  • New parameters 655 is sent to HCC 101 at block 622 and used by the analytic model at block 608 for the next device inputs S[n+1].
  • PCC 102 sends the new model parameters to HCC 101.
  • HCC 101 then applies the new parameters in the model.
  • the new parameters are:
  • FIG. 7 shows process 700 for handling multimedia signals, in which the analytic model is executing at PCC 102.
  • HCC 101 continuously streams source data 751 to PCC 102 at block 701.
  • PCC 102 uses a model with Z hidden layers to analyze the data stream at block 721 to obtain predictive results R 752.
  • the results are then sent back to HCC 101 at block 722, where HCC 101 applies the predictive results to the IoT ecosystem at block 702.
  • HCC 101 continuously monitors the IoT ecosystem for any corrections to the predictive results at block 703. If there are any corrections, HCC 101 provides corrections T 753 as feedback to PCC 102 at block 704.
  • PCC 102 performs reinforcement learning at block 723 to obtain a new set of parameters W 754. The new parameters are then applied to the analytic model at block 721 for the subsequent source data stream.
  • application 800 the source data stream from one or more IoT devices is analyzed at HCC 101.
  • the output from the X th hidden layer of hidden layers 802 is then sent to PCC 102 to continue the analysis through hidden layers 803.
  • the amount of data sent from HCC 101 to PCC 102 is typically reduced.
  • the privacy of user data may be protected by sending a version of transformed data instead of the source data.
  • the distribution of work load for executing the model may be based on:
  • HCC 101 executes the first layer of the analytic model and then sends the output to PCC 102.
  • PCC 102 then executes the remaining layers of the analytic model.
  • HCC 101 executes layers of the analytic model until reaching a layer with the minimum number of output nodes. HCC would then send the output of this layer to a PCC 102. PCC 102 then executes the remaining layers of the analytic model).
  • HCC 101 may execute layers of the analytic model until a layer is reached, where its output is totally unrelated to the source data. HCC 101 then sends the output of this layer to PCC 102. PCC 102 then executes the remaining layers of the analytic model.
  • HCC 101 executes layers of the analytic model that have fixed parameters. HCC 101 then sends the output of the last layer to PCC 102. PCC 102 then executes the remaining layers of the analytic model).
  • HCC 101 may vary from those with basic configuration that only allows analytic model to be executed to more powerful ones that are equipped with more powerful hardware for training analytic models with multiple hidden layers.
  • an analytic model may be split into three sub-models.
  • the input processing layers and the output layer(s) are executed at HCC 101, and some or all of the hidden layers are executed at PCC 102.
  • the raw input data and the predictive outputs, which are closely related to the users, are locally kept.
  • FIG. 9 shows logic flow 900 for locally executing reinforcement learning at HCC 101. Reinforcement learning is executed at block 901. HCC 101 uses input data S[n] 951, predictive outputs R[n] 952, and corrections T[n] 953 at n th operation to optimize the set of parameters W for the analytic model by minimizing an error function.
  • the new set of parameters W[n+1] 954 is provided to the analytic model at block 902 so that the analytic model can utilize them at block 903.
  • new predictive results R[n+1] 956 may be applied to the IoT ecosystem at block 904.
  • the reinforcement learning algorithm repeats execution at block 906 using the data (S, R and T) from the [n+l] th instance. Otherwise, HCC 101 waits for inputs at block 907. [0104] With some applications, training of analytic model may be too demanding for the computer resources at HCC 101. In such situations, PCC 102 may be used to assist reinforcement learning at HCC 101.
  • Figure 10 shows logic flow 1000 for assistant training (parallel training), where HCC 101 performs training during training sequence 1001, and PCC 102 performs parallel training during training sequence 1021.
  • HCC 101 performs reinforcement learning using device inputs S[n] 1051 and an analytic model G at block 1002.
  • the predictive output 0[m] 1053 from the analytic model G is compared with the corrections from user T[n] 1052 to compute an error measure E[m] 1054 at block 1003.
  • Adjustment to the model parameters is determined based on the magnitude of the error value and rate of change of the error values between iterations at block 1004.
  • the new set of parameters U[m+1] 1055 is then used by the analytic model G at block 1002 to calculate a new output 0[m+l] using the same device inputs S[n].
  • a new error measure E[m+1] is then computed by comparing T[n] and 0[m+l]. Additional iterations may be performed until a desired error measure is obtained.
  • a copy of device inputs S[n] 1051 and the corrections T[n] 1052 is sent to PCC 102, for example, via data channel 151.
  • PCC 102 performs similar reinforcement learning process at block 1021 to assist model training at HCC 1001.
  • device inputs S[n] 1051 are executed by analytic model P at block 1022.
  • the predictive output Q[k] 1073 from analytic model P at block 1022 is compared with the corrections from user T[n] 1052 to compute an error measure F[k] 1074 at block 1023. Adjustment to the model parameters is based on the magnitude of the error value and rate of change of the error values between iterations at block 1024.
  • the new set of parameters V[k+1] 1075 is then used by analytic model P 1022 to calculate a new output Q[k+1] using the same device inputs S[n], and a new error measure F[k+1] is computed by comparing T[n] and Q[k+1], and so forth.
  • PCC 102 may use an identical algorithm (where model P is a copy of model G) to change the model parameters U 1055 and V 1075.
  • different algorithms may be used for adjusting the model parameters U 1055 and V 1075.
  • the error measures from the two learning models may be consistently compared. If there is substantial difference between the two error measures consistently, both learning models G and P may select (switch to) the set of model parameters that yields the lower error measures and continue the training.
  • the training may be terminated if any of the two models G and P meets a target error threshold, where the set of model parameters that meets error threshold is used by the analytic model at HCC 101.
  • HCC 101 may upload the trained model to PCC 102 for archiving, sharing, or optimization.
  • PCC 102 may analyze all the received models from other HCC’s and optimize a new model from them. With some embodiments, a new model may be trained when a new default IoT device is added. PCC 102 may distribute the new model to all HCC’s.
  • HCC 101 may decide to use the new model as provided by PCC 102 , use the new model with the parameters from the original model, or totally ignore the new model. The decision may be based on a comparison to the error measures when executing different models using the empirical data locally stored.
  • HCC 101 may decide to execute reinforcement learnings at any time during an operation, for example according to block 621 (as shown in Figure 6), block 723 (as shown in Figure 7), block 906 (as shown in Figure 9), or process 1000 (as shown in Figure 10).
  • legacy IoT device data stored locally at HCC 101 may be used to train the analytic model (for example, a continual improvement to the original model).
  • new IoT device data may be used to train the analytic model (for example, a new analytic model with an additional device type).
  • a mix of both the legacy and new IoT device data may be used to execute the original model and to train the new model in parallel.
  • Weightings may be assigned in computing the error measures during reinforcement learnings. For example, with object recognition, more weight may be allocated to recognition errors than to the errors in confidence level.
  • aspects described herein may be embodied as a method, an apparatus, or as computer-executable instructions stored on one or more non-transitory and/or tangible computer-readable media. Accordingly, those aspects may take the form of an entirely hardware embodiment, an entirely software embodiment (which may or may not include firmware) stored on one or more non-transitory and/or tangible computer-readable media, or an embodiment combining software and hardware aspects. Any and/or all of the method steps described herein may be embodied in computer-executable instructions stored on a computer-readable medium, such as a non-transitory and/or tangible computer readable medium and/or a computer readable storage medium.
  • any and/or all of the method steps described herein may be embodied in computer-readable instructions stored in the memory and/or other non-transitory and/or tangible storage medium of an apparatus that includes one or more processors, such that the apparatus is caused to perform such method steps when the one or more processors execute the computer-readable instructions.
  • various signals representing data or events as described herein may be transferred between a source and a destination in the form of light and/or electromagnetic waves traveling through signal-conducting media such as metal wires, optical fibers, and/or wireless transmission media (e.g., air and/or space).

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Automation & Control Theory (AREA)
  • Computer Security & Cryptography (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Telephonic Communication Services (AREA)
  • Information Transfer Between Computers (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

Un nuage informatique domestique (HCC) prend en charge un ou plusieurs dispositifs de l'Internet des objets (IdO), éventuellement au moyen de différents protocoles de connexion, dans un environnement local. Le HCC réduit souvent la quantité de trafic de données envoyé à un nuage informatique public (PCC) par traitement local de données de dispositif collectées plutôt que par envoi des données de dispositif au PCC pour traitement. Cette approche réduit la quantité de trafic de données envoyé sur le réseau, améliore la confidentialité des données et aide à maintenir un niveau de qualité de service souhaitée. Pour ce faire, le HCC peut télécharger un modèle analytique de données approprié du PCC, peut former le modèle, peut exécuter le modèle formé pour obtenir des informations de prédiction des données de dispositif IdO collectées, et peut télécharger le modèle formé vers le PCC. En variante, le HCC et le PCC peuvent exécuter des sous-modèles du modèle analytique et échanger les sorties des sous-modèles les unes avec les autres.
EP21785519.6A 2020-04-06 2021-04-06 Nuage informatique local qui est interactif avec un nuage informatique public Pending EP4133695A4 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US16/840,648 US11656966B2 (en) 2020-04-06 2020-04-06 Local computing cloud that is interactive with a public computing cloud
US16/840,708 US11399069B2 (en) 2020-04-06 2020-04-06 Method and apparatus to implement a home computing cloud
PCT/US2021/025943 WO2021207179A1 (fr) 2020-04-06 2021-04-06 Nuage informatique local qui est interactif avec un nuage informatique public

Publications (2)

Publication Number Publication Date
EP4133695A1 true EP4133695A1 (fr) 2023-02-15
EP4133695A4 EP4133695A4 (fr) 2024-05-08

Family

ID=78022790

Family Applications (2)

Application Number Title Priority Date Filing Date
EP21783966.1A Pending EP4133694A4 (fr) 2020-04-06 2021-04-06 Procédé et appareil de mise en oeuvre d'un nuage informatique domestique
EP21785519.6A Pending EP4133695A4 (fr) 2020-04-06 2021-04-06 Nuage informatique local qui est interactif avec un nuage informatique public

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP21783966.1A Pending EP4133694A4 (fr) 2020-04-06 2021-04-06 Procédé et appareil de mise en oeuvre d'un nuage informatique domestique

Country Status (3)

Country Link
EP (2) EP4133694A4 (fr)
CN (2) CN115668868A (fr)
WO (2) WO2021207179A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114285890B (zh) * 2021-12-10 2024-03-15 西安广和通无线通信有限公司 云平台连接方法、装置、设备及存储介质

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10601810B2 (en) * 2011-09-09 2020-03-24 Kingston Digital, Inc. Private cloud routing server connection mechanism for use in a private communication architecture
US20130201316A1 (en) * 2012-01-09 2013-08-08 May Patents Ltd. System and method for server based control
KR101560470B1 (ko) * 2014-01-07 2015-10-16 한국과학기술원 스마트 연결 장치 및 스마트 연결 장치를 활용하여 IoT 장치를 제어하기 위한 방법
JP6367465B2 (ja) * 2014-07-21 2018-08-01 コンヴィーダ ワイヤレス, エルエルシー Mqttプロトコルを使用するサービス層インターワーキング
US20160205106A1 (en) * 2015-01-12 2016-07-14 Verisign, Inc. Systems and methods for providing iot services
US9977415B2 (en) * 2015-07-03 2018-05-22 Afero, Inc. System and method for virtual internet of things (IOT) devices and hubs
KR102471665B1 (ko) * 2015-08-27 2022-11-25 포그혼 시스템스 인코포레이티드 에지 인텔리전스 플랫폼 및 사물 인터넷 센서 스트림 시스템
US9866637B2 (en) * 2016-01-11 2018-01-09 Equinix, Inc. Distributed edge processing of internet of things device data in co-location facilities
US10645181B2 (en) * 2016-12-12 2020-05-05 Sap Se Meta broker for publish-subscribe-based messaging
US10671925B2 (en) * 2016-12-28 2020-06-02 Intel Corporation Cloud-assisted perceptual computing analytics
US11057344B2 (en) * 2016-12-30 2021-07-06 Fortinet, Inc. Management of internet of things (IoT) by security fabric
US10878342B2 (en) * 2017-03-30 2020-12-29 Intel Corporation Cloud assisted machine learning
US10476751B2 (en) * 2017-10-19 2019-11-12 Microsoft Technology Licensing, Llc IoT cloud to cloud architecture
US20200372412A1 (en) * 2018-01-03 2020-11-26 Signify Holding B.V. System and methods to share machine learning functionality between cloud and an iot network
US10862971B2 (en) * 2018-04-27 2020-12-08 EMC IP Holding Company LLC Internet of things gateway service for a cloud foundry platform
EP3591938A1 (fr) * 2018-07-03 2020-01-08 Electronics and Telecommunications Research Institute Système et procédé pour commander un flux de travail inter-domaines sur la base d'une structure de moteur hiérarchique
KR102091126B1 (ko) * 2018-10-24 2020-04-23 전자부품연구원 IoT 데이터 분석을 위한 에지-클라우드 협업 시스템 및 이의 운용방법

Also Published As

Publication number Publication date
EP4133694A4 (fr) 2024-06-12
WO2021207191A1 (fr) 2021-10-14
EP4133695A4 (fr) 2024-05-08
EP4133694A1 (fr) 2023-02-15
CN115668868A (zh) 2023-01-31
WO2021207179A1 (fr) 2021-10-14
CN115918035A (zh) 2023-04-04

Similar Documents

Publication Publication Date Title
JP6457447B2 (ja) データセンターのネットワークトラフィックスケジューリング方法及び装置
Peng et al. Joint optimization of service chain caching and task offloading in mobile edge computing
WO2014119719A1 (fr) Système de contrôle de ressources, dispositif de génération de modèle de contrôle, dispositif de contrôle, procédé et programme de contrôle de ressources
US10247435B2 (en) Real-time control of highly variable thermal loads
US20170286861A1 (en) Structured machine learning framework
Cui et al. TailCutter: Wisely cutting tail latency in cloud CDNs under cost constraints
CN107846371B (zh) 一种多媒体业务QoE资源分配方法
CN105703927A (zh) 一种资源分配方法、网络设备和网络系统
US10922623B2 (en) Capacity planning, management, and engineering automation platform
EP3884437A1 (fr) Procédé et gestionnaire d'apprentissage automatique de gestion de la prédiction des caractéristiques d'un service
EP4133695A1 (fr) Nuage informatique local qui est interactif avec un nuage informatique public
Chen et al. Task offloading in hybrid-decision-based multi-cloud computing network: a cooperative multi-agent deep reinforcement learning
JP2015011365A (ja) プロビジョニング装置、システム、プロビジョニング方法、および、プロビジョニングプログラム
Naresh et al. Sac-abr: Soft actor-critic based deep reinforcement learning for adaptive bitrate streaming
Ayache et al. Walk for learning: A random walk approach for federated learning from heterogeneous data
US11656966B2 (en) Local computing cloud that is interactive with a public computing cloud
Dechouniotis et al. A control‐theoretic approach towards joint admission control and resource allocation of cloud computing services
KR20220042928A (ko) 복수의 액세스 네트워크 장치들에 대한 자동 구성 네트워크를 구현하는 방법 및 이를 수행하는 전자 장치
CN117202265A (zh) 边缘环境下基于dqn的服务迁移方法
Zeng et al. Towards secure and network state aware bitrate adaptation at IoT edge
JP2024521051A (ja) 5g及びエッジコンピューティングアプリケーションのためのアプリケーション中心のデザイン
Hsieh et al. Deep reinforcement learning-based task assignment for cooperative mobile edge computing
EP3788768B1 (fr) Procédés et systèmes pour diffuser des données multimédia sur un réseau de distribution de contenu
Huang et al. Granular vnf-based microservices: Advanced service decomposition and the role of machine learning techniques
Wang et al. Offloading and Quality Control for AI Generated Content Services in Edge Computing Networks

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20221019

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20240409

RIC1 Information provided on ipc code assigned before grant

Ipc: G06N 3/08 20060101ALN20240403BHEP

Ipc: H04L 67/565 20220101ALI20240403BHEP

Ipc: H04L 69/18 20220101ALI20240403BHEP

Ipc: H04L 69/08 20220101ALI20240403BHEP

Ipc: H04L 67/125 20220101ALI20240403BHEP

Ipc: H04L 12/28 20060101AFI20240403BHEP