US20220051073A1 - Integrated Assistance Platform - Google Patents
Integrated Assistance Platform Download PDFInfo
- Publication number
- US20220051073A1 US20220051073A1 US17/513,410 US202117513410A US2022051073A1 US 20220051073 A1 US20220051073 A1 US 20220051073A1 US 202117513410 A US202117513410 A US 202117513410A US 2022051073 A1 US2022051073 A1 US 2022051073A1
- Authority
- US
- United States
- Prior art keywords
- event data
- data
- machine learning
- learning model
- autonomous
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000010801 machine learning Methods 0.000 claims abstract description 84
- 230000002596 correlated effect Effects 0.000 claims abstract description 37
- 230000009471 action Effects 0.000 claims abstract description 33
- 238000000034 method Methods 0.000 claims abstract description 27
- 230000000694 effects Effects 0.000 claims abstract description 25
- 238000012545 processing Methods 0.000 claims description 15
- 239000003795 chemical substances by application Substances 0.000 description 192
- 230000008569 process Effects 0.000 description 13
- 230000006870 function Effects 0.000 description 8
- 230000003993 interaction Effects 0.000 description 6
- 230000002787 reinforcement Effects 0.000 description 5
- 238000012549 training Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 206010012289 Dementia Diseases 0.000 description 3
- 206010037180 Psychiatric symptoms Diseases 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 239000003814 drug Substances 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000006399 behavior Effects 0.000 description 2
- 229940079593 drug Drugs 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000013479 data entry Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000013160 medical therapy Methods 0.000 description 1
- 206010027175 memory impairment Diseases 0.000 description 1
- 230000004630 mental health Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000000554 physical therapy Methods 0.000 description 1
- 239000006187 pill Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
- 230000002618 waking effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/006—Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/02—Knowledge representation; Symbolic representation
- G06N5/022—Knowledge engineering; Knowledge acquisition
- G06N5/025—Extracting rules from data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/04—Inference or reasoning models
- G06N5/043—Distributed expert systems; Blackboards
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/088—Non-supervised learning, e.g. competitive learning
Definitions
- This invention relates generally to automatic assistance devices and more specifically to an integrated assistance platform that integrates data from disparate sources and uses machine-learning to provide intelligent assistance.
- Smart devices provide many benefits. Smart devices can receive voice commands and integrate with home appliances and internet services, making daily living easier. For example, smart devices can turn on or off home appliances or provide news updates via a display or text-to-speech.
- Certain aspects and features include a system and a method of operating a first autonomous agent, a second autonomous agent, and a data aggregator.
- the first autonomous agent includes a first sensor module, a first reasoning module, and a first actuator module.
- the first sensor module manages a first sensor and the first reasoning module provides a first machine learning model.
- the first autonomous agent is configured to receive, from the first sensor module, a first set of event data indicating events relating to a subject, provide the first set of event data to a data aggregator and receive, from the data aggregator, correlated event data including events sensed by the first autonomous agent and the second autonomous agent.
- Updating the first machine learning model includes applying the machine learning model to the correlated event data to predict a first pattern of activity, determining a first reward of the first machine learning model, and updating internal parameters of the first machine learning model to maximize the first reward.
- the first autonomous agent is further configured to determine, via the first reasoning module and based on the first pattern of activity, that a first action is to be performed, causing the first actuator module to perform the first action.
- the second autonomous agent includes a second sensor module, a second reasoning module, and a second actuator module.
- the second sensor module manages a second sensor and the second reasoning module provides a second machine learning model.
- the second sensor is different from the first sensor and the second autonomous agent is configured to receive, from the second sensor module, a second set of event data indicating events relating to a subject, provide the second set of event data to the data aggregator, receive, from the data aggregator, the correlated event data, and update the second machine learning model. Updating the second machine learning model includes applying the second machine learning model to the correlated event data to predict a second pattern of activity, determining a second reward of the second machine learning model, and updating internal parameters of the second machine learning model to maximize the second reward.
- the second autonomous agent is configured to determine, via the second reasoning module and based on the second pattern of activity, that a second action is to be performed and cause the second actuator module to perform the second action.
- the data aggregator includes a data store and is configured to store the first set of event data and the second set of event data in the data store, correlate event data from the first set of event data and the second set of event data, and store the correlated event data in the data store.
- FIG. 1 illustrates an exemplary integrated assistance computing system, according to certain aspects of the present disclosure.
- FIG. 2 illustrates an exemplary environment in which an autonomous agent can operate, according to certain aspects of the present disclosure.
- FIG. 3 depicts an example of a data aggregator for an integrated assistance computing system, according to aspects of the present disclosure.
- FIG. 4 depicts a flowchart of an exemplary method used to operate an integrated assistance computing system, according to certain aspects of the present disclosure.
- FIG. 5 depicts a flowchart of an exemplary method used for reinforcement learning, according to certain aspects of the present disclosure.
- FIG. 6 depicts an example of a computing system for implementing an integrated assistance computing system, according to certain aspects of the present disclosure.
- the integrated assistance computing platform can address the needs of seniors or other people.
- the integrated assistance platform can be implemented as smartphone application connecting to a cloud-based service, a hardware device, or a robotic device.
- the integrated assistance computing system includes one or more autonomous agents. Autonomous agents are applications or devices that execute independently from each other, each with separate machine learning capability configured to process sensor data from themselves or other autonomous agents, determine that a specific action should be performed based on correlated event data, and cause the action to be performed.
- Each autonomous agent uses one or more machine learning models that use unsupervised, supervised, or reinforcement learning. In this manner, the agents are self-improving and can therefore become more valuable to a user.
- each autonomous agent gathers and takes decisions based on data from other agents, sensors, and disparate sources such as external databases or internet services such as medical systems, financial systems, and social media.
- Exemplary autonomous agents include scheduler, reminder, and responder agents, but other types of autonomous agents are possible.
- a scheduler agent proactively schedules appointments with medical or financial professionals, or coordinates social activity. Appointments can be scheduled based on predictions from machine-learning models based on aggregated data from disparate sources such as medical or financial records.
- a reminder agent reminds the user, by activating an alert such as a sound, light, or phone call, that an appointment is approaching.
- An responder agent monitors or control household appliances and sensors. The responder agent responds to a user request using a voice assistant. The responder agent accesses Internet sites such as personal financial or medical sites. The responder agent interacts with the user in order to inform the user of daily events, schedule appointments, converse with the user, or provide the user with puzzles or brain-teasers.
- a first autonomous agent including a sensor module, a reasoning module, and an actuator module operates on an integrated assistance computing system.
- the sensor module of the first autonomous agent connects to a door sensor and a light sensor, thereby receiving event data indicating when a door has been opened and closed and when a light has been turned on or off.
- the first autonomous agent provides the event data to a data aggregator operating on the platform.
- a second autonomous agent also operates on the integrated assistance computing platform.
- the second autonomous agent includes a reasoning module that connects to various Internet-based services in order to obtain emails, weather forecasts, expected sunrise and sunset.
- the data aggregator receives event data from a second autonomous agent.
- the event data includes the weather, sunrise time, and sunset time.
- the second autonomous agent provides the events to the data aggregator. Over time, the data aggregator correlates door-related events and light-related events.
- the first autonomous agent receives correlated events from the data aggregator and updates the machine learning model of the first autonomous agent.
- the machine learning model analyzes the data, learn patterns, or makes predictions on the correlated data.
- the machine learning model recognizes a daily pattern that includes the light being turned and a door being opened shortly after sunrise.
- the first autonomous agents creates a rule based on this predicted pattern.
- the door and light sensors do not detect events based on this rule, then the first autonomous agent creates an alert. Deviation from patterns can be detected and exceptions can be made. For example, if an autonomous agent detects that the user stayed up late the previous day, then the rule can be varied slightly to allow for the fact that the user will likely sleep in.
- a machine learning model detects a pattern of a user watching particular baseball games. Detecting this event data can be performed using audio or video recognition or interfacing with a television or cable box.
- the autonomous agent can also access Internet-based services to obtain event data including wins and losses for the team, delayed game starts, overtimes, etc.
- the data aggregator aggregates this event data from disparate sources and provides the data to the autonomous agent. This aggregated data is useful in performing several different functions. For example, a responder agent can send alerts to the user with game scores or announce that the game will start late. Further, by analyzing this data using the machine learning model, the autonomous agent determines additional trends such as a particular losing streak.
- the autonomous agent inputs the aggregated event data to a machine learning model, which recognizes a pattern. Examples of patterns include that the user typically watches all games for a particular team, all away games, games on a Saturday, or a majority of games. Predictions can also be based on external data such as the weather or other scheduled activities. Based on the determined pattern, the autonomous agent can predict whether a user will watch a particular game. Based on this prediction, the autonomous agent can automatically turn the television on, or remind the user that the game will be on that day.
- the autonomous agent can use the predictions to determine that an abnormal activity has occurred and can then act accordingly. For example, if the autonomous agent predicts that the user will watch a particular game, but does not watch the game, the responder agent takes an action such as reminding the user with an audio or visual alert.
- missing a game can cause an autonomous agent to issue an alert. But the autonomous agent first checks to determine whether another explanation exists for the user missing the game.
- the autonomous agent checks sensor event data to determine whether the user is performing another activity as detected by appliance, light, sound, or other events. In this case, the autonomous agent might determine that the user is healthy and is simply busy doing something else.
- data from Internet-sources such as emails or social media may indicate appointments or visits are scheduled at the same time as the game. For example, the user's daughter could be visiting.
- the autonomous agent Having determined an exception to a predicted event, the autonomous agent then learns from the deviation, using reinforcement or other machine learning techniques. For example, the autonomous agent learns that the user does not watch baseball games when the user's daughter is visiting. The autonomous agent provides this feedback to the machine learning model and the data aggregator. The machine learning model is updated accordingly and next time the user does not watch a predicted game and the daughter is visiting, the autonomous agent does not identify a deviation. In this manner, by analyzing events from disparate sources, an autonomous agent determines a richer understanding of user behavior and improves over time.
- the autonomous agent takes an action.
- a responder agent can first attempt to reach the user by alert, and if the user does not respond, then the responder agent attempts to reach the user or can send an alert to family or friends.
- Other learned predictions can include predicting a time to get a cane, move from cane to a walker, and move from walker to wheelchair.
- the autonomous agents can also schedule activities and services such as scheduling transportation for medical and physical therapy, or physician visits.
- Autonomous agents can also send alerts that include when to take pills and shots or alert a care-giver of possible forgetfulness, or email grocery lists.
- FIG. 1 illustrates an exemplary integrated assistance computing platform 100 , according to certain aspects of the present disclosure.
- Integrated assistance computing platform 100 includes integrated assistance computing system 101 , user device 102 , physical devices 160 , sensors 170 , and Internet-based services 180 .
- Integrated assistance computing system 101 can be implemented as a mobile application on a phone or tablet, on a personal computer such as a laptop or desktop, or implemented in a physical robot.
- Integrated assistance computing system 101 can access physical devices 160 , sensors 170 , or Internet-based services 180 via a network or other connection.
- User device 102 provides a point of interaction for the user.
- User device 102 includes speaker 103 , display 104 , and microphone 105 .
- Speaker 103 can be used by integrated assistance computing system 101 to transmit audio or speech to a user.
- Display 104 can display information such as status updates, news, weather, or alerts. For example, as depicted, display 104 is displaying the weather and the time.
- Microphone 105 can receive voice commands from a user.
- Integrated assistance computing system 101 can process the commands or use an external system to process commands and respond.
- system integrator 140 can reconfigure, activate, or deactivate one or more physical devices 160 .
- Physical devices 160 include wheelchair 161 , doorbell 162 , appliances 163 , and lights 164 .
- system integrator 140 can cause wheelchair 161 to move.
- System integrator 140 can receive a notification from doorbell 162 or access images from a camera installed on doorbell 162 .
- System integrator can turn on or turn off appliances 163 .
- Appliances 163 can include microwaves, stoves, refrigerators, and the like.
- System integrator 140 can turn on, turn off, or dim lights 164 .
- Integrated assistance computing system 101 can connect to physical devices 160 via dedicated point-to-point connection, a wired connection, or a wireless connection.
- physical devices 160 can be operated by a smart home system connected to integrated assistance computing system 101 via a network connection.
- Other physical devices are possible such as medical alert systems, self-driving cars, or robots.
- system integrator can cause a robot to approach a user.
- Sensors 170 include motion sensor 171 , light sensor 172 , sound sensor 173 , and temperature sensor 174 . Other sensors are possible. Sensors 170 provide autonomous agents 110 - 112 data or events on which to base decisions and take actions. For example, sound sensor 173 provides a signal to responder agent 112 that indicates that the user has made a specific sound that identifies that the user is awake. Such sounds and their identification can be provided or learned. Temperature sensor 174 provides a signal to responder agent 112 that a residence is too hot or too cold.
- Internet-based services 180 include public services and also services that store or maintain personally identifiable information.
- Internet-based services include social services 181 such as social media accounts, financial services 182 such as bank accounts or investment accounts, and medical services 183 such as doctors, dentists, wellness centers, or medical alert systems.
- social services 181 such as social media accounts
- financial services 182 such as bank accounts or investment accounts
- medical services 183 such as doctors, dentists, wellness centers, or medical alert systems.
- Integrated assistance computing system 101 includes one or more autonomous agents such as a scheduler agent 110 , reminder agent 111 , or responder agent 112 .
- Scheduler agent 110 performs scheduling functions such as making medical appointments, making appointments with financial advisors, or scheduling social activities.
- Reminder agent 111 creates alerts such as reminding the user to take his or her medication, to alter an investment plan, or send emails to friends and family.
- Responder agent 112 can respond to commands such as verbal commands received via microphone 105 or commands received via a user interface.
- Responder agent 112 can also play interactive games such as puzzles with a user.
- Responder agent 112 can also activate or deactivate physical devices 160 .
- autonomous agents 110 - 112 While three autonomous agents 110 - 112 are depicted, additional autonomous agents are possible. Autonomous agents can be added to the integrated assistance computing platform via software upgrades, or implemented on remote computing systems and accessed via a data network.
- Integrated assistance computing system 101 also includes system integrator 140 , which can receive information from or control physical devices 160 , sensors 170 , or Internet-based services 180 .
- system integrator 140 autonomous agents 110 - 112 can control external devices. For example, if the responder agent 112 decides that the lights 164 should be turned on as light sensor 172 is receiving a signal that a light level is too dim, then responder agent 112 can send a control signal to system integrator 140 , which in turn causes the lights 164 to be activated.
- System integrator 140 can perform speech recognition.
- Data aggregator 150 aggregates data from one or more autonomous agents 110 - 112 .
- Data aggregator includes one or more data stores 151 .
- Data aggregator 150 can perform data functions such as correlation. For example, if data aggregator 150 determines that event data received from scheduler agent 110 , for example, an appointment, is related to data received from reminder agent 111 , then data aggregator can integrate the data into a common data entry or data structure, or link the data together by using a reference. In this manner, autonomous agents 110 - 112 receive the benefit of utilizing correlated data aggregated from multiple autonomous agents.
- each autonomous agent 110 - 112 can access correlated data from data aggregator 150 , which in turn accesses data from physical devices 160 , sensors 170 , or Internet-based services 180 , and take actions based on that data.
- scheduler agent 110 can determine from a medical service 183 that a user has a doctor's appointment.
- Scheduler agent 110 can provide this information to machine learning model 120 , which, over multiple iterations, can learn the frequency with which medical appointments are scheduled.
- Scheduler agent 110 can remind the user, for example, by sending a message to speaker 103 or display 104 , that an upcoming appointment, or, if no appointment is scheduled, cause an appointment to be scheduled.
- Each autonomous agent 110 - 112 can also access data gathered from other autonomous agents, provide that data to a machine learning model, and receive predictions thereon.
- an autonomous agent 110 - 112 can access correlated data from data aggregator 150 , provide the data to one or more machine learning models, and receive predictions from the machine learning model.
- Each autonomous agent can receive event data from sensors 170 , provide the event data to data aggregator 150 , receive correlated event data from data aggregator 150 , provide the event data to a machine learning model, receive a prediction from the machine learning model and take action based on the decision.
- scheduler agent 110 includes machine learning model 120
- reminder agent 111 includes machine learning model 121
- responder agent 112 includes machine learning model 122 . The methods used with machine learning models in each agent are discussed further with respect to FIG. 2 .
- FIG. 2 illustrates an exemplary environment in which an autonomous agent can operate, according to certain aspects of the present disclosure.
- Environment 200 includes autonomous agent 201 , sensors 170 , and Internet-based services 180 .
- Autonomous agent 201 includes one or more of reasoning module 220 , actuator module 230 , machine learning model 210 , and data 215 .
- Each autonomous agent can execute as a separate application on integrated assistance computing system 101 , or on a separate computing system.
- Reasoning module 220 , actuator module 230 , and machine learning model 210 can execute on integrated assistance computing system 101 as separate applications or processes, or can execute on different processors.
- Data 215 can be used for local storage or storage of training data used for machine learning model 210 .
- machine learning model 210 can be included within reasoning module 220 .
- Autonomous agents can help combat senior-specific problems such as loneliness and dementia. For example, by analyzing data from multiple agents, integrated assistance computing platform can detect whether anyone has been seen at a residence or whether the user has interacted with anyone. Integrated assistance computing platform can help combat dementia by playing games and puzzles with the user. The games and puzzles can be obtained from Internet-based services 180 .
- the scheduler agent and the reminder agent can perform more sophisticated analysis than with one agent alone, can make more intelligent suggestions. and be more useful to the user. For example, if an event indicates that the user is watching television more than usual or not moving much is correlated with not taking their medication on time, the autonomous agent may conclude that the user is not feeling well and take an action.
- Autonomous agents can self-organize and self-register with the integrated assistance computing platform.
- the autonomous agents can use an auto-discovery function provided by integrated assistance computing system 101 .
- the auto-discovery function periodically checks for new autonomous agents, physical devices 160 , sensors 170 , or Internet-based services 180 . If a new agent, device, sensor, or service is detected, then the auto-discovery function updates internal configuration tables, configures interface protocols, refreshes a network topology, and activates the new agent, device, sensor, or service.
- a new autonomous agent is added to the integrated assistance computing system 101 .
- the new autonomous agent communicates its capabilities to the computing system. In turn, the computing system informs other autonomous agents of the capabilities.
- the platform starts providing information to the agent, for example, via data aggregator 150 .
- Autonomous agent 201 accesses physical devices 160 , sensors 170 , and Internet-based services 180 .
- a first autonomous agent can analyze sensors in a residence such as light sensor 172 or sound sensor 173 .
- a second autonomous agent can analyze Internet-based services such as social services 181 .
- the two agents can operate in conjunction with one another.
- the second autonomous agent can schedule appointments or gather photos from the user's social media account.
- Data aggregator 150 aggregates events from the first agent and second agent. Each autonomous agent can access the aggregated data and provide the data to the respective machine learning model. Based on the data, a machine learning model can determine patterns or determine or execute rules. For example, the data aggregator 150 can aggregate and organize events that enable a machine learning model to predict when a user will wake up or go to sleep. Subsequently, an autonomous agent can take action based on the rules, such as by turning the lights on, the heat up, or the coffee maker on.
- Rules can be used to adjust predicted actions.
- An example rule ensures that a determined action does not interfere with the user's sleep. For example, by accessing sensor data the first autonomous agent can determine that the user is asleep. Based on a rule not to disturb the user when he or she is asleep, the first autonomous agent can delay a presentation of photos received via social media until such time that the user is awake.
- Reasoning module 220 analyzes data and determines actions, optionally in conjunction with machine learning model 210 .
- reasoning module 220 receives correlated event data from data aggregator 150 and takes decisions thereon.
- Reasoning module 220 includes an inference engine that performs the data analysis or provides the data to machine learning model 210 for analysis.
- Machine learning model 210 receives data from reasoning module 220 , predicts data patterns or events and provides the predictions to reasoning module 220 .
- Reasoning module 220 can include predefined or dynamic rules. Rules specify that when a specific event occurs, take a specific action in response.
- Machine learning model 210 learns different data patterns. For example, sensors 170 can be used to determine when a user is waking up, going to bed, or getting ready to leave the home. Similarly, machine learning model can use socialization patterns to determine the kind of interactions that the user has on a daily basis. For example, by analyzing Internet-based services 180 such as email, or text messages, the autonomous agent 201 can determine how often the user is socializing and with whom. Such analysis is useful to prevent loneliness, i.e., ensuring that the user is social enough, or to prevent fraud by detecting scam messages or visits. As discussed further with respect to FIG. 5 , different types of learning are possible. Actuator module 230 receives determined actions from reasoning module 220 and causes the integrated assistance computing system 101 to take the action.
- Actions can include issuing an alert, send an email, turn on the lights, etc. For example, if the light sensor 172 detects light, then the reasoning module determines a significance of the event and determines that a greeting is appropriate.
- the actuator module 230 outputs a greeting via speaker 103 .
- Actuator module 230 can issue alerts. For example, by alerting the user to take medicine or of an upcoming appointment, update an investment plan, indicate potential fraud, email an invitation, notify a care-giver or notify medical personnel.
- FIG. 3 depicts an example of a data integration platform for an integrated assistance computing system, according to aspects of the present disclosure.
- FIG. 3 depicts an exemplary integrated assistance computing system 300 .
- Integrated assistance computing system 300 includes one or more of data aggregator 350 , sensors 170 , autonomous agent 312 , reminder agent 311 , scheduler agent 310 , Internet-based services 180 , and speaker 103 . While as depicted, integrated assistance computing system 300 includes three autonomous agents, different numbers are possible.
- Data aggregator 350 includes data 360 , which stores events 361 - 369 .
- Data 360 is common storage accessible to multiple autonomous agents. In this manner, data aggregator 350 enables interaction and cooperation between the agents, including using data shared from external sources such as Internet-based services 180 . Examples of data that can be stored by data aggregator 350 include public information such as addresses and phone numbers, news data, public records, private information such as pictures or social media posts, and learned rules, models, or algorithms.
- Data aggregator 350 can determine similarities or correlations between data received from autonomous agents. Data aggregator 350 receives events 361 - 369 from one or more autonomous agents, aggregates and correlates the events into groups, and provides the groups to scheduler agent 310 , reminder agent 311 , and autonomous agent 312 .
- Events 361 - 369 represent events captured by one or more autonomous agents. For example, events 361 , 363 and 364 could be obtained from autonomous agent 312 , whereas the other events are obtained from another autonomous agent.
- autonomous agent 312 can access sensors 170 and receive a detection by light sensor 172 that a light was turned on. Similarly, autonomous agent 312 can access sound sensor 173 and receive a detection of noise.
- Other events can include social media posts, emails, or notifications. In the aggregate, events can determine a pattern of activity such as when the user wakes up, what the user typically does during the day (or a particular day of the week), and when the user typically goes to bed.
- Events 361 - 369 are ordered chronologically but need not be. For example, events can be reordered and grouped according to other criteria such as an identified connection with a person, place, or keyword.
- event 364 is a detection of reduced activity. Reduced activity can be determined based on a deviation from normal such as a reduction in movement.
- scheduler agent 310 schedules an appointment as indicated by event 366 . To do so, scheduler agent 310 accesses Internet-based services 180 , for example a healthcare provider website. Once the appointment is scheduled, scheduled agent 310 stores the appointment info in data 360 so that other autonomous agents may use the information to make further predictions.
- reminder agent 311 sends a reminder about the appointment to the user, for example, via text-to-speech via speaker 103 .
- FIG. 4 depicts a flowchart of an exemplary method 400 used to operate an integrated assistance computing system, according to certain aspects of the present disclosure.
- FIG. 4 is discussed with respect to FIGS. 2 and 3 for example purposes, but method 400 can be implemented on other systems.
- process 400 involves receiving, from the first sensor module, a first set of event data indicating events relating to a subject.
- autonomous agent 312 receives events 361 - 355 , 357 , and 359 from sensors 170 .
- scheduler agent 310 receives events 366 and 368 , representing the appointment, from Internet-based services 180 .
- process 400 involves providing the first set of event data to a data aggregator.
- autonomous agent 312 provides events 361 - 355 , 357 , and 359 to data aggregator 350 , which stores the events in data 360 .
- Scheduler agent 310 provides events 366 and 368 to data aggregator 350 , which stores the events in data 360 .
- process 400 involves receiving, from the data aggregator, correlated event data comprising events sensed by the first autonomous agent and a second autonomous agent.
- reminder agent 311 receives event 366 from data aggregator 350 .
- reminder agent 311 is shown as a different agent than autonomous agent 312 , but autonomous agents can provide data to data aggregator 350 and receive data from data aggregator 350 .
- process 400 involves updating the first machine learning model.
- scheduler agent 310 receives events 361 - 355 , 357 , and 359 and provides the received events to a machine learning model.
- process 400 involves determining, via the first reasoning module and based on the first pattern of activity, that a first action should be performed, causing the first actuator module to perform the first action.
- scheduler agent 310 determines that an appropriate course of action is to schedule a doctor appointment.
- Scheduler agent 310 connects to Internet-based services 180 and schedules the appointment.
- a and reminder agent 311 provides a reminder via speaker 103 to schedule a visit.
- the autonomous agents continue to monitor the user's activity. Responsive to determining the user's behavior has reverted back to normal, i.e., consistent with a pattern of events illustrated by events 361 - 363 , an autonomous agent can determine that scheduling the doctor appointment was a valid decision. In this regard, the machine learning model receives a reward indicting that the decision taken was correct. Conversely, if the activity detected 369 indicates continued unusual activity, then the autonomous agent determines a different course of action and provides the machine learning model an indication of a low reward. In either case, the machine learning model can adjust internal parameters such as states or modify previously established rules in order to encourage (for positive feedback) or discourage (for negative feedback) the performed action. In this manner, the autonomous agents learn and improve.
- FIG. 5 depicts a flowchart of an exemplary method 500 used for reinforcement learning, according to certain aspects of the present disclosure.
- Method 500 can be implemented by integrated assistance computing device or by another device.
- Machine learning models described herein can base predictions on data aggregated from one or more autonomous agents, one or more sensors 170 , one or more physical devices 160 , or one or more Internet-based services 180 .
- the machine learning model is continuously updated based on feedback received from a user or operator. Feedback can be provided via a user interface, or via text-to-speech, or by another means.
- process 500 involves receiving, from a data aggregator, data including correlated events.
- the data can include events that originated from different autonomous agents, such as reminder agent 111 or responder agent 112 .
- data aggregator 350 determines that two or more events are related, groups the events, and make the correlated events available to the autonomous agents.
- process 500 involves initiating an action based on a prediction from a machine learning model.
- integrated assistance computing system 101 schedules an appointment for the user such as a doctor's appointment.
- process 500 involves receiving user feedback indicating whether the action is desirable or undesirable.
- Feedback can be provided in different manners.
- feedback can be user-based, e.g. via a user interface or voice command.
- Integrated assistance computing system 101 can receives feedback from a user indicating whether the user considers the scheduled appointment to be useful.
- Feedback can be inferred from a detection that an unusual pattern of activity is continuing. For example, if an unusual pattern of activity (e.g., shown by event 364 ) does not improve, then a low reward or negative feedback is inferred.
- Integrated assistance computing system 101 provides the feedback to the machine learning model.
- process 500 involves updating internal parameters of the machine learning model.
- the parameters are updated as to maximize the reward. More specifically, the machine learning model associates attributes or parameters of the scheduled event with the user's feedback such that the event is more likely to be generated if the event received positive feedback, and less likely to be generated if the event received negative feedback. In this manner, the machine learning model maximizes the reward and improves over iterations.
- Machine learning models described herein can also use supervised or unsupervised learning.
- supervised learning a set of training data with known positive and negative cases is provided to the machine learning model 210 in order to train the model to make predictions. Training can be performed before the integrated assistance computing system is provided to a user, for example, at the factory.
- supervised learning can be used to train the machine learning model 210 to predict when a user will wake up by providing patterns that include the identification of sound, light, or movement.
- unsupervised learning machine learning model 210 learns and adapts over time. In this case, training is performed with live user data, for example, when the user interacts with the device.
- FIG. 6 depicts an example of a computing system 600 for implementing an integrated assistance computing system, according to certain aspects of the present disclosure.
- the implementation of computing system 600 could be used for one or more of scheduler agent 110 , reminder agent 111 , responder agent 112 , another autonomous agent, system integrator 140 , or data aggregator 150 .
- the depicted example of a computing system 600 includes a processor 602 communicatively coupled to one or more memory devices 604 .
- the processor 602 executes computer-executable program code stored in a memory device 604 , accesses information stored in the memory device 604 , or both.
- Examples of the processor 602 include a microprocessor, an application-specific integrated circuit (“ASIC”), a field-programmable gate array (“FPGA”), or any other suitable processing device.
- the processor 602 can include any number of processing devices, including a single processing device.
- a memory device 604 includes any suitable non-transitory computer-readable medium for storing program code 605 , program data 607 , or both.
- Program code 605 and program data 607 can be from scheduler agent 110 , reminder agent 111 , responder agent 112 , another autonomous agent, system integrator 140 , or data aggregator 150 , or any other applications or data described herein.
- a computer-readable medium can include any electronic, optical, magnetic, or other storage device capable of providing a processor with computer-readable instructions or other program code.
- Non-limiting examples of a computer-readable medium include a magnetic disk, a memory chip, a ROM, a RAM, an ASIC, optical storage, magnetic tape or other magnetic storage, or any other medium from which a processing device can read instructions.
- the instructions may include processor-specific instructions generated by a compiler or an interpreter from code written in any suitable computer-programming language, including, for example, C, C++, C#, Visual Basic, Java, Python, Perl, JavaScript, and ActionScript.
- the computing system 600 may also include a number of external or internal devices, an input device 620 , a presentation device 618 , or other input or output devices.
- the computing system 600 is shown with one or more input/output (“I/O”) interfaces 608 .
- An I/O interface 608 can receive input from input devices or provide output to output devices.
- One or more buses 606 are also included in the computing system 600 .
- the bus 606 communicatively couples one or more components of a respective one of the computing system 600 .
- the computing system 600 executes program code 605 that configures the processor 602 to perform one or more of the operations described herein.
- Examples of the program code 605 include, in various aspects, modeling algorithms executed by scheduler agent 110 , reminder agent 111 , responder agent 112 , another autonomous agent, system integrator 140 , or data aggregator 150 , or other suitable applications that perform one or more operations described herein.
- the program code may be resident in the memory device 604 or any suitable computer-readable medium and may be executed by the processor 602 or any other suitable processor.
- one or more memory devices 604 stores program data 607 that includes one or more datasets and models described herein. Examples of these datasets include interaction data, environment metrics, training interaction data or historical interaction data, transition importance data, etc.
- one or more of data sets, models, and functions are stored in the same memory device (e.g., one of the memory devices 604 ).
- one or more of the programs, data sets, models, and functions described herein are stored in different memory devices 604 accessible via a data network.
- the computing system 600 also includes a network interface device 610 .
- the network interface device 610 includes any device or group of devices suitable for establishing a wired or wireless data connection to one or more data networks.
- Non-limiting examples of the network interface device 610 include an Ethernet network adapter, a modem, and/or the like.
- the computing system 600 is able to communicate with one or more other computing devices via a data network using the network interface device 610 .
- the computing system 600 also includes the input device 620 and the presentation device 618 depicted in FIG. 6 .
- An input device 620 can include any device or group of devices suitable for receiving visual, auditory, or other suitable input that controls or affects the operations of the processor 602 .
- Non-limiting examples of the input device 620 include a touchscreen, a mouse, a keyboard, a microphone, a separate mobile computing device, etc.
- a presentation device 618 can include any device or group of devices suitable for providing visual, auditory, or other suitable sensory output.
- Non-limiting examples of the presentation device 618 include a touchscreen, a monitor, a speaker, a separate mobile computing device, etc.
- Presentation device 618 can implement functionality of display 104 .
- presentation device 618 can display user interface elements, such as sliders or controls.
- FIG. 6 depicts the input device 620 and the presentation device 618 as being local to the computing device that executes scheduler agent 110 , reminder agent 111 , responder agent 112 , another autonomous agent, system integrator 140 , or data aggregator 150 , other implementations are possible.
- one or more of the input device 620 and the presentation device 618 can include a remote client-computing device that communicates with the computing system 600 via the network interface device 610 using one or more data networks described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Systems and methods disclosed herein relate to autonomous agents. A first autonomous agent receives, from a first sensor, a first set of event data indicating events relating to a subject. The first autonomous agent provides the first set of event data to a data aggregator. The first autonomous agent receives, from the data aggregator, correlated event data including events sensed by the first autonomous agent and a second autonomous agent. The first autonomous agent applies machine learning model to the correlated event data to predict a first pattern of activity and determines, based on the first pattern of activity, that a first action is to be performed, causing the first actuator module to perform the first action.
Description
- This application is a continuation of U.S. patent application Ser. No. 16/018,772 filed on Jun. 26, 2018. The contents of each of the foregoing is/are hereby incorporated by reference into this application as if set forth herein in full.
- This invention relates generally to automatic assistance devices and more specifically to an integrated assistance platform that integrates data from disparate sources and uses machine-learning to provide intelligent assistance.
- Smart devices provide many benefits. Smart devices can receive voice commands and integrate with home appliances and internet services, making daily living easier. For example, smart devices can turn on or off home appliances or provide news updates via a display or text-to-speech.
- But such devices do not fully address the specific needs of seniors. For example, some senior-specific devices exist, but such devices fail to fully address senior-specific needs such as mental health issues like dementia, physical mobility issues, or loneliness. More specifically, existing solutions do not fully integrate disparate systems such as local sensors with Internet-based services with artificial intelligence. As such, existing systems are not able to proactively address the needs of seniors. Hence, additional solutions are needed.
- Certain aspects and features include a system and a method of operating a first autonomous agent, a second autonomous agent, and a data aggregator. The first autonomous agent includes a first sensor module, a first reasoning module, and a first actuator module. The first sensor module manages a first sensor and the first reasoning module provides a first machine learning model. The first autonomous agent is configured to receive, from the first sensor module, a first set of event data indicating events relating to a subject, provide the first set of event data to a data aggregator and receive, from the data aggregator, correlated event data including events sensed by the first autonomous agent and the second autonomous agent. Updating the first machine learning model includes applying the machine learning model to the correlated event data to predict a first pattern of activity, determining a first reward of the first machine learning model, and updating internal parameters of the first machine learning model to maximize the first reward. The first autonomous agent is further configured to determine, via the first reasoning module and based on the first pattern of activity, that a first action is to be performed, causing the first actuator module to perform the first action.
- The second autonomous agent includes a second sensor module, a second reasoning module, and a second actuator module. The second sensor module manages a second sensor and the second reasoning module provides a second machine learning model. The second sensor is different from the first sensor and the second autonomous agent is configured to receive, from the second sensor module, a second set of event data indicating events relating to a subject, provide the second set of event data to the data aggregator, receive, from the data aggregator, the correlated event data, and update the second machine learning model. Updating the second machine learning model includes applying the second machine learning model to the correlated event data to predict a second pattern of activity, determining a second reward of the second machine learning model, and updating internal parameters of the second machine learning model to maximize the second reward. The second autonomous agent is configured to determine, via the second reasoning module and based on the second pattern of activity, that a second action is to be performed and cause the second actuator module to perform the second action.
- The data aggregator includes a data store and is configured to store the first set of event data and the second set of event data in the data store, correlate event data from the first set of event data and the second set of event data, and store the correlated event data in the data store.
- These illustrative examples are mentioned not to limit or define the disclosure, but to provide examples to aid understanding thereof. Additional examples and further description are provided in the Detailed Description.
- These and other features, aspects, and advantages of the present disclosure are better understood when the following Detailed Description is read with reference to the accompanying drawings, where:
-
FIG. 1 illustrates an exemplary integrated assistance computing system, according to certain aspects of the present disclosure. -
FIG. 2 illustrates an exemplary environment in which an autonomous agent can operate, according to certain aspects of the present disclosure. -
FIG. 3 depicts an example of a data aggregator for an integrated assistance computing system, according to aspects of the present disclosure. -
FIG. 4 depicts a flowchart of an exemplary method used to operate an integrated assistance computing system, according to certain aspects of the present disclosure. -
FIG. 5 depicts a flowchart of an exemplary method used for reinforcement learning, according to certain aspects of the present disclosure. -
FIG. 6 depicts an example of a computing system for implementing an integrated assistance computing system, according to certain aspects of the present disclosure. - Aspects of the present invention relate to an integrated assistance computing platform that assists users in a proactive manner with daily living. The integrated assistance computing platform can address the needs of seniors or other people. The integrated assistance platform can be implemented as smartphone application connecting to a cloud-based service, a hardware device, or a robotic device. The integrated assistance computing system includes one or more autonomous agents. Autonomous agents are applications or devices that execute independently from each other, each with separate machine learning capability configured to process sensor data from themselves or other autonomous agents, determine that a specific action should be performed based on correlated event data, and cause the action to be performed.
- Each autonomous agent uses one or more machine learning models that use unsupervised, supervised, or reinforcement learning. In this manner, the agents are self-improving and can therefore become more valuable to a user. By using an integrated assistance computing platform, each autonomous agent gathers and takes decisions based on data from other agents, sensors, and disparate sources such as external databases or internet services such as medical systems, financial systems, and social media.
- Exemplary autonomous agents include scheduler, reminder, and responder agents, but other types of autonomous agents are possible. For example, a scheduler agent proactively schedules appointments with medical or financial professionals, or coordinates social activity. Appointments can be scheduled based on predictions from machine-learning models based on aggregated data from disparate sources such as medical or financial records. For example, a reminder agent reminds the user, by activating an alert such as a sound, light, or phone call, that an appointment is approaching. An responder agent monitors or control household appliances and sensors. The responder agent responds to a user request using a voice assistant. The responder agent accesses Internet sites such as personal financial or medical sites. The responder agent interacts with the user in order to inform the user of daily events, schedule appointments, converse with the user, or provide the user with puzzles or brain-teasers.
- The following example is provided to introduce certain aspects. A first autonomous agent including a sensor module, a reasoning module, and an actuator module operates on an integrated assistance computing system. The sensor module of the first autonomous agent connects to a door sensor and a light sensor, thereby receiving event data indicating when a door has been opened and closed and when a light has been turned on or off. The first autonomous agent provides the event data to a data aggregator operating on the platform.
- A second autonomous agent also operates on the integrated assistance computing platform. The second autonomous agent includes a reasoning module that connects to various Internet-based services in order to obtain emails, weather forecasts, expected sunrise and sunset. The data aggregator receives event data from a second autonomous agent. The event data includes the weather, sunrise time, and sunset time. The second autonomous agent provides the events to the data aggregator. Over time, the data aggregator correlates door-related events and light-related events.
- Continuing the example, the first autonomous agent receives correlated events from the data aggregator and updates the machine learning model of the first autonomous agent. The machine learning model analyzes the data, learn patterns, or makes predictions on the correlated data. Continuing the example, the machine learning model recognizes a daily pattern that includes the light being turned and a door being opened shortly after sunrise. The first autonomous agents creates a rule based on this predicted pattern. At a future time, if the door and light sensors do not detect events based on this rule, then the first autonomous agent creates an alert. Deviation from patterns can be detected and exceptions can be made. For example, if an autonomous agent detects that the user stayed up late the previous day, then the rule can be varied slightly to allow for the fact that the user will likely sleep in.
- In another example, a machine learning model detects a pattern of a user watching particular baseball games. Detecting this event data can be performed using audio or video recognition or interfacing with a television or cable box. The autonomous agent can also access Internet-based services to obtain event data including wins and losses for the team, delayed game starts, overtimes, etc. The data aggregator aggregates this event data from disparate sources and provides the data to the autonomous agent. This aggregated data is useful in performing several different functions. For example, a responder agent can send alerts to the user with game scores or announce that the game will start late. Further, by analyzing this data using the machine learning model, the autonomous agent determines additional trends such as a particular losing streak.
- The autonomous agent inputs the aggregated event data to a machine learning model, which recognizes a pattern. Examples of patterns include that the user typically watches all games for a particular team, all away games, games on a Saturday, or a majority of games. Predictions can also be based on external data such as the weather or other scheduled activities. Based on the determined pattern, the autonomous agent can predict whether a user will watch a particular game. Based on this prediction, the autonomous agent can automatically turn the television on, or remind the user that the game will be on that day.
- Additionally, the autonomous agent can use the predictions to determine that an abnormal activity has occurred and can then act accordingly. For example, if the autonomous agent predicts that the user will watch a particular game, but does not watch the game, the responder agent takes an action such as reminding the user with an audio or visual alert.
- In some circumstances, missing a game can cause an autonomous agent to issue an alert. But the autonomous agent first checks to determine whether another explanation exists for the user missing the game. The autonomous agent checks sensor event data to determine whether the user is performing another activity as detected by appliance, light, sound, or other events. In this case, the autonomous agent might determine that the user is healthy and is simply busy doing something else. Similarly, data from Internet-sources such as emails or social media may indicate appointments or visits are scheduled at the same time as the game. For example, the user's daughter could be visiting.
- Having determined an exception to a predicted event, the autonomous agent then learns from the deviation, using reinforcement or other machine learning techniques. For example, the autonomous agent learns that the user does not watch baseball games when the user's daughter is visiting. The autonomous agent provides this feedback to the machine learning model and the data aggregator. The machine learning model is updated accordingly and next time the user does not watch a predicted game and the daughter is visiting, the autonomous agent does not identify a deviation. In this manner, by analyzing events from disparate sources, an autonomous agent determines a richer understanding of user behavior and improves over time.
- Alternatively, responsive to determining that no other events indicate a reason that the user missed a game, the autonomous agent takes an action. A responder agent can first attempt to reach the user by alert, and if the user does not respond, then the responder agent attempts to reach the user or can send an alert to family or friends.
- Other learned predictions can include predicting a time to get a cane, move from cane to a walker, and move from walker to wheelchair. Via actuator modules, the autonomous agents can also schedule activities and services such as scheduling transportation for medical and physical therapy, or physician visits. Autonomous agents can also send alerts that include when to take pills and shots or alert a care-giver of possible forgetfulness, or email grocery lists.
- Turning now to the Figures,
FIG. 1 illustrates an exemplary integratedassistance computing platform 100, according to certain aspects of the present disclosure. Integratedassistance computing platform 100 includes integratedassistance computing system 101, user device 102,physical devices 160,sensors 170, and Internet-basedservices 180. Integratedassistance computing system 101 can be implemented as a mobile application on a phone or tablet, on a personal computer such as a laptop or desktop, or implemented in a physical robot. Integratedassistance computing system 101 can accessphysical devices 160,sensors 170, or Internet-basedservices 180 via a network or other connection. - User device 102 provides a point of interaction for the user. User device 102 includes
speaker 103,display 104, andmicrophone 105.Speaker 103 can be used by integratedassistance computing system 101 to transmit audio or speech to a user.Display 104 can display information such as status updates, news, weather, or alerts. For example, as depicted,display 104 is displaying the weather and the time.Microphone 105 can receive voice commands from a user. Integratedassistance computing system 101 can process the commands or use an external system to process commands and respond. - In response to decisions taken by autonomous agents 110-112,
system integrator 140 can reconfigure, activate, or deactivate one or morephysical devices 160.Physical devices 160 includewheelchair 161,doorbell 162,appliances 163, and lights 164. For example,system integrator 140 can causewheelchair 161 to move.System integrator 140 can receive a notification fromdoorbell 162 or access images from a camera installed ondoorbell 162. System integrator can turn on or turn offappliances 163.Appliances 163 can include microwaves, stoves, refrigerators, and the like.System integrator 140 can turn on, turn off, ordim lights 164. Integratedassistance computing system 101 can connect tophysical devices 160 via dedicated point-to-point connection, a wired connection, or a wireless connection. In an aspect,physical devices 160 can be operated by a smart home system connected to integratedassistance computing system 101 via a network connection. Other physical devices are possible such as medical alert systems, self-driving cars, or robots. For example, system integrator can cause a robot to approach a user. -
Sensors 170 includemotion sensor 171,light sensor 172,sound sensor 173, andtemperature sensor 174. Other sensors are possible.Sensors 170 provide autonomous agents 110-112 data or events on which to base decisions and take actions. For example,sound sensor 173 provides a signal toresponder agent 112 that indicates that the user has made a specific sound that identifies that the user is awake. Such sounds and their identification can be provided or learned.Temperature sensor 174 provides a signal toresponder agent 112 that a residence is too hot or too cold. - Internet-based
services 180 include public services and also services that store or maintain personally identifiable information. For example, Internet-based services includesocial services 181 such as social media accounts,financial services 182 such as bank accounts or investment accounts, andmedical services 183 such as doctors, dentists, wellness centers, or medical alert systems. By connecting to Internet-basedservices 180, integratedassistance computing system 101 can remind the user of upcoming appointments, changes that may need to be made to investment plans or medical accounts, identify potential fraud, etc. - Integrated
assistance computing system 101 includes one or more autonomous agents such as ascheduler agent 110,reminder agent 111, orresponder agent 112.Scheduler agent 110 performs scheduling functions such as making medical appointments, making appointments with financial advisors, or scheduling social activities.Reminder agent 111 creates alerts such as reminding the user to take his or her medication, to alter an investment plan, or send emails to friends and family.Responder agent 112 can respond to commands such as verbal commands received viamicrophone 105 or commands received via a user interface.Responder agent 112 can also play interactive games such as puzzles with a user.Responder agent 112 can also activate or deactivatephysical devices 160. - While three autonomous agents 110-112 are depicted, additional autonomous agents are possible. Autonomous agents can be added to the integrated assistance computing platform via software upgrades, or implemented on remote computing systems and accessed via a data network.
- Integrated
assistance computing system 101 also includessystem integrator 140, which can receive information from or controlphysical devices 160,sensors 170, or Internet-basedservices 180. Usingsystem integrator 140, autonomous agents 110-112 can control external devices. For example, if theresponder agent 112 decides that thelights 164 should be turned on aslight sensor 172 is receiving a signal that a light level is too dim, thenresponder agent 112 can send a control signal tosystem integrator 140, which in turn causes thelights 164 to be activated.System integrator 140 can perform speech recognition. -
Data aggregator 150 aggregates data from one or more autonomous agents 110-112. Data aggregator includes one ormore data stores 151.Data aggregator 150 can perform data functions such as correlation. For example, ifdata aggregator 150 determines that event data received fromscheduler agent 110, for example, an appointment, is related to data received fromreminder agent 111, then data aggregator can integrate the data into a common data entry or data structure, or link the data together by using a reference. In this manner, autonomous agents 110-112 receive the benefit of utilizing correlated data aggregated from multiple autonomous agents. - For example, each autonomous agent 110-112 can access correlated data from
data aggregator 150, which in turn accesses data fromphysical devices 160,sensors 170, or Internet-basedservices 180, and take actions based on that data. For example,scheduler agent 110 can determine from amedical service 183 that a user has a doctor's appointment.Scheduler agent 110 can provide this information tomachine learning model 120, which, over multiple iterations, can learn the frequency with which medical appointments are scheduled.Scheduler agent 110 can remind the user, for example, by sending a message tospeaker 103 ordisplay 104, that an upcoming appointment, or, if no appointment is scheduled, cause an appointment to be scheduled. - Each autonomous agent 110-112 can also access data gathered from other autonomous agents, provide that data to a machine learning model, and receive predictions thereon. For example, an autonomous agent 110-112 can access correlated data from
data aggregator 150, provide the data to one or more machine learning models, and receive predictions from the machine learning model. Each autonomous agent can receive event data fromsensors 170, provide the event data todata aggregator 150, receive correlated event data fromdata aggregator 150, provide the event data to a machine learning model, receive a prediction from the machine learning model and take action based on the decision. As depicted,scheduler agent 110 includesmachine learning model 120,reminder agent 111 includesmachine learning model 121, andresponder agent 112 includesmachine learning model 122. The methods used with machine learning models in each agent are discussed further with respect toFIG. 2 . -
FIG. 2 illustrates an exemplary environment in which an autonomous agent can operate, according to certain aspects of the present disclosure.Environment 200 includes autonomous agent 201,sensors 170, and Internet-basedservices 180. Autonomous agent 201 includes one or more ofreasoning module 220,actuator module 230,machine learning model 210, anddata 215. Each autonomous agent can execute as a separate application on integratedassistance computing system 101, or on a separate computing system.Reasoning module 220,actuator module 230, andmachine learning model 210 can execute on integratedassistance computing system 101 as separate applications or processes, or can execute on different processors.Data 215 can be used for local storage or storage of training data used formachine learning model 210. In an aspect,machine learning model 210 can be included withinreasoning module 220. - Autonomous agents can help combat senior-specific problems such as loneliness and dementia. For example, by analyzing data from multiple agents, integrated assistance computing platform can detect whether anyone has been seen at a residence or whether the user has interacted with anyone. Integrated assistance computing platform can help combat dementia by playing games and puzzles with the user. The games and puzzles can be obtained from Internet-based
services 180. By integrating data from different autonomous agents and other sources, the scheduler agent and the reminder agent can perform more sophisticated analysis than with one agent alone, can make more intelligent suggestions. and be more useful to the user. For example, if an event indicates that the user is watching television more than usual or not moving much is correlated with not taking their medication on time, the autonomous agent may conclude that the user is not feeling well and take an action. - Autonomous agents can self-organize and self-register with the integrated assistance computing platform. In an aspect, the autonomous agents can use an auto-discovery function provided by integrated
assistance computing system 101. The auto-discovery function periodically checks for new autonomous agents,physical devices 160,sensors 170, or Internet-basedservices 180. If a new agent, device, sensor, or service is detected, then the auto-discovery function updates internal configuration tables, configures interface protocols, refreshes a network topology, and activates the new agent, device, sensor, or service. In an example, a new autonomous agent is added to the integratedassistance computing system 101. The new autonomous agent communicates its capabilities to the computing system. In turn, the computing system informs other autonomous agents of the capabilities. The platform starts providing information to the agent, for example, viadata aggregator 150. - Autonomous agent 201 accesses
physical devices 160,sensors 170, and Internet-basedservices 180. For example, a first autonomous agent can analyze sensors in a residence such aslight sensor 172 orsound sensor 173. Simultaneously, a second autonomous agent can analyze Internet-based services such associal services 181. The two agents can operate in conjunction with one another. For example, the second autonomous agent can schedule appointments or gather photos from the user's social media account. -
Data aggregator 150 aggregates events from the first agent and second agent. Each autonomous agent can access the aggregated data and provide the data to the respective machine learning model. Based on the data, a machine learning model can determine patterns or determine or execute rules. For example, thedata aggregator 150 can aggregate and organize events that enable a machine learning model to predict when a user will wake up or go to sleep. Subsequently, an autonomous agent can take action based on the rules, such as by turning the lights on, the heat up, or the coffee maker on. - Rules can be used to adjust predicted actions. An example rule ensures that a determined action does not interfere with the user's sleep. For example, by accessing sensor data the first autonomous agent can determine that the user is asleep. Based on a rule not to disturb the user when he or she is asleep, the first autonomous agent can delay a presentation of photos received via social media until such time that the user is awake.
-
Reasoning module 220 analyzes data and determines actions, optionally in conjunction withmachine learning model 210. For example,reasoning module 220 receives correlated event data fromdata aggregator 150 and takes decisions thereon.Reasoning module 220 includes an inference engine that performs the data analysis or provides the data tomachine learning model 210 for analysis.Machine learning model 210 receives data fromreasoning module 220, predicts data patterns or events and provides the predictions toreasoning module 220.Reasoning module 220 can include predefined or dynamic rules. Rules specify that when a specific event occurs, take a specific action in response. -
Machine learning model 210 learns different data patterns. For example,sensors 170 can be used to determine when a user is waking up, going to bed, or getting ready to leave the home. Similarly, machine learning model can use socialization patterns to determine the kind of interactions that the user has on a daily basis. For example, by analyzing Internet-basedservices 180 such as email, or text messages, the autonomous agent 201 can determine how often the user is socializing and with whom. Such analysis is useful to prevent loneliness, i.e., ensuring that the user is social enough, or to prevent fraud by detecting scam messages or visits. As discussed further with respect toFIG. 5 , different types of learning are possible.Actuator module 230 receives determined actions from reasoningmodule 220 and causes the integratedassistance computing system 101 to take the action. Actions can include issuing an alert, send an email, turn on the lights, etc. For example, if thelight sensor 172 detects light, then the reasoning module determines a significance of the event and determines that a greeting is appropriate. Theactuator module 230 outputs a greeting viaspeaker 103.Actuator module 230 can issue alerts. For example, by alerting the user to take medicine or of an upcoming appointment, update an investment plan, indicate potential fraud, email an invitation, notify a care-giver or notify medical personnel. -
FIG. 3 depicts an example of a data integration platform for an integrated assistance computing system, according to aspects of the present disclosure.FIG. 3 depicts an exemplary integratedassistance computing system 300. Integratedassistance computing system 300 includes one or more ofdata aggregator 350,sensors 170, autonomous agent 312,reminder agent 311,scheduler agent 310, Internet-basedservices 180, andspeaker 103. While as depicted, integratedassistance computing system 300 includes three autonomous agents, different numbers are possible. -
Data aggregator 350 includesdata 360, which stores events 361-369.Data 360 is common storage accessible to multiple autonomous agents. In this manner,data aggregator 350 enables interaction and cooperation between the agents, including using data shared from external sources such as Internet-basedservices 180. Examples of data that can be stored bydata aggregator 350 include public information such as addresses and phone numbers, news data, public records, private information such as pictures or social media posts, and learned rules, models, or algorithms. -
Data aggregator 350 can determine similarities or correlations between data received from autonomous agents.Data aggregator 350 receives events 361-369 from one or more autonomous agents, aggregates and correlates the events into groups, and provides the groups toscheduler agent 310,reminder agent 311, and autonomous agent 312. - Events 361-369 represent events captured by one or more autonomous agents. For example,
events sensors 170 and receive a detection bylight sensor 172 that a light was turned on. Similarly, autonomous agent 312 can accesssound sensor 173 and receive a detection of noise. Other events can include social media posts, emails, or notifications. In the aggregate, events can determine a pattern of activity such as when the user wakes up, what the user typically does during the day (or a particular day of the week), and when the user typically goes to bed. - Events 361-369 are ordered chronologically but need not be. For example, events can be reordered and grouped according to other criteria such as an identified connection with a person, place, or keyword.
- As depicted,
event 364 is a detection of reduced activity. Reduced activity can be determined based on a deviation from normal such as a reduction in movement. In response,scheduler agent 310 schedules an appointment as indicated byevent 366. To do so,scheduler agent 310 accesses Internet-basedservices 180, for example a healthcare provider website. Once the appointment is scheduled, scheduledagent 310 stores the appointment info indata 360 so that other autonomous agents may use the information to make further predictions. At an appropriate time,reminder agent 311 sends a reminder about the appointment to the user, for example, via text-to-speech viaspeaker 103. -
FIG. 4 depicts a flowchart of anexemplary method 400 used to operate an integrated assistance computing system, according to certain aspects of the present disclosure.FIG. 4 is discussed with respect toFIGS. 2 and 3 for example purposes, butmethod 400 can be implemented on other systems. - At
block 401,process 400 involves receiving, from the first sensor module, a first set of event data indicating events relating to a subject. For example, autonomous agent 312 receives events 361-355, 357, and 359 fromsensors 170. Similarly,scheduler agent 310 receivesevents services 180. - At
block 402,process 400 involves providing the first set of event data to a data aggregator. For example, autonomous agent 312 provides events 361-355, 357, and 359 todata aggregator 350, which stores the events indata 360.Scheduler agent 310 providesevents data aggregator 350, which stores the events indata 360. - At
block 403,process 400 involves receiving, from the data aggregator, correlated event data comprising events sensed by the first autonomous agent and a second autonomous agent. Continuing the example,reminder agent 311 receivesevent 366 fromdata aggregator 350. For example purposes,reminder agent 311 is shown as a different agent than autonomous agent 312, but autonomous agents can provide data todata aggregator 350 and receive data fromdata aggregator 350. - At
block 404,process 400 involves updating the first machine learning model. For example,scheduler agent 310 receives events 361-355, 357, and 359 and provides the received events to a machine learning model. - At
block 405,process 400 involves determining, via the first reasoning module and based on the first pattern of activity, that a first action should be performed, causing the first actuator module to perform the first action. From machine learning model,scheduler agent 310 determines that an appropriate course of action is to schedule a doctor appointment.Scheduler agent 310 connects to Internet-basedservices 180 and schedules the appointment. At a later time, a andreminder agent 311 provides a reminder viaspeaker 103 to schedule a visit. - Continuing the example, subsequent to the doctor appointment, indicated by
event 368, the autonomous agents continue to monitor the user's activity. Responsive to determining the user's behavior has reverted back to normal, i.e., consistent with a pattern of events illustrated by events 361-363, an autonomous agent can determine that scheduling the doctor appointment was a valid decision. In this regard, the machine learning model receives a reward indicting that the decision taken was correct. Conversely, if the activity detected 369 indicates continued unusual activity, then the autonomous agent determines a different course of action and provides the machine learning model an indication of a low reward. In either case, the machine learning model can adjust internal parameters such as states or modify previously established rules in order to encourage (for positive feedback) or discourage (for negative feedback) the performed action. In this manner, the autonomous agents learn and improve. -
FIG. 5 depicts a flowchart of anexemplary method 500 used for reinforcement learning, according to certain aspects of the present disclosure.Method 500 can be implemented by integrated assistance computing device or by another device. Machine learning models described herein can base predictions on data aggregated from one or more autonomous agents, one ormore sensors 170, one or morephysical devices 160, or one or more Internet-basedservices 180. With reinforcement learning, the machine learning model is continuously updated based on feedback received from a user or operator. Feedback can be provided via a user interface, or via text-to-speech, or by another means. - At
block 501,process 500 involves receiving, from a data aggregator, data including correlated events. The data can include events that originated from different autonomous agents, such asreminder agent 111 orresponder agent 112. As shown inFIG. 3 ,data aggregator 350 determines that two or more events are related, groups the events, and make the correlated events available to the autonomous agents. - At
block 502,process 500 involves initiating an action based on a prediction from a machine learning model. In an example, integratedassistance computing system 101 schedules an appointment for the user such as a doctor's appointment. - At
block 503,process 500 involves receiving user feedback indicating whether the action is desirable or undesirable. Feedback can be provided in different manners. For example, feedback can be user-based, e.g. via a user interface or voice command. Integratedassistance computing system 101 can receives feedback from a user indicating whether the user considers the scheduled appointment to be useful. Feedback can be inferred from a detection that an unusual pattern of activity is continuing. For example, if an unusual pattern of activity (e.g., shown by event 364) does not improve, then a low reward or negative feedback is inferred. Integratedassistance computing system 101 provides the feedback to the machine learning model. - At
block 504,process 500 involves updating internal parameters of the machine learning model. The parameters are updated as to maximize the reward. More specifically, the machine learning model associates attributes or parameters of the scheduled event with the user's feedback such that the event is more likely to be generated if the event received positive feedback, and less likely to be generated if the event received negative feedback. In this manner, the machine learning model maximizes the reward and improves over iterations. - Machine learning models described herein can also use supervised or unsupervised learning. Using supervised learning, a set of training data with known positive and negative cases is provided to the
machine learning model 210 in order to train the model to make predictions. Training can be performed before the integrated assistance computing system is provided to a user, for example, at the factory. In an example, supervised learning can be used to train themachine learning model 210 to predict when a user will wake up by providing patterns that include the identification of sound, light, or movement. Using unsupervised learning,machine learning model 210 learns and adapts over time. In this case, training is performed with live user data, for example, when the user interacts with the device. -
FIG. 6 depicts an example of acomputing system 600 for implementing an integrated assistance computing system, according to certain aspects of the present disclosure. The implementation ofcomputing system 600 could be used for one or more ofscheduler agent 110,reminder agent 111,responder agent 112, another autonomous agent,system integrator 140, ordata aggregator 150. - The depicted example of a
computing system 600 includes aprocessor 602 communicatively coupled to one ormore memory devices 604. Theprocessor 602 executes computer-executable program code stored in amemory device 604, accesses information stored in thememory device 604, or both. Examples of theprocessor 602 include a microprocessor, an application-specific integrated circuit (“ASIC”), a field-programmable gate array (“FPGA”), or any other suitable processing device. Theprocessor 602 can include any number of processing devices, including a single processing device. - A
memory device 604 includes any suitable non-transitory computer-readable medium for storingprogram code 605,program data 607, or both.Program code 605 andprogram data 607 can be fromscheduler agent 110,reminder agent 111,responder agent 112, another autonomous agent,system integrator 140, ordata aggregator 150, or any other applications or data described herein. A computer-readable medium can include any electronic, optical, magnetic, or other storage device capable of providing a processor with computer-readable instructions or other program code. Non-limiting examples of a computer-readable medium include a magnetic disk, a memory chip, a ROM, a RAM, an ASIC, optical storage, magnetic tape or other magnetic storage, or any other medium from which a processing device can read instructions. The instructions may include processor-specific instructions generated by a compiler or an interpreter from code written in any suitable computer-programming language, including, for example, C, C++, C#, Visual Basic, Java, Python, Perl, JavaScript, and ActionScript. - The
computing system 600 may also include a number of external or internal devices, aninput device 620, apresentation device 618, or other input or output devices. For example, thecomputing system 600 is shown with one or more input/output (“I/O”) interfaces 608. An I/O interface 608 can receive input from input devices or provide output to output devices. One or more buses 606 are also included in thecomputing system 600. The bus 606 communicatively couples one or more components of a respective one of thecomputing system 600. - The
computing system 600 executesprogram code 605 that configures theprocessor 602 to perform one or more of the operations described herein. Examples of theprogram code 605 include, in various aspects, modeling algorithms executed byscheduler agent 110,reminder agent 111,responder agent 112, another autonomous agent,system integrator 140, ordata aggregator 150, or other suitable applications that perform one or more operations described herein. The program code may be resident in thememory device 604 or any suitable computer-readable medium and may be executed by theprocessor 602 or any other suitable processor. - In some aspects, one or
more memory devices 604stores program data 607 that includes one or more datasets and models described herein. Examples of these datasets include interaction data, environment metrics, training interaction data or historical interaction data, transition importance data, etc. In some aspects, one or more of data sets, models, and functions are stored in the same memory device (e.g., one of the memory devices 604). In additional or alternative aspects, one or more of the programs, data sets, models, and functions described herein are stored indifferent memory devices 604 accessible via a data network. - In some aspects, the
computing system 600 also includes anetwork interface device 610. Thenetwork interface device 610 includes any device or group of devices suitable for establishing a wired or wireless data connection to one or more data networks. Non-limiting examples of thenetwork interface device 610 include an Ethernet network adapter, a modem, and/or the like. Thecomputing system 600 is able to communicate with one or more other computing devices via a data network using thenetwork interface device 610. - In some aspects, the
computing system 600 also includes theinput device 620 and thepresentation device 618 depicted inFIG. 6 . Aninput device 620 can include any device or group of devices suitable for receiving visual, auditory, or other suitable input that controls or affects the operations of theprocessor 602. Non-limiting examples of theinput device 620 include a touchscreen, a mouse, a keyboard, a microphone, a separate mobile computing device, etc. Apresentation device 618 can include any device or group of devices suitable for providing visual, auditory, or other suitable sensory output. Non-limiting examples of thepresentation device 618 include a touchscreen, a monitor, a speaker, a separate mobile computing device, etc.Presentation device 618 can implement functionality ofdisplay 104. In addition,presentation device 618 can display user interface elements, such as sliders or controls. - Although
FIG. 6 depicts theinput device 620 and thepresentation device 618 as being local to the computing device that executesscheduler agent 110,reminder agent 111,responder agent 112, another autonomous agent,system integrator 140, ordata aggregator 150, other implementations are possible. For instance, in some aspects, one or more of theinput device 620 and thepresentation device 618 can include a remote client-computing device that communicates with thecomputing system 600 via thenetwork interface device 610 using one or more data networks described herein. - While the present subject matter has been described in detail with respect to specific aspects thereof, it will be appreciated that those skilled in the art, upon attaining an understanding of the foregoing, may readily produce alterations to, variations of, and equivalents to such aspects. Accordingly, it should be understood that the present disclosure has been presented for purposes of example rather than limitation and does not preclude inclusion of such modifications, variations, and/or additions to the present subject matter as would be readily apparent to one of ordinary skill in the art.
Claims (20)
1. A device comprising:
a processing system including a processor; and
a memory that stores executable instructions that, when executed by the processing system, facilitate performance of operations, the operations comprising:
receiving, at a first autonomous agent of a plurality of autonomous agents, a first set of event data indicating events relating to a subject, wherein each of the plurality of autonomous agents includes a respective machine learning model;
providing the first set of event data to a data aggregator that also receives a second set of event data relating to the subject from a second autonomous agent of the plurality of autonomous agents,
receiving, from the data aggregator, correlated event data comprising the first set of event data correlated with the second set of event data; and
predicting a first pattern of activity of the subject by applying a first machine learning model of the first autonomous agent to the correlated event data.
2. The device of claim 1 , wherein the operations further comprise:
applying the first set of event data to the first machine learning model to update the first machine learning model.
3. The device of claim 1 , wherein the operations further comprise:
applying the correlated event data to the first machine learning model to update the first machine learning model.
4. The device of claim 1 , wherein the operations further comprise:
receiving, at the first autonomous agent, additional event data from an internet-based service; and
providing the additional event data to the data aggregator.
5. The device of claim 4 , wherein the correlated event received from data aggregator comprises the first set of event data correlated with the second set of event data and correlated with the additional event data.
6. The device of claim 1 , wherein the operations further comprise:
receiving, at the first autonomous agent, a voice command;
applying the voice command to the first machine learning model to identify an action to be taken by the subject; and
providing instructions for the subject to take the action.
7. The device of claim 1 , wherein the providing the first set of event data to the data aggregator comprises providing the first set of event data to a second device that includes the data aggregator.
8. The device of claim 1 , wherein the first autonomous agent comprises a scheduling agent, wherein the first pattern of activity comprises scheduling an appointment.
9. The device of claim 1 , wherein the data aggregator correlates a plurality of events from the first set of event data and the second set of event data from the second autonomous agent.
10. A non-transitory machine-readable medium, comprising executable instructions that, when executed by a processing system including a processor, facilitate performance of operations, the operations comprising:
receiving, at a first autonomous agent of a plurality of autonomous agents, a first set of event data indicating events relating to a subject, wherein each of the plurality of autonomous agents includes a respective machine learning model;
providing the first set of event data to a data aggregator that also receives a second set of event data relating to the subject from a second autonomous agent of the plurality of autonomous agents,
receiving, from the data aggregator, correlated event data comprising the first set of event data correlated with the second set of event data; and
predicting a first pattern of activity of the subject by applying a first machine learning model of the first autonomous agent to the correlated event data.
11. The non-transitory machine-readable medium of claim 10 , further comprising:
applying the first set of event data to the first machine learning model to update the first machine learning model.
12. The non-transitory machine-readable medium of claim 10 , further comprising:
applying the correlated event data to the first machine learning model to update the first machine learning model.
13. The non-transitory machine-readable medium of claim 10 , further comprising:
receiving, at the first autonomous agent, additional event data from an internet-based service; and
providing the additional event data to the data aggregator.
14. The non-transitory machine-readable medium of claim 13 , wherein the correlated event received from data aggregator comprises the first set of event data correlated with the second set of event data and correlated with the additional event data.
15. The non-transitory machine-readable medium of claim 10 , further comprising:
receiving, at the first autonomous agent, a voice command;
applying the voice command to the first machine learning model to identify an action to be taken by the subject; and
providing instructions for the subject to take the action.
16. The non-transitory machine-readable medium of claim 10 , wherein the first autonomous agent comprises a scheduling agent, wherein the first pattern of activity comprises scheduling an appointment.
17. A method comprising:
receiving, by a processing system including a processor, at a first autonomous agent of a plurality of autonomous agents, a first set of event data indicating events relating to a subject, wherein each of the plurality of autonomous agents includes a respective machine learning model;
providing, by the processing system, the first set of event data to a data aggregator that also receives a second set of event data relating to the subject from a second autonomous agent of the plurality of autonomous agents,
receiving, by the processing system, from the data aggregator, correlated event data comprising the first set of event data correlated with the second set of event data; and
predicting, by the processing system, a first pattern of activity of the subject by applying a first machine learning model of the first autonomous agent to the correlated event data.
18. The method of claim 17 , further comprising:
applying, by the processing system, the first set of event data to the first machine learning model to update the first machine learning model.
19. The method of claim 17 , further comprising:
applying, by the processing system, the correlated event data to the first machine learning model to update the first machine learning model.
20. The method of claim 17 , further comprising:
receiving, by the processing system, at the first autonomous agent, additional event data from an internet-based service; and
providing, by the processing system, the additional event data to the data aggregator.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/513,410 US20220051073A1 (en) | 2018-06-26 | 2021-10-28 | Integrated Assistance Platform |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/018,772 US11188810B2 (en) | 2018-06-26 | 2018-06-26 | Integrated assistance platform |
US17/513,410 US20220051073A1 (en) | 2018-06-26 | 2021-10-28 | Integrated Assistance Platform |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/018,772 Continuation US11188810B2 (en) | 2018-06-26 | 2018-06-26 | Integrated assistance platform |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220051073A1 true US20220051073A1 (en) | 2022-02-17 |
Family
ID=68980717
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/018,772 Active 2040-08-03 US11188810B2 (en) | 2018-06-26 | 2018-06-26 | Integrated assistance platform |
US17/513,410 Abandoned US20220051073A1 (en) | 2018-06-26 | 2021-10-28 | Integrated Assistance Platform |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/018,772 Active 2040-08-03 US11188810B2 (en) | 2018-06-26 | 2018-06-26 | Integrated assistance platform |
Country Status (1)
Country | Link |
---|---|
US (2) | US11188810B2 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160232454A1 (en) * | 2015-02-11 | 2016-08-11 | International Business Machines Corporation | Identifying home automation correlated events and creating portable recipes |
US10270668B1 (en) * | 2015-03-23 | 2019-04-23 | Amazon Technologies, Inc. | Identifying correlated events in a distributed system according to operational metrics |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130158368A1 (en) * | 2000-06-16 | 2013-06-20 | Bodymedia, Inc. | System for monitoring and managing body weight and other physiological conditions including iterative and personalized planning, intervention and reporting capability |
US20020128746A1 (en) | 2001-02-27 | 2002-09-12 | International Business Machines Corporation | Apparatus, system and method for a remotely monitored and operated avatar |
EP1741044B1 (en) | 2004-03-27 | 2011-09-14 | Harvey Koselka | Autonomous personal service robot |
WO2007041295A2 (en) | 2005-09-30 | 2007-04-12 | Irobot Corporation | Companion robot for personal interaction |
CN101087395B (en) | 2007-05-31 | 2010-12-29 | 西安电子科技大学 | Built-in human-machine interaction assistance system |
KR20120061688A (en) | 2010-12-03 | 2012-06-13 | 한국기술교육대학교 산학협력단 | Apparatus and control method for robot of old people's welfare support |
US8406926B1 (en) | 2011-05-06 | 2013-03-26 | Google Inc. | Methods and systems for robotic analysis of environmental conditions and response thereto |
KR101330046B1 (en) | 2011-09-19 | 2013-11-18 | 한국산업기술대학교산학협력단 | System for assisting elderly memory using walking assistant robot for life support of elderly, and method of assisting elderly memory using the same |
US20140125678A1 (en) | 2012-07-11 | 2014-05-08 | GeriJoy Inc. | Virtual Companion |
US20150294595A1 (en) | 2012-10-08 | 2015-10-15 | Lark Technologies, Inc. | Method for providing wellness-related communications to a user |
CN103876838A (en) | 2012-12-24 | 2014-06-25 | 北京格瑞图科技有限公司 | Intellisensive elderly assistant care system |
US20150314454A1 (en) | 2013-03-15 | 2015-11-05 | JIBO, Inc. | Apparatus and methods for providing a persistent companion device |
US20170206064A1 (en) | 2013-03-15 | 2017-07-20 | JIBO, Inc. | Persistent companion device configuration and deployment platform |
JP2016522465A (en) | 2013-03-15 | 2016-07-28 | ジボ インコーポレイテッド | Apparatus and method for providing a persistent companion device |
US10417552B2 (en) | 2014-03-25 | 2019-09-17 | Nanyang Technological University | Curiosity-based emotion modeling method and system for virtual companions |
CN103996155A (en) | 2014-04-16 | 2014-08-20 | 深圳市易特科信息技术有限公司 | Intelligent interaction and psychological comfort robot service system |
US10331095B2 (en) * | 2014-04-29 | 2019-06-25 | Cox Communications | Systems and methods for development of an automation control service |
CN103984315A (en) | 2014-05-15 | 2014-08-13 | 成都百威讯科技有限责任公司 | Domestic multifunctional intelligent robot |
US9375845B1 (en) | 2014-09-30 | 2016-06-28 | Sprint Communications Company, L.P. | Synchronizing robot motion with social interaction |
JP6530906B2 (en) | 2014-11-28 | 2019-06-12 | マッスル株式会社 | Partner robot and its remote control system |
CN104618464A (en) | 2015-01-16 | 2015-05-13 | 中国科学院上海微系统与信息技术研究所 | Internet-of-things-based smart home care service system |
US20160314185A1 (en) | 2015-04-27 | 2016-10-27 | Microsoft Technology Licensing, Llc | Identifying events from aggregated device sensed physical data |
CN104965426A (en) | 2015-06-24 | 2015-10-07 | 百度在线网络技术(北京)有限公司 | Intelligent robot control system, method and device based on artificial intelligence |
CN104951077A (en) | 2015-06-24 | 2015-09-30 | 百度在线网络技术(北京)有限公司 | Man-machine interaction method and device based on artificial intelligence and terminal equipment |
US9724824B1 (en) | 2015-07-08 | 2017-08-08 | Sprint Communications Company L.P. | Sensor use and analysis for dynamic update of interaction in a social robot |
CN105137828B (en) | 2015-07-31 | 2018-04-06 | 佛山市父母通智能机器人有限公司 | A kind of old man's Intelligent life self-aid system based on Internet of Things |
US10664572B2 (en) | 2015-08-06 | 2020-05-26 | Microsoft Technology Licensing, Llc | Recommendations for health benefit resources |
KR20170022717A (en) | 2015-08-21 | 2017-03-02 | 최성 | Pet robot that elderly people get around just in time to eat |
US10452816B2 (en) | 2016-02-08 | 2019-10-22 | Catalia Health Inc. | Method and system for patient engagement |
GB201613138D0 (en) | 2016-07-29 | 2016-09-14 | Unifai Holdings Ltd | Computer vision systems |
US20180322413A1 (en) * | 2017-05-08 | 2018-11-08 | T-Mobile Usa, Inc. | Network of autonomous machine learning vehicle sensors |
-
2018
- 2018-06-26 US US16/018,772 patent/US11188810B2/en active Active
-
2021
- 2021-10-28 US US17/513,410 patent/US20220051073A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160232454A1 (en) * | 2015-02-11 | 2016-08-11 | International Business Machines Corporation | Identifying home automation correlated events and creating portable recipes |
US10270668B1 (en) * | 2015-03-23 | 2019-04-23 | Amazon Technologies, Inc. | Identifying correlated events in a distributed system according to operational metrics |
Also Published As
Publication number | Publication date |
---|---|
US20190392286A1 (en) | 2019-12-26 |
US11188810B2 (en) | 2021-11-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11869328B2 (en) | Sensing peripheral heuristic evidence, reinforcement, and engagement system | |
US11307892B2 (en) | Machine-learning for state determination and prediction | |
US20210077036A1 (en) | Voice controlled assistance for monitoring adverse events of a user and/or coordinating emergency actions such as caregiver communication | |
Kim et al. | Emergency situation monitoring service using context motion tracking of chronic disease patients | |
US20160180222A1 (en) | Intelligent Personal Agent Platform and System and Methods for Using Same | |
EP3469496A1 (en) | Situation forecast mechanisms for internet of things integration platform | |
US20230237059A1 (en) | Managing engagement methods of a digital assistant while communicating with a user of the digital assistant | |
US20230148909A1 (en) | Daily living monitoring and management system | |
CN107431649A (en) | For the generation and realization of resident family's strategy of intelligent household | |
KR101842963B1 (en) | A system for user-robot interaction, and information processing method for the same | |
Wang et al. | Towards intelligent caring agents for aging-in-place: Issues and challenges | |
WO2018152365A1 (en) | Activity monitoring system | |
Mao et al. | Review of cross-device interaction for facilitating digital transformation in smart home context: a user-centric perspective | |
US11941506B2 (en) | System and method for monitoring via smart devices | |
US20220051073A1 (en) | Integrated Assistance Platform | |
US20220230753A1 (en) | Techniques for executing transient care plans via an input/output device | |
US20210012881A1 (en) | Systems, methods and apparatus for treatment protocols | |
Haigh et al. | Agents for recognizing and responding to the behaviour of an elder | |
US20190362858A1 (en) | Systems and methods for monitoring remotely located individuals | |
KR20180046124A (en) | System, method and program for analyzing user trait | |
US20230245771A1 (en) | Dynamic home themes for assisted early detection and treatment in healthcare | |
US11418358B2 (en) | Smart device active monitoring | |
US20230044000A1 (en) | System and method using ai medication assistant and remote patient monitoring (rpm) devices | |
JP2023059602A (en) | Program, information processing method, and information processing apparatus | |
Awada et al. | An Integrated System for Improved Assisted Living of Elderly People |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: AT&T INTELLECTUAL PROPERTY I, L.P., GEORGIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHEN, CHUXIN;DOME, GEORGE;OETTING, JOHN;SIGNING DATES FROM 20180620 TO 20180621;REEL/FRAME:058620/0411 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |