EP3868135A1 - Économie de durée de vie de batterie à l'aide d'un emplacement inféré - Google Patents

Économie de durée de vie de batterie à l'aide d'un emplacement inféré

Info

Publication number
EP3868135A1
EP3868135A1 EP19836704.7A EP19836704A EP3868135A1 EP 3868135 A1 EP3868135 A1 EP 3868135A1 EP 19836704 A EP19836704 A EP 19836704A EP 3868135 A1 EP3868135 A1 EP 3868135A1
Authority
EP
European Patent Office
Prior art keywords
user
location
visit
prediction
inference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP19836704.7A
Other languages
German (de)
English (en)
Inventor
Ido Priness
Sagi Hilleli
Jonathan Rabin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/194,611 external-priority patent/US20190090197A1/en
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Publication of EP3868135A1 publication Critical patent/EP3868135A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0251Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity
    • H04W52/0254Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity detecting a user operation or a tactile contact or a motion of the device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • a mobile computing device such as a cell phone
  • a majority of these mobile devices are equipped with a variety of applications that enhance, in some respect, the use of the mobile device.
  • Many of these applications, or even the operating system for the device itself use location information as part of the application or service.
  • this location information is provided using calculations with known cell tower or WiFi locations, or through global positioning system (GPS) receivers inside the device.
  • GPS global positioning system
  • location services One of the main issues with the use of location services is power consumption. In other words, conventional location services can drain the battery on a user’s mobile device more quickly than desired, frustrating the user.
  • Embodiments described in this disclosure are directed towards systems and methods for improving the operation of a mobile device or the user experience, such as prolonging the battery life, by providing inferred current and/or future location information for a user.
  • embodiments may determine a likely current and/or future location (or likely sequence(s) of locations) for a user. Some embodiments may further predict related contextual or semantic information, such as how long a user will likely stay; or other contextual information.
  • an inference or prediction of one or more current and/or future semantic locations and corresponding confidences may be determined and may be used by or provided to services, applications or operating systems on a user’s mobile device. Further, in some cases, the location inferences may be adjusted or updated based on a context of a current visit (current context). In this way, embodiments of the disclosure are able to provide an inferred location without using conventional location services that would otherwise drain the device’s battery.
  • FIG. l is a block diagram of an example operating environment suitable for implementing aspects of the disclosure.
  • FIG. 2 is a diagram depicting an example computing architecture suitable for implementing aspects of the disclosure
  • FIGS. 3A-3D depict an example process flow for determining inferred or predicted location information for a user, in accordance with an embodiment of the disclosure
  • FIGS. 4-5 depict flow diagrams of methods for determining inferred or predicted semantic location information for a user, in accordance with an embodiment of the disclosure.
  • FIG. 6 is a block diagram of an exemplary computing environment suitable for use in implementing an embodiment of the disclosure.
  • Various aspects of the technology described herein are directed towards systems, methods, and computer storage media for, among other things, improving the operation of a mobile device or the user experience, such as by prolonging the battery life, by providing an inferred current and/or future semantic location information about a user.
  • embodiments may provide a likely current and/or future location for a user, which may include contextual or semantic information.
  • the semantic information may include arrival time, length of stay, or departure time from the current location, other people known by the user (e.g., social network contacts, co-workers, business contacts, or family members) who are likely to also be at the future location at the same time as the user, or other information related to the current and/or future location visit.
  • the inferred location or location prediction information may be provided to an operating system or application or service such as a personal digital assistant service (or virtual assistant) associated with the user, or may be provided as an application programming interface (API) to facilitate consumption of the inferred location or location prediction information by a computing application or service.
  • an operating system or application or service such as a personal digital assistant service (or virtual assistant) associated with the user, or may be provided as an application programming interface (API) to facilitate consumption of the inferred location or location prediction information by a computing application or service.
  • API application programming interface
  • embodiments of the disclosure are able to provide an inferred current user location or predicted future user location without using the conventional location services of conventional approaches.
  • embodiments of the disclosure are adaptive and capable of accounting for external or explicit information, such as information from a camera, microphone or other sensor regarding a user’s location, information from user communications or user activity, which may include out-of-routine activity of a user, when determining a inferred current or predicted future semantic location.
  • some embodiments leverage the most relevant information about the user, such as user behavior patterns and visit patterns, to determine an inferred current location or predicted future location, without using explicit location signals, such as GPS.
  • some embodiments of the disclosure provide a corresponding context of the inferred current or the future location, such as the semantic location features described herein (e.g., arrival time, length of stay).
  • information about the user’s current visit to a location may be used with historical observations data about the user, and explicit information related to the user, such as expected user events (e.g., an expected flight), and/or other lasting information or ephemeral information (e.g., holidays and traffic, respectively) to determine an inference of a current location or a prediction of one or more future locations that the user will likely visit.
  • a semantic location may comprise a location that the user is expected to visit next, after the currently visited semantic location (or at a future time, following visiting the current inferred location), or also may comprise a series or sequence of future semantic locations expected to be visited by the user.
  • the current and/or future semantic location information may indicate that a user will most likely go to the gym, then the store, and then to her house.
  • the prediction may also include related contextual information, such as when the user is likely to leave the inferred current location for the predicted future location, when the user is likely to arrive at a predicted future location, or how long the user will likely stay.
  • This inferred current or predicted future location information may be provided to operating systems, applications or other services that consume or use user location information.
  • a corresponding confidence value also may be determined for each predicted location or for a sequence, and may correspond to alternative predicted or inferred locations.
  • an embodiment may determine that the user has a sixty percent likelihood of being at a restaurant.
  • an embodiment may determine that the user has an eighty percent likelihood that a next or future location visited will be the gym and a twenty percent likelihood that the next location visited by the user will be the user’s home.
  • the confidence may indicate a likelihood associated with a particular category of venue (or activity, event, purpose, or type of visit) and/or a specific venue (or specific activity, event, purpose, or type of visit).
  • Some embodiments of this disclosure may utilize a set of one or more predictor programs or routines (sometimes referred to as“predictors”) for determining an inferred current location and/or a predicted future location and then utilize a selection process for determining which predictions (provided by the set of predictor(s)) should be used. Additionally, in some embodiments, explicit information may be determined that can impact the user’s inferred current location and/or predicted future location(s), including context (or features) associated with the future semantic location such as arrival time or length of stay.
  • predictor programs or routines sometimes referred to as“predictors”
  • explicit information may be determined that can impact the user’s inferred current location and/or predicted future location(s), including context (or features) associated with the future semantic location such as arrival time or length of stay.
  • such explicit information may include external signals or events associated with the user (e.g., flights, travel, vacation), user calendar information (e.g., appointments, meetings, out-of-office indications), out-of- routine information for the user, or other external information as described herein, such as lasting information (e.g., holidays) or ephemeral information (e.g., traffic, weather, temporary incidents such as closures).
  • external signals or events associated with the user e.g., flights, travel, vacation
  • user calendar information e.g., appointments, meetings, out-of-office indications
  • out-of- routine information for the user
  • other external information such as described herein, such as lasting information (e.g., holidays) or ephemeral information (e.g., traffic, weather, temporary incidents such as closures).
  • lasting information e.g., holidays
  • ephemeral information e.g., traffic, weather, temporary incidents such as closures.
  • semantic location information is used broadly herein and may include geographical location information as well as contextual information, such as the purpose or activity of a user’s visit to the location, whether the user is visiting a venue at the location or just passing by the location, a specific venue that the user is visiting (e.g., not just a shopping center, but a specific coffee house in the shopping center), the arrival time, and/or length of stay.
  • semantic information may include information about a user’s activities or venue rather than merely their geographic location; for instance, it may be determined that a user in the geographical location of a movie theater for several hours has watched a movie.
  • current context is used broadly herein to include information associated with the user’s current semantic location or current visit, as well as other information related to the current visit.
  • current context may include current visit information as well as context features or contextual information associated with the current visit, such as the user’ s arrival time to the current location; the location(s) of the user (including geographic location or semantic location) prior to the current location (if known) (i.e., where the user came from); date, time, day of the week information; user activity detected in connection to the current visit or previous location(s) visited prior to the current location; or other people the user knows who are determined to be at the current location or previous location(s), for example.
  • a predicted semantic location (or a next location or future location) of a user may comprise a semantic location visited immediately following the current inferred semantic location or a location that may be visited subsequent to the current inferred location, but not necessarily immediately following the current inferred location.
  • the sequence may comprise an ordered set of locations (for example, the user is likely to go to place A followed by place B followed by place C), or a series of future locations that are likely to occur, but in which there may be other locations in the series that are not in the predicted series; for example, the user is next likely to visit place A and then place B and then place C, but the user may visit place X between places A and B or B and C.
  • some embodiments may update a predicted future semantic location (or sequence) upon determining or inferring that the user has visited place X.
  • user data is received from one or more data sources.
  • the user data may be received by collecting user data with one or more sensors or components on user computing device(s) associated with a user. Examples of user data, which is further described in connection to user-data collection component 210 of FIG. 2, user-activity information (e.g., app usage, online activity, searches, calls, or other user device interactions), application data, contacts data, calendar and social network data, or nearly any other source of user-related data that may be sensed or determined by a user device or other computing device (except actual location data from a GPS receiver or other location service). As further described herein, the received user data may be used for determining a current context of the current visit.
  • user-activity information e.g., app usage, online activity, searches, calls, or other user device interactions
  • application data e.g., contacts data, calendar and social network data
  • the received user data may be used for determining a current context of the current visit.
  • user data first may be used to determine information about a current visit, which may include semantic location information associated with the current visit, and from the current visit information, a current context may be determined.
  • user location history information from previous visits to the inferred current location, as well as received user data also may be used to facilitate determining the inferred current location or predicted future locations, as described herein.
  • Information regarding one or more previous visits may be determined based in part on information from the current context.
  • historic location information (which may be provided from historic location data 243 of FIG. 2) associated with the user may be used to identify a set of historical visits of the user to the possible current location.
  • Information from these historical visits, features of the visits, and information from the current context or of the possible current location then may be provided to one or more predictors for determining a set of history -based predictions about a user’s current location(s). From among the set of history-based predictions, one or more likely history- based prediction(s) may be determined. For example, in one embodiment, the history-based prediction(s) may be selected based on confidence values (or a confidence “score”) associated with each prediction.
  • Explicit information that can impact the inferred current location and/or predicted future location(s) is also determined, in some embodiments, and reconciled with the set of determined history-based prediction(s).
  • the explicit information is conflated with the history-based prediction(s) in order to determine an inferred user location or prediction (or predictions).
  • Information about the inferred user location may be provided to a computer application or service, such as a virtual assistant service associated with the user, or may be provided in connection with an API to facilitate consumption of the inferred user location or prediction information by a computer application or service. Further, some embodiments may be carried out by a virtual assistant application or service, which may be implemented as one or more computer applications, services, or routines, such as an app running on a mobile device and/or in the cloud, as further described herein.
  • FIG. 1 a block diagram is provided showing an example operating environment 100 in which some embodiments of the present disclosure may be employed. It should be understood that this and other arrangements described herein are set forth only as examples. Other arrangements and elements (e.g., machines, interfaces, functions, orders, and groupings of functions,) can be used in addition to or instead of those shown, and some elements may be omitted altogether for the sake of clarity. Further, many of the elements described herein are functional entities that may be implemented as discrete or distributed components or in conjunction with other components, and in any suitable combination and location. Various functions described herein as being performed by one or more entities may be carried out by hardware, firmware, and/or software. For instance, some functions may be carried out by a processor executing instructions stored in memory.
  • example operating environment 100 includes a number of user devices, such as user devices 102a and 102b through 102n; a number of data sources, such as data sources 104a and 104b through 104n; server 106; sensors 103a and 107; and network 110.
  • environment 100 shown in FIG. 1 is an example of one suitable operating environment.
  • Each of the components shown in FIG. 1 may be implemented via any type of computing device, such as computing device 600 described in connection to FIG. 6, for example.
  • These components may communicate with each other via network 110, which may include, without limitation, one or more local area networks (LANs) and/or wide area networks (WANs).
  • network 110 comprises the Internet and/or a cellular network, amongst any of a variety of possible public and/or private networks.
  • any number of user devices, servers, and data sources may be employed within operating environment 100 within the scope of the present disclosure. Each may comprise a single device or multiple devices cooperating in a distributed environment. For instance, server 106 may be provided via multiple devices arranged in a distributed environment that collectively provide the functionality described herein. Additionally, other components not shown may also be included within the distributed environment.
  • User devices 102a and 102b through 102n can be client devices on the client- side of operating environment 100, while server 106 can be on the server-side of operating environment 100.
  • Server 106 can comprise server-side software designed to work in conjunction with client-side software on user devices 102a and 102b through 102n so as to implement any combination of the features and functionalities discussed in the present disclosure.
  • This division of operating environment 100 is provided to illustrate one example of a suitable environment, and there is no requirement for each implementation that any combination of server 106 and user devices 102a and 102b through 102n remain as separate entities.
  • User devices 102a and 102b through 102n may comprise any type of computing device capable of use by a user.
  • user devices 102a through 102n may be the type of computing device described in relation to FIG. 6 herein.
  • a user device may be embodied as a personal computer (PC), a laptop computer, a mobile or mobile device, a smartphone, a smart speaker, a tablet computer, a smart watch, a wearable computer, a personal digital assistant (PDA), a music player such asan MP3 or streaming device, a global positioning system (GPS) or device, a video player, a handheld communications device, a gaming device or system, an entertainment system, a vehicle computer system, an embedded system controller, a camera, a remote control, a bar code scanner, a computerized measuring device, an appliance, a consumer electronic device, a workstation, or any combination of these delineated devices, or any other suitable computer device.
  • PC personal computer
  • laptop computer a mobile or mobile device
  • smartphone a smart speaker
  • a tablet computer a smart watch
  • a wearable computer a personal digital assistant
  • PDA personal digital assistant
  • a music player such asan MP3 or streaming device
  • GPS global positioning system
  • video player such as an MP3
  • Data sources 104a and 104b through 104n may comprise data sources and/or data systems, which are configured to make data available to any of the various constituents of operating environment 100, or system 200 described in connection to FIG. 2. (For instance, in one embodiment, one or more data sources 104a through 104n provide (or make available for accessing) user data to user-data collection component 210 of FIG. 2.) Data sources 104a and 104b through 104n may be discrete from user devices 102a and 102b through 102n and server 106 or may be incorporated and/or integrated into at least one of those components.
  • one or more of data sources 104a through 104n comprise one or more sensors, which may be integrated into or associated with one or more of the user device(s) 102a, 102b, or 102n or server 106. Examples of sensed user data made available by data sources 104a through 104n are described further in connection to user-data collection component 210 of FIG. 2.
  • Operating environment 100 can be utilized to implement one or more of the components of system 200, described in FIG. 2, including components for collecting user data, monitoring user activity and events, determining inferred current location or future location predictions, and consuming or providing location inference or prediction information to provide an improved user experience, by improving battery life (because traditional location services (such as GPS) are not needed).
  • Operating environment 100 also can be utilized for implementing aspects of process flow 300, described in FIGS. 3 A-3D, or methods 400 or 500 in FIGS. 4 and 5, respectively.
  • FIG. 2 a block diagram is provided showing aspects of an example computing system architecture suitable for implementing an embodiment of the disclosure and designated generally as system 200.
  • System 200 represents only one example of a suitable computing system architecture.
  • Example system 200 includes network 110, which is described in connection to FIG. 1, and which communicatively couples components of system 200 including user- data collection component 210, presentation component 218, visits monitor 280, location prediction/inference engine 260, user-location inference engine 220, one or more inferred location consumers 270, and storage 225.
  • Visits monitor 280 including its components 282, 284, 286, and 288), location prediction/inference engine 260 (including its components and subcomponents 262, 263a, 263b, 264, 2642, 2644, 2646, 2648, and 268), user-data collection component 210, presentation component 218, and user-location inference engine 220 may be embodied as a set of compiled computer instructions or functions, program modules, computer software services, or an arrangement of processes carried out on one or more computer systems, such as computing device 600 described in connection to FIG. 6, for example.
  • the inferred current user location or predicted future user location can then be provided to an operating system, services(s) or application(s), (such as inferred user location consumer 270).
  • the functions performed by components of system 200 are associated with one or more personal digital assistant (sometimes referred to as“virtual assistant”) applications, services, or routines.
  • applications, services, or routines may operate on one or more user devices (such as user device 104a), servers (such as server 106), may be distributed across one or more user devices and servers, or be implemented in the cloud.
  • these components of system 200 may be distributed across a network, including one or more servers (such as server 106) and client devices (such as user device 102a), in the cloud, or may reside on a user device such as user device 102a.
  • these components, functions performed by these components, or services carried out by these components may be implemented at appropriate abstraction layer(s) such as the operating system layer, application layer, hardware layer, etc., of the computing system(s).
  • abstraction layer(s) such as the operating system layer, application layer, hardware layer, etc., of the computing system(s).
  • the functionality of these components and/or the embodiments of the disclosure described herein can be performed, at least in part, by one or more hardware logic components.
  • illustrative types of hardware logic components include Field-programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (ASICs), Application-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.
  • FPGAs Field-programmable Gate Arrays
  • ASICs Application-specific Integrated Circuits
  • ASSPs Application-specific Standard Products
  • SOCs System-on-a-chip systems
  • CPLDs Complex Programmable Logic Devices
  • user-data collection component 210 is generally responsible for accessing or receiving (and in some cases also identifying) user data from one or more data sources, such as data sources 104a and 104b through 104n of FIG. 1.
  • user-data collection component 210 may be employed to facilitate the accumulation of user data of a particular user (or in some cases, a plurality of users including crowdsourced data) for visits monitor 280, location prediction/inference engine 260, user- location inference engine 220, or other components or subcomponents of system 200.
  • the data may be received (or accessed), and optionally accumulated, reformatted and/or combined, by user-data collection component 210 and stored in one or more data stores such as storage 225, where it may be available to the components or subcomponents of system 200.
  • the user data may be stored in or associated with a user profile 240, as described herein.
  • the technologies described herein include functionality for preserving user privacy or providing the user with control of the data that is collected and used to provide the personalized services to the user.
  • the collected or stored user data may be encrypted to preserve user privacy.
  • personally identifiable data i.e., user data that specifically identifies particular users
  • the use of personally identifiable data is managed to minimize risk of exposure; for instance, any personally identifying data is either not uploaded from the one or more data sources with the user data, is not permanently stored, and/or is not made available to the components or subcomponents of system 200.
  • certain user data may be de-identified after collection, in some embodiments, in order to further preserve user privacy.
  • a user may opt into services provided by the technologies described herein and/or select which user data about the user and/or which sources of user data are to be utilized by these technologies.
  • one embodiment comprises a graphical user interface dashboard or notebook, which may be presented via presentation component 218, which presents details about specific user data utilized, and may facilitate enabling a user to view, modify, select data sources, and/or delete their user data or user-data sources.
  • the user may optionally view user-data- related details about the user’s inferred current or future location, such as which user data was used for a particular location inference.
  • a user may be presented with near-real-time changes to their inferred location based on the user-data sources that are selected to be utilized by the technologies described herein. For instance, as the user selects specific user-data sources for inclusion (or exclusion), then the corresponding inferred location might change. By comparing the presented inferred location with the user’s actual location, which the user presumably knows, the user might be encouraged to select more user-data sources for inclusion, as the accuracy of the inferred location is likely to improve as more user-data sources are included. Moreover, in some aspects, the user may be presented with an option to“correct” or update information to better tune the data.
  • User data may be received from a variety of sources where the data may be available in a variety of formats.
  • user data received via user-data collection component 210 may be determined via one or more sensors (such as sensors 103 a and 107 of FIG. 1), which may be on or associated with one or more user devices (such as user device 102a), servers (such as server 106), and/or other computing devices.
  • a sensor may include a function, routine, component, or combination thereof for sensing, detecting, or otherwise obtaining information, such as user data, from a data source 104a, and may be embodied as hardware, software, or both.
  • user data may include data that is sensed or determined from one or more sensors (referred to herein as sensor data), smartphone data (such as phone state, charging data, date/time, or other information derived from a smartphone), user- activity information (for example: app usage; online activity; searches; voice data such as automatic speech recognition; activity logs; communications data including calls, texts, instant messages, and emails; website posts; other user data associated with communication events; other user interactions with a user device, etc.) including user activity that occurs over more than one user device, user history, session logs, application data, contacts data, calendar and schedule data, notification data, social network data, news (including popular or trending items on search engines or social networks), online gaming data, ecommerce activity (including data from online accounts such as Microsoft®, Amazon.com®, Google®, eBay®, PayPal®, video-streaming services, gaming services, or Xbox Live®), user-account(s) data (which may include data from user preferences or settings associated with a personalization-related (e.g.,“
  • User data can be received by user-data collection component 210 from one or more sensors and/or computing devices associated with a user. While it is contemplated that the user data is processed, by the sensors or other components not shown, for interpretability by user-data collection component 210, embodiments described herein do not limit the user data to processed data and may include raw data. Moreover, while historic location data may have been received or collected by user-data collection component 210, the system 200 does not need or use the traditional location services (such as GPS) to determine user inferred current location or predicted future user location, as described below. In some respects, user data may be provided in user-data streams or signals.
  • traditional location services such as GPS
  • A“user signal” can be a feed or stream of user data from a corresponding data source.
  • a user signal could be from a smartphone, a home-sensor device, a GPS device (e.g., for historic location coordinates), a vehicle-sensor device, a wearable device, a user device, a gyroscope sensor, an accelerometer sensor, a calendar service, an email account, a credit card account, or other data source.
  • user-data collection component 210 receives or accesses user-related data continuously, periodically, or as needed.
  • Visits monitor 280 is generally responsible for monitoring user data for information that may be used for determining historic user visits and, in some instances, features associated with those historic visits, which may be used for determining context associated with the historic visits.
  • information determined by visits monitor 280 may be provided to location prediction/inference engine 260 including information regarding the current context and historical visits (historical observations) and/or stored historic location data 249 or user historic visits 246.
  • embodiments of visits monitor 280 may use user data, including historic location information, to determine or attribute the user’s historic location, which may be carried out by location attribution component 282, described below. Based on the location attribution, a visit for the user may be determined. In some embodiments, a visit may be determined using features identified from the user data (including current or historical user data), such as how long the user is at a particular location. For example, user data indicating that a user was in the same approximate geographical location for a period of time is more likely to imply a visit occurred than user data indicating the user was only at a particular location briefly (such as in the case where a user is driving by a location, but not visiting it).
  • a“visit” may indicate a degree of intention by the user to be at the user’s location.
  • a visit may be determined where a user remains approximately at the same geographical location over a time frame. In contrast, merely passing through a location or momentarily being at a location may indicate that a visit has not occurred.
  • a historic visit may be determined by visit identifier 284, described below, and features associated with the visit may be identified by visit/activity feature determiner 288, also described below.
  • visits monitor 280 comprises a location attribution component 282, visit identifier 284, contextual information extractor 286, and visit/activity feature determiner 288.
  • visits monitor 280 and/or one or more of its subcomponents may determine interpretive data from received user data.
  • Interpretive data corresponds to data utilized by the subcomponents of visits monitor 280 (or other components or subcomponents of system 200) to interpret user data.
  • interpretive data can be used to provide context to user data, which can support determinations or inferences made by the subcomponents, such as the disambiguation example described above.
  • embodiments of visits monitor 280, its subcomponents, and other components of system 200 may use user data and/or user data in combination with interpretive data for carrying out the objectives of the subcomponents described herein.
  • Location attribution component 282 in general, is responsible for determining location attribution using user data, as described previously.
  • user data may include any user data (or sensor data) indicating historic location information, such as GPS, wireless communications (e.g., cellular or Wi-Fi Access Point), IP addresses associated with historic user activity, user check-in/social -networking information, or other user data from which location information may be determined.
  • location attribution component 282 attributes the location to a location of interest to the user, such as locations frequented by the user (sometimes called“hubs”). For example, in some embodiments, locations indicated by the location data may be clustered and the dense clusters used for determining those locations wherein a user spends time (e.g., hubs).
  • location attribution component 282 performs filtering, which may remove location information outliers (e.g., a Wi-Fi-derived location data point from 300 yards away suggesting that the user is at that location); clustering; or other means to determine location data for attribution.
  • location attribution component 282 may perform location attribution with historic location data associated with the user (such as logged user data or logged location information, which may be stored in a user profile such as historic location data 243 in user profile 240). The historic location attributions may be used for determining visits historical visits.
  • Visit identifier 284 in general, is responsible for determining (or identifying) a visit has occurred.
  • Embodiments of visit identifier 284 may be used for determining one or more historical visits.
  • Some embodiments of visit identifier 284 may use the historic location attributions determined by location attribution component 282 to identify a visit. For example, as described previously, user data indicating that a user was in the same approximate geographical location for a period of time may indicate a visit.
  • visits may be identified by concatenating consecutive (or substantially consecutive) user location data indicating the user is near the same approximate location, and in some cases filtering out outliers.
  • visits monitor 280 may acquire historic location information (which may be obtained from user data provided by a user-data collection component 210), continuously, periodically, or as needed.
  • the historic location information may have corresponding timestamps for when the historic location information was sensed or otherwise determined.
  • a collection of location time data may be determined that includes data points indicating a location (which may be a geographical location or semantic location) and a corresponding time that the location was detected.
  • the location-time data comprises a time series of location information. Accordingly, in some embodiments, a visit may be determined based on concatenating consecutive (or approximately consecutive) data points in the time-series that indicate the same approximate location.
  • the location-time data may be stored in historic location data 243.
  • Contextual information extractor 286, in general, is responsible for determining contextual information related to the historic visits (detected by visit identifier 284 or visits monitor 280), such as context features or variables associated with a particular visit or other information related to the visit.
  • contextual information extractor 286 may associate the determined contextual information with the related visit and may also log the contextual information with the visit. Alternatively, the association or logging may be carried out by another service.
  • some embodiments of contextual information extractor 286 provide the determined contextual information to visit/activity feature determiner 288, which determines features or variables associated with the historic visit and/or activity (such as described below) and/or related contextual information.
  • contextual information extractor 286 determine contextual information related to a visit such as entities related to the visit (e.g., other people present at the location), a venue or venue-related information about the visit, or detected activity performed by the user at the location.
  • this may include context features such as information about the location, such as venue information (e.g., this is the user’s office location, home location, gym.), time (including, for instance, arrival/departure times or duration of stay), day, and/or date, which may be represented as a time stamp associated with the visit, other user activity preceding and/or following the visit, other information about the visit such as entities associated with the visit (e.g., venues, people, objects), information detected by sensor(s) on user devices associated with the user that is concurrent or substantially concurrent to the visit (e.g., user activity detected via a computing device such as watching a streaming movie on an entertainment console, or motion information or physiological information detected on a fitness tracking user device, for example), user interaction on one or more user devices (such as browsing certain types of webpages, listening to music, taking pictures, composing email, or any other type of user device interaction), social media activity, or any other information related to the visit that is detectable that may be used for determining features or patterns associated with user visits
  • venue information e.
  • a particular user device may be identified by detecting and analyzing characteristics of the user device, such as device hardware, software such as operating system (OS), network- related characteristics, user accounts accessed via the device, and similar characteristics. For example, information about a user device may be determined using functionality of many operating systems to provide information about the hardware, OS version, network connection information, installed application, or the like. In some embodiments, a device name or identification (device ID) may be determined for each device associated with a user. This information about the identified user devices associated with a user may be stored in a user profile associated with the user, such as in user profile 240.
  • OS operating system
  • the user devices may be polled, interrogated, or otherwise analyzed to determine contextual information about the devices. This information may be used for determining a label or identification of the device (e.g., a device ID) so that contextual information about a particular historic visit captured on one user device may be recognized and distinguished from data captured by another user device.
  • users may declare or register a user device, such as by logging into an account via the device, installing an application on the device, connecting to an online service that interrogates the device, or otherwise providing information about the device to an application or service.
  • devices that sign into an account associated with the user such as a Microsoft® account or Net Passport, email account, social network, or the like, are identified and determined to be associated with the user.
  • contextual information extractor 286 may receive user data from user-data collection component 210, parse the data, in some instances, and identify and extract context features or variables (which also may be carried out by visit/activity feature determiner 288).
  • Context variables may be stored as a related set of contextual information associated with a historic visit, and may be stored in a user profile such as in user profile 240. For instance, contextual information associated with historic visits may be stored in user historic visits component 246.
  • Visit/activity feature determiner 288, in general, is responsible for identifying features and user activity associated with historic visits.
  • Features associated with a visit and user activity features may be used by visits monitor 280 to determine the context of the visit.
  • Features also may include contextual information and other details associated with a visit or user activity during a visit.
  • visit features may include the historic location (such as the geographic and/or semantic location if available), time and date, arrival time, departure time, length of stay, previous location(s) visited, next locations visited, sequences or series of locations, day of the week, user activity during the visit, user activity prior to or subsequent to the visit, information about other users associated with the visit (for example, if the visit is a meeting, then the other invitees/attendees of the meeting), or nearly any measurable or otherwise determinable variable associated with a visit.
  • visit logic 235 may be utilized for determining a visit, contextual information associated with a visit, and/or features of the visit.
  • Visit logic 235 may include rules, conditions, associations, classification models, or other criteria to identify a visit and contextual information or features associated with the visit.
  • visit logic 235 may include comparing visit criteria with the user data in order to determine that a visit has occurred and/or particular features associated with a determined visit.
  • the visit logic 235 can take many different forms depending on the mechanism used to identify a particular visit or feature of a visit.
  • the visit logic 235 may comprise training data used to train a neural network that is used to evaluate user data to determine when a visit has occurred, or when particular features are present in a determined visit.
  • the visit logic may comprise static rules (which may be predefined or may be set based on settings or preferences in a user profile associated with the user), Boolean logic, fuzzy logic, neural network, finite state machine, support vector machine, logistic regression, clustering, or machine learning techniques, similar statistical classification processes, other rules, conditions, associations, or combinations of these to identify a visit and/or visit feature from user data.
  • visit logic may specify types of user device interaction(s) information that are associated with a visit feature, such as launching a fitness tracking app which may occur at a gym, navigating to a website to read a movie review, or composing an email.
  • 288 may be stored as one or more label(s), tag(s), or metadata associated with the visit information, and may be used for indexing visits determined by visit identifier 284.
  • Information about visits determined by visits monitor 280 including in some embodiments the determined contextual information or features associated with the visit(s), may be stored in a user profile 240 associated with a user, such as in user historic visits 246.
  • location prediction/inference engine 260 is generally responsible for determining one or more possible current or future locations (or sequence(s) of future locations) for a user.
  • the output of location prediction/inference engine 260 can be stored in user-location/activity patterns 250, in user profile 240, and/or by used by user- location inference engine 220.
  • location prediction/inference engine 260 comprises features similarity determiner 262, one or more pattern-based predictors 264, and location prediction/inference selector 268.
  • embodiments of location prediction/inference engine 260 receive user visit information, such as information regarding a historical visit and/or contextual information, which may be determined by visits monitor 280, and utilize this information to generate a pattern-based inference or prediction of a current or future location for the user (or locations, as described previously). In some embodiments, a corresponding confidence is also determined for the inference or prediction(s). Further, the inference or prediction may comprise a single location, a sequence of locations, or probabilities for multiple locations; for example, an eighty percent likelihood that the next location will be the user’s gym and a twenty percent likelihood that the next location will be the user’s home.
  • Features similarity determiner 262 in general, is responsible for determining features or context(s) of historical visits that are similar to the current context.
  • Features similarity determiner 262 may be used in conjunction with one or more features pattern determiners (e.g., determiners 263a and 263b, described below) to determine a set of historical visits or features that are similar to the current context.
  • the set of historical visits similar to the current context then may be used as inputs to a particular pattern based predictor 264, as described further below.
  • features similarity determiner 262 comprises subcomponents for determining similarity among visits of different types of features or feature-based patterns of visits.
  • features similarity determiner 262 comprises periodic features similarity determiner 263a and behavior features similarity determiner 263b.
  • Periodic features comprise, for example, features of visits that occur approximately periodically; for example, visits occurring on the same particular time(s) of day, day of the week or month, even/odd days (or weeks), monthly, yearly, every other day, every 3 rd day, etc.
  • Behavior features comprise user behaviors such as arrival time, length of stay, user activities (e.g.
  • features similarity determiner 262 may determine visitation sequence similarity (e.g.
  • features similarity determiner 262 may be utilized, such as features pattern determiners for determining similarity between the presence of other people at the possible current visit and historical visits, such as contacts or social media acquaintances of the user, similarity of activity conducted by other people detected at the possible current visit and historical visits, similarity of events determined to be occurring at the possible current visit and events of historical visits, or similarity of any other aspect, feature, or context associated with a possible current visit and historical visit.
  • Pattern-based predictors 264 comprise one or more predictor programs or routines (“predictors”) for inferring a current location or predicting a next or future possible location of the user based in part on feature patterns of similarity (e.g., behavior and periodic features) between a possible current visit and a set of historical visits.
  • a pattern-based predictor 264 receives current and historical visit information and associated features (or current and historic contexts), which may be received from a user profile 240 of the user, and determines an inference of a current location or a prediction about the next (or future) user location.
  • a pattern-based predictor 264 uses features similarity determiner 262 to determine patterns or features in common between historical visits and the possible current visit to identify a subset of historical visits similar to the possible current visit.
  • features similarity determiner 262 may be used to determine, from among the set of historical visits, those historical visits having a periodic feature in common with the possible current visit.
  • periodic features similarity determiner 263a might determine those historical visits that have features indicating that the visit happened on a Monday, those historical visits having features corresponding to the first day of the month (any first day, not just Mondays), or an even week, or a weekday.
  • behavior features similarity determiner 263b may be used to determine sets of historical visits having a particular behavior feature in common with the possible current visit.
  • a set of historical visits determined using behavior features similarity determiner 263b might include those previous visits where the user also arrived to work later than normal (or visits where the user arrived at a time close to the arrival time of the current visit).
  • a pattern-based predictor 264 comprises a visit filter
  • each pattern- based predictor 264 may be designed (or tuned) for determining a prediction based on a prediction model using a particular feature (or features) or feature type; for instance, there might be a predictor 264 used for determining predictions when the feature indicates a workday, or weekend, or Monday, a holiday, or arrival time, or length of stay, etc.
  • Such a predictor 264 may utilize those historical visits having the features (similar to the current visit) corresponding to its particular prediction model. (For instance, in some embodiments, these predictors may utilize specific prediction algorithms or classification models, based on their particular type of pattern prediction (i.e., prediction model). These algorithms or models may be stored as prediction algorithms 230 in storage 225.)
  • visits filter 2642 performs visits filtering to determine a set of historical visits that are relevant to that particular predictor 264. More specifically, visit filtering may be performed such that each predictor 264 may receive a subset of historical visits with features that correspond to its prediction criteria.
  • a predictor 264 for determining a location prediction based on behavior features similarity for‘arrival time to work later than is than normal’ receives a set of historical visits where the user arrived late for work; thus, each visit in this particular historical subset includes similar features for the location (here, work) and arrival time.)
  • each predictor 264 may utilize a set of historical visits that are similar to the possible current visit, based on at least one in-common feature between the historical visits and possible current visit (determined by features similarity determiner 262), wherein the in-common feature(s) corresponds to prediction criteria of the predictor 264.
  • Examples of predictors 264 may include without limitation, periodic-feature based predictors, behavior-feature based predictors (which may include behavior sequences or visitation sequence similarity), out-of-routine or uncommon behavior features (such as when a user arrives to the office late or doesn’t go to the office on a workday (if this is uncommon behavior for the user), or when the user visits an uncommonly visited location) or other types of similarity-of-feature based predictors.
  • Visit score determiner 2644 compares similarities of features in the possible current visit and the subset of historical visits (which may be considered as a comparison of contexts, in some embodiments) and scores each possible visit with respect to the similarity of its features. In particular, some embodiments score not only those features used for determining the subset of historical visits (e.g., a weekday or arrival time), but all (or a larger number) of features available in the possible current and historical visits for comparison.
  • a Boolean logic process is used (i.e., the features have to be true or have the same or similar pattern, and if this is satisfied, then a statistical difference between the particular features is determined).
  • the differences may include, for example, differences in the arrival times, length of stay, sequence distances, etc. In an embodiment, these differences are determined and put into a sigmoid.
  • a similarity threshold is used, which may be predetermined, tunable, or adaptive, or may be initially set to a value based on a population of users, may be based on empirical information learned about the particular user, for example, or may be adaptive based on the number of historical observations.
  • the threshold is 0.5 (i.e., just over fifty percent, meaning more similar than dissimilar). In another embodiment, the threshold is initially 0.6, or 0.8.
  • the threshold may be used to determine whether a particular historical visit is“similar enough” to the possible current visit so as to be considered for determining an inference or prediction. In some cases, it may be necessary to perform some further filtering or a selection of features for the similarity comparison, such as where, for a given day, a user has more than one arrival time feature (e.g., the user arrived at work twice, because they left work for lunch and then returned). Here, it may be determined that the arrival time after lunch should be used for comparison with the possible current visit. In some embodiments, a vector representing the similarity differences (or similarity score) may be determined by visit score determiner 2644.
  • Visit selector 2646 is generally responsible for determining or selecting those visits from the subset of historical visits that are most similar (or similar enough, based on a threshold) to a possible current visit. In some embodiments, visit selector 2646 uses a threshold, as described above, to determine those historical visits that satisfy the threshold, and are thus similar enough to be used for determining an inference of a user’s current location or a prediction of the user’s future location. In one embodiment, for each day in the user’s history, the visit (or visits) with the highest score is selected.
  • the selected historical visits determined by visit selector 2646 comprise a set of “example visits.” Because a location prediction/inference engine 260 may comprise multiple pattern-based predictors 264, a given determination about a user’s current location or predicted future location may create multiple sets of example visits, each of which may correspond to a predictor 264.
  • Pattern-based prediction determiner 2648 is generally responsible for determining and outputting user-pattern information, which may be stored, for example, as user-location/activity patterns 250 in user-profile 240.
  • an inference or prediction probability corresponding to the location inference or prediction may be determined.
  • the prediction probability may be based on a ratio of the size of an inference or prediction support set versus the total number of observations (historical visits in the subset determined by the visit filtering); for instance, the number of visits in the prediction support set divided by the number of total observations.
  • the prediction also may comprise additional context, such as information about the user’s likely departure time of the inferred current location to go to the next predicted location, arrival time at the next predicted location, length of stay at the next predicted location, or other contextual information as described herein. In some embodiments, this may be determined based at least in part on the times (arrival, departure, length of stay) of the prediction support set observations.
  • pattern-based prediction determiner 2648 determine a prediction significance for the inference or prediction, which may be determined based on a confidence interval (e.g., a binomial confidence interval) or other appropriate statistical measure.
  • a confidence interval e.g., a binomial confidence interval
  • an inference or prediction confidence for a particular inference or prediction is also determined.
  • the inference or prediction confidence may indicate a degree or likelihood that the inference is correct or that the prediction will occur, or in other words, the chances that the user is at the inferred location or will visit the predicted future location.
  • the confidence is based on the prediction probability and the prediction significance; for example, in one embodiment, the prediction confidence is determined as the product of the prediction probability and the prediction significance.
  • the inference or prediction confidence associated with an inference or prediction may be used to select a particular inference or prediction from other possible inferences or predictions determined by the predictors 264 in location prediction/inference engine 260.
  • the output of each pattern- based predictor 264 is an inferred user location or predicted next or future location (or locations), and in some cases corresponding contextual information, such as departure time, arrival time, length of stay, etc., and/or a prediction confidence corresponding to the inferred current location of the predicted next or future location (or locations).
  • Location prediction/inference selector 268 determines an inferred current location and/or a prediction of a future location from among the predictions determined by each of the one or more pattern-based predictors 264.
  • an ensemble process is utilized, wherein one or more of the predictors 264 vote or weigh in, and a selection is determined based on at least one of the ensemble member predictors 264. Further, in some embodiments, individual ensemble member predictors may be weighted based on learned information about the user or of the visits.
  • location prediction/inference selector 268 selects the particular inference or prediction that has the highest corresponding confidence as the resultant current user location or next (or subsequent) predicted location (or locations). This selected location may be considered a pattern-based (or history-based) prediction determined by location prediction/inference engine 260.
  • the output of location prediction/inference selector 268 may be stored in user location/activity patterns 250 in user profile 240, and in some embodiments may be received by user-location inference engine 220. This is only a pattern- based inference or prediction. User-location inference engine 220 can refine, adjust or alter this inference or prediction based on other inputs.
  • User-location inference engine 220 in general, is responsible for inferring a user’ s location and outputting a user’ s inferred current location or a future predicted location by conflating a prediction, such as the pattern-based or history-based prediction (i.e., the prediction determined by location prediction/inference selector 268), and explicit information associated with the user, if available.
  • a prediction such as the pattern-based or history-based prediction (i.e., the prediction determined by location prediction/inference selector 268), and explicit information associated with the user, if available.
  • the term“coherent inference or prediction” is sometimes used herein to mean a comprehensive inference or prediction that reconciles the pattern-based inference or prediction and explicit information.
  • a coherent inference or prediction is provided as the output of the conflation performed by user-location inference engine 220.
  • a pattern-based prediction (as determined by location prediction/inference engine 260) determines that a user is likely at home thirty minutes after leaving the office. (Thus, the user’s inferred location in this example is the user’s home if it has been thirty minutes since the user left the office.) But suppose the user has a confirmed appointment after work to see a doctor, which may be indicated in the user’s calendar. Then, user-location inference engine 220 may determine that the user is likely to be at the appointment (e.g., the location of the doctor’ s office), rather than at home.
  • user-location inference engine 220 may, in some instances, override a historical-based or pattern-based prediction determined by prediction engine 260.
  • user-location inference engine 220 may determine that, although explicit information indicates a potential conflict with the pattern-based prediction (the user’s home), the confidence is low and therefore the resulting inference or the“coherent” inference for the location of the user is the user’s home.
  • user-location inference engine 220 comprises explicit signals determiner 222, user activity monitor 223a, current context determiner 223b, conflict level determiner 224, and user location predictor 226.
  • Explicit signals determiner 222 includes user activity monitor 223a and current context determiner 223b.
  • Explicit signals determiner 222 generally determines one or more explicit signals, and, in some embodiments, determines a level of confidence associated with each explicit signal.
  • Explicit signals determiner 222 determines an explicit signal representing explicit information associated with the user, which may be related to a pattern-based (or historical-based) prediction determined by location prediction/inference engine 260. For instance, in some embodiments, an explicit signal may be determined based on information determined about the user for a time corresponding with the pattern-based prediction(s). For instances, as explained below, explicit signals determiner 222 (or one of its subcomponents) may use an email about a flight itinerary for the user to determine the user’s likely future location.
  • User activity monitor 223a monitor’s the user’s activity, including sensor- derived information from one or more user devices, communications (e.g. email, calls, texts, instant messages, social media posts or activity), calendar activity (such as a meeting location), applications used and/or launched, online browsing, accounts accessed (e.g. streaming media, or business news feeds (which may be used by a user when in known locations, for example a user may typically only access Netflix while at home, and only checks stock activity at work)).
  • communications e.g. email, calls, texts, instant messages, social media posts or activity
  • calendar activity such as a meeting location
  • applications used and/or launched such as a meeting location
  • online browsing e.g. streaming media, or business news feeds (which may be used by a user when in known locations, for example a user may typically only access Netflix while at home, and only checks stock activity at work)).
  • User activity monitor 223a monitors this activity across one or multiple user devices to extract or determine features (which may comprise explicit information) that may be used to determine a user’s current or likely future location, and/or also may be used for reconciling this with already-determined location-pattern information (e.g., either to confirm that the user is likely to stay on their pattern or to learn that the user may deviate from their pattern).
  • User communications might include an email with a flight itinerary, hotel reservation information, or car rental. User communications might also include calls, emails or text messages indicating a user location (“Let’s meet for lunch at noon tomorrow at our diner.”).
  • user activity monitored by user activity monitor may include user-device interactions, such online activity like websites browsed (e.g., visiting a restaurant’s website and making a reservation at the restaurant), or app usage, such as purchasing tickets to a concert via an app or similar purchase-transaction data, which may be available via a banking or financial app.
  • user-device interactions such as online activity like websites browsed (e.g., visiting a restaurant’s website and making a reservation at the restaurant), or app usage, such as purchasing tickets to a concert via an app or similar purchase-transaction data, which may be available via a banking or financial app.
  • User activity monitor 223a may also receive and use sensor-derived information (such as from user data collection component 210).
  • This sensor-derived information includes, for example, information from cameras, microphones, or other activity sensed from user-devices that have a known location (e.g., the user’s television or smart refrigerator at the user’s home) For example, user voice and sounds from different devices can be used to infer that the user is home if they talk to a smart home device like a speaker.
  • Current context determiner 223b monitors and determines information about a user’s current (or near-future) context. This contextual information helps to resolve the user’s likely current or future location.
  • the current context includes information such as the weather, whether the user had other events transpire that are usually associated with affecting the user’s location - e.g., the user was late to work that day, and on days where the user is late, the user usually stays later or skips going to the gym.
  • Component 223b may receive information from components 210 or the monitored user activity from user activity monitor 223a.
  • the contextual information determined by current context determiner 223b can be used to determine whether the user is likely to follow a particular pattern (and in those instances where the user follows a pattern that has more than one possible location associated with it, the contextual information may be used to determine whether one of these locations is more likely. Contextual information also may be used to determine if the user is“out of routine.” In other words, the user is less likely to be at a near-future location that is inferred from a pattern, because the user appears to be not following the pattern or not likely to follow the pattern because some detected context, behavior or user-activity is different than expected.
  • an explicit signal may indicate that the user has a calendar appointment with a doctor at a time in which a pattern-based prediction has predicted the user to be at another location (e.g., the user’s home).
  • Explicit signals may be monitored by activity monitor 223 a and current context determiner and determined from user data provided by user-data collection component 210 and, in some instances, may be stored in a user profile associated with the user, such as user explicit signals data component 249 of user profile 240.
  • Explicit signals determiner 222 also may determine a level of confidence associated with each explicit signal.
  • each explicit signal or piece of explicit information is evaluated by explicit signals determiner 222 to determine a confidence associated with the explicit signal.
  • the confidence may indicate a legitimacy or authority (e.g., strength) of the explicit signal; for instance, a higher confidence may indicate that the user’s future activity is more likely to be affected according to the explicit signal.
  • a meeting request received by a user that the user has affirmatively confirmed (accepted as attending) may have a higher confidence than a meeting request received by the user that the user has responded to as tentative or has not responded to at all (an unconfirmed conflict).
  • an explicit signal may be determined by extracting and/or parsing location-related information from information determined by components 223a or 223b. For example, where an SMS text message (or other user communication) indicates“I’ll meet you out at the mall at 8pm today,” information about the user’s location (the mall) and a future time (8pm, today) may be extracted and provided as an external, explicit signal.
  • crowd-sourced information may be used to determine an explicit signal, such as information from a user’s close circle of friends or co-workers. For example, if most of the user’s co-workers have the same event on their calendars, such as a“team party” with an address, then an explicit signal may be inferred. (This type of explicit signal may have a lower confidence than an explicit signal determined based on user-data derived directly from the user.)
  • Conflict level determiner 224 generally determines a level of conflict between one or more explicit signals and a pattern-based prediction. Where it is determined that a conflict does not occur for a particular pattern-based prediction, then the explicit signal(s) may be ignored with regards to that pattern-based prediction, in an embodiment. But where it is determined that a conflict may occur, then the explicit signal(s) and pattern- based prediction are conflated (or reconciled) to determine a coherent inferred or predicted semantic location. For example, an explicit signal indicating a user-accepted meeting over the exact time as a predicted location of lunch at a restaurant, wherein the meeting is at a different location than the restaurant, may be determined has having a high level of conflict.
  • the corresponding confidence may be high, and it may be determined (as described below in connection to user location predictor 226) that the location is the meeting, and further that the user will not go to lunch at the restaurant.
  • a partial conflict may occur. For example, suppose a user has just sent an SMS text message to a friend indicating that the user will stop by her friend’s house on the way home today to pick up an item to borrow. A pattern-based prediction may have determined that the next location is the user’s home; the user will arrive home at 6 PM, and will stay at home for 13 hours (until tomorrow morning, when the user goes to work). But an explicit signal, based on the text message, indicates that the user’s next semantic location is the friend’s house.
  • conflict level determiner 224 may determine a partial conflict
  • user location predictor 226 may determine that the user’s semantic location is most likely the friend’s house, but that a subsequent semantic location is the user’ home.
  • User location predictor 226 (or another component of system 200) may further determine that the user will likely arrive at her friend’ s house at 5 :45 PM. (This may be determined, for instance, based on a pattern of departure time from work.)
  • User location predictor 226, in general, is responsible for providing a coherent prediction about a user location based on the conflation.
  • user location predictor 226 may access or receive one or more of the following data in order to make an inference about the user’s current or future location: the user’s location prediction(s) or inference(s) determined by location prediction/inference engine 260 (or more specifically by location prediction/inference selector 268), or the user’s location pattern(s), determined by location prediction/inference engine 260, which may be stored in user location/activity patterns 250 in user profile 240; explicit signal information received from explicit signals determiner 222 (if available) (as noted above, this explicit information could include: any information about inferred locations derived from the user’s communications or the user calendar; information about nearby mobile devices of other users, which may include location information; audio/video information from the user’s mobile device or another device (such as a smart speaker) that may be used to identify the user’s current location; current contextual information (
  • User location predictor 226 may also determine a subsequent location (or locations) visited after the current location, but not necessarily immediately following the current location, as described previously.
  • the term“next location” is used broadly herein.
  • user location predictor 226 may determine the likely semantic location of the user based on a level of conflict between a pattern-based (or history-based) predicted user location (such as determined by location prediction/inference engine 260) and one or more explicit signal(s) representing explicit information associated with the user, and in some instances, a corresponding confidence regarding the accuracy or certainty of the explicit signal(s). For example, explicit signals with high confidence and high conflict may trump the pattern-based prediction, and thus the predicted user semantic location may be determined from the explicit signal(s). In particular, in one embodiment, the explicit signal having the highest confidence is used for determining the predicted user location.
  • the pattern-based prediction is determined to be the user location; for example, where no explicit signal or explicit evidence is identified to contradict the pattern-based prediction, then the pattern-based prediction will be provided as the coherent inference or prediction.
  • a confidence associated with the conflicting explicit signal may be compared to the prediction confidence associated with the pattern-based prediction to determine a coherent prediction regarding the user’s current location or future semantic location.
  • Some embodiments of user location predictor 226 (or user-location inference engine 220) utilize reconciliation logic 237 to reconcile conflicts and/or determine the coherent prediction.
  • Reconciliation logic 237 may comprise rules, conditions, associations, classification models, or other criteria, and may take different forms, depending on the explicit signal(s) or pattern-based prediction.
  • reconciliation logic 237 may comprise static rules (which may be predefined or may be set based on settings or preferences in a user profile associated with the user), Boolean logic, fuzzy logic, neural network, finite state machine, support vector machine, logistic regression, clustering, or machine learning techniques, similar statistical classification processes, other rules, conditions, associations, or combinations of these.
  • static rules which may be predefined or may be set based on settings or preferences in a user profile associated with the user
  • Boolean logic fuzzy logic
  • neural network finite state machine
  • finite state machine support vector machine
  • logistic regression logistic regression
  • clustering or machine learning techniques, similar statistical classification processes, other rules, conditions, associations, or combinations of these.
  • reconciliation logic 237 instructs user-location inference engine 220 to return null (or in other words, user-location inference engine 220 or other components of system 200 do not provide a coherent prediction in this circumstance).
  • example system 200 also includes one or more inferred or inferred or predicted-location consumers 270.
  • Inferred or predicted-location consumers 270 comprise computing applications or computing services that consume the inferred current location or predicted future semantic location information regarding the user to provide an improved computing experience for the user. Consumers 270 may elect to consume the inferred user location or predicted future location to save battery life of the user device on which they reside.
  • coherent inference(s) regarding a user’s current semantic location(s) may be provided to computer applications or services (e.g., inferred or predicted-location consumers 270), which may include an aspect of a virtual assistant computer program associated with the user.
  • the coherent inferred user location or prediction(s) may be provided in connection with an API to facilitate their utilization by a predicted-location consumer 270.
  • inferred or predicted-location consumers 270 may include, without limitation, calendar or scheduling applications or services, notification services, personalized content services, automation services, or other computing services that may be tailored to a user based on knowledge of the user’s likely future semantic location.
  • Some embodiments of inferred or predicted- location consumers 270 may be carried out by a virtual assistant application or service, which may be implemented as one or more computer programs (which may comprise one or more applications, services, or routines), such as an app running on a mobile device and/or in the cloud, as further described herein.
  • an inferred or predicted location consumer 270 comprises a location service 271.
  • Location service 271 provides the inferred location information about the user’s location in place of the actual location information (e.g., typically provided a location-services component(s), such as a GPS sensor) to the operating system of the mobile device and/or any applications or services on the mobile device that request location information.
  • location service 271 may control (or work with other software routines/services/drivers to control) the conventional location services on the user device; for example, location service 271 may disable, turn-off, or modify operation of the GPS sensor/GPS-related services so that they are not operating or so that they operate less often.
  • location service 271 may provide its inferred location information in place of the location information that would otherwise be provided by the conventional location-services component s).
  • system 200 may generate a personalized notification to be presented to a user, which may be provided to presentation component 218.
  • the notification may indicate the option for the user to user inferred location or predicted user location (from user-location inference engine 220) instead of traditional location services (such as GPS) and makes it available to presentation component 218, which determines when and how (i.e., what format) to present the notification based on user data.
  • using the inferred or predicted user location user-device battery-life may be better managed or improved.
  • Example system 200 also includes a presentation component 218 that is generally responsible for presenting content and related information to a user, such as the content from inferred or predicted-location consumers 270.
  • Presentation component 218 may comprise one or more applications or services on a user device, across multiple user devices, or in the cloud. For example, in one embodiment, presentation component 218 manages the presentation of content to a user across multiple user devices associated with that user. In some embodiments, presentation component 218 may determine on which user device(s) content is presented, as well as the context of the presentation, such as how (or in what format and how much content, which can be dependent on the user device or context) it is presented, when it is presented, etc. In some embodiments, presentation component 218 generates user interface features associated with the personalized content.
  • Such features can include interface elements (such as graphics buttons, sliders, menus, audio prompts, alerts, alarms, vibrations, pop-up windows, notification-bar or status-bar items, in-app notifications, or other similar features for interfacing with a user), queries, and prompts.
  • interface elements such as graphics buttons, sliders, menus, audio prompts, alerts, alarms, vibrations, pop-up windows, notification-bar or status-bar items, in-app notifications, or other similar features for interfacing with a user
  • queries and prompts.
  • Storage 225 generally stores information including data, computer instructions (e.g., software program instructions, routines, or services), logic, profiles and/or models used in embodiments of the disclosure described herein.
  • storage 225 comprises a data store (or computer data memory). Further, although depicted as a single data store component, storage 225 may be embodied as one or more data stores or may be in the cloud.
  • storage 225 stores pattern prediction algorithms or models 230, visits logic 235, and reconciliation logic 237, as described previously.
  • storage 225 stores one or more user profiles 240, an example embodiment of which is illustratively provided in FIG. 2.
  • Example user profile 240 may include information associated with a particular user or, in some instances, a category of users. As shown, user profile 240 includes user’s historic location data 243, user historic visits 246, user account(s) and activity data 248, user explicit signals data 249, and pattern-based predictions 250, some of which have been described previously.
  • the information stored in user profiles 240 may be available to the routines or other components of example system 200
  • User’s location/activity pattern data 242 may comprise the semantic locations determined (as a coherent inference of current location or prediction of future semantic location(s)) by user-location inference engine 220 , and may include semantic locations frequented by a user, as described previously and referred to herein as“hubs.”
  • User locations data 250 may be provided to one or more inferred or predicted-location consumers 270 or to a virtual assistant associated with the user.
  • User account(s) and activity data 248 generally includes user data determined from user-data collection component 210 (which in some cases may include crowdsourced data that is relevant to the particular user), and may be used for determining historic visit-related information, such as semantic location information, features associated with visits (past or future), and/or explicit signals, for example.
  • User account(s) and activity data 248 also may include information about user devices accessed, used, or otherwise associated with a user, and/or information related to user accounts associated with the user; for example, online or cloud-based accounts (e.g., email, social media) such as a Microsoft® Net passport, other accounts such as entertainment or gaming-related accounts (e.g., Xbox live, Netflix, online game subscription accounts, etc.), user data relating to such accounts as user emails, texts, instant messages, calls, other communications, and other content; social network accounts and data, such as news feeds; online activity; and calendars, appointments, application data, or other user data that may be used for determining current or historic visit features or explicit signals.
  • online or cloud-based accounts e.g., email, social media
  • other accounts such as entertainment or gaming-related accounts (e.g., Xbox live, Netflix, online game subscription accounts, etc.)
  • user data relating to such accounts as user emails, texts, instant messages, calls, other communications, and other content
  • social network accounts and data such as
  • FIGS. 3A-3D aspects of an example process flow 300 are illustratively depicted for an embodiment of the disclosure.
  • FIG. 3A depicts an overview of process flow 300 and each FIGS. 3B-3D depict an aspect of process flow 300.
  • the blocks of process flow 300 (shown across FIGS. 3A-3D) that correspond to actions (or steps) to be performed (as opposed to information to be acted on) may be carried out by one or more computer applications or services, in some embodiments, including a virtual assistant, that operate on one or more user devices (such as user device 104a), servers (such as server 106), may be distributed across multiple user devices and/or servers, or may be implemented in the cloud.
  • the functions performed by the steps of process flow 300 are carried out by components of system 200, described in connection to FIG. 2.
  • Visits aggregation 302 determines a set of historical visits 346 that might include one or more features in common with a possible current visit.
  • FIG. 3B A more detailed perspective of aspects of block 302 is shown in FIG. 3B.
  • visits aggregation 302 receives user historic location signal information 343.
  • Historic location information 343 may be received from a user profile associated with the user; for instance, historic location signal information 343 may be received from historic location data 243 of user profile 240.
  • Historic location signal information 343 may be determined from user data obtained by one or more user-data collection components 210, as described in FIG. 2.
  • Location attribution 382 performs location attribution using the historic location signal 343.
  • location attribution 382 is carried out by location attribution component 282 or visits monitor 280 of system 200, and may further include functionality described in connection to visits monitor 280.
  • visit recognition 384 is performed to identify historical visits to the same location as the possible current visit.
  • visit recognition 384 determines a visit based on user data indicating that the user was in the same approximate location (e.g., a geographic location or semantic location) for a timeframe. Further, in some instances, consecutive (or substantially consecutive) user location data near the same location is concatenated, and in some cases outliers are filtered out, to facilitate identifying a visit and duration of the visit.
  • visit recognition 384 uses visit logic, such as visit logic 235, described in connection to FIG. 2.
  • the operations performed in visit recognition 384 are carried out by visit identifier 284 or visits monitor 280 of system 200, and may further include operations or functionality of embodiments described in connection to visits monitor 280.
  • the output of visits aggregation 302 includes a set of historical visits 346, which is provided to history-based prediction 304.
  • FIG. 3C and with continuing reference to FIG. 3A, aspects of history-based prediction 304 are shown.
  • features calculation is performed on the received set of user historic visits 346.
  • the user historic visits may be received directly from the output of visits aggregation 302 or from storage, such as user historic visits component 246 of user profile 240, described in FIG. 2.
  • Features calculation 362 determines features in the set of user historic visits 346 that are similar to features of the possible current visit. (Although the term“calculation” is used in regards to this block, it is clear from the description provided herein that the operations performed by features calculation 362 may comprise more than merely mathematical operations.)
  • the similar or in-common features determined by features calculation 362 are used by one or more pattern-based predictors 364 to filter the set of user-historic visits 346 in order to determine a subset of historic visits that have one or more features in common with the possible current feature (or put another way, a subset of historic visits that share a context with the possible current visit).
  • features calculation 362 determines similarity among visits of different types of features or based on feature-based patterns of the visits.
  • features calculation 362 comprises a periodic features calculation 363a and a similarity features calculation 363b.
  • periodic features may comprise features of visits that occur approximately periodically.
  • Similarity features may comprise features based on similar behavior (such as similar late arrival time, length of stay, previous location visited, etc.), or other similar features other than periodic features (for instance, similar uncommon or out- of-routine features).
  • periodic features calculation 363a and a similarity features calculation 363b may be performed by a periodic features determiner 263a and a behavior similarity features determiner 263b, respectively, such as described in connection to FIG. 2.
  • the operations performed in features calculation 362 are carried out by features similarity determiner 262 or location prediction/inference engine 260 of system 200, and may include operations or functionality of embodiments described in connection to features similarity determiner 262.
  • One or more pattern-based predictors 364 receive user historic visits information 346 and feature-similarity information determined from features calculation 362. Using this information, each of the pattern-based predictors 364 determines a pattern- based prediction regarding a possible location of the user. As shown in Fig. 3C, a pattern based predictor 264 comprises steps including visit filtering 3642, visit scoring 3644, visit selection 346, and patter-based prediction 3648. Some embodiments of a pattern-based predictor 364 are implemented using a pattern-based predictor 264, described in connection to FIG. 2.
  • visit filtering 3642 visits scoring 3644, visit selection 3646, and patter-based prediction 3648 are carried out by visit filter 2642, visit score determiner 2644, visit selector 2646, and pattern-based prediction determiner 2648, respectively, and may include embodiments described in connection to pattern-based predictors 264 or system 200.
  • Some embodiments of history -based prediction 304 determine multiple subsets of historic visits, such that each subset is similar to the possible current visit based on at least one in-common feature similarity (determined by features calculation 362). For example, the outputs of periodic features calculation 363a may be utilized by a visit filtering 3642 step to determine a first subset of historic visits that occur on the same day of the week as the possible current visit for a first pattern-based predictor 364, and utilized by another visit filtering 3642 step to determine by a second subset of historic visits that begin at approximately the same time of day as the possible current visit for a second pattern-based predictor 364.
  • the outputs of similarity features calculation 363b may be utilized in a similar manner to determine a third subset of historic visits that include a similar previous location visited by the user for a third pattern-based predictor 364, and a fourth subset of historic visits that include a feature related to an out-of-routine event for a fourth pattern-based predictor 364.
  • Each of the four pattern-based predictors in this example may then perform visit scoring 3644, visit selection 3646, and pattern-based prediction 3648 to determine a pattern-based prediction regarding a possible location of the user.
  • embodiments of blocks 3642, 3644, 3646, and 3648 may comprise an embodiment described in connection to visit filter 2642, visit score determiner 2644, visit selector 2648, and pattern-based prediction determiner 2648 of system 200.
  • a pattern-based prediction is selected from among the pattern- based predictions determined by the one or more pattern-based predictors 364.
  • the selected pattern-based prediction comprises a history-based prediction 350 (shown in FIGS. 3A and 3D).
  • the operations performed in prediction selection 368 are carried out by a location prediction/inference selector 268 or location prediction/inference engine 260 of system 200, and may include operations or functionality of embodiments described in connection to location prediction/inference selector 268 or location prediction/inference engine 260.
  • the history -based prediction 350 has a corresponding prediction confidence, which may be determined as described in connection to pattern-based prediction determiner 2648 and location prediction/inference selector 268 of system 200.
  • the output of history -based prediction 304 includes a history-based prediction 350, which is provided to user-location inference conflation 306.
  • User-location inference conflation 306 also receives one or more user explicit signals 349, such as explicit signals regarding flights (3492), the user’s calendar (3494), external events (3496), or out of routine information (3498), for example.
  • user explicit signals 349 such as explicit signals regarding flights (3492), the user’s calendar (3494), external events (3496), or out of routine information (3498), for example.
  • FIG. 3D and with continuing reference to FIG. 3A, aspects of user-location inference conflation 306 are shown.
  • One or more user explicit signals 349 and history -based prediction 350 are received by user-location inference conflation 306.
  • a confidence is calculated for each of the user explicit signals 349.
  • a conflict level is calculated for each of the explicit signals 349 versus the history -based prediction 350.
  • Embodiments of bock 322 may be carried out by an explicit signals determiner 222 or user-location inference engine 220 , of system 200, and may include operations or functionality of embodiments described in connection to explicit signals determiner 222 or user-location inference engine 220.
  • embodiments of bock 324 may be carried out by a conflict level determiner 224 or user-location inference engine 220, of system 200, and may include operations or functionality of embodiments described in connection to conflict level determiner 224 or user-location inference engine 220.
  • a user location inference/prediction 342 is determined.
  • the user location inference/prediction 342 comprises a coherent prediction about the semantic location of the user or a future predicted location of the user.
  • Some embodiments of user-location inference conflation 306 may determine the user location inference/prediction 342 as described in connection with location predictor 226 (or prediction conflation 220) for determining the coherent prediction.
  • user location inference/prediction 342 is determined based on a level of conflict between history-based prediction 350 and one or more explicit signal(s) 349, and in some instances, a corresponding confidence regarding the accuracy or certainty of the explicit signal(s)).
  • the explicit signal confidence is compared to a prediction confidence associated with history-based prediction 350 to determine a coherent prediction regarding the user’s semantic location.
  • some embodiments of user-location inference conflation 306 also determine features or contextual information associated with the coherent prediction, such as likely arrival time to a future predicted semantic location, likely departure time from the current location (or the preceding location in the case of a coherent prediction about a series or sequence of future locations), duration or stay at the location, user activities, events, other people, or other related information as describe herein.
  • Some embodiments of process flow 350 may provide the determined coherent prediction about a user semantic location and related information (if determined) to one or more predicted-location consumers (not shown), such as consumers 270 described in connection to FIG. 2. Again, providing an inferred or predicted user location without using conventional location services (such as GPS) can prolong battery life, resulting in a better user experience.
  • FIGS. 4 and 5 flow diagrams are provided illustrating examples of a method 400 for providing a personalized computing experience to a user based on a predicted next or future semantic location of the user, and a method 500 for determining and utilizing a prediction of a next or future location of a user.
  • Each block or step of method 400, method 500, and other methods or process flows described herein comprises a computing process that may be performed using any combination of hardware, firmware, and/or software. For instance, various functions may be carried out by a processor executing instructions stored in computer memory.
  • method 400 and 500 may be implemented using the components described in system 200.
  • the methods may be embodied as computer-usable instructions stored on computer storage media.
  • the methods also may be carried out by a computer program such as a virtual assistant computing service, a distributed application, a stand-alone application, a service or hosted service (stand-alone or in combination with another hosted service), or a plug-in to another product, to name a few.
  • a computer program such as a virtual assistant computing service, a distributed application, a stand-alone application, a service or hosted service (stand-alone or in combination with another hosted service), or a plug-in to another product, to name a few.
  • step 410 determine a current context for a user.
  • Embodiments of step 410 may determine a context associated with a user’s possible current location.
  • the context may comprise, for example, the user’s previous location(s), date, time, day of the week, other users at the current location, user activity detected, or other current context of information for determining visit features, as described herein.
  • Embodiments of step 410 may receive user data (which may include current and historic user data) for determining a current context, and if a visit is occurring, may determine the current context for the current visit.
  • the user data may be received from a user-data collection component 210 and/or a user profile 240 associated with the user.
  • step 410 further comprises determining a set of features associated with the current visit.
  • contextual information associated with a visit may be extracted from user data related to the visit, and used for determining a current context, which may include features about the current visit, such as described in connection to contextual information extractor 286 in system 200.
  • Embodiments of step 410 may be carried out by visits monitor 280 of system 200, described in connection to FIG. 2. Additional details regarding embodiments of step 410 are described in connection to visits monitor 280 of system 200.
  • step 420 the user’s historic visits to the possible current location are determined.
  • Embodiments of step 420 determine a set of historic visits to the same or approximate location of the user’s possible current location, determined in step 410.
  • the set of similar historic visits may be determined from historic location data associated with the user, which may be accessed in a log of information, such as historic location data 243 in user profile 240, described in FIG. 2.
  • Embodiments of step 420 may be carried out by visits monitor 280 of system 200, also described in connection to FIG. 2. Further details regarding embodiments of step 420 for determining the user’s historic visits to the user’s current location, are described in connection to visits monitor 280 of system 200.
  • a pattern-based prediction for the user’s current or future location is determined.
  • Embodiments of step 430 determine a pattern-based prediction (sometimes called a“history -based prediction”) based on the set of historic visits determined in step 420 and the current context determined in step 410.
  • a set of one or more candidate pattern-based predictions are determined such that each candidate prediction is determined using a subset of the historic visits set, the subset having at least one feature (or context) in common with the possible current visit.
  • the at least one common feature is based on a periodic feature pattern or behavior feature pattern, such as described in connection to feature similarity determiner 262 in FIG. 2.
  • a particular candidate prediction then may be selected as the pattern-based prediction determined in step 430.
  • the particular candidate prediction is selected based on a corresponding prediction confidence determined with each candidate prediction. For example, in one embodiment, each candidate prediction is determined with a corresponding prediction confidence, and candidate prediction with the highest prediction confidence is selected as the pattern-based prediction of step 430.
  • Some embodiments of step 430 (or method 400) determine multiple pattern-based predictions from the set of candidate predictions.
  • step 430 comprises sub-steps 432, 434, and 436.
  • Embodiments of step 432 identify similar or“in-common” features in historic visits and the possible current visit.
  • the similar features may be based on behavior similarity or periodic similarity, such as described in connection to feature similarity determiner 262 in FIG. 2.
  • Some embodiments of sub-step 432 determine a visitation sequence similarity (e.g. the sequence of the last K locations the user visited prior to possible current visit (or historic visit) based on the Levenschtein distance between the historical visit (the observed visit) and possible current visits sequences.
  • sub-step 432 determine a subset of historic visits having at least one feature in common with the possible current visit. For example, one subset may include historic visits on the same day as the possible current visit; another subset may include historic visits having the same preceding location visited by the user as the possible current visit; still another subset may include historic visits having the same approximate arrival time to the location as the user’s arrival time to the possible current visit.
  • Each subset of historic visits may be used in sub-step 434 to determine a candidate prediction, based on the feature similarity pattern of the subset (thus the term“pattern- based” prediction).
  • a set of candidate pattern-based predictions is determined.
  • Embodiments of sub-step 434 determine a set of candidate predictions regarding the user’s current (or future) location, and each candidate prediction may be determined based on a subset of historic visits that have a particular feature pattern (or context pattern) in common with the possible current visit.
  • the predicted location is determined based on the subsequent or next location(s) visited by the user in the subset of historic visits, as described herein.
  • sub-step 434 comprises sub-steps 4342-
  • sub-step 434 may be performed multiple times; sub-steps 4342-4348 may be performed for each candidate prediction to be determined.
  • visit filtering is performed.
  • the set of historic visits determined in step 420 may be filtered to identify a subset of historic visits having one or more particular similar feature pattern(s) in common with the possible current visit, such as determined in sub-step 432.
  • sub-step 4342 may determine a subset of historic visits that comprises those historic visits that occurred on the same day of the week as the possible current visit.
  • sub-step 4342 may be carried out by pattern-based predictors 264 or visit filter 2642 of system 200, described in connection to system 200 (FIG. 2). Additional details regarding embodiments of sub-step 4342 are described in connection to visit filter 2642 of system 200 and visit filtering 3642 of process flow 300.
  • sub-step 4344 the subset of historic visits is scored with respect to similarity to the possible current visit.
  • Embodiments of sub-step 4344 determine a similarity score for each historic visit, in the subset of historic visits determined in sub-step 4342, to the possible current visit (or current context).
  • each historic visit is scored based on the number of features in common with the possible current visit (regardless of the particular feature(s) used to determine the subset in sub-step 4342) and/or based on a statistical similarity of its features to features of the possible current visit.
  • sub-step 4344 may be carried out by pattern-based predictors 264 or visit score determiner 2644 of system 200, described in connection to system 200 (FIG. 2). Additional details regarding embodiments of sub-step 4344 are described in connection to visit score determiner 2644 of system 200 and visit scoring 3644 of process flow 300.
  • step 4346 visit selection is determined.
  • Embodiments of step 4346 determine, from the subset of historic visits determined in sub-step 4344, a set of example visits that are sufficiently similar to the possible current visit, based on their similarity scores.
  • Some embodiments of sub-step 4346 use a similarity threshold, such as described in connection to predictor 264 in system 200 (FIG. 2.), to determine those historic visits that are sufficiently similar to the possible current visit.
  • the historic visits that satisfy the threshold are determined to be sufficiently similar and may be utilized in sub step 4348 or determining a candidate prediction of a user location.
  • the set of historic visits that satisfy the similarity threshold is referred to as the example set.
  • each member of the example set of historic visits includes at least one feature or context in-common with the possible current visit or sufficiently similar to the possible current visit, based on a similarity score.
  • the visit (or visits) on a particular day having the highest score, among other visits that day is selected.
  • the similarity threshold may be predetermined, tunable, or adaptive, or may be initially set to a value based on a population of users, may be based on empirical information learned about the particular user, for example, or may be adaptive based on the number of historical observations.
  • Some embodiments of sub-step 4346 may be carried out by pattern-based predictors 264 or visit selector determiner 2646 of system 200, described in connection to system 200 (FIG. 2). Additional details regarding embodiments of sub-step 4346 are described in connection to pattern-based predictors 264 and visit selector determiner 2646 of system 200, and visit selection 3646 of process flow 300.
  • a candidate prediction is determined.
  • Embodiments of sub-step 4348 determine a candidate pattern-based prediction regarding a user’s current or predicted location (or locations) based on the example set of historic visits determined in sub-step 4346. Further, in some embodiments, sub-step 4348 also determines contextual information related to the candidate prediction, such as arrival time at a future location, departure time from the inferred current location, length of stay at the future location, etc.
  • the candidate prediction is determined as the location(s) occurring in the example visits having the highest observations count, such as described in connection to pattern-based prediction determiner 2648 in system 200 (Fig. 2).
  • Those historic visits in the set of example visits that are consistent with the candidate prediction determined in sub-step 4348 such as those particular example visits that have an observation that contributes to the highest observation count, comprise the“prediction support set” of historic visits.
  • a prediction confidence corresponding to the candidate prediction is also determined.
  • the prediction confidence may indicate a degree or likelihood that the inferred or predicted user location is accurate.
  • the prediction confidence may be determined as described in connection with pattern-based prediction determiner 2648 in system 200; for instance, in one embodiment, the prediction confidence is determined as product of a prediction probability and a prediction significance corresponding to the candidate prediction, as described in an embodiment provided in connection with pattern-based prediction determiner 2648.
  • Some embodiments of sub-step 4348 may be carried out by pattern-based predictors 264 or pattern-based prediction determiner 2648 of system 200, described in connection to system 200 (FIG. 2). Additional details regarding embodiments of sub-step 4348 are described in connection to pattern- based predictors 264 and pattern-based prediction determiner 2648 of system 200, and pattern-based prediction 3648 of process flow 300.
  • the candidate pattern-based predictions determined in sub-step 434 are used by sub-step 436.
  • a pattern-based prediction is selected from the set of candidate predictions determined in sub-step 434.
  • Some embodiments of sub-step 436 select the candidate prediction having the highest corresponding prediction confidence.
  • Some embodiments of sub-step 436 utilize an ensemble selection process, whereby each of the candidate predictions vote or weigh in, and a particular candidate prediction is selected based on this. Further, in some embodiments, individual ensemble member predictors may be weighted based on learned information about the user or of the visits.
  • the output of sub-step 434 comprises a pattern- based inference for a user location or a prediction for a location (or locations) of the user, and may also include related contextual information (e.g., arrival time at the future location, departure time from the inferred current location, length of stay, user activity at the future location, or examples as described herein).
  • Some embodiments of step 430, and sub-sets 432-436 may be carried out by location prediction/inference engine 260, described in connection to system 200 (FIG. 2). Additional details regarding embodiments of step 430 and sub-sets 432-436 are described in connection to location prediction/inference engine 260 of system 200 and history-based prediction 304 of process flow 300.
  • the pattern-based prediction determined in step 430 is provided as a coherent prediction of the user’s current or future location and is utilized in step 460.
  • some embodiments of method 400 do not include identifying explicit signals (step 440) nor performing conflation (step 450).
  • an explicit signal comprises information that may impact a user’s location(s), including context or features associated with the location(s), such as arrival time or length of stay, for instance.
  • an explicit signal may comprise information regarding a scheduled meeting on a user’s calendar or an email received by the user regarding upcoming flight.
  • Some embodiments of step 440 determine an explicit signal based on user data received from user- data collection component 210, which may include information sensed or otherwise determined from user account(s)/activity data 248, described in connection to system 200.
  • an explicit signal may be determined based on information determined about the user for a time corresponding with the pattern-based prediction(s) determined in step 430.
  • step 440 also determine a level of confidence associated with each explicit signal.
  • the confidence level may indicate a legitimacy or authority (e.g., strength) of the particular explicit signal; for instance, a higher confidence may indicate that the user’s activity is more likely to be affected according to the explicit signal.
  • the explicit signal confidence may be utilized when conflating an explicit signal with a pattern-based prediction by providing an indication of the likelihood that the inferred location corresponding to the explicit signal is the user’s current location.
  • the determined explicit signals and corresponding confidence(s) may be stored user explicit signals data 249 of user profile 240.
  • step 440 are performed by an explicit signals determiner 222 or a user- location inference engine 220, described in connection to system 200 (FIG. 2). Additional details regarding embodiments of step 440 are described in connection to user-location inference engine 220 of system 200 and user-location inference conflation 306 of process flow 300.
  • step 450 conflate the explicit signal and pattern-based prediction to determine an inferred location of the user.
  • Embodiments of step 450 perform conflation on the explicit signal(s) determined in step 440 (if there are any explicit signals) and the pattern- based prediction determined in step 430 to determine a coherent prediction of the current or future location (or locations) of the user.
  • the coherent prediction of the current or future location (or locations) may comprise a semantic location and include related contextual information about the location.
  • an explicit signal confidence determined in step 440 may be utilized when conflating an explicit signal with a pattern-based prediction by providing an indication of the likelihood that the location corresponding to the explicit signal is the user location rather than the location indicated in the pattern-based prediction.
  • step 450 determine a level of conflict between any explicit signals determined in step 440 and the pattern-based prediction from step 430. If there is no conflict, then the pattern-based prediction may be determined as the user location. On the other hand, if conflict is determined (for instance, where the explicit signal indicates the user is at a first location or context and the pattern-based prediction indicates the user is at in a different location or context), then based on the level of conflict (and in some instances the confidence of the explicit signal), prediction conflation may: override the pattern-based prediction with the location information and context derived from the explicit signal; may modify the pattern-based prediction based according to the explicit signal (for instance the user may still visit the location predicted by the pattern-based prediction, but may visit that location after the location corresponding to the explicit signal); or may determine that the explicit signal will likely not impact the pattern-based prediction and thus provide the pattern-based prediction as a coherent prediction.
  • conflict for instance, where the explicit signal indicates the user is at a first location or context and the pattern-based prediction indicates the user is at in
  • step 450 further determine contextual information related to the coherent inferred or predicted location, such as length of stay, venue, user activities likely to be performed at the location, departure from the current location, other people (such as friends or contacts of the user) who may be present at the location, or other contextual information, as described herein.
  • Some embodiments of step 450 are performed by a user-location inference engine 220, described in connection to system 200 (FIG. 2). Additional details regarding embodiments of step 440 are described in connection to user-location inference engine 220 of system 200 and user-location inference conflation 306 of process flow 300.
  • a location consumer uses the inferred user location or predicted future location. By doing so, rather than using conventional location services, such as GPS, the battery life of a user device is prolonged.
  • a location consumer such as a location service, uses the inferred location information about the user’s location in place of location information that would otherwise be determined using conventional location-services components, such as a GPS sensor.
  • the location service provides the inferred location information to the OS/operating system or any applications or services that use location information.
  • the location service may control (or work inconjunction with other software routines, services, or drivers to control) other location-related functionality or services on the user device.
  • a location service or location consumer may disable, turn-off, or modify operation of the GPS sensor /GPS-related services so that these components are not operating or so that they operate less often, thereby preserving device battery charge.
  • the location service may provide the inferred location information in place of the location information that would otherwise be provided by conventional location-services component s).
  • a flow diagram is provided illustrating an example method 500 for determining and utilizing a prediction of a next or future location of a user.
  • a current visit is determined.
  • Embodiments of step 510 determine that a user is currently visiting a location.
  • a visit may be determined from user- data indicating location information about the user. For example, user data indicating that a user has been in the same approximate geographical location for duration of time may indicate a visit.
  • One embodiment of step 510 determines a visit by concatenating consecutive (or substantially consecutive) user location data indicating the user is near the same approximate location, and in some cases filtering out outliers.
  • Embodiments of step 510 may be carried out by location attribution 282 and visit identifier 284 or visits monitor 280 of system 200, described in connection to FIG. 2. Additional details regarding embodiments of step 510 are described in connection to visits monitor 280 of system 200.
  • determining a context comprises determining one or more features associated with the visit.
  • contextual information associated with the visit may be extracted from user data related to the current visit, and used for determining current context, which may include features about the current visit. Additional details regarding embodiments of step 520 are described in connection to visits monitor 280 of system 200. Some embodiments of step 520 may be performed as described in step 410 of method 400.
  • step 530 determine user historic visits to the current location.
  • Embodiments of step 530 determine a set of historic visits to the same or approximate location of the location of the current visit determined in step 510.
  • Some embodiments of step 530 comprise performing visit aggregation, such as described in connection with process flow 300 (FIGS. 3A and 3B). Some embodiments of step 530 may be performed as described in step 420 of method 400.
  • step 540 determine a history -based prediction for the next (or future) location of the user.
  • Embodiments of step 540 determine a prediction of the user’s next or future location based on a similarity patterns identified in historic visits (determined in step 530) with respect to the current visit (determined in step 510).
  • a set of one or more candidate predictions are determined such that each candidate prediction is determined using a subset of the historic visits set, the subset having at least one feature (or context) in common with the current visit.
  • the at least one common feature may be based on a periodic feature similarity or behavior feature similarity, such as described in connection to feature similarity determiner 262 in FIG. 2.
  • a particular candidate prediction may be selected form the set of candidate predictions, as the pattern-based prediction determined in step 540.
  • the particular candidate prediction is selected based on a corresponding prediction confidence determined with each candidate prediction, such as described in connection to pattern-based predictors 264 of system 200.
  • Some embodiments of step 540 may be performed as described in step 430 of method 400, including the sub-steps of step 430.
  • step 550 determine explicit signals.
  • Embodiments of step 550 determine one or more explicit signals of information that may impact a user’s future location(s), which may include the context or features associated with the future location(s).
  • an explicit signal may be determined based on information determined about the user for a future time corresponding with the history-based prediction(s) determined in step 540.
  • Embodiments of step 550 also may determine a level of confidence associated with the explicit signal(s), which may be used for determining a likelihood that the explicit signal will impact the user’s future location in regard to the future location(s) prediction determined in step 540.
  • Some embodiments of step 550 may be performed as described in step 440 of method 400.
  • step 560 conflate the explicit signals and history based prediction.
  • Embodiments of step 560 perform conflation on the one or more explicit signals determined in step 550 with the history-based next location prediction determined in step 540 to determine a coherent prediction of the next or future location (or locations) of the user. Additionally, some embodiments of step 560 further determine related contextual information about the next or future location, such as arrival time, length of stay, venue, user activities likely to be performed at the location, departure from the current location, other people (such as friends or contacts of the user) who may be present at the location, or other contextual information, as described herein.
  • an explicit signal confidence determined in step 5500 may be utilized when conflating an explicit signal with a history-based prediction by providing an indication of the likelihood that the future location corresponding to the explicit signal will be visited by the user rather than the future location indicated in the pattern-based prediction.
  • Some embodiments of step 560 may be performed as described in step 450 of method 400.
  • the coherent predicted next (or future) location for the user is provided.
  • the predicted next or future location may comprise a semantic location and include related contextual information about the location.
  • Some embodiments of step 570 comprise utilizing the coherent prediction regarding a user’s future location(s), determined in step 560, by one or more computer applications or services, such as a predicted-location consumer, described in connection to system 200.
  • some embodiments of step 570 utilize the determined future location to provide a personalized or tailored computing experience to the user.
  • Some embodiments of step 570 may provide the coherent predicted next (or future) location via an API to facilitate consumption of the predicted future location by a computing application or service, such as a predicted-location consumer.
  • Some embodiments of step 570 may be performed as described in step 460 of method 400. Additional details regarding embodiments of step 570 are described in connection to inferred or predicted-location consumers 270 of system 200.
  • computing device 600 an exemplary computing device is provided and referred to generally as computing device 600.
  • the computing device 600 is but one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the disclosure. Neither should the computing device 600 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated.
  • Embodiments of the disclosure may be described in the general context of computer code or machine-useable instructions, including computer-useable or computer- executable instructions, such as program modules, being executed by a computer or other machine, such as a personal data assistant, a smartphone, a tablet PC, or other handheld device.
  • program modules including routines, programs, objects, components, data structures, and the like, refer to code that performs particular tasks or implements particular abstract data types.
  • Embodiments of the disclosure may be practiced in a variety of system configurations, including handheld devices, consumer electronics, general- purpose computers, more specialty computing devices, etc.
  • Embodiments of the disclosure may also be practiced in distributed computing environments where tasks are performed by remote-processing devices that are linked through a communications network.
  • program modules may be located in both local and remote computer storage media including memory storage devices.
  • computing device 600 includes a bus 610 that directly or indirectly couples the following devices: memory 612, one or more processors 614, one or more presentation components 616, one or more input/output (I/O) ports 618, one or more I/O components 620, and an illustrative power supply 622.
  • Bus 610 represents what may be one or more busses (such as an address bus, data bus, or combination thereof).
  • I/O input/output
  • FIG. 6 is merely illustrative of an exemplary computing device that can be used in connection with one or more embodiments of the present disclosure. Distinction is not made between such categories as“workstation,”“server,” “laptop,”“handheld device,” etc., as all are contemplated within the scope of FIG. 6 and with reference to“computing device.”
  • Computing device 600 typically includes a variety of computer-readable media.
  • Computer-readable media can be any available media that can be accessed by computing device 600 and includes both volatile and nonvolatile media, removable and non removable media.
  • Computer-readable media may comprise computer storage media and communication media.
  • Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVDs) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 600.
  • Computer storage media does not comprise signals per se.
  • Communication media typically embodies computer-readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
  • modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • communication media includes wired media, such as a wired network or direct-wired connection, and wireless media, such as acoustic, RF, infrared, and other wireless media. Combinations of any of the above should also be included within the scope of computer- readable media.
  • Memory 612 includes computer storage media in the form of volatile and/or nonvolatile memory.
  • the memory may be removable, non-removable, or a combination thereof.
  • Exemplary hardware devices include solid-state memory, hard drives, optical-disc drives, etc.
  • Computing device 600 includes one or more processors 614 that read data from various entities such as memory 612 or EO components 620.
  • Presentation component(s) 616 presents data indications to a user or other device.
  • Exemplary presentation components include a display device, speaker, printing component, vibrating component, and the like.
  • the EO ports 618 allow computing device 600 to be logically coupled to other devices, including EO components 620, some of which may be built in. Illustrative components include a microphone, joystick, game pad, satellite dish, scanner, printer, wireless device, etc.
  • the EO components 620 may provide a natural user interface (NUI) that processes air gestures, voice, or other physiological inputs generated by a user. In some instances, inputs may be transmitted to an appropriate network element for further processing.
  • NUI may implement any combination of speech recognition, touch and stylus recognition, facial recognition, biometric recognition, gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, and touch recognition associated with displays on the computing device 600.
  • the computing device 600 may be equipped with depth cameras, such as stereoscopic camera systems, infrared camera systems, RGB camera systems, and combinations of these, for gesture detection and recognition. Additionally, the computing device 600 may be equipped with accelerometers or gyroscopes that enable detection of motion. The output of the accelerometers or gyroscopes may be provided to the display of the computing device 600 to render immersive augmented reality or virtual reality.
  • computing device 600 may include one or more radio(s) 624 (or similar wireless communication components).
  • the radio 624 transmits and receives radio or wireless communications.
  • the computing device 600 may be a wireless terminal adapted to receive communications and media over various wireless networks.
  • Computing device 600 may communicate via wireless protocols, such as code division multiple access (“CDMA”), global system for mobiles (“GSM”), or time division multiple access (“TDMA”), as well as others, to communicate with other devices.
  • CDMA code division multiple access
  • GSM global system for mobiles
  • TDMA time division multiple access
  • the radio communications may be a short-range connection, a long-range connection, or a combination of both a short-range and a long-range wireless telecommunications connection.
  • a short-range connection may include, by way of example and not limitation, a Wi-Fi® connection to a device (e.g., mobile hotspot) that provides access to a wireless communications network, such as a WLAN connection using the 802.11 protocol; a Bluetooth connection to another computing device is a second example of a short-range connection, or a near-field communication connection.
  • a long- range connection may include a connection using, by way of example and not limitation, one or more of CDMA, GPRS, GSM, TDMA, and 802.16 protocols.
  • Embodiment 1 A computerized method for determining future semantic location information for a user, the method comprising: determining a current context associated with a current visit of the user to a current location; determining a set of historical user visits to the current location; based on the set of historical user visits and the current context, determining a history-based prediction for the next location of the user; determining a set of explicit signals associated with the user, the explicit signals comprising information about the user’s future location; conflating information from the set of explicit signals and the history -based prediction to determine coherent prediction of a future location for the user; and providing the coherent prediction of the future location of the user.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephonic Communication Services (AREA)
  • Navigation (AREA)

Abstract

Des aspects de la technologie de la présente invention concernent une durée de vie de batterie améliorée pour un dispositif utilisateur basé sur l'utilisation d'un emplacement inféré de l'utilisateur qui élimine le besoin de services de localisation classiques tels que le GPS. En particulier, un emplacement inféré pour un utilisateur peut être déterminé, comprenant des informations contextuelles relatives à l'emplacement futur. En utilisant des informations à partir du contexte courant de l'utilisateur avec des observations d'historique relatives à l'utilisateur et des événements d'utilisateur attendus, des événements hors routine, ou d'autres informations durables ou éphémères, une inférence d'un ou plusieurs emplacements d'utilisateur peuvent être déterminés des confiances correspondantes peuvent être déterminées. L'emplacement d'utilisateur inféré peut être fourni à une application ou un service tel qu'un service d'assistant personnel associé à l'utilisateur, ou peut être fourni en tant que API pour permettre la consommation des informations d'emplacement inférées par une application ou un service.
EP19836704.7A 2018-11-19 2019-11-13 Économie de durée de vie de batterie à l'aide d'un emplacement inféré Withdrawn EP3868135A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/194,611 US20190090197A1 (en) 2015-04-29 2018-11-19 Saving battery life with inferred location
PCT/US2019/061047 WO2020106499A1 (fr) 2018-11-19 2019-11-13 Économie de durée de vie de batterie à l'aide d'un emplacement inféré

Publications (1)

Publication Number Publication Date
EP3868135A1 true EP3868135A1 (fr) 2021-08-25

Family

ID=69165555

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19836704.7A Withdrawn EP3868135A1 (fr) 2018-11-19 2019-11-13 Économie de durée de vie de batterie à l'aide d'un emplacement inféré

Country Status (3)

Country Link
EP (1) EP3868135A1 (fr)
CN (1) CN113039818A (fr)
WO (1) WO2020106499A1 (fr)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11127020B2 (en) * 2009-11-20 2021-09-21 Palo Alto Research Center Incorporated Generating an activity inference model from contextual data
US8855901B2 (en) * 2012-06-25 2014-10-07 Google Inc. Providing route recommendations
WO2014074513A1 (fr) * 2012-11-06 2014-05-15 Intertrust Technologies Corporation Systèmes et procédés de reconnaissance d'activité
US11493347B2 (en) * 2013-03-12 2022-11-08 Verizon Patent And Licensing Inc. Using historical location data to improve estimates of location
US9420426B2 (en) * 2013-07-30 2016-08-16 Google Inc. Inferring a current location based on a user location history
US9838848B2 (en) * 2015-06-05 2017-12-05 Apple Inc. Venue data prefetch
US9872150B2 (en) * 2015-07-28 2018-01-16 Microsoft Technology Licensing, Llc Inferring logical user locations
US11429883B2 (en) * 2015-11-13 2022-08-30 Microsoft Technology Licensing, Llc Enhanced computer experience from activity prediction

Also Published As

Publication number Publication date
CN113039818A (zh) 2021-06-25
WO2020106499A1 (fr) 2020-05-28

Similar Documents

Publication Publication Date Title
US10909464B2 (en) Semantic locations prediction
US10567568B2 (en) User event pattern prediction and presentation
CN110476176B (zh) 用户目标辅助技术
US11128979B2 (en) Inferring user availability for a communication
US11656922B2 (en) Personalized notification brokering
US10446009B2 (en) Contextual notification engine
WO2019133264A1 (fr) Expérience informatique améliorée à partir d'un modèle d'activité personnel
US10185973B2 (en) Inferring venue visits using semantic information
US20160292584A1 (en) Inferring User Sleep Patterns
CN107851231A (zh) 基于活动模型的活动检测
US20160321616A1 (en) Unusualness of Events Based On User Routine Models
US11436293B2 (en) Characterizing a place by features of a user visit
US20220078135A1 (en) Signal upload optimization
US20190090197A1 (en) Saving battery life with inferred location
EP3868135A1 (fr) Économie de durée de vie de batterie à l'aide d'un emplacement inféré

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210518

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

RAP3 Party data changed (applicant data changed or rights of an application transferred)

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC

18W Application withdrawn

Effective date: 20211102