WO2016085585A1 - Présentation de cartes d'informations pour des évènements associés à des entités - Google Patents

Présentation de cartes d'informations pour des évènements associés à des entités Download PDF

Info

Publication number
WO2016085585A1
WO2016085585A1 PCT/US2015/056019 US2015056019W WO2016085585A1 WO 2016085585 A1 WO2016085585 A1 WO 2016085585A1 US 2015056019 W US2015056019 W US 2015056019W WO 2016085585 A1 WO2016085585 A1 WO 2016085585A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
user
snapshot
entities
snapshots
Prior art date
Application number
PCT/US2015/056019
Other languages
English (en)
Inventor
Matthew Sharifi
David Petrou
Original Assignee
Google Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/555,111 external-priority patent/US20160027044A1/en
Application filed by Google Inc. filed Critical Google Inc.
Priority to DE112015005293.3T priority Critical patent/DE112015005293T5/de
Priority to CN201580035494.XA priority patent/CN106663112A/zh
Publication of WO2016085585A1 publication Critical patent/WO2016085585A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/109Time management, e.g. calendars, reminders, meetings or time accounting
    • G06Q10/1093Calendar-based scheduling for persons or groups
    • G06Q10/1095Meeting or appointment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/11File system administration, e.g. details of archiving or snapshots
    • G06F16/128Details of file system snapshots on the file-level, e.g. snapshot creation, administration, deletion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/109Time management, e.g. calendars, reminders, meetings or time accounting

Definitions

  • the Internet provides access to a wide variety of resources. For example, video and/or audio files, as well as webpages for particular subjects or particular news articles, are accessible over the Internet. Access to these resources presents opportunities for other content (e.g., advertisements) to be provided with the resources.
  • a webpage can include slots in which content can be presented. These slots can be defined in the webpage or defined for presentation with a webpage, for example, along with search results.
  • Content in these examples can be of various formats, while the devices that consume (e.g., present) the content can be equally varied in terms of their type and capabilities.
  • the method can include receiving, by a server device, a plurality of snapshots associated with use of a computing device by a user, each snapshot from the plurality of snapshots being based on content presented to the user on the computing device.
  • the method can further include evaluating the plurality of snapshots, including, for each respective snapshot: identifying a respective set of entities indicated by the respective snapshot, and storing, to a memory, indications of the respective set of entities and a respective timestamp indicating a respective time that the respective snapshot was captured, wherein the respective set of entities and respective timestamp are associated in the memory.
  • the method can further include determining, based on a first snapshot from the plurality of snapshots, a first time to present one or more information cards to the user.
  • the method can further include, at the first time, locating in memory entities having a time stamp that corresponds to the first time.
  • the method can further include generating an information card based on the one or more of the located entities.
  • the method can further include providing, for presentation to the user, the generated information card.
  • FIG. 1 is a block diagram of an example environment for delivering content.
  • FIG. 2A shows an example system for presenting information cards based on entities associated with snapshots of content presented to users.
  • FIG. 2B shows an example information card associated with a phone number entity.
  • FIG. 2C shows an example information card associated with a location entity.
  • FIG. 2D shows an example information card associated with a subject entity.
  • FIG. 3 is a flowchart of an example process for providing information cards based on snapshots extracted from content presented to a user.
  • FIG. 4 is a block diagram of an example computer system that can be used to implement the methods, systems and processes described in this disclosure.
  • Snapshots can be captured and evaluated on an ongoing basis based on content that is presented to one or more users on their respective user devices.
  • Content may be presented to a user in, for example, a browser, an application (e.g., a mobile app), a web site, an advertisement, a social network page, or other digital content environments.
  • Each snapshot may include at least a portion of one or more of a calendar entry, a map, an email message, a social network page entry, a web page element, an image, or some other content.
  • Evaluating a particular snapshot can include identifying associated entities, (e.g., persons, places (e.g., specific locations, addresses, cities, states, countries, room numbers, buildings, or other specific geographic locations), things (such as phone numbers), subjects, scheduled events (e.g., lunch dates, birthdays, meetings), or other identifiable entities.
  • a timestamp associated with receipt of a snapshot can also be stored in association with the snapshot and/or entities upon which the snapshot is based.
  • Target presentation times can be determined, based on, for example a timestamp associated with receipt of the snapshot, and/or based on times of one or more events identified using the snapshot.
  • one or more information cards that identify one or more of the entities can be provided (e.g., for presentation to the user).
  • Each information card can also indicate, for example, a context that the user can use to understand the rationale for the display of the given information card.
  • At least one call to action can also be included in the information card, to, for example, allow the user to perform an action associated with an entity (such as dialing a phone number, obtaining driving directions, or receiving additional information).
  • An information card can serve as a prompt of sorts (e.g., for the user to remember a concept and/or some other piece(s) of information), or the information card can serve as a reminder of an upcoming event.
  • the users may be provided with an opportunity to enable/disable or control programs or features that may collect and/or use personal information (e.g., information about a user's social network, social actions or activities, a user's preferences or a user's current location).
  • personal information e.g., information about a user's social network, social actions or activities, a user's preferences or a user's current location.
  • certain data may be treated in one or more ways before it is stored or used, so that personally identifiable information associated with the user is removed.
  • a user's identity may be anonymized so that the no personally identifiable information can be determined for the user, or a user's geographic location may be generalized to where location information is obtained (such as to a city, ZIP code, or state level), so that a particular location of a user cannot be determined.
  • Particular implementations may realize none, one or more of the following advantages. Users can be automatically presented with an information card that is relevant to an event or a subject associated with content that they have received.
  • FIG. 1 is a block diagram of an example environment 100 for delivering content.
  • the example environment 100 includes a content management system 110 for selecting and providing content in response to requests for content.
  • the example environment 100 includes a network 102, such as a local area network (LAN), a wide area network (WAN), the Internet, or a combination thereof.
  • the network 102 connects websites 104, user devices 106, content sponsors 108 (e.g., advertisers), publishers 109, and the content management system 110.
  • the example environment 100 may include many thousands of websites 104, user devices 106, content sponsors 108 and publishers 109.
  • the environment 100 can include plural data stores, which can be stored locally by the content management system 110, stored somewhere else and accessible using the network 102, generated as needed from various data sources, or some combination of these.
  • a data store of entities 131 can include a list of entities that can be used to identify entities in snapshots of content presented to users. Entities can include, for example, phone numbers, locations (e.g., addresses, cities, states, countries, room numbers, buildings, specific geographic locations), subjects (e.g., related to topics), names of people, scheduled events (e.g., lunch dates, birthdays, meetings), email addresses, organization names, products, movies, music, or other subjects that can be represented, e.g., in a knowledge graph or other information representation.
  • a data store of entities 131 can include, for example, plural entries, one for each snapshot evaluated.
  • a snapshot can be evaluated after capture and one or more top ranked or most significant entities that are included or referenced in a snapshot can be stored as a group (e.g., an entry in the data store of entities 131).
  • a data store of timestamps 132 can include timestamps associated with times that respective snapshots were captured.
  • the timestamps can be associated with the entities that are identified from the respective snapshots.
  • a data store of events 133 can include information associated with events that have been identified from a respective snapshot.
  • information for an event can include one or more of a date, a start time, an end time, a duration, names of participants, an associated location, associated phone numbers and/or other contact information (e.g., email addresses), an event type (e.g., meeting, birthday, lunch date), and a description or context (e.g., that was obtained from the respective snapshot).
  • a data store of target presentation times 134 can include one or more times that are established, by the content management system 1 10, for the presentation of a respective information card.
  • a target presentation time established for a lunch date may include a time that is one hour before the lunch date (e.g., as a reminder to leave or prepare for the lunch date) and a designated time on the day or night before the lunch date to inform the user of the next day's lunch date.
  • Some or all of the data stores discussed can be combined in a single data store, such as a data store that includes a combination of identified entities, events, timestamps and target presentation times, all being associated with a single snapshot.
  • the content management system 110 can include plural engines, some or all of which may be combined or separate, and may be co-located or distributed (e.g., connected over the network 102).
  • a snapshot evaluation engine 121 can evaluate snapshots of content presented to a user on a device. For each snapshot, for example, the snapshot evaluation engine 121 can identify entities and/or events included in the snapshot and store the identified entities/events along with a timestamp associated with a time that a respective snapshot was captured or presentation time.
  • An information card engine 122 can perform functions associated with gathering information for use in information cards, generating the information cards, and determining times for presenting the information cards. For example, after the received snapshots are evaluated, the information card engine 122 can determine content for inclusion in an information card and a time to present one or more information cards to the user, including determining a target time for the presentation. Selection of content and timing of presentation is discussed in greater detail below.
  • a website 104 includes one or more resources 105 associated with a domain name and hosted by one or more servers.
  • An example website is a collection of webpages formatted in hypertext markup language (HTML) that can contain text, images, multimedia content, and programming elements, such as scripts.
  • HTML hypertext markup language
  • Each website 104 can be maintained by a content publisher, which is an entity that controls, manages and/or owns the website 104.
  • a resource 105 can be any data that can be provided over the network 102.
  • a resource 105 can be identified by a resource address that is associated with the resource 105.
  • Resources include HTML pages, word processing documents, portable document format (PDF) documents, images, video, and news feed sources, to name only a few.
  • the resources can include content, such as words, phrases, images, video and sounds, that may include embedded information (such as meta-information hyperlinks) and/or embedded instructions.
  • a user device 106 is an electronic device that is under control of a user and is capable of requesting and receiving resources over the network 102.
  • Example user devices 106 include personal computers (PCs), televisions with one or more processors embedded therein or coupled thereto, set-top boxes, gaming consoles, mobile
  • a user device 106 typically includes one or more user applications, such as a web browser, to facilitate the sending and receiving of data over the network 102.
  • a user device 106 can request resources 105 from a website 104.
  • data representing the resource 105 can be provided to the user device 106 for presentation by the user device 106.
  • the data representing the resource 105 can also include data specifying a portion of the resource or a portion of a user display, such as a presentation location of a pop-up window or a slot of a third-party content site or webpage, in which content can be presented.
  • These specified portions of the resource or user display are referred to as slots (e.g., ad slots).
  • the environment 100 can include a search system 112 that identifies the resources by crawling and indexing the resources provided by the content publishers on the websites 104. Data about the resources can be indexed based on the resource to which the data corresponds. The indexed and, optionally, cached copies of the resources can be stored in an indexed cache 1 14.
  • User devices 106 can submit search queries 1 16 to the search system 1 12 over the network 102.
  • the search system 1 12 can, for example, access the indexed cache 114 to identify resources that are relevant to the search query 1 16.
  • the search system 1 12 identifies the resources in the form of search results 118 and returns the search results 1 18 to the user devices 106 in search results pages.
  • a search result 118 can be data generated by the search system 1 12 that identifies a resource that is provided in response to a particular search query, and includes a link to the resource.
  • Search results pages can also include one or more slots in which other content items (e.g.,
  • the content management system 110 receives a request for content.
  • the request for content can include characteristics of the slots that are defined for the requested resource or search results page, and can be provided to the content management system 110.
  • a reference e.g., URL
  • a size of the slot e.g., a size of the slot, and/or media types that are available for presentation in the slot
  • keywords associated with a requested resource e.g., source keywords
  • a search query 116 for which search results are requested can also be provided to the content management system 1 10 to facilitate identification of content that is relevant to the resource or search query 1 16.
  • the content management system 1 10 can select content that is eligible to be provided in response to the request ("eligible content items").
  • eligible content items can include eligible ads having characteristics matching the characteristics of ad slots and that are identified as relevant to specified resource keywords or search queries 1 16.
  • other information such as information obtained from one or more snapshots, can be used to respond to the received request.
  • the selection of the eligible content items can further depend on user signals, such as demographic signals, behavioral signals or other signals derived from a user profile.
  • the content management system 1 10 can select from the eligible content items that are to be provided for presentation in slots of a resource or search results page based at least in part on results of an auction (or by some other selection process). For example, for the eligible content items, the content management system 110 can receive offers from content sponsors 108 and allocate the slots, based at least in part on the received offers (e.g., based on the highest bidders at the conclusion of the auction or based on other criteria, such as those related to satisfying open reservations and a value of learning). The offers represent the amounts that the content sponsors are willing to pay for presentation of (or selection of or other interaction with) their content with a resource or search results page.
  • an offer can specify an amount that a content sponsor is willing to pay for each 1000 impressions (i.e., presentations) of the content item, referred to as a CPM bid.
  • the offer can specify an amount that the content sponsor is willing to pay (e.g., a cost per engagement) for a selection (i.e., a click-through) of the content item or a conversion following selection of the content item.
  • the selected content item can be determined based on the offers alone, or based on the offers of each content sponsor being multiplied by one or more factors, such as quality scores derived from content performance, landing page scores, a value of learning, and/or other factors.
  • a conversion can be said to occur when a user performs a particular transaction or action related to a content item provided with a resource or search results page. What constitutes a conversion may vary from case-to-case and can be determined in a variety of ways. For example, a conversion may occur when a user clicks on a content item (e.g., an ad), is referred to a webpage, and consummates a purchase there before leaving that webpage.
  • a content item e.g., an ad
  • a conversion can also be defined by a content provider to be any measurable or observable user action, such as downloading a white paper, navigating to at least a given depth of a website, viewing at least a certain number of webpages, spending at least a predetermined amount of time on a web site or webpage, registering on a website, experiencing media, or performing a social action regarding a content item (e.g., an ad), such as endorsing, republishing or sharing the content item.
  • a content provider to be any measurable or observable user action, such as downloading a white paper, navigating to at least a given depth of a website, viewing at least a certain number of webpages, spending at least a predetermined amount of time on a web site or webpage, registering on a website, experiencing media, or performing a social action regarding a content item (e.g., an ad), such as endorsing, republishing or sharing the content item.
  • Other actions that constitute a conversion can also be used
  • FIG. 2A is a block diagram of a system 200 for presenting information cards 201 based on entities associated with snapshots 202 of content presented to users.
  • snapshots 202 can be captured over time from content 204a, 204b that is presented to a user 206 on a user device 106a.
  • the content 204a, 204b can be all or a portion of content (e.g., only content in active windows) in a display area associated with a user device.
  • the content 204a, 204b may be presented in one or more of a browser, an application, a web site, an advertisement, a social network page, or some other user interface or application.
  • the content 204a, 204b can include one or more of a calendar entry, a map, an email message, a social network page entry, a web page element, an image, or some other content or element.
  • the snapshots 202 of the content 204a, 204b can be evaluated, for example, to identify associated entities 131, such as phone numbers, locations (e.g., addresses, cities, states, countries, room numbers, buildings, specific geographic locations), subjects, names of people, scheduled events (e.g., lunch dates, birthdays, meetings), or other identifiable entities. Timestamps 132 associated with the received snapshots 202 can be used with the identified entities 131, for example, to identify target presentation times 133 of information cards 201 associated with the entities 131. At times corresponding to the target presentation times 133, for example, the content management system 1 10 can provide information cards 201 for presentation to the user 206.
  • associated entities 131 such as phone numbers, locations (e.g., addresses, cities, states, countries, room numbers, buildings, specific geographic locations),
  • one or more events can be identified based on the entities included in a snapshot (e.g., a calendar entry identifying a person, place and phone number).
  • a first time e.g., in the future
  • the event can be stored (e.g., in the repository of events 133) along with the first time.
  • a second time that is before the event can be determined, such as a time by which the user 206 needs to be notified to leave in order to arrive at the event on time.
  • the second time can be a time to perform an action relative to the event before the event is to occur, such as ordering flowers for an anniversary or sending a card for a birthday.
  • Determining a time to present an information card can include determining that a current time (e.g., the present time) is equal to the second time (e.g., an hour before the lunch date).
  • the information card can be presented for the event at the second time.
  • the following example stages can be used for providing information cards.
  • the content management system 110 can receive the snapshots 202, e.g., a plurality of snapshots that are associated with a use of the user device 106a by a user 206.
  • the received snapshots 202 can include snapshots of content 204a, 204b presented to the user 206 on the user device 106a.
  • the snapshots 202 can include snapshots taken from an email message (e.g., content 204a) or from an image (e.g., content 204b), and/or from other content presented to the user 206.
  • the snapshot evaluation engine 121 can evaluate the received snapshots 202. For example, for each snapshot, the snapshot evaluation engine 121 can identify entities 131 included in a snapshot 202.
  • the entities that are identified for the snapshot 202 obtained from the content 204a can include Bob, Carol, J's restaurant, and Carol's cell phone number.
  • the snapshot evaluation engine 121 can store the identified entities (or a subset thereof, such as most prominent entities) along with a timestamp associated with a time that a respective snapshot was captured.
  • a determination can be made whether any determined entities are related to each other, such as related to a common event. Relatedness can be based on proximity (e.g., the entities appear in close proximity to each other) or some other relationship in the snapshot.
  • proximity e.g., the entities appear in close proximity to each other
  • timestamps can be stored in the data store of timestamps 132, e.g., for later use in generating and presenting information cards 201 related to the snapshots 202.
  • one or more entities may be associated with an event. That is, an entity may be a person, and the event may relate to a meeting with the person (as indicated by the content included in an email message that is shown in the snapshot being evaluated).
  • a calendar item can be set up in the user's calendar and optionally in calendars of other users associated with the event (e.g., including users who are not necessarily event attendees).
  • events that are identified can include events for which the user is not to attend, but from which the user may still benefit by received an information card (e.g., a coupon expiration for an on-line sale). Events are discussed in more detail below.
  • Evaluation of the snapshot 202 associated with the content 204a can determine that a lunch date event exists between the user 206 (e.g., Bob) and Carol.
  • Other information identified from the snapshot 202 can include time, location
  • a context can be determined associated with the snapshot and/or event. For example, based on the entities of Bob, Carol, the restaurant and Carol's phone number, a context of "lunch date at noon on date X with Carol at J's.” Context information can be determined, stored and later accessed, for example, to provide a user with information as to why a particular information card was presented. In some implementations, other information can be included in a context, such as identification of the app or other source from which the snapshot was extracted, the way that the information was evaluated, or a context associated with a screen shot.
  • the context information can be in the form of a snippet of text from which the entity or event was extracted.
  • the snippet on which the context is based can be formatted to highlight the relevant pieces of information.
  • the information card engine 122 can determine a time to present one or more information cards to the user including determining a target time.
  • target times can be stored in the data store of target presentation times 134. For example, for Bob's pending lunch date with Carol, the information card engine 122 can determine a reminder time for Bob that is one hour before the scheduled noon lunch date. In some implementations, multiple times to present information cards can be determined, e.g., to include a reminder, to be sent the night before, that Bob has a lunch date with Carol the following day. In some implementations, target times can be determined using various factors, such as a mode of transportation, a distance, a location and/or other factors. For information cards that serve as prompts to the user, for example, target times can include one or more times since the concept was originally presented to the user, e.g., in the form of content from which a respective snapshot was obtained.
  • the information card engine 122 can identify entities from the stored entities 131 based on a comparison of the target time with timestamps associated with a respective entity of the stored entities. For example, for the lunch date that is scheduled for Carol and Bob, the information card engine 122 can identify information to be used in an information card that is associated with the pending lunch date. For example, Carol's phone number can be an entity that can be identified for the generation of the information card, e.g., for a reminder to Bob that is sent at a target time one hour before the lunch date and that also includes Carol's cell phone number.
  • the information card engine 122 can generate the information card 201 based on the one or more identified entities 131.
  • information card 201 can include information associated with the lunch date and Carol's cell phone number.
  • the information card 201 can be stored, e.g., at the content management system 1 10, for use in multiple subsequent presentations of the same information card.
  • the content management system 110 can provide, for presentation to the user, the information card 201.
  • the information card 201 may be provided to the user device 106a for presentation on a screen 208c, which may be the same or different screen as screens 208a, 208b from which the snapshots 202 were obtained from plural user sessions 210 for the user 206.
  • the screens 208a, 208b, 208c can be screens that are presented on multiple ones of user devices 106a that are associated with the user 206.
  • the time at which the information card is presented for example, can be a time since the concept associated with the information card was originally presented to the user, e.g., in the form of content from which a respective snapshot was obtained.
  • the information card can be provided to jog the user's memory.
  • the time at which the information card is presented can also be a time relative to an event (e.g., the lunch date) that is associated with the information card.
  • some information cards 201 may be applicable to more than one user.
  • the content management system 110 can provide information cards 201 to all parties associated with an event, such as to both Bob and Carol with regard to their pending lunch date.
  • the user when snapshots are evaluated in anticipation of potentially providing information cards to the user, the user can optionally receive a notification (e.g., along the lines of "You may be receiving information cards based on X.").
  • users can have an option to change when and how information cards are to be presented, either individually by groups (or by types of information cards), or globally.
  • users can be presented with controls for specifying the type of information that can be used for information cards, such as checkbox controls along the lines of "Don't use information from my email to generate information cards.”
  • users can control the times that information cards are to be presented, e.g., times of day or times for specific snapshots.
  • users can be provided with transparency controls for any particular information card, e.g., to learn how or why an information card was prepared and presented.
  • FIG. 2B shows an example information card 220a associated with a phone number entity.
  • the information card 220a can be presented an hour before Bob and Carol's pending lunch date.
  • the information card 220a can include, for example, a notification caption 222a (e.g., "Dialer") that notifies the user that the information card is a type that is associated with a phone number, e.g., Carol's cell phone number.
  • a context 224a for example, can identify the context associated with the information card.
  • the context 224a can include (or be determined from) part of the snapshot 202, including a snippet of Bob's email message received from Carol that contains information (e.g., location, phone number, date 226a) associated with the pending lunch date.
  • the information card 220a can also include, for example, a call-to-action 228a, such as a control, displayed with the information card on Bob's smart phone, for dialing Carol's cell phone number).
  • Other calls-to-action 228a are possible in this example, such as a call-to-action to display a map to the restaurant.
  • FIG. 2C shows an example information card 220b associated with a location entity.
  • the information card 220b can include, for example, a notification caption 222b (e.g., "Location") that notifies the user that the information card is associated with a location, e.g., Paris, France.
  • the information card 220b can be generated, for example, from a snapshot 202 associated with the user browsing online information associated with Paris, such as online travel or vacation information.
  • a context 224b can identify the context associated with the information card.
  • the context 224b can include (or be determined from) part of the snapshot 202, including a map that may be included in a snapshot or identified from information in the snapshot.
  • the information card 220b can also include, for example, a call-to-action 228b, such as a control, displayed on Bob's smart phone, for obtaining driving directions to or within Paris.
  • a time associated with the presentation of the information card 220b can be determined based on a present time and the user's current location (e.g., arriving at an airport in Paris).
  • FIG. 2D shows an example information card 220c associated with an informational entity.
  • the information card 220c can include, for example, a notification caption 222c (e.g., "Answer") that notifies the user that the information card is associated with a subject, e.g., the New York Stock Exchange (NYSE).
  • a notification caption 222c e.g., "Answer"
  • the NYSE can also be a location.
  • "Answer" types of information cards can apply, for example, to informational entities, e.g., from a snippet, a biography, a quote (e.g., a stock quote displayed on the user's screen), or other informational content.
  • the information card 220c can be generated, for example, from a snapshot associated with the user browsing online information associated with the NYSE or information from other sources.
  • a context 224c can identify the context associated with the information card.
  • the context 224c can include (or be determined from) part of the snapshot, including a snippet of text about the NYSE that the user may have been presented as content from a web site.
  • the information card 220c can also include, for example, a call-to-action 228c, such as a control, displayed on Bob's smart phone, for obtaining more information about the NYSE.
  • FIG. 3 A is a flowchart of an example process 300 for providing information cards based on snapshots extracted from content presented to a user. For example, coincidence can include simultaneous, near simultaneous or recent presentation of the sensory content item to a user.
  • the content management system 110 can perform stages of the process 300 using instructions that are executed by one or more processors.
  • FIGS. 1-2C are used to provide example structures for performing the steps of the process 300.
  • a plurality of snapshots associated with use of a computing device by a user is received by a server device (302).
  • Each snapshot from the plurality of snapshots is based on content presented to the user on the computing device.
  • a server device such as the content management system 110, can receive snapshots 202 associated with use of the user device 106a, including snapshots 202 of content 204a, 204b presented to the user 206.
  • the process 300 can further include obtaining the plurality of snapshots by the device.
  • the user device 106a can take the snapshots 202 and provide them to the content management system 1 10.
  • the snapshots 202 can be obtained by the content management system 1 10 from the content that the content management system 1 10 provides to the user device 106a.
  • the snapshots associated with the use of the device by the user can include audio presented to, or experienced by, the user.
  • snapshots 202 can include recordings that have been provided to the user device 106a.
  • the obtaining the snapshots 202 can also include using voice recognition or other recognition techniques to obtain a textual translation or identification (e.g., title) of the audio that is presented.
  • obtaining the snapshot 202 can include obtaining an audio fingerprint (e.g., of a particular song) for use in identifying the audio.
  • snapshots associated with the use of the device by the user can include content that is not associated with a browser.
  • snapshots 202 can be obtained from non-browser sources such as applications, web sites, social network sites, advertisements, and/or other sources.
  • obtaining the plurality of snapshots by the device can occur periodically or based on an environmental event.
  • snapshots 202 can be obtained periodically, such as at N-second or M-minute intervals, or snapshots 202 can be obtained whenever certain triggers occur, e.g., including user actions or other triggers.
  • the environmental event can be triggered by the device (e.g., the user device 106a), by an application (e.g., when the user starts the app or performs a triggering action), by a service (e.g., map application, calendar, or email) communicating with the device, by the operating system associated with the device, or based on a change of context, change of scene, or change of use of the device by the user.
  • a new snapshot 202 can occur when it is determined that a threshold percentage of the screen on the user device 106a has changed.
  • the environmental event can be a change in context of an application that is executing on the device, wherein a time used for detecting the change of context includes at least one of a substantially current time and a previous time.
  • a time used for detecting the change of context includes at least one of a substantially current time and a previous time.
  • the environmental event can be triggered by the user 206 moving from one level of an application or game to another level, or by reaching a milestone associated with the application or game.
  • the change of context e.g., change of levels or reaching a milestone
  • the plurality of snapshots are evaluated (304).
  • the snapshot evaluation engine 121 can evaluate the received snapshots 202.
  • the snapshot evaluation engine 121 can identify, for each snapshot, entities 131 included in a snapshot 202.
  • the snapshot evaluation engine 121 can store the identified entities along with a timestamp associated with a time that a respective snapshot was captured.
  • receiving the snapshots associated with use of the device by the user can include receiving a hash that represents the content included in a respective snapshot, and evaluating the received snapshots includes using, in the evaluating, the hash instead of original content.
  • the snapshot evaluation engine 121 can evaluate hash information associated with the content provided.
  • the information can include, for example, text that corresponds to the content (e.g., "your credit card ending * 1437"), or metadata associated with the content that describes what is contained in the content (e.g., "your address plus ZIP").
  • evaluating the received snapshots can further include identifying one or more events based on the entities included in a snapshot, determining a first time that is in the future when the event is to occur, storing the event along with the first time, determining a second time that is before the event, and determining the time to present can include determining that a current time is equal to the second time and presenting an information card includes presenting an information card for the event at the second time.
  • evaluating the snapshot 202 can indicate the existence of a lunch date event between Bob and Carol. The time/place of the lunch date and Carol's cell phone number can also be determined from the snapshot 202. The content management system 110 can use this information to identify the lunch date and to generate one or more information cards at predetermined times before the lunch date is to occur.
  • identifying entities included in the snapshot can further include identifying a natural language description of an event in the text.
  • the snapshot evaluation engine 121 can identify text in the content 204a that describes the event (e.g., a lunch date) or indicates the entities associated with the event (e.g., Bob, Carol, J's restaurant, and Carol's phone number).
  • the event can be an activity of interest to the user that is to occur in the future.
  • the event that is identified by the snapshot evaluation engine 121 can be the lunch date that Bob has with Carol, which is of interest to Bob.
  • a respective set of entities indicated by the respective snapshot is identified (306).
  • the snapshot evaluation engine 121 can identify entities 131 included in the snapshot 202 obtained from the content 204a.
  • the entities that are identified for the snapshot 202 can include Bob, Carol, J's restaurant, and Carol's cell phone number.
  • Indications of the respective set of entities and a respective timestamp indicating a respective time that the respective snapshot was captured are stored to a memory (308).
  • the respective set of entities and respective timestamp are associated in the memory.
  • the snapshot evaluation engine 121 can store the most prominent identified entities along with a timestamp associated with a time that a respective snapshot was captured.
  • the timestamps can be stored, for example, in the data store of timestamps 132 for later use in generating and presenting information cards 201 related to the snapshots 202 and associated entities.
  • a first time to present one or more information cards to the user is determined (310).
  • the information card engine 122 can determine a time to present one or more information cards to the user including determining a target time. For example, for Bob's pending lunch date with Carol, the information card engine 122 can determine a reminder time for Bob that is one hour before the scheduled noon lunch date. In some implementations, multiple times to present information cards can be determined, e.g., to include a reminder, to be sent the night before, that Bob has a lunch date with Carol the following day.
  • target time can be relative to the timestamp associated with the snapshot 202, such as to show the user the information card 220c at a later time.
  • target times can be calculated closer to the start time of an event, or can be re-calculated based on a current location of the user who is to receive the information card (e.g., Bob may need 90 minutes to drive to the lunch date, based on Bob's current location).
  • the target time can be a time in the past
  • the information card can provide a reminder for an event or entity surfaced to the user in the past.
  • the information card 220c can be based, not on an event, but on a past presentation of content related to the NYSE.
  • determining the time to present one or more information cards can include determining one or more predetermined times in the past and, for each time, determining one or more information cards for presentation to the user.
  • the information card engine 122 can determine multiple times to present the information card 220c, and the times can be based on when the user was first presented with content associated with the NYSE on which the information card 220c is based.
  • the predetermined times can be varied depending on a current context of the user. For example, based on the current actions of the user 206, e.g., being in the middle of an app or casually surfing the Internet, the information card engine 122 can delay or accelerate the generation of the data card (e.g., based on the user's current location).
  • information cards can be surfaced when requested by the user, such as when opening an application or tool that displays and/or manages information cards, and/or by requesting that all or particular information cards be presented. Other signals for surfacing information cards can be used.
  • entities having a time stamp that corresponds to the first time are located in memory (312).
  • the information card engine 122 can identify entities for use in generating an information card from the stored entities 131 based on a comparison of the target time with timestamps associated with a respective entity of the stored entities. For example, for the lunch date that is scheduled for Carol and Bob, the information card engine 122 can identify information to be used in an information card that is associated with the pending lunch date. For example, Carol's phone number can be an entity that is identified for the generation of the information card that includes a reminder to Bob. The information card can be sent at a target time one hour before the lunch date and can include Carol's cell phone number.
  • identifying entities can further include recognizing text in the snapshot, and parsing the text to identify entities.
  • the snapshot evaluation engine 121 can recognize that the snapshot 202 includes text.
  • the snapshot evaluation engine 121 can extract the text in various ways, such as by using optical character recognition (OCR) or other character recognition techniques, by extracting text from Hyper-Text Markup Language (HTML) or other code used for generating the content (e.g., content 204a or 204b), or by other techniques.
  • OCR optical character recognition
  • HTML Hyper-Text Markup Language
  • recognizing text in a snapshot can include using natural language processing techniques, e.g., that use a grammar associated with words or phrases in the text, or sources of snapshots (e.g., based on email formats, calendar entry formats, or other formats).
  • other visual recognition techniques can be applied to the snapshots, e.g., object recognition, landmark recognition, and/or other ways to detect entities from images.
  • An information card is generated based on the one or more of the located entities (314).
  • the information card engine 122 can generate the information card 201 including the one or more identified entities 131 (e.g., an information card that includes Carol's cell phone number).
  • the generated information card is provided for presentation to the user (316). For example, once the information card 201 is generated, the information card 201 may be presented multiple times, for example, on the screen 208c of the user device 106a.
  • storing the identified entities can include storing contextual information associated with an identified entity, and presenting the information card can further include presenting the contextual information along with information about the identified entity on the information card.
  • the snapshot 202 is evaluated by the snapshot evaluation engine 121, information can also be determined and stored for context information that a context associated with a respective snapshot that includes the entities (e.g., identifies the email message and the pending lunch date).
  • the information card 201 can include the context 224a (e.g., identifying the lunch date email or associated information).
  • Other example contexts are shown in contexts 224b and 224c.
  • FIG. 4 is a block diagram of example computing devices 400, 450 that may be used to implement the systems and methods described in this document, as either a client or as a server or plurality of servers.
  • Computing device 400 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers.
  • Computing device 400 is further intended to represent any other typically non-mobile devices, such as televisions or other electronic devices with one or more processers embedded therein or attached thereto.
  • Computing device 450 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smartphones, and other computing devices.
  • the components shown here, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the technologies described and/or claimed in this document.
  • Computing device 400 includes a processor 402, memory 404, a storage device 406, a high-speed controller 408 connecting to memory 404 and high-speed expansion ports 410, and a low-speed controller 412 connecting to low-speed bus 414 and storage device 406.
  • Each of the components 402, 404, 406, 408, 410, and 412, are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate.
  • the processor 402 can process instructions for execution within the computing device 400, including instructions stored in the memory 404 or on the storage device 406 to display graphical information for a GUI on an external input/output device, such as display 416 coupled to high-speed controller 408.
  • multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory.
  • multiple computing devices 400 may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).
  • the memory 404 stores information within the computing device 400.
  • the memory 404 is a computer-readable medium. In one
  • the memory 404 is a volatile memory unit or units. In another implementation, the memory 404 is a non-volatile memory unit or units.
  • the storage device 406 is capable of providing mass storage for the computing device 400. In one implementation, the storage device 406 is a computer-readable medium. In various different implementations, the storage device 406 may be a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations.
  • a computer program product is tangibly embodied in an information carrier. The computer program product contains instructions that, when executed, perform one or more methods, such as those described above.
  • the information carrier is a computer- or machine-readable medium, such as the memory 404, the storage device 406, or memory on processor 402.
  • the high-speed controller 408 manages bandwidth-intensive operations for the computing device 400, while the low-speed controller 412 manages lower bandwidth- intensive operations. Such allocation of duties is an example only. In one
  • the high-speed controller 408 is coupled to memory 404, display 416 (e.g., through a graphics processor or accelerator), and to high-speed expansion ports 410, which may accept various expansion cards (not shown).
  • low- speed controller 412 is coupled to storage device 406 and low-speed bus 414.
  • the low- speed bus 414 (e.g., a low-speed expansion port), which may include various
  • communication ports may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
  • input/output devices such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
  • the computing device 400 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a standard server 420, or multiple times in a group of such servers. It may also be implemented as part of a rack server system 424. In addition, it may be implemented in a personal computer such as a laptop computer 422. Alternatively, components from computing device 400 may be combined with other components in a mobile device (not shown), such as computing device 450. Each of such devices may contain one or more of computing devices 400, 450, and an entire system may be made up of multiple computing devices 400, 450 communicating with each other.
  • Computing device 450 includes a processor 452, memory 464, an input/output device such as a display 454, a communication interface 466, and a transceiver 468, among other components.
  • the computing device 450 may also be provided with a storage device, such as a micro-drive or other device, to provide additional storage.
  • a storage device such as a micro-drive or other device, to provide additional storage.
  • Each of the components 450, 452, 464, 454, 466, and 468 are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate.
  • the processor 452 can process instructions for execution within the computing device 450, including instructions stored in the memory 464.
  • the processor may also include separate analog and digital processors.
  • the processor may provide, for example, for coordination of the other components of the computing device 450, such as control of user interfaces, applications run by computing device 450, and wireless communication by computing device 450.
  • Processor 452 may communicate with a user through control interface 458 and display interface 456 coupled to a display 454.
  • the display 454 may be, for example, a TFT LCD display or an OLED display, or other appropriate display technology.
  • the display interface 456 may comprise appropriate circuitry for driving the display 454 to present graphical and other information to a user.
  • the control interface 458 may receive commands from a user and convert them for submission to the processor 452.
  • an external interface 462 may be provided in communication with processor 452, so as to enable near area communication of computing device 450 with other devices.
  • External interface 462 may provide, for example, for wired communication (e.g., via a docking procedure) or for wireless communication (e.g., via Bluetooth ® or other such technologies).
  • the memory 464 stores information within the computing device 450.
  • the memory 464 is a computer-readable medium. In one
  • the memory 464 is a volatile memory unit or units. In another implementation, the memory 464 is a non-volatile memory unit or units. Expansion memory 474 may also be provided and connected to computing device 450 through expansion interface 472, which may include, for example, a subscriber identification module (SIM) card interface. Such expansion memory 474 may provide extra storage space for computing device 450, or may also store applications or other information for computing device 450. Specifically, expansion memory 474 may include instructions to carry out or supplement the processes described above, and may include secure information also. Thus, for example, expansion memory 474 may be provide as a security module for computing device 450, and may be programmed with instructions that permit secure use of computing device 450. In addition, secure applications may be provided via the SIM cards, along with additional information, such as placing identifying information on the SIM card in a non-hackable manner.
  • SIM subscriber identification module
  • the memory may include for example, flash memory and/or MRAM memory, as discussed below.
  • a computer program product is tangibly embodied in an information carrier.
  • the computer program product contains instructions that, when executed, perform one or more methods, such as those described above.
  • the information carrier is a computer- or machine-readable medium, such as the memory 464, expansion memory 474, or memory on processor 452.
  • Computing device 450 may communicate wirelessly through communication interface 466, which may include digital signal processing circuitry where necessary. Communication interface 466 may provide for communications under various modes or protocols, such as GSM voice calls, SMS, EMS, or MMS messaging, CDMA, TDMA, PDC, WCDMA, CDMA2000, or GPRS, among others. Such communication may occur, for example, through transceiver 468 (e.g., a radio-frequency transceiver). In addition, short-range communication may occur, such as using a Bluetooth ® , WiFi, or other such transceiver (not shown). In addition, GPS receiver module 470 may provide additional wireless data to computing device 450, which may be used as appropriate by applications running on computing device 450.
  • transceiver 468 e.g., a radio-frequency transceiver
  • short-range communication may occur, such as using a Bluetooth ® , WiFi, or other such transceiver (not shown).
  • GPS receiver module 470 may provide additional wireless data to computing device 450, which
  • Computing device 450 may also communicate audibly using audio codec 460, which may receive spoken information from a user and convert it to usable digital information. Audio codec 460 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of computing device 450. Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on computing device 450.
  • Audio codec 460 may receive spoken information from a user and convert it to usable digital information. Audio codec 460 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of computing device 450. Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on computing device 450.
  • the computing device 450 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a cellular telephone 480. It may also be implemented as part of a smartphone 482, personal digital assistant, or other mobile device.
  • Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof.
  • ASICs application specific integrated circuits
  • These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
  • the systems and techniques described here can be implemented on a computer having a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer.
  • a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
  • a keyboard and a pointing device e.g., a mouse or a trackball
  • Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • the systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components.
  • the components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN”), a wide area network (“WAN”), and the Internet.
  • LAN local area network
  • WAN wide area network
  • the Internet the global information network
  • the computing system can include clients and servers.
  • a client and server are generally remote from each other and typically interact through a communication network.
  • the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Engineering & Computer Science (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

L'invention concerne des procédés, des systèmes et un appareil qui comprennent des programmes d'ordinateur codés sur un support d'informations lisible par ordinateur, comprenant un procédé pour fournir un contenu. Les instantanés d'écran associés à l'utilisation d'un dispositif informatique par un utilisateur sont reçus. Chaque instantané d'écran est basé sur un contenu présenté à l'utilisateur. Les instantanés d'écran sont évalués. Pour chaque instantané d'écran respectif, un ensemble respectif d'entités indiquées par l'instantané d'écran respectif est identifié. Les indications de l'ensemble respectif d'entités et une estampille temporelle respective indiquant un temps respectif pendant lequel l'instantané d'écran respectif a été capturé sont associées et stockées. Sur la base d'un premier instantané d'écran des instantanés d'écran, un premier temps pour présenter une ou plusieurs cartes d'informations à l'utilisateur est déterminé. Au premier temps, les entités ayant une estampille temporelle qui correspond au premier temps sont localisées. Une carte d'informations est générée sur la base des entités localisées. La carte d'informations générée est fournie pour une présentation à l'utilisateur.
PCT/US2015/056019 2014-11-26 2015-10-16 Présentation de cartes d'informations pour des évènements associés à des entités WO2016085585A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE112015005293.3T DE112015005293T5 (de) 2014-11-26 2015-10-16 Präsentation von Informationskarten für Ereignisse, die mit Entitäten verbunden sind
CN201580035494.XA CN106663112A (zh) 2014-11-26 2015-10-16 呈现与实体相关联的事件的信息卡

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/555,111 US20160027044A1 (en) 2013-12-19 2014-11-26 Presenting information cards for events associated with entities
US14/555,111 2014-11-26

Publications (1)

Publication Number Publication Date
WO2016085585A1 true WO2016085585A1 (fr) 2016-06-02

Family

ID=54477249

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/056019 WO2016085585A1 (fr) 2014-11-26 2015-10-16 Présentation de cartes d'informations pour des évènements associés à des entités

Country Status (3)

Country Link
CN (1) CN106663112A (fr)
DE (1) DE112015005293T5 (fr)
WO (1) WO2016085585A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10762146B2 (en) 2017-07-26 2020-09-01 Google Llc Content selection and presentation of electronic content
EP3776193B1 (fr) * 2018-03-30 2023-10-25 Fullstory, Inc. Capture et traitement d'interactions avec une interface utilisateur d'une application d'origine
CN111061530A (zh) * 2019-12-05 2020-04-24 维沃移动通信有限公司 一种图像处理方法、电子设备及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120323853A1 (en) * 2011-06-17 2012-12-20 Microsoft Corporation Virtual machine snapshotting and analysis
US20130115888A1 (en) * 2011-11-09 2013-05-09 At&T Mobility Ii Llc Received signal strength indicator snapshot analysis
US20140351217A1 (en) * 2013-05-23 2014-11-27 Oracle International Corporation Database snapshot analysis

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030040925A1 (en) * 2001-08-22 2003-02-27 Koninklijke Philips Electronics N.V. Vision-based method and apparatus for detecting fraudulent events in a retail environment
US7707039B2 (en) * 2004-02-15 2010-04-27 Exbiblio B.V. Automatic modification of web pages
US20090110173A1 (en) * 2007-10-31 2009-04-30 Nokia Corporation One touch connect for calendar appointments
CN101968865B (zh) * 2010-11-17 2013-12-11 上海合合信息科技发展有限公司 在电子日历中添加提醒事件的方法
US20120224711A1 (en) * 2011-03-04 2012-09-06 Qualcomm Incorporated Method and apparatus for grouping client devices based on context similarity
US8787407B2 (en) * 2011-10-14 2014-07-22 Alcatel Lucent Processing messages correlated to multiple potential entities
US9460608B2 (en) * 2012-09-13 2016-10-04 Apple Inc. Reminder creation for tasks associated with a user event
US20140155022A1 (en) * 2012-12-05 2014-06-05 Anil Kandregula Methods and apparatus to monitor usage of mobile devices
CN103763435B (zh) * 2014-01-14 2016-03-16 深圳市金立通信设备有限公司 一种事件提醒方法及移动终端

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120323853A1 (en) * 2011-06-17 2012-12-20 Microsoft Corporation Virtual machine snapshotting and analysis
US20130115888A1 (en) * 2011-11-09 2013-05-09 At&T Mobility Ii Llc Received signal strength indicator snapshot analysis
US20140351217A1 (en) * 2013-05-23 2014-11-27 Oracle International Corporation Database snapshot analysis

Also Published As

Publication number Publication date
DE112015005293T5 (de) 2017-08-17
CN106663112A (zh) 2017-05-10

Similar Documents

Publication Publication Date Title
KR101769058B1 (ko) 해시태그 및 컨텐츠 제시
US11361344B2 (en) Combining content with a search result
US20160027044A1 (en) Presenting information cards for events associated with entities
US11244352B2 (en) Selecting content associated with a collection of entities
US20220051288A1 (en) Presenting options for content delivery
US10862888B1 (en) Linking a forwarded contact on a resource to a user interaction on a requesting source item
US9063972B1 (en) Increasing user retention and re-engagement in social networking services
US20120143701A1 (en) Re-publishing content in an activity stream
WO2013134393A1 (fr) Fourniture d'un contenu à un utilisateur sur de multiples dispositifs
US10178189B1 (en) Attributing preferences to locations for serving content
US11449905B2 (en) Third party customized content based on first party identifer
US9882867B2 (en) Providing content to devices in a cluster
US9436946B2 (en) Selecting content based on entities present in search results
WO2016085585A1 (fr) Présentation de cartes d'informations pour des évènements associés à des entités
WO2015000176A1 (fr) Fourniture de contenu dynamique à partir de pages web de réseau social
US10042936B1 (en) Frequency-based content analysis
US20150199718A1 (en) Selecting content items using entities of search results

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15791086

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 112015005293

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15791086

Country of ref document: EP

Kind code of ref document: A1