WO2015126851A1 - Production de résumés d'interactions sociales entre utilisateurs - Google Patents

Production de résumés d'interactions sociales entre utilisateurs Download PDF

Info

Publication number
WO2015126851A1
WO2015126851A1 PCT/US2015/016208 US2015016208W WO2015126851A1 WO 2015126851 A1 WO2015126851 A1 WO 2015126851A1 US 2015016208 W US2015016208 W US 2015016208W WO 2015126851 A1 WO2015126851 A1 WO 2015126851A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
relationship
signal stream
data
engine
Prior art date
Application number
PCT/US2015/016208
Other languages
English (en)
Inventor
Nadav Aharony
Alan Lee GARDNER, III
George Cody SUMTER
Original Assignee
Google Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/622,794 external-priority patent/US9672291B2/en
Application filed by Google Inc. filed Critical Google Inc.
Priority to EP15751343.3A priority Critical patent/EP3108441A4/fr
Priority to CN201580017103.1A priority patent/CN106133786B/zh
Publication of WO2015126851A1 publication Critical patent/WO2015126851A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking

Definitions

  • the specification relates to managing user activities. More specifically, the specification relates to analyzing user activities and summarizing social interactions between users.
  • a system for summarizing social interactions between users includes a processor and a memory storing instructions that, when executed, cause the system to: receive a signal stream from at least one of a hardware sensor and a virtual detector, filter the signal stream and outputting filtered signal stream including data defining human-understandable actions, identify activities associated with a first user from the filtered signal stream, generate a summary of the first user's activities, determine that the first user is within proximity to a second user, determine a degree of separation between the first user and the second user in a social network, determine a time elapsed since a last interaction between the first user and the second user, classify the first user's relationship with the second user as being a first type of relationship, a second type of relationship or a third time of relationship, responsive to having the first type of relationship, generate a first summary for the first user that includes a notification that the second user is nearby, a last interaction with the second user and recent interactions with the second user
  • a signal stream from at least one of a hardware sensor and a virtual detector, filtering the signal stream and outputting filtered signal stream including data defining human-understandable actions, identifying activities associated with a first user from the filtered signal stream, generating a summary of the first user's activities, determining that the first user is within proximity to a second user, determining a degree of separation between the first user and the second user in a social network, determining a time elapsed since a last interaction between the first user and the second user, classifying the first user's relationship with the second user as being a first type of relationship, a second type of relationship or a third time of relationship, responsive to having the first type of relationship, generating a first summary for the first user that includes a notification that the second user is nearby, a last interaction with the second user and recent interactions with the second user, responsive to having the second type of relationship, generating a second
  • the operations include: determining closeness between the first user and the second user based on at least one of the degree of separation and the time elapsed since the last interaction, and wherein classifying the first user's relationship with the second user is based on the closeness; determining what information of the summary to provide to the first user based on privacy settings; and determining that the first user is within proximity to the second user based at least in part on data received from at least one of the hardware sensor and the virtual detector.
  • the features include: the first summary including action items that the first user owes the second user; the second summary including important events that occurred to the first user that the second user might be interested in hearing about; the second summary including a name of a mutual connection and an event that the mutual connection attended; the second summary including a recent post on a social network that was created by the second user; and the third type of relationship being between a first user that has not met the second user in person.
  • the disclosure may be particularly advantageous in improving social interactions among people because a first user can get different summaries of user activities based upon different connections with other users that remind the first user about who the other users are, where they met last time, which topics they might discuss, etc.
  • Figure 1 is a block diagram illustrating an example of a system for generating a summary of a user.
  • Figure 2 is a block diagram illustrating an example of a summary application.
  • Figure 3A is an example graphic representation of a user interface for displaying a summary where the first user and the second user are close friends and/or interact with each other frequently.
  • Figures 3B and 3C are example graphic representation of user interfaces for displaying a summary where the first user and the second user are friends and/or interact with each other infrequently.
  • Figure 3D is an example graphic representation of a user interface for displaying a summary where the first user and the second user are strangers with information shared in common.
  • Figure 4 is a flow diagram of an example of a method for generating a summary for a first user.
  • Figures 5 A and 5B are flow diagrams of another example of a method for generating a summary for a first user depending on the type of relationship between the users.
  • the specification discloses a system and method for summarizing social interactions between users.
  • the summary application receives a signal stream from at least one of a hardware sensor and a virtual detector.
  • the summary application filters the signal stream and outputs a filtered signal stream including data for defining one or more human- understandable actions.
  • the summary application identifies one or more activities associated with a first user from the filtered signal stream.
  • the summary application generates a summary of the first user's activities. For example, the first user attended a conference, posted pictures of an important event, and checked-in at a restaurant.
  • the summary application determines that the first user is within proximity to a second user and determines a degree of separation between the first user and the second user in a social network.
  • the summary application determines a time elapsed since a last interaction between a first user and a second user. For example, if more than a month has passed since they last interacted, they are not close friends.
  • the summary application classifies the first user's relationship with the second user as being a first type of relationship, a second type of relationship or a third type of relationship. Responsive to having the first type of relationship, the summary application generates a first summary for the first user that includes a notification that the second user is nearby, a last interaction with the second user and recent interactions with the second user.
  • the first type of relationship includes, for example, friendship.
  • the summary application Responsive to having the second type of relationship, the summary application generates a second summary for the first user that includes the notification that the second user is nearby, the last interaction with the second user and events that the first user and the second user share in common. This applies when the users are acquaintances.
  • the summary application also generates a summary of all the important life events that occurred to the first user since he last spoke with the second user. For example, the first user started a new job and had a baby. Responsive to having the third type of relationship, the summary application generates a third summary for the first user that includes the notification that the second user is nearby and events that the first user and the second user share in common. For example, the first user and second user both attended the same conference last week. This is for users that do not know each other very well and gives them things to discuss.
  • Figure 1 illustrates a block diagram of a system 100 for summarizing social interactions between users.
  • the illustrated description of the system 100 includes user devices 1 15a...1 15n that are accessed by users 125a...125n, one or more social network servers 101 and an event server 107. In the illustrated embodiment, these entities of the system 100 are communicatively coupled via a network 105.
  • a letter after a reference number for example " 115a” is a reference to the element having that particular reference number.
  • a reference number in the text without a following letter, for example "1 15” is a general reference to any or all instances of the element bearing that reference number.
  • the network 105 can be a conventional type network, wired or wireless, and may have any number of configurations for example a star configuration, token ring configuration or other configurations known to those skilled in the art. Furthermore, the network 105 may comprise a local area network (LAN), a wide area network (WAN) (e.g., the Internet), and/or any other interconnected data path across which multiple devices may communicate. In some embodiments, the network 105 may be a peer-to-peer network. The network 105 may also be coupled to or includes portions of a telecommunications network for sending data in a variety of different communication protocols.
  • the network 105 includes Bluetooth communication networks or a cellular communications network for sending and receiving data for example via SMS/MMS, hypertext transfer protocol (HTTP), direct data connection, WAP, e-mail, etc. While only one network 105 is illustrated, in practice one or more networks 105 may be coupled to the above mentioned entities.
  • SMS/MMS short message service
  • HTTP hypertext transfer protocol
  • WAP direct data connection
  • e-mail etc. While only one network 105 is illustrated, in practice one or more networks 105 may be coupled to the above mentioned entities.
  • the social network server 101 can be a hardware server that includes a processor, a memory and network communication capabilities.
  • the social network server 101 is communicatively coupled to the network 105 via signal line 102.
  • the social network server 101 sends and receives data to and from one or more of the user devices 1 15a, 1 15n and the event server 107 via the network 105.
  • the social network server 101 includes a social network application 109 and a database 199.
  • a social network can be a type of social structure where the users may be connected by a common feature.
  • the common feature includes relationships/connections, e.g., friendship, family, work, an interest, etc.
  • the common features may be provided by one or more social networking systems including explicitly defined relationships and
  • the social network application 109 in the social network server 101 manages the social network by handling registration of users, publication of content (e.g. posts, comments, photos, links, check-ins, etc.), hosting multi-user communication sessions, managing of groups, managing different sharing levels, updating the social graph, etc.
  • the social network application 109 registers a user by receiving information such as a username and password and generates a user profile that is associated with the user and stored as part of the social graph.
  • the user profile includes additional information about the user including interests (e.g.
  • the database 199 in the social network server 101 stores social network data associated with the users.
  • the database 199 stores social network data describing one or more of user profiles, posts, comments, videos, audio files, images, sharings, acknowledgements, etc., published on a social network.
  • the system 100 may include multiple social network servers 101 that include traditional social network servers, email servers, micro-blog servers, blog servers, forum servers, message servers, etc.
  • 109 may be representative of one social network and that there may be multiple social networks coupled to the network 105, each having its own server, application and social graph.
  • a first social network may be more directed to business networking
  • a second may be more directed to or centered on academics
  • a third may be more directed to local business
  • a fourth may be directed to dating and others may be of general interest or a specific focus.
  • the user devices 115a, 115n in Figure 1 are used by way of example.
  • the disclosure applies to a system architecture having any number of user devices 115 available to any number of users 125.
  • the user 125a interacts with the user device 115a.
  • the summary application 103a can be stored on the user device 115a which is communicatively coupled to the network 105 via signal line 108.
  • the user 125n interacts with the user device 115n.
  • the user device 115n is communicatively coupled to the network 105 via signal line 1 10.
  • the user device 115 can be any computing device that includes a memory and a processor.
  • the user devices 1 15 can be a laptop computer, a desktop computer, a tablet computer, a mobile telephone, a personal digital assistant, a mobile email device, a portable game player, a portable music player, a television with one or more processors embedded therein or coupled thereto or any other electronic device capable of accessing the network 105, etc.
  • the user device 115 can include a mobile device that is worn by the user 125.
  • the user device 1 15 is included as part of a clip (e.g., a wristband), as part of a jewelry or as part of a pair of glasses.
  • the user device 1 15 can be a smart watch.
  • the user 125 can view notifications from the summary application 103 on a display of the device worn by the user 125.
  • the user 125 can view the notifications on a display of a smart watch or a smart wristband.
  • the user 125 can view the notifications on an optical head-mounted display of a pair of glasses.
  • the user 125 may also configure what types of notifications to be displayed on the device worn by the user 125.
  • the user 125 may configure the wearable device to flash a LED light for five seconds if a friend's mobile device is detected in proximity to the user 125.
  • the summary application 103 can be split into some components that are stored on the user device 115a and some components that are stored on the event server 107.
  • the summary application 103a on the user device 115a acts in part as a thin-client application and sends an event stream including one or more events associated with a user to the summary application 103b on the event server 107.
  • the summary application 103b on the event server 107 augments the event stream by including new events and sends back the updated event stream to the summary application 103 a on the user device 115a for presenting the event stream to the user 125a.
  • the summary application 103b can be stored on an event server 107, which is connected to the network 105 via signal line 104.
  • the event server 107 can be a hardware server that includes a processor, a memory and network communication capabilities. The event server 107 sends and receives data to and from other entities of the system 100 via the network 105. While Figure 1 illustrates one event server 107, the system 100 may include one or more event servers 107.
  • the summary application 103 can be software including routines for generating a summary of user activities.
  • the summary application 103 can be implemented using hardware including a field-programmable gate array (FPGA) or an application-specific integrated circuit (ASIC).
  • the summary application 103 can be implemented using a combination of hardware and software.
  • the summary application 103 may be stored in a combination of the devices and servers, or in one of the devices or servers. The summary application 103 is described in further detail below with reference to Figures 2.
  • the summary application 103 identifies activities associated with users and generates a summary of the user activities for a user. In some embodiments, the summary application 103 determines that a first user is within proximity of a second user, generates a summary based on the relationship between the first and second users and provides the first user with the summary. The summary includes a notification that the second user is nearby, a last time the first user interacted with the second user and information about the at least one of the first user and the second user.
  • the summary application 103 determines that Joe is an acquaintance of Amy (because they have not connected with each other for the past three months) and generates a summary for Amy to notify that Joe is nearby and that their last interaction was in Mary's house three months ago.
  • the summary also includes a picture of Joe looking at the Seagull Monument in Salt Lake City, a picture of Amy's new house and a picture of Amy having lunch with Mary.
  • the summary reminds Amy of Joe and provides topics that they can discuss (e.g., the Seagull Monument in Salt Lake City, Amy's new house or their mutual friend Mary). As a result, the connection between Amy and Joe might be improved.
  • the summary application 103 generates different summaries for a user based on different relationships between the user and other users. For example, if the summary application 103 determines that Ryan is a close friend of Richard since they talk on a social network every week. When Ryan is attending a conference in the city where Richard lives, the summary application 103 detects Ryan's location and generates a summary to notify Richard that Ryan is nearby and remind Richard that he told Ryan that they will visit a neighboring national park together when Ryan comes to the city.
  • the summary application 103 may generate a different summary that includes the information "both of you attended Murray High School" for Richard responsive to Oscar attending the conference in the city.
  • FIG. 2 is a block diagram of a computing device 200 that includes the summary application 103, a processor 235, a memory 237, a communication unit 241, a storage device 243 and one or more hardware sensors 252a...252n according to some examples.
  • the components of the computing device 200 are communicatively coupled by a bus 220.
  • the computing device 200 can be one of a user device 1 15 and an event server 107.
  • the processor 235 includes an arithmetic logic unit, a microprocessor, a general-purpose controller or some other processor array to perform computations and provide electronic display signals to a display device.
  • the processor 235 is coupled to the bus 220 via signal line 236 for communication with the other components.
  • Processor 235 may process data signals and may comprise various computing architectures including a complex instruction set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture, or an architecture implementing a combination of instruction sets.
  • CISC complex instruction set computer
  • RISC reduced instruction set computer
  • processors 235 may be included.
  • the processing capability may be limited to supporting the display of images and the capture and transmission of images.
  • the processing capability might be enough to perform more complex tasks, including various types of feature extraction and sampling.
  • other processors, operating systems, sensors, displays and physical configurations are possible.
  • the memory 237 stores instructions and/or data that may be executed by processor 235.
  • the memory 237 is coupled to the bus 220 via signal line 238 for communication with the other components.
  • the instructions and/or data may include code for performing any and/or all of the techniques described herein.
  • the memory 237 may be a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, flash memory or some other memory device known in the art.
  • DRAM dynamic random access memory
  • SRAM static random access memory
  • flash memory or some other memory device known in the art.
  • the memory 237 also includes a non-volatile memory or similar permanent storage device and media for example a hard disk drive, a CD-ROM device, a DVD-ROM device, a DVD- RAM device, a DVD-RW device, a flash memory device, or some other mass storage device known in the art for storing information on a more permanent basis.
  • a non-volatile memory or similar permanent storage device and media for example a hard disk drive, a CD-ROM device, a DVD-ROM device, a DVD- RAM device, a DVD-RW device, a flash memory device, or some other mass storage device known in the art for storing information on a more permanent basis.
  • the communication unit 241 transmits and receives data to and from at least one of the user device 115, the event server 107 and the social network server 101 depending upon where the summary application 103 is stored.
  • the communication unit 241 is coupled to the bus 220 via signal line 242.
  • the communication unit 241 includes a port for direct physical connection to the network 105 or to another
  • the communication unit 241 includes a USB, SD, CAT-5 or similar port for wired communication with the user device 115.
  • the communication unit 241 includes a wireless transceiver for exchanging data with the user device 1 15 or any other communication channel using one or more wireless communication methods, such as IEEE 802.11, IEEE 802.16, BLUETOOTH® or another suitable wireless communication method.
  • the communication unit 241 includes a cellular communications transceiver for sending and receiving data over a cellular communications network such as via short messaging service (SMS), multimedia messaging service (MMS), hypertext transfer protocol (HTTP), direct data connection, WAP, e-mail or another suitable type of electronic communication.
  • SMS short messaging service
  • MMS multimedia messaging service
  • HTTP hypertext transfer protocol
  • WAP direct data connection
  • e-mail e-mail
  • the communication unit 241 includes a wired port and a wireless transceiver.
  • the communication unit 241 also provides other conventional connections to the network for distribution of files and/or media objects using standard network protocols such as TCP/IP, HTTP, HTTPS and SMTP as will be understood to those skilled in the art.
  • the storage device 243 can be a non-transitory memory that temporarily stores data used by the summary application 103, for example, a cache.
  • the storage device 243 may be a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, flash memory or some other memory device known in the art.
  • the storage device 243 also includes a non-volatile memory or similar permanent storage device and media such as a hard disk drive, a CD-ROM device, a DVD- ROM device, a DVD-RAM device, a DVD-RW device, a flash memory device, or some other mass storage device known in the art for storing information on a more permanent basis.
  • the storage device 243 is communicatively coupled by the bus 220 for communication with the other components of the computing device 200 via signal line 240. Although only one storage device 243 is shown in Figure 2A, multiple storage devices 243 may be included. In other embodiments, the storage device 243 may not be included in the user device 1 15 and can be communicatively coupled to the user device 115 via the network 105.
  • the storage device 243 stores one or more of raw data, signal streams, activities performed by one or more users and analytics data associated with the activities.
  • the data stored in the storage device 243 is described below in more detail.
  • the storage device 243 may store other data for providing the functionality described herein.
  • the hardware sensors 252a...252n are physical sensors for detecting data.
  • Example hardware sensors 252 include, but are not limited to, an infrared sensor, an accelerometer, a pedometer, a global positioning system (GPS) sensor, a Bluetooth sensor, a power detector, a battery detector, a camera, a light detection and ranging (LIDAR) sensor, a motion sensor, a capacitive sensor, a thermostat and a microphone, etc.
  • Other example hardware sensors 252 are possible.
  • the hardware sensor 252a is communicatively coupled to the bus 220 via signal line 251
  • the hardware sensor 252n is communicatively coupled to the bus 220 via signal line 253.
  • the one or more hardware sensors 252 generate sensor data and send the sensor data to a processing unit 204 of the summary application 103.
  • the sensor data generated by the one or more hardware sensors 252 are referred to as hardware raw data.
  • Example hardware raw data includes, but is not limited to, data describing a number of steps from a pedometer, data describing a geographic location (e.g., a latitude, a longitude and an elevation of a location) and a velocity from a GPS sensor, data describing a presence of other devices in close proximity to the user device 1 15 from a Bluetooth sensor, data describing a movement from an accelerometer (for e.g., the user device 1 15 is being held in a certain orientation while watching a video, playing a video game, etc.), data describing brightness in an environment from a light detector, data describing detecting ambient sounds from a microphone, data describing detecting wireless access points from wireless transceivers, etc.
  • Other example hardware raw data is possible.
  • 103 includes a virtual detector 202, a processing unit 204, a filter engine 206, an activity identifier 208, an aggregator 210, a summarizing engine 212, a user interface engine 214 and a privacy engine 216.
  • the virtual detector 202 can be software including routines for generating raw data.
  • the virtual detector 202 can be a set of instructions executable by the processor 235 to provide the functionality described below for generating raw data.
  • the virtual detector 202 can be stored in the memory 237 of the computing device 200 and can be accessible and executable by the processor 235.
  • the virtual detector 202 may be adapted for cooperation and communication with the processor 235 and other components of the computing device 200 via signal line 230.
  • the one or more hardware sensors 252 generate hardware raw data, and send the hardware raw data to the processing unit 204.
  • the virtual detector 202 generates other raw data that is not related to hardware sensors 252, and sends the other raw data to the processing unit 204.
  • the other raw data generated by the virtual detector 202 is referred to as virtual raw data.
  • the virtual detector 202 generates the virtual raw data with permission from the user.
  • Example virtual raw data includes, but is not limited to, software raw data related to software stored on the user device 1 15, mobile network information related to the user device's 115 mobile network, file status on the user device 1 15, data describing interactions between the user and the user device 1 15 (e.g., the user turning up or turning down volume, brightness, contrast, etc.
  • the user zooming in or zooming out of content displayed on the user device 115, the user scrolling down on a touch screen or typing in a user interface, the user making a phone call using the user device 115, etc.
  • data describing user interactions on a social network e.g., the user viewing a social stream on a social network; the user publishing a post, sharing a web page, posting a comment, viewing a video, listening to an audio file, playing an online game, submitting a survey, adding users as his or her connections, etc., on the social network
  • the user's online search history, the user's browsing history and the user's communication history e.g., text messages, emails, etc.
  • the virtual raw data is retrieved with permission from the user, etc.
  • the virtual raw data includes metadata associated with the user device 115.
  • Example software raw data related to software stored on the user device 1 15 includes, but is not limited to, operating system information related to the user device 1 15 (e.g., the user updating the operating system, switching the operating system, etc.), applications stored on the user device 1 15 (e.g., applications for fitness tracking, counting calories, mobile payment, reading books, listening to music, etc.) and application usage information on the user device 115 (e.g., the user entering his or her gym routine into a fitness tracking application, opening a song playlist in a media library, closing an instant messaging application, deleting an unused application, updating an existing application, installing a new application, configuring an application setting, etc.).
  • Other example software raw data is possible.
  • the virtual detector 202 stores the virtual raw data in the storage device 243.
  • the processing unit 204 can be software including routines for receiving signal streams from the virtual detector 202 and/or one or more hardware sensors 252.
  • the processing unit 204 can be a set of instructions executable by the processor 235 to provide the functionality described below for receiving signal streams from the virtual detector 202 and/or one or more hardware sensors 252.
  • the processing unit 204 can be stored in the memory 237 of the computing device 200 and can be accessible and executable by the processor 235.
  • the processing unit 204 may be adapted for cooperation and communication with the processor 235 and other components of the computing device 200 via signal line 232.
  • the processing unit 204 receives a signal stream from the virtual detector 202, where the signal stream includes virtual raw data generated by the virtual detector 202. In other embodiments, the processing unit 204 receives a signal stream from one or more hardware sensors 252, where the signal stream includes hardware raw data generated by the one or more hardware sensors 252. In some other embodiments, the processing unit 204 receives a stream of virtual raw data from the virtual detector 202 and a stream of hardware raw data from the one or more hardware sensors 252, where the stream of virtual raw data and the stream of hardware raw data together form a consolidated signal stream. The processing unit 204 sends the signal stream to the filter engine 206. In some embodiments, the processing unit 204 stores the signal stream in the storage 243.
  • the processing unit 204 validates the data in the signal stream for its usefulness. In some embodiments, the processing unit 204 saves a data block from the signal stream that indicates a change in state as when compared to a previous data block. For example, at a first timestamp, the processing unit 204 may receive a first set of location data from a GPS sensor indicating a user has just arrived at a coffee shop after coming out of a subway station, and the processing unit 204 may save the first set of location data.
  • the processing unit 204 At a second timestamp, if the processing unit 204 receives, from the GPS sensor, a second set of location data which is identical to the first set of location data, indicating the user is at the same location as the first timestamp, the processing unit 204 does not save the second set of location data. However, at a third timestamp, if the processing unit 204 receives, from the GPS sensor, a third set of location data which is different from the second set of location data, indicating the user has left the coffee shop and is now in the office, the processing unit 204 saves the third set of location data.
  • the processing unit 204 saves data related to the transit moments and ignores data related to the stationary moments.
  • the processing unit 204 saves the data from the signal stream that indicate a change in a frequency of steps (for e.g., data from accelerometer), a change of velocity (for e.g., data from GPS sensor), a change of location (for e.g., data from a GPS sensor, a wireless transceiver, etc.), a change in application usage (e.g., an application being opened, used, closed, updated, installed, etc.), a change in actions performed on a social network (e.g., a user logging in, logging out, uploading a photograph, accepting invites, posting a comment, indicating an acknowledgement, adding other users as connections, etc.), a change related to detecting a presence of other user devices 1 15n in close proximity of the user device 115a or other changes in state.
  • a frequency of steps for e.g., data from accelerometer
  • a change of velocity for e.g., data from GPS sensor
  • a change of location for e.g.
  • the filter engine 206 can be software including routines for filtering signal streams.
  • the filter engine 206 can be a set of instructions executable by the processor 235 to provide the functionality described below for filtering signal streams.
  • the filter engine 206 can be stored in the memory 237 of the computing device 200 and can be accessible and executable by the processor 235.
  • the filter engine 206 may be adapted for cooperation and communication with the processor 235 and other components of the computing device 200 via signal line 234.
  • the filter engine 206 filters the signal stream to define one or more human-understandable actions. For example, the filter engine 206 filters the signal stream to retrieve data describing a number of steps from the accelerometer of the user device 1 15 and outputs a filtered signal stream including step data. In another example, the filter engine 206 filters the signal stream to retrieve sequence of location and velocity data from a GPS sensor of the user device 1 15 and outputs a filtered signal stream including location data. In yet another example, the filter engine 206 filters the signal stream to retrieve data describing detection of a mobile device in close proximity to the user device 1 15 and outputs a filtered signal stream including detection data. Such a filtered signal stream includes hashed identifiers (i.e. hashed using phone number, email, or social network profile identifiers, etc.) associated with the mobile device in close proximity of the user device 1 15.
  • hashed identifiers i.e. hashed using phone number, email, or social network profile identifiers, etc.
  • the filter engine 206 filters the signal stream to combine different types of data in a filtered signal stream to define one or more human understandable actions. For example, the filter engine 206 outputs a filtered signal stream that combines one or more of the following data including: (1) location and velocity data from a GPS sensor, and (2) detection data indicating presence of an automobile (e.g., Bluetooth enabled) and a mobile in close proximity, etc. to indicate travelling together with another user. In another example, the filter engine 206 outputs a filtered signal stream that combines one or more of the following data including: (1) ambient sound data from a microphone, (2) location data from a GPS sensor or Wi-Fi access point, and (3) uploading one or more pictures with GPS tags matching the location data to the social network, etc.
  • a filtered signal stream that combines one or more of the following data including: (1) ambient sound data from a microphone, (2) location data from a GPS sensor or Wi-Fi access point, and (3) uploading one or more pictures with GPS tags matching the location data to the social network, etc.
  • the filter engine 206 outputs a filtered signal stream that combines one or more of the following data including: (1) motion data from an accelerometer, (2) ambient illumination data from a light sensor, (3) energy usage data from a power detector on the user device 1 15, and (4) application usage data from an application manager in the user device 115, etc. to indicate sleeping or active day time activity.
  • the filter engine 206 filters the signal stream to identify changes in one or more human understandable actions. For example, assume a Bluetooth sensor on a user's mobile device is detecting a presence of a number of mobile devices in close proximity of the user every five minutes from 1 :00 PM to 1 :30 PM. The filter engine 206 filters the data generated by the Bluetooth sensor and outputs a filtered signal stream that includes (1) data indicating detection of a first mobile device and a second mobile device in proximity of the user at 1 :00 PM, and (2) data indicating detection that the second mobile device is no longer in proximity of the user at 1 :25 PM.
  • a GPS sensor on a user's mobile device updates the location of the user every 2 minutes from 8:00 AM to 8:30 AM.
  • the filter engine 206 filters the data generated by the GPS sensor and outputs a filtered signal stream that includes (1) a first set of location and timestamp describing that the user arrived at a coffee shop at 8:04 AM and (2) a second set of location and timestamp data describing that the user left the coffee shop at 8:30 AM.
  • Other sets of location and timestamp data received from the GPS sensor between 8:00 AM and 8:30 AM are not included in the filtered signal stream because they are identical or too similar.
  • the filtered signal stream includes data describing appearance and disappearance of another user device 1 15.
  • a Bluetooth sensor detects a presence of a friend's mobile device and generates data describing the presence of the friend's mobile device every five minutes from 1 :00 PM to 1 :30 PM.
  • the filter engine 206 filters the data generated by the Bluetooth sensor, and outputs a filtered signal stream that only includes (1) data indicating an appearance of the friend's mobile device at 1 :00 PM and (2) data indicating the friend's mobile device was last detected at 1 :30 PM.
  • the filtered signal stream includes data indicating a change of a frequency of steps, a change of velocity, a change of application usage (e.g., an application being open or being closed), a change of actions on a social network (e.g., a user logging in or exiting from a social network account) or other changes in actions
  • the filter engine 206 filters the signal stream to include data from a Bluetooth sensor associated with a first user device 1 15 of a first user, where the data can be used to determine a presence of a second user device 115 that also has a
  • the first user device 115's Bluetooth sensor generates data indicating a presence of the second user device 115.
  • the data indicating presence of the second user device 115 can also indicate a presence of a second user associated with the second user device 1 15 (e.g., the first user and the second user are in proximity).
  • the filter engine 206 filters the signal stream to additionally include received signal strength indicator (RSSI) data from the Bluetooth sensor for increased granularity.
  • RSSI received signal strength indicator
  • the filter engine 206 filters a first signal stream, and outputs a first filtered signal stream that includes a first set of data from a first Bluetooth sensor associated with a first user device 115 of a first user.
  • the first set of data indicates the first user device 1 15 detects a presence of a third device at a first timestamp.
  • the filter engine 206 filters a second signal stream, and outputs a second filtered signal stream that includes a second set of data from a second Bluetooth sensor associated with a second user device 1 15 of a second user.
  • the second set of data indicates the second user device 1 15 detects a presence of the third device at a second timestamp. If the time difference between the first timestamp and the second timestamp is within a predetermined threshold (e.g., five seconds), the first set of data and the second set of data can be used by the activity identifier 208 to determine that the first user device 115 and the second user device 115 are in proximity since both of the two user devices 1 15 detect the third device within a short time period.
  • the activity identifier 208 is described below in more detail.
  • the third device is a vehicle. If the vehicle is detected almost simultaneously by two mobile devices of two users, the two users are very likely to be in the same vehicle.
  • the first and second filtered signal streams may additionally include velocity data from GPS sensors respectively. If the velocity data indicates the two users are moving, the activity identifier 208 can estimate the two users are travelling in the same vehicle.
  • the third device is a device at home with a Bluetooth sensor (e.g., a Bluetooth-enabled personal computer). If the device at home is respectively detected by two mobile devices of two users within a predetermined time window (e.g., within 10 seconds), the activity identifier 208 can estimate that the two users are at home. In some examples, the activity identifier 208 estimates two users as being together if the location data from GPS sensors indicates the two users' geo-locations are the same.
  • the filter engine 206 filters the signal streams to additionally include received signal strength indicator (RSSI) data for increased granularity.
  • RSSI received signal strength indicator
  • the filter engine 206 may poll for specific known devices by filtering available devices based on a social graph of a user and/or the user's location. For example, the filter engine 206 identifies a group of devices used by the user's friends. In another example, the filter engine 206 identifies a group of devices at the same location as the user. In yet another example, the filter engine 206 identifies a group of devices that are used by the user's friends and at the same location as the user.
  • RSSI received signal strength indicator
  • the filter engine 206 provides the filtered signal stream to applications stored on the user device 115.
  • the step data from the filtered stream is input to a fitness tracking application.
  • the filter engine 206 stores the filtered signal stream in the storage device 243.
  • the filter engine 206 sends the filtered signal stream to the activity identifier 208.
  • the activity identifier 208 can be software including routines for identifying activities.
  • the activity identifier 208 can be a set of instructions executable by the processor 235 to provide the functionality described below for identifying activities.
  • the activity identifier 208 can be stored in the memory 237 of the computing device 200 and can be accessible and executable by the processor 235.
  • the activity identifier 208 may be adapted for cooperation and communication with the processor 235 and other components of the computing device 200 via signal line 236.
  • Example activities include, but are not limited to, physical activities (e.g., running, walking, sleeping, driving, talking to someone, biking, talking to a group, hiking, etc.), activities on social networks (e.g., playing online games on a social network, publishing posts and/or comments, acknowledging posts, sharing posts, etc.) and activities on user devices 1 15 (e.g., opening an application, listening to a playlist, calling a contact, writing emails, viewing photos, watching videos, etc.). Other example activities are possible.
  • the activity identifier 208 receives a filtered signal stream from the filter engine 206, and identifies one or more activities from the filtered signal stream.
  • the filtered signal stream includes step data from a pedometer.
  • the activity identifier 208 identifies that the user is walking if the frequency of steps conforms to the user's walking pace. However, if the frequency of steps conforms to the user's running pace, the activity identifier 208 identifies that the user is running.
  • the filtered signal stream includes (1) acceleration data indicating zero acceleration from an accelerometer, (2) timestamp data indicating the time is midnight from a GPS sensor, (3) brightness data indicating lights are off from a light detector, (4) power usage indicating that the user device 115 is connected to a charger and (5) application usage indicating that the applications are not being used.
  • the activity identifier 208 identifies that the user activity is sleeping based on the filtered signal stream.
  • the activity identifier 208 determines user activities based on data received from multiple virtual detectors 202 and/or hardware sensors 252.
  • the filtered signal stream includes data indicating (1) a game application is running on the user device 115 and (2) the user is swiping fingers on the touch screen of the user device 1 15.
  • the activity identifier 208 identifies that the user is playing a game on the user device 1 15.
  • the filtered signal stream includes (1) data describing steps from a pedometer, (2) data describing that a music application is running on the user device 115 from the virtual detector 202, and (3) data describing a friend's mobile device is detected in proximity to the user device 1 15 from a Bluetooth sensor of the user device 1 15.
  • the activity identifier 208 identifies that the user is listening to music and jogging with the friend based on the usage of the music application, the frequency of steps and presence of the friend's mobile device in proximity to the user device 115.
  • the filtered signal stream includes (1) location data describing the user is currently in a coffee shop from a GPS sensor of the user device 115 and (2) data describing a friend's mobile device is detected in proximity to the user device 115 from a Bluetooth sensor of the user device 1 15.
  • the activity identifier 208 identifies that the user is meeting with the friend at the coffee shop.
  • the activity identifier 208 retrieves data describing a user profile from the social network server 101 with permission from the user.
  • the user profile includes one or more of the user's age, gender, education background, working experience, interests and other demographic information.
  • the activity identifier 208 identifies one or more activities associated with the user from the filtered signal stream based on the user profile. For example, for a particular frequency of steps determined based on the step data from a pedometer, the activity identifier 208 may determine that the user is running if the user is a senior over 60 years old. However, the activity identifier 208 may determine that the user is walking at a fast pace if the user is a young athlete. In another example, if the user is categorized as a marathon running, the activity identifier 208 is more likely to identify the user activity as running than other activities such as biking, swimming, etc.
  • the activity identifier 208 identifies a social aspect, an attention aspect and/or a mobility aspect for each activity based on the filtered signal stream.
  • a social aspect indicates who is with the user during the activity. For example, a social aspect of a running activity indicates that a friend runs together with the user. In another example, a social aspect of a meeting indicates whether the user attends a business meeting or meets with friends.
  • An attention aspect indicates what the user focuses on. For example, an attention aspect of a gaming activity indicates the user focuses his or her attention on the game application.
  • a mobility aspect indicates a state of the user. For example, the mobility aspect indicates the user is sitting or moving during the activity. In some embodiments, the mobility aspect describes the user's geo-location. For example, the mobility aspect indicates the user is driving on a highway.
  • the filtered signal stream includes change in actions, and the activity identifier 208 identifies a beginning and/or an ending of an activity from the filtered signal stream. For example, at a first timestamp, the activity identifier 208 identifies a beginning of a running activity if the filtered signal stream includes data indicating that the frequency of the user's steps increases from a walking pace to a running pace. At a second timestamp, the activity identifier 208 identifies an ending of the running activity if the filtered signal stream includes data indicating the frequency of the user's steps decreases from a running pace to a walking pace.
  • the activity identifier 208 identifies a beginning of a dining activity if the filtered signal stream includes (1) location data indicating the user arrives at a restaurant and (2) data indicating presence of a friend's mobile device in proximity to the user's mobile device.
  • the activity identifier 208 identifies an ending of the dining activity if the filtered signal stream includes location data indicating the user leaves the restaurant.
  • the aggregator 210 can be software including routines for aggregating activities associated with a user.
  • the aggregator 210 can be a set of instructions executable by the processor 235 to provide the functionality described below for aggregating activities associated with a user.
  • the aggregator 210 can be stored in the memory 237 of the computing device 200 and can be accessible and executable by the processor 235.
  • the aggregator 210 may be adapted for cooperation and communication with the processor 235 and other components of the computing device 200 via signal line 238.
  • the aggregator 210 aggregates one or more activities associated with a user to define an event related to the user.
  • An event can be data describing a story of a user.
  • an event includes a single activity performed during a particular time period. For example, an exercise event describes that the user ran in a park from 6:00 AM to 6:30 AM.
  • an event includes multiple activities performed by a user during a particular time period. For example, a Saturday social event from 3 :00 PM to 10:00 PM includes shopping with friends in a mall from 3:00 PM to 6:00 PM, dining with the friends in a restaurant from 6:00 PM to 8:00 PM and going to a movie with the friends from 8:00 PM to 10:00 PM.
  • an event includes multiple activities related to a particular subject.
  • a gaming event includes playing a video game with a friend, posting a gaming result on a social network, sharing gaming photos online and posting comments on the gaming result.
  • an event includes one or more activities performed at the same location.
  • a sports event includes watching a sports game with friends in a stadium, taking photos of the sports game, shopping for a jersey in the stadium and encountering a colleague in the stadium, etc.
  • Other example events are possible.
  • the aggregator 210 stores the events defined from activities of a user in the data storage 243. In other embodiments, the aggregator 210 sends the events to the summarizing engine 212.
  • the summarizing engine 212 can be software including routines for generating detailed summaries of events for a first user depending on a type of relationship between the first user and a second user.
  • the summarizing engine 212 can be a set of instructions executable by the processor 235 to provide the functionality described below for generating graphical data for generating summaries.
  • the summarizing engine 212 can be stored in the memory 237 of the computing device 200 and can be accessible and executable by the processor 235.
  • the summarizing engine 212 may be adapted for cooperation and communication with the processor 235 and other components of the computing device 200 via signal line 240.
  • the summarizing engine 212 receives events including activities of a first user from the aggregator 210 and generates a summary of the first user's activities based on the events.
  • the activities include activities within a certain time period.
  • the activities may include activities during a first user's life during a period or periods of time between interactions of the first user and a second user.
  • the summarizing engine 212 receives events of the first user's physical activities, activities on social networks and activities on user devices 115 from the aggregator 210 and generates a summary of the first user's activities.
  • the summarizing engine 212 generates a summary of the first user's activities during a specified time period (e.g., a day, a week or a month). For example, the summarizing engine 212 generates a summary of applications used by the first user, posts published by the first user, people meeting with the first user, photos shared by the first user, videos viewed by the first user and other physical activities (e.g., biking, walking, etc.) performed by the first user during the specified time period. For example, the summarizing engine 212 may generate a summary of activities during a user's lifetime. For example, the summarizing engine 212 may summarize or identify important moments in the lifetime of a first user during the person of time between interactions between the first user and another user.
  • a specified time period e.g., a day, a week or a month.
  • the summarizing engine 212 generates a summary of applications used by the first user, posts published by the first user, people meeting with the first user, photos shared by the first user,
  • the summarizing engine 212 determines that the first user is within proximity to a second user, generates a summary of the first user's activities based on the first user's closeness with the second user and provides the first user with the summary.
  • the proximity is a physical proximity between the first and second users. The summarizing engine 212 determines that the first user is within proximity to the second user based on data received from multiple virtual detectors 202 and/or hardware sensors 252.
  • the summarizing engine 212 determines that the first user is within proximity to the second user based on data describing that the second user's mobile device is detected in proximity to the user device 115 associated with the first user from a Bluetooth sensor of the user device 115, or based on location data describing both the first user and the second user are currently in a coffee shop from GPS sensors of the user devices 115 associated with the first user and the second user.
  • the summarizing engine 212 determines the first user's closeness with the second user based on a connection between the first user and the second user.
  • the connection is a social connection between the first user and the second user on a social network.
  • the summarizing engine 212 receives social data (e.g., profiles, relationships, a social graph, etc.) from one or more social networks and determine if and how users are connected.
  • the summarizing engine 212 determines the first user's closeness with the second user based on a degree of separation of a social connection between the first user and the second user.
  • the summarizing engine 212 identifies that the first and second users follow each other in a social network and determines a degree of separation of one between the first and second users. Based on this first-degree separation, the summarizing engine 212 determines that the first user and second user are close. The lower the degree of separation is, the closer the first and second users are. For example, the summarizing engine 212 determines that there is a first- degree friendship connection between the first and second users since they are directly connected in a social network with a friendship connection. The summarizing engine 212 also determines that there is a second-degree friendship connection between the first user and a third user since they are connected in the social network via a mutual friend. The summarizing engine 212 determines that the first user is closer to the second user than to the third user based on comparing the degrees of separation.
  • the summarizing engine 212 determines a connection between the first and second users based on other sources.
  • the sources for the connection between the first and second users can also include communications, such as emails, micro- blogs, blogs, forums, user contact lists, corporate employee databases, organizational charts, etc.
  • the sources for connection between the first and second users can also be historical co-presence of the first and second users.
  • the summarizing engine 212 can determine if users are connected by checking users' contact lists or by determining if users have sent or received a certain number of emails (e.g., one email, five emails, 10 emails, 50 emails, etc.) to or from each other in a certain period of time (e.g., in a week, in a month, in a year, etc.).
  • the summarizing engine 212 can determine user connections by analyzing corporate employee databases or school alumni databases, etc. For example, the summarizing engine 212 determines that users are connected if they have worked for the same employer or if they have studied at the same school.
  • the summarizing engine 212 determines the first user's closeness with the second user based on the first and second users' connection from other sources. For example, if the first user frequently exchanges emails with the second user (e.g., ten emails per week) while seldom communicating with a third user via emails (e.g., two emails per month), the summarizing engine 212 determines that the first user is closer to the second user than to the third user. If the first user met a fourth user once when they attended the same conference one year ago, the summarizing engine 212 determines that the first user is closer to the third user than to the fourth user.
  • emails e.g., ten emails per week
  • a third user via emails e.g., two emails per month
  • the summarizing engine 212 determines that the first user's closeness with the second user based on a time elapsed since a last interaction. In some embodiments, the summarizing engine 212 determines which events to include in the summary based on a time elapsed since a last interaction. The summarizing engine 212 receives activity data from the activity identifier 208 and determines a time elapsed since a last interaction between the first and second users based on the activity data. For example, the time elapsed may be time elapsed since the last face-to-face meeting between the first and second users, or since the last communication between the first and second users, or some other last interaction between the first and second users.
  • the summarizing engine 212 increases the closeness between the first and second users when the determined time is reduced. For example, the summarizing engine 212 determines that the first user and the second user are close based on an activity that the first user commented on a post sent by the second user 15 minutes ago. The summarizing engine 212 determines that the first user is distant to a third user since the last interaction between them was that the third user replied an email from the first user two years ago.
  • the summarizing engine 212 determines that the first user's closeness with the second user based on at least one of a connection between the first user and the second user and a time elapsed since a last interaction. For example, the summarizing engine 212 determines that the first and second users are close since they are directly connected with a friendship connection on a social network (e.g., a degree of separation of one). However, if the summarizing engine 212 also determines that the last interaction between the first and second users (e.g., an email) occurred one year ago, the summarizing engine 212 decreases the closeness between the first and second users.
  • a social network e.g., a degree of separation of one
  • the summarizing engine 212 determines relationships between the first and second users based on the closeness. In some embodiments, the summarizing engine 212 classifies the first user's relationship with the second user as being a first type of relationship, a second type of relationship or a third type of relationship. If the first user knows the second user very well (e.g., close friends and/or interact with each other frequently), the second user very well (e.g., close friends and/or interact with each other frequently), the second user very well (e.g., close friends and/or interact with each other frequently), the second user very well (e.g., close friends and/or interact with each other frequently), the second user very well (e.g., close friends and/or interact with each other frequently), the second user very well (e.g., close friends and/or interact with each other frequently), the second user very well (e.g., close friends and/or interact with each other frequently), the second user very well (e.g., close friends and/or interact with each other frequently), the second user
  • summarizing engine 212 classifies the relationship between the first and second users as being a first type of relationship. If the first user does not know the second user very well (e.g., acquaintances and/or interact with each other infrequently), the summarizing engine 212 classifies the relationship between the first and second users as being a second type of relationship. If the first user has not met the second user in person (e.g., strangers with information shared in common), the summarizing engine 212 classifies the relationship between the first and second users as being a third type of relationship.
  • the summarizing engine 212 determines relationships between the first and second users based on various contexts. For example, in some embodiments, the summarizing engine 212 classifies the first user's relationship with the second user based on the context of the relationship. For example, the first and second users may have a personal relationship. As another example, the first and second users may have a professional relationship.
  • the summarizing engine 212 determines relationships between the first and second users based on historical context. For example, if the first user and the second user were on the same sports team, the relationship may be categorized based on that. As another example, if the first user and the second user were in the same club, the relationship may be categorized based on that.
  • the summarizing engine 212 assigns threshold degrees of separation and determines whether the first user knows the second user well based on the threshold degrees of separation. For example, the summarizing engine 212 assigns a first threshold degree of separation as two and assigns a second threshold degree of separation as seven. If the degree of separation of a connection between the first and second users is less than two, the summarizing engine 212 determines that the first user knows the second user very well and they have a first type of relationship. If the degree of separation of a connection between the first and second users is between two and seven, the summarizing engine 212 determines that the first user generally knows the second user (e.g., they are on edge of a social graph) and they have a second type of relationship. If the degree of separation of a connection between the first and second users is greater than seven, the summarizing engine 212 considers that the first and second users are strangers and determines that they have a third type of relationship.
  • the summarizing engine 212 uses other factors (e.g., interaction frequency, a time elapsed since a last interaction, etc.) to determine whether the first user knows the second user very well. For example, if the summarizing engine 212 determines that the first and second users interact with each other on a social network about twice per month or determines that the last interaction between the first and second users was 20 days ago, the summarizing engine 212 determines that the first user does not know the second user very well and classifies the relationship between the first and second users as being a second type of relationship.
  • factors e.g., interaction frequency, a time elapsed since a last interaction, etc.
  • the 212 generates a first summary for the first user that includes a notification that the second user is nearby, a last interaction with the second user and recent interactions with the second user.
  • the first summary includes any action items that the first user owes the second user.
  • the summarizing engine 212 includes a notification "dinner with your friend" and restaurant information in the first summary to remind the first user that he/she needs to make dinner plans with the second user.
  • the summarizing engine 212 attaches a starred email in the first summary to remind the first user that he/she needs to give a class notes mentioned in the starred email to the second user.
  • the first summary will be described in detail below with reference to Figure 3A.
  • the 212 generates a second summary for the first user that includes the notification that the second user is nearby, the last interaction with the second user and events that the first user and the second user share in common.
  • Examples of social events that the first user and the second user share in common may be photos of the two users together, events that both users attended, and stories related to both users.
  • Other examples of events that the first user and the second user share in common may be events that happened specifically in this location where the two users are currently located.
  • the second summary includes important events that occurred to the first user that the second user might be interested in hearing about. For example, after the first and second user last met in an information technology (IT) conference, the first user started up an IT company.
  • IT information technology
  • the summarizing engine 212 includes the information of the first user's IT company in the second summary.
  • the second summary also includes a name of a mutual connection and an event that the mutual connection attended.
  • the summarizing engine 212 also includes the IT conference that both the first user and the second user attended last year in the second summary to remind the first user of the second user.
  • the summarizing engine 212 includes a name of a university from which the first and second users graduated or a mutual friend's name in the second summary.
  • the summarizing engine 212 includes a common hobby or common acquaintances between the first a second users.
  • the second summary includes a recent post on a social network that was created by the second user.
  • the second summary includes a picture of a restaurant near the Golden Gate Bridge taken by the second user on the first day the second user arrived in San
  • the 212 generates a third summary for the first user that includes the notification that the second user is nearby and events that the first user and the second user share in common.
  • the summarizing engine 212 includes biographical information that the first user and the second user share in common such as having worked in a same company, joining a New Year's celebration every year at Times Square in New York City. The third summary will be described in detail below with reference to Figure 3D.
  • the summarizing engine 212 determines that the first user is within proximity to the second user, the summarizing engine 212 generates a summary based on the type of relationship between the first and second users and provides the first user with the summary.
  • the summary includes a notification that the second user is nearby, a last time the first user interacted with the second user and information about at least one of the first user and the second user.
  • the summarizing engine 212 determines that the first user is within proximity to the second user, for example, when the first and second users are face-to-face or close enough in proximity that they are in a conversation with each other.
  • the notification may include that another user is a certain distance away. In other embodiments, the notification could prompt the two users to coordinate online or start heading toward one another in order to meet.
  • the user interface engine 214 can be software including routines for generating graphical or audio data for providing user interfaces to users.
  • the user interface engine 214 can be a set of instructions executable by the processor 235 to provide the functionality described below for generating graphical data for providing user interfaces to users or providing audio data to users.
  • the user interface engine 214 can be stored in the memory 237 of the computing device 200 and can be accessible and executable by the processor 235.
  • the user interface engine 214 may be adapted for cooperation and communication with the processor 235 and other components of the computing device 200 via signal line 242.
  • the user interface engine 214 generates graphical data for providing a user interface that depicts a summary of a user's activities.
  • the user interface engine 214 sends the graphical data to a user device 115, causing the user device 1 15 to present the user interface to the user.
  • the user interface engine 214 may help trigger launching of a relevant application and relevant content with the applications (for example, an email application might have the right email open or the result of a query for related emails between the first and second users).
  • Example user interfaces are shown in Figures 3A-3D.
  • the user interface engine 214 generates graphical data for providing a user interface that depicts an event associated with one or more users.
  • a user may modify or update the event notification, add more peers to the event, share the event, add a detailed description of photos, make comments on the event, add or update a title for the event, or perform other actions on the event using the user interface.
  • the user interface engine 214 may generate graphical data for providing other user interfaces to users.
  • the privacy engine 216 can be software including routines for determining what information to provide to a user based on privacy settings.
  • the privacy engine 216 can be a set of instructions executable by the processor 235 to provide the functionality described below for determining what information to provide to a user based on privacy settings, the privacy engine 216 can be stored in the memory 237 of the computing device 200 and can be accessible and executable by the processor 235.
  • the privacy engine 216 may be adapted for cooperation and communication with the processor 235 and other components of the computing device 200 via signal line 217.
  • the privacy engine 216 determines privacy settings from a user profile associated with a user. For example, John manually selects privacy settings such as preferring to share personal photos with a group in the social network that he created called "close friends.”
  • the privacy engine 216 communicates with the summarizing engine 212 to determine what information to provide to a user based on privacy settings. For example, in the above example, when John is nearby his best friend Linda and his acquaintance Sara, the summary engine 212 generates a first summary for Linda that includes a personal photo of John and generates a second summary for Sara without including the personal photo.
  • the summary engine 212 may also generate a third summary of the common items for all three users of all of the three users together (for example, email threads between all three users, documents that all three users collaborated on, or shared photos among the three users, etc.).
  • Figure 3 A is an example graphic representation of a user interface 300 for displaying a summary where the first user and the second user are close friends and/or interact with each other frequently.
  • the user interface 300 displays a summary generated for Lance.
  • the user interface 300 includes a notification 301 notifying Lance that Bob Smith is nearby.
  • Bob is a close friend of Lance.
  • the user interface 300 includes this last interaction at 302.
  • the user interface 300 also includes other interactions between Bob and Lance, for example, the indication 303 shows that Bob and Lance had a two-minute call two weeks ago and the indication 304 shows that Bob and Lance exchanged an email about a project three weeks ago.
  • Lance may be reminded that he owns some documents related to the project to Bob.
  • Figures 3B and 3C are example graphic representations of user interfaces 320 and 340 for displaying a summary where the first user and the second user are friends and/or interact with each other infrequently.
  • the user interfaces 320 and 340 display a first portion and a second portion of a summary generated for Lance.
  • the user interface 320 includes a notification 321 notifying Lance that Sara Doe is nearby. Sara is a friend of Lance. Last time they met with each other was at Oren's Barbeque one year ago. The user interface 320 includes this last interaction at 322.
  • the user interface 320 also includes events that Sara and Lance share in common, for example, the indication 323 shows that Sara and Lance both attended Noname University, the indication 324 shows that Sara and Lance are both friends of John Doe, the indication 325 shows that Sara and Lance both live in San Francisco, CA and the indication 326 shows that Sara and Lance used to live in Cambridge, MA.
  • the user interface 340 displays a second portion of the summary generated for Lance.
  • the user interface 340 includes a recent post 341 on a social network that was created by Sara.
  • the picture 341 taken by Sara shows how Sara was scared by a bear when the bear was too close to her.
  • the user interface 340 also includes important events that occurred to Lance that Sara might be interested in hearing about, e.g., Lance's life since last communicating with Sara 342.
  • the picture 343 shows that Lance had a baby
  • the picture 344 shows that Lance started a company
  • the picture 345 shows that Lance had lunch with their mutual friend Jenny.
  • Figure 3D is an example graphic representation of a user interface 360 for displaying a summary where the first user and the second user are strangers with information shared in common.
  • the user interface 360 displays a summary generated for Lance.
  • the user interface 360 includes a notification 361 notifying Lance that Mike Jones is nearby. Lance does not really know Mike but shares common information with Mike. For example, the indications 362, 363 and 364 show that both Lance and Mike are friends of Alice Doe, used to work in X company and currently live in San Diego, CA. By providing the common information shared between Lance and Mike, they have some topics to discuss and may eventually know each other well.
  • Figure 4 is a flow diagram of an example of a method for generating a summary for a first user.
  • the summary application 103 comprises a processing unit 204, a filter engine 206, an activity identifier 208 and a summarizing engine 212.
  • the processing unit 204 receives 402 a signal stream from at least one of a hardware sensor 252 and a virtual detector 202.
  • the filte r engine 206 filters 404 the signal stream and outputs a filtered signal stream including data for defining one or more human- understandable actions.
  • the activity identifier 208 identifies 406 one or more activities associated with a first user from the filtered signal stream.
  • the summarizing engine 212 generates 408 a summary of the first user's activities.
  • the summarizing engine 212 determines 410 that the first user is within proximity to a second user.
  • the summarizing engine 212 determines 412 the first user's closeness with the second user based on at least one of a connection between the first user and the second user and a time elapsed since a last interaction.
  • the summarizing engine 212 provides 414 the first user with a notification that the second user is nearby, a last time the first user interacted with the second user and information about at least one of the first user and the second user.
  • Figures 5A and 5B are flow diagrams of another example of a method for generating a summary for a first user depending on the type of relationship between the users.
  • the summary application 103 comprises a processing unit 204, a filter engine 206, an activity identifier 208, an aggregator 210 and a summarizing engine 212.
  • the processing unit 204 receives 502 a signal stream from at least one of a hardware sensor 252 and a virtual detector 202.
  • the signal stream includes at least one of hardware raw data generated by the hardware sensor 252 and virtual raw data generated by the virtual detector 202.
  • the filter engine 206 filters 504 the signal stream and outputs a filtered signal stream including data for defining one or more human-understandable actions. For example, the filter engine 206 outputs a filtered signal stream that combines one or more of the following data including: (1) location and velocity data from a GPS sensor, and (2) detection data indicating presence of an automobile (e.g., Bluetooth enabled) and a mobile in close proximity, etc. to indicate travelling together with another user.
  • detection data indicating presence of an automobile (e.g., Bluetooth enabled) and a mobile in close proximity, etc. to indicate travelling together with another user.
  • the activity identifier 208 identifies 506 one or more activities associated with a first user from the filtered signal stream. For example, for a particular frequency of steps determined based on the step data from a pedometer, the activity identifier 208 may determine that the user is running if the user is a senior over 60 years old. However, the activity identifier 208 may determine that the user is walking at a fast pace if the user is a young athlete. In another example, if the user is categorized as a marathon running, the activity identifier 208 is more likely to identify the user activity as running than other activities such as biking, swimming, etc.
  • the summarizing engine 212 generates 508 a summary of the first user's activities.
  • the summarizing engine 212 receives events including activities of a first user from the aggregator 210 and generates a summary of the first user's activities based on the events.
  • the summarizing engine 212 determines 510 that the first user is within proximity to a second user. For example, the summarizing engine 212 determines that the first user is within proximity to the second user based on data describing that the second user's mobile device is detected in proximity to the user device 115 associated with the first user from a Bluetooth sensor of the user device 115.
  • the summarizing engine 212 determines 512 a degree of separation between the first user and the second user in a social network. For example, the summarizing engine 212 identifies that the first and second users follow each other in a social network and determines a degree of separation of one between the first and second users. In this example, based on this first-degree separation, the summarizing engine 212 determines that the first user and second user are close. [00101] The summarizing engine 212 determines 514 a time elapsed since a last interaction between the first user and the second user. For example, the summarizing engine 212 determines that the first user and the second user are close based on an activity that the first user commented on a post sent by the second user 15 minutes ago.
  • the summarizing engine 212 determines that the first user is distant to a third user since the last interaction between them was that the third user replied an email from the first user two years ago. In some embodiments, the summarizing engine 212 determines 514 a time elapsed since a last interaction between the first user and the second user to determine relevant events to include.
  • the summarizing engine 212 classifies 516 the first user's relationship with the second user as being a first type of relationship, a second type of relationship or a third type of relationship. If the first user knows the second user very well (e.g., close friends and/or interact with each other frequently), the summarizing engine 212 classifies the relationship between the first and second users as being a first type of relationship. If the first user does not know the second user very well (e.g., acquaintances and/or interact with each other infrequently), the summarizing engine 212 classifies the relationship between the first and second users as being a second type of relationship. If the first user has not met the second user in person (e.g., strangers with information shared in common), the summarizing engine 212 classifies the relationship between the first and second users as being a third type of relationship.
  • the summarizing engine 212 generates 518 a first summary for the first user that includes a notification that the second user is nearby, a last interaction with the second user and recent interactions with the second user.
  • the summarizing engine 212 includes a notification "dinner with your friend” and restaurant information in the first summary to remind the first user that he/she owns a dinner to the second user.
  • the 212 generates 520 a second summary for the first user that includes the notification that the second user is nearby, the last interaction with the second user and events that the first user and the second user share in common.
  • the second summary includes important events that occurred to the first user that the second user might be interested in hearing about.
  • the second summary also includes a name of a mutual connection and an event that the mutual connection attended.
  • the second summary includes a recent post on a social network that was created by the second user.
  • the summarizing engine 212 generates 522 a third summary for the first user that includes the notification that the second user is nearby and events that the first user and the second user share in common.
  • the summarizing engine 212 includes biographical information that the first user and the second user share in common such as having worked in a same company, joining New Year's celebration every year at Times Square in New York City.
  • the present embodiment of the specification also relates to an apparatus for performing the operations herein.
  • This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including optical disks, CD-ROMs, and magnetic disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, flash memories including USB keys with non-volatile memory or any type of media suitable for storing electronic instructions, each coupled to a computer system bus.
  • the specification can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment containing both hardware and software elements.
  • the specification is implemented in software, which includes but is not limited to firmware, resident software, microcode, etc.
  • the description can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system.
  • a computer-usable or computer readable medium can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • a data processing system suitable for storing and/or executing program code will include at least one processor coupled directly or indirectly to memory elements through a system bus.
  • the memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
  • I/O devices including but not limited to keyboards, displays, pointing devices, etc.
  • I/O controllers can be coupled to the system either directly or through intervening I/O controllers.
  • Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks.
  • Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
  • modules, routines, features, attributes, methodologies and other aspects of the disclosure can be implemented as software, hardware, firmware or any combination of the three.
  • a component, an example of which is a module, of the specification is implemented as software
  • the component can be implemented as a standalone program, as part of a larger program, as a plurality of separate programs, as a statically or dynamically linked library, as a kernel loadable module, as a device driver, and/or in every and any other way known now or in the future to those of ordinary skill in the art of computer programming.
  • the disclosure is in no way limited to implementation in any specific programming language, or for any specific operating system or environment. Accordingly, the disclosure is intended to be illustrative, but not limiting, of the scope of the specification, which is set forth in the following claims.

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

La présente invention concerne un système et un procédé permettant de produire des résumés d'interactions sociales entre des utilisateurs. Le système comprend un processeur et une mémoire stockant des instructions qui, lorsqu'elles sont exécutées, amènent le système à : recevoir un flux de signaux à partir d'un capteur de matériel et/ou d'un détecteur virtuel, filtrer le flux de signaux et délivrer en sortie le flux de signaux filtré comprenant des données définissant des actions compréhensibles par l'homme, identifier des activités associées à un premier utilisateur à partir du flux de signaux filtré, générer un résumé des activités du premier utilisateur, déterminer que le premier utilisateur se situe à proximité d'un second utilisateur, déterminer un degré de séparation entre le premier utilisateur et le second utilisateur dans un réseau social, déterminer un temps écoulé depuis une dernière interaction entre le premier utilisateur et le second utilisateur, classer la relation du premier utilisateur avec le second utilisateur comme étant un premier type de relation, un deuxième type de relation ou un troisième type de relation, en réponse à l'obtention du premier type de relation, générer un premier résumé pour le premier utilisateur qui comprend une notification du fait que le second utilisateur se situe à proximité, une dernière interaction avec le second utilisateur et des interactions récentes avec le second utilisateur, en réponse à l'obtention du deuxième type de relation, générer un deuxième résumé pour le premier utilisateur qui comprend la notification du fait que le second utilisateur se situe à proximité, la dernière interaction avec le second utilisateur et des événements que le premier utilisateur et le second utilisateur partagent en commun, et en réponse à l'obtention du troisième type de relation, générer un troisième résumé pour le premier utilisateur qui comprend la notification du fait que le second utilisateur se situe à proximité et des événements que le premier utilisateur et le second utilisateur partagent en commun.
PCT/US2015/016208 2014-02-19 2015-02-17 Production de résumés d'interactions sociales entre utilisateurs WO2015126851A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP15751343.3A EP3108441A4 (fr) 2014-02-19 2015-02-17 Production de résumés d'interactions sociales entre utilisateurs
CN201580017103.1A CN106133786B (zh) 2014-02-19 2015-02-17 总结用户之间的社交互动

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201461641488P 2014-02-19 2014-02-19
US61/641,488 2014-02-19
US14/622,794 2015-02-13
US14/622,794 US9672291B2 (en) 2014-02-19 2015-02-13 Summarizing social interactions between users

Publications (1)

Publication Number Publication Date
WO2015126851A1 true WO2015126851A1 (fr) 2015-08-27

Family

ID=53878880

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/016208 WO2015126851A1 (fr) 2014-02-19 2015-02-17 Production de résumés d'interactions sociales entre utilisateurs

Country Status (3)

Country Link
EP (1) EP3108441A4 (fr)
CN (1) CN106133786B (fr)
WO (1) WO2015126851A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105827265A (zh) * 2016-03-30 2016-08-03 深圳还是威健康科技有限公司 基于可穿戴设备的数据传输方法和装置
US11323402B2 (en) 2017-06-26 2022-05-03 International Business Machines Corporation Spatial topic representation of messages
US11790753B2 (en) 2020-04-06 2023-10-17 Koninklijke Philips N.V. System and method for determining and managing socially isolated individuals

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111324741B (zh) * 2018-12-17 2023-08-18 中国移动通信集团山西有限公司 用户关系识别方法、装置、设备及介质
CN111049988A (zh) * 2019-12-23 2020-04-21 随手(北京)信息技术有限公司 移动设备的亲密度预测方法、系统、设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070015518A1 (en) * 2005-07-15 2007-01-18 Agilis Systems, Inc. Mobile resource location-based customer contact systems
US20070030824A1 (en) * 2005-08-08 2007-02-08 Ribaudo Charles S System and method for providing communication services to mobile device users incorporating proximity determination
US20100311395A1 (en) * 2009-06-08 2010-12-09 Microsoft Corporation Nearby contact alert based on location and context
JP2011217128A (ja) * 2010-03-31 2011-10-27 Jin Yatomi 移動携帯端末の接近検知および通知システム、接近検知および通知サーバ、情報端末、プログラム、および記録媒体
US20130046770A1 (en) * 2011-08-19 2013-02-21 Erick Tseng Sending Notifications About Other Users With Whom A User is Likely to Interact

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1825430A4 (fr) * 2004-10-19 2009-08-26 Yahoo Inc Systeme et procede de reseautage personnel base sur la localisation
CN101968818A (zh) * 2010-11-08 2011-02-09 北京开心人信息技术有限公司 一种社交网站中向用户推荐好友的方法及系统
CN102695121A (zh) * 2011-03-25 2012-09-26 北京千橡网景科技发展有限公司 向社交网络中的用户推送好友信息的方法和系统
US20140012918A1 (en) * 2011-03-29 2014-01-09 Nokia Corporation Method and apparatus for creating an ephemeral social network
JP5929501B2 (ja) * 2012-05-21 2016-06-08 ソニー株式会社 情報処理装置、情報処理方法およびプログラム
CN102750335A (zh) * 2012-06-01 2012-10-24 深圳市创梦天地科技有限公司 一种基于位置的信息共享系统及其实现方法
CN103488641A (zh) * 2012-06-13 2014-01-01 张征程 有关信息空间化的社交网站架构

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070015518A1 (en) * 2005-07-15 2007-01-18 Agilis Systems, Inc. Mobile resource location-based customer contact systems
US20070030824A1 (en) * 2005-08-08 2007-02-08 Ribaudo Charles S System and method for providing communication services to mobile device users incorporating proximity determination
US20100311395A1 (en) * 2009-06-08 2010-12-09 Microsoft Corporation Nearby contact alert based on location and context
JP2011217128A (ja) * 2010-03-31 2011-10-27 Jin Yatomi 移動携帯端末の接近検知および通知システム、接近検知および通知サーバ、情報端末、プログラム、および記録媒体
US20130046770A1 (en) * 2011-08-19 2013-02-21 Erick Tseng Sending Notifications About Other Users With Whom A User is Likely to Interact

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105827265A (zh) * 2016-03-30 2016-08-03 深圳还是威健康科技有限公司 基于可穿戴设备的数据传输方法和装置
US11323402B2 (en) 2017-06-26 2022-05-03 International Business Machines Corporation Spatial topic representation of messages
US11329939B2 (en) 2017-06-26 2022-05-10 International Business Machines Corporation Spatial topic representation of messages
US11790753B2 (en) 2020-04-06 2023-10-17 Koninklijke Philips N.V. System and method for determining and managing socially isolated individuals

Also Published As

Publication number Publication date
EP3108441A1 (fr) 2016-12-28
CN106133786B (zh) 2022-02-11
CN106133786A (zh) 2016-11-16
EP3108441A4 (fr) 2017-07-19

Similar Documents

Publication Publication Date Title
US10275420B2 (en) Summarizing social interactions between users
CN106031262B (zh) 邻近度检测
US7895049B2 (en) Dynamic representation of group activity through reactive personas
US20170032248A1 (en) Activity Detection Based On Activity Models
US20160191446A1 (en) Techniques for prompting communication among users of a social network
US9191788B2 (en) System and method for contextual social messaging
US10609183B2 (en) Content sharing recommendations
US10162896B1 (en) Event stream architecture for syncing events
US20160092040A1 (en) Communication device with contact information inference
CN106133786B (zh) 总结用户之间的社交互动
US20140245180A1 (en) Apparatus and method for providing contact-related information items
US20140108383A1 (en) Method and System for Filtering Search Results for Maps Using Social Graph
WO2013077950A1 (fr) Notes déclenchées par la localisation
US20190190874A1 (en) People Matching for Social Activities on an Online Social Network
US20220078135A1 (en) Signal upload optimization
US10769548B2 (en) Value model for sending notifications
US20140244616A1 (en) Apparatus and method for providing contact-related information items
US10863354B2 (en) Automated check-ins
US9824112B1 (en) Creating event streams from raw data
US20160147413A1 (en) Check-in Additions
US20160147421A1 (en) Dynamic Status Indicator
US20160150032A1 (en) Prefetching Places
US9391947B1 (en) Automatic delivery channel determination for notifications
US20140244630A1 (en) Apparatus and method for providing contact-related information items
EP2881910A1 (fr) Indiquer la disponibilité d'un utilisateur pour une communication

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15751343

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2015751343

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2015751343

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE