WO2009043020A2 - System and method for injecting sensed presence into social networking applications - Google Patents
System and method for injecting sensed presence into social networking applications Download PDFInfo
- Publication number
- WO2009043020A2 WO2009043020A2 PCT/US2008/078148 US2008078148W WO2009043020A2 WO 2009043020 A2 WO2009043020 A2 WO 2009043020A2 US 2008078148 W US2008078148 W US 2008078148W WO 2009043020 A2 WO2009043020 A2 WO 2009043020A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- sensor
- data
- information
- social networking
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/54—Presence management, e.g. monitoring or registration for receipt of user log-on information, or the connection status of the users
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/2866—Architectures; Arrangements
- H04L67/30—Profiles
- H04L67/306—User profiles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/52—Network services specially adapted for the location of the user terminal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/535—Tracking the activity of the user
Definitions
- a system foi injecting sensedmediance into social netwoiking applications includes at least one sensoi pioximate to a user, the at least one sensor being used foi collecting sensoi data associated with the usei, aspectnce seiver foi receiving and storing the sensor data, an inference engine for analyzing the stored data and to infer a presence status for the user, and a presentation engine for presenting the information to the user and othei users
- FIG 3 is a flowchart illustrating one exemplary method for injecting sensed presence into social networking applications
- FIG 4 shows an example of a snapshot of a user's data page as accessed from a portal of the system of FIG 1 DETAILED DESCRIPTION OF THE EMBODIMENTS
- FIG 4 shows an example of a snapshot of a user's data page as accessed from a portal of the system of FIG 1 DETAILED DESCRIPTION OF THE EMBODIMENTS
- Presence server 1 16 may consist of a set of servers that: (1) hold a database of users and their associated presences 122, 124; (2) implement a web portal that provides access to presence information via per-user accounts; and (3) contain algorithms to draw inferences about many objective and subjective aspects of users based upon received and stored sensor data.
- Presence server 116 may include one or more interfaces (e.g., using one or more application programming interfaces (APIs)) to clients (e.g., thin clients) running on consumer computing and communication devices, such as cell phone 106, PDA 108, and notebook computer 112.
- APIs application programming interfaces
- the privacy of users registered with system 100 may be protected through a number of mechanisms Users' raw sensor feeds and inferred information (collectively considered as the user's sensed presence) may be securely stored within storage 210 of presence server 1 16, but may be shared by the owning usei according to group membership policies For example, the iecorded sensoi data and meritnce is available only to users that are alieady part of the user ' s buddy list.
- Buddies are determined from the combination of buddy lists imported by registered services (Pidgm, Facebook, etc ), and buddies may be added based on profile matching
- certain embodiments of system 100 inherit and leverage the Vi oik ahead ⁇ undei taken by a usei when ci eating his buddy lists and sub-lists (e g in Pidgin Skype, Facebook) in defining access policies to the usei ssheetnce data
- system 10066 The past ten yeais have seen the giowth in popularity of online social networks, including chat groups, weblogs, fnend networks, and dating websites Howevei, one hurdle to using such sites is the requrance that useis manually input their pieferences, characteristics, and the like into the site databases
- Certain embodiments of system 100 provide the means for automatic collection and sharing of this type of profile information
- Such embodiments of system 100 automatically learn and allow users to export information about their favorite locations or "'haunts", what recreational activities they enjoy, and what kind of lifestyle they are familiar with, along with near real-time personal presence updates sharable via application (e g , Skype, MySpace) plug-ms and web portal 218
- Furthei as many popular IM clients allow searching for people by name, location, age, etc
- certain embodiments of system 100 enable searching of users based upon a data mining process that also involves interests (like preferred listened music, significant places, preferred sport, etc) H.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Computer And Data Communications (AREA)
- Information Transfer Between Computers (AREA)
- Telephonic Communication Services (AREA)
Abstract
A method for injecting sensed presence into social networking applications includes receiving sensor data associated with a user (102), inferring a presence status of the user based upon analysis of the sensor data, storing the sensor data and presence status within a database, and sending the presence status to a social networking server (126) to update the user's presence information for the social networking applications based upon the user's preferences. A system for injecting sensed presence into social networking applications includes at least one sensor (110) proximate to a user, the at least one sensor being used for collecting sensor data associated with the user, a presence server (116) for receiving and storing the sensor data, an inference engine for analyzing the stored data and to infer a presence status for the user, and a presentation engine for presenting the information to the user and other users.
Description
SYSTEM AND METHOD FOR INJECTING SENSED PRESENCE INTO SOCIAL NETWORKING APPLICATIONS
RELATED APPLICATIONS
[0001] This application claims benefit of priority to United States Provisional Patent Application Serial Number 60/976,371 filed 28 September 2007, which is incorporated herein by reference.
BACKGROUND
[0002] Presence is currently limited to a user's contactable status. For example, a first user's status and availability is provided by presence servers in a network environment. A second user needing to contact the first user may thereby determine the optimal way (and likelihood of success) in contacting the second user. A presence server may determine this status by detecting events made by the first user. For example, if the first user is typing on a keyboard of a computer, these key input events indicate that the first user is sitting at, and using, the computer. The presence server thus displays the status of the first user as 'using the computer'. Using this presence information, the second user may decide to send an instant message to the first user's computer to initiate communication. Alternatively, the second user may decide to call a telephone located near the first user's computer, knowing that the first user will probably answer. [0003] Thus, presence is a useful tool for making informed decisions about other users. However, this presence information is limited to locating and communicating with its users.
[0004] Social networking applications, such as MySpace, Facebook, Skype, allow users to interactively define their presence in greater detail, thereby allowing other permitted users to view this additional presence information.
However, the burden of updating the presence information interactively typically results in infrequent updates and therefore often old (and often incorrect) presence information.
SUMMARY
10005] A method foi injecting sensed piesence into social netwoiking applications includes iecemng sensoi data associated with a usei, inlemng a piesence status of the user based upon analysis of the sensor data, stoimg the sensoi data and piesence status within a database, and sending the presence status to a social networking server to update the user's piesence information for the social networking applications based upon the user's preferences
[0006] A software product includes instructions, stored on computer- readable media, where the instructions, when executed by a computer, peiform steps for injecting sensed piesence into social networking applications The instructions include instructions for receiving sensor data associated with a usei, instructions foi inferring a presence status of the user based upon analysis of the sensor data, instructions for storing the sensor data and presence status within a database, and instructions foi sending the presence status to a social netwoiking seiver to update the user's piesence information foi the social networking applications based upon the user's preferences
[0007] A system foi injecting sensed piesence into social netwoiking applications includes at least one sensoi pioximate to a user, the at least one sensor being used foi collecting sensoi data associated with the usei, a piesence seiver foi receiving and storing the sensor data, an inference engine for analyzing the stored data and to infer a presence status for the user, and a presentation engine for presenting the information to the user and othei users
BRIEF DESCRIPTION OF THE FIGURES
[0008] FIG 1 shows a system for injecting sensed presence into social networking applications, according to an embodiment
[0009] FIG 2 shows the presence server of FIG 1 interacting with applications, a cell phone, a PDA, embedded sensors, and a notebook computer
[0010] FIG 3 is a flowchart illustrating one exemplary method for injecting sensed presence into social networking applications [0011] FIG 4 shows an example of a snapshot of a user's data page as accessed from a portal of the system of FIG 1
DETAILED DESCRIPTION OF THE EMBODIMENTS [0012 J To make a piesence system useful foi a social netw oik additional sensoi input and activity determination is used That is piesence outside a woik envnonment is detected to piovide a usei s status that is useful in the social environment Once the usei's status is determined, it may be shared with other useis, as permitted by the user
[0013] FIG 1 shows a system 100 for injecting sensed presence into social netwoiking applications A user 102 has, for example, a cell phone 106, a personal digital assistant (PDA) 108 and an embedded sensor unit 110 Cell phone 106 may include a camera for sensing images around user 102 and a global positioning sensor (GPS) unit for determining the location of user 102 PDA 108 may include a temperature sensoi foi sensing temperature near user 102 Embedded sensors 110 may include one or more accelerometeis for determining activity of user 102 Embedded sensors 110 may include a tiansceiver to enable wireless communication with cell phone 106, PDA 108 and/or network 120 Embedded sensors 1 10 may be attached to a usei's body (e g , a usei's running shoes) as illustrated m FIG 1 Howevei, embedded sensors may also be used in othei manners, such as earned by anothei user, attached to personal pioperty (e g , a usei 's bicycle, car, ski boot), or embedded in the ecosystem of a city oi town (e g , a carbon dioxide sensor oi a pollen sensor attached to a structure in a city) Network 120 may include the Internet and other wired and/or wireless networks
[0014] A second user 104 is shown with a notebook computer 112 that may include one or moie sensors that collect information of user 104 Such sensors are becoming common amongst items carried by people during everyday activities These sensors are periodically interrogated and resulting sensor data sent to a presence server 116 via network 120 Presence server 116 analyzes this sensor data to infer activity of users 102 and 104 Foi example, sensor data received from sensors associated with user 102 is used to define a presence status 122 of user 102 Similarly, sensor data from sensors associated with user 104 is used to define a presence status 124 of user 104
[0015] Through analysis of histoπcal server data, presence server 116 may determine behavioral patterns based on movements of users 102 and 104. For
example piesence sen ei 1 16 ma) infei that usei 102 and usei 104 iiequent the same coffee shop by deteimining that usei 102 and usei 104 \ isit the coffee shop each morning w hen tiavelmg to then lespective places of w ot k
[0016] Diffeient sensois may be used to collect data associated with a usei, such as one oi more of the following cameias, microphones, acceleiometers, GPS locators, iadio sensors (e g , Bluetooth device contact logs, 802 15 4 langmg, 802 1 1 localization), temperature sensors, light sensors, humidity sensois, magnetometeis, button click sensors and device status sensors (e g , cell phone rmger status sensors) As noted above, such sensors may be included within devices carried by a user
[0017] In another example, wireless sensors external to cell phone 106 may relay information to cell phone 106, which then sends the information to pi esence server 1 16 foi processing For example, embedded sensor unit 110 may include one or more sensor types that periodically provide sensor data to cell phone 106 via a wneless communication link 1 1 1, such as Bluetooth
[0018] Where sensoi operation is resouice taxing, sensoi opeiation may be done on demand via an explicit query from piesence seivei 1 16 Sensoi data is pushed fiom consumei devices 106, 108, 110 and 112 to presence seiver 116 wheie it is analyzed to infei activity of the associated usei [0019] Another type of sensor that may be used in system 100 is a software sensor that measure artifacts of other software running on a computing platform to determine a user's behavior, mood, etc Since no actual sensor is being read, these software sensors may also be termed virtual sensors Where processing allows, inference may occur within the computing device (e g , cell phone 106, PDA 108, notebook computei 1 12), otherwise data is sent to presence seiver 1 16 foi inference
[0020] In another example, a sensor may not be immediately associated with a user, but may be indirectly associated to the user by locality For example, to determine air quality associated with user 102, sensor information from a statically deployed sensor infrastructure 114 that measures air quality may be used if that data is obtained from one or more sensors of the infrastructure that are near to the location of user 102 Such sensor infrastructures may operate independently of user 102 and
presence server 1 16. but may be matched to user 102 through time and location information of user 102. In another example, where users 102 and 104 are near one another and user 104 has one or more different sensor types to those of user 102, while users 102 and 104 are proximate, sensor data from user 104 may be applicable to user 102 and vice versa. Thus, sensors may be shared.
[0021] As more and mode consumer devices include sensors, these device may be used to unobtrusively collect information about a user. By collecting information related to a user, presence server 1 16 (via data fusion and analysis) may determine characteristics and life patterns (e.g., presence status 122, 124) of the user and/or of particular groups of users. These characteristics and life patterns may be fed back to the users in the form of application services 1 18, such as on a display of a consumer device (e.g., cell phone 106). These characteristics (e.g., presence status 122, 124) may also be sent to a social networking server 126 that supports one or more social network applications 1 19, such as Facebook and MySpace, and instant messaging. For example, user 102 may also have an account with a social networking application such as Facebook, and therefore configures presence server 1 16 to send presence information of user 102 to social networking server 126 for use by social networking application 1 19, thereby alleviating the need for user 102 to continually update presence information within that social networking application. That is, presence information for one or more associated social networking applications may be updated with presence information (e.g., presence status 122, 124) for the associated user (e.g., user 102) by presence server 1 16. Thus, as user 104 accesses social networking application 119, shared presence status of user 102 is automatically updated by presence server 1 16 via social networking server 126. [0022] Presence server 1 16 may consist of a set of servers that: (1) hold a database of users and their associated presences 122, 124; (2) implement a web portal that provides access to presence information via per-user accounts; and (3) contain algorithms to draw inferences about many objective and subjective aspects of users based upon received and stored sensor data. Presence server 116 may include one or more interfaces (e.g., using one or more application programming interfaces (APIs)) to clients (e.g., thin clients) running on consumer computing and communication devices, such as cell phone 106, PDA 108, and notebook computer 112. The clients
may send \ ia push opeiation information about the usei to piesence sen ei 1 16 and iecen e hom piesence seix ei 1 16 mfeπed infoimation about the usei and the usei s life patterns based on sensed data Data sensing clients may make use of sensoi shaπng and calibiation functions to impiove the quality of data gatheied foi each user, thereby enhancing the peiformance of piesence seiver 1 16
|0023] While processed user information is available (both for individual leview and group shaπng) via the web portal, APIs for the retrieval and presentation of (a subset of) this information are available, through one or more plug-ins (i e , a pull operation) to popular social netwoik applications such as Skype, Pidgin, Facebook, and MySpace
[0024] FIG 2 shows (in an embodiment) presence servei 1 16 of FIG 1 interacting with applications 1 18, cell phone 106, PDA 108, embedded sensors 1 10 and notebook computei 1 12 , collectively called consumer devices 220, in furthei detail Piesence servei 1 16 is shown with storage 210, an inference engine 212 and a presentation engine 214 Storage 210 is used to store received sensoi data and user's piesence status 122, 124 Applications 1 18 may include instant messaging 202, social netwoiks 204, multimedia 206, and blogosphere 208
[0025] Infeience engine 212 analyzes leceived sensoi data and determines piesence information (e g , 122 and 124) foi each user This presence information is, for example, piesented to designated applications running on one oi more user devices (e g , phone 106, PDA 108 and notebook computer 112) A usei 's presence information may be pushed to certain devices based upon the user's permission That is, m certain embodiments, the user must allow others to view their presence information for the information to be made available to, and/oi pushed to, other useis [0026] Certain embodiments of system 100 support opportune peei-to- peer communication between consumer devices using available short range radio technologies, such as Bluetooth and IEEE 802 15 4 Communication between consumei devices and presence server 116 takes place accoidmg to the availability of the 802 11 and cellular data channels, which can be impacted both by the device feature set and by radio coverage For devices that support multiple communication modes, communication can be attempted first using a TCP/IP connection over open
802 1 1 channels second using GPRS EDGE-enabled bulk oi stieam tiansfei and finally SMS/MMS is used as a tailback, in ceitain embodiments of system 100
A. Sensing
[0027] Illustiatively, a sensing client (e g , thin sensing client) is installed on a consumer device (e g , cell phone 106, PDA 108, and computei 1 12) to peπodically poll on-boaid sensors (both haidware and software) and to push the collected sensor data, via an available network connection (wired oi wireless, e g , 107, 109, 1 13), to presence seiver 1 16 for analysis and storage For sensing modalities that aie particularly iesomce taxing (especially foi mobile devices), sensor sampling may be done on demand via an explicit query from presence servei 116 Sampling rates and durations for each of the sensors aie set in accordance with the needs of inference engine 212 Typically, the sensing clients use low rate sampling to save eneigy and switch to a higher sampling rate upon detection of an interesting event (i e , set of circumstances) thereby impioving sampling resolution foi penods of inteiest while preserving powei ioi penods of less interest Given the pπcmg schemes of MMS/SMS and the battery dram implied by 802 1 1 oi cellulai iadio usage, data may be compressed befoie sending to piesence server 1 16, using standaid generic compiession techniques on iaw data and/oi domain-specific iun-length encoding (e g , for a stand/walk/run classifier, only send updates to piesence servei 1 16 when the state changes) to reduce communication cost and powei usage When using SMS, the maximum message size may be used to minimize the price per bit Furthermore, preliminary data analysis (e g , filtering, inference) may be migrated to consumer devices 220 Given the computational power of many new cellular phones, significant processing may be done on these mobile devices to reduce communication costs However, aggregate (trans-users) analysis may be done withm presence server 116
1) Hardware Sensors:
[0028] Cell phone 106 may represent one or more of a Nokia 5500 Sport, N80, N800 and N95 cell phones PDA 108 may represent one or more of a Nokia N800 Cell phone 106 and PDA 108 may be combined as in a phone/PDA hybrids such as an Apple iPhone Embedded sensors 110 may represent one or more of a
Nike+ system, a recreational sensor platform such as a Garmin Edge, a SkiScape. and a BikeNet.
[0029] The SkiScape platform is. for example, used at a ski area to sense information on skiers and/or the ski area environment. Sensing devices may be attached to skiers, and fixed nodes communicate with the skiers" sensing devices. The fixed nodes may also capture data such as images, sound, and radar information including snow depth. Additional information on SkiScape may be found in the following paper which is incorporated herein by reference: Eisenman et al., SkiScape Sensing (Poster Abstract), in Proc. of Fourth ACM Conf. on Embedded Network Sensor Systems, (SenSys 2006), Boulder, Nov. 2006.
[0030] The BikeNet system includes a mobile networked sensing system for bicycles. Sensors collect cyclist and environmental data along a cycling route. Application tasking and sensed data uploading occur when sensors come within radio range of a static sensor access point or via a mobile sensor access point along the cycling route. Additional information on BikeNet may be found in the following paper which is incorporated herein by reference: Eisenman et al., The BikeNet Mobile Sensing System for Cyclist Experience Mapping, in Proc. of Fifth ACM Conf. on Embedded Networked Sensor Systems, (SenSys 2007) Sydney, Australia, Nov. 2007. [0031] Another example of embedded sensor 1 10 is a BlueTooth enabled 3D accelerometer, sometimes referred to as a BlueCel. The BlueCel may be attached to a user or to another entity, such as a bicycle. Notebook computer 1 12 may represent one or more of Toshiba and Hewlett Packard laptop and/or desktop computers. Through a survey of the commonly available commercial hardware, including the examples listed above, the following hardware sensors are currently available on one or more commercial-of-the-shelf (COTS) devices: embedded cameras, laptop/desktop web cameras, microphone, accelerometer, GPS, radio (e.g., Bluetooth device contact logs, 802.15.4 ranging, 802.1 1 localization), temperature sensors, light sensors, humidity sensors, magnetometers, button click sensors, and device state sensors (e.g., ringer off detectors). System 100 may exploit the availability of these sensors.
2) Virtual Software Sensors:
|0032] Software sensors are for example those that measure artifacts of other software that runs on the computing platform in an effort to understand the context of the user's behavior, mood, etc. They are "'virtual ' in that they do not sense physical phenomena, but rather sense electronic evidence (''breadcrumbs") left as the user goes about his daily routine. Examples of virtual software sensors include the following: a trace of recent/current URLs loaded by the web browser; a trace of recent songs played on the music player to infer mood or activity; and mobile phone call log mining for structure beyond what a cell phone bill commonly provides. [0033] As an example of how hardware and software sensor samples are combined to infer activity or status, if recent web searches show access to a movie review site (e.g., moviefone.com), and if a call was recently made to a particular friend, and if the time of day and the day of week is consistent with movie theatre times, and if the cell phone ringer is turned off, it is highly probably that the user is at a movie theatre.
B. Sensor Calibration
[0034] Sensor calibration is a fundamental problem in all types of sensor networks. Without proper calibration, sensor-enabled mobile devices (e.g., cell phones, PDAs and embedded sensor platforms, etc.) produce data that may not be useful or may even be misleading. There are many possible sources of error introduced into sensed data, including those caused by the device hardware itself. This hardware error can be broken down into error caused by irregularities of the physical sensor itself due to its manufacturing, sensor drift (sensors characteristics change because of age or damage), and errors resulting from integration of the physical sensor with the consumer device (e.g., cell phone 106). Physical sensor irregularities are compensated for at the factory, where a set of known stimuli is applied to the sensor to produce a map of the sensor output. To correct the device integration error, post-factory calibration of the sensor-enabled mobile device is required. [0035] One example of a calibration protocol that may be used in system
100 is CaliBree. CaliBree is a distributed, scalable, and lightweight protocol that may
be used to automatically calibiate mobile sensoi de\ ices It is assumed that in mobile people-centπc sensoi netwoiks, theie will be tw o classes ot sensois w ith iespect to calibiation calibiated nodes that aie eithei static 01 mobile, and un-calibiated mobile nodes In the following discussion, the nodes belonging to the foimei class aie ieferred to as giound truth nodes These ground tiuth nodes may exist as a iesult of factory calibration, 01 end usei manual calibration In CaIiBi ee, un-calibrated nodes opportunistically interact with calibrated nodes to solve a discrete aveiage consensus problem, level aging cooperative control over their sensoi readings The average consensus algorithm measures the disagreement of sensoi samples between the un- calibrated node and a series of calibiated neighbors The algoiithm eventually converges to a consensus among the nodes and leads to the discovery of the actual disagreement between the un-calibrated node sensor and calibrated nodes sensors The disagreement is used by the un-calibrated node to generate (using a best fit line algorithm) the calibration curve of the newly calibrated sensor The calibration cmve is then used to locally adjust the newly calibrated nodes sensoi readings
|0036] In order foi the consensus algorithm to succeed, the un-calibiated sensoi deuces compare data when sensing the same envπonment as the giound truth nodes Given the limited amount of time mobile nodes may expeπence the same sensing enviionment during a particulai lendezvous, and the fact that even close proximity does not guarantee that the un-cahbrated sensor and the ground tiuth sensor expeπence the same environment, the consensus algorithm is run over time when un- cahbrated nodes encounter different ground truth nodes Additional information on CaliBree may be found in the following technical report which is incorporated herein by reference E Miluzzo et al , Cahbree — A Self Calibration Experimental System foi Wireless Sensor Networks in 5067/2008 Lecture Notes in Computer Science 314 (Springer Beilm/Heidelberg, 2008)
C. Sensor Sharing
[0037] In system 100, mobile sensing devices (e g , cell phone 106, PDA 108, embedded sensors 110) locally manage sensing requests both on behalf of the sensing clients (e g , thin sensing clients) running on the mobile device and/or services running remotely on presence server 116 These sensing requests may be
multidimensional w ith each lequested sensoi sample eompπsing a pnmai y sensoi type and a metadata 01 context \ ectoi Context is the set of cπcumstances 01 conditions that suπound a paiticulai sensoi sample This context \ cctoi included in a lequest may indicate a location 01 time at which a sample is to be taken and may include moie sophisticated context tags such as sensor oiientation (e g , facing north, mounted on hip) and mfened custodian activity (e g , walking, running, sitting) This context may also include weights that indicate the lelative importance of one or more context elements
[0038] Whethei a given lequest is injected into the system locally by a sensing client running on a mobile sensoi node, oi by piesence seivei 116, difficulties may arise in tasking an appropnate mobile sensor node with that given request Considered strictly, a mobile sensor node is only appropriate to handle a given lequest if it is equipped with the proper sensor, and if it has a context vector that matches that of the request Stiict handling of a lequest may lead to excessive delay and/or a loss of data fidelity m successfully sei vicing the lequest due to thiee connected issues the uncontiolled mobility of mobile sensing device useis, the location (oi effective leach) of the lequest injection point with respect to the sensing taiget location and the mismatch between the equipped sensoi s and context of available mobile sensor nodes and the sensors and context lequned by a lequest Sensoi shaiing addresses this last difficulty
[0039] Withm system 100, not every consumer device 220 is equipped with every sensor type Node heteiogeneity arises due to a number of reasons, including cost (some sensors are more expensive, or rai e), interest (some sensors are of more importance in different interest groups, or to individual users), and hardware evolution (hardware evolves and many platform generations will be in use simultaneously, e g , many newer mobile phone models are equipped with a camera and/or accelerometer but many older/cheapei models aie not) Due to this heterogeneity, mismatches between the sensors required by sensing client requests and the sensors with which a given mobile node is equipped are likely to arise [0040] Sensor equipment and context constraints that a mobile sensing device must meet to be tasked for a given application request are loosened by allowing sensor sharing between mobile devices (e g , a buddy's device) At a high
le\ el ln-situ sensoi shaiing is a function that allows an undei -qualified node to boiiov* sensoi data fiom a qualified node in an effort to satisfy a sensing lequest while maintaining 01 impio\ ing on the data fidelity, and in a moie timely mannei
[0041 J In an example scenaπo, a student w alks fiom his dorm to school to attend a mid-day class The student carries a mobile phone equipped with a suite of simple sensors (e g , accelerometei, temperature, miciophone, camera), a GPS (e g , Nokia N95), and a low power iadio such as Bluetooth oi IEEE 802 15 4, m his pocket As he leaves the dorm building, his phone is queried via the cellular data channel to measure outdooi light intensity based upon a request made by the student's mother accessing a portal page of system 100 Recognizing that the phone is in the pocket using data mferencmg, as the student walks along the sensoi shaπng service on the phone periodically bioadcasts to discovei a node that is out of the pocket (e g , a hand-held or hip-mounted device) that can answei the light sensing lequest If such a node responds and shares its light sensoi leading, the outdoor light intensity query may be answei ed Further along, the student receives an Ozone Alert text message on his phone Curious about the ozone level m his vicinity, he launches an applet on his phone that is piogiammcd to lequest this information fiom piesence sen ei 1 16 Howevei, since the phone is not equipped with an ozone sensoi, a lequest foi the local ozone information is bioadcast by the sensor sharing service A bus mounted platform (e g , UMass DieselNet) with an attached ozone sensoi replies, shaπng the ozone ieadmg which is displayed locally on the cell phone scieen
[0042] The sensor shaπng primitive comprises a distributed sharing decision algorithm running on the mobile user devices (e g , cell phone), resource and context discovery protocols, in-situ sensor sharing piotocols and algorithms that adapt according to the iadio and sensed environment, and a context analysis engine that provides the basis for shaπng decision making
D. Analysis
[0043] Sensed data is sent from device clients in consumer devices 220 to presence server 116 and is processed by one or more analysis component (e g , inference engine 212) resident withm presence server 116 This analysis component may combine histoπcal per-user information with inferences derived from
combinations of current data derived from multiple sensors to determine the presence status of the user. This presence status may include objective items such as location and activity, and subjective items such as mood and preference. While a number of data fusion, aggregation, and data processing methods are possible, the following are examples of analysis/inference outputs that are used to generate the sensed presence within system 100.
[0044] Location is a key function in any sensing system for providing geographical context to raw sensor readings. When explicit localization services like GPS are not available, due to hardware limitation or issues with satellite coverage, location of the client devices may be inferred from observed WiFi (e.g., access point identifiers), Skyhook service, Bluetooth (e.g., identifying location from static devices) cellular base station neighborhoods, and/or other unique sets of sensed data in a manner similar to ambient beacon localization. Ambient beacon localization may be used to determine a sensor's location by use of supervised learning algorithms that allow the sensor to recognize physical locations that are sufficiently distinguishable in terms of sensed data from the other sensors in a field. Additional information on ambient beacon localization may be found in the following paper which is incorporate herein by reference: Nicholas D. Lane et al., Ambient Beacon Localization: Using Sensed Characteristics of the Physical World to Localize Mobile Sensors, in 2007 Proceedings of the 4th Workshop on Embedded Networked Sensors 38 (2007). [0045] Human activity-inferring algorithms are incorporated within inference engine 212 to log and predict a users' behavior. A simple classifier to determine whether a user is stationary or mobile may be built from several different data inputs, alone or in combination (e.g., changes in location by any possible means, accelerometer data). Accelerometer data may be analyzed to identify a number of physical activities, including sitting, standing, using a mobile phone, walking, running, stair climbing, and others.
[0046] Human behavior is often a product of the environment. To better understand a user's behavior, it is useful to quantify the user's environment. Image and sound data may be collected and analyzed to derive the noisiness/brightness of the environment. Conversation detection and voice detection algorithms may be used
to identify the people in a user's vicinity that may impact behavior and mood of the user.
[0047] Part of a person s daily experience is the environment where the person lives and spends most of the time. For example, health related issues of interest may include the level of an individual's exposure to particulates (e.g., pollen) and pollution. By incorporating mechanisms that enable air quality monitoring around the individual through opportunistic interaction with mobile sensors and/or static pre-deployed infrastructure (e.g., sensors 1 14), these health related issues of interest may be predicted and possible prevented.
E. Presentation
[0048] Since communication devices, and in particular mobile communication devices, provide varying amounts of application support (e.g., web browser, Skype, and Rhythmbox on a laptop; web browser and Skype on the N800, SMS only on the Motorola L2), a presentation engine 214 provides a variety of means for sending the human sensing presence (e.g., presence status 122, 124), distilled from the sensed data by presence server 1 16, for display on the end user device (e.g., consumer devices 220).
1) Text only: Email/SMS
[0049] More limited platforms, such as older/low-end cell phones and PDAs, likely do not have the capability to browse the Internet and have a limited application suite. These platforms may still participate as information consumers in the architecture of system 100 by receiving text-based updates from SMS/email generator 216, rather than graphical indicators of status embedded in other applications.
2) Browser: Web Portal
[0050] Platforms that support at least general Internet browsing allow users to access web portal 218, the content of which is served from storage 210. In certain embodiments, the particular visualizations generated by web portal 218 may be customized to a degree in a manner similar to Google Gadget development/configuration on personalized iGoogle pages. Web portal 218 thus may
pio\ ide a flexible and complete piesentation of a usei s collected data (e g . a data log) and data shai ed by othei usei s (e g Ma a buddy list) In ceitam embodiments, the usei may also configui e all aspects of the associated account including finegrained shaπng piefeiences foi the usef s "buddies" , thiough web portal 218
3) Application-specific Plug-ins
[0051] Depending on the application support on the usef s device, one oi moie of the following exemplary plug-ms may be used In certain embodiments, in addition to status information rendeied by the plug-in in the applications' GUI, the plug-in piovides click-through access to the web portal 218 - both to the user's pages and the shared section of any buddy's pages
• Instant messaging client buddy list shows an icon with a particulai status item for the buddy
• Facebook and MySpace pages have plug-ins to show your status and that of youi friends • iGoogle gadgets show various status items from a device user and his buddies The iGoogle page periodically iefreshes itself, so it follows the data pull model fiom piesence servei 1 16
Photography applications have plug-ms to allow pictuies to be stamped with metadata like location (minimally) and other environmental (light, temperature) and human status elements
F. Privacy Protection
[0052] The privacy of users registered with system 100 may be protected through a number of mechanisms Users' raw sensor feeds and inferred information (collectively considered as the user's sensed presence) may be securely stored within storage 210 of presence server 1 16, but may be shared by the owning usei according to group membership policies For example, the iecorded sensoi data and piesence is available only to users that are alieady part of the user's buddy list In certain embodiments, Buddies are determined from the combination of buddy lists imported by registered services (Pidgm, Facebook, etc ), and buddies may be added based on profile matching Thus, certain embodiments of system 100 inherit and leverage the
Vi oik ahead} undei taken by a usei when ci eating his buddy lists and sub-lists (e g in Pidgin Skype, Facebook) in defining access policies to the usei s piesence data
[0053] In ceitam embodiments, useis may decide whethei to be visible to othei usei s using a buddy seaich seivice 01 via a buddy beacon sen ice In certain embodiments, useis aie also given the ability to furthei apply per-buddy policies to determine the level of data disclosuie on pei-usei, pei-gioup, oi global level Ceitam embodiments of system 100 follow the Virtual Walls model which provides different levels of disclosure based on context, enabling access to the complete sensed/inferred data set, a subset of it, or no access at all Foi example, user 102 might allow user 104 to take pictuies fiom cell phone 106 while denying camera access to other buddies, usei 102 might make the location trace of user 102 available to user 104 and to other buddies The disclosure policies are foi example set from the user's account configuration page withm web portal 218
[0054] In addition to user-specific data shaπng policies, certain embodiments oi system 100 compute and shaie aggiegate statistics across the global user population Foi this service, shared information is for example made anonymous and a\ eiaged, and access to the information is fuithei conti oiled by a quid pro quo requnement
Services [0055] Certain embodiments of system 100 help support the following goals (i) provide information to individuals about then life patterns, and (ii) piovide more texture to interpersonal communication (both direct and mdnect) using information derived from hardware and software sensors on usei devices A number of services, built upon the architecture of system 100, that aim to meet these goals are described below
A. Life Patterns
[0056] Enriching the concept put forward in the MyLifeBits pioject, certain embodiments of system 100 automatically sense and store location traces, inferred activity history, history of sensed environment (e g , sound and light levels), rendezvous with friends and enemies, web search history, phone call history, and/or VOIP and text messaging history In this way, system 100 may provide context m the
foim of sensed data to the myπad othei digital observations being collected Such information may be of archi\ al inteiest to the user as a curiosity, and may also be used to help undei stand behavioi. mood, and health
B. My Presence (0057] As indicated by the increasing popularity of social networking sites like Facebook and MySpace, people (especially youth) are interested both m actively updating aspects of their own status (i.e., personal sensing presence), and surfing the online profiles of their friends and acquaintances for status updates However, it is troublesome to have each user manually update more than one or two aspects of his or her sensed presence on a regular basis
[0058] Certam embodiments of System 100 add texture and ease of use to online electronic avatars (e.g., the avatars of Facebook and MySpace) by automatically updating each user's social networking profile with mfeπed and current information (e g., "on the phone"', "'drinking coffee'", "jogging at the gym", '"at the movies'") that is gleaned from hardware and software sensors by presence server 1 16
C. Friends Feeds
[0059] In the same way people subsciibe to news feeds or blog updates, and given the regularity with which users of social netwoikmg sites biowse their friends' profiles, there is clearly a need for a profile subscription service similar to really simple syndication (RSS) (Facebook has a similar service for the data and web interface it maintains) Under this model, buddies' status updates might be event dπven; a user asks to be informed of a particular buddy's state (e.g , walking, bikmg, lonely, with people at the coffee shop) at, for example, the user's cell phone
D. Social Interactions [0060] Using voice detection, known device detection (e.g , cell phone
Bluetooth MAC address), and life patterns, group meetings and other events that involve groupings of people may be detected by embodiments of system 100. In social group internetworking, friends are often interested in who is spending time with whom. Such embodiments of System 100 allow individuals to detect when groups of their buddies are meeting (or even when an illicit rendezvous is happening). A further
level of analysis may deteimine whethei a com ei sation is ongoing and fuithci gioup dynamics (e g who is the dominant speakei )
E. Significant Places
[0061] Ha\ e you evei found yourself standing in front of a new lestauiant, or wandering in an unlamiliai neighborhood, wanting to know moie9 A call to directoiy assistance is one option, but what you really want are the opinions of youi fiiends Phone calls to survey each of them aie too much of a hassle Or alternatively, maybe you just want to analyze your own routine to find out where you spend the most time To satisfy both aims, certain embodiments of system 100 support the identification and sharing of significant places m each usei 's life patterns
[0062] Significant places are derived through a continuously evolving clustering, classification, and labeling appioach In the first step, location traces aie collected from available sources (e g , WiFi association, GPS, etc ) foi the given usei Since location tiaces always have some level of inaccuracy, the sensed locations aie clusteied accoidmg to then geogiaphical pioximity The importance of a clustei is identified by consideiing time-based inputs such as visitation fiequency, dwell time, and iegulaiity Once significant clusters aie identified, a similaiity measuie is applied to determine how "close" the new clustei is to othei significant clustei s aheady identified (across a user's buddies) in the system If the similarity is greater than a threshold, then the system automatically labels (e g , '"Home"', 'Coffee shop'", etc ) the new cluster with the best match The user may amend the label, if the automatic label is deemed insufficient Finally, the usei has the option of forcing the system to label places considered "insignificant" by the system (e g , due to not enough visitations yet) [0063] As implied above, embodiments of system 100 keeps the labels and the cluster information ol important clustei s for all users, applying them to subsequent clustei learning stages and offering to users a list of possible labels for given clusters In addition to this "behind the scenes'* type of place label shaimg, m certain embodiments, users may also explicitly expose their significant places with their buddies or globally to all users, using methods (e g , portal, plug-ins) previously described Accordingly, if a user is visiting a location that is currently not a
significant clustei to him based on his ow n location/time tiaces the point can be matched against shaied buddies clustei s
[0064] Once the significant places ot useis ha\e been automatically identified and eithei automatically 01 manually tagged, useis may annotate then significant places The annotation may include for example identifying the cafe that has good coffee oi classifying a neighboihood as one of dangerous safe, hip oi dull
F. Health Monitoring
[0065] As many people are becoming more health-conscious in terms of diet and lifestyle, certain embodiments of system 100 also provide individuals with health aspects of their daily routines Such embodiments of system 100 aie able to estimate exposure to ultraviolet light, sunlight (e g , for Seasonal Affective Disorder (SAD) afflicted users) and noise, along with number of steps taken (distance traveled) and number of calories buined These estimates aie derived by combining mfeience of location and activity of the users with weathei information (e g , UV index, pollen and particulate levels) captuied by piesence server 1 16 fiom the web, foi example
G. Budch Search
10066] The past ten yeais have seen the giowth in popularity of online social networks, including chat groups, weblogs, fnend networks, and dating websites Howevei, one hurdle to using such sites is the requnement that useis manually input their pieferences, characteristics, and the like into the site databases Certain embodiments of system 100 provide the means for automatic collection and sharing of this type of profile information Such embodiments of system 100 automatically learn and allow users to export information about their favorite locations or "'haunts", what recreational activities they enjoy, and what kind of lifestyle they are familiar with, along with near real-time personal presence updates sharable via application (e g , Skype, MySpace) plug-ms and web portal 218 Furthei, as many popular IM clients allow searching for people by name, location, age, etc , certain embodiments of system 100 enable searching of users based upon a data mining process that also involves interests (like preferred listened music, significant places, preferred sport, etc)
H. Budd> Beacon
|0067] The buddy seaich sen ice of embodiments of system 100 is adapted to facilitate local intei action as w ell In this mode a usei tonfiguies the buddy seaich seivice to piovide instant notification to his mobile device if a fellow usei has a profile with a certain degiee of matching attπbutes (e g , significant place foi both is "Dirt Cowboy coffee shop"', both have piimaπly nocturnal life patterns, similar music or sports interests) All this information is automatically mined via system 100 sensing clients running on usei consumer devices 220, the user does not have to manually configure his profile information Devices with this seivice installed peπodically bioadcast the profile aspects the user is willing to advertise - a Buddy Beacon - via an available short range iadio inteiface (e g , Bluetooth, 802 15 4, 802 1 1) When a profile advertisement is leceived that matches, the user is notified via his mobile device
I. "Above Average?" [0068] Theie is much inteiest in statistics Foi example, people may want to know whether they aie popular, how they measuie up, oi whethei they have a compaiatively outgoing peisonahty By analyzing aggregate sensoi data collected by its members, certain embodiments of system 100 piovide such statistical information on items such as the top ten most common places to visit in a neighborhood, the average time spent at work, and many otheis Such embodiments of system 100 make this aggregate information available to useis, each user may configure their web portal page to display this information as desired Comparisons are available both against global averages and gioup aveiages (e g , those of a user's buddies) Tying in with the Life Patterns service, useis may also see how their compaiative behavior attπbutes change over time (i e , with the season, semester, etc ) In certain embodiments of system 100, the normal pnvacy model of system 100 is based on buddy lists, and therefore, each user must manually opt in to this global sharing of information, even though the data is made anonymous through aggregation and averaging before being made available However, m such embodiments, access to the global average information is only made available to users on a quid pro quo basis
[0069] FIG. 3 is a flowchart illustrating one exemplary method 300 for injecting sensed presence into social networking applications. In step 302, method 300 receives sensor data associated with a user. In one example of step 302, a client (e.g., a thin client) within cell phone 106 samples sensors within cell phone 106 and sends this sensor data to presence server 1 16. In step 304, method 300 infers presence information of the user based upon the sensor data. In one example of step 304, inference engine 212 analyzes sensor data received from cell phone 106, PDA 108 and/or embedded sensors 1 10 and generates presence status 122. In step 306, method 300 stores sensor data and inferred presence status within a database. In one example of step 306, presence server 1 16 stores sensor data and presence status within storage 210. In step 308, method 300 sends the presence status to a social networking server based upon the user's preferences. In one example of step 308, user 102 defines preferences within presence server 1 16 to send presence status 102 to social networking server 126, thereby automatically updating presence for user 102 within Facebook.
Example of Implementation
[0070] The following is a description of an embodiment of system 100. It should be understood that the following description is provided merely as one example of how system 100 may be implemented. System 100 may be implemented in other manners in accordance with the previous description, and the scope of the invention.
A. Sensing
[0071] In this example, sensing clients are implemented on the following COTS hardware: a Nokia 5500 Sport cell phone (including the Symbian operating system, a 3D accelerometer, and BlueTooth capability); • a Nokia N80 cell phone (including the Symbian operating system,
802.1 lb/g capability, and BlueTooth capability); a Nokia N95 cell phone (including the Symbian operating system, 802.1 lb/g capability, BlueTooth capability, and GPS capability);
a Nokia N800 PDA (including the Linux opeiatmg svstem. 802 1 I b g capability, and BlueTooth capability), and
• Linux notebook computei s.
[0072] Each sensing client in this example is configured to periodically push its sensed data to piesence server 1 16. The following is a description of how sensing clients are implemented on the COTS hardware.
• A Perl plugin to Rhythmbox audio player on the Lmux laptop and the Nokia N800 pushes the current song to presence servei 1 16.
• A Python script samples the 3D accelerometer on the Nokia 5500 Sport at a rate that supports accurate activity inference.
The BlueTooth and 802.1 1 neighborhoods (MAC addresses) of the sensing clients are periodically collected using a Python script. In this example of system 100, users have the option to register the BlueTooth and 802.1 1 MAC address of their devices with the system. In this way, presence server 1 16 can convert MAC addresses into human-friendly neighbor lists.
• A Python script captures camera and microphone samples on the Nokia N80 and Nokia N95 platforms Additionally, the EXlF image metadata are captured and analyzed A Perl plugin to Pidgm pushes IM buddy lists and status to presence server 1 16.
A Perl plugin to Facebook pushes Facebook friend lists to presence server 1 16.
• A Python script periodically samples the GPS location on the Nokia N95.
Lmux libraries compiled for the notebook computers and the Nokia N800 periodically sample the WiFi-deπved location using Skyhook and push to the location to presence server 1 16.
[0073] A BlueTooth enabled accelerometer or BlueCel was also used m this example of system 100. The BlueCel extends the capability of BlueTooth enabled devices, and the BlueCel's application is determined by its placement. For example, the BlueCel may be to analyze a user's weight lifting or bicycling activities
by placing the BlueCel on a weight stack or bicycle pedal, respectively. The BlueCel is implemented, for example, from a Sparkfun WiTiIt module, a Sparkfun LiPo battery charger, and a LiPo battery.
[0074] In this example of system 100, a Python script reads accelerometer readings from the BlueCel over the BlueTooth interface. A sensing client menu allows the user to tell system 100 what the application is (e.g., weight lifting or bicycling), thereby allowing the client to set the appropriate sampling rate of the BlueCel's accelerometer. The BlueCel' s data is tagged with the application so that presence server 1 16 can properly interpret the data. [0075] Furthermore, in this example of system 100, existing embedded sensing systems accessible via IEEE 802.15.4 radio are levered by integrating the SDIO-compatible Moteiv Tmote Mini into the Nokia N800 device. Such existing embedded sensing systems are examples of embedded sensors 1 10.
B. Analysis [0076] Unprocessed or semi-processed data is pushed by sensing clients running on user devices to presence server 1 16 in this example of system 100. A MySQL database on presence server 1 16 stores and organizes the incoming data, which is accessible via an API instantiated as a collection of PHP, Perl, and Bash scripts. The Waikato Environment for Knowledge Analysis (''WEKA") workbench is used for clustering and classification.
[0077] The following is a description of how some of the services discussed above are implemented in this example of system 100:
[0078] An activity classifier determines whether a user is standing, walking, or running. The activity classifier makes this determination from features in data from either the Nokia 5500 Sport or the BlueCel accelerometer. Examples of such features include peak and RMS frequency and peak and RMS magnitude. The activity classifier operates of a mobile device (i.e., one of the Nokia devices of the notebook computers of this computer) to avoid the cost (energy and monetary) of sending complete raw accelerometer data via SMS to presence server 116. [0079] An indoor/outdoor classifier determines whether a user is indoors or outdoors. The classifier uses a feature vector including a number of elements to
make the classilϊei be iobust to diffeient tvpes of indooi and oυtdooi eirwionments Such teatuies include the following the ability of the mobile de\ ice to acquπe a GPS estimate the numbei of satellites seen by GPS, the numbei of WiFi access points and BlueTooth devices seen as well as then signal stiengths, the fiequency of ambient light (looking foi the AC-mduced flicker), and the diffeiential between the ternperatme measured by the device and the temperatme read via a weather information feed (to detect mdooi climate control)
[0080] A mobility classifier determines whether a user is stationary, walking, or driving The mobility classifϊei consideis changes to a iadio neighboi set Additionally, the mobility classifier considers the lelative signal stiengths, both foi individual neighbors and the aggregate across all neighbors, for BlueTooth, WiFi, and GSM radios of the mobile devices The mobility classifier maps changes in the radio environment (i e , neighbors, received signal strength) to speed of movement The iesult of the afoiementioned indoor/outdooi classifiei is also included in the featuie vector Locations traces are omitted due to then lelatively high eπoi with respect to the speed of human motion
[0081 J Using Matlab processing on presence sei vei 1 16 a noise index (expiessed in decibels) is geneiated from audio samples captured fiom the Nokia N80 s and N95 s miciophones Similarly, a bnghtness index (ranging from 0 to 1) is geneiated on piesence servei 116 using Matlab from images captuied from the Nokia N 80 and N95's cameras The sounds and bnghtness indices help presence server 116 infer information about a person's surroundings For example, the noise index is combined ovei time to estimate the cumulative effect of the sound environment on a user's hearing As another example, the brightness index helps determine the positive effect of sunlight (when combined with an mdooi/outdooi classifier) on those afflicted with seasonal affective disorder Additionally, a classifier based on a voice detection algorithm determines whether a user is engaged in conveisation
[0082] A user's significant places are derived by analyzing location traces, mobility time statistics, and other data inputs Raw location data is first clustered using the EM algorithm Clusters are subsequently mapped against time statistics (viz , visitation frequency, dwell time, regulaπty, time of day, weekday/weekend, AM/PM) and other information (viz , indoor/outdoor, current and previous mobility
class numbei and composition oi people gioups \ isible in a location) to dctcimmc impoitance Additionally WiFi and Bluetooth MAC addiess of neighbois aic used to dilfeientiate between o\ ei lapping clusteis Fmall\ a similaiit} measuie is computed between the new clustei and existing clusteis known by the system [0083] In this example, system 100 maintains geneiic labels foi significant clusteis Howevei, useis may alias the clusters as well to give moie personally meaningful oi gioup-oπented names The clustering m this example is adaptive in that the model changes over time depending on how the mobility trace of the usei (and othei system users) evolves - that is the significance of a place may evolve ovei time While it is often advantageous to i elate lecogmzed significant clusters to physical locations (i e , coordinates), in this example, system 100 also enables the recognition of significant places for devices that do not have access to absolute localization capabilities by using local region recognition based on what featuies are available to the device, such as WiFi, Bluetooth, and GSM capabilities Accordingly, a location clustei need not be an aggiegated set of tiue coordinate estimates, but can alternately compiise a set of location iecogmtion estimates
|0084] In this example the numbei of caloiies burned aie estimated by combining the infeience of walking fiom the activity classifiei, time spent walking, and an average factor of caloiies burned pei unit time when walking at a moderate pace Fuithei, exposme to ultraviolet light is estimated by combining the mfeience of walking oi running or standing, the inference of being outdoois, the time spent, and a feed to a web-based weathei service to learn a curient UV dose iate A similar technique is applied to estimate pollen exposure (tiee, grass, weed) and particulate exposuie [0085] As discussed above, the BlueCel facilitates many application- specific data collection possibilities Application-specific data analysis tools may be developed foi the BlueCel to support applications including bicycle riding analysis (BlueCel placed on the pedal), golf swing analysis (BlueCel affixed to the club head), weight lifting analysis (e g , to determine exercise motion correctness to avoid injury), and woikout logging (BlueCel affixed to the wrist)
C. Presentation
[0086] In this example the usei s piocessed sensoi data can be \ iew ed via a web biowsei by logging into the usei s account on piesence sei\ ei 1 16 Additionally, a subset of the usei s status mfoimation is made a\ailable \ia data push and data pull mechanisms to the user s buddies through then system portal pages, and thiough plugms to social networking applications
[0087] In this example, the data a usei shares with his buddies may be iendered via a number of simple icons that distill the current sensing piesence of the user FIG 4 shows a snapshot 400, which is an example of a user's data page on web portal 218 Right pane 402 includes Buddy lists loaded from registeied Pidgin and Facebook accounts, and the Buddy lists aie annotated with icons repiesentmg the shared data The icons offer chck-thiough access to a fuller iepresentation of the shared user data In the example of FlG 4, buddies Patty and Selma are inferred to be standing and in a conversation, while buddy Lenny is inferred to be at the coffee shop, as indicated by the color of the activity icons next the each buddy's name
[0088] On login to system 100 m this example, left pane 404 shows the logged-in user's data Additionally, left pane 404 shows a buddy's data if an icon next to that buddy's name is clicked In the example of FlG 4, the logged-m usei Homer Simpson has clicked on the icon foi his buddy Patty Patty has a sharing policy that allows the display of the following data as shown in left pane 404 Patty's buddies m her vicinity (determined via BlueTooth and WiFi MAC address lecogmtion), Patty's trace of her last significant places visited, etc In this example, the link 406 at the bottom of the page to take a picture ("peek") from Patty's cell phone is disabled, Patty has disabled access foi Homer Simpson to image data m her privacy profile Instead, Homer has clicked the link 408 to view the sound level history plot 410 for Patty, ostensibly to see how noisy Selma is Icons 412 denote that a buddy imported from the usei's Facebook account (using a Facebook developer API) oi Pidgin account (using a Pidgm developer API) is also a registered user of system 100 [0089] Changes may be made m the above methods and systems without departing from the scope hereof It should thus be noted that the matter contained m the above descπption or shown m the accompanying drawings should be interpreted
as illustrative and not in a limiting sense. The following claims are intended to co\ er generic and specific features described herein, as well as all statements of the scope of the present method and system, which, as a matter of language, might be said to fall therebetween.
Claims
1. A method for injecting sensed piesence into social networking applications, comprising. receiving sensor data associated with a user; inferring a presence status of the user based upon analysis of the sensoi data; storing the sensor data and presence status within a database; and sending the presence status to a social networking server to update the user's presence information for the social networking applications based upon the user' s preferences
2. The method of claim 1 , the presence status including one or more of location information, activity information, preference information, and mood information.
3. The method of claim 1 , the sensoi data including one oi more of location information, acceleiometei mfoimation. tempeiatuie information, light intensity information, audio information, and softwaie aitifacts
4. The method of claim 1 , the sensoi data being leceived from one or more sensors located proximate to the user.
5. The method of claim 1 , the sensor data being leceived from one or more shared sensors located proximate to the user
6. The method of claim 1 , the step of inferring comprising inferring the presence status based upon analysis of combined sensor data from a plurality of sensors associated with the user.
7. The method of claim 1 , the presence status including statistical information based upon other users' sensor data and inferred presence status.
8 The method of claim 1 tmthei compiising mteiactmg w ith the usei \ ia a w eb poital the w eb poital pio\ iding the usei with statistical intoimation based upon othei usei s sensoi data and mteired piesence status
9 The method of claim 8, the statistical information including a iankmg of the user against statistical aveiages
10 The method of claim 1, the step of receiving sensor data compiising wirelessly transmitting sensor data from an embedded sensoi to a consumei device associated with the user
1 1 The method of claim 1 , the sensor data comprising measured artifacts of softwai e running on a computing platform associated with the usei
12 The method of claim 1, the step of inferring being executed on a computei platform associated with the usei
13 The method of claim 1 , the step of infeπing being executed on a piesence sen ei in communication with a computei platfoim associated with the user
14 A software pioduct compπsmg instructions, stoied on computei - ieadable media, wherein the mstiuctions, when executed by a computei, peiform steps for injecting sensed presence into social networking applications, compiising mstiuctions for receiving sensoi data associated with a user, instructions for mfeiimg a presence status of the usei based upon analysis of the sensoi data, instructions foi storing the sensoi data and presence status within a database, and instructions foi sending the presence status to a social networking server to update the usei s presence information for the social netwoiking applications based upon the user's prefeiences
15 A system for injecting sensed presence into social networking applications, comprising at least one sensor proximate to a user, the at least one sensor being used for collecting sensor data associated with the user; a presence server for receiving and storing the sensor data; an inference engine for analyzing the stored data and to infer a presence status for the user; and a presentation engine for presenting the information to the user and other users.
16. The system of claim 15, the presentation engine including a web portal for interacting with one or more users.
17. The system of claim 15, the presentation engine including an email generator for sending email messages to one or more users, the email message containing the presence status.
18. The system of claim 15, the presentation engine including an SMS generator for sending SMS messages to one or more users, the SMS message containing the presence status.
19. The system of claim 15, the presentation engine sending the presence status to one or more social networking applications, the social networking application being configured to automatically update the user's presence within the social networking application.
20. The system of claim 15, the at least one sensor being embedded in a device selected from the group consisting of a cell phone, a personal digital assistant, and a notebook computer.
21. The system of claim 15, the at least one sensor being an embedded sensor that wirelessly communications with a consumer device selected from the group consisting of a cell phone, a personal digital assistant, and a notebook computer.
22. The system of claim 15. the at least one sensor comprising a BlueTooth enabled accelerometer operable to communicate with a BlueTooth enabled consumer device.
23. The system of claim 22, the BlueTooth enabled consumer device being selected from the group consisting of a BlueTooth enabled cell phone, a BlueTooth enabled personal digital assistant, and a BlueTooth enabled notebook computer.
24. The system of claim 15, the system being configured and arranged for the at least one sensor to send sensor data to the presence server upon a query from the presence server.
25. The system of claim 15, the at least one sensor being a virtual sensor that determines the sensor data by measuring artifacts of software running on a computing platform associated with the user.
26. The system of claim 15, the at least one sensor comprising a first and a second sensor, the first sensor being calibrated from the second sensor.
27. The system of claim 15, the at least one sensor comprising a sensor associated with a skier.
28. The system of claim 15, the at least one sensor comprising a sensor associated with a bicyclist.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/680,492 US20100299615A1 (en) | 2007-09-28 | 2008-09-29 | System And Method For Injecting Sensed Presence Into Social Networking Applications |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US97637107P | 2007-09-28 | 2007-09-28 | |
US60/976,371 | 2007-09-28 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2009043020A2 true WO2009043020A2 (en) | 2009-04-02 |
WO2009043020A3 WO2009043020A3 (en) | 2009-05-14 |
Family
ID=40467085
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2008/078148 WO2009043020A2 (en) | 2007-09-28 | 2008-09-29 | System and method for injecting sensed presence into social networking applications |
Country Status (2)
Country | Link |
---|---|
US (1) | US20100299615A1 (en) |
WO (1) | WO2009043020A2 (en) |
Cited By (149)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2290908A1 (en) * | 2009-08-25 | 2011-03-02 | Oki Electric Industry Co., Ltd. | System and method for providing presence information |
CN101998235A (en) * | 2009-08-20 | 2011-03-30 | 福特全球技术公司 | Method and system for updating social networking system based on vehicle events |
WO2011035998A1 (en) * | 2009-09-23 | 2011-03-31 | Bayerische Motoren Werke Aktiengesellschaft | Method and device for providing information about a user of a social network in the social network |
EP2317730A1 (en) * | 2009-10-29 | 2011-05-04 | Siemens Enterprise Communications GmbH & Co. KG | Method and system to automatically change or update the configuration or setting of a communication system |
US20110124977A1 (en) * | 2009-11-21 | 2011-05-26 | Tyson York Winarski | System and method for interpreting a users pyschological state from sensed biometric information and communicating that state to a social networking site |
US20110320981A1 (en) * | 2010-06-23 | 2011-12-29 | Microsoft Corporation | Status-oriented mobile device |
EP2439909A1 (en) * | 2010-10-06 | 2012-04-11 | Alcatel Lucent | Method for establishing a communication session |
EP2372973A3 (en) * | 2010-04-01 | 2012-05-30 | Sony Ericsson Mobile Communications AB | Updates with context information |
US8543460B2 (en) | 2010-11-11 | 2013-09-24 | Teaneck Enterprises, Llc | Serving ad requests using user generated photo ads |
US8560013B2 (en) | 2010-12-14 | 2013-10-15 | Toyota Motor Engineering & Manufacturing North America, Inc. | Automatic status update for social networking |
US9131343B2 (en) | 2011-03-31 | 2015-09-08 | Teaneck Enterprises, Llc | System and method for automated proximity-based social check-ins |
CN104935875A (en) * | 2014-03-21 | 2015-09-23 | 福特全球技术公司 | Vehicle-based media content capture and remote service integration |
US9178983B2 (en) | 2010-09-28 | 2015-11-03 | E.Digital Corporation | System and method for managing mobile communications |
WO2016077116A1 (en) * | 2014-11-10 | 2016-05-19 | Thomson Licensing | Movie night |
US9825898B2 (en) | 2014-06-13 | 2017-11-21 | Snap Inc. | Prioritization of messages within a message collection |
US9843720B1 (en) | 2014-11-12 | 2017-12-12 | Snap Inc. | User interface for accessing media at a geographic location |
US20170374003A1 (en) | 2014-10-02 | 2017-12-28 | Snapchat, Inc. | Ephemeral gallery of ephemeral messages |
US9881094B2 (en) | 2015-05-05 | 2018-01-30 | Snap Inc. | Systems and methods for automated local story generation and curation |
US9886727B2 (en) | 2010-11-11 | 2018-02-06 | Ikorongo Technology, LLC | Automatic check-ins and status updates |
US9961153B2 (en) | 2013-09-11 | 2018-05-01 | Unify Gmbh & Co. Kg | System and method to determine the presence status of a registered user on a network |
US10080102B1 (en) | 2014-01-12 | 2018-09-18 | Investment Asset Holdings Llc | Location-based messaging |
US10102680B2 (en) | 2015-10-30 | 2018-10-16 | Snap Inc. | Image based tracking in augmented reality systems |
US10123166B2 (en) | 2015-01-26 | 2018-11-06 | Snap Inc. | Content request by location |
US10154192B1 (en) | 2014-07-07 | 2018-12-11 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US10157449B1 (en) | 2015-01-09 | 2018-12-18 | Snap Inc. | Geo-location-based image filters |
US10165402B1 (en) | 2016-06-28 | 2018-12-25 | Snap Inc. | System to track engagement of media items |
US10203855B2 (en) | 2016-12-09 | 2019-02-12 | Snap Inc. | Customized user-controlled media overlays |
US10219111B1 (en) | 2018-04-18 | 2019-02-26 | Snap Inc. | Visitation tracking system |
US10223397B1 (en) | 2015-03-13 | 2019-03-05 | Snap Inc. | Social graph based co-location of network users |
US10319149B1 (en) | 2017-02-17 | 2019-06-11 | Snap Inc. | Augmented reality anamorphosis system |
US10327096B1 (en) | 2018-03-06 | 2019-06-18 | Snap Inc. | Geo-fence selection system |
US10334307B2 (en) | 2011-07-12 | 2019-06-25 | Snap Inc. | Methods and systems of providing visual content editing functions |
US10348662B2 (en) | 2016-07-19 | 2019-07-09 | Snap Inc. | Generating customized electronic messaging graphics |
US10354425B2 (en) | 2015-12-18 | 2019-07-16 | Snap Inc. | Method and system for providing context relevant media augmentation |
US10387514B1 (en) | 2016-06-30 | 2019-08-20 | Snap Inc. | Automated content curation and communication |
US10387730B1 (en) | 2017-04-20 | 2019-08-20 | Snap Inc. | Augmented reality typography personalization system |
US10423983B2 (en) | 2014-09-16 | 2019-09-24 | Snap Inc. | Determining targeting information based on a predictive targeting model |
US10430838B1 (en) | 2016-06-28 | 2019-10-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections with automated advertising |
US10474321B2 (en) | 2015-11-30 | 2019-11-12 | Snap Inc. | Network resource location linking and visual content sharing |
US10499191B1 (en) | 2017-10-09 | 2019-12-03 | Snap Inc. | Context sensitive presentation of content |
US10523625B1 (en) | 2017-03-09 | 2019-12-31 | Snap Inc. | Restricted group content collection |
US10572681B1 (en) | 2014-05-28 | 2020-02-25 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US10580458B2 (en) | 2014-12-19 | 2020-03-03 | Snap Inc. | Gallery of videos set to an audio time line |
US10614828B1 (en) | 2017-02-20 | 2020-04-07 | Snap Inc. | Augmented reality speech balloon system |
US10616239B2 (en) | 2015-03-18 | 2020-04-07 | Snap Inc. | Geo-fence authorization provisioning |
US10623666B2 (en) | 2016-11-07 | 2020-04-14 | Snap Inc. | Selective identification and order of image modifiers |
US10638256B1 (en) | 2016-06-20 | 2020-04-28 | Pipbin, Inc. | System for distribution and display of mobile targeted augmented reality content |
US10657708B1 (en) | 2015-11-30 | 2020-05-19 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US10679389B2 (en) | 2016-02-26 | 2020-06-09 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US10679393B2 (en) | 2018-07-24 | 2020-06-09 | Snap Inc. | Conditional modification of augmented reality object |
US10678818B2 (en) | 2018-01-03 | 2020-06-09 | Snap Inc. | Tag distribution visualization system |
US10740974B1 (en) | 2017-09-15 | 2020-08-11 | Snap Inc. | Augmented reality system |
US10805696B1 (en) | 2016-06-20 | 2020-10-13 | Pipbin, Inc. | System for recording and targeting tagged content of user interest |
US10819811B2 (en) | 2013-01-17 | 2020-10-27 | Microsoft Technology Licensing, Llc | Accumulation of real-time crowd sourced data for inferring metadata about entities |
US10817898B2 (en) | 2015-08-13 | 2020-10-27 | Placed, Llc | Determining exposures to content presented by physical objects |
US10824654B2 (en) | 2014-09-18 | 2020-11-03 | Snap Inc. | Geolocation-based pictographs |
US10834525B2 (en) | 2016-02-26 | 2020-11-10 | Snap Inc. | Generation, curation, and presentation of media collections |
US10839219B1 (en) | 2016-06-20 | 2020-11-17 | Pipbin, Inc. | System for curation, distribution and display of location-dependent augmented reality content |
US10862951B1 (en) | 2007-01-05 | 2020-12-08 | Snap Inc. | Real-time display of multiple images |
US10885136B1 (en) | 2018-02-28 | 2021-01-05 | Snap Inc. | Audience filtering system |
US10911575B1 (en) | 2015-05-05 | 2021-02-02 | Snap Inc. | Systems and methods for story and sub-story navigation |
US10915911B2 (en) | 2017-02-03 | 2021-02-09 | Snap Inc. | System to determine a price-schedule to distribute media content |
US10933311B2 (en) | 2018-03-14 | 2021-03-02 | Snap Inc. | Generating collectible items based on location information |
US10952013B1 (en) | 2017-04-27 | 2021-03-16 | Snap Inc. | Selective location-based identity communication |
US10948717B1 (en) | 2015-03-23 | 2021-03-16 | Snap Inc. | Reducing boot time and power consumption in wearable display systems |
US10963529B1 (en) | 2017-04-27 | 2021-03-30 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US10979752B1 (en) | 2018-02-28 | 2021-04-13 | Snap Inc. | Generating media content items based on location information |
US10993069B2 (en) | 2015-07-16 | 2021-04-27 | Snap Inc. | Dynamically adaptive media content delivery |
US10997760B2 (en) | 2018-08-31 | 2021-05-04 | Snap Inc. | Augmented reality anthropomorphization system |
US11017173B1 (en) | 2017-12-22 | 2021-05-25 | Snap Inc. | Named entity recognition visual context and caption data |
US11023514B2 (en) | 2016-02-26 | 2021-06-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US11030787B2 (en) | 2017-10-30 | 2021-06-08 | Snap Inc. | Mobile-based cartographic control of display content |
US11038829B1 (en) | 2014-10-02 | 2021-06-15 | Snap Inc. | Ephemeral gallery of ephemeral messages with opt-in permanence |
US11037372B2 (en) | 2017-03-06 | 2021-06-15 | Snap Inc. | Virtual vision system |
US11044393B1 (en) | 2016-06-20 | 2021-06-22 | Pipbin, Inc. | System for curation and display of location-dependent augmented reality content in an augmented estate system |
US11128715B1 (en) | 2019-12-30 | 2021-09-21 | Snap Inc. | Physical friend proximity in chat |
US11163941B1 (en) | 2018-03-30 | 2021-11-02 | Snap Inc. | Annotating a collection of media content items |
US11170393B1 (en) | 2017-04-11 | 2021-11-09 | Snap Inc. | System to calculate an engagement score of location based media content |
US11182383B1 (en) | 2012-02-24 | 2021-11-23 | Placed, Llc | System and method for data collection to validate location data |
US11199957B1 (en) | 2018-11-30 | 2021-12-14 | Snap Inc. | Generating customized avatars based on location information |
US11201981B1 (en) | 2016-06-20 | 2021-12-14 | Pipbin, Inc. | System for notification of user accessibility of curated location-dependent content in an augmented estate |
US11206615B2 (en) | 2019-05-30 | 2021-12-21 | Snap Inc. | Wearable device location systems |
US11218838B2 (en) | 2019-10-31 | 2022-01-04 | Snap Inc. | Focused map-based context information surfacing |
US11216869B2 (en) | 2014-09-23 | 2022-01-04 | Snap Inc. | User interface to augment an image using geolocation |
US11228551B1 (en) | 2020-02-12 | 2022-01-18 | Snap Inc. | Multiple gateway message exchange |
US11232040B1 (en) | 2017-04-28 | 2022-01-25 | Snap Inc. | Precaching unlockable data elements |
US11249617B1 (en) | 2015-01-19 | 2022-02-15 | Snap Inc. | Multichannel system |
US11249614B2 (en) | 2019-03-28 | 2022-02-15 | Snap Inc. | Generating personalized map interface with enhanced icons |
US11250075B1 (en) | 2017-02-17 | 2022-02-15 | Snap Inc. | Searching social media content |
US11265273B1 (en) | 2017-12-01 | 2022-03-01 | Snap, Inc. | Dynamic media overlay with smart widget |
US11290851B2 (en) | 2020-06-15 | 2022-03-29 | Snap Inc. | Location sharing using offline and online objects |
US11297399B1 (en) | 2017-03-27 | 2022-04-05 | Snap Inc. | Generating a stitched data stream |
US11294936B1 (en) | 2019-01-30 | 2022-04-05 | Snap Inc. | Adaptive spatial density based clustering |
US11301117B2 (en) | 2019-03-08 | 2022-04-12 | Snap Inc. | Contextual information in chat |
US11314776B2 (en) | 2020-06-15 | 2022-04-26 | Snap Inc. | Location sharing using friend list versions |
US11343323B2 (en) | 2019-12-31 | 2022-05-24 | Snap Inc. | Augmented reality objects registry |
US11349796B2 (en) | 2017-03-27 | 2022-05-31 | Snap Inc. | Generating a stitched data stream |
US11361493B2 (en) | 2019-04-01 | 2022-06-14 | Snap Inc. | Semantic texture mapping system |
US11372608B2 (en) | 2014-12-19 | 2022-06-28 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US11388226B1 (en) | 2015-01-13 | 2022-07-12 | Snap Inc. | Guided personal identity based actions |
US11430091B2 (en) | 2020-03-27 | 2022-08-30 | Snap Inc. | Location mapping for large scale augmented-reality |
US11429618B2 (en) | 2019-12-30 | 2022-08-30 | Snap Inc. | Surfacing augmented reality objects |
US11455082B2 (en) | 2018-09-28 | 2022-09-27 | Snap Inc. | Collaborative achievement interface |
US11475254B1 (en) | 2017-09-08 | 2022-10-18 | Snap Inc. | Multimodal entity identification |
US11483267B2 (en) | 2020-06-15 | 2022-10-25 | Snap Inc. | Location sharing using different rate-limited links |
US11483262B2 (en) | 2020-11-12 | 2022-10-25 | International Business Machines Corporation | Contextually-aware personalized chatbot |
US11500525B2 (en) | 2019-02-25 | 2022-11-15 | Snap Inc. | Custom media overlay system |
US11503432B2 (en) | 2020-06-15 | 2022-11-15 | Snap Inc. | Scalable real-time location sharing framework |
US11507614B1 (en) | 2018-02-13 | 2022-11-22 | Snap Inc. | Icon based tagging |
US11516167B2 (en) | 2020-03-05 | 2022-11-29 | Snap Inc. | Storing data based on device location |
US11558709B2 (en) | 2018-11-30 | 2023-01-17 | Snap Inc. | Position service to determine relative position to map features |
US11574431B2 (en) | 2019-02-26 | 2023-02-07 | Snap Inc. | Avatar based on weather |
US11601888B2 (en) | 2021-03-29 | 2023-03-07 | Snap Inc. | Determining location using multi-source geolocation data |
US11601783B2 (en) | 2019-06-07 | 2023-03-07 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11606755B2 (en) | 2019-05-30 | 2023-03-14 | Snap Inc. | Wearable device location systems architecture |
US11616745B2 (en) | 2017-01-09 | 2023-03-28 | Snap Inc. | Contextual generation and selection of customized media content |
US11619501B2 (en) | 2020-03-11 | 2023-04-04 | Snap Inc. | Avatar based on trip |
US11625443B2 (en) | 2014-06-05 | 2023-04-11 | Snap Inc. | Web document enhancement |
US11631276B2 (en) | 2016-03-31 | 2023-04-18 | Snap Inc. | Automated avatar generation |
US11645324B2 (en) | 2021-03-31 | 2023-05-09 | Snap Inc. | Location-based timeline media content system |
US11675831B2 (en) | 2017-05-31 | 2023-06-13 | Snap Inc. | Geolocation based playlists |
US11676378B2 (en) | 2020-06-29 | 2023-06-13 | Snap Inc. | Providing travel-based augmented reality content with a captured image |
US11714535B2 (en) | 2019-07-11 | 2023-08-01 | Snap Inc. | Edge gesture interface with smart interactions |
US11734712B2 (en) | 2012-02-24 | 2023-08-22 | Foursquare Labs, Inc. | Attributing in-store visits to media consumption based on data collected from user devices |
US11751015B2 (en) | 2019-01-16 | 2023-09-05 | Snap Inc. | Location-based context information sharing in a messaging system |
US11776256B2 (en) | 2020-03-27 | 2023-10-03 | Snap Inc. | Shared augmented reality system |
US11785161B1 (en) | 2016-06-20 | 2023-10-10 | Pipbin, Inc. | System for user accessibility of tagged curated augmented reality content |
US11799811B2 (en) | 2018-10-31 | 2023-10-24 | Snap Inc. | Messaging and gaming applications communication platform |
US11809624B2 (en) | 2019-02-13 | 2023-11-07 | Snap Inc. | Sleep detection in a location sharing system |
US11816853B2 (en) | 2016-08-30 | 2023-11-14 | Snap Inc. | Systems and methods for simultaneous localization and mapping |
US11821742B2 (en) | 2019-09-26 | 2023-11-21 | Snap Inc. | Travel based notifications |
US11829834B2 (en) | 2021-10-29 | 2023-11-28 | Snap Inc. | Extended QR code |
US11842411B2 (en) | 2017-04-27 | 2023-12-12 | Snap Inc. | Location-based virtual avatars |
US11843456B2 (en) | 2016-10-24 | 2023-12-12 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11852554B1 (en) | 2019-03-21 | 2023-12-26 | Snap Inc. | Barometer calibration in a location sharing system |
US11860888B2 (en) | 2018-05-22 | 2024-01-02 | Snap Inc. | Event detection system |
US11868414B1 (en) | 2019-03-14 | 2024-01-09 | Snap Inc. | Graph-based prediction for contact suggestion in a location sharing system |
US11870743B1 (en) | 2017-01-23 | 2024-01-09 | Snap Inc. | Customized digital avatar accessories |
US11876941B1 (en) | 2016-06-20 | 2024-01-16 | Pipbin, Inc. | Clickable augmented reality content manager, system, and network |
US11877211B2 (en) | 2019-01-14 | 2024-01-16 | Snap Inc. | Destination sharing in location sharing system |
US11893208B2 (en) | 2019-12-31 | 2024-02-06 | Snap Inc. | Combined map icon with action indicator |
US11902129B1 (en) | 2023-03-24 | 2024-02-13 | T-Mobile Usa, Inc. | Vendor-agnostic real-time monitoring of telecommunications networks |
US11925869B2 (en) | 2012-05-08 | 2024-03-12 | Snap Inc. | System and method for generating and displaying avatars |
US11943192B2 (en) | 2020-08-31 | 2024-03-26 | Snap Inc. | Co-location connection service |
US11972529B2 (en) | 2019-02-01 | 2024-04-30 | Snap Inc. | Augmented reality system |
US12001750B2 (en) | 2022-04-20 | 2024-06-04 | Snap Inc. | Location-based shared augmented reality experience system |
US12020386B2 (en) | 2022-06-23 | 2024-06-25 | Snap Inc. | Applying pregenerated virtual experiences in new location |
US12020384B2 (en) | 2022-06-21 | 2024-06-25 | Snap Inc. | Integrating augmented reality experiences with other components |
US12026362B2 (en) | 2022-05-19 | 2024-07-02 | Snap Inc. | Video editing application for mobile devices |
Families Citing this family (89)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7395329B1 (en) | 2002-05-13 | 2008-07-01 | At&T Delaware Intellectual Property., Inc. | Real-time notification of presence availability changes |
US7353455B2 (en) | 2002-05-21 | 2008-04-01 | At&T Delaware Intellectual Property, Inc. | Caller initiated distinctive presence alerting and auto-response messaging |
US7370278B2 (en) | 2002-08-19 | 2008-05-06 | At&T Delaware Intellectual Property, Inc. | Redirection of user-initiated distinctive presence alert messages |
US8316117B2 (en) | 2006-09-21 | 2012-11-20 | At&T Intellectual Property I, L.P. | Personal presentity presence subsystem |
US9104962B2 (en) * | 2007-03-06 | 2015-08-11 | Trion Worlds, Inc. | Distributed network architecture for introducing dynamic content into a synthetic environment |
US8898325B2 (en) * | 2007-03-06 | 2014-11-25 | Trion Worlds, Inc. | Apparatus, method, and computer readable media to perform transactions in association with participants interacting in a synthetic environment |
US20090113311A1 (en) * | 2007-10-25 | 2009-04-30 | Eric Philip Fried | Personal status display system |
US8321525B2 (en) * | 2007-11-27 | 2012-11-27 | Loyalblocks Ltd. | Method, device and system for creating a virtual local social network |
US20090209274A1 (en) * | 2008-02-15 | 2009-08-20 | Sony Ericsson Mobile Communications Ab | System and Method for Dynamically Updating and Serving Data Objects Based on Sender and Recipient States |
US10013986B1 (en) | 2016-12-30 | 2018-07-03 | Google Llc | Data structure pooling of voice activated data packets |
US11017428B2 (en) * | 2008-02-21 | 2021-05-25 | Google Llc | System and method of data transmission rate adjustment |
US20100049815A1 (en) * | 2008-08-23 | 2010-02-25 | Mobile Tribe Llc | Programmable and Extensible Multi-Social Network Alert System |
US8302015B2 (en) | 2008-09-04 | 2012-10-30 | Qualcomm Incorporated | Integrated display and management of data objects based on social, temporal and spatial parameters |
US8806350B2 (en) | 2008-09-04 | 2014-08-12 | Qualcomm Incorporated | Integrated display and management of data objects based on social, temporal and spatial parameters |
US8626863B2 (en) * | 2008-10-28 | 2014-01-07 | Trion Worlds, Inc. | Persistent synthetic environment message notification |
US8156054B2 (en) | 2008-12-04 | 2012-04-10 | At&T Intellectual Property I, L.P. | Systems and methods for managing interactions between an individual and an entity |
US20100146064A1 (en) * | 2008-12-08 | 2010-06-10 | Electronics And Telecommunications Research Institute | Source apparatus, sink apparatus and method for sharing information thereof |
US20100151887A1 (en) * | 2008-12-15 | 2010-06-17 | Xg Technology, Inc. | Mobile handset proximity social networking |
US8458177B2 (en) * | 2009-02-02 | 2013-06-04 | Yahoo! Inc. | Automated search |
US8657686B2 (en) * | 2009-03-06 | 2014-02-25 | Trion Worlds, Inc. | Synthetic environment character data sharing |
US8661073B2 (en) * | 2009-03-06 | 2014-02-25 | Trion Worlds, Inc. | Synthetic environment character data sharing |
US8694585B2 (en) * | 2009-03-06 | 2014-04-08 | Trion Worlds, Inc. | Cross-interface communication |
US10482428B2 (en) * | 2009-03-10 | 2019-11-19 | Samsung Electronics Co., Ltd. | Systems and methods for presenting metaphors |
US9489039B2 (en) | 2009-03-27 | 2016-11-08 | At&T Intellectual Property I, L.P. | Systems and methods for presenting intermediaries |
US8214515B2 (en) * | 2009-06-01 | 2012-07-03 | Trion Worlds, Inc. | Web client data conversion for synthetic environment interaction |
US20100318257A1 (en) * | 2009-06-15 | 2010-12-16 | Deep Kalinadhabhotla | Method and system for automatically calibrating a three-axis accelerometer device |
US20110010093A1 (en) * | 2009-07-09 | 2011-01-13 | Palo Alto Research Center Incorporated | Method for encouraging location and activity labeling |
US8875219B2 (en) * | 2009-07-30 | 2014-10-28 | Blackberry Limited | Apparatus and method for controlled sharing of personal information |
US9258376B2 (en) * | 2009-08-04 | 2016-02-09 | At&T Intellectual Property I, L.P. | Aggregated presence over user federated devices |
KR101667577B1 (en) * | 2009-09-09 | 2016-10-28 | 엘지전자 주식회사 | Mobile terminal and method for controlling displaying thereof |
JP5440080B2 (en) * | 2009-10-02 | 2014-03-12 | ソニー株式会社 | Action pattern analysis system, portable terminal, action pattern analysis method, and program |
US20110099507A1 (en) | 2009-10-28 | 2011-04-28 | Google Inc. | Displaying a collection of interactive elements that trigger actions directed to an item |
CN102088675A (en) * | 2009-12-07 | 2011-06-08 | 深圳富泰宏精密工业有限公司 | Wireless communication device and use method thereof |
US9460448B2 (en) * | 2010-03-20 | 2016-10-04 | Nimbelink Corp. | Environmental monitoring system which leverages a social networking service to deliver alerts to mobile phones or devices |
US9501782B2 (en) | 2010-03-20 | 2016-11-22 | Arthur Everett Felgate | Monitoring system |
JP2011232871A (en) * | 2010-04-26 | 2011-11-17 | Sony Corp | Information processor, text selection method and program |
US20120134282A1 (en) * | 2010-11-30 | 2012-05-31 | Nokia Corporation | Method and apparatus for selecting devices to form a community |
CA2823346A1 (en) | 2010-12-30 | 2012-07-05 | Ambientz | Information processing using a population of data acquisition devices |
US8949721B2 (en) * | 2011-01-25 | 2015-02-03 | International Business Machines Corporation | Personalization of web content |
US9407706B2 (en) | 2011-03-31 | 2016-08-02 | Qualcomm Incorporated | Methods, devices, and apparatuses for activity classification using temporal scaling of time-referenced features |
US20130013685A1 (en) * | 2011-04-04 | 2013-01-10 | Bagooba, Inc. | Social Networking Environment with Representation of a Composite Emotional Condition for a User and/or Group of Users |
US9195309B2 (en) | 2011-05-27 | 2015-11-24 | Qualcomm Incorporated | Method and apparatus for classifying multiple device states |
US9317390B2 (en) * | 2011-06-03 | 2016-04-19 | Microsoft Technology Licensing, Llc | Collecting, aggregating, and presenting activity data |
US9509788B2 (en) | 2011-06-09 | 2016-11-29 | Tata Consultancy Services Limited | Social network graph based sensor data analytics |
US8521848B2 (en) | 2011-06-28 | 2013-08-27 | Microsoft Corporation | Device sensor and actuation for web pages |
US9159324B2 (en) | 2011-07-01 | 2015-10-13 | Qualcomm Incorporated | Identifying people that are proximate to a mobile device user via social graphs, speech models, and user context |
GB2507933B (en) * | 2011-10-12 | 2015-08-05 | Ibm | Aggregation of sensor appliances using device registers and wiring brokers |
US10198716B2 (en) * | 2011-11-11 | 2019-02-05 | Microsoft Technology Licensing, Llc | User availability awareness |
US9389681B2 (en) * | 2011-12-19 | 2016-07-12 | Microsoft Technology Licensing, Llc | Sensor fusion interface for multiple sensor input |
US9026896B2 (en) * | 2011-12-26 | 2015-05-05 | TrackThings LLC | Method and apparatus of physically moving a portable unit to view composite webpages of different websites |
US9965140B2 (en) | 2011-12-26 | 2018-05-08 | TrackThings LLC | Method and apparatus of a marking objects in images displayed on a portable unit |
US8532919B2 (en) | 2011-12-26 | 2013-09-10 | TrackThings LLC | Method and apparatus of physically moving a portable unit to view an image of a stationary map |
EP2798517A4 (en) * | 2011-12-27 | 2015-08-12 | Tata Consultancy Services Ltd | A method and system for creating an intelligent social network between plurality of devices |
AU2013217184A1 (en) * | 2012-02-02 | 2014-08-21 | Tata Consultancy Services Limited | A system and method for identifying and analyzing personal context of a user |
US9292829B2 (en) * | 2012-03-12 | 2016-03-22 | Blackberry Limited | System and method for updating status information |
EP2640097B1 (en) * | 2012-03-12 | 2018-01-10 | BlackBerry Limited | System and Method for Updating Status Information |
CN104272110A (en) * | 2012-03-15 | 2015-01-07 | 伊利诺伊大学评议会 | Liquid sampling device for use with mobile device and methods |
US20130243189A1 (en) * | 2012-03-19 | 2013-09-19 | Nokia Corporation | Method and apparatus for providing information authentication from external sensors to secure environments |
CN102724656B (en) * | 2012-05-21 | 2015-01-28 | 中兴通讯股份有限公司 | Device and method for updating state of user of mobile social network and mobile terminal |
US9729649B1 (en) * | 2012-08-15 | 2017-08-08 | Amazon Technologies, Inc. | Systems and methods for controlling the availability of communication applications |
US8965759B2 (en) * | 2012-09-01 | 2015-02-24 | Sarah Hershenhorn | Digital voice memo transfer and processing |
WO2014047118A2 (en) * | 2012-09-24 | 2014-03-27 | Qualcomm Incorporated | Integrated display and management of data objects based on social, temporal and spatial parameters |
GB201219091D0 (en) | 2012-10-24 | 2012-12-05 | Imagination Tech Ltd | Method, system and device for connecting similar users |
US9740773B2 (en) * | 2012-11-02 | 2017-08-22 | Qualcomm Incorporated | Context labels for data clusters |
US9336295B2 (en) | 2012-12-03 | 2016-05-10 | Qualcomm Incorporated | Fusing contextual inferences semantically |
US20140177494A1 (en) * | 2012-12-21 | 2014-06-26 | Alexander W. Min | Cloud-aware collaborative mobile platform power management using mobile sensors |
US9031573B2 (en) | 2012-12-31 | 2015-05-12 | Qualcomm Incorporated | Context-based parameter maps for position determination |
US9692839B2 (en) | 2013-03-13 | 2017-06-27 | Arris Enterprises, Inc. | Context emotion determination system |
US9135248B2 (en) | 2013-03-13 | 2015-09-15 | Arris Technology, Inc. | Context demographic determination system |
US10304325B2 (en) | 2013-03-13 | 2019-05-28 | Arris Enterprises Llc | Context health determination system |
US8844050B1 (en) * | 2013-03-15 | 2014-09-23 | Athoc, Inc. | Personnel crisis communications management and personnel status tracking system |
US9549042B2 (en) | 2013-04-04 | 2017-01-17 | Samsung Electronics Co., Ltd. | Context recognition and social profiling using mobile devices |
US9342737B2 (en) * | 2013-05-31 | 2016-05-17 | Nike, Inc. | Dynamic sampling in sports equipment |
US9438687B2 (en) | 2013-12-17 | 2016-09-06 | Microsoft Technology Licensing, Llc | Employing presence information in notebook application |
US9571595B2 (en) * | 2013-12-17 | 2017-02-14 | Microsoft Technology Licensing, Llc | Employment of presence-based history information in notebook application |
US10567444B2 (en) * | 2014-02-03 | 2020-02-18 | Cogito Corporation | Tele-communication system and methods |
US9672291B2 (en) | 2014-02-19 | 2017-06-06 | Google Inc. | Summarizing social interactions between users |
US10306000B1 (en) * | 2014-03-31 | 2019-05-28 | Ribbon Communications Operating Company, Inc. | Methods and apparatus for generating, aggregating and/or distributing presence information |
US10044774B1 (en) | 2014-03-31 | 2018-08-07 | Sonus Networks, Inc. | Methods and apparatus for aggregating and distributing presence information |
US9398107B1 (en) | 2014-03-31 | 2016-07-19 | Sonus Networks, Inc. | Methods and apparatus for aggregating and distributing contact and presence information |
WO2015173769A2 (en) * | 2014-05-15 | 2015-11-19 | Ittah Roy | System and methods for sensory controlled satisfaction monitoring |
US9271121B1 (en) | 2014-08-12 | 2016-02-23 | Google Inc. | Associating requests for content with a confirmed location |
US9686663B2 (en) * | 2014-09-11 | 2017-06-20 | Facebook, Inc. | Systems and methods for acquiring and providing information associated with a crisis |
EP3109818A1 (en) * | 2015-06-25 | 2016-12-28 | Mastercard International Incorporated | Methods, devices, and systems for automatically detecting, tracking, and validating transit journeys |
US10448204B2 (en) | 2017-03-28 | 2019-10-15 | Microsoft Technology Licensing, Llc | Individualized presence context publishing |
US11108709B2 (en) * | 2017-05-25 | 2021-08-31 | Lenovo (Singapore) Pte. Ltd. | Provide status message associated with work status |
US10841255B2 (en) * | 2018-03-09 | 2020-11-17 | International Business Machines Corporation | Determination of an online collaboration status of a user based upon biometric and user activity data |
CN111353001B (en) * | 2018-12-24 | 2023-08-18 | 杭州海康威视数字技术股份有限公司 | Method and device for classifying users |
WO2022000256A1 (en) * | 2020-06-30 | 2022-01-06 | Ringcentral, Inc. | Methods and systems for directing communications |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020120687A1 (en) * | 2001-02-05 | 2002-08-29 | Athanassios Diacakis | System and method for filtering unavailable devices in a presence and availability management system |
GB2377783A (en) * | 2001-07-20 | 2003-01-22 | Ibm | Controlling access by software agents to a distributed processing system |
US20050270157A1 (en) * | 2004-06-05 | 2005-12-08 | Alcatel | System and method for importing location information and policies as part of a rich presence environment |
US20060061468A1 (en) * | 2004-09-17 | 2006-03-23 | Antti Ruha | Sensor data sharing |
US20070143433A1 (en) * | 2005-12-15 | 2007-06-21 | Daigle Brian K | Using statistical tracking information of instant messaging users |
US20070167170A1 (en) * | 2006-01-18 | 2007-07-19 | Nortel Networks Limited | Method and device for determining location-enhanced presence information for entities subscribed to a communications system |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070288416A1 (en) * | 1996-06-04 | 2007-12-13 | Informative, Inc. | Asynchronous Network Collaboration Method and Apparatus |
US6968179B1 (en) * | 2000-07-27 | 2005-11-22 | Microsoft Corporation | Place specific buddy list services |
US7620902B2 (en) * | 2005-04-20 | 2009-11-17 | Microsoft Corporation | Collaboration spaces |
WO2007090133A2 (en) * | 2006-01-30 | 2007-08-09 | Kramer Jame F | System for providing a service to venues where people aggregate |
US8122491B2 (en) * | 2006-05-18 | 2012-02-21 | Microsoft Corporation | Techniques for physical presence detection for a communications device |
US8514066B2 (en) * | 2006-11-25 | 2013-08-20 | Trimble Navigation Limited | Accelerometer based extended display |
US8157730B2 (en) * | 2006-12-19 | 2012-04-17 | Valencell, Inc. | Physiological and environmental monitoring systems and methods |
US20080184170A1 (en) * | 2007-01-16 | 2008-07-31 | Shape Innovations Inc | Systems and methods for customized instant messaging application for displaying status of measurements from sensors |
WO2008143841A1 (en) * | 2007-05-14 | 2008-11-27 | The Ohio State University | Assessment device |
-
2008
- 2008-09-29 WO PCT/US2008/078148 patent/WO2009043020A2/en active Application Filing
- 2008-09-29 US US12/680,492 patent/US20100299615A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020120687A1 (en) * | 2001-02-05 | 2002-08-29 | Athanassios Diacakis | System and method for filtering unavailable devices in a presence and availability management system |
GB2377783A (en) * | 2001-07-20 | 2003-01-22 | Ibm | Controlling access by software agents to a distributed processing system |
US20050270157A1 (en) * | 2004-06-05 | 2005-12-08 | Alcatel | System and method for importing location information and policies as part of a rich presence environment |
US20060061468A1 (en) * | 2004-09-17 | 2006-03-23 | Antti Ruha | Sensor data sharing |
US20070143433A1 (en) * | 2005-12-15 | 2007-06-21 | Daigle Brian K | Using statistical tracking information of instant messaging users |
US20070167170A1 (en) * | 2006-01-18 | 2007-07-19 | Nortel Networks Limited | Method and device for determining location-enhanced presence information for entities subscribed to a communications system |
Non-Patent Citations (2)
Title |
---|
ALEX VARSHAVSKY ET AL: "Calibree: Calibration-Free Localization Using Relative Distance Estimations" PERVASIVE COMPUTING; [LECTURE NOTES IN COMPUTER SCIENCE], SPRINGER BERLIN HEIDELBERG, BERLIN, HEIDELBERG, vol. 5013, 13 May 2007 (2007-05-13), pages 146-161, XP019088939 ISBN: 978-3-540-79575-9 * |
MAYR H ED - ROZENBLIT J ET AL: "Using software sensors for migrating from classical simulation systems towards virtual worlds" ENGINEERING OF COMPUTER-BASED SYSTEMS, 1997. PROCEEDINGS., INTERNATION AL CONFERENCE AND WORKSHOP ON MONTEREY, CA, USA 24-28 MARCH 1997, LOS ALAMITOS, CA, USA,IEEE COMPUTER. SOC, US, 24 March 1997 (1997-03-24), pages 105-112, XP010218849 ISBN: 978-0-8186-7889-9 * |
Cited By (318)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11588770B2 (en) | 2007-01-05 | 2023-02-21 | Snap Inc. | Real-time display of multiple images |
US10862951B1 (en) | 2007-01-05 | 2020-12-08 | Snap Inc. | Real-time display of multiple images |
CN101998235A (en) * | 2009-08-20 | 2011-03-30 | 福特全球技术公司 | Method and system for updating social networking system based on vehicle events |
US8171076B2 (en) | 2009-08-25 | 2012-05-01 | Oki Electric Industry Co., Ltd. | System and method for providing presence information |
EP2290908A1 (en) * | 2009-08-25 | 2011-03-02 | Oki Electric Industry Co., Ltd. | System and method for providing presence information |
WO2011035998A1 (en) * | 2009-09-23 | 2011-03-31 | Bayerische Motoren Werke Aktiengesellschaft | Method and device for providing information about a user of a social network in the social network |
US8571724B2 (en) | 2009-09-23 | 2013-10-29 | Bayerische Motoren Werke Aktiengesellschaft | Method and device for providing information about a user of a social network in the social network |
KR102009912B1 (en) | 2009-10-29 | 2019-08-12 | 유니파이 게엠베하 운트 코. 카게 | Method and system to automatically change or update the configuration or setting of a communication system |
JP2014038632A (en) * | 2009-10-29 | 2014-02-27 | Siemens Enterprise Communications Gmbh & Co Kg | Method and system for automatically changing or updating configuration or setting of communication system |
KR20170008317A (en) * | 2009-10-29 | 2017-01-23 | 유니파이 게엠베하 운트 코. 카게 | Method and system to automatically change or update the configuration or setting of a communication system |
CN102055793B (en) * | 2009-10-29 | 2016-06-15 | 西门子企业通讯有限责任两合公司 | The method and system configured or set up of automatically change or more new communication system |
EP2317730A1 (en) * | 2009-10-29 | 2011-05-04 | Siemens Enterprise Communications GmbH & Co. KG | Method and system to automatically change or update the configuration or setting of a communication system |
US10303774B2 (en) | 2009-10-29 | 2019-05-28 | Unify Gmbh & Co. Kg | Method and system to automatically change or update the configuration or setting of a communication system |
JP2011096259A (en) * | 2009-10-29 | 2011-05-12 | Siemens Enterprise Communications Gmbh & Co Kg | Method and system for automatically changing or update configuration or setting of communication system |
US10650194B2 (en) | 2009-10-29 | 2020-05-12 | Unify Gmbh & Co. Kg | Method and system to automatically change or update the configuration or setting of a communication system |
CN102055793A (en) * | 2009-10-29 | 2011-05-11 | 西门子企业通讯有限责任两合公司 | Method and system to automatically change or update the configuration or setting of a communication system |
US8666672B2 (en) * | 2009-11-21 | 2014-03-04 | Radial Comm Research L.L.C. | System and method for interpreting a user's psychological state from sensed biometric information and communicating that state to a social networking site |
US20110124977A1 (en) * | 2009-11-21 | 2011-05-26 | Tyson York Winarski | System and method for interpreting a users pyschological state from sensed biometric information and communicating that state to a social networking site |
EP2372973A3 (en) * | 2010-04-01 | 2012-05-30 | Sony Ericsson Mobile Communications AB | Updates with context information |
US20110320981A1 (en) * | 2010-06-23 | 2011-12-29 | Microsoft Corporation | Status-oriented mobile device |
US9622055B2 (en) | 2010-09-28 | 2017-04-11 | E.Digital Corporation | System and method for managing mobile communications |
US9178983B2 (en) | 2010-09-28 | 2015-11-03 | E.Digital Corporation | System and method for managing mobile communications |
US9641664B2 (en) | 2010-09-28 | 2017-05-02 | E.Digital Corporation | System, apparatus, and method for utilizing sensor data |
EP2439909A1 (en) * | 2010-10-06 | 2012-04-11 | Alcatel Lucent | Method for establishing a communication session |
US8548855B2 (en) | 2010-11-11 | 2013-10-01 | Teaneck Enterprises, Llc | User generated ADS based on check-ins |
US9886727B2 (en) | 2010-11-11 | 2018-02-06 | Ikorongo Technology, LLC | Automatic check-ins and status updates |
US8543460B2 (en) | 2010-11-11 | 2013-09-24 | Teaneck Enterprises, Llc | Serving ad requests using user generated photo ads |
US8554627B2 (en) | 2010-11-11 | 2013-10-08 | Teaneck Enterprises, Llc | User generated photo ads used as status updates |
US8560013B2 (en) | 2010-12-14 | 2013-10-15 | Toyota Motor Engineering & Manufacturing North America, Inc. | Automatic status update for social networking |
US9131343B2 (en) | 2011-03-31 | 2015-09-08 | Teaneck Enterprises, Llc | System and method for automated proximity-based social check-ins |
US11451856B2 (en) | 2011-07-12 | 2022-09-20 | Snap Inc. | Providing visual content editing functions |
US10334307B2 (en) | 2011-07-12 | 2019-06-25 | Snap Inc. | Methods and systems of providing visual content editing functions |
US11750875B2 (en) | 2011-07-12 | 2023-09-05 | Snap Inc. | Providing visual content editing functions |
US10999623B2 (en) | 2011-07-12 | 2021-05-04 | Snap Inc. | Providing visual content editing functions |
US11734712B2 (en) | 2012-02-24 | 2023-08-22 | Foursquare Labs, Inc. | Attributing in-store visits to media consumption based on data collected from user devices |
US11182383B1 (en) | 2012-02-24 | 2021-11-23 | Placed, Llc | System and method for data collection to validate location data |
US11925869B2 (en) | 2012-05-08 | 2024-03-12 | Snap Inc. | System and method for generating and displaying avatars |
US10819811B2 (en) | 2013-01-17 | 2020-10-27 | Microsoft Technology Licensing, Llc | Accumulation of real-time crowd sourced data for inferring metadata about entities |
US10567533B2 (en) | 2013-09-11 | 2020-02-18 | Unify Gmbh & Co. Kg | System and method to determine the presence status of a registered user on a network |
US9961153B2 (en) | 2013-09-11 | 2018-05-01 | Unify Gmbh & Co. Kg | System and method to determine the presence status of a registered user on a network |
US10080102B1 (en) | 2014-01-12 | 2018-09-18 | Investment Asset Holdings Llc | Location-based messaging |
US10349209B1 (en) | 2014-01-12 | 2019-07-09 | Investment Asset Holdings Llc | Location-based messaging |
CN104935875A (en) * | 2014-03-21 | 2015-09-23 | 福特全球技术公司 | Vehicle-based media content capture and remote service integration |
US10990697B2 (en) | 2014-05-28 | 2021-04-27 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US10572681B1 (en) | 2014-05-28 | 2020-02-25 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US11972014B2 (en) | 2014-05-28 | 2024-04-30 | Snap Inc. | Apparatus and method for automated privacy protection in distributed images |
US11921805B2 (en) | 2014-06-05 | 2024-03-05 | Snap Inc. | Web document enhancement |
US11625443B2 (en) | 2014-06-05 | 2023-04-11 | Snap Inc. | Web document enhancement |
US10659914B1 (en) | 2014-06-13 | 2020-05-19 | Snap Inc. | Geo-location based event gallery |
US11166121B2 (en) | 2014-06-13 | 2021-11-02 | Snap Inc. | Prioritization of messages within a message collection |
US10623891B2 (en) | 2014-06-13 | 2020-04-14 | Snap Inc. | Prioritization of messages within a message collection |
US10200813B1 (en) | 2014-06-13 | 2019-02-05 | Snap Inc. | Geo-location based event gallery |
US10182311B2 (en) | 2014-06-13 | 2019-01-15 | Snap Inc. | Prioritization of messages within a message collection |
US10779113B2 (en) | 2014-06-13 | 2020-09-15 | Snap Inc. | Prioritization of messages within a message collection |
US10448201B1 (en) | 2014-06-13 | 2019-10-15 | Snap Inc. | Prioritization of messages within a message collection |
US10524087B1 (en) | 2014-06-13 | 2019-12-31 | Snap Inc. | Message destination list mechanism |
US11317240B2 (en) | 2014-06-13 | 2022-04-26 | Snap Inc. | Geo-location based event gallery |
US9825898B2 (en) | 2014-06-13 | 2017-11-21 | Snap Inc. | Prioritization of messages within a message collection |
US11122200B2 (en) | 2014-07-07 | 2021-09-14 | Snap Inc. | Supplying content aware photo filters |
US10432850B1 (en) | 2014-07-07 | 2019-10-01 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US11849214B2 (en) | 2014-07-07 | 2023-12-19 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US10154192B1 (en) | 2014-07-07 | 2018-12-11 | Snap Inc. | Apparatus and method for supplying content aware photo filters |
US11595569B2 (en) | 2014-07-07 | 2023-02-28 | Snap Inc. | Supplying content aware photo filters |
US10602057B1 (en) | 2014-07-07 | 2020-03-24 | Snap Inc. | Supplying content aware photo filters |
US10423983B2 (en) | 2014-09-16 | 2019-09-24 | Snap Inc. | Determining targeting information based on a predictive targeting model |
US11625755B1 (en) | 2014-09-16 | 2023-04-11 | Foursquare Labs, Inc. | Determining targeting information based on a predictive targeting model |
US11741136B2 (en) | 2014-09-18 | 2023-08-29 | Snap Inc. | Geolocation-based pictographs |
US11281701B2 (en) | 2014-09-18 | 2022-03-22 | Snap Inc. | Geolocation-based pictographs |
US10824654B2 (en) | 2014-09-18 | 2020-11-03 | Snap Inc. | Geolocation-based pictographs |
US11216869B2 (en) | 2014-09-23 | 2022-01-04 | Snap Inc. | User interface to augment an image using geolocation |
US11038829B1 (en) | 2014-10-02 | 2021-06-15 | Snap Inc. | Ephemeral gallery of ephemeral messages with opt-in permanence |
US20170374003A1 (en) | 2014-10-02 | 2017-12-28 | Snapchat, Inc. | Ephemeral gallery of ephemeral messages |
US11522822B1 (en) | 2014-10-02 | 2022-12-06 | Snap Inc. | Ephemeral gallery elimination based on gallery and message timers |
US11411908B1 (en) | 2014-10-02 | 2022-08-09 | Snap Inc. | Ephemeral message gallery user interface with online viewing history indicia |
US10476830B2 (en) | 2014-10-02 | 2019-11-12 | Snap Inc. | Ephemeral gallery of ephemeral messages |
WO2016077116A1 (en) * | 2014-11-10 | 2016-05-19 | Thomson Licensing | Movie night |
US11190679B2 (en) | 2014-11-12 | 2021-11-30 | Snap Inc. | Accessing media at a geographic location |
US9843720B1 (en) | 2014-11-12 | 2017-12-12 | Snap Inc. | User interface for accessing media at a geographic location |
US11956533B2 (en) | 2014-11-12 | 2024-04-09 | Snap Inc. | Accessing media at a geographic location |
US10616476B1 (en) | 2014-11-12 | 2020-04-07 | Snap Inc. | User interface for accessing media at a geographic location |
US11372608B2 (en) | 2014-12-19 | 2022-06-28 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US10811053B2 (en) | 2014-12-19 | 2020-10-20 | Snap Inc. | Routing messages by message parameter |
US11783862B2 (en) | 2014-12-19 | 2023-10-10 | Snap Inc. | Routing messages by message parameter |
US11250887B2 (en) | 2014-12-19 | 2022-02-15 | Snap Inc. | Routing messages by message parameter |
US11803345B2 (en) | 2014-12-19 | 2023-10-31 | Snap Inc. | Gallery of messages from individuals with a shared interest |
US10580458B2 (en) | 2014-12-19 | 2020-03-03 | Snap Inc. | Gallery of videos set to an audio time line |
US10157449B1 (en) | 2015-01-09 | 2018-12-18 | Snap Inc. | Geo-location-based image filters |
US11301960B2 (en) | 2015-01-09 | 2022-04-12 | Snap Inc. | Object recognition based image filters |
US11734342B2 (en) | 2015-01-09 | 2023-08-22 | Snap Inc. | Object recognition based image overlays |
US10380720B1 (en) | 2015-01-09 | 2019-08-13 | Snap Inc. | Location-based image filters |
US11388226B1 (en) | 2015-01-13 | 2022-07-12 | Snap Inc. | Guided personal identity based actions |
US11962645B2 (en) | 2015-01-13 | 2024-04-16 | Snap Inc. | Guided personal identity based actions |
US11249617B1 (en) | 2015-01-19 | 2022-02-15 | Snap Inc. | Multichannel system |
US11910267B2 (en) | 2015-01-26 | 2024-02-20 | Snap Inc. | Content request by location |
US10123166B2 (en) | 2015-01-26 | 2018-11-06 | Snap Inc. | Content request by location |
US10536800B1 (en) | 2015-01-26 | 2020-01-14 | Snap Inc. | Content request by location |
US11528579B2 (en) | 2015-01-26 | 2022-12-13 | Snap Inc. | Content request by location |
US10932085B1 (en) | 2015-01-26 | 2021-02-23 | Snap Inc. | Content request by location |
US10223397B1 (en) | 2015-03-13 | 2019-03-05 | Snap Inc. | Social graph based co-location of network users |
US11902287B2 (en) | 2015-03-18 | 2024-02-13 | Snap Inc. | Geo-fence authorization provisioning |
US10616239B2 (en) | 2015-03-18 | 2020-04-07 | Snap Inc. | Geo-fence authorization provisioning |
US10893055B2 (en) | 2015-03-18 | 2021-01-12 | Snap Inc. | Geo-fence authorization provisioning |
US11320651B2 (en) | 2015-03-23 | 2022-05-03 | Snap Inc. | Reducing boot time and power consumption in displaying data content |
US10948717B1 (en) | 2015-03-23 | 2021-03-16 | Snap Inc. | Reducing boot time and power consumption in wearable display systems |
US11662576B2 (en) | 2015-03-23 | 2023-05-30 | Snap Inc. | Reducing boot time and power consumption in displaying data content |
US11449539B2 (en) | 2015-05-05 | 2022-09-20 | Snap Inc. | Automated local story generation and curation |
US10592574B2 (en) | 2015-05-05 | 2020-03-17 | Snap Inc. | Systems and methods for automated local story generation and curation |
US10911575B1 (en) | 2015-05-05 | 2021-02-02 | Snap Inc. | Systems and methods for story and sub-story navigation |
US11392633B2 (en) | 2015-05-05 | 2022-07-19 | Snap Inc. | Systems and methods for automated local story generation and curation |
US11496544B2 (en) | 2015-05-05 | 2022-11-08 | Snap Inc. | Story and sub-story navigation |
US9881094B2 (en) | 2015-05-05 | 2018-01-30 | Snap Inc. | Systems and methods for automated local story generation and curation |
US10993069B2 (en) | 2015-07-16 | 2021-04-27 | Snap Inc. | Dynamically adaptive media content delivery |
US11961116B2 (en) | 2015-08-13 | 2024-04-16 | Foursquare Labs, Inc. | Determining exposures to content presented by physical objects |
US10817898B2 (en) | 2015-08-13 | 2020-10-27 | Placed, Llc | Determining exposures to content presented by physical objects |
US10102680B2 (en) | 2015-10-30 | 2018-10-16 | Snap Inc. | Image based tracking in augmented reality systems |
US11315331B2 (en) | 2015-10-30 | 2022-04-26 | Snap Inc. | Image based tracking in augmented reality systems |
US10366543B1 (en) | 2015-10-30 | 2019-07-30 | Snap Inc. | Image based tracking in augmented reality systems |
US10733802B2 (en) | 2015-10-30 | 2020-08-04 | Snap Inc. | Image based tracking in augmented reality systems |
US11769307B2 (en) | 2015-10-30 | 2023-09-26 | Snap Inc. | Image based tracking in augmented reality systems |
US10474321B2 (en) | 2015-11-30 | 2019-11-12 | Snap Inc. | Network resource location linking and visual content sharing |
US11599241B2 (en) | 2015-11-30 | 2023-03-07 | Snap Inc. | Network resource location linking and visual content sharing |
US10997783B2 (en) | 2015-11-30 | 2021-05-04 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US11380051B2 (en) | 2015-11-30 | 2022-07-05 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US10657708B1 (en) | 2015-11-30 | 2020-05-19 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US11830117B2 (en) | 2015-12-18 | 2023-11-28 | Snap Inc | Media overlay publication system |
US10354425B2 (en) | 2015-12-18 | 2019-07-16 | Snap Inc. | Method and system for providing context relevant media augmentation |
US11468615B2 (en) | 2015-12-18 | 2022-10-11 | Snap Inc. | Media overlay publication system |
US11023514B2 (en) | 2016-02-26 | 2021-06-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US11611846B2 (en) | 2016-02-26 | 2023-03-21 | Snap Inc. | Generation, curation, and presentation of media collections |
US11889381B2 (en) | 2016-02-26 | 2024-01-30 | Snap Inc. | Generation, curation, and presentation of media collections |
US11197123B2 (en) | 2016-02-26 | 2021-12-07 | Snap Inc. | Generation, curation, and presentation of media collections |
US10679389B2 (en) | 2016-02-26 | 2020-06-09 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections |
US10834525B2 (en) | 2016-02-26 | 2020-11-10 | Snap Inc. | Generation, curation, and presentation of media collections |
US11631276B2 (en) | 2016-03-31 | 2023-04-18 | Snap Inc. | Automated avatar generation |
US11044393B1 (en) | 2016-06-20 | 2021-06-22 | Pipbin, Inc. | System for curation and display of location-dependent augmented reality content in an augmented estate system |
US10805696B1 (en) | 2016-06-20 | 2020-10-13 | Pipbin, Inc. | System for recording and targeting tagged content of user interest |
US11201981B1 (en) | 2016-06-20 | 2021-12-14 | Pipbin, Inc. | System for notification of user accessibility of curated location-dependent content in an augmented estate |
US10638256B1 (en) | 2016-06-20 | 2020-04-28 | Pipbin, Inc. | System for distribution and display of mobile targeted augmented reality content |
US10839219B1 (en) | 2016-06-20 | 2020-11-17 | Pipbin, Inc. | System for curation, distribution and display of location-dependent augmented reality content |
US10992836B2 (en) | 2016-06-20 | 2021-04-27 | Pipbin, Inc. | Augmented property system of curated augmented reality media elements |
US11876941B1 (en) | 2016-06-20 | 2024-01-16 | Pipbin, Inc. | Clickable augmented reality content manager, system, and network |
US11785161B1 (en) | 2016-06-20 | 2023-10-10 | Pipbin, Inc. | System for user accessibility of tagged curated augmented reality content |
US10735892B2 (en) | 2016-06-28 | 2020-08-04 | Snap Inc. | System to track engagement of media items |
US11640625B2 (en) | 2016-06-28 | 2023-05-02 | Snap Inc. | Generation, curation, and presentation of media collections with automated advertising |
US10165402B1 (en) | 2016-06-28 | 2018-12-25 | Snap Inc. | System to track engagement of media items |
US10785597B2 (en) | 2016-06-28 | 2020-09-22 | Snap Inc. | System to track engagement of media items |
US10219110B2 (en) | 2016-06-28 | 2019-02-26 | Snap Inc. | System to track engagement of media items |
US10885559B1 (en) | 2016-06-28 | 2021-01-05 | Snap Inc. | Generation, curation, and presentation of media collections with automated advertising |
US10327100B1 (en) | 2016-06-28 | 2019-06-18 | Snap Inc. | System to track engagement of media items |
US11445326B2 (en) | 2016-06-28 | 2022-09-13 | Snap Inc. | Track engagement of media items |
US10506371B2 (en) | 2016-06-28 | 2019-12-10 | Snap Inc. | System to track engagement of media items |
US10430838B1 (en) | 2016-06-28 | 2019-10-01 | Snap Inc. | Methods and systems for generation, curation, and presentation of media collections with automated advertising |
US11080351B1 (en) | 2016-06-30 | 2021-08-03 | Snap Inc. | Automated content curation and communication |
US10387514B1 (en) | 2016-06-30 | 2019-08-20 | Snap Inc. | Automated content curation and communication |
US11895068B2 (en) | 2016-06-30 | 2024-02-06 | Snap Inc. | Automated content curation and communication |
US11509615B2 (en) | 2016-07-19 | 2022-11-22 | Snap Inc. | Generating customized electronic messaging graphics |
US10348662B2 (en) | 2016-07-19 | 2019-07-09 | Snap Inc. | Generating customized electronic messaging graphics |
US11816853B2 (en) | 2016-08-30 | 2023-11-14 | Snap Inc. | Systems and methods for simultaneous localization and mapping |
US12002232B2 (en) | 2016-08-30 | 2024-06-04 | Snap Inc. | Systems and methods for simultaneous localization and mapping |
US11843456B2 (en) | 2016-10-24 | 2023-12-12 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11876762B1 (en) | 2016-10-24 | 2024-01-16 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US10623666B2 (en) | 2016-11-07 | 2020-04-14 | Snap Inc. | Selective identification and order of image modifiers |
US11750767B2 (en) | 2016-11-07 | 2023-09-05 | Snap Inc. | Selective identification and order of image modifiers |
US11233952B2 (en) | 2016-11-07 | 2022-01-25 | Snap Inc. | Selective identification and order of image modifiers |
US10754525B1 (en) | 2016-12-09 | 2020-08-25 | Snap Inc. | Customized media overlays |
US10203855B2 (en) | 2016-12-09 | 2019-02-12 | Snap Inc. | Customized user-controlled media overlays |
US11397517B2 (en) | 2016-12-09 | 2022-07-26 | Snap Inc. | Customized media overlays |
US11616745B2 (en) | 2017-01-09 | 2023-03-28 | Snap Inc. | Contextual generation and selection of customized media content |
US11870743B1 (en) | 2017-01-23 | 2024-01-09 | Snap Inc. | Customized digital avatar accessories |
US10915911B2 (en) | 2017-02-03 | 2021-02-09 | Snap Inc. | System to determine a price-schedule to distribute media content |
US10319149B1 (en) | 2017-02-17 | 2019-06-11 | Snap Inc. | Augmented reality anamorphosis system |
US11861795B1 (en) | 2017-02-17 | 2024-01-02 | Snap Inc. | Augmented reality anamorphosis system |
US11250075B1 (en) | 2017-02-17 | 2022-02-15 | Snap Inc. | Searching social media content |
US11720640B2 (en) | 2017-02-17 | 2023-08-08 | Snap Inc. | Searching social media content |
US10614828B1 (en) | 2017-02-20 | 2020-04-07 | Snap Inc. | Augmented reality speech balloon system |
US11189299B1 (en) | 2017-02-20 | 2021-11-30 | Snap Inc. | Augmented reality speech balloon system |
US11748579B2 (en) | 2017-02-20 | 2023-09-05 | Snap Inc. | Augmented reality speech balloon system |
US11961196B2 (en) | 2017-03-06 | 2024-04-16 | Snap Inc. | Virtual vision system |
US11670057B2 (en) | 2017-03-06 | 2023-06-06 | Snap Inc. | Virtual vision system |
US11037372B2 (en) | 2017-03-06 | 2021-06-15 | Snap Inc. | Virtual vision system |
US10887269B1 (en) | 2017-03-09 | 2021-01-05 | Snap Inc. | Restricted group content collection |
US11258749B2 (en) | 2017-03-09 | 2022-02-22 | Snap Inc. | Restricted group content collection |
US10523625B1 (en) | 2017-03-09 | 2019-12-31 | Snap Inc. | Restricted group content collection |
US11297399B1 (en) | 2017-03-27 | 2022-04-05 | Snap Inc. | Generating a stitched data stream |
US11558678B2 (en) | 2017-03-27 | 2023-01-17 | Snap Inc. | Generating a stitched data stream |
US11349796B2 (en) | 2017-03-27 | 2022-05-31 | Snap Inc. | Generating a stitched data stream |
US11170393B1 (en) | 2017-04-11 | 2021-11-09 | Snap Inc. | System to calculate an engagement score of location based media content |
US11195018B1 (en) | 2017-04-20 | 2021-12-07 | Snap Inc. | Augmented reality typography personalization system |
US10387730B1 (en) | 2017-04-20 | 2019-08-20 | Snap Inc. | Augmented reality typography personalization system |
US11995288B2 (en) | 2017-04-27 | 2024-05-28 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US11782574B2 (en) | 2017-04-27 | 2023-10-10 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11392264B1 (en) | 2017-04-27 | 2022-07-19 | Snap Inc. | Map-based graphical user interface for multi-type social media galleries |
US11474663B2 (en) | 2017-04-27 | 2022-10-18 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US10952013B1 (en) | 2017-04-27 | 2021-03-16 | Snap Inc. | Selective location-based identity communication |
US11418906B2 (en) | 2017-04-27 | 2022-08-16 | Snap Inc. | Selective location-based identity communication |
US11842411B2 (en) | 2017-04-27 | 2023-12-12 | Snap Inc. | Location-based virtual avatars |
US11409407B2 (en) | 2017-04-27 | 2022-08-09 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11385763B2 (en) | 2017-04-27 | 2022-07-12 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11893647B2 (en) | 2017-04-27 | 2024-02-06 | Snap Inc. | Location-based virtual avatars |
US11451956B1 (en) | 2017-04-27 | 2022-09-20 | Snap Inc. | Location privacy management on map-based social media platforms |
US10963529B1 (en) | 2017-04-27 | 2021-03-30 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US11556221B2 (en) | 2017-04-27 | 2023-01-17 | Snap Inc. | Friend location sharing mechanism for social media platforms |
US11232040B1 (en) | 2017-04-28 | 2022-01-25 | Snap Inc. | Precaching unlockable data elements |
US11675831B2 (en) | 2017-05-31 | 2023-06-13 | Snap Inc. | Geolocation based playlists |
US11475254B1 (en) | 2017-09-08 | 2022-10-18 | Snap Inc. | Multimodal entity identification |
US11721080B2 (en) | 2017-09-15 | 2023-08-08 | Snap Inc. | Augmented reality system |
US10740974B1 (en) | 2017-09-15 | 2020-08-11 | Snap Inc. | Augmented reality system |
US11335067B2 (en) | 2017-09-15 | 2022-05-17 | Snap Inc. | Augmented reality system |
US11617056B2 (en) | 2017-10-09 | 2023-03-28 | Snap Inc. | Context sensitive presentation of content |
US11006242B1 (en) | 2017-10-09 | 2021-05-11 | Snap Inc. | Context sensitive presentation of content |
US10499191B1 (en) | 2017-10-09 | 2019-12-03 | Snap Inc. | Context sensitive presentation of content |
US12010582B2 (en) | 2017-10-09 | 2024-06-11 | Snap Inc. | Context sensitive presentation of content |
US11030787B2 (en) | 2017-10-30 | 2021-06-08 | Snap Inc. | Mobile-based cartographic control of display content |
US11670025B2 (en) | 2017-10-30 | 2023-06-06 | Snap Inc. | Mobile-based cartographic control of display content |
US11558327B2 (en) | 2017-12-01 | 2023-01-17 | Snap Inc. | Dynamic media overlay with smart widget |
US11943185B2 (en) | 2017-12-01 | 2024-03-26 | Snap Inc. | Dynamic media overlay with smart widget |
US11265273B1 (en) | 2017-12-01 | 2022-03-01 | Snap, Inc. | Dynamic media overlay with smart widget |
US11687720B2 (en) | 2017-12-22 | 2023-06-27 | Snap Inc. | Named entity recognition visual context and caption data |
US11017173B1 (en) | 2017-12-22 | 2021-05-25 | Snap Inc. | Named entity recognition visual context and caption data |
US11983215B2 (en) | 2018-01-03 | 2024-05-14 | Snap Inc. | Tag distribution visualization system |
US11487794B2 (en) | 2018-01-03 | 2022-11-01 | Snap Inc. | Tag distribution visualization system |
US10678818B2 (en) | 2018-01-03 | 2020-06-09 | Snap Inc. | Tag distribution visualization system |
US11841896B2 (en) | 2018-02-13 | 2023-12-12 | Snap Inc. | Icon based tagging |
US11507614B1 (en) | 2018-02-13 | 2022-11-22 | Snap Inc. | Icon based tagging |
US10979752B1 (en) | 2018-02-28 | 2021-04-13 | Snap Inc. | Generating media content items based on location information |
US11523159B2 (en) | 2018-02-28 | 2022-12-06 | Snap Inc. | Generating media content items based on location information |
US10885136B1 (en) | 2018-02-28 | 2021-01-05 | Snap Inc. | Audience filtering system |
US11044574B2 (en) | 2018-03-06 | 2021-06-22 | Snap Inc. | Geo-fence selection system |
US10524088B2 (en) | 2018-03-06 | 2019-12-31 | Snap Inc. | Geo-fence selection system |
US11570572B2 (en) | 2018-03-06 | 2023-01-31 | Snap Inc. | Geo-fence selection system |
US10327096B1 (en) | 2018-03-06 | 2019-06-18 | Snap Inc. | Geo-fence selection system |
US11722837B2 (en) | 2018-03-06 | 2023-08-08 | Snap Inc. | Geo-fence selection system |
US10933311B2 (en) | 2018-03-14 | 2021-03-02 | Snap Inc. | Generating collectible items based on location information |
US11491393B2 (en) | 2018-03-14 | 2022-11-08 | Snap Inc. | Generating collectible items based on location information |
US11998833B2 (en) | 2018-03-14 | 2024-06-04 | Snap Inc. | Generating collectible items based on location information |
US11163941B1 (en) | 2018-03-30 | 2021-11-02 | Snap Inc. | Annotating a collection of media content items |
US10779114B2 (en) | 2018-04-18 | 2020-09-15 | Snap Inc. | Visitation tracking system |
US10924886B2 (en) | 2018-04-18 | 2021-02-16 | Snap Inc. | Visitation tracking system |
US10219111B1 (en) | 2018-04-18 | 2019-02-26 | Snap Inc. | Visitation tracking system |
US11297463B2 (en) | 2018-04-18 | 2022-04-05 | Snap Inc. | Visitation tracking system |
US11683657B2 (en) | 2018-04-18 | 2023-06-20 | Snap Inc. | Visitation tracking system |
US10681491B1 (en) | 2018-04-18 | 2020-06-09 | Snap Inc. | Visitation tracking system |
US10448199B1 (en) | 2018-04-18 | 2019-10-15 | Snap Inc. | Visitation tracking system |
US11860888B2 (en) | 2018-05-22 | 2024-01-02 | Snap Inc. | Event detection system |
US11670026B2 (en) | 2018-07-24 | 2023-06-06 | Snap Inc. | Conditional modification of augmented reality object |
US10789749B2 (en) | 2018-07-24 | 2020-09-29 | Snap Inc. | Conditional modification of augmented reality object |
US10943381B2 (en) | 2018-07-24 | 2021-03-09 | Snap Inc. | Conditional modification of augmented reality object |
US10679393B2 (en) | 2018-07-24 | 2020-06-09 | Snap Inc. | Conditional modification of augmented reality object |
US11367234B2 (en) | 2018-07-24 | 2022-06-21 | Snap Inc. | Conditional modification of augmented reality object |
US10997760B2 (en) | 2018-08-31 | 2021-05-04 | Snap Inc. | Augmented reality anthropomorphization system |
US11450050B2 (en) | 2018-08-31 | 2022-09-20 | Snap Inc. | Augmented reality anthropomorphization system |
US11676319B2 (en) | 2018-08-31 | 2023-06-13 | Snap Inc. | Augmented reality anthropomorphtzation system |
US11704005B2 (en) | 2018-09-28 | 2023-07-18 | Snap Inc. | Collaborative achievement interface |
US11455082B2 (en) | 2018-09-28 | 2022-09-27 | Snap Inc. | Collaborative achievement interface |
US11799811B2 (en) | 2018-10-31 | 2023-10-24 | Snap Inc. | Messaging and gaming applications communication platform |
US11558709B2 (en) | 2018-11-30 | 2023-01-17 | Snap Inc. | Position service to determine relative position to map features |
US11698722B2 (en) | 2018-11-30 | 2023-07-11 | Snap Inc. | Generating customized avatars based on location information |
US11812335B2 (en) | 2018-11-30 | 2023-11-07 | Snap Inc. | Position service to determine relative position to map features |
US11199957B1 (en) | 2018-11-30 | 2021-12-14 | Snap Inc. | Generating customized avatars based on location information |
US11877211B2 (en) | 2019-01-14 | 2024-01-16 | Snap Inc. | Destination sharing in location sharing system |
US11751015B2 (en) | 2019-01-16 | 2023-09-05 | Snap Inc. | Location-based context information sharing in a messaging system |
US11294936B1 (en) | 2019-01-30 | 2022-04-05 | Snap Inc. | Adaptive spatial density based clustering |
US11693887B2 (en) | 2019-01-30 | 2023-07-04 | Snap Inc. | Adaptive spatial density based clustering |
US11972529B2 (en) | 2019-02-01 | 2024-04-30 | Snap Inc. | Augmented reality system |
US11809624B2 (en) | 2019-02-13 | 2023-11-07 | Snap Inc. | Sleep detection in a location sharing system |
US11954314B2 (en) | 2019-02-25 | 2024-04-09 | Snap Inc. | Custom media overlay system |
US11500525B2 (en) | 2019-02-25 | 2022-11-15 | Snap Inc. | Custom media overlay system |
US11574431B2 (en) | 2019-02-26 | 2023-02-07 | Snap Inc. | Avatar based on weather |
US11301117B2 (en) | 2019-03-08 | 2022-04-12 | Snap Inc. | Contextual information in chat |
US11868414B1 (en) | 2019-03-14 | 2024-01-09 | Snap Inc. | Graph-based prediction for contact suggestion in a location sharing system |
US11852554B1 (en) | 2019-03-21 | 2023-12-26 | Snap Inc. | Barometer calibration in a location sharing system |
US11249614B2 (en) | 2019-03-28 | 2022-02-15 | Snap Inc. | Generating personalized map interface with enhanced icons |
US11740760B2 (en) | 2019-03-28 | 2023-08-29 | Snap Inc. | Generating personalized map interface with enhanced icons |
US11361493B2 (en) | 2019-04-01 | 2022-06-14 | Snap Inc. | Semantic texture mapping system |
US11785549B2 (en) | 2019-05-30 | 2023-10-10 | Snap Inc. | Wearable device location systems |
US11963105B2 (en) | 2019-05-30 | 2024-04-16 | Snap Inc. | Wearable device location systems architecture |
US11606755B2 (en) | 2019-05-30 | 2023-03-14 | Snap Inc. | Wearable device location systems architecture |
US11206615B2 (en) | 2019-05-30 | 2021-12-21 | Snap Inc. | Wearable device location systems |
US11601783B2 (en) | 2019-06-07 | 2023-03-07 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11917495B2 (en) | 2019-06-07 | 2024-02-27 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11714535B2 (en) | 2019-07-11 | 2023-08-01 | Snap Inc. | Edge gesture interface with smart interactions |
US11821742B2 (en) | 2019-09-26 | 2023-11-21 | Snap Inc. | Travel based notifications |
US11218838B2 (en) | 2019-10-31 | 2022-01-04 | Snap Inc. | Focused map-based context information surfacing |
US11977553B2 (en) | 2019-12-30 | 2024-05-07 | Snap Inc. | Surfacing augmented reality objects |
US11128715B1 (en) | 2019-12-30 | 2021-09-21 | Snap Inc. | Physical friend proximity in chat |
US11429618B2 (en) | 2019-12-30 | 2022-08-30 | Snap Inc. | Surfacing augmented reality objects |
US11943303B2 (en) | 2019-12-31 | 2024-03-26 | Snap Inc. | Augmented reality objects registry |
US11343323B2 (en) | 2019-12-31 | 2022-05-24 | Snap Inc. | Augmented reality objects registry |
US11893208B2 (en) | 2019-12-31 | 2024-02-06 | Snap Inc. | Combined map icon with action indicator |
US11228551B1 (en) | 2020-02-12 | 2022-01-18 | Snap Inc. | Multiple gateway message exchange |
US11888803B2 (en) | 2020-02-12 | 2024-01-30 | Snap Inc. | Multiple gateway message exchange |
US11516167B2 (en) | 2020-03-05 | 2022-11-29 | Snap Inc. | Storing data based on device location |
US11765117B2 (en) | 2020-03-05 | 2023-09-19 | Snap Inc. | Storing data based on device location |
US11619501B2 (en) | 2020-03-11 | 2023-04-04 | Snap Inc. | Avatar based on trip |
US11776256B2 (en) | 2020-03-27 | 2023-10-03 | Snap Inc. | Shared augmented reality system |
US11915400B2 (en) | 2020-03-27 | 2024-02-27 | Snap Inc. | Location mapping for large scale augmented-reality |
US11430091B2 (en) | 2020-03-27 | 2022-08-30 | Snap Inc. | Location mapping for large scale augmented-reality |
US11483267B2 (en) | 2020-06-15 | 2022-10-25 | Snap Inc. | Location sharing using different rate-limited links |
US11503432B2 (en) | 2020-06-15 | 2022-11-15 | Snap Inc. | Scalable real-time location sharing framework |
US11314776B2 (en) | 2020-06-15 | 2022-04-26 | Snap Inc. | Location sharing using friend list versions |
US11290851B2 (en) | 2020-06-15 | 2022-03-29 | Snap Inc. | Location sharing using offline and online objects |
US11676378B2 (en) | 2020-06-29 | 2023-06-13 | Snap Inc. | Providing travel-based augmented reality content with a captured image |
US11943192B2 (en) | 2020-08-31 | 2024-03-26 | Snap Inc. | Co-location connection service |
US11483262B2 (en) | 2020-11-12 | 2022-10-25 | International Business Machines Corporation | Contextually-aware personalized chatbot |
US11606756B2 (en) | 2021-03-29 | 2023-03-14 | Snap Inc. | Scheduling requests for location data |
US11902902B2 (en) | 2021-03-29 | 2024-02-13 | Snap Inc. | Scheduling requests for location data |
US11601888B2 (en) | 2021-03-29 | 2023-03-07 | Snap Inc. | Determining location using multi-source geolocation data |
US11645324B2 (en) | 2021-03-31 | 2023-05-09 | Snap Inc. | Location-based timeline media content system |
US11829834B2 (en) | 2021-10-29 | 2023-11-28 | Snap Inc. | Extended QR code |
US12033253B2 (en) | 2021-11-17 | 2024-07-09 | Snap Inc. | Augmented reality typography personalization system |
US12001750B2 (en) | 2022-04-20 | 2024-06-04 | Snap Inc. | Location-based shared augmented reality experience system |
US12026362B2 (en) | 2022-05-19 | 2024-07-02 | Snap Inc. | Video editing application for mobile devices |
US12020384B2 (en) | 2022-06-21 | 2024-06-25 | Snap Inc. | Integrating augmented reality experiences with other components |
US12020386B2 (en) | 2022-06-23 | 2024-06-25 | Snap Inc. | Applying pregenerated virtual experiences in new location |
US12028301B2 (en) | 2023-01-31 | 2024-07-02 | Snap Inc. | Contextual generation and selection of customized media content |
US11902129B1 (en) | 2023-03-24 | 2024-02-13 | T-Mobile Usa, Inc. | Vendor-agnostic real-time monitoring of telecommunications networks |
US12033191B2 (en) | 2023-03-29 | 2024-07-09 | Snap Inc. | Generation, curation, and presentation of media collections with automated advertising |
US12035198B2 (en) | 2023-05-22 | 2024-07-09 | Snap Inc. | Visitation tracking system |
Also Published As
Publication number | Publication date |
---|---|
WO2009043020A3 (en) | 2009-05-14 |
US20100299615A1 (en) | 2010-11-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2009043020A2 (en) | System and method for injecting sensed presence into social networking applications | |
Miluzzo et al. | CenceMe–injecting sensing presence into social networking applications | |
CN107111652B (en) | System and method for selecting device content based on probability of device being linked | |
TWI636416B (en) | Method and system for multi-phase ranking for content personalization | |
CN106605418B (en) | Power management for mobile clients using location-based services | |
CA2832557C (en) | Recommending digital content based on implicit user identification | |
TWI533246B (en) | Method and system for discovery of user unknown interests | |
US20150178282A1 (en) | Fast and dynamic targeting of users with engaging content | |
CN110431585A (en) | A kind of generation method and device of user's portrait | |
KR101573993B1 (en) | Method and apparatus for segmenting context information | |
US9454234B2 (en) | Instruction triggering method and device, user information acquisition method and system, terminal, and server | |
US20100077020A1 (en) | Method, apparatus and computer program product for providing intelligent updates of emission values | |
US20130132566A1 (en) | Method and apparatus for determining user context | |
WO2017019650A1 (en) | Activity detection based on activity models | |
US20170031996A1 (en) | Virtual Tiles For Service Content Recommendation | |
US9871876B2 (en) | Sequential behavior-based content delivery | |
US20210337010A1 (en) | Computerized system and method for automatically providing networked devices non-native functionality | |
CN110782289B (en) | Service recommendation method and system based on user portrait | |
CN111247782B (en) | Method and system for automatically creating instant AD-HOC calendar events | |
US9826366B2 (en) | Low key point of interest notification | |
US11422996B1 (en) | Joint embedding content neural networks | |
US20210248173A1 (en) | Systems and methods for providing media recommendations using contextual and sequential user embeddings | |
CN113454669A (en) | Characterizing a place by user visited features | |
CN108307039B (en) | Application information display method and mobile terminal | |
CN110799946A (en) | Multi-application user interest memory management |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08833029 Country of ref document: EP Kind code of ref document: A2 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12680492 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 08833029 Country of ref document: EP Kind code of ref document: A2 |