US20150058123A1 - Contextually aware interactive advertisements - Google Patents
Contextually aware interactive advertisements Download PDFInfo
- Publication number
- US20150058123A1 US20150058123A1 US14/465,786 US201414465786A US2015058123A1 US 20150058123 A1 US20150058123 A1 US 20150058123A1 US 201414465786 A US201414465786 A US 201414465786A US 2015058123 A1 US2015058123 A1 US 2015058123A1
- Authority
- US
- United States
- Prior art keywords
- advertisement
- user
- presentation
- real
- location
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000002452 interceptive effect Effects 0.000 title abstract description 11
- 238000000034 method Methods 0.000 claims abstract description 40
- 238000004891 communication Methods 0.000 claims description 53
- 238000004458 analytical method Methods 0.000 claims description 42
- 230000002123 temporal effect Effects 0.000 claims description 14
- 238000001514 detection method Methods 0.000 description 23
- 238000010586 diagram Methods 0.000 description 18
- 230000006870 function Effects 0.000 description 16
- 239000000047 product Substances 0.000 description 15
- 230000033001 locomotion Effects 0.000 description 13
- 239000000284 extract Substances 0.000 description 12
- 230000003287 optical effect Effects 0.000 description 11
- 101100503241 Caenorhabditis elegans folt-1 gene Proteins 0.000 description 9
- 230000008878 coupling Effects 0.000 description 9
- 238000010168 coupling process Methods 0.000 description 9
- 238000005859 coupling reaction Methods 0.000 description 9
- 230000007613 environmental effect Effects 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 8
- 230000014509 gene expression Effects 0.000 description 8
- 230000004044 response Effects 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 7
- 230000000670 limiting effect Effects 0.000 description 7
- 230000001413 cellular effect Effects 0.000 description 6
- 239000007789 gas Substances 0.000 description 6
- 238000007726 management method Methods 0.000 description 6
- 230000005236 sound signal Effects 0.000 description 6
- 230000003213 activating effect Effects 0.000 description 5
- 238000005286 illumination Methods 0.000 description 4
- 238000013507 mapping Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 238000012552 review Methods 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 3
- 210000004556 brain Anatomy 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 230000006855 networking Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000002829 reductive effect Effects 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 230000001755 vocal effect Effects 0.000 description 3
- RWSOTUBLDIXVET-UHFFFAOYSA-N Dihydrogen sulfide Chemical compound S RWSOTUBLDIXVET-UHFFFAOYSA-N 0.000 description 2
- 239000008186 active pharmaceutical agent Substances 0.000 description 2
- 238000007792 addition Methods 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 230000036760 body temperature Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000003344 environmental pollutant Substances 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 230000008921 facial expression Effects 0.000 description 2
- 231100001261 hazardous Toxicity 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000000704 physical effect Effects 0.000 description 2
- 231100000719 pollutant Toxicity 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000001737 promoting effect Effects 0.000 description 2
- 230000002207 retinal effect Effects 0.000 description 2
- 230000008786 sensory perception of smell Effects 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000012011 method of payment Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000037081 physical activity Effects 0.000 description 1
- 230000037074 physically active Effects 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 238000007670 refining Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0251—Targeted advertisements
- G06Q30/0267—Wireless devices
-
- E—FIXED CONSTRUCTIONS
- E05—LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
- E05B—LOCKS; ACCESSORIES THEREFOR; HANDCUFFS
- E05B47/00—Operating or controlling locks or other fastening devices by electric or magnetic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9535—Search customisation based on user profiles and personalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/30—Payment architectures, schemes or protocols characterised by the use of specific devices or networks
- G06Q20/32—Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
- G06Q20/321—Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices using wearable devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/30—Payment architectures, schemes or protocols characterised by the use of specific devices or networks
- G06Q20/32—Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
- G06Q20/322—Aspects of commerce using mobile devices [M-devices]
- G06Q20/3224—Transactions dependent on location of M-devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/30—Payment architectures, schemes or protocols characterised by the use of specific devices or networks
- G06Q20/32—Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
- G06Q20/326—Payment applications installed on the mobile devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/38—Payment protocols; Details thereof
- G06Q20/384—Payment protocols; Details thereof using social networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0251—Targeted advertisements
- G06Q30/0255—Targeted advertisements based on user history
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0251—Targeted advertisements
- G06Q30/0261—Targeted advertisements based on user location
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0282—Rating or review of business operators or products
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0623—Item investigation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0631—Item recommendations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0633—Lists, e.g. purchase orders, compilation or processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/01—Social networking
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/30—Individual registration on entry or exit not involving the use of a pass
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07F—COIN-FREED OR LIKE APPARATUS
- G07F7/00—Mechanisms actuated by objects other than coins to free or to actuate vending, hiring, coin or paper currency dispensing or refunding apparatus
- G07F7/02—Mechanisms actuated by objects other than coins to free or to actuate vending, hiring, coin or paper currency dispensing or refunding apparatus by keys or other credit registering devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/02—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail using automatic reactions or user delegation, e.g. automatic replies or chatbot-generated messages
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
-
- E—FIXED CONSTRUCTIONS
- E05—LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
- E05B—LOCKS; ACCESSORIES THEREFOR; HANDCUFFS
- E05B47/00—Operating or controlling locks or other fastening devices by electric or magnetic means
- E05B2047/0048—Circuits, feeding, monitoring
- E05B2047/005—Opening, closing of the circuit
- E05B2047/0054—Opening, closing of the circuit using microprocessor, printed circuits, or the like
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/52—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
Definitions
- Embodiments of the present disclosure relate generally to mobile computing technology and, more particularly, but not by way of limitation, to contextual interactive advertisements.
- FIG. 1A is a block diagram illustrating a networked system, according to some example embodiments.
- FIG. 1B illustrates a block diagram showing components provided within the system of FIG. 1A , according to some example embodiments.
- FIG. 2 is a block diagram illustrating an example embodiment of an advertisement system, according to some example embodiments.
- FIG. 3 is a depiction of an interactive advertisement, according to some example embodiments.
- FIG. 4 is a flow diagram illustrating an example method for identifying an advertisement and presenting item listings, according to some example embodiments.
- FIG. 5 is an illustration showing example types of sensors that provide various sensor data, according to some example embodiments.
- FIG. 6 is a flow diagram illustrating communication between various entities, according to some example embodiments.
- FIG. 7 is a flow diagram illustrating further example operations for presenting item listings based on real-time contextual data, according to some example embodiments.
- FIGS. 8 and 9 are flow diagrams illustrating further operations for determining contextual conditions, according to some example embodiments.
- FIGS. 10-13 illustrate example user interfaces, according to some example embodiments.
- FIG. 14 depicts an example mobile device and mobile operating system interface, according to some example embodiments.
- FIG. 15 is a block diagram illustrating an example of a software architecture that may be installed on a machine, according to some example embodiments.
- FIG. 16 illustrates a diagrammatic representation of a machine in the form of a computer system within which a set of instructions may be executed for causing the machine to perform any one or more of the methodologies discussed herein, according to an example embodiment.
- Mobile devices provide a variety of data that is measured, captured, or otherwise obtained via sensors such as position sensors to capture position data (e.g., using a Global Positioning System (GPS) component), detection sensors to detect identifiers (e.g., an optical sensor to read QR codes), and the like.
- position sensors to capture position data
- detection sensors to detect identifiers
- identifiers e.g., an optical sensor to read QR codes
- contextually aware interactive advertisements may be realized using such data.
- exclusivity associated with a particular advertisement can be implemented using contextual data.
- a particular (in some cases exclusive) offer or deal corresponding to a particular advertisement may be available to users who are physically being presented the particular advertisement at a particular time and location.
- the element of exclusivity is intended, in some scenarios, to have the effect of generating excitement or a “buzz” regarding an advertising campaign.
- an advertisement indication that indicates a presentation of an advertisement or promotion to a user is received.
- the user may scan or otherwise obtain an advertisement identifier (e.g., scanning a QR code that included the advertisement identifier) corresponding to an advertisement using a user device (e.g., a smart phone equipped with an optical sensor to scan QR codes).
- the advertisement indication comprises the advertisement identifier scanned by the user, although the advertisement indication can include other information such as location, time, or other contextual data.
- the advertisement corresponding to the advertisement indication is identified. For example, if the advertisement indication includes the advertisement identifier, the advertisement can be identified via a lookup of the advertisement based on the advertisement identifier.
- one or more item listings are determined based, at least in part, on the identified advertisement.
- a predefined set of items is accessed based on the advertisement (e.g., a lookup of the predefined set of items corresponding to the advertisement using the advertisement identifier) and one or more item listings corresponding to items included in the predefined set of items may be identified.
- the advertisement corresponds to an item type, and item listings associated with the item type are determined (e.g., identifying item listings on an e-commerce website that match the item type). Many other schemes and techniques may be employed to determine the item listing.
- contextual data corresponding to the advertisement indication is received.
- the contextual data comprises real-time contextual data.
- the contextual data corresponds to a physical context or physical environment of the presentation of the advertisement.
- the contextual data comprises location data (e.g., as determined by a GPS component of a mobile device of the user) that corresponds to a presentation location of the presentation of the advertisement to the user.
- location data e.g., as determined by a GPS component of a mobile device of the user
- a location of where the user is viewing a particular advertisement may be ascertained in real-time.
- Presentation of the at least one item listing is caused based on the real-time contextual data. For example, if the user is viewing a particular advertisement at a particular location and time, the item listing is presented to the user.
- the item listings are exclusively available to users that meet contextual conditions such as a location condition (e.g., a distance condition) and temporal condition. For example, if a user location is not within a distance of an advertisement location, the item listings may not be available to the user. In this manner, the user may interact with a contextually aware interactive advertisement.
- FIG. 1A an example embodiment of a high-level client-server-based network architecture 100 is shown.
- a networked system 102 provides server-side functionality via a network 104 (e.g., the Internet or wide area network (WAN)) to a client device 110 .
- a user e.g., user 106
- FIG. 1A illustrates, for example, a web client 112 (e.g., a browser, such as the Internet Explorer®) browser developed by Microsoft® Corporation of Redmond, Wash. State), client application(s) 114 , and a programmatic client 116 executing on the client device 110 .
- the client device 110 may include the web client 112 , the client application(s) 114 , and the programmatic client 116 alone, together, or in any suitable combination.
- FIG. 1A shows one client device 110 , in other implementations, the network architecture 100 comprises multiple client devices.
- the client device 110 comprises a computing device that includes at least a display and communication capabilities that provide access to the networked system 102 via the network 104 .
- the client device 110 comprises, but is not limited to, a remote device, work station, computer, general purpose computer, Internet appliance, hand-held device, wireless device, portable device, wearable computer, cellular or mobile phone, Personal Digital Assistant (PDA), smart phone, tablet, ultrabook, netbook, laptop, desktop, multi-processor system, microprocessor-based or programmable consumer electronic, game consoles, set-top box, network Personal Computer (PC), mini-computer, and so forth.
- the client device 110 comprises one or more of a touch screen, accelerometer, gyroscope, biometric sensor, camera, microphone, Global Positioning System (GPS) device, and the like.
- GPS Global Positioning System
- the client device 110 communicates with the network 104 via a wired or wireless connection.
- the network 104 comprises an ad hoc network, an intranet, an extranet, a Virtual Private Network (VPN), a Local Area Network (LAN), a wireless LAN (WLAN), a Wide Area Network (WAN), a wireless WAN (WWAN), a Metropolitan Area Network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a Wireless Fidelity (Wi-Fi®) network, a Worldwide Interoperability for Microwave Access (WiMax) network, another type of network, or any suitable combination thereof.
- VPN Virtual Private Network
- LAN Local Area Network
- WLAN wireless LAN
- WAN Wide Area Network
- WWAN wireless WAN
- MAN Metropolitan Area Network
- PSTN Public Switched Telephone Network
- PSTN Public Switched Telephone Network
- a cellular telephone network a wireless network
- Wi-Fi® Wireless Fide
- the client device 110 includes one or more of the applications (also referred to as “apps”) such as, but not limited to, web browsers, book reader apps (operable to read e-books), media apps (operable to present various media forms including audio and video), fitness apps, biometric monitoring apps, messaging apps, electronic mail (email) apps, and e-commerce site apps (also referred to as “marketplace apps”).
- the client application(s) 114 include various components operable to present information to the user and communicate with networked system 102 .
- the e-commerce site application may be configured to locally provide the user interface and at least some of the functionalities with the application configured to communicate with the networked system 102 , on an as needed basis, for data or processing capabilities not locally available (e.g., access to a database of items available for sale, to authenticate a user, to verify a method of payment).
- the client device 110 can use its web browser to access the e-commerce site (or a variant thereof) hosted on the networked system 102 .
- the user (e.g., the user 106 ) comprises a person, a machine, or other means of interacting with the client device 110 .
- the user is not be part of the network architecture 100 , but interacts with the network architecture 100 via the client device 110 or another means.
- the user provides input (e.g., touch screen input or alphanumeric input) to the client device 110 and the input is communicated to the networked system 102 via the network 104 .
- the networked system 102 in response to receiving the input from the user, communicates information to the client device 110 via the network 104 to be presented to the users. In this way, the user can interact with the networked system 102 using the client device 110 .
- An Application Program Interface (API) server 120 and a web server 122 may be coupled to, and provide programmatic and web interfaces respectively to, one or more application server(s) 140 .
- the application server(s) 140 may host one or more publication system(s) 142 , payment system(s) 144 , and an advertisement system 150 , each of which may comprise one or more modules or applications and each of which may be embodied as hardware, software, firmware, or any combination thereof.
- the application server(s) 140 are, in turn, shown to be coupled to one or more database server(s) 124 that facilitate access to one or more information storage repositories or database(s) 126 .
- the database(s) 126 are storage devices that store information to be posted (e.g., publications or listings) to the publication system(s) 142 .
- the database(s) 126 may also store digital goods information in accordance with some example embodiments.
- a third party application 132 executing on a third party server 130 , is shown as having programmatic access to the networked system 102 via the programmatic interface provided by the API server 120 .
- the third party application 132 utilizing information retrieved from the networked system 102 , may support one or more features or functions on a website hosted by the third party.
- the third party website may, for example, provide one or more promotional, marketplace, or payment functions that are supported by the relevant applications of the networked system 102 .
- the publication system(s) 142 may provide a number of publication functions and services to the users that access the networked system 102 .
- the payment system(s) 144 may likewise provide a number of functions to perform or facilitate payments and transactions. While the publication system(s) 142 and payment system(s) 144 are shown in FIG. 1A to both form part of the networked system 102 , it will be appreciated that, in alternative embodiments, each system 142 and 144 may form part of a payment service that is separate and distinct from the networked system 102 . In some example embodiments, the payment system(s) 144 may form part of the publication system(s) 142 .
- the advertisement system 150 provides functionality to implement contextually aware interactive advertisements. As such, the advertisement system 150 receives an advertisement indication, identifies the advertisement corresponding to the advertisement indication, determines the item listings based on the advertisement, receives the contextual data, and causes the presentation of the item listings to the user based on the contextual data. In some example embodiments, the system 150 communicates with the client device 110 , the third party server(s) 130 , the publication system(s) 142 (e.g., retrieving item listings), and the payment system(s) 144 (e.g., purchasing a listing). In an alternative example embodiment, the advertisement system 150 is a part of the publication system(s) 142 . The advertisement system 150 will be discussed further in connection with FIG. 2 below.
- client-server-based network architecture 100 shown in FIG. 1A employs a client-server architecture
- present inventive subject matter is, of course, not limited to such an architecture, and may equally well find application in a distributed, or peer-to-peer, architecture system, for example.
- the various systems of the applications server(s) 140 e.g., the publication system(s) 142 and the payment system(s) 144
- the web client 112 may access the various systems of the networked system 102 (e.g., the publication system(s) 142 ) via the web interface supported by the web server 122 .
- the programmatic client 116 and client application(s) 114 may access the various services and functions provided by the networked system 102 via the programmatic interface provided by the API server 120 .
- the programmatic client 116 may, for example, be a seller application (e.g., the Turbo Lister application developed by eBay® Inc., of San Jose, Calif.) to enable sellers to author and manage listings on the networked system 102 in an off-line manner, and to perform batch-mode communications between the programmatic client 116 and the networked system 102 .
- FIG. 1B illustrates a block diagram showing components provided within the publication system(s) 142 , according to some embodiments.
- the publication system(s) 142 may comprise a market place system to provide market place functionality (e.g., facilitating the purchase of items associated with item listings on an e-commerce website).
- the networked system 102 may be hosted on dedicated or shared server machines that are communicatively coupled to enable communications between server machines.
- the components themselves are communicatively coupled (e.g., via appropriate interfaces) to each other and to various data sources, so as to allow information to be passed between the applications or so as to allow the applications to share and access common data.
- the components may access one or more database(s) 126 via the database server(s) 124 .
- the networked system 102 may provide a number of publishing, listing, and price-setting mechanisms whereby a seller (also referred to as a “first user”) may list (or publish information concerning) goods or services for sale or barter, a buyer (also referred to as a “second user”) can express interest in or indicate a desire to purchase or barter such goods or services, and a transaction (such as a trade) may be completed pertaining to the goods or services.
- the networked system 102 may comprise a publication engine 160 and a selling engine 162 .
- the publication engine 160 may publish information, such as item listings or product description pages, on the networked system 102 .
- the selling engine 162 may comprise one or more fixed-price engines that support fixed-price listing and price setting mechanisms and one or more auction engines that support auction-format listing and price setting mechanisms (e.g., English, Dutch, Chinese, Double, Reverse auctions, etc.).
- the various auction engines may also provide a number of features in support of these auction-format listings, such as a reserve price feature whereby a seller may specify a reserve price in connection with a listing and a proxy-bidding feature whereby a bidder may invoke automated proxy bidding.
- the selling engine 162 may further comprise one or more deal engines that support merchant-generated offers for products and services.
- a listing engine 164 allows sellers to conveniently author listings of items or authors to author publications.
- the listings pertain to goods or services that a user (e.g., a seller) wishes to transact via the networked system 102 .
- the listings may be an offer, deal, coupon, or discount for the good or service.
- Each good or service is associated with a particular category.
- the listing engine 164 may receive listing data such as title, description, and aspect name/value pairs.
- each listing for a good or service may be assigned an item identifier.
- a user may create a listing that is an advertisement or other form of information publication.
- Listings may then be stored to one or more storage devices coupled to the networked system 102 (e.g., database(s) 126 ).
- Listings also may comprise product description pages that display a product and information (e.g., product title, specifications, and reviews) associated with the product.
- the product description page may include an aggregation of item listings that correspond to the product described on the product description page.
- the listing engine 164 also may allow buyers to conveniently author listings or requests for items desired to be purchased.
- the listings may pertain to goods or services that a user (e.g., a buyer) wishes to transact via the networked system 102 .
- Each good or service is associated with a particular category.
- the listing engine 164 may receive as much or as little listing data, such as title, description, and aspect name/value pairs, that the buyer is aware of about the requested item.
- the listing engine 164 may parse the buyer's submitted item information and may complete incomplete portions of the listing.
- the listing engine 164 may parse the description, extract key terms, and use those terms to make a determination of the identity of the item. Using the determined item identity, the listing engine 164 may retrieve additional item details for inclusion in the buyer item request. In some embodiments, the listing engine 164 may assign an item identifier to each listing for a good or service.
- the listing engine 164 allows sellers to generate offers for discounts on products or services.
- the listing engine 164 may receive listing data, such as the product or service being offered, a price or discount for the product or service, a time period for which the offer is valid, and so forth.
- the listing engine 164 permits sellers to generate offers from sellers' mobile devices. The generated offers may be uploaded to the networked system 102 for storage and tracking.
- Searching the networked system 102 is facilitated by a searching engine 166 .
- the searching engine 166 enables keyword queries of listings published via the networked system 102 .
- the searching engine 166 receives the keyword queries from a device of a user and conducts a review of the storage device storing the listing information. The review will enable compilation of a result set of listings that may be sorted and returned to the client device 110 of the user.
- the searching engine 166 may record the query (e.g., keywords) and any subsequent user actions and behaviors (e.g., navigations, selections, or click-throughs).
- the searching engine 166 also may perform a search based on a location of the user.
- a user may access the searching engine 166 via a mobile device and generate a search query. Using the search query and the user's location, the searching engine 166 may return relevant search results for products, services, offers, auctions, and so forth to the user.
- the searching engine 166 may identify relevant search results both in a list form and graphically on a map. Selection of a graphical indicator on the map may provide additional details regarding the selected search result.
- the user may specify, as part of the search query, a radius or distance from the user's current location to limit search results.
- a navigation engine 168 allows users to navigate through various categories, catalogs, or inventory data structures according to which listings may be classified within the networked system 102 .
- the navigation engine 168 allows a user to successively navigate down a category tree comprising a hierarchy of categories (e.g., the category tree structure) until a particular set of listings is reached.
- Various other navigation applications within the navigation engine 168 may be provided to supplement the searching and browsing applications.
- the navigation engine 168 may record the various user actions (e.g., clicks) performed by the user in order to navigate down the category tree.
- a personalization engine 170 provides functionality to personalize various aspects of user interactions with the networked system 102 .
- the user can define, provide, or otherwise communicate personalization settings used by the personalization engine 170 to determine interactions with the networked system 102 .
- the personalization engine 170 determines personalization settings automatically and personalizes interactions based on the automatically determined settings. For example, the personalization engine 170 determines a native language of the user and automatically presents information in the native language.
- FIG. 2 is a block diagram of the advertisement system 150 that provides functionality to implement contextually aware interactive advertisements, according to some example embodiments.
- the advertisement system 150 includes a presentation module 210 , a communication module 220 , an analysis module 230 , an item module 240 , a condition module 250 , and an offer module 260 . All, or some, of the modules 210 - 260 of FIG. 2 , communicate with each other either directly or indirectly, for example, via a network coupling, shared memory, and the like. It will be appreciated that each module of modules 210 - 260 may be implemented as a single module, combined into other modules, further subdivided into multiple modules, or any suitable combination thereof. Other modules not pertinent to example embodiments may also be included, but are not shown.
- the presentation module 210 provides various presentation and user interface functionality operable to interactively present and receive information from the user. For instance, the presentation module 210 can cause presentation of the determined item listings to the user. In various implementations, the presentation module 210 presents or causes presentation of information (e.g., visually displaying information on a screen, acoustic output, haptic feedback). Interactively presenting is intended to include the exchange of information between a particular device and the user.
- the user may provide input to interact with the user interface in many possible manners such as alphanumeric, point based (e.g., cursor), tactile, or other input (e.g., touch screen, tactile sensor, light sensor, infrared sensor, biometric sensor, microphone, gyroscope, accelerometer, or other sensors), and the like.
- the presentation module 210 provides many other user interfaces to facilitate functionality described herein.
- “presenting” as used herein is intended to include communicating information or instructions to a particular device that is operable to perform presentation based on the communicated information or instructions.
- the communication module 220 provides various communications functionality and web services.
- the communication module 220 provides network communication such as communicating with the networked system 102 , the client device 110 , and the third party server(s) 130 .
- the communication module 220 receives the advertisement indication from a user device (e.g., a smart phone) of the user.
- the communication module 220 receives contextual data corresponding to the advertisement indication.
- the contextual data is real-time contextual data.
- the network communication may operate over wired or wireless modalities.
- Web services are intended to include retrieving information from the third party server(s) 130 , the database(s) 126 , and the application server(s) 140 .
- information retrieved by the communication module 220 comprises data associated with the user (e.g., user profile information from an online account, social network service data associated with the user), data associated with one or more items listed on an e-commerce website (e.g., images of the item, reviews of the item, item price), or other data to facilitate the functionality described herein.
- data associated with the user e.g., user profile information from an online account, social network service data associated with the user
- data associated with one or more items listed on an e-commerce website e.g., images of the item, reviews of the item, item price
- the analysis module 230 provides a variety of analysis functions to facilitate the functionality describe herein. For example, the analysis module 230 identifies the advertisement corresponding to the advertisement indication. More specifically, the analysis module 230 performs a lookup of the advertisement identifier included in the advertisement indication to identify the advertisement, according to some implementations. In some implementations, the analysis module 230 extracts information from the contextual data such as a user location, a presentation time, a user identity, and so on.
- the analysis module 230 accesses user data corresponding to the user.
- user data may include calendars (e.g., user calendar events such as birthdays, trips, exams), user profiles (e.g., demographic information such as age, gender, income level), purchase histories, browse histories (e.g., search terms), social media content (e.g., check-ins, posts, connections), other user data (e.g., bookmarked websites, preferences or settings for various applications, application usage data such as time spent using a particular application), and the like.
- the analysis module 230 access, retrieves, or otherwise obtains the user data from the database(s) 126 , the third party server(s) 130 , the publication system(s) 142 , or elsewhere.
- the item module 240 provides functionality to determine the item listings, according to some implementations.
- the item listings correspond to items available for purchase such as a listing on an e-commerce website.
- the item module 240 may employ a variety of schemes and techniques to determine the item listing based on various data.
- the item module 240 accesses a predefined set of item listings corresponding to the advertisement and determines one or more item listings among the set of item listings.
- the predefined set of item listings may be configured by an operator, advertiser, or another party associated with the advertisement.
- the item module 240 determines the item listings from an e-commerce website (e.g., eBay®) based on the advertisement (e.g., an item type or brand corresponding to the advertisement).
- an e-commerce website e.g., eBay®
- the condition module 250 provides functionality to implement contextual conditions in conjunction with the advertisement, according to some embodiments. For instance, the condition module 250 accesses contextual conditions associated with the advertisement and evaluates satisfaction of the contextual conditions. For example, if the contextual conditions include a distance condition, the condition module 250 determines satisfaction of the distance condition based on the user location and the advertisement location.
- the contextual conditions are intended to create exclusivity in association with the advertisement. In other words, the interactive features of the advertisement may be available under specified conditions associated with the contextual data and otherwise unavailable to the user.
- the offer module 260 provides functionality to generate advertisement offers associated with the item listing, according to some embodiments.
- the offer comprises a discount for a purchase associated with the item listing.
- the offer module 260 provides advertisement features that are specific to the advertisement.
- the advertisement features comprise, for example, free shipping, faster shipping, otherwise unavailable item features (e.g., a color or style not widely available), exclusive items, and so forth.
- the offer module 260 may provide a variety of offers to the user, and in some cases, the offers may be based on various data such as the contextual data, user data, and so forth.
- Scene 310 depicts an advertisement 320 that includes a tag 330 (e.g., a QR code embedded on or near the advertisement 320 ).
- the tag 330 is embedded in the advertisement 320 , and in other implementations, the tag 330 is merely in the vicinity of the advertisement 320 .
- user device 350 detects the advertisement identifier encoded in the tag 330 via a signal 340 .
- the signal 340 may be an optical signal captured, detected, or otherwise obtained by the user device 350 , with the user device 350 being operable to decode the signal to extract the advertisement identifier.
- the tag 330 comprises a QR code that is readable by an app executing on a mobile device of the user that includes a camera sensor.
- the tag 330 comprises a Radio Frequency Identification (RFID) tag, Near Field Communication (NFC) tag, smart tag, or another storage device operable to store the advertisement identifier and communicate the advertisement identifier to the user device 350 (see FIG. 5 for additional sensor to detect identifiers).
- RFID Radio Frequency Identification
- NFC Near Field Communication
- smart tag or another storage device operable to store the advertisement identifier and communicate the advertisement identifier to the user device 350 (see FIG. 5 for additional sensor to detect identifiers).
- a tagless identification of the advertisement may be implemented by comparing a user location to respective advertisement locations corresponding to a plurality of advertisements and identifying a match between a particular advertisement location and the user location.
- the user device 350 is communicatively coupled, via coupling 360 , to the network 104 , which is in turn communicatively coupled to the networked system 102 including the advertisement system 150 (discussed above in connection with FIG. 1A ).
- User 370 may initiate the identification of the advertisement 320 by operating the user device 350 .
- the user device 350 executes an app operable to obtain the advertisement identifier and presents a user interfaces that includes the item listings to the user.
- the user 370 is carrying the user device 350 (e.g., a smart phone or smart watch) and may be interested in the advertisement 320 .
- the user 370 initiates the identification of the advertisement 320 by detecting the tag 330 with the user device 350 .
- the user device 350 may extract the advertisement identifier from the tag 330 (e.g., scanning a QR code).
- the advertisement indication that includes the advertisement identifier corresponding to the tag 330 and the advertisement 320 is communicated from the user device 350 to the communication module 220 via the network 104 .
- the analysis module 230 identifies the advertisement corresponding to the advertisement indication (e.g., a lookup of the advertisement based on the advertisement identifier).
- the item module 240 determines the item listings, based, at least in part, on the identified advertisement. For instance, the item module 240 may access a predefined set of item listings or dynamically determine item listings corresponding to the advertisement. In some implementations, the item module 240 retrieves the item listing and associated item data from the publication system(s) 142 .
- the item data may include item images, price, description, brand, and so forth.
- the communication module 220 receives the contextual data corresponding to the advertisement indication from the user device 350 .
- the contextual data includes location data (e.g., as determined by a GPS component of the user device 350 ).
- the presentation module 210 based on the contextual data, causes presentation of the item listings (e.g., by communicating the item listings and instructions to present the item listing to the user device 350 ).
- the condition module 250 determines satisfaction of the contextual conditions and, based on the satisfaction of the contextual conditions, the presentation module 210 causes presentation of the item listings to the user.
- the contextual conditions include the distance condition.
- the analysis module 230 identifies the advertisement location corresponding to the advertisement (e.g., the advertisement location may be predefined and accessed by the analysis module 230 ) and extracts the user location from the contextual data (e.g., the contextual data includes GPS data from the user device 350 ).
- the condition module 250 determines satisfaction of the distance condition by determining that the user location is within a distance of the advertisement location. The distance can be predefined or dynamically determined based on the contextual data.
- the presentation module 210 causes presentation of the item listing. Conversely, if the contextual conditions are not satisfied, the presentation module 210 does not cause presentation of the item listing.
- the user may not be able to view the item listings.
- the presentation of the item listings may be exclusive to users (e.g., the user 370 ) that are physically in the vicinity of the advertisement 320 .
- the user 370 may interact with the advertisement 320 in a contextually aware manner.
- FIG. 4 is a flow diagram illustrating an example method 400 for identifying the advertisement and presenting the item listings, according to some example embodiments.
- the operations of the method 400 may be performed by components of the advertisement system 150 .
- the communication module 220 may receive, from the user device, the advertisement indication that indicates the presentation of the advertisement or a promotion to the user.
- the advertisement indication can comprise the advertisement identifier extracted from a QR code, a RFID tag, a NFC tag, a smart tag, an audio signal (e.g., audio tagging), or the contextual data (e.g., a location mapping).
- the analysis module 230 extracts the advertisement identifier from various suitable combinations of tags and the contextual data.
- the advertisement indication includes the contextual data corresponding to the physical context or physical environment of the presentation of the advertisement to the user.
- the contextual data is received, retrieved, or otherwise obtained as a separate operation as discussed below in connection with operation 440 .
- the user initiates the advertisement identification (e.g., the user activates a user interface element on the user device to begin the advertisement identification).
- the advertisement identification initiates automatically via the analysis module 230 monitoring, tracking, or otherwise automatically detecting the advertisement indication.
- the analysis module 230 monitors the contextual data, received by the communication module 220 , for the advertisement indication.
- the contextual data may include location data pertaining to the user.
- the analysis module 230 automatically detects the advertisement indication based on the location data (e.g., mapping the user location with a plurality of advertisement locations).
- a QR code, a RFID tag, a NFC tag, a smart tag, or the like is embedded in the advertisement or in the vicinity of the advertisement.
- the user may initiate the advertisement identification by physically detecting a particular tag corresponding to the advertisement (e.g., physically moving a mobile device operable to detect RFID tags within a detection range of the RFID tag corresponding to the advertisement).
- the advertisement indication includes an advertisement identifier extracted from the tag (e.g., extracted by the user device).
- the advertisement may include an audio component (e.g., a television advertisement, a radio advertisement, a loud speaker announcement).
- the user device 350 or the analysis module 230 extracts the advertisement identifier using audio tag identification software.
- an app executing on the user device 350 operable to detect and extract the advertisement identifier from an audio signal detected by the user device 350 , communicates the advertisement indication including the advertisement identifier to the communication module 220 .
- the user device 350 communicates the advertisement indication including the audio signal to the communication module 220 , and the analysis module 230 subsequently extracts the advertisement identifier from the audio signal.
- the analysis module 230 extracts the advertisement indication from the contextual data. For example, the analysis module 230 extracts the user location from the contextual data, accesses location data corresponding to a plurality of advertisements, and identifies a match between the user location and the location data of a particular advertisement among the plurality of advertisements. In this way, the analysis module 230 identifies a particular advertisement being presented to the user based on the contextual data.
- the analysis module 230 identifies the advertisement corresponding to the advertisement indication.
- the advertisement indication includes the advertisement identifier, and the analysis module 230 identifies the advertisement based on the advertisement identifier. For instance, the analysis module 230 performs a lookup of the advertisement using the advertisement identifier (e.g., a lookup table in a database, such as database(s) 126 , indexed with advertisement identifiers).
- the advertisement indication includes contextual data that the analysis module 230 uses to identify the advertisement. For instance, if the contextual data includes the user location, the analysis module 230 compares the user location to the advertisement locations (e.g., stored in a database such as database(s) 126 ).
- the item module 240 determines one or more item listings based, at least in part, on the advertisement.
- an operator, advertiser, or another party associated with the advertisement may specify a predefined set of item listings for the advertisement.
- the item module 240 accesses the predefined set of item listings for the advertisement and determines a portion of the predefined set of item listings.
- the advertisement may depict a particular celebrity, and the predefined set of item listings may include item listing endorsed by the particular celebrity.
- the item module 240 dynamically determines the item listings based on the advertisement. For instance, the item module 240 identifies item listings from an e-commerce website that are associated with the advertisement (e.g., same or similar type of item or brand as promoted by the advertisement).
- the item module 240 determines the item listings based on the contextual data, user data, or other data. For example, the item module 240 determines the item listings by first identifying item listings on an e-commerce website associated with the advertisement and then refining the identified item listings based on the user data. In a specific example, if the user data indicates a gender of the user, various ones of the identified item listings may be excluded based on the gender (e.g., gender specific apparel items). In a further example, the item module 240 may employ the contextual data to determine item listings pertinent to the user.
- the item module 240 may use the user location included in the contextual data as a basis for determining the item listings (e.g., based on the user being near a store that sells a particular item, include an item listing corresponding to the particular item).
- the contextual data may indicate weather conditions such as a cold day, and the item module 240 may identify item listings based on the weather conditions (e.g., item listings associated with cold weather such as hot coffee).
- the communication module 220 receives contextual data corresponding to the advertisement indication.
- the contextual data includes real-time contextual data.
- the term “real-time data,” as used herein, is intended to include data associated with an event currently happening.
- the real-time data may include user input data or sensor data communicated to the communication module 220 after a delay interval (e.g., due to transmission delay or other delays such as being temporarily stored at an intermediate device) between capturing the data and the communication module 220 receiving the data.
- the communication module 220 stores (e.g., a storage device such as database(s) 126 ) the contextual data in association with the user and the advertisement (e.g., a database index by a user identifier or the advertisement identifier).
- the item module 240 determines the item listings based on the stored contextual data.
- the stored contextual data may indicate that the user has previously initiated identification of a particular advertisement. Based on the indication of the previous presentation of the advertisement to the user, the item module 240 may determine different item listings than those previously presented to the user.
- the real-time contextual data corresponds to the physical context of the presentation of the advertisement.
- the physical context includes the presentation location (e.g., where the advertisement is being presented to the user), a presentation time (e.g., the time the advertisement is being presented to the user), an ambient noise level (e.g., a decibel level corresponding to a noise level of the advertisement presentation), an ambient temperature, an ambient illumination level, biometric data associated with the user, and so on.
- the communication module 220 receives the contextual data from sensors of the user device as further discussed in connection with FIG. 5 , below.
- example diagram 500 depicts non-limiting example sensor components 510 that may provide attribute data, according to some example embodiments.
- the sensor components 510 include motion components 520 , position components 530 , environmental components 540 , biometric components 550 , detection components 560 , and a wide gamut of other sensors, gauges, and measurement components not shown in FIG. 5 .
- the sensor components 510 or a suitable combination of the sensor components 510 may be included in any suitable device or machine of FIG. 1 , such as the client device 110 , to facilitate the functionality described herein.
- the sensor components 510 may receive, detect, measure, capture, or otherwise obtain sensor data associated with physical properties, attributes, or characteristics.
- the sensor components 510 may provide, produce, transmit, or otherwise communicate the sensor data or other indications associated with the physical properties, attributes, or characteristics (e.g., a sensor included in a device operable to communicate the sensor data to the networked system 102 ).
- a combination of devices may be employed to provide the sensor data (e.g., a first device that includes a sensor and is communicatively coupled to a second device that communicates sensor data received from the first device to the networked system 102 ).
- the sensor data provided by the sensor components 510 may be accessible to all, or some, of the modules described above on a real-time or near real-time basis.
- the sensor components 510 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting.
- the motion components 520 include acceleration sensors (e.g., accelerometer), gravitation sensors, rotation sensors (e.g., gyroscope), and so forth.
- the motion components 520 may provide motion data such as velocity, acceleration, or other force measurements along an x, y, and z axes.
- the motion data is provided at a regular update rate or sampling rate (e.g., 10 updates per second) that may be configurable.
- the position components 530 include location sensors (e.g., a Global Position System (GPS) receiver component), altitude sensors (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensors (e.g., magnetometers that provide magnetic field strength along the x, y, and z axes), and the like.
- location sensors e.g., a Global Position System (GPS) receiver component
- altitude sensors e.g., altimeters or barometers that detect air pressure from which altitude may be derived
- orientation sensors e.g., magnetometers that provide magnetic field strength along the x, y, and z axes
- the position components 530 may provide position data such as latitude, longitude, altitude, and a time stamp. Similar to the motion components 520 , the position components 530 may provide the motion data at a regular update rate that may be configurable.
- the environmental components 540 include illumination sensors (e.g., photometer), temperature sensors (e.g., one or more thermometers that detect ambient temperature), humidity sensors, pressure sensors (e.g., barometer), acoustic sensors (e.g., one or more microphones that detect background noise), proximity sensors (e.g., an infrared sensor that detects nearby objects), gas sensors (e.g., machine olfaction detection sensors, gas detection sensors to detect concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), and so on.
- the environmental components 540 may measure various physical parameters to provide an indication or signal corresponding to the physical environment surrounding the environmental components 540 .
- the biometric components 550 include components to detect expressions, measure biosignals, or identify people, among other functions.
- the biometric components 550 include expression components to detect expressions (also referred to as “kinesics”) such as hand gestures (e.g., an optical component to detect a hand gesture or a Doppler component to detect hand motions), vocal expressions (e.g., a microphone to detect changes in voice pitch that may indicate tension), facial expressions (e.g., a camera to detect expressions or micro-expressions of a person such as a smile), body gestures, and eye tracking (e.g., detecting the focal point of a person's eyes or patterns in eye movement).
- hand gestures e.g., an optical component to detect a hand gesture or a Doppler component to detect hand motions
- vocal expressions e.g., a microphone to detect changes in voice pitch that may indicate tension
- facial expressions e.g., a camera to detect expressions or micro-expressions of a person
- the biometric components 550 may also include, for example, biosignal components to measure biosignals such as blood pressure, heart rate, body temperature, perspiration, and brain waves (e.g., as determined by a electroencephalogram).
- biosignal components to measure biosignals such as blood pressure, heart rate, body temperature, perspiration, and brain waves (e.g., as determined by a electroencephalogram).
- the biometric components 550 include identification components to identify people such as retinal scanners (e.g., a camera component), vocal detectors (e.g., a microphone to receive audio data for voice identification), facial detectors, fingerprint detectors, and electroencephalogram sensors (e.g., to identify a person via unique brain wave patterns).
- the detection components 560 provide functionality to detect a variety of identifiers.
- the detection components 560 include Radio Frequency Identification (RFID) tag reader components, Near Field Communication (NFC) smart tag detection components, optical reader components (e.g., an optical sensor to detect one-dimensional bar codes such as Universal Product Code (UPC) bar codes, multi-dimensional bar codes such as a Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, Uniform Commercial Code Reduced Space Symbology (UCC RSS)-2D bar code, and other optical codes), or acoustic detection components (e.g., microphones to identify tagged audio signals).
- RFID Radio Frequency Identification
- NFC Near Field Communication
- optical reader components e.g., an optical sensor to detect one-dimensional bar codes such as Universal Product Code (UPC) bar codes, multi-dimensional bar codes such as a Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra
- the presentation module 210 causes presentation of the item listings based on the real-time contextual data. For instance, the presentation module 210 communicates the item listing to the user device with instructions to cause presentation of the item listing to the user. In response to receiving the instructions to cause the presentation, the user device generates a user interface including the item listings and displays the user interface to the user, according to some implementations. In alternative implementations, the presentation module 210 generates the user interface including the item listings and communicates the generated user interface to the user device for presentation to the user.
- the offer module 260 generates an advertisement offer associated with the item listings. Subsequently, the presentation module 210 provides the advertisement offer to the user (e.g., a user interface configured to present the offer and receive an indication of a selection of the offer). In some instances, the offer module 260 generates the offer based on the contextual data. For example, if the contextual data indicates the user location is within a distance of a store the sells a particular item associated with the item listings, the offer module 260 may generate the offer for the particular item based on that basis (e.g., a discount to entice the user to stop by the store). In various implementations, the offer comprises discounts, coupons, free shipping, or exclusive item features (e.g., an item style otherwise unavailable) associated with the item listings.
- the contextual data indicates the user location is within a distance of a store the sells a particular item associated with the item listings
- the offer module 260 may generate the offer for the particular item based on that basis (e.g., a discount to entice
- the presentation module 210 or the offer module 260 may provide exclusive features (e.g., the exclusive features may be included in the presentation) associated with the item listings to the user.
- the exclusive features may include various promotional techniques such as offer items otherwise not available or of limited availability (e.g., a book including an author autograph).
- the exclusive features are intended to incentivize the user to interact with the advertisement.
- FIG. 6 is a flow diagram 600 illustrating communication between various entities, according to some example embodiments.
- user 602 initiates advertisement indication capture.
- the user 602 activates an app executing on a mobile device of the user to capture the advertisement indication.
- user device 604 captures the advertisement indication.
- the user device 604 captures the advertisement indication using a variety of techniques such as QR code scanning, RFID tag detection, NFC tag detection, smart tag detection, audio tag detection, user location mapping, and so on.
- the advertisement system 150 receives the advertisement indication at the operation 410 , identifies the advertisement at the operation 420 , determines the item listings at the operation 430 , receives the contextual data at the operation 440 , and causes presentation of the item listings at the operation 450 .
- the advertisement system 150 communicates the item listings to the user device 604 for presentation to the user, according to some implementation.
- the user device 604 presents the item listings to the user (e.g., a user interface that includes the item listings).
- the user 602 may receive (e.g., viewing) the presentation at operation 612 and may select an option to make a purchase associated with the item listings at the operation 614 .
- the user interface that includes the item listings may be configured to receive a selection to make a purchase associated with the item listings.
- the user device 604 receives the selection to make a purchase associated with the item listings.
- the user device 604 communicates the selection to make the purchase to the advertisement system 150 (e.g., received at the communication module 220 ).
- the advertisement system 150 may facilitate the purchase associated with the item listings.
- the offer module 260 may perform the transaction for the purchase.
- FIG. 7 is a flow diagram illustrating further example operations for presenting item listings based on real-time contextual data, according to some example embodiments.
- the presentation module 210 causes presentation of the item listings based on the contextual data at the operation 450 .
- the condition module 250 accesses contextual conditions associated with the advertisement.
- the contextual conditions may include a location condition, a temporal condition, or other conditions.
- the condition module 250 determines satisfaction of the contextual conditions based on the contextual data.
- the condition module 250 evaluates some, or all, of the conditions included in the contextual conditions (e.g., the condition module 250 iterates through and evaluates each condition included in the contextual conditions). For example, if the contextual conditions include a location condition and a temporal condition, the condition module 250 may determine satisfaction of the contextual conditions if either or both the location condition and the temporal condition are satisfied.
- the satisfaction of the contextual conditions is determined based on a weighting of the satisfaction of the conditions included in the contextual conditions (e.g., the location condition may be associated with a higher weight and given more influence in the condition module 250 determining satisfaction of the contextual conditions).
- the weighting may be predetermined or dynamically determined (e.g., weighting based on feedback data or other engagement data such as the user or similar user showing interest in a particular item listing via tapping or clicking on the particular item listing).
- the condition module 250 may calculate a contextual condition metric based on the satisfaction of respective conditions included in the contextual conditions. In this implementation, the condition module 250 determines satisfaction of the contextual conditions when the contextual condition metric exceeds a threshold.
- the presentation module 210 causes presentation of the item listings based on the satisfaction of the contextual conditions. For example, if the condition module 250 determines that the contextual conditions are satisfied, the presentation module 210 may then cause presentation of the item listings. Conversely if the condition module 250 determines the contextual conditions are not satisfied, the presentation module 210 does not cause presentation of the item listings.
- the condition module 250 monitors the contextual data to determine that the contextual conditions are satisfied after the presentation of the item listings. For instance, if the contextual conditions include the temporal condition, the condition module 250 may determine, after the item listings are presented to the user, that the temporal condition is not satisfied and restrict the presentation of the item listings or restrict features associated with the item listings (e.g., remove or disable an option to purchase the item listings). In this way, the condition module 250 may create an exclusive experience associated with the advertisement available to users who physically interact with the advertisement. The local or ephemeral nature of the presentation of the item listings created by employing the contextual conditions may have the effect of generating demand, interest, excited, or a “buzz” regarding an advertising campaign and associated item listings.
- FIG. 8 is a flow diagram illustrating further operations for determining contextual conditions, according to some example embodiments.
- the condition module 250 may determine satisfaction of the contextual conditions at the operation 720 .
- the operation 720 includes additional operations as show in FIG. 8 , such as the contextual conditions including the distance condition.
- the analysis module 230 identifies the advertisement location corresponding to the advertisement.
- the advertisement may have a fixed location (e.g., a poster affixed to a wall).
- An operator, advertiser, or another party may assign a predefined advertisement location to the advertisement (e.g., longitude, latitude, altitude coordinates for the poster location) and store the advertisement location in a storage device such as database(s) 126 .
- the analysis module 230 identifies the advertisement location by accessing the advertisement location based on the advertisement identifier (e.g., the advertisement location stored in a database according to the advertisement identifier).
- the analysis module 230 automatically determines the location of the advertisement based on the contextual data.
- the contextual data may indicate a location of a mobile device of the user (e.g., via a GPS component of the mobile device).
- the mobile device may further detect the advertisement using any of the short range communication techniques described above (e.g., QR code scanning, RFID detection).
- the analysis module 230 may infer that the location of the advertisement is in the vicinity of the mobile device when the mobile device detects the advertisement.
- the analysis module 230 stores (e.g., in a storage device such as database(s) 126 ) the automatically determined advertisement location to be used in subsequent analysis.
- the analysis module 230 may access a stored advertisement location corresponding to the advertisement for the user.
- the analysis module 230 stores the automatically determined advertisement location from many users and identifies the true advertisement location using statistical analysis (e.g., an average or standard deviation based analysis). Many other schemes and techniques may be employed by the analysis module 230 to automatically determine the advertisement location.
- the analysis module 230 extracts the user location from the contextual data.
- the contextual data may include location data received from a mobile device of the user operable to provide location as determined by a GPS component of the mobile device.
- the analysis module 230 may infer the location of the user based the detection of the advertisement using a short range communication technique, similar to that discussed above in connection with the operation 810 .
- the advertisement location is known (e.g., predefined by an operator or the automatically determine advertisement location is stored from a different user)
- the analysis module 230 may infer the user location is in the vicinity of the advertisement location based on the user detecting the advertisement using a short range communication technique.
- the condition module 250 determines satisfaction of the distance condition by determining that the user location is within a distance of the advertisement location.
- the distance may be predefined (e.g., specified by the advertiser) or dynamically determined.
- FIG. 9 is a flow diagram illustrating further operations for determining contextual conditions, according to some example embodiments.
- the condition module 250 may determine satisfaction of the contextual conditions at the operation 720 .
- the operation 720 includes additional operations as shown in FIG. 9 , such as the contextual conditions including the temporal condition.
- the analysis module 230 extracts a presentation time from the real-time contextual data or the advertisement indication.
- the presentation time is intended to include a time when the user is being presented the advertisement.
- the advertisement indication may include a time stamp of when the user device detected the advertisement (e.g., when the QR code embedded in the advertisement was scanned).
- the condition module 250 determines satisfaction of the temporal condition by determining that the presentation time is within a time period (e.g., 15 minutes or one week).
- the time period may be predefined or dynamically determined (e.g., a time period based on the length of time the user viewed the advertisement as determined by the contextual data).
- FIGS. 8 and 9 illustrate the contextual conditions including the distance condition and the temporal condition
- an environmental condition may be implemented based on environmental data included in the contextual data.
- the condition module 250 may implement conditions based on ambient noise data, ambient illumination data, or other environmental data corresponding to the physical context of the presentation of the advertisement. For example, if the ambient noise data indicate the physical context of the presentation of the advertisement is noisy (e.g., audio decibel level exceeding a threshold), the presentation module 210 may omit an audio component of the presentation of the item listings as the user is unlikely to receive an audio presentation under such conditions.
- the condition module 250 may implement conditions based on biometric data corresponding to the user being presented the advertisement. For instance, heart rate data included in the context data may indicate the user is jogging or performing some other vigorous physical activity. The condition module 250 may target physically active users by implementing a biometric condition based on, for example, the heart rate exceeding a threshold.
- FIGS. 10-13 depict example user interfaces for interactively presenting the item listings to the user.
- FIGS. 10-13 depict specific example user interfaces and user interface elements, these are merely non-limiting examples and many other alternate user interfaces and user interface elements may be generated by the presentation module 210 and presented to the user. It will be noted that alternate presentations of the displays of FIGS. 10-13 may include additional information, graphics, options, and so forth; other presentations may include less information, or may provide abridged information for easy use by the user.
- FIG. 10 depicts an example device 1000 (e.g., a smart phone) displaying an example user interface 1010 that includes user interface element 1020 and item listings 1030 , according to some example embodiments.
- the user interface element 1020 provides an option to sort the item listings 1030 , or otherwise navigate the item listings, according to various schemes such as sorting based on recentness (e.g., based on temporal information corresponding to respective item listings), item price, distance from the user, relevance, or other metrics.
- the item listings 1030 include various portions of item information such as an item image, price, merchant, brand, other information retrieved from the publication system(s) 142 , and the like.
- activating a particular item listing presents additional information corresponding to the particular item listing.
- the item listings 1030 may include item listings associated with a celebrity depicted in the advertisement (e.g., a celebrity endorsement for a basket of items).
- FIG. 11 depicts an example device 1100 (e.g., a smart phone) displaying an example user interface 1110 that includes an item listing 1120 , user interface element 1130 , and user interface element 1140 , according to some example embodiments.
- the item listing 1120 includes various portions of item information such as an item image, price, merchant, brand, other information retrieved from the publication system(s) 142 , and the like.
- activating the item listing 1120 presents additional information corresponding to the item listing 1120 .
- activating the user interface element 1130 provides the user the option to purchase the item corresponding to the item listing (e.g., activating the user interface element 1140 may facilitate a transaction for the item, for example, using the payment system(s) 144 ).
- user interface element 1140 includes a map with locations of the item listings or merchants that sell the item corresponding to the item listing 1120 .
- a current user location 1150 is determined (e.g., via a GPS component of a mobile device of the user) and used to determine nearby merchants, such as merchant 1160 , that sell the item corresponding to the item listing.
- FIG. 12 depicts an example device 1200 (e.g., a smart watch) displaying an example user interface 1210 .
- the example user interface 1210 includes user interface element 1220 that may be associated with the identified advertisement (see the advertisement in connection with FIG. 3 ).
- the user interface 1210 includes user interface element 1230 that provides the user an option to interact with the user interface 1210 . For instance, activating the user interface element 1230 provides additional information associated with the advertisement.
- the item listings may include a group of particular item listings associated with a celebrity (e.g., a celebrity endorsement).
- FIG. 13 depicts an example device 1300 (e.g., smart phone) displaying an example user interface 1310 that includes a notification 1320 , according to some example embodiments.
- the presentation module 210 causes presentation of the notification 1320 to the user.
- the notification may be presented to the user in response to automatic detection of the advertisement (e.g., location mapping of the user indicates the user is near the advertisement or automatic detection of an NFC tag embedded in an advertisement).
- the presentation module 210 communicates, to the device 1300 , instructions to present the notification 1320 .
- the instructions include notification content, generated by the presentation module 210 , such as a message (e.g., pertinent information) to be presented to the user.
- the notification 1320 comprises a text message, such as Short Message Service (SMS) messages, Multimedia Messaging Service (MMS), Enhanced Messaging Service (EMS), and so forth.
- SMS Short Message Service
- MMS Multimedia Messaging Service
- EMS Enhanced Messaging Service
- the notification 1320 comprises a push notification or another similar type of notification.
- the notification 1320 comprises interactive user interface elements such as user interface elements 1330 .
- the user interface elements 1330 provide the user an option to make a selection (e.g., through an SMS system, mobile application).
- Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules.
- a “hardware module” is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner.
- one or more computer systems e.g., a standalone computer system, a client computer system, or a server computer system
- one or more hardware modules of a computer system e.g., a processor or a group of processors
- software e.g., an application or application portion
- a hardware module may be implemented mechanically, electronically, or any suitable combination thereof.
- a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations.
- a hardware module may be a special-purpose processor, such as a Field-Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC).
- a hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations.
- a hardware module may include software encompassed within a general-purpose processor or other programmable processor. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
- hardware module should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein.
- “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware modules) at different times. Software may accordingly configure a particular processor or processors, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
- Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
- a resource e.g., a collection of information
- processors may be temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein.
- processor-implemented module refers to a hardware module implemented using one or more processors.
- the methods described herein may be at least partially processor-implemented, with a particular processor or processors being an example of hardware.
- a particular processor or processors being an example of hardware.
- the operations of a method may be performed by one or more processors or processor-implemented modules.
- the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS).
- SaaS software as a service
- at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an Application Program Interface (API)).
- API Application Program Interface
- processors may be distributed among the processors, not only residing within a single machine, but deployed across a number of machines.
- the processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the processors or processor-implemented modules may be distributed across a number of geographic locations.
- FIG. 14 illustrates an example mobile device 1400 executing a mobile operating system (e.g., iOSTM, AndroidTM, Windows® Phone, or other mobile operating systems), according to example embodiments.
- the mobile device 1400 includes a touch screen operable to receive tactile data from a user 1402 .
- the user 1402 may physically touch 1404 the mobile device 1400 , and in response to the touch 1404 , the mobile device 1400 determines tactile data such as touch location, touch force, or gesture motion.
- the mobile device 1400 displays a home screen 1406 (e.g., Springboard on iOSTM) operable to launch applications or otherwise manage various aspects of the mobile device 1400 .
- a home screen 1406 e.g., Springboard on iOSTM
- the home screen 1406 provides status information such as battery life, connectivity, or other hardware statuses.
- the user 1402 activates user interface elements by touching an area occupied by a respective user interface element. In this manner, the user 1402 may interact with the applications. For example, touching the area occupied by a particular icon included in the home screen 1406 causes launching of an application corresponding to the particular icon.
- apps may be executing on the mobile device 1400 such as native applications (e.g., applications programmed in Objective-C, Swift, or another suitable language running on iOSTM or applications programmed in Java running on AndroidTM), mobile web applications (e.g., Hyper Text Markup Language-5 (HTML5)), or hybrid applications (e.g., a native shell application that launches an HTML5 session).
- native applications e.g., applications programmed in Objective-C, Swift, or another suitable language running on iOSTM or applications programmed in Java running on AndroidTM
- mobile web applications e.g., Hyper Text Markup Language-5 (HTML5)
- hybrid applications e.g., a native shell application that launches an HTML5 session.
- the mobile device 1400 includes a messaging app 1420 , audio recording app 1422 , a camera app 1424 , a book reader app 1426 , a media app 1428 , a fitness app 1430 , a file management app 1432 , a location app 1434 , a browser app 1436 , a settings app 1438 , a contacts app 1440 , a telephone call app 1442 , or other apps (e.g., gaming apps, social networking apps, biometric monitoring apps), a third party app 1444 .
- apps e.g., gaming apps, social networking apps, biometric monitoring apps
- FIG. 15 is a block diagram 1500 illustrating an architecture of software 1502 , which may be installed on any one or more of the devices described above.
- FIG. 15 is merely a non-limiting example of a software architecture, and it will be appreciated that many other architectures may be implemented to facilitate the functionality described herein.
- the software 1502 may be implemented by hardware such as machine 1600 of FIG. 16 that includes processors 1610 , memory 1630 , and I/O components 1650 .
- the software 1502 may be conceptualized as a stack of layers where each layer may provide a particular functionality.
- the software 1502 includes layers such as an operating system 1504 , libraries 1506 , frameworks 1508 , and applications 1510 .
- the applications 1510 invoke application programming interface (API) calls 1512 through the software stack and receive messages 1514 in response to the API calls 1512 , according to some implementations.
- API application programming interface
- the operating system 1504 manages hardware resources and provides common services.
- the operating system 1504 includes, for example, a kernel 1520 , services 1522 , and drivers 1524 .
- the kernel 1520 acts as an abstraction layer between the hardware and the other software layers in some implementations.
- the kernel 1520 provides memory management, processor management (e.g., scheduling), component management, networking, security settings, among other functionality.
- the services 1522 may provide other common services for the other software layers.
- the drivers 1524 may be responsible for controlling or interfacing with the underlying hardware.
- the drivers 1524 may include display drivers, camera drivers, Bluetooth® drivers, flash memory drivers, serial communication drivers (e.g., Universal Serial Bus (USB) drivers), Wi-Fi® drivers, audio drivers, power management drivers, and so forth.
- USB Universal Serial Bus
- the libraries 1506 provide a low-level common infrastructure that may be utilized by the applications 1510 .
- the libraries 1506 may include system 1530 libraries (e.g., C standard library) that may provide functions such as memory allocation functions, string manipulation functions, mathematic functions, and the like.
- the libraries 1506 may include API libraries 1532 such as media libraries (e.g., libraries to support presentation and manipulation of various media formats such as Moving Picture Experts Group-4 (MPEG4), Advanced Video Coding (H.264 or AVC), Moving Picture Experts Group Layer-3 (MP3), Advanced Audio Coding (AAC), Adaptive Multi-Rate (AMR) audio codec, Joint Photographic Experts Group (JPEG or JPG), Portable Network Graphics (PNG)), graphics libraries (e.g., an OpenGL framework used to render in two dimensions (2D) and three dimensions (3D) in a graphic content on a display), database libraries (e.g., SQLite to provide various relational database functions), web libraries (e.g., WebKit to provide web browsing functionality), and the like.
- the libraries 1506 may also include a wide variety of other libraries 1534 to provide many other APIs to the applications 1510 .
- the frameworks 1508 provide a high-level common infrastructure that may be utilized by the applications 1510 , according to some implementations.
- the frameworks 1508 provide various graphic user interface (GUI) functions, high-level resource management, high-level location services, and so forth.
- GUI graphic user interface
- the frameworks 1508 may provide a broad spectrum of other APIs that may be utilized by the applications 1510 , some of which may be specific to a particular operating system or platform.
- the applications 1510 include a home application 1550 , a contacts application 1552 , a browser application 1554 , a book reader application 1556 , a location application 1558 , a media application 1560 , a messaging application 1562 , a game application 1564 , and a broad assortment of other applications such as third party application 1566 .
- the applications 1510 are programs that execute functions defined in the programs.
- Various programming languages may be employed to create one or more of the applications 1510 , structured in a variety of manners, such as object-orientated programming languages (e.g., Objective-C, Java, or C++) or procedural programming languages (e.g., C or assembly language).
- the third party application 1566 may be mobile software running on a mobile operating system such as iOSTM, AndroidTM, Windows® Phone, or other mobile operating systems.
- the third party application 1566 may invoke the API calls 1512 provided by the mobile operating system 1504 to facilitate functionality described herein.
- FIG. 16 is a block diagram illustrating components of a machine 1600 , according to some example embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein.
- FIG. 16 shows a diagrammatic representation of the machine 1600 in the example form of a computer system, within which instructions 1616 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine 1600 to perform any one or more of the methodologies discussed herein may be executed.
- the machine 1600 operates as a standalone device or may be coupled (e.g., networked) to other machines.
- the machine 1600 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- the machine 1600 may comprise, but not be limited to, a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a smart home device (e.g., a smart appliance), other smart devices, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 1616 , sequentially or otherwise, that specify actions to be taken by machine 1600 .
- the term “machine” shall also be taken to include a collection of machines
- the machine 1600 may include processors 1610 , memory 1630 , and I/O components 1650 , which may be configured to communicate with each other via a bus 1602 .
- the processors 1610 e.g., a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Radio-Frequency Integrated Circuit (RFIC), another processor, or any suitable combination thereof
- the processors 1610 may include, for example, processor 1612 and processor 1614 that may execute instructions 1616 .
- processor is intended to include multi-core processors that may comprise two or more independent processors (also referred to as “cores”) that may execute instructions contemporaneously.
- FIG. 16 shows multiple processors, the machine 1600 may include a single processor with a single core, a single processor with multiple cores (e.g., a multi-core process), multiple processors with a single core, multiple processors with multiples cores, or any combination thereof.
- the memory 1630 may include a main memory 1632 , a static memory 1634 , and a storage unit 1636 accessible to the processors 1610 via the bus 1602 .
- the storage unit 1636 may include a machine-readable medium 1638 on which is stored the instructions 1616 embodying any one or more of the methodologies or functions described herein.
- the instructions 1616 may also reside, completely or at least partially, within the main memory 1632 , within the static memory 1634 , within at least one of the processors 1610 (e.g., within the processor's cache memory), or any suitable combination thereof, during execution thereof by the machine 1600 . Accordingly, in various implementations, the main memory 1632 , static memory 1634 , and the processors 1610 are considered as machine-readable media 1638 .
- the term “memory” refers to a machine-readable medium 1638 able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium 1638 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions 1616 .
- machine-readable medium shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., instructions 1616 ) for execution by a machine (e.g., machine 1600 ), such that the instructions, when executed by one or more processors of the machine 1600 (e.g., processors 1610 ), cause the machine 1600 to perform any one or more of the methodologies described herein.
- a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices.
- machine-readable medium shall accordingly be taken to include, but not be limited to, one or more data repositories in the form of a solid-state memory (e.g., flash memory), an optical medium, a magnetic medium, other non-volatile memory (e.g., Erasable Programmable Read-Only Memory (EPROM)), or any suitable combination thereof.
- solid-state memory e.g., flash memory
- EPROM Erasable Programmable Read-Only Memory
- machine-readable medium specifically excludes non-statutory signals per se.
- the I/O components 1650 include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. In general, it will be appreciated that the I/O components 1650 may include many other components that are not shown in FIG. 16 .
- the I/O components 1650 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting. In various example embodiments, the I/O components 1650 include output components 1652 and input components 1654 .
- the output components 1652 include visual components (e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor), other signal generators, and so forth.
- visual components e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)
- acoustic components e.g., speakers
- haptic components e.g., a vibratory motor
- the input components 1654 include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), tactile input components (e.g., a physical button, a touch screen that provides location and force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like.
- alphanumeric input components e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components
- point based input components e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument
- tactile input components e.g., a physical button, a touch
- the I/O components 1650 include biometric components 1656 , motion components 1658 , environmental components 1660 , or position components 1662 among a wide array of other components.
- the biometric components 1656 include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like.
- the motion components 1658 include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth.
- the environmental components 1660 include, for example, illumination sensor components (e.g., photometer), temperature sensor components (e.g., one or more thermometer that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., barometer), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensors (e.g., machine olfaction detection sensors, gas detection sensors to detection concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment.
- illumination sensor components e.g., photometer
- temperature sensor components e.g., one or more thermometer that detect ambient temperature
- humidity sensor components e.g., pressure sensor components (e.g
- the position components 1662 include location sensor components (e.g., a Global Position System (GPS) receiver component), altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like.
- location sensor components e.g., a Global Position System (GPS) receiver component
- altitude sensor components e.g., altimeters or barometers that detect air pressure from which altitude may be derived
- orientation sensor components e.g., magnetometers
- the I/O components 1650 may include communication components 1664 operable to couple the machine 1600 to a network 1680 or devices 1670 via coupling 1682 and coupling 1672 , respectively.
- the communication components 1664 include a network interface component or another suitable device to interface with the network 1680 .
- communication components 1664 include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), Wi-Fi® components, and other communication components to provide communication via other modalities.
- the devices 1670 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).
- USB Universal Serial Bus
- the communication components 1664 detect identifiers or include components operable to detect identifiers.
- the communication components 1664 include Radio Frequency Identification (RFID) tag reader components, NFC smart tag detection components, optical reader components (e.g., an optical sensor to detect a one-dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, Uniform Commercial Code Reduced Space Symbology (UCC RSS)-2D bar code, and other optical codes), acoustic detection components (e.g., microphones to identify tagged audio signals), or any suitable combination thereof.
- RFID Radio Frequency Identification
- NFC smart tag detection components e.g., NFC smart tag detection components
- optical reader components e.g., an optical sensor to detect a one-dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, Aztec code
- IP Internet Protocol
- Wi-Fi® Wireless Fidelity
- one or more portions of the network 1680 may be an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless WAN (WWAN), a metropolitan area network (MAN), the Internet, a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a plain old telephone service (POTS) network, a cellular telephone network, a wireless network, a Wi-Fi® network, another type of network, or a combination of two or more such networks.
- VPN virtual private network
- LAN local area network
- WLAN wireless LAN
- WAN wide area network
- WWAN wireless WAN
- MAN metropolitan area network
- PSTN Public Switched Telephone Network
- POTS plain old telephone service
- the network 1680 or a portion of the network 1680 may include a wireless or cellular network and the coupling 1682 may be a Code Division Multiple Access (CDMA) connection, a Global System for Mobile communications (GSM) connection, or other type of cellular or wireless coupling.
- the coupling 1682 may implement any of a variety of types of data transfer technology, such as Single Carrier Radio Transmission Technology (1xRTT), Evolution-Data Optimized (EVDO) technology, General Packet Radio Service (GPRS) technology, Enhanced Data rates for GSM Evolution (EDGE) technology, third Generation Partnership Project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS).
- the instructions 1616 are transmitted or received over the network 1680 using a transmission medium via a network interface device (e.g., a network interface component included in the communication components 1664 ) and utilizing any one of a number of well-known transfer protocols (e.g., Hypertext Transfer Protocol (HTTP)).
- a network interface device e.g., a network interface component included in the communication components 1664
- HTTP Hypertext Transfer Protocol
- the instructions 1616 are transmitted or received using a transmission medium via the coupling 1672 (e.g., a peer-to-peer coupling) to devices 1670 .
- the term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions 1616 for execution by the machine 1600 , and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
- the machine-readable medium 1638 is non-transitory (in other words, not having any transitory signals) in that it does not embody a propagating signal.
- labeling the machine-readable medium 1638 as “non-transitory” should not be construed to mean that the medium is incapable of movement; the medium should be considered as being transportable from one physical location to another.
- the machine-readable medium 1638 is tangible, the medium may be considered to be a machine-readable device.
- inventive subject matter has been described with reference to specific example embodiments, various modifications and changes may be made to these embodiments without departing from the broader scope of embodiments of the present disclosure.
- inventive subject matter may be referred to herein, individually or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single disclosure or inventive concept if more than one is, in fact, disclosed.
- the term “or” may be construed in either an inclusive or exclusive sense. Moreover, plural instances may be provided for resources, operations, or structures described herein as a single instance. Additionally, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in a context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within a scope of various embodiments of the present disclosure. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within a scope of embodiments of the present disclosure as represented by the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Strategic Management (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Development Economics (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Economics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Entrepreneurship & Innovation (AREA)
- Game Theory and Decision Science (AREA)
- Databases & Information Systems (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Primary Health Care (AREA)
- Tourism & Hospitality (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
A system and method for contextually aware interactive advertisements are provided. In example embodiments, an advertisement indication that indicates a presentation of an advertisement to a user is received. The advertisement corresponding to the advertisement indication is identified. At least one item listing is determined based, at least in part, on the advertisement. Real-time contextual data corresponding to the advertisement indication is received. The real-time contextual data corresponds to a physical context of the presentation of the advertisement. Accessing contextual conditions associated with the advertisement. Determining satisfaction of the contextual conditions based on the real-time contextual data. Based on the determined satisfaction of the contextual conditions, presentation of the at least one item listing is caused to the user.
Description
- This application claims the priority benefit of U.S. Provisional Application No. 61/869,557, entitled “IMPROVED RETAIL EXPERIENCE,” filed Aug. 23, 2013, which is hereby incorporated by reference in its entirety.
- Embodiments of the present disclosure relate generally to mobile computing technology and, more particularly, but not by way of limitation, to contextual interactive advertisements.
- In recent years mobile devices, wearable devices, smart devices, and the like have pervaded nearly every aspect of modern life. Such devices often include sensors operable to physically detect identifiers such as Quick Response (QR) codes. In addition, the near ubiquity of wireless networks provides users with access to information virtually anywhere.
- Various ones of the appended drawings merely illustrate example embodiments of the present disclosure and cannot be considered as limiting its scope.
-
FIG. 1A is a block diagram illustrating a networked system, according to some example embodiments. -
FIG. 1B illustrates a block diagram showing components provided within the system ofFIG. 1A , according to some example embodiments. -
FIG. 2 is a block diagram illustrating an example embodiment of an advertisement system, according to some example embodiments. -
FIG. 3 is a depiction of an interactive advertisement, according to some example embodiments. -
FIG. 4 is a flow diagram illustrating an example method for identifying an advertisement and presenting item listings, according to some example embodiments. -
FIG. 5 is an illustration showing example types of sensors that provide various sensor data, according to some example embodiments. -
FIG. 6 is a flow diagram illustrating communication between various entities, according to some example embodiments. -
FIG. 7 is a flow diagram illustrating further example operations for presenting item listings based on real-time contextual data, according to some example embodiments. -
FIGS. 8 and 9 are flow diagrams illustrating further operations for determining contextual conditions, according to some example embodiments. -
FIGS. 10-13 illustrate example user interfaces, according to some example embodiments. -
FIG. 14 depicts an example mobile device and mobile operating system interface, according to some example embodiments. -
FIG. 15 is a block diagram illustrating an example of a software architecture that may be installed on a machine, according to some example embodiments. -
FIG. 16 illustrates a diagrammatic representation of a machine in the form of a computer system within which a set of instructions may be executed for causing the machine to perform any one or more of the methodologies discussed herein, according to an example embodiment. - The headings provided herein are merely for convenience and do not necessarily affect the scope or meaning of the terms used.
- The description that follows includes systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments of the disclosure. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art, that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques are not necessarily shown in detail.
- Mobile devices provide a variety of data that is measured, captured, or otherwise obtained via sensors such as position sensors to capture position data (e.g., using a Global Positioning System (GPS) component), detection sensors to detect identifiers (e.g., an optical sensor to read QR codes), and the like. Such data may be utilized to augment, supplement, or otherwise enhance advertisements (also referred to as “ads”) or promotions. For example, contextually aware interactive advertisements may be realized using such data. In various implementations, exclusivity associated with a particular advertisement can be implemented using contextual data. A particular (in some cases exclusive) offer or deal corresponding to a particular advertisement may be available to users who are physically being presented the particular advertisement at a particular time and location. The element of exclusivity is intended, in some scenarios, to have the effect of generating excitement or a “buzz” regarding an advertising campaign.
- In various embodiments, an advertisement indication that indicates a presentation of an advertisement or promotion to a user is received. For instance, the user may scan or otherwise obtain an advertisement identifier (e.g., scanning a QR code that included the advertisement identifier) corresponding to an advertisement using a user device (e.g., a smart phone equipped with an optical sensor to scan QR codes). In this instance, the advertisement indication comprises the advertisement identifier scanned by the user, although the advertisement indication can include other information such as location, time, or other contextual data. Subsequent to receiving the advertisement indication, the advertisement corresponding to the advertisement indication is identified. For example, if the advertisement indication includes the advertisement identifier, the advertisement can be identified via a lookup of the advertisement based on the advertisement identifier.
- In embodiments, one or more item listings are determined based, at least in part, on the identified advertisement. In a specific example, a predefined set of items is accessed based on the advertisement (e.g., a lookup of the predefined set of items corresponding to the advertisement using the advertisement identifier) and one or more item listings corresponding to items included in the predefined set of items may be identified. In another example, the advertisement corresponds to an item type, and item listings associated with the item type are determined (e.g., identifying item listings on an e-commerce website that match the item type). Many other schemes and techniques may be employed to determine the item listing.
- In further embodiments, contextual data corresponding to the advertisement indication is received. In some cases, the contextual data comprises real-time contextual data. In various implementations, the contextual data corresponds to a physical context or physical environment of the presentation of the advertisement. For instance, the contextual data comprises location data (e.g., as determined by a GPS component of a mobile device of the user) that corresponds to a presentation location of the presentation of the advertisement to the user. Thus, based on the real-time contextual data, a location of where the user is viewing a particular advertisement may be ascertained in real-time.
- Presentation of the at least one item listing is caused based on the real-time contextual data. For example, if the user is viewing a particular advertisement at a particular location and time, the item listing is presented to the user. In some examples, the item listings are exclusively available to users that meet contextual conditions such as a location condition (e.g., a distance condition) and temporal condition. For example, if a user location is not within a distance of an advertisement location, the item listings may not be available to the user. In this manner, the user may interact with a contextually aware interactive advertisement.
- With reference to
FIG. 1A , an example embodiment of a high-level client-server-basednetwork architecture 100 is shown. A networkedsystem 102 provides server-side functionality via a network 104 (e.g., the Internet or wide area network (WAN)) to aclient device 110. In some implementations, a user (e.g., user 106) interacts with thenetworked system 102 using theclient device 110.FIG. 1A illustrates, for example, a web client 112 (e.g., a browser, such as the Internet Explorer®) browser developed by Microsoft® Corporation of Redmond, Wash. State), client application(s) 114, and aprogrammatic client 116 executing on theclient device 110. Theclient device 110 may include theweb client 112, the client application(s) 114, and theprogrammatic client 116 alone, together, or in any suitable combination. AlthoughFIG. 1A shows oneclient device 110, in other implementations, thenetwork architecture 100 comprises multiple client devices. - In various implementations, the
client device 110 comprises a computing device that includes at least a display and communication capabilities that provide access to thenetworked system 102 via thenetwork 104. Theclient device 110 comprises, but is not limited to, a remote device, work station, computer, general purpose computer, Internet appliance, hand-held device, wireless device, portable device, wearable computer, cellular or mobile phone, Personal Digital Assistant (PDA), smart phone, tablet, ultrabook, netbook, laptop, desktop, multi-processor system, microprocessor-based or programmable consumer electronic, game consoles, set-top box, network Personal Computer (PC), mini-computer, and so forth. In an example embodiment, theclient device 110 comprises one or more of a touch screen, accelerometer, gyroscope, biometric sensor, camera, microphone, Global Positioning System (GPS) device, and the like. - The
client device 110 communicates with thenetwork 104 via a wired or wireless connection. For example, one or more portions of thenetwork 104 comprises an ad hoc network, an intranet, an extranet, a Virtual Private Network (VPN), a Local Area Network (LAN), a wireless LAN (WLAN), a Wide Area Network (WAN), a wireless WAN (WWAN), a Metropolitan Area Network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a Wireless Fidelity (Wi-Fi®) network, a Worldwide Interoperability for Microwave Access (WiMax) network, another type of network, or any suitable combination thereof. - In some example embodiments, the
client device 110 includes one or more of the applications (also referred to as “apps”) such as, but not limited to, web browsers, book reader apps (operable to read e-books), media apps (operable to present various media forms including audio and video), fitness apps, biometric monitoring apps, messaging apps, electronic mail (email) apps, and e-commerce site apps (also referred to as “marketplace apps”). In some implementations, the client application(s) 114 include various components operable to present information to the user and communicate withnetworked system 102. In some embodiments, if the e-commerce site application is included in theclient device 110, then this application may be configured to locally provide the user interface and at least some of the functionalities with the application configured to communicate with thenetworked system 102, on an as needed basis, for data or processing capabilities not locally available (e.g., access to a database of items available for sale, to authenticate a user, to verify a method of payment). Conversely, if the e-commerce site application is not included in theclient device 110, theclient device 110 can use its web browser to access the e-commerce site (or a variant thereof) hosted on thenetworked system 102. - In various example embodiments, the user (e.g., the user 106) comprises a person, a machine, or other means of interacting with the
client device 110. In some example embodiments, the user is not be part of thenetwork architecture 100, but interacts with thenetwork architecture 100 via theclient device 110 or another means. For instance, the user provides input (e.g., touch screen input or alphanumeric input) to theclient device 110 and the input is communicated to thenetworked system 102 via thenetwork 104. In this instance, thenetworked system 102, in response to receiving the input from the user, communicates information to theclient device 110 via thenetwork 104 to be presented to the users. In this way, the user can interact with thenetworked system 102 using theclient device 110. - An Application Program Interface (API)
server 120 and aweb server 122 may be coupled to, and provide programmatic and web interfaces respectively to, one or more application server(s) 140. The application server(s) 140 may host one or more publication system(s) 142, payment system(s) 144, and anadvertisement system 150, each of which may comprise one or more modules or applications and each of which may be embodied as hardware, software, firmware, or any combination thereof. The application server(s) 140 are, in turn, shown to be coupled to one or more database server(s) 124 that facilitate access to one or more information storage repositories or database(s) 126. In an example embodiment, the database(s) 126 are storage devices that store information to be posted (e.g., publications or listings) to the publication system(s) 142. The database(s) 126 may also store digital goods information in accordance with some example embodiments. - Additionally, a
third party application 132, executing on athird party server 130, is shown as having programmatic access to thenetworked system 102 via the programmatic interface provided by theAPI server 120. For example, thethird party application 132, utilizing information retrieved from thenetworked system 102, may support one or more features or functions on a website hosted by the third party. The third party website may, for example, provide one or more promotional, marketplace, or payment functions that are supported by the relevant applications of thenetworked system 102. - The publication system(s) 142 may provide a number of publication functions and services to the users that access the
networked system 102. The payment system(s) 144 may likewise provide a number of functions to perform or facilitate payments and transactions. While the publication system(s) 142 and payment system(s) 144 are shown inFIG. 1A to both form part of thenetworked system 102, it will be appreciated that, in alternative embodiments, eachsystem networked system 102. In some example embodiments, the payment system(s) 144 may form part of the publication system(s) 142. - In some implementations, the
advertisement system 150 provides functionality to implement contextually aware interactive advertisements. As such, theadvertisement system 150 receives an advertisement indication, identifies the advertisement corresponding to the advertisement indication, determines the item listings based on the advertisement, receives the contextual data, and causes the presentation of the item listings to the user based on the contextual data. In some example embodiments, thesystem 150 communicates with theclient device 110, the third party server(s) 130, the publication system(s) 142 (e.g., retrieving item listings), and the payment system(s) 144 (e.g., purchasing a listing). In an alternative example embodiment, theadvertisement system 150 is a part of the publication system(s) 142. Theadvertisement system 150 will be discussed further in connection withFIG. 2 below. - Further, while the client-server-based
network architecture 100 shown inFIG. 1A employs a client-server architecture, the present inventive subject matter is, of course, not limited to such an architecture, and may equally well find application in a distributed, or peer-to-peer, architecture system, for example. The various systems of the applications server(s) 140 (e.g., the publication system(s) 142 and the payment system(s) 144) may also be implemented as standalone software programs, which do not necessarily have networking capabilities. - The
web client 112 may access the various systems of the networked system 102 (e.g., the publication system(s) 142) via the web interface supported by theweb server 122. Similarly, theprogrammatic client 116 and client application(s) 114 may access the various services and functions provided by thenetworked system 102 via the programmatic interface provided by theAPI server 120. Theprogrammatic client 116 may, for example, be a seller application (e.g., the Turbo Lister application developed by eBay® Inc., of San Jose, Calif.) to enable sellers to author and manage listings on thenetworked system 102 in an off-line manner, and to perform batch-mode communications between theprogrammatic client 116 and thenetworked system 102. -
FIG. 1B illustrates a block diagram showing components provided within the publication system(s) 142, according to some embodiments. In various example embodiments, the publication system(s) 142 may comprise a market place system to provide market place functionality (e.g., facilitating the purchase of items associated with item listings on an e-commerce website). Thenetworked system 102 may be hosted on dedicated or shared server machines that are communicatively coupled to enable communications between server machines. The components themselves are communicatively coupled (e.g., via appropriate interfaces) to each other and to various data sources, so as to allow information to be passed between the applications or so as to allow the applications to share and access common data. Furthermore, the components may access one or more database(s) 126 via the database server(s) 124. - The
networked system 102 may provide a number of publishing, listing, and price-setting mechanisms whereby a seller (also referred to as a “first user”) may list (or publish information concerning) goods or services for sale or barter, a buyer (also referred to as a “second user”) can express interest in or indicate a desire to purchase or barter such goods or services, and a transaction (such as a trade) may be completed pertaining to the goods or services. To this end, thenetworked system 102 may comprise apublication engine 160 and aselling engine 162. Thepublication engine 160 may publish information, such as item listings or product description pages, on thenetworked system 102. In some embodiments, the sellingengine 162 may comprise one or more fixed-price engines that support fixed-price listing and price setting mechanisms and one or more auction engines that support auction-format listing and price setting mechanisms (e.g., English, Dutch, Chinese, Double, Reverse auctions, etc.). The various auction engines may also provide a number of features in support of these auction-format listings, such as a reserve price feature whereby a seller may specify a reserve price in connection with a listing and a proxy-bidding feature whereby a bidder may invoke automated proxy bidding. The sellingengine 162 may further comprise one or more deal engines that support merchant-generated offers for products and services. - A
listing engine 164 allows sellers to conveniently author listings of items or authors to author publications. In one embodiment, the listings pertain to goods or services that a user (e.g., a seller) wishes to transact via thenetworked system 102. In some embodiments, the listings may be an offer, deal, coupon, or discount for the good or service. Each good or service is associated with a particular category. Thelisting engine 164 may receive listing data such as title, description, and aspect name/value pairs. Furthermore, each listing for a good or service may be assigned an item identifier. In other embodiments, a user may create a listing that is an advertisement or other form of information publication. The listing information may then be stored to one or more storage devices coupled to the networked system 102 (e.g., database(s) 126). Listings also may comprise product description pages that display a product and information (e.g., product title, specifications, and reviews) associated with the product. In some embodiments, the product description page may include an aggregation of item listings that correspond to the product described on the product description page. - The
listing engine 164 also may allow buyers to conveniently author listings or requests for items desired to be purchased. In some embodiments, the listings may pertain to goods or services that a user (e.g., a buyer) wishes to transact via thenetworked system 102. Each good or service is associated with a particular category. Thelisting engine 164 may receive as much or as little listing data, such as title, description, and aspect name/value pairs, that the buyer is aware of about the requested item. In some embodiments, thelisting engine 164 may parse the buyer's submitted item information and may complete incomplete portions of the listing. For example, if the buyer provides a brief description of a requested item, thelisting engine 164 may parse the description, extract key terms, and use those terms to make a determination of the identity of the item. Using the determined item identity, thelisting engine 164 may retrieve additional item details for inclusion in the buyer item request. In some embodiments, thelisting engine 164 may assign an item identifier to each listing for a good or service. - In some embodiments, the
listing engine 164 allows sellers to generate offers for discounts on products or services. Thelisting engine 164 may receive listing data, such as the product or service being offered, a price or discount for the product or service, a time period for which the offer is valid, and so forth. In some embodiments, thelisting engine 164 permits sellers to generate offers from sellers' mobile devices. The generated offers may be uploaded to thenetworked system 102 for storage and tracking. - Searching the
networked system 102 is facilitated by a searchingengine 166. For example, the searchingengine 166 enables keyword queries of listings published via thenetworked system 102. In example embodiments, the searchingengine 166 receives the keyword queries from a device of a user and conducts a review of the storage device storing the listing information. The review will enable compilation of a result set of listings that may be sorted and returned to theclient device 110 of the user. The searchingengine 166 may record the query (e.g., keywords) and any subsequent user actions and behaviors (e.g., navigations, selections, or click-throughs). - The searching
engine 166 also may perform a search based on a location of the user. A user may access the searchingengine 166 via a mobile device and generate a search query. Using the search query and the user's location, the searchingengine 166 may return relevant search results for products, services, offers, auctions, and so forth to the user. The searchingengine 166 may identify relevant search results both in a list form and graphically on a map. Selection of a graphical indicator on the map may provide additional details regarding the selected search result. In some embodiments, the user may specify, as part of the search query, a radius or distance from the user's current location to limit search results. - In a further example, a
navigation engine 168 allows users to navigate through various categories, catalogs, or inventory data structures according to which listings may be classified within thenetworked system 102. For example, thenavigation engine 168 allows a user to successively navigate down a category tree comprising a hierarchy of categories (e.g., the category tree structure) until a particular set of listings is reached. Various other navigation applications within thenavigation engine 168 may be provided to supplement the searching and browsing applications. Thenavigation engine 168 may record the various user actions (e.g., clicks) performed by the user in order to navigate down the category tree. - In some example embodiments, a
personalization engine 170 provides functionality to personalize various aspects of user interactions with thenetworked system 102. For instance, the user can define, provide, or otherwise communicate personalization settings used by thepersonalization engine 170 to determine interactions with thenetworked system 102. In further example embodiments, thepersonalization engine 170 determines personalization settings automatically and personalizes interactions based on the automatically determined settings. For example, thepersonalization engine 170 determines a native language of the user and automatically presents information in the native language. -
FIG. 2 is a block diagram of theadvertisement system 150 that provides functionality to implement contextually aware interactive advertisements, according to some example embodiments. In an example embodiment, theadvertisement system 150 includes apresentation module 210, acommunication module 220, ananalysis module 230, anitem module 240, acondition module 250, and anoffer module 260. All, or some, of the modules 210-260 ofFIG. 2 , communicate with each other either directly or indirectly, for example, via a network coupling, shared memory, and the like. It will be appreciated that each module of modules 210-260 may be implemented as a single module, combined into other modules, further subdivided into multiple modules, or any suitable combination thereof. Other modules not pertinent to example embodiments may also be included, but are not shown. - The
presentation module 210 provides various presentation and user interface functionality operable to interactively present and receive information from the user. For instance, thepresentation module 210 can cause presentation of the determined item listings to the user. In various implementations, thepresentation module 210 presents or causes presentation of information (e.g., visually displaying information on a screen, acoustic output, haptic feedback). Interactively presenting is intended to include the exchange of information between a particular device and the user. The user may provide input to interact with the user interface in many possible manners such as alphanumeric, point based (e.g., cursor), tactile, or other input (e.g., touch screen, tactile sensor, light sensor, infrared sensor, biometric sensor, microphone, gyroscope, accelerometer, or other sensors), and the like. It will be appreciated that thepresentation module 210 provides many other user interfaces to facilitate functionality described herein. Further, it will be appreciated that “presenting” as used herein is intended to include communicating information or instructions to a particular device that is operable to perform presentation based on the communicated information or instructions. - The
communication module 220 provides various communications functionality and web services. For example, thecommunication module 220 provides network communication such as communicating with thenetworked system 102, theclient device 110, and the third party server(s) 130. In a specific example, thecommunication module 220 receives the advertisement indication from a user device (e.g., a smart phone) of the user. In another specific example, thecommunication module 220 receives contextual data corresponding to the advertisement indication. In some instances, the contextual data is real-time contextual data. In various example embodiments, the network communication may operate over wired or wireless modalities. Web services are intended to include retrieving information from the third party server(s) 130, the database(s) 126, and the application server(s) 140. In some implementations, information retrieved by thecommunication module 220 comprises data associated with the user (e.g., user profile information from an online account, social network service data associated with the user), data associated with one or more items listed on an e-commerce website (e.g., images of the item, reviews of the item, item price), or other data to facilitate the functionality described herein. - The
analysis module 230 provides a variety of analysis functions to facilitate the functionality describe herein. For example, theanalysis module 230 identifies the advertisement corresponding to the advertisement indication. More specifically, theanalysis module 230 performs a lookup of the advertisement identifier included in the advertisement indication to identify the advertisement, according to some implementations. In some implementations, theanalysis module 230 extracts information from the contextual data such as a user location, a presentation time, a user identity, and so on. - In some further implementations, the
analysis module 230 accesses user data corresponding to the user. For instance, user data may include calendars (e.g., user calendar events such as birthdays, trips, exams), user profiles (e.g., demographic information such as age, gender, income level), purchase histories, browse histories (e.g., search terms), social media content (e.g., check-ins, posts, connections), other user data (e.g., bookmarked websites, preferences or settings for various applications, application usage data such as time spent using a particular application), and the like. In various example embodiments, theanalysis module 230 access, retrieves, or otherwise obtains the user data from the database(s) 126, the third party server(s) 130, the publication system(s) 142, or elsewhere. - The
item module 240 provides functionality to determine the item listings, according to some implementations. In various implementations, the item listings correspond to items available for purchase such as a listing on an e-commerce website. Theitem module 240 may employ a variety of schemes and techniques to determine the item listing based on various data. In an embodiment, theitem module 240 accesses a predefined set of item listings corresponding to the advertisement and determines one or more item listings among the set of item listings. The predefined set of item listings may be configured by an operator, advertiser, or another party associated with the advertisement. In another example, theitem module 240 determines the item listings from an e-commerce website (e.g., eBay®) based on the advertisement (e.g., an item type or brand corresponding to the advertisement). - The
condition module 250 provides functionality to implement contextual conditions in conjunction with the advertisement, according to some embodiments. For instance, thecondition module 250 accesses contextual conditions associated with the advertisement and evaluates satisfaction of the contextual conditions. For example, if the contextual conditions include a distance condition, thecondition module 250 determines satisfaction of the distance condition based on the user location and the advertisement location. The contextual conditions are intended to create exclusivity in association with the advertisement. In other words, the interactive features of the advertisement may be available under specified conditions associated with the contextual data and otherwise unavailable to the user. - The
offer module 260 provides functionality to generate advertisement offers associated with the item listing, according to some embodiments. In an embodiment, the offer comprises a discount for a purchase associated with the item listing. In other embodiments, theoffer module 260 provides advertisement features that are specific to the advertisement. The advertisement features comprise, for example, free shipping, faster shipping, otherwise unavailable item features (e.g., a color or style not widely available), exclusive items, and so forth. Theoffer module 260 may provide a variety of offers to the user, and in some cases, the offers may be based on various data such as the contextual data, user data, and so forth. - Referring now to
FIG. 3 , adepiction 300 of an interactive advertisement is shown, according to some example embodiments.Scene 310 depicts anadvertisement 320 that includes a tag 330 (e.g., a QR code embedded on or near the advertisement 320). In some implementations, thetag 330 is embedded in theadvertisement 320, and in other implementations, thetag 330 is merely in the vicinity of theadvertisement 320. In some implementations,user device 350 detects the advertisement identifier encoded in thetag 330 via asignal 340. For instance, thesignal 340 may be an optical signal captured, detected, or otherwise obtained by theuser device 350, with theuser device 350 being operable to decode the signal to extract the advertisement identifier. - In an embodiment, the
tag 330 comprises a QR code that is readable by an app executing on a mobile device of the user that includes a camera sensor. In other embodiments, thetag 330 comprises a Radio Frequency Identification (RFID) tag, Near Field Communication (NFC) tag, smart tag, or another storage device operable to store the advertisement identifier and communicate the advertisement identifier to the user device 350 (seeFIG. 5 for additional sensor to detect identifiers). In still other embodiments, a tagless identification of the advertisement may be implemented by comparing a user location to respective advertisement locations corresponding to a plurality of advertisements and identifying a match between a particular advertisement location and the user location. - In some implementations, the
user device 350 is communicatively coupled, viacoupling 360, to thenetwork 104, which is in turn communicatively coupled to thenetworked system 102 including the advertisement system 150 (discussed above in connection withFIG. 1A ).User 370 may initiate the identification of theadvertisement 320 by operating theuser device 350. For example, theuser device 350 executes an app operable to obtain the advertisement identifier and presents a user interfaces that includes the item listings to the user. - In the
example depiction 300, theuser 370 is carrying the user device 350 (e.g., a smart phone or smart watch) and may be interested in theadvertisement 320. Theuser 370 initiates the identification of theadvertisement 320 by detecting thetag 330 with theuser device 350. Theuser device 350 may extract the advertisement identifier from the tag 330 (e.g., scanning a QR code). Subsequently, the advertisement indication that includes the advertisement identifier corresponding to thetag 330 and theadvertisement 320 is communicated from theuser device 350 to thecommunication module 220 via thenetwork 104. Theanalysis module 230 identifies the advertisement corresponding to the advertisement indication (e.g., a lookup of the advertisement based on the advertisement identifier). Once theanalysis module 230 identifies the advertisement, theitem module 240 determines the item listings, based, at least in part, on the identified advertisement. For instance, theitem module 240 may access a predefined set of item listings or dynamically determine item listings corresponding to the advertisement. In some implementations, theitem module 240 retrieves the item listing and associated item data from the publication system(s) 142. The item data may include item images, price, description, brand, and so forth. - In various embodiments, the
communication module 220 receives the contextual data corresponding to the advertisement indication from theuser device 350. In some embodiments, the contextual data includes location data (e.g., as determined by a GPS component of the user device 350). In an embodiment, based on the contextual data, thepresentation module 210 causes presentation of the item listings (e.g., by communicating the item listings and instructions to present the item listing to the user device 350). For example, thecondition module 250 determines satisfaction of the contextual conditions and, based on the satisfaction of the contextual conditions, thepresentation module 210 causes presentation of the item listings to the user. - In a specific example embodiment, the contextual conditions include the distance condition. In this example embodiment, the
analysis module 230 identifies the advertisement location corresponding to the advertisement (e.g., the advertisement location may be predefined and accessed by the analysis module 230) and extracts the user location from the contextual data (e.g., the contextual data includes GPS data from the user device 350). Thecondition module 250 determines satisfaction of the distance condition by determining that the user location is within a distance of the advertisement location. The distance can be predefined or dynamically determined based on the contextual data. In this example, if the contextual conditions are satisfied, thepresentation module 210 causes presentation of the item listing. Conversely, if the contextual conditions are not satisfied, thepresentation module 210 does not cause presentation of the item listing. Put another way, in the context of this example embodiment, if the user interacts with the advertisement from another location that is outside of the distance of the distance condition (e.g., scanning the same or a similar QR code from a remote location), the user may not be able to view the item listings. In this way, the presentation of the item listings may be exclusive to users (e.g., the user 370) that are physically in the vicinity of theadvertisement 320. Thus, in thedepiction 300, theuser 370 may interact with theadvertisement 320 in a contextually aware manner. -
FIG. 4 is a flow diagram illustrating anexample method 400 for identifying the advertisement and presenting the item listings, according to some example embodiments. The operations of themethod 400 may be performed by components of theadvertisement system 150. Atoperation 410, thecommunication module 220 may receive, from the user device, the advertisement indication that indicates the presentation of the advertisement or a promotion to the user. For example, the advertisement indication can comprise the advertisement identifier extracted from a QR code, a RFID tag, a NFC tag, a smart tag, an audio signal (e.g., audio tagging), or the contextual data (e.g., a location mapping). In some implementations, theanalysis module 230 extracts the advertisement identifier from various suitable combinations of tags and the contextual data. In some example embodiments, the advertisement indication includes the contextual data corresponding to the physical context or physical environment of the presentation of the advertisement to the user. In other example embodiments, the contextual data is received, retrieved, or otherwise obtained as a separate operation as discussed below in connection withoperation 440. - In some implementations, the user initiates the advertisement identification (e.g., the user activates a user interface element on the user device to begin the advertisement identification). In other implementations, the advertisement identification initiates automatically via the
analysis module 230 monitoring, tracking, or otherwise automatically detecting the advertisement indication. In these implementations, theanalysis module 230 monitors the contextual data, received by thecommunication module 220, for the advertisement indication. For instance, the contextual data may include location data pertaining to the user. In this instance, theanalysis module 230 automatically detects the advertisement indication based on the location data (e.g., mapping the user location with a plurality of advertisement locations). - In some embodiments, a QR code, a RFID tag, a NFC tag, a smart tag, or the like is embedded in the advertisement or in the vicinity of the advertisement. The user may initiate the advertisement identification by physically detecting a particular tag corresponding to the advertisement (e.g., physically moving a mobile device operable to detect RFID tags within a detection range of the RFID tag corresponding to the advertisement). In some implementations, the advertisement indication includes an advertisement identifier extracted from the tag (e.g., extracted by the user device).
- In other embodiments, the advertisement may include an audio component (e.g., a television advertisement, a radio advertisement, a loud speaker announcement). In this example, the
user device 350 or theanalysis module 230 extracts the advertisement identifier using audio tag identification software. For example, an app executing on theuser device 350, operable to detect and extract the advertisement identifier from an audio signal detected by theuser device 350, communicates the advertisement indication including the advertisement identifier to thecommunication module 220. In an alternative example, theuser device 350 communicates the advertisement indication including the audio signal to thecommunication module 220, and theanalysis module 230 subsequently extracts the advertisement identifier from the audio signal. - In alternative embodiments, the
analysis module 230 extracts the advertisement indication from the contextual data. For example, theanalysis module 230 extracts the user location from the contextual data, accesses location data corresponding to a plurality of advertisements, and identifies a match between the user location and the location data of a particular advertisement among the plurality of advertisements. In this way, theanalysis module 230 identifies a particular advertisement being presented to the user based on the contextual data. - At
operation 420, theanalysis module 230 identifies the advertisement corresponding to the advertisement indication. In an embodiment, the advertisement indication includes the advertisement identifier, and theanalysis module 230 identifies the advertisement based on the advertisement identifier. For instance, theanalysis module 230 performs a lookup of the advertisement using the advertisement identifier (e.g., a lookup table in a database, such as database(s) 126, indexed with advertisement identifiers). In another embodiment, the advertisement indication includes contextual data that theanalysis module 230 uses to identify the advertisement. For instance, if the contextual data includes the user location, theanalysis module 230 compares the user location to the advertisement locations (e.g., stored in a database such as database(s) 126). - At
operation 430, theitem module 240 determines one or more item listings based, at least in part, on the advertisement. For example, an operator, advertiser, or another party associated with the advertisement may specify a predefined set of item listings for the advertisement. In this example, theitem module 240 accesses the predefined set of item listings for the advertisement and determines a portion of the predefined set of item listings. For instance, the advertisement may depict a particular celebrity, and the predefined set of item listings may include item listing endorsed by the particular celebrity. In another implementation, theitem module 240 dynamically determines the item listings based on the advertisement. For instance, theitem module 240 identifies item listings from an e-commerce website that are associated with the advertisement (e.g., same or similar type of item or brand as promoted by the advertisement). - In further implementations, the
item module 240 determines the item listings based on the contextual data, user data, or other data. For example, theitem module 240 determines the item listings by first identifying item listings on an e-commerce website associated with the advertisement and then refining the identified item listings based on the user data. In a specific example, if the user data indicates a gender of the user, various ones of the identified item listings may be excluded based on the gender (e.g., gender specific apparel items). In a further example, theitem module 240 may employ the contextual data to determine item listings pertinent to the user. For instance, theitem module 240 may use the user location included in the contextual data as a basis for determining the item listings (e.g., based on the user being near a store that sells a particular item, include an item listing corresponding to the particular item). In another example, the contextual data may indicate weather conditions such as a cold day, and theitem module 240 may identify item listings based on the weather conditions (e.g., item listings associated with cold weather such as hot coffee). - At
operation 440, thecommunication module 220 receives contextual data corresponding to the advertisement indication. In various implementations, the contextual data includes real-time contextual data. The term “real-time data,” as used herein, is intended to include data associated with an event currently happening. For example, the real-time data may include user input data or sensor data communicated to thecommunication module 220 after a delay interval (e.g., due to transmission delay or other delays such as being temporarily stored at an intermediate device) between capturing the data and thecommunication module 220 receiving the data. - In further embodiments, the
communication module 220 stores (e.g., a storage device such as database(s) 126) the contextual data in association with the user and the advertisement (e.g., a database index by a user identifier or the advertisement identifier). In some implementations, theitem module 240 determines the item listings based on the stored contextual data. For example, the stored contextual data may indicate that the user has previously initiated identification of a particular advertisement. Based on the indication of the previous presentation of the advertisement to the user, theitem module 240 may determine different item listings than those previously presented to the user. - In some implementations, the real-time contextual data corresponds to the physical context of the presentation of the advertisement. The physical context includes the presentation location (e.g., where the advertisement is being presented to the user), a presentation time (e.g., the time the advertisement is being presented to the user), an ambient noise level (e.g., a decibel level corresponding to a noise level of the advertisement presentation), an ambient temperature, an ambient illumination level, biometric data associated with the user, and so on. In various implementations, the
communication module 220 receives the contextual data from sensors of the user device as further discussed in connection withFIG. 5 , below. - Referring now to
FIG. 5 , example diagram 500 depicts non-limitingexample sensor components 510 that may provide attribute data, according to some example embodiments. In example embodiments, thesensor components 510 includemotion components 520,position components 530,environmental components 540,biometric components 550,detection components 560, and a wide gamut of other sensors, gauges, and measurement components not shown inFIG. 5 . Thesensor components 510 or a suitable combination of thesensor components 510 may be included in any suitable device or machine ofFIG. 1 , such as theclient device 110, to facilitate the functionality described herein. - The
sensor components 510 may receive, detect, measure, capture, or otherwise obtain sensor data associated with physical properties, attributes, or characteristics. Thesensor components 510 may provide, produce, transmit, or otherwise communicate the sensor data or other indications associated with the physical properties, attributes, or characteristics (e.g., a sensor included in a device operable to communicate the sensor data to the networked system 102). In some implementations, a combination of devices may be employed to provide the sensor data (e.g., a first device that includes a sensor and is communicatively coupled to a second device that communicates sensor data received from the first device to the networked system 102). As a result, the sensor data provided by thesensor components 510 may be accessible to all, or some, of the modules described above on a real-time or near real-time basis. Thesensor components 510 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting. - The
motion components 520 include acceleration sensors (e.g., accelerometer), gravitation sensors, rotation sensors (e.g., gyroscope), and so forth. Themotion components 520 may provide motion data such as velocity, acceleration, or other force measurements along an x, y, and z axes. In some implementations, the motion data is provided at a regular update rate or sampling rate (e.g., 10 updates per second) that may be configurable. - The
position components 530 include location sensors (e.g., a Global Position System (GPS) receiver component), altitude sensors (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensors (e.g., magnetometers that provide magnetic field strength along the x, y, and z axes), and the like. In an example embodiment, theposition components 530 may provide position data such as latitude, longitude, altitude, and a time stamp. Similar to themotion components 520, theposition components 530 may provide the motion data at a regular update rate that may be configurable. - The
environmental components 540 include illumination sensors (e.g., photometer), temperature sensors (e.g., one or more thermometers that detect ambient temperature), humidity sensors, pressure sensors (e.g., barometer), acoustic sensors (e.g., one or more microphones that detect background noise), proximity sensors (e.g., an infrared sensor that detects nearby objects), gas sensors (e.g., machine olfaction detection sensors, gas detection sensors to detect concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), and so on. Theenvironmental components 540 may measure various physical parameters to provide an indication or signal corresponding to the physical environment surrounding theenvironmental components 540. - The
biometric components 550 include components to detect expressions, measure biosignals, or identify people, among other functions. For example, thebiometric components 550 include expression components to detect expressions (also referred to as “kinesics”) such as hand gestures (e.g., an optical component to detect a hand gesture or a Doppler component to detect hand motions), vocal expressions (e.g., a microphone to detect changes in voice pitch that may indicate tension), facial expressions (e.g., a camera to detect expressions or micro-expressions of a person such as a smile), body gestures, and eye tracking (e.g., detecting the focal point of a person's eyes or patterns in eye movement). Thebiometric components 550 may also include, for example, biosignal components to measure biosignals such as blood pressure, heart rate, body temperature, perspiration, and brain waves (e.g., as determined by a electroencephalogram). In further examples, thebiometric components 550 include identification components to identify people such as retinal scanners (e.g., a camera component), vocal detectors (e.g., a microphone to receive audio data for voice identification), facial detectors, fingerprint detectors, and electroencephalogram sensors (e.g., to identify a person via unique brain wave patterns). - The
detection components 560 provide functionality to detect a variety of identifiers. For example, thedetection components 560 include Radio Frequency Identification (RFID) tag reader components, Near Field Communication (NFC) smart tag detection components, optical reader components (e.g., an optical sensor to detect one-dimensional bar codes such as Universal Product Code (UPC) bar codes, multi-dimensional bar codes such as a Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, Uniform Commercial Code Reduced Space Symbology (UCC RSS)-2D bar code, and other optical codes), or acoustic detection components (e.g., microphones to identify tagged audio signals). In additional, a variety of information may be derived via various communication components such as location via Internet Protocol (IP) geo-location, location via Wi-Fi® signal triangulation, location via detecting a NFC beacon signal that may indicate a particular location, and so forth. - Referring back to
FIG. 4 , atoperation 450, thepresentation module 210 causes presentation of the item listings based on the real-time contextual data. For instance, thepresentation module 210 communicates the item listing to the user device with instructions to cause presentation of the item listing to the user. In response to receiving the instructions to cause the presentation, the user device generates a user interface including the item listings and displays the user interface to the user, according to some implementations. In alternative implementations, thepresentation module 210 generates the user interface including the item listings and communicates the generated user interface to the user device for presentation to the user. - In further embodiments, the
offer module 260 generates an advertisement offer associated with the item listings. Subsequently, thepresentation module 210 provides the advertisement offer to the user (e.g., a user interface configured to present the offer and receive an indication of a selection of the offer). In some instances, theoffer module 260 generates the offer based on the contextual data. For example, if the contextual data indicates the user location is within a distance of a store the sells a particular item associated with the item listings, theoffer module 260 may generate the offer for the particular item based on that basis (e.g., a discount to entice the user to stop by the store). In various implementations, the offer comprises discounts, coupons, free shipping, or exclusive item features (e.g., an item style otherwise unavailable) associated with the item listings. - In still further embodiments, the
presentation module 210 or theoffer module 260 may provide exclusive features (e.g., the exclusive features may be included in the presentation) associated with the item listings to the user. For example, the exclusive features may include various promotional techniques such as offer items otherwise not available or of limited availability (e.g., a book including an author autograph). In this embodiment, the exclusive features are intended to incentivize the user to interact with the advertisement. -
FIG. 6 is a flow diagram 600 illustrating communication between various entities, according to some example embodiments. Atoperation 606, user 602 initiates advertisement indication capture. For example, the user 602 activates an app executing on a mobile device of the user to capture the advertisement indication. Atoperation 608, user device 604 captures the advertisement indication. As discussed above, the user device 604 captures the advertisement indication using a variety of techniques such as QR code scanning, RFID tag detection, NFC tag detection, smart tag detection, audio tag detection, user location mapping, and so on. - As discussed above, the
advertisement system 150 receives the advertisement indication at theoperation 410, identifies the advertisement at theoperation 420, determines the item listings at theoperation 430, receives the contextual data at theoperation 440, and causes presentation of the item listings at theoperation 450. At theoperation 450, theadvertisement system 150 communicates the item listings to the user device 604 for presentation to the user, according to some implementation. - At operation 610, the user device 604 presents the item listings to the user (e.g., a user interface that includes the item listings). The user 602 may receive (e.g., viewing) the presentation at
operation 612 and may select an option to make a purchase associated with the item listings at theoperation 614. For instance, the user interface that includes the item listings may be configured to receive a selection to make a purchase associated with the item listings. Atoperation 616, the user device 604 receives the selection to make a purchase associated with the item listings. The user device 604 communicates the selection to make the purchase to the advertisement system 150 (e.g., received at the communication module 220). Atoperation 618, theadvertisement system 150 may facilitate the purchase associated with the item listings. For instance, theoffer module 260 may perform the transaction for the purchase. -
FIG. 7 is a flow diagram illustrating further example operations for presenting item listings based on real-time contextual data, according to some example embodiments. Subsequent to theoperation 440, thepresentation module 210 causes presentation of the item listings based on the contextual data at theoperation 450. In addition, atoperation 710, thecondition module 250 accesses contextual conditions associated with the advertisement. For example, the contextual conditions may include a location condition, a temporal condition, or other conditions. - At
operation 720, thecondition module 250 determines satisfaction of the contextual conditions based on the contextual data. Thecondition module 250 evaluates some, or all, of the conditions included in the contextual conditions (e.g., thecondition module 250 iterates through and evaluates each condition included in the contextual conditions). For example, if the contextual conditions include a location condition and a temporal condition, thecondition module 250 may determine satisfaction of the contextual conditions if either or both the location condition and the temporal condition are satisfied. - In some implementations, the satisfaction of the contextual conditions is determined based on a weighting of the satisfaction of the conditions included in the contextual conditions (e.g., the location condition may be associated with a higher weight and given more influence in the
condition module 250 determining satisfaction of the contextual conditions). The weighting may be predetermined or dynamically determined (e.g., weighting based on feedback data or other engagement data such as the user or similar user showing interest in a particular item listing via tapping or clicking on the particular item listing). In an example implementation, thecondition module 250 may calculate a contextual condition metric based on the satisfaction of respective conditions included in the contextual conditions. In this implementation, thecondition module 250 determines satisfaction of the contextual conditions when the contextual condition metric exceeds a threshold. - At
operation 730, thepresentation module 210 causes presentation of the item listings based on the satisfaction of the contextual conditions. For example, if thecondition module 250 determines that the contextual conditions are satisfied, thepresentation module 210 may then cause presentation of the item listings. Conversely if thecondition module 250 determines the contextual conditions are not satisfied, thepresentation module 210 does not cause presentation of the item listings. - In further embodiments, the
condition module 250 monitors the contextual data to determine that the contextual conditions are satisfied after the presentation of the item listings. For instance, if the contextual conditions include the temporal condition, thecondition module 250 may determine, after the item listings are presented to the user, that the temporal condition is not satisfied and restrict the presentation of the item listings or restrict features associated with the item listings (e.g., remove or disable an option to purchase the item listings). In this way, thecondition module 250 may create an exclusive experience associated with the advertisement available to users who physically interact with the advertisement. The local or ephemeral nature of the presentation of the item listings created by employing the contextual conditions may have the effect of generating demand, interest, excited, or a “buzz” regarding an advertising campaign and associated item listings. -
FIG. 8 is a flow diagram illustrating further operations for determining contextual conditions, according to some example embodiments. Subsequent to theoperation 710, thecondition module 250 may determine satisfaction of the contextual conditions at theoperation 720. In some implementations, theoperation 720 includes additional operations as show inFIG. 8 , such as the contextual conditions including the distance condition. Atoperation 810, theanalysis module 230 identifies the advertisement location corresponding to the advertisement. For instance, the advertisement may have a fixed location (e.g., a poster affixed to a wall). An operator, advertiser, or another party may assign a predefined advertisement location to the advertisement (e.g., longitude, latitude, altitude coordinates for the poster location) and store the advertisement location in a storage device such as database(s) 126. For instance, theanalysis module 230 identifies the advertisement location by accessing the advertisement location based on the advertisement identifier (e.g., the advertisement location stored in a database according to the advertisement identifier). - In some implementations, the
analysis module 230 automatically determines the location of the advertisement based on the contextual data. For example, the contextual data may indicate a location of a mobile device of the user (e.g., via a GPS component of the mobile device). The mobile device may further detect the advertisement using any of the short range communication techniques described above (e.g., QR code scanning, RFID detection). In this implementation, based on the location of the mobile device and use of a short range detection technique, theanalysis module 230 may infer that the location of the advertisement is in the vicinity of the mobile device when the mobile device detects the advertisement. In further implementations, theanalysis module 230 stores (e.g., in a storage device such as database(s) 126) the automatically determined advertisement location to be used in subsequent analysis. Thus, if the user detects the advertisement using a device without location services, theanalysis module 230 may access a stored advertisement location corresponding to the advertisement for the user. In some instances, theanalysis module 230 stores the automatically determined advertisement location from many users and identifies the true advertisement location using statistical analysis (e.g., an average or standard deviation based analysis). Many other schemes and techniques may be employed by theanalysis module 230 to automatically determine the advertisement location. - At
operation 820, theanalysis module 230 extracts the user location from the contextual data. For example, the contextual data may include location data received from a mobile device of the user operable to provide location as determined by a GPS component of the mobile device. In some implementations, theanalysis module 230 may infer the location of the user based the detection of the advertisement using a short range communication technique, similar to that discussed above in connection with theoperation 810. In this implementation, if the advertisement location is known (e.g., predefined by an operator or the automatically determine advertisement location is stored from a different user), theanalysis module 230 may infer the user location is in the vicinity of the advertisement location based on the user detecting the advertisement using a short range communication technique. - At
operation 830, thecondition module 250 determines satisfaction of the distance condition by determining that the user location is within a distance of the advertisement location. The distance may be predefined (e.g., specified by the advertiser) or dynamically determined. -
FIG. 9 is a flow diagram illustrating further operations for determining contextual conditions, according to some example embodiments. Subsequent to theoperation 710, thecondition module 250 may determine satisfaction of the contextual conditions at theoperation 720. In some implementations, theoperation 720 includes additional operations as shown inFIG. 9 , such as the contextual conditions including the temporal condition. Atoperation 910, theanalysis module 230 extracts a presentation time from the real-time contextual data or the advertisement indication. The presentation time is intended to include a time when the user is being presented the advertisement. For example, the advertisement indication may include a time stamp of when the user device detected the advertisement (e.g., when the QR code embedded in the advertisement was scanned). - At
operation 920, thecondition module 250 determines satisfaction of the temporal condition by determining that the presentation time is within a time period (e.g., 15 minutes or one week). The time period may be predefined or dynamically determined (e.g., a time period based on the length of time the user viewed the advertisement as determined by the contextual data). - Although
FIGS. 8 and 9 illustrate the contextual conditions including the distance condition and the temporal condition, it will be appreciated that many other conditions may be included in the contextual conditions. For example, an environmental condition may be implemented based on environmental data included in the contextual data. For instance, thecondition module 250 may implement conditions based on ambient noise data, ambient illumination data, or other environmental data corresponding to the physical context of the presentation of the advertisement. For example, if the ambient noise data indicate the physical context of the presentation of the advertisement is noisy (e.g., audio decibel level exceeding a threshold), thepresentation module 210 may omit an audio component of the presentation of the item listings as the user is unlikely to receive an audio presentation under such conditions. In another instance, thecondition module 250 may implement conditions based on biometric data corresponding to the user being presented the advertisement. For instance, heart rate data included in the context data may indicate the user is jogging or performing some other vigorous physical activity. Thecondition module 250 may target physically active users by implementing a biometric condition based on, for example, the heart rate exceeding a threshold. -
FIGS. 10-13 depict example user interfaces for interactively presenting the item listings to the user. AlthoughFIGS. 10-13 depict specific example user interfaces and user interface elements, these are merely non-limiting examples and many other alternate user interfaces and user interface elements may be generated by thepresentation module 210 and presented to the user. It will be noted that alternate presentations of the displays ofFIGS. 10-13 may include additional information, graphics, options, and so forth; other presentations may include less information, or may provide abridged information for easy use by the user. -
FIG. 10 depicts an example device 1000 (e.g., a smart phone) displaying anexample user interface 1010 that includesuser interface element 1020 anditem listings 1030, according to some example embodiments. In some implementations, theuser interface element 1020 provides an option to sort theitem listings 1030, or otherwise navigate the item listings, according to various schemes such as sorting based on recentness (e.g., based on temporal information corresponding to respective item listings), item price, distance from the user, relevance, or other metrics. In an example embodiment, theitem listings 1030 include various portions of item information such as an item image, price, merchant, brand, other information retrieved from the publication system(s) 142, and the like. In some implementations, activating a particular item listing presents additional information corresponding to the particular item listing. In a specific example, theitem listings 1030 may include item listings associated with a celebrity depicted in the advertisement (e.g., a celebrity endorsement for a basket of items). -
FIG. 11 depicts an example device 1100 (e.g., a smart phone) displaying anexample user interface 1110 that includes anitem listing 1120,user interface element 1130, anduser interface element 1140, according to some example embodiments. In an example embodiment, theitem listing 1120 includes various portions of item information such as an item image, price, merchant, brand, other information retrieved from the publication system(s) 142, and the like. In some implementations, activating theitem listing 1120 presents additional information corresponding to theitem listing 1120. In an example embodiment, activating theuser interface element 1130 provides the user the option to purchase the item corresponding to the item listing (e.g., activating theuser interface element 1140 may facilitate a transaction for the item, for example, using the payment system(s) 144). In some example embodiments,user interface element 1140 includes a map with locations of the item listings or merchants that sell the item corresponding to theitem listing 1120. In further example embodiments, acurrent user location 1150 is determined (e.g., via a GPS component of a mobile device of the user) and used to determine nearby merchants, such asmerchant 1160, that sell the item corresponding to the item listing. -
FIG. 12 depicts an example device 1200 (e.g., a smart watch) displaying anexample user interface 1210. Theexample user interface 1210 includesuser interface element 1220 that may be associated with the identified advertisement (see the advertisement in connection withFIG. 3 ). Theuser interface 1210 includesuser interface element 1230 that provides the user an option to interact with theuser interface 1210. For instance, activating theuser interface element 1230 provides additional information associated with the advertisement. For instance, the item listings may include a group of particular item listings associated with a celebrity (e.g., a celebrity endorsement). -
FIG. 13 depicts an example device 1300 (e.g., smart phone) displaying anexample user interface 1310 that includes anotification 1320, according to some example embodiments. In various example embodiments, thepresentation module 210 causes presentation of thenotification 1320 to the user. The notification may be presented to the user in response to automatic detection of the advertisement (e.g., location mapping of the user indicates the user is near the advertisement or automatic detection of an NFC tag embedded in an advertisement). For instance, thepresentation module 210 communicates, to thedevice 1300, instructions to present thenotification 1320. In some instances, the instructions include notification content, generated by thepresentation module 210, such as a message (e.g., pertinent information) to be presented to the user. In example embodiments, thenotification 1320 comprises a text message, such as Short Message Service (SMS) messages, Multimedia Messaging Service (MMS), Enhanced Messaging Service (EMS), and so forth. In other example embodiments, thenotification 1320 comprises a push notification or another similar type of notification. In further example embodiments, thenotification 1320 comprises interactive user interface elements such asuser interface elements 1330. In these example embodiments, theuser interface elements 1330 provide the user an option to make a selection (e.g., through an SMS system, mobile application). - Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A “hardware module” is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner. In various example embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
- In some embodiments, a hardware module may be implemented mechanically, electronically, or any suitable combination thereof. For example, a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations. For example, a hardware module may be a special-purpose processor, such as a Field-Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC). A hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, a hardware module may include software encompassed within a general-purpose processor or other programmable processor. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
- Accordingly, the phrase “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware modules) at different times. Software may accordingly configure a particular processor or processors, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
- Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
- The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein. As used herein, “processor-implemented module” refers to a hardware module implemented using one or more processors.
- Similarly, the methods described herein may be at least partially processor-implemented, with a particular processor or processors being an example of hardware. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. Moreover, the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an Application Program Interface (API)).
- The performance of certain of the operations may be distributed among the processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the processors or processor-implemented modules may be distributed across a number of geographic locations.
-
FIG. 14 illustrates an examplemobile device 1400 executing a mobile operating system (e.g., iOS™, Android™, Windows® Phone, or other mobile operating systems), according to example embodiments. In one embodiment, themobile device 1400 includes a touch screen operable to receive tactile data from auser 1402. For instance, theuser 1402 may physically touch 1404 themobile device 1400, and in response to thetouch 1404, themobile device 1400 determines tactile data such as touch location, touch force, or gesture motion. In various example embodiments, themobile device 1400 displays a home screen 1406 (e.g., Springboard on iOS™) operable to launch applications or otherwise manage various aspects of themobile device 1400. In some example embodiments, thehome screen 1406 provides status information such as battery life, connectivity, or other hardware statuses. In some implementations, theuser 1402 activates user interface elements by touching an area occupied by a respective user interface element. In this manner, theuser 1402 may interact with the applications. For example, touching the area occupied by a particular icon included in thehome screen 1406 causes launching of an application corresponding to the particular icon. - Many varieties of applications (also referred to as “apps”) may be executing on the
mobile device 1400 such as native applications (e.g., applications programmed in Objective-C, Swift, or another suitable language running on iOS™ or applications programmed in Java running on Android™), mobile web applications (e.g., Hyper Text Markup Language-5 (HTML5)), or hybrid applications (e.g., a native shell application that launches an HTML5 session). For example, themobile device 1400 includes amessaging app 1420,audio recording app 1422, acamera app 1424, abook reader app 1426, amedia app 1428, afitness app 1430, afile management app 1432, alocation app 1434, abrowser app 1436, asettings app 1438, acontacts app 1440, atelephone call app 1442, or other apps (e.g., gaming apps, social networking apps, biometric monitoring apps), athird party app 1444. -
FIG. 15 is a block diagram 1500 illustrating an architecture ofsoftware 1502, which may be installed on any one or more of the devices described above.FIG. 15 is merely a non-limiting example of a software architecture, and it will be appreciated that many other architectures may be implemented to facilitate the functionality described herein. Thesoftware 1502 may be implemented by hardware such asmachine 1600 ofFIG. 16 that includesprocessors 1610,memory 1630, and I/O components 1650. In this example architecture, thesoftware 1502 may be conceptualized as a stack of layers where each layer may provide a particular functionality. For example, thesoftware 1502 includes layers such as anoperating system 1504,libraries 1506,frameworks 1508, andapplications 1510. Operationally, theapplications 1510 invoke application programming interface (API) calls 1512 through the software stack and receivemessages 1514 in response to the API calls 1512, according to some implementations. - In various implementations, the
operating system 1504 manages hardware resources and provides common services. Theoperating system 1504 includes, for example, akernel 1520,services 1522, anddrivers 1524. Thekernel 1520 acts as an abstraction layer between the hardware and the other software layers in some implementations. For example, thekernel 1520 provides memory management, processor management (e.g., scheduling), component management, networking, security settings, among other functionality. Theservices 1522 may provide other common services for the other software layers. Thedrivers 1524 may be responsible for controlling or interfacing with the underlying hardware. For instance, thedrivers 1524 may include display drivers, camera drivers, Bluetooth® drivers, flash memory drivers, serial communication drivers (e.g., Universal Serial Bus (USB) drivers), Wi-Fi® drivers, audio drivers, power management drivers, and so forth. - In some implementations, the
libraries 1506 provide a low-level common infrastructure that may be utilized by theapplications 1510. Thelibraries 1506 may includesystem 1530 libraries (e.g., C standard library) that may provide functions such as memory allocation functions, string manipulation functions, mathematic functions, and the like. In addition, thelibraries 1506 may include API libraries 1532 such as media libraries (e.g., libraries to support presentation and manipulation of various media formats such as Moving Picture Experts Group-4 (MPEG4), Advanced Video Coding (H.264 or AVC), Moving Picture Experts Group Layer-3 (MP3), Advanced Audio Coding (AAC), Adaptive Multi-Rate (AMR) audio codec, Joint Photographic Experts Group (JPEG or JPG), Portable Network Graphics (PNG)), graphics libraries (e.g., an OpenGL framework used to render in two dimensions (2D) and three dimensions (3D) in a graphic content on a display), database libraries (e.g., SQLite to provide various relational database functions), web libraries (e.g., WebKit to provide web browsing functionality), and the like. Thelibraries 1506 may also include a wide variety ofother libraries 1534 to provide many other APIs to theapplications 1510. - The
frameworks 1508 provide a high-level common infrastructure that may be utilized by theapplications 1510, according to some implementations. For example, theframeworks 1508 provide various graphic user interface (GUI) functions, high-level resource management, high-level location services, and so forth. Theframeworks 1508 may provide a broad spectrum of other APIs that may be utilized by theapplications 1510, some of which may be specific to a particular operating system or platform. - In an example embodiment, the
applications 1510 include ahome application 1550, acontacts application 1552, abrowser application 1554, abook reader application 1556, alocation application 1558, amedia application 1560, amessaging application 1562, agame application 1564, and a broad assortment of other applications such asthird party application 1566. According to some embodiments, theapplications 1510 are programs that execute functions defined in the programs. Various programming languages may be employed to create one or more of theapplications 1510, structured in a variety of manners, such as object-orientated programming languages (e.g., Objective-C, Java, or C++) or procedural programming languages (e.g., C or assembly language). In a specific example, the third party application 1566 (e.g., an application developed using the Android™ or iOS™ software development kit (SDK) by an entity other than the vendor of the particular platform) may be mobile software running on a mobile operating system such as iOS™, Android™, Windows® Phone, or other mobile operating systems. In this example, thethird party application 1566 may invoke the API calls 1512 provided by themobile operating system 1504 to facilitate functionality described herein. - Example Machine Architecture and Machine-Readable medium
-
FIG. 16 is a block diagram illustrating components of amachine 1600, according to some example embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein. Specifically,FIG. 16 shows a diagrammatic representation of themachine 1600 in the example form of a computer system, within which instructions 1616 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing themachine 1600 to perform any one or more of the methodologies discussed herein may be executed. In alternative embodiments, themachine 1600 operates as a standalone device or may be coupled (e.g., networked) to other machines. In a networked deployment, themachine 1600 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. Themachine 1600 may comprise, but not be limited to, a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a smart home device (e.g., a smart appliance), other smart devices, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing theinstructions 1616, sequentially or otherwise, that specify actions to be taken bymachine 1600. Further, while only asingle machine 1600 is illustrated, the term “machine” shall also be taken to include a collection ofmachines 1600 that individually or jointly execute theinstructions 1616 to perform any one or more of the methodologies discussed herein. - The
machine 1600 may includeprocessors 1610,memory 1630, and I/O components 1650, which may be configured to communicate with each other via a bus 1602. In an example embodiment, the processors 1610 (e.g., a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Radio-Frequency Integrated Circuit (RFIC), another processor, or any suitable combination thereof) may include, for example,processor 1612 andprocessor 1614 that may executeinstructions 1616. The term “processor” is intended to include multi-core processors that may comprise two or more independent processors (also referred to as “cores”) that may execute instructions contemporaneously. AlthoughFIG. 16 shows multiple processors, themachine 1600 may include a single processor with a single core, a single processor with multiple cores (e.g., a multi-core process), multiple processors with a single core, multiple processors with multiples cores, or any combination thereof. - The
memory 1630 may include a main memory 1632, a static memory 1634, and astorage unit 1636 accessible to theprocessors 1610 via the bus 1602. Thestorage unit 1636 may include a machine-readable medium 1638 on which is stored theinstructions 1616 embodying any one or more of the methodologies or functions described herein. Theinstructions 1616 may also reside, completely or at least partially, within the main memory 1632, within the static memory 1634, within at least one of the processors 1610 (e.g., within the processor's cache memory), or any suitable combination thereof, during execution thereof by themachine 1600. Accordingly, in various implementations, the main memory 1632, static memory 1634, and theprocessors 1610 are considered as machine-readable media 1638. - As used herein, the term “memory” refers to a machine-
readable medium 1638 able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium 1638 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to storeinstructions 1616. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., instructions 1616) for execution by a machine (e.g., machine 1600), such that the instructions, when executed by one or more processors of the machine 1600 (e.g., processors 1610), cause themachine 1600 to perform any one or more of the methodologies described herein. Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, one or more data repositories in the form of a solid-state memory (e.g., flash memory), an optical medium, a magnetic medium, other non-volatile memory (e.g., Erasable Programmable Read-Only Memory (EPROM)), or any suitable combination thereof. The term “machine-readable medium” specifically excludes non-statutory signals per se. - The I/
O components 1650 include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. In general, it will be appreciated that the I/O components 1650 may include many other components that are not shown inFIG. 16 . The I/O components 1650 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting. In various example embodiments, the I/O components 1650 includeoutput components 1652 andinput components 1654. Theoutput components 1652 include visual components (e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor), other signal generators, and so forth. Theinput components 1654 include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), tactile input components (e.g., a physical button, a touch screen that provides location and force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like. - In some further example embodiments, the I/
O components 1650 includebiometric components 1656,motion components 1658,environmental components 1660, orposition components 1662 among a wide array of other components. For example, thebiometric components 1656 include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like. Themotion components 1658 include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth. Theenvironmental components 1660 include, for example, illumination sensor components (e.g., photometer), temperature sensor components (e.g., one or more thermometer that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., barometer), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensors (e.g., machine olfaction detection sensors, gas detection sensors to detection concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment. Theposition components 1662 include location sensor components (e.g., a Global Position System (GPS) receiver component), altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like. - Communication may be implemented using a wide variety of technologies. The I/
O components 1650 may includecommunication components 1664 operable to couple themachine 1600 to anetwork 1680 ordevices 1670 viacoupling 1682 andcoupling 1672, respectively. For example, thecommunication components 1664 include a network interface component or another suitable device to interface with thenetwork 1680. In further examples,communication components 1664 include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), Wi-Fi® components, and other communication components to provide communication via other modalities. Thedevices 1670 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)). - Moreover, in some implementations, the
communication components 1664 detect identifiers or include components operable to detect identifiers. For example, thecommunication components 1664 include Radio Frequency Identification (RFID) tag reader components, NFC smart tag detection components, optical reader components (e.g., an optical sensor to detect a one-dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, Uniform Commercial Code Reduced Space Symbology (UCC RSS)-2D bar code, and other optical codes), acoustic detection components (e.g., microphones to identify tagged audio signals), or any suitable combination thereof. In addition, a variety of information can be derived via thecommunication components 1664, such as, location via Internet Protocol (IP) geo-location, location via Wi-Fi® signal triangulation, location via detecting a NFC beacon signal that may indicate a particular location, and so forth. - In various example embodiments, one or more portions of the
network 1680 may be an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless WAN (WWAN), a metropolitan area network (MAN), the Internet, a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a plain old telephone service (POTS) network, a cellular telephone network, a wireless network, a Wi-Fi® network, another type of network, or a combination of two or more such networks. For example, thenetwork 1680 or a portion of thenetwork 1680 may include a wireless or cellular network and thecoupling 1682 may be a Code Division Multiple Access (CDMA) connection, a Global System for Mobile communications (GSM) connection, or other type of cellular or wireless coupling. In this example, thecoupling 1682 may implement any of a variety of types of data transfer technology, such as Single Carrier Radio Transmission Technology (1xRTT), Evolution-Data Optimized (EVDO) technology, General Packet Radio Service (GPRS) technology, Enhanced Data rates for GSM Evolution (EDGE) technology, third Generation Partnership Project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS). High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE) standard, others defined by various standard setting organizations, other long range protocols, or other data transfer technology. - In example embodiments, the
instructions 1616 are transmitted or received over thenetwork 1680 using a transmission medium via a network interface device (e.g., a network interface component included in the communication components 1664) and utilizing any one of a number of well-known transfer protocols (e.g., Hypertext Transfer Protocol (HTTP)). Similarly, in other example embodiments, theinstructions 1616 are transmitted or received using a transmission medium via the coupling 1672 (e.g., a peer-to-peer coupling) todevices 1670. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carryinginstructions 1616 for execution by themachine 1600, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software. - Furthermore, the machine-
readable medium 1638 is non-transitory (in other words, not having any transitory signals) in that it does not embody a propagating signal. However, labeling the machine-readable medium 1638 as “non-transitory” should not be construed to mean that the medium is incapable of movement; the medium should be considered as being transportable from one physical location to another. Additionally, since the machine-readable medium 1638 is tangible, the medium may be considered to be a machine-readable device. - Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
- Although an overview of the inventive subject matter has been described with reference to specific example embodiments, various modifications and changes may be made to these embodiments without departing from the broader scope of embodiments of the present disclosure. Such embodiments of the inventive subject matter may be referred to herein, individually or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single disclosure or inventive concept if more than one is, in fact, disclosed.
- The embodiments illustrated herein are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.
- As used herein, the term “or” may be construed in either an inclusive or exclusive sense. Moreover, plural instances may be provided for resources, operations, or structures described herein as a single instance. Additionally, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in a context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within a scope of various embodiments of the present disclosure. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within a scope of embodiments of the present disclosure as represented by the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
Claims (20)
1. A system comprising:
a communication module to receive, from a portable device of a user, a promotion indication that indicates a presentation of a promotion to a user;
an analysis module to identify the promotion that corresponds to the promotion indication;
an item module, implemented by at least one processor of a machine, to determine at least one item listing based, at least in part, on the promotion;
the communication module further to receive, from the portable device of the user, real-time contextual data that corresponds to the promotion indication, the real-time contextual data that corresponds to a physical environment of the presentation of the promotion;
a condition module to access contextual conditions associated with the promotion and determine satisfaction of the contextual conditions based on the real-time contextual data; and
based on the determined satisfaction of the contextual conditions, a presentation module to cause presentation of a user interface including the at least one item listing to the user, the presentation.
2. The system of claim 1 , wherein:
the analysis module further to identify an advertisement location that corresponds to the advertisement and further to extract a user location from the real-time contextual data; and
the condition module further to determine satisfaction of a distance condition, included in the contextual conditions, by determining that the user location is within a distance of the advertisement location, the distance being specified by the distance condition.
3. A method comprising:
receiving, from a user device, an advertisement indication that indicates a presentation of an advertisement to a user,
identifying the advertisement corresponding to the advertisement indication;
determining, using a hardware processor of a machine, at least one item listing based, at least in part, on the advertisement;
receiving, from the user device, real-time contextual data corresponding to the advertisement indication, the real-time contextual data corresponding to a physical context of the presentation of the advertisement;
accessing contextual conditions associated with the advertisement;
determining satisfaction of the contextual conditions based on the real-time contextual data; and
based on the determined satisfaction of the contextual conditions, causing presentation of the at least one item listing to the user, the presentation being exclusive based on the real-time contextual data.
4. The method of claim 3 , further comprising:
identifying an advertisement location corresponding to the advertisement;
extracting a user location from the real-time contextual data; and
determining satisfaction of a distance condition, included in the contextual conditions, by determining that the user location is within a distance of the advertisement location, the distance being specified by the distance condition.
5. The method of claim 3 , further comprising:
extracting a presentation time from the real-time contextual data; and
determining satisfaction of a temporal condition, included in the contextual conditions, by determining that the presentation time is within a time period specified by the temporal condition.
6. The method of claim 3 , further comprising:
extracting a user identity from the real-time contextual data;
accessing user data corresponding to the user based on the user identity; and
determining the at least one item listing based on the advertisement and the user data.
7. The method of claim 3 , further comprising:
generating an advertisement offer associated with the at least one item listing; and
providing the advertisement offer to the user.
8. The method of claim 7 , wherein the generating the offer is based, at least in part, on the real-time contextual data.
9. The method of claim 7 , wherein the advertisement offer comprises a discounted purchase associated with the at least one item listing.
10. The method of claim 3 , further comprising:
extracting the advertisement indication from the real-time contextual data by:
extracting a user location from the real-time contextual data;
accessing location data corresponding to a plurality of advertisements, and
identifying a match between the user location and the location data of a particular advertisement among the plurality of advertisements.
11. The method of claim 3 , further comprising:
storing the real-time contextual data and the advertisement indication in association with the user; and
receiving a subsequent advertisement indication that indicates a presentation of the advertisement to the user;
identifying the advertisement corresponding to the advertisement indication; and
determining the at least one item listing based on the stored real-time contextual data and the advertisement.
12. The method of claim 3 , wherein the advertisement indication results from the user device physically detecting an advertisement identifier corresponding to the advertisement.
13. The method of claim 12 , wherein the advertisement identifier is detected from at least one of a QR code, a RFID tag, an audio tag, or a smart tag.
14. A machine readable medium having no transitory signals and storing instructions that, when executed by at least one processor of a machine, cause the machine to perform operations comprising:
receiving, from a user device, an advertisement indication that indicates a presentation of an advertisement to a user,
identifying the advertisement corresponding to the advertisement indication;
determining at least one item listing based, at least in part, on the advertisement;
receiving, from the user device, real-time contextual data corresponding to the advertisement indication, the real-time contextual data corresponding to a physical context of the presentation of the advertisement;
accessing contextual conditions associated with the advertisement;
determining satisfaction of the contextual conditions based on the real-time contextual data; and
based on the determined satisfaction of the contextual conditions, causing presentation of the at least one item listing to the user, the presentation being restricted based on the real-time contextual data.
15. The machine-readable medium of claim 14 , further comprising:
identifying an advertisement location corresponding to the advertisement;
extracting a user location from the real-time contextual data; and
determining satisfaction of a distance condition, included in the contextual conditions, by determining that the user location is within a distance of the advertisement location, the distance being specified by the distance condition.
16. The machine-readable medium of claim 14 , further comprising:
extracting a presentation time from the real-time contextual data; and
determining satisfaction of a temporal condition, included in the contextual conditions, by determining that the presentation time is within a time period specified by the temporal condition.
17. The machine-readable medium of claim 14 , further comprising:
extracting a user identity from the real-time contextual data;
accessing user data corresponding to the user based on the user identity; and
determining the at least one item listing based on the advertisement and the user data.
18. The machine-readable medium of claim 14 , further comprising:
generating an advertisement offer associated with the at least one item listing based on the advertisement; and
providing the advertisement offer to the user.
19. The method of claim 14 , wherein the generating the offer is based, at least in part, on the real-time contextual data.
20. The method of claim 14 , wherein the advertisement offer comprises a discounted purchase associated with the at least one item listing.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/465,786 US20150058123A1 (en) | 2013-08-23 | 2014-08-21 | Contextually aware interactive advertisements |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361869557P | 2013-08-23 | 2013-08-23 | |
US14/465,786 US20150058123A1 (en) | 2013-08-23 | 2014-08-21 | Contextually aware interactive advertisements |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150058123A1 true US20150058123A1 (en) | 2015-02-26 |
Family
ID=52481220
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/465,710 Abandoned US20150058142A1 (en) | 2013-08-23 | 2014-08-21 | Store-integrated tablet |
US14/465,786 Abandoned US20150058123A1 (en) | 2013-08-23 | 2014-08-21 | Contextually aware interactive advertisements |
US14/466,857 Active 2035-12-06 US9842351B2 (en) | 2013-08-23 | 2014-08-22 | Generating product listings using locker sensors |
US14/466,801 Abandoned US20150058239A1 (en) | 2013-08-23 | 2014-08-22 | Item-based social discovery |
US15/837,675 Active 2035-06-15 US11188948B2 (en) | 2013-08-23 | 2017-12-11 | Generating product listings using locker sensor data and reconfiguring lockers based on product size |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/465,710 Abandoned US20150058142A1 (en) | 2013-08-23 | 2014-08-21 | Store-integrated tablet |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/466,857 Active 2035-12-06 US9842351B2 (en) | 2013-08-23 | 2014-08-22 | Generating product listings using locker sensors |
US14/466,801 Abandoned US20150058239A1 (en) | 2013-08-23 | 2014-08-22 | Item-based social discovery |
US15/837,675 Active 2035-06-15 US11188948B2 (en) | 2013-08-23 | 2017-12-11 | Generating product listings using locker sensor data and reconfiguring lockers based on product size |
Country Status (1)
Country | Link |
---|---|
US (5) | US20150058142A1 (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106021586A (en) * | 2016-06-06 | 2016-10-12 | 腾讯科技(北京)有限公司 | Information processing method and server |
WO2017074828A1 (en) * | 2015-10-30 | 2017-05-04 | Microsoft Technology Licensing, Llc | Communication interface for wearable devices |
KR20170089480A (en) * | 2016-01-27 | 2017-08-04 | 삼성전자주식회사 | Electronic apparatus and operating method thereof |
US9842351B2 (en) | 2013-08-23 | 2017-12-12 | Ebay Inc. | Generating product listings using locker sensors |
WO2018009550A1 (en) * | 2015-12-01 | 2018-01-11 | Ebay Inc. | Sensor based product recommendations |
US20180020963A1 (en) * | 2016-07-21 | 2018-01-25 | Comcast Cable Communications, Llc | Recommendations Based On Biometric Feedback From Wearable Device |
WO2019194964A1 (en) * | 2018-04-04 | 2019-10-10 | Ebay Inc. | User authentication in hybrid environments |
US10861061B2 (en) * | 2015-04-21 | 2020-12-08 | Facebook, Inc. | Messenger application plug-in for providing tailored advertisements within a conversation thread |
US10939143B2 (en) | 2019-03-26 | 2021-03-02 | Wipro Limited | System and method for dynamically creating and inserting immersive promotional content in a multimedia |
US10948988B1 (en) * | 2019-09-09 | 2021-03-16 | Tectus Corporation | Contextual awareness based on eye motion tracking by an eye-mounted system |
US20220141528A1 (en) * | 2020-11-04 | 2022-05-05 | Digital Turbine, Inc. | Cross-device interaction |
Families Citing this family (57)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9443298B2 (en) | 2012-03-02 | 2016-09-13 | Authentect, Inc. | Digital fingerprinting object authentication and anti-counterfeiting system |
US8774455B2 (en) | 2011-03-02 | 2014-07-08 | Raf Technology, Inc. | Document fingerprinting |
US10346852B2 (en) | 2016-02-19 | 2019-07-09 | Alitheon, Inc. | Preserving authentication under item change |
US10445682B2 (en) | 2013-02-01 | 2019-10-15 | United Parcel Service Of America, Inc. | Systems and methods for parcel delivery to alternate delivery locations |
US10521761B2 (en) | 2013-03-12 | 2019-12-31 | United Parcel Service Of America, Inc. | Systems and methods of delivering parcels using attended delivery/pickup locations |
EP3058530A4 (en) | 2013-10-14 | 2017-04-12 | United Parcel Service Of America, Inc. | Systems and methods for facilitating delivery of a parcel to a suitably sized locker |
US9639889B2 (en) * | 2013-12-23 | 2017-05-02 | Ebay Inc. | Discovery engine storefront |
US10528908B2 (en) | 2014-03-12 | 2020-01-07 | Ebay Inc. | Automatic location based discovery of extended inventory |
US11195230B2 (en) * | 2014-07-25 | 2021-12-07 | Clearingbid, Inc. | Systems including a hub platform, communication network and memory configured for processing data involving time-stamped/time-sensitive aspects and/or other features |
US9532188B1 (en) * | 2014-08-10 | 2016-12-27 | Google Inc. | Creating a group based on proximate detection |
CA2967064C (en) | 2014-11-14 | 2020-08-25 | United Parcel Service Of America, Inc. | Systems and methods for facilitating shipping of parcels for returning items |
US10410164B2 (en) | 2014-11-14 | 2019-09-10 | United Parcel Service Of America, Inc | Systems and methods for facilitating shipping of parcels |
US10210544B2 (en) * | 2014-12-17 | 2019-02-19 | Paypal, Inc. | Displaying merchandise with avatars |
US20160196584A1 (en) * | 2015-01-06 | 2016-07-07 | Facebook, Inc. | Techniques for context sensitive overlays |
US9743041B1 (en) * | 2015-01-22 | 2017-08-22 | Lawrence J. Owen | AskMe now system and method |
US20160292247A1 (en) * | 2015-03-31 | 2016-10-06 | Kenneth Scott Kaufman | Method of retrieving categorical data entries through an interactive graphical abstraction |
US10055707B2 (en) | 2015-04-07 | 2018-08-21 | Paypal, Inc. | Location detection devices for use in a courier services network |
US10083479B2 (en) * | 2015-06-04 | 2018-09-25 | Verizon Patent And Licensing Inc. | Systems and methods for product user interface development |
US20160371273A1 (en) * | 2015-06-18 | 2016-12-22 | WYMP, Inc. | System and method for searching for specific types of items based on peer ranking of quality |
GB2544871A (en) * | 2015-10-08 | 2017-05-31 | Rfid Innovation Res Ltd | Intelligent display system and method |
EP3174001A1 (en) * | 2015-11-27 | 2017-05-31 | Fujitsu Limited | Gaze tracking system and gaze tracking method |
US10867301B2 (en) | 2016-04-18 | 2020-12-15 | Alitheon, Inc. | Authentication-triggered processes |
US10740767B2 (en) | 2016-06-28 | 2020-08-11 | Alitheon, Inc. | Centralized databases storing digital fingerprints of objects for collaborative authentication |
US10915612B2 (en) | 2016-07-05 | 2021-02-09 | Alitheon, Inc. | Authenticated production |
EP3491534A4 (en) | 2016-07-29 | 2020-01-15 | Cardex Group Pty Ltd | Contact information exchanging and content system and method for networking and marketing |
US10902540B2 (en) | 2016-08-12 | 2021-01-26 | Alitheon, Inc. | Event-driven authentication of physical objects |
US10839528B2 (en) | 2016-08-19 | 2020-11-17 | Alitheon, Inc. | Authentication-based tracking |
US10600022B2 (en) | 2016-08-31 | 2020-03-24 | United Parcel Service Of America, Inc. | Systems and methods for synchronizing delivery of related parcels via a computerized locker bank |
MX2019003023A (en) * | 2016-09-16 | 2019-07-01 | Walmart Apollo Llc | Returned product detection. |
US11157886B2 (en) | 2017-02-03 | 2021-10-26 | Viatouch Media Inc. | Cantilevered weight sensitive shelf, rail, and mounting system |
US20190378088A1 (en) * | 2017-02-03 | 2019-12-12 | Viatouch Media Inc. | System and method of individualized merchandising in an automatic retail device |
US10521825B2 (en) * | 2017-03-15 | 2019-12-31 | Facebook, Inc. | Systems and methods for providing interactive user interface elements for obtaining feedback within a media content item |
US11062118B2 (en) | 2017-07-25 | 2021-07-13 | Alitheon, Inc. | Model-based digital fingerprinting |
US10635801B2 (en) | 2017-10-30 | 2020-04-28 | Walmart Apollo, Llc | Systems and methods for securing access to storage and retrieval systems |
CN107909741A (en) * | 2017-11-13 | 2018-04-13 | 北京小米移动软件有限公司 | The method and shared cabinet system of the shared cabinet of management |
US10242263B1 (en) | 2017-11-14 | 2019-03-26 | Wells Fargo Bank, N.A. | Virtual assistant of safe locker |
US10970549B1 (en) | 2017-11-14 | 2021-04-06 | Wells Fargo Bank, N.A. | Virtual assistant of safe locker |
EP3514715A1 (en) | 2018-01-22 | 2019-07-24 | Alitheon, Inc. | Secure digital fingerprint key object database |
USD905083S1 (en) * | 2018-10-23 | 2020-12-15 | Yoox Net-A-Porter Group Spa | Display screen with animated graphical user interface |
US11907894B2 (en) * | 2019-01-11 | 2024-02-20 | Fff Enterprises, Inc. | Storage devices and operation methods thereof |
US11263581B2 (en) * | 2019-01-11 | 2022-03-01 | Fff Enterprises, Inc. | Storage devices and operation methods thereof |
US10963670B2 (en) | 2019-02-06 | 2021-03-30 | Alitheon, Inc. | Object change detection and measurement using digital fingerprints |
CN110009828A (en) * | 2019-04-12 | 2019-07-12 | 武汉找学网科技有限公司 | A kind of locker based on Internet of Things, storing method and system |
EP3734506A1 (en) | 2019-05-02 | 2020-11-04 | Alitheon, Inc. | Automated authentication region localization and capture |
EP3736717A1 (en) | 2019-05-10 | 2020-11-11 | Alitheon, Inc. | Loop chain digital fingerprint method and system |
US11602232B2 (en) * | 2019-10-02 | 2023-03-14 | ACI Holdings Ltd. | System and method for generating clothing recommendations via a smart mirror |
US11238146B2 (en) | 2019-10-17 | 2022-02-01 | Alitheon, Inc. | Securing composite objects using digital fingerprints |
EP3859603A1 (en) | 2020-01-28 | 2021-08-04 | Alitheon, Inc. | Depth-based digital fingerprinting |
EP3885982A3 (en) | 2020-03-23 | 2021-12-22 | Alitheon, Inc. | Hand biometrics system and method using digital fingerprints |
US11568683B2 (en) | 2020-03-23 | 2023-01-31 | Alitheon, Inc. | Facial biometrics system and method using digital fingerprints |
EP3929806A3 (en) | 2020-04-06 | 2022-03-09 | Alitheon, Inc. | Local encoding of intrinsic authentication data |
US11663849B1 (en) | 2020-04-23 | 2023-05-30 | Alitheon, Inc. | Transform pyramiding for fingerprint matching system and method |
US11983957B2 (en) | 2020-05-28 | 2024-05-14 | Alitheon, Inc. | Irreversible digital fingerprints for preserving object security |
US11700123B2 (en) | 2020-06-17 | 2023-07-11 | Alitheon, Inc. | Asset-backed digital security tokens |
US11908262B2 (en) | 2021-11-18 | 2024-02-20 | Capital One Services, Llc | Token based secure access to a locker system |
US20250053995A1 (en) * | 2021-12-09 | 2025-02-13 | Synq Access + Security Technology Ltd. | Item collection, return, exchange without human interaction |
US20240087386A1 (en) * | 2022-09-13 | 2024-03-14 | Grubbrr Spv Llc | Systems and methods for management of dual sided delivery locker array |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070286358A1 (en) * | 2006-04-29 | 2007-12-13 | Msystems Ltd. | Digital audio recorder |
US20080266128A1 (en) * | 2007-04-27 | 2008-10-30 | Sensormatic Electronics Corporation | Handheld data capture system with power and safety monitor and method therefore |
US20090012865A1 (en) * | 2005-10-31 | 2009-01-08 | Yahoo! Inc. | Clickable map interface for product inventory |
US20090164132A1 (en) * | 2007-12-13 | 2009-06-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for comparing media content |
US20100169175A1 (en) * | 2006-10-30 | 2010-07-01 | Koran Joshua M | Optimization of Targeted Advertisements Based on User Profile Information |
US20120142322A1 (en) * | 2010-12-06 | 2012-06-07 | Echostar Technologies L.L.C. | Providing Location Information Using Matrix Code |
US20120223131A1 (en) * | 2011-03-03 | 2012-09-06 | Lim John W | Method and apparatus for dynamically presenting content in response to successive scans of a static code |
US20130041761A1 (en) * | 2011-04-07 | 2013-02-14 | Jeffrey Allen Voda | Location based advertising asset tracking system and method |
US20130124361A1 (en) * | 2010-07-08 | 2013-05-16 | Christopher Bryson | Consumer, retailer and supplier computing systems and methods |
US20130151343A1 (en) * | 2011-12-09 | 2013-06-13 | Samsung Electronics Co., Ltd. | Displaying mobile advertising based on determining user's physical activity from mobile device sensor data |
US20130229261A1 (en) * | 2012-03-01 | 2013-09-05 | Elwha Llc | Systems and methods for scanning a user environment and evaluating data of interest |
US20130325594A1 (en) * | 2012-06-05 | 2013-12-05 | Yahoo! Inc. | Sponsored applications |
US8782691B1 (en) * | 2002-01-15 | 2014-07-15 | The Directv Group, Inc. | Time shifted targeted advertisements based upon user profiles |
Family Cites Families (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5158155A (en) * | 1990-10-11 | 1992-10-27 | Vendorsgroup, Inc. | Vendors' structural complex |
US8346626B2 (en) | 1999-05-07 | 2013-01-01 | Robertson Steven C | System and method for providing electronic multi-merchant gift registry services over a distributed network |
TW446878B (en) | 1999-09-15 | 2001-07-21 | First Cube Pte Ltd | A method and system for facilitating delivery and pickup of goods |
US7257552B1 (en) * | 2000-03-27 | 2007-08-14 | Hector Franco | Consumer products distribution system |
US7698240B1 (en) * | 2000-05-15 | 2010-04-13 | I2 Technologies Us, Inc. | System and method for providing electronic financial transaction services |
US6886895B2 (en) * | 2000-05-24 | 2005-05-03 | Globix Corporation | Security locker for computer equipment |
US6546309B1 (en) * | 2000-06-29 | 2003-04-08 | Kinney & Lange, P.A. | Virtual fitting room |
US7653457B2 (en) * | 2001-03-16 | 2010-01-26 | Breakthrough Logistics Corporation | Method and system for efficient package delivery and storage |
US6694217B2 (en) * | 2001-05-24 | 2004-02-17 | Breakthrough Logistics Corporation | Automated system for efficient article storage and self-service retrieval |
US20020178074A1 (en) | 2001-05-24 | 2002-11-28 | Gregg Bloom | Method and apparatus for efficient package delivery and storage |
US7890375B2 (en) * | 2001-07-31 | 2011-02-15 | Half.Com, Inc. | Method and system to facilitate pre-ordering via an electronic commerce facility, and to automatically facilitate satisfying of a pre-order upon listing of an appropriate offer via the electronic commerce facility |
US7885901B2 (en) * | 2004-01-29 | 2011-02-08 | Yahoo! Inc. | Method and system for seeding online social network contacts |
US9805395B2 (en) * | 2012-01-19 | 2017-10-31 | Dizpersion Corporation | Online marketing system and method |
US7873549B1 (en) * | 2006-03-27 | 2011-01-18 | Amazon Technologies, Inc. | Product dimension correction |
US7853480B2 (en) | 2007-05-21 | 2010-12-14 | Amazon Technologies, Inc. | System and method for providing export services to merchants |
US20090024402A1 (en) * | 2007-07-20 | 2009-01-22 | Ebay Inc. | Search using multi-faceted reputation information |
US8542906B1 (en) * | 2008-05-21 | 2013-09-24 | Sprint Communications Company L.P. | Augmented reality image offset and overlay |
US8494909B2 (en) * | 2009-02-09 | 2013-07-23 | Datalogic ADC, Inc. | Automatic learning in a merchandise checkout system with visual recognition |
US20110238512A1 (en) * | 2010-03-26 | 2011-09-29 | Eric Hapaki Doty | Method and Apparatus for Showroom Sales |
WO2012048057A2 (en) | 2010-10-05 | 2012-04-12 | Centric Software, Inc. | Interactive collection book for mobile devices |
US20130046594A1 (en) | 2011-06-04 | 2013-02-21 | Box Office Live Television, LLC | Interactive advertising displays |
US10043219B2 (en) * | 2012-02-21 | 2018-08-07 | Neil Shivraj DAVEY | Robotically assisted banking automation and insurance system |
US8849721B2 (en) * | 2011-09-21 | 2014-09-30 | Facebook, Inc. | Structured objects and actions on a social networking system |
US20130103758A1 (en) * | 2011-10-19 | 2013-04-25 | c/o Facebook, Inc. | Filtering and ranking recommended users on a social networking system |
US20130110678A1 (en) * | 2011-11-02 | 2013-05-02 | Apple Inc. | Purchasing a product in a store using a mobile device |
US8666836B2 (en) | 2011-12-15 | 2014-03-04 | Facebook, Inc. | Targeting items to a user of a social networking system based on a predicted event for the user |
US8606645B1 (en) | 2012-02-02 | 2013-12-10 | SeeMore Interactive, Inc. | Method, medium, and system for an augmented reality retail application |
US20130212173A1 (en) * | 2012-02-13 | 2013-08-15 | Robert William Carthcart | Suggesting relationship modifications to users of a social networking system |
US20130262252A1 (en) * | 2012-03-29 | 2013-10-03 | Girish Lakshman | Pickup locations as a transfer point |
US9406084B2 (en) * | 2012-05-23 | 2016-08-02 | Specialty's Café & Bakery, Inc. | Methods for submitting a food order remotely |
US9898742B2 (en) | 2012-08-03 | 2018-02-20 | Ebay Inc. | Virtual dressing room |
US9462066B2 (en) | 2012-08-21 | 2016-10-04 | Facebook, Inc. | Social action by quick response (QR) code |
US20140114875A1 (en) * | 2012-10-23 | 2014-04-24 | Swapbox Inc. | Methods and systems for the secure sale of tangible goods |
US8972189B2 (en) | 2012-10-29 | 2015-03-03 | Ebay Inc. | Social mobile shopping system |
US20140122201A1 (en) | 2012-10-30 | 2014-05-01 | Aaron Johnson | Apparatus, method, and computer program product for online social marketing |
KR20170116224A (en) * | 2013-03-15 | 2017-10-18 | 로케이터 아이피, 엘피 | Shelf-level marketing and point of sales enrichment |
US20140330407A1 (en) * | 2013-05-01 | 2014-11-06 | Thomas Corder | Intelligent Reconfigurable Locker System |
US8827095B1 (en) * | 2013-07-24 | 2014-09-09 | Locker Storage Solutions, LLC | Expandable-collapsible safe |
US20150058142A1 (en) | 2013-08-23 | 2015-02-26 | Michael George Lenahan | Store-integrated tablet |
US20150332362A1 (en) * | 2014-05-16 | 2015-11-19 | Reverb.com LLC | System and method for facilitating sale of goods |
-
2014
- 2014-08-21 US US14/465,710 patent/US20150058142A1/en not_active Abandoned
- 2014-08-21 US US14/465,786 patent/US20150058123A1/en not_active Abandoned
- 2014-08-22 US US14/466,857 patent/US9842351B2/en active Active
- 2014-08-22 US US14/466,801 patent/US20150058239A1/en not_active Abandoned
-
2017
- 2017-12-11 US US15/837,675 patent/US11188948B2/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8782691B1 (en) * | 2002-01-15 | 2014-07-15 | The Directv Group, Inc. | Time shifted targeted advertisements based upon user profiles |
US20090012865A1 (en) * | 2005-10-31 | 2009-01-08 | Yahoo! Inc. | Clickable map interface for product inventory |
US20070286358A1 (en) * | 2006-04-29 | 2007-12-13 | Msystems Ltd. | Digital audio recorder |
US20100169175A1 (en) * | 2006-10-30 | 2010-07-01 | Koran Joshua M | Optimization of Targeted Advertisements Based on User Profile Information |
US20080266128A1 (en) * | 2007-04-27 | 2008-10-30 | Sensormatic Electronics Corporation | Handheld data capture system with power and safety monitor and method therefore |
US20090164132A1 (en) * | 2007-12-13 | 2009-06-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for comparing media content |
US20130124361A1 (en) * | 2010-07-08 | 2013-05-16 | Christopher Bryson | Consumer, retailer and supplier computing systems and methods |
US20120142322A1 (en) * | 2010-12-06 | 2012-06-07 | Echostar Technologies L.L.C. | Providing Location Information Using Matrix Code |
US20120223131A1 (en) * | 2011-03-03 | 2012-09-06 | Lim John W | Method and apparatus for dynamically presenting content in response to successive scans of a static code |
US20130041761A1 (en) * | 2011-04-07 | 2013-02-14 | Jeffrey Allen Voda | Location based advertising asset tracking system and method |
US20130151343A1 (en) * | 2011-12-09 | 2013-06-13 | Samsung Electronics Co., Ltd. | Displaying mobile advertising based on determining user's physical activity from mobile device sensor data |
US20130229261A1 (en) * | 2012-03-01 | 2013-09-05 | Elwha Llc | Systems and methods for scanning a user environment and evaluating data of interest |
US20130325594A1 (en) * | 2012-06-05 | 2013-12-05 | Yahoo! Inc. | Sponsored applications |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11188948B2 (en) | 2013-08-23 | 2021-11-30 | Ebay Inc. | Generating product listings using locker sensor data and reconfiguring lockers based on product size |
US9842351B2 (en) | 2013-08-23 | 2017-12-12 | Ebay Inc. | Generating product listings using locker sensors |
US10861061B2 (en) * | 2015-04-21 | 2020-12-08 | Facebook, Inc. | Messenger application plug-in for providing tailored advertisements within a conversation thread |
WO2017074828A1 (en) * | 2015-10-30 | 2017-05-04 | Microsoft Technology Licensing, Llc | Communication interface for wearable devices |
US11121999B2 (en) | 2015-10-30 | 2021-09-14 | Microsoft Technology Licensing, Llc | Communication interface for wearable devices |
CN108352005A (en) * | 2015-10-30 | 2018-07-31 | 微软技术许可有限责任公司 | For the communication interface of wearable device |
WO2018009550A1 (en) * | 2015-12-01 | 2018-01-11 | Ebay Inc. | Sensor based product recommendations |
US10952076B2 (en) | 2016-01-27 | 2021-03-16 | Samsung Electronics Co., Ltd. | Electronic device and operating method therefor |
KR20170089480A (en) * | 2016-01-27 | 2017-08-04 | 삼성전자주식회사 | Electronic apparatus and operating method thereof |
EP3393176A4 (en) * | 2016-01-27 | 2018-11-14 | Samsung Electronics Co., Ltd. | Electronic device and operating method therefor |
KR102377002B1 (en) * | 2016-01-27 | 2022-03-21 | 삼성전자주식회사 | Electronic apparatus and operating method thereof |
CN106021586A (en) * | 2016-06-06 | 2016-10-12 | 腾讯科技(北京)有限公司 | Information processing method and server |
US20180020963A1 (en) * | 2016-07-21 | 2018-01-25 | Comcast Cable Communications, Llc | Recommendations Based On Biometric Feedback From Wearable Device |
US11707216B2 (en) * | 2016-07-21 | 2023-07-25 | Comcast Cable Communications, Llc | Recommendations based on biometric feedback from wearable device |
US20240148295A1 (en) * | 2016-07-21 | 2024-05-09 | Comcast Cable Communications, Llc | Recommendations Based On Biometric Feedback From Wearable Device |
WO2019194964A1 (en) * | 2018-04-04 | 2019-10-10 | Ebay Inc. | User authentication in hybrid environments |
US11055763B2 (en) | 2018-04-04 | 2021-07-06 | Ebay Inc. | User authentication in hybrid online and real-world environments |
EP4435645A3 (en) * | 2018-04-04 | 2024-12-11 | eBay Inc. | User authentication in hybrid environments |
US10939143B2 (en) | 2019-03-26 | 2021-03-02 | Wipro Limited | System and method for dynamically creating and inserting immersive promotional content in a multimedia |
US10948988B1 (en) * | 2019-09-09 | 2021-03-16 | Tectus Corporation | Contextual awareness based on eye motion tracking by an eye-mounted system |
US20220141528A1 (en) * | 2020-11-04 | 2022-05-05 | Digital Turbine, Inc. | Cross-device interaction |
US11540007B2 (en) * | 2020-11-04 | 2022-12-27 | Digital Turbine, Inc. | Cross-device interaction |
US20230128319A1 (en) * | 2020-11-04 | 2023-04-27 | Digital Turbine, Inc. | Cross-device interaction |
US12028572B2 (en) * | 2020-11-04 | 2024-07-02 | Digital Turbine, Inc. | Cross-device interaction |
Also Published As
Publication number | Publication date |
---|---|
US9842351B2 (en) | 2017-12-12 |
US20150058239A1 (en) | 2015-02-26 |
US20180232771A1 (en) | 2018-08-16 |
US20150058163A1 (en) | 2015-02-26 |
US11188948B2 (en) | 2021-11-30 |
US20150058142A1 (en) | 2015-02-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20150058123A1 (en) | Contextually aware interactive advertisements | |
US20210082023A9 (en) | Data mesh visualization | |
US12217302B2 (en) | Interactive product review interface | |
US11563817B2 (en) | Passive social media contact engagement | |
US11132722B2 (en) | Dynamic predefined product reviews | |
US20210263918A1 (en) | Comparison and Visualization System | |
US11792733B2 (en) | Battery charge aware communications | |
KR20210068156A (en) | System and method for personalized actionable notifications | |
CN112825180A (en) | Validated video commentary | |
US11928727B2 (en) | Managing products at a physical marketplace | |
US12299713B2 (en) | Advertising cannibalization management | |
CN113196328B (en) | Draft Completion System | |
US20160189262A1 (en) | System and method for buy through transactions | |
US11188988B2 (en) | Image generation for social media contact engagement | |
CN112785365A (en) | Compatible model determination for efficient list creation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: EBAY INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LENAHAN, MICHAEL GEORGE;CHUNG, CHAHN;SANDOVAL, MYRA;AND OTHERS;REEL/FRAME:033586/0519 Effective date: 20140821 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |