US20090112713A1 - Opportunity advertising in a mobile device - Google Patents

Opportunity advertising in a mobile device Download PDF

Info

Publication number
US20090112713A1
US20090112713A1 US12006793 US679308A US2009112713A1 US 20090112713 A1 US20090112713 A1 US 20090112713A1 US 12006793 US12006793 US 12006793 US 679308 A US679308 A US 679308A US 2009112713 A1 US2009112713 A1 US 2009112713A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
content
operation
circuit
display
person
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US12006793
Inventor
Edward K.Y. Jung
Royce A. Levien
Robert W. Lord
Mark A. Malamud
John D. Rinaldo, Jr.
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Searete LLC
Original Assignee
Searete LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/02Marketing, e.g. market research and analysis, surveying, promotions, advertising, buyer profiling, customer management or rewards; Price estimation or determination
    • G06Q30/0207Discounts or incentives, e.g. coupons, rebates, offers or upsales
    • G06Q30/0212Chance discounts or incentives

Abstract

Provided embodiments include a device, apparatus, system, computer program product, and method. A provided method includes a method implemented in a mobile device having a core communication function and operable to present human perceivable content using a display. The method includes detecting an attention of a person with respect to the display. The method also includes determining that space is available on the display for presentation of advertising content. The method further includes sending to a third-party an indication of the detected attention of the person and an indication of the determined availability of the display to present advertising content.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application is related to and claims the benefit of the earliest available effective filing date(s) from the following listed application(s) (the “Related Applications”) (e.g., claims earliest available priority dates for other than provisional patent applications or claims benefits under 35 USC § 119(e) for provisional patent applications, for any and all parent, grandparent, great-grandparent, etc. applications of the Related Application(s)).
  • RELATED APPLICATIONS
  • For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 11/977,752, entitled METHOD OF SELECTING A SECOND CONTENT BASED ON A USER'S REACTION TO A FIRST CONTENT, naming EDWARD K. Y. JUNG; ROYCE A. LEVIEN; ROBERT W. LORD; MARK A. MALAMUD; JOHN D. RINALDO, JR. as inventors, filed 24 OCT. 2007, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
  • For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 11/977,748, entitled REQUESTING A SECOND CONTENT BASED ON A USER'S REACTION TO A FIRST CONTENT, naming EDWARD K. Y. JUNG; ROYCE A. LEVIEN; ROBERT W. LORD; MARK A. MALAMUD; JOHN D. RINALDO, JR. as inventors, filed 25 OCT. 2007, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
  • For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 11/978,206, entitled SELECTING A SECOND CONTENT BASED ON A USER'S REACTION TO A FIRST CONTENT, naming EDWARD K. Y. JUNG; ROYCE A. LEVIEN; ROBERT W. LORD; MARK A. MALAMUD; JOHN D. RINALDO, JR. as inventors, filed 26 OCT. 2007, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
  • For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 11/978,534, entitled RETURNING A SECOND CONTENT BASED ON A USER'S REACTION TO A FIRST CONTENT, naming EDWARD K. Y. JUNG; ROYCE A. LEVIEN; ROBERT W. LORD; MARK A. MALAMUD; JOHN D. RINALDO, JR. as inventors, filed 27 OCT. 2007, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
  • For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 11/980,321, entitled METHOD OF SELECTING A SECOND CONTENT BASED ON A USER'S REACTION TO A FIRST CONTENT OF AT LEAST TWO INSTANCES OF DISPLAYED CONTENT, naming EDWARD K. Y. JUNG; ROYCE A. LEVIEN; ROBERT W. LORD; MARK A. MALAMUD; JOHN D. RINALDO, JR. as inventors, filed 29 OCT. 2007, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
  • For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 11/981,573, entitled SELECTING A SECOND CONTENT BASED ON A USER'S REACTION TO A FIRST CONTENT OF AT LEAST TWO INSTANCES OF DISPLAYED CONTENT, naming EDWARD K. Y. JUNG; ROYCE A. LEVIEN; ROBERT W. LORD; MARK A. MALAMUD; JOHN D. RINALDO, JR. as inventors, filed 30 OCT. 2007, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
  • For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 11/983,406, entitled RETURNING A NEW CONTENT BASED ON A PERSON'S REACTION TO AT LEAST TWO INSTANCES OF PREVIOUSLY DISPLAYED CONTENT, naming EDWARD K. Y. JUNG; ROYCE A. LEVIEN; ROBERT W. LORD; MARK A. MALAMUD; JOHN D. RINALDO, JR. as inventors, filed 7 NOV. 2007, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
  • For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 11/998,820, entitled TARGETED-ADVERTISING BASED ON A SENSED PHYSIOLOGICAL RESPONSE BY A PERSON TO A GENERAL ADVERTISEMENT, naming EDWARD K. Y. JUNG; ROYCE A. LEVIEN; ROBERT W. LORD; MARK A. MALAMUD; JOHN D. RINALDO, JR. as inventors, filed 30 NOV. 2007, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
  • For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 11/998,826, entitled PHYSIOLOGICAL RESPONSE BASED TARGETED ADVERTISING, naming EDWARD K. Y. JUNG; ROYCE A. LEVIEN; ROBERT W. LORD; MARK A. MALAMUD; JOHN D. RINALDO, JR. as inventors, filed 30 NOV. 2007, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
  • For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 11/998,779, entitled PROVIDING PERSONALIZED ADVERTISING, naming EDWARD K. Y. JUNG; ROYCE A. LEVIEN; ROBERT W. LORD; MARK A. MALAMUD; JOHN D. RINALDO, JR. as inventors, filed 30 NOV. 2007, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
  • For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. 12/001,759, entitled RETURNING A PERSONALIZED ADVERTISEMENT, naming EDWARD K. Y. JUNG; ROYCE A. LEVIEN; ROBERT W. LORD; MARK A. MALAMUD; JOHN D. RINALDO, JR. as inventors, filed 11 DEC. 2007, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
  • For purposes of the USPTO extra-statutory requirements, the present application constitutes a continuation-in-part of U.S. patent application Ser. No. TO BE ASSIGNED, entitled METHOD OF SPACE-AVAILABLE ADVERTISING IN A MOBILE DEVICE, naming EDWARD K. Y. JUNG; ROYCE A. LEVIEN; ROBERT W. LORD; MARK A. MALAMUD; JOHN D. RINALDO, JR. as inventors, filed 4 JAN. 2008, which is currently co-pending, or is an application of which a currently co-pending application is entitled to the benefit of the filing date.
  • The United States Patent Office (USPTO) has published a notice to the effect that the USPTO's computer programs require that patent applicants reference both a serial number and indicate whether an application is a continuation or continuation-in-part. Stephen G. Kunin, Benefit of Prior-Filed Application, USPTO Official Gazette Mar. 18, 2003, available at http://www.uspto.gov/web/offices/com/sol/og/2003/week11/patbene.htm. The present Applicant Entity (hereinafter “Applicant”) has provided above a specific reference to the application(s) from which priority is being claimed as recited by statute. Applicant understands that the statute is unambiguous in its specific reference language and does not require either a serial number or any characterization, such as “continuation” or “continuation-in-part,” for claiming priority to U.S. patent applications. Notwithstanding the foregoing, Applicant understands that the USPTO's computer programs have certain data entry requirements, and hence Applicant is designating the present application as a continuation-in-part of its parent applications as set forth above, but expressly points out that such designations are not to be construed in any way as any type of commentary and/or admission as to whether or not the present application contains any new matter in addition to the matter of its parent application(s).
  • All subject matter of the Related Applications and of any and all parent, grandparent, great-grandparent, etc. applications of the Related Applications is incorporated herein by reference to the extent such subject matter is not inconsistent herewith.
  • SUMMARY
  • An embodiment provides a method. The method is implemented in a mobile device having a core communication function and operable to present human perceivable content using a display surface. The method includes identifying an attention of a person with respect to the display surface. The method also includes determining that the display surface is available to present advertising content. The method further includes presenting an advertising content using the display surface.
  • In an alternative embodiment, the method is implemented in at least one of a portable, handheld, a cellular, or a wireless mobile device having a core communication function and operable to present human perceivable content using a display surface. In another alternative embodiment, the method is implemented in at least one of a human borne, vehicle borne, an aircraft borne, a train borne, or a vessel borne mobile device having a core communication function and operable to present human perceivable content using a display surface. In a further embodiment, the method is implemented in a mobile device having at least one of a voice, telephone, email, message, global positioning, navigation, picture, video, browsing, or Internet core communication function, and operable to present human perceivable content using a display surface. In an embodiment, the method is implemented in a mobile device having a core communication function and operable to present human perceivable content using a display surface that includes at least one of a visually reflective surface, a flat surface, a screen, an audio speaker, or a scent emitter.
  • In an alternative embodiment, the method may include facilitating a selection of the advertising content. In another embodiment, the method may include receiving the advertising content from a remote advertising server. In a further embodiment, the method may include facilitating a selection of a follow-up advertising content at least partially based on a response by the person to the presented advertising content. In another embodiment, the method may include notifying an advertising selector of the determined availability of the display surface to present advertising content. In a further embodiment, the method may include saving an indication of having presented the advertising content. In another embodiment, the method may include saving an indication of a response by the person with respect to the presented advertising content. In addition to the foregoing, other method embodiments are described in the claims, drawings, and text that form a part of the present application.
  • Another embodiment provides a method. The method is implemented in a mobile device having a core communication function and operable to present human perceivable content using a display. The method includes detecting an attention of a person with respect to the display. The method also includes determining that space is available on the display for presentation of advertising content. The method further includes sending to a third-party an indication of the detected attention of the person and an indication of the determined availability of the display to present advertising content. In an alternative embodiment, the method may include receiving an indication of an advertising content selected for presentation by a remotely located application. In another alternative embodiment, the method may include presenting the selected advertising content using the display. In addition to the foregoing, other method embodiments are described in the claims, drawings, and text that form a part of the present application.
  • A further embodiment provides a mobile communications device. The mobile communications device includes a display circuit operable to facilitate presentation of human perceivable content on a display surface. The mobile communications device also includes a core communication system operable to exchange data with another computing device and to provide core communication related information to the display circuit. The mobile communications device further includes a tracking system operable to determine a physical orientation of an element of a person's sensory system with respect to the display surface. The mobile communications device also includes a display status circuit operable to determine an availability of the display surface to present advertising content. The mobile communications device further includes an advertisement insertion circuit operable to provide advertising content to the display circuit for presentation. In an alternative embodiment, the mobile communications device may include an advertisement acquisition circuit operable to initiate a selection of the advertising content. In addition to the foregoing, other device embodiments are described in the claims, drawings, and text that form a part of the present application.
  • An embodiment provides a computer program product. The computer program product includes a computer-readable storage medium bearing program instructions. The program instructions are operable to perform a process in a mobile computing device having a core communication function and operable to present human perceivable content using a display surface. The process includes identifying an attention of a person with respect to the display surface. The process further includes determining that the display surface is available to present advertising content. The process also includes presenting an advertising content using the display surface. In an alternative embodiment, the process may include facilitating a selection of the advertising content. In another embodiment, the process may include receiving the advertising content from a remote advertising server. In a further embodiment, the process may include notifying an advertising selector of the determined availability of the display surface to present advertising content. In another embodiment, the process may include saving an indication of an action by the person. In a further embodiment, the process may include saving an indication of a physiological response by the person with respect to the advertising content. In addition to the foregoing, other computer program product embodiments are described in the claims, drawings, and text that form a part of the present application.
  • Another embodiment provides a mobile device having a core communication function and operable to present human perceivable content using a display surface. The mobile device includes means for identifying an attention of a person with respect to the display surface. The mobile device also includes means for determining that the display surface is available to present advertising content. The mobile device further includes means for presenting an advertising content using the display surface. The mobile device may include means for facilitating a selection of the advertising content. The mobile device may include means for receiving the advertising content from a remote advertising server. The mobile device may include means for notifying an advertising selector of the determined availability of the display surface to present advertising content. In addition to the foregoing, other device embodiments are described in the claims, drawings, and text that form a part of the present application.
  • A further embodiment includes a method implemented in a mobile device operable to present human perceivable content using a display surface. The method includes detecting an attention of a person with respect to the display surface. The method also includes determining that space is available on the display surface for presenting a space-available advertisement. The method further includes sending an indication to a third-party of an opportunity for presentation of a space-available advertisement. The method may include receiving an indication of a space-available-advertisement. The method may include presenting the indicated space-available-advertisement. In addition to the foregoing, other method embodiments are described in the claims, drawings, and text that form a part of the present application.
  • The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates an exemplary embodiment of a thin computing device in which embodiments may be implemented;
  • FIG. 2 illustrates an exemplary embodiment of a general-purpose computing system in which embodiments may be implemented;
  • FIG. 3 illustrates an example environment in which embodiments may be implemented;
  • FIG. 4 illustrates an example of an operational flow implemented in an environment that includes a person interacting with an electronic device using a user direct-input device;
  • FIG. 5 illustrates an alternative embodiment of the operational flow of FIG. 4;
  • FIG. 6 illustrates another alternative embodiment of the operational flow of FIG. 4;
  • FIG. 7 illustrates a further alternative embodiment of the operational flow of FIG. 4;
  • FIG. 8 illustrates an alternative embodiment of the operational flow of FIG. 4;
  • FIG. 9 illustrates another alternative embodiment of the operational flow of FIG. 4;
  • FIG. 10 illustrates an example environment;
  • FIG. 11 illustrates an example computer program product;
  • FIG. 12 illustrates an example environment that includes an electronic device;
  • FIG. 13 illustrates an example operational flow implemented in an environment that includes a person viewing content displayed by an electronic device;
  • FIG. 14 illustrates an example computer program product;
  • FIG. 15 illustrates an example environment in which embodiments may be implemented;
  • FIG. 16 illustrates an example operational flow;
  • FIG. 17 illustrates another alternative embodiment of the operational flow of FIG. 16;
  • FIG. 18 illustrates an example system;
  • FIG. 19 illustrates an example computer program product;
  • FIG. 20 illustrates an example system that includes an electronic device;
  • FIG. 21 illustrates an example environment in which embodiments may be implemented;
  • FIG. 22 illustrates an example operational flow implemented in an environment that includes a person viewing at least two instances of content having a common contextual attribute and displayed by an electronic device;
  • FIGS. 23 and 24 illustrate an alternative embodiment of the operational flow of FIG. 22;
  • FIG. 25 illustrates a further alternative embodiment of the operational flow of FIG. 22;
  • FIG. 26 illustrates a further alternative embodiment of the operational flow of FIG. 22;
  • FIG. 27 illustrates an alternative embodiment of the operational flow of FIG. 22;
  • FIG. 28 illustrates another alternative embodiment of the operational flow of FIG. 22;
  • FIG. 29 illustrates an example environment;
  • FIG. 30 illustrates an example computer program product;
  • FIG. 31 illustrates an example electronic device;
  • FIG. 32 illustrates an example environment;
  • FIG. 33 illustrates an example operational flow;
  • FIG. 34 illustrates an alternative embodiment of the operational flow of FIG. 33;
  • FIG. 35 illustrates another alternative embodiment of the operational flow of FIG. 33;
  • FIG. 36 illustrates a further alternative embodiment of the operational flow of FIG. 33;
  • FIG. 37 illustrates an alternative embodiment of the operational flow of FIG. 33;
  • FIG. 38 illustrates an example environment;
  • FIG. 39 illustrates an example computer program product;
  • FIG. 40 illustrates an example electronic device;
  • FIG. 41 illustrates an example environment;
  • FIG. 42 illustrates an example operational flow;
  • FIG. 43 illustrates an alternative embodiment of the operational flow of FIG. 42;
  • FIG. 44 illustrates another alternative embodiment of the operational flow of FIG. 42;
  • FIG. 45 illustrates another alternative embodiment of the operational flow of FIG. 42;
  • FIG. 46 illustrates an alternative embodiment of the operational flow of FIG. 42;
  • FIG. 47 illustrates a further alternative embodiment of the operational flow of FIG. 42;
  • FIG. 48 illustrates an alternative embodiment of the operational flow of FIG. 42;
  • FIG. 49 illustrates an example electronic system;
  • FIG. 50 illustrates an example computer program product;
  • FIG. 51 illustrates an example system that includes an electronic device;
  • FIG. 52 illustrates an example environment;
  • FIG. 53 illustrates an example operational flow;
  • FIG. 54 illustrates an alternative embodiment of the operational flow of FIG. 53;
  • FIG. 55 illustrates another alternative embodiment of the operational flow of FIG. 53;
  • FIG. 56 illustrates a further alternative embodiment of the operational flow of FIG. 53;
  • FIG. 57 illustrates an alternative embodiment of the operational flow of FIG. 53;
  • FIG. 58 illustrates an example system;
  • FIG. 59 illustrates an example computer program product;
  • FIG. 60 illustrates an example electronic device;
  • FIG. 61 illustrates an example environment;
  • FIG. 62 illustrates an example operational flow;
  • FIG. 63 illustrates an alternative embodiment of the operational flow of FIG. 62;
  • FIG. 64 illustrates another alternative embodiment of the operational flow of FIG. 62;
  • FIG. 65 illustrates a further alternative embodiment of the operational flow of FIG. 62;
  • FIG. 66 illustrates an alternative embodiment of the operational flow of FIG. 62;
  • FIG. 67 illustrates an example environment that includes an electronic device;
  • FIG. 68 illustrates an example computer program product;
  • FIG. 69 illustrates an example electronic device;
  • FIG. 70 illustrates an example environment;
  • FIG. 71 illustrates an example operational flow;
  • FIG. 72 illustrates an alternative embodiment of the example operational flow of FIG. 71;
  • FIG. 73 illustrates an alternative embodiment of the example operational flow of FIG. 71;
  • FIG. 74 illustrates a further embodiment of the example operational flow of FIG. 71;
  • FIG. 75 illustrates an alternative embodiment of the example operational flow of FIG. 71;
  • FIG. 76 illustrates another alternative embodiment of the example operational flow of FIG. 71;
  • FIG. 77 illustrates a further alternative embodiment of the example operational flow of FIG. 71;
  • FIG. 78 illustrates an example operational flow;
  • FIG. 79 illustrates an alternative embodiment of the example operational flow of FIG. 78;
  • FIG. 80 illustrates an alternative embodiment of the example operational flow of FIG. 78;
  • FIG. 81 illustrates an example environment;
  • FIG. 82 illustrates an example computer program product;
  • FIG. 83 illustrates an example system;
  • FIG. 84 illustrates an example operational flow; and
  • FIG. 85 illustrates an alternative embodiment of the operational flow of FIG. 84.
  • DETAILED DESCRIPTION
  • In the following detailed description, reference is made to the accompanying drawings, which form a part hereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. The illustrated embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented here.
  • FIG. 1 and the following discussion are intended to provide a brief, general description of an environment in which embodiments may be implemented. FIG. 1 illustrates an exemplary system that includes a thin computing device 20, which may be included in an electronic device that also includes a device functional element 50. For example, the electronic device may include any item having electrical and/or electronic components playing a role in a functionality of the item, such as a limited resource computing device, a wireless communication device, a mobile wireless communication device, an electronic pen, a handheld electronic writing device, a digital camera, a scanner, an ultrasound device, an x-ray machine, a non-invasive imaging device, a cell phone, a PDA, a Blackberry® device, a printer, a refrigerator, a car, and an airplane. The thin computing device 20 includes a processing unit 21, a system memory 22, and a system bus 23 that couples various system components including the system memory 22 to the processing unit 21. The system bus 23 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. The system memory includes read-only memory (ROM) 24 and random access memory (RAM) 25. A basic input/output system (BIOS) 26, containing the basic routines that help to transfer information between sub-components within the thin computing device 20, such as during start-up, is stored in the ROM 24. A number of program modules may be stored in the ROM 24 and/or RAM 25, including an operating system 28, one or more application programs 29, other program modules 30 and program data 31.
  • A user may enter commands and information into the computing device 20 through input devices, such as a number of switches and buttons, illustrated as hardware buttons 44, connected to the system via a suitable interface 45. Input devices may further include a touch-sensitive display with suitable input detection circuitry, illustrated as a display 32 and screen input detector 33. The output circuitry of the touch-sensitive display 32 is connected to the system bus 23 via a video driver 37. Other input devices may include a microphone 34 connected through a suitable audio interface 35, and a physical hardware keyboard (not shown). Output devices may include at least one the display 32, or a projector display 36.
  • In addition to the display 32, the computing device 20 may include other peripheral output devices, such as at least one speaker 38. Other external input or output devices 39, such as a joystick, game pad, satellite dish, scanner or the like may be connected to the processing unit 21 through a USB port 40 and USB port interface 41, to the system bus 23. Alternatively, the other external input and output devices 39 may be connected by other interfaces, such as a parallel port, game port or other port. The computing device 20 may further include or be capable of connecting to a flash card memory (not shown) through an appropriate connection port (not shown). The computing device 20 may further include or be capable of connecting with a network through a network port 42 and network interface 43, and through wireless port 46 and corresponding wireless interface 47 may be provided to facilitate communication with other peripheral devices, including other computers, printers, and so on (not shown). It will be appreciated that the various components and connections shown are exemplary and other components and means of establishing communications links may be used.
  • The computing device 20 may be primarily designed to include a user interface. The user interface may include a character, a key-based, and/or another user data input via the touch sensitive display 32. The user interface may include using a stylus (not shown). Moreover, the user interface is not limited to an actual touch-sensitive panel arranged for directly receiving input, but may alternatively or in addition respond to another input device such as the microphone 34. For example, spoken words may be received at the microphone 34 and recognized. Alternatively, the computing device 20 may be designed to include a user interface having a physical keyboard (not shown).
  • The device functional elements 50 are typically application specific and related to a function of the electronic device, and is coupled with the system bus 23 through an interface (not shown). The functional elements may typically perform a single well-defined task with little or no user configuration or setup, such as a refrigerator keeping food cold, a cell phone connecting with an appropriate tower and transceiving voice or data information, and a camera capturing and saving an image.
  • FIG. 2 illustrates an exemplary embodiment of a general-purpose computing system in which embodiments may be implemented, shown as a computing system environment 100. Components of the computing system environment 100 may include, but are not limited to, a computing device 110 having a processing unit 120, a system memory 130, and a system bus 121 that couples various system components including the system memory to the processing unit 120. The system bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus, also known as Mezzanine bus.
  • The computing system environment 100 typically includes a variety of computer-readable media products. Computer-readable media may include any media that can be accessed by the computing device 110 and include both volatile and nonvolatile media, removable and non-removable media. By way of example, and not of limitation, computer-readable media may include computer storage media and communications media.
  • Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. Computer storage media includes, but is not limited to, random-access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory, or other memory technology, CD-ROM, digital versatile disks (DVD), or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage, or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computing device 110. In a further embodiment, a computer storage media may include a group of computer storage media devices. In another embodiment, a computer storage media may include an information store. In another embodiment, an information store may include a quantum memory, a photonic quantum memory, and/or atomic quantum memory. Combinations of any of the above may also be included within the scope of computer-readable media.
  • Communications media may typically embody computer-readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism and include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communications media include wired media, such as a wired network and a direct-wired connection, and wireless media such as acoustic, RF, optical, and infrared media.
  • The system memory 130 includes computer storage media in the form of volatile and nonvolatile memory such as ROM 131 and RAM 132. A RAM may include at least one of a DRAM, an EDO DRAM, a SDRAM, a RDRAM, a VRAM, and/or a DDR DRAM. A basic input/output system (BIOS) 133, containing the basic routines that help to transfer information between elements within the computing device 110, such as during start-up, is typically stored in ROM 131. RAM 132 typically contains data and program modules that are immediately accessible to or presently being operated on by processing unit 120. By way of example, and not limitation, FIG. 2 illustrates an operating system 134, application programs 135, other program modules 136, and program data 137. Often, the operating system 134 offers services to applications programs 135 by way of one or more application programming interfaces (APIs) (not shown). Because the operating system 134 incorporates these services, developers of applications programs 135 need not redevelop code to use the services. Examples of APIs provided by operating systems such as Microsoft's “WINDOWS” are well known in the art.
  • The computing device 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media products. By way of example only, FIG. 2 illustrates a non-removable non-volatile memory interface (hard disk interface) 140 that reads from and writes for example to non-removable, non-volatile magnetic media. FIG. 2 also illustrates a removable non-volatile memory interface 150 that, for example, is coupled to a magnetic disk drive 151 that reads from and writes to a removable, non-volatile magnetic disk 152, and/or is coupled to an optical disk drive 155 that reads from and writes to a removable, non-volatile optical disk 156, such as a CD ROM. Other removable/nonremovable, volatile/non-volatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, memory cards, flash memory cards, DVDs, digital video tape, solid state RAM, and solid state ROM. The hard disk drive 141 is typically connected to the system bus 121 through a non-removable memory interface, such as the interface 140, and magnetic disk drive 151 and optical disk drive 155 are typically connected to the system bus 121 by a removable non-volatile memory interface, such as interface 150.
  • The drives and their associated computer storage media discussed above and illustrated in FIG. 2 provide storage of computer-readable instructions, data structures, program modules, and other data for the computing device 110. In FIG. 2, for example, hard disk drive 141 is illustrated as storing an operating system 144, application programs 145, other program modules 146, and program data 147. Note that these components can either be the same as or different from the operating system 134, application programs 135, other program modules 136, and program data 137. The operating system 144, application programs 145, other program modules 146, and program data 147 are given different numbers here to illustrate that, at a minimum, they are different copies.
  • A user may enter commands and information into the computing device 110 through input devices such as a microphone 163, keyboard 162, and pointing device 161, commonly referred to as a mouse, trackball, or touch pad. Other input devices (not shown) may include at least one of a touch sensitive display, joystick, game pad, satellite dish, and scanner. These and other input devices are often connected to the processing unit 120 through a user input interface 160 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port, or a universal serial bus (USB).
  • A display 191, such as a monitor or other type of display device or surface may be connected to the system bus 121 via an interface, such as a video interface 190. A projector display engine 192 that includes a projecting element may be coupled to the system bus. In addition to the display, the computing device 110 may also include other peripheral output devices such as speakers 197 and printer 196, which may be connected through an output peripheral interface 195.
  • The computing system environment 100 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180. The remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device, or other common network node, and typically includes many or all of the elements described above relative to the computing device 110, although only a memory storage device 181 has been illustrated in FIG. 2. The network logical connections depicted in FIG. 2 include a local area network (LAN) and a wide area network (WAN), and may also include other networks such as a personal area network (PAN) (not shown). Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets, and the Internet.
  • When used in a networking environment, the computing system environment 100 is connected to the network 171 through a network interface, such as the network interface 170, the modem 172, and/or the wireless interface 193. The network may include a LAN network environment, and/or a WAN network environment, such as the Internet. In a networked environment, program modules depicted relative to the computing device 110, or portions thereof, may be stored in a remote memory storage device. By way of example, and not limitation, FIG. 2 illustrates remote application programs 185 as residing on computer storage medium 181. It will be appreciated that the network connections shown are exemplary and other means of establishing communications link between the computers may be used.
  • FIG. 3 illustrates an example environment 200 in which embodiments may be implemented. The example environment includes an electronic device 204, a response sensing apparatus 206, a transceiver circuit 207, a user direct-input device 208, and a display surface 209. In some embodiments, one or more of the response sensing apparatus, the transceiver circuit, the user direct-input interface, and the display surface may be structurally distinct from the remaining circuits or the electronic device. The response sensing apparatus includes at least one user sensor operable to acquire data indicative of a response by a person 205 to a content displayed by or on the display surface. The at least one user sensor is illustrated as a sensor 206A, a sensor 206B, and a wearable/mountable sensor 206C. The at least one user sensor may be physically incorporated with the electronic device, or may be physically separate from the electronic device and electronically coupled with the device. The user direct-input device 208 includes at least one device that may be used by the person to directly interact with the electronic device, such as the mouse 161, keyboard 162, microphone 163, and/or speakers 197 described in conjunction with FIG. 2, or a touch screen, such as the display 32 combined with the screen input detector 33 described in conjunction with FIG. 1. The display surface may include any surface suitable for displaying a content to the person. The display surface may include the monitor 191 described in conjunction with FIG. 2, or a surface such as a wall or another planar surface (not shown) onto which a content may be projected for display to the person. The display surface may be physically incorporated with the electronic device, or may be physically separate from the electronic device and electronically coupled with the device.
  • The electronic device 204 may include a wired or wireless access to digital content using the transceiver 207, such as via a network 299. In an alternative embodiment, the electronic device may be coupled to the network via a wireless link, a satellite link, and/or a wired link.
  • In an embodiment, the electronic device 204 includes a reaction detector circuit 210, an analytic circuit 250, a query circuit 260, and a display circuit 280. In some embodiments, one or more of the reaction detector circuit, the analytic determining circuit, the query circuit, and/or the display circuit may be structurally distinct from the remaining circuits. In an embodiment, the electronic device or a portion of the electronic device may be implemented in whole or in part using the thin computing device 20 described in conjunction with FIG. 1, and/or the computing device 110 described in conjunction with FIG. 2. In another embodiment, the electronic device or a portion of the electronic device may be implemented using Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. In a further embodiment, one or more of the circuits and/or the machine may be implemented in hardware, software, and/or firmware. The person 205 may input commands and information to the electronic device 204 using the user direct-input device 208.
  • The electronic device 204 may include at least one additional circuit. The at least one additional circuit may include additional circuits 290. In addition, the electronic device may include a processor (not illustrated), such as the processing unit 21 described in conjunction with FIG. 1, and/or the processor 120 described in conjunction with FIG. 2. In further addition, the electronic device may include a computer storage media illustrated as a data store. In an embodiment, the electronic device 204 may include a mobile electronic device.
  • In an embodiment, the reaction detector circuit 210 may include at least one additional circuit. The at least one additional circuit may include at least one of a reaction circuit 212, reaction evaluation circuit 214, a gaze reaction circuit 216, a response sensor circuit 218, a physical reaction circuit 222, an emotional reaction circuit 224, a direct sensor circuit 226, a reaction state circuit 228, a content characteristic circuit 232, and/or a device type detector circuit 236.
  • In another embodiment, the analytic circuit 250 may include at least one additional circuit. The at least one additional circuit may include at least one of a multiple attribute determining circuit 252 and/or an attribute determining circuit.
  • In a further embodiment, the query circuit 260 may include at least one additional circuit. The at least one additional circuit may include at least one of a local data store search circuit 262, a search engine facilitating circuit 264, a mitigation instruction circuit 274, a Web search facilitating circuit 266, an algorithm search facilitating circuit, and/or a multiple target search facilitating circuit 272.
  • FIG. 4 illustrates an example of an operational flow 400 implemented in an environment that includes a person interacting with an electronic device using a user direct-input device. In an alternative embodiment, the environment that includes a person interacting with an electronic device using a user direct-input device further includes environment that includes a person viewing content displayed by an electronic device and directly interacting with the electronic device using a user direct-input device. FIG. 4 and several following figures may include various examples of operational flows, discussions, and explanations with respect to the above-described environment 200 of FIG. 3, and/or with respect to other examples and contexts. However, it should be understood that the operational flows may be executed in a number of other environments and contexts, and/or in modified versions of FIG. 3. Also, although the various operational flows are illustrated in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, and/or may be performed concurrently.
  • After a start operation implemented in the environment that includes a person viewing content displayed by an electronic device using a user direct-input device, the operational flow 400 includes an observation operation 410. The observation operation detects a reaction by the person to a displayed first content. The observation operation may be implemented using the reaction detector circuit 210, and/or the response sensing apparatus 206. An analytical operation 450 determines a content attribute of the displayed first content. The analytical operation may be implemented using the analytic circuit 250. A query operation 460 facilitates a search for a second content based on the detected reaction by the observation operation and on the determined content attribute by the investigation operation. The query operation may be implemented using the query circuit 260. A broadcast operation 480 displays the second content in a manner perceivable by the person. The broadcast operation may be implemented using the display circuit 280. The operational flow 400 then proceeds to an end operation.
  • In an embodiment, the observation operation 410 may be implemented using the reaction detector circuit 210 of FIG. 3. For example, optically based observation data of the person 205 may be acquired by the sensor 206A, and/or sensor 206B. Physiological based data of the person may be acquired by the wearable/mountable sensor 206C. A circuit in the response sensing apparatus 206 may transform data acquired by the sensors 206A-206C into data indicative of a response by the person to the displayed first content. For example, a response may include at least one of a change in breathing rate, a change in heart rate, eye movements, facial movements, gaze direction and/or time, or a brain wave pattern. Another circuit in the response sensing circuit may detect a reaction by the person to a displayed first content based on the data indicative of a response by the person to the displayed first content. For example, a facial response that includes the person moving the ends of their lips above the center portion of the lips may be detected as a “smile” reaction or a “positive” reaction. A facial response that includes the person moving the ends of the lips below the center portion of the lips may be detected as a “frown” reaction or a “negative” reaction. The observation operation does not include data directly inputted by the person 205 using the user direct-input device 208, such as keyboard, mouse, and voice commands entered by the user through the user direct-input device. However, in an alternative embodiment, the observation operation may include at least one of a quality, or a manner of the person's input of data using the direct-input device. For example, the observation operation may acquire data indicative of the person shouting a voice command without regard to a nature of the voice command, or the person striking keys of the keyboard particularly hard without regard to the keyed command or text. In a further embodiment, the observation operation may acquire sensor data indicative of the person shouting a voice command and associate a reaction with the nature of the voice command. For example, data indicative of a loud voice response may be associated with a spoken command “Delete this Web page” as a negative reaction to the content of the Web page.
  • In an embodiment, data indicative of a response may include data indicative of at least one of a person's gaze, attention, gaze dwell time, facial movements, eye movements, pupil dilation, physiological parameters (heart rate, respiration rate, etc.), stance, sub-vocalization (and other non-word audio), P-300 response, brain waves, brain patterns, or other detectable aspects. In another embodiment, data indicative of a response may include data indicative of at least one of a person's physiological, behavioral, emotional, voluntary, or involuntary response. In a further embodiment, data indicative of a response may include data acquired by functional near-infrared spectroscopy (fNIRS) indicative of a response. fNIRS data may be acquired by a fNIRS device, an embodiment of which is illustrated as the wearable/mountable sensor 206C.
  • In another embodiment, the observation operation 410 detects a reaction by the person to a displayed first content by applying pattern recognition to the data indicative of a response by the person to the displayed first content. For example, a reaction may include a response that reveals the person 205's feelings or attitude toward the displayed first content. In a further embodiment, the observation operation detects a reaction by the person to a displayed first content by applying pattern matching to the data indicative of a response by the person to the displayed first content.
  • In use, an embodiment of the operational flow 400 may illustrated by reference to FIG. 3. For example, a first content may be displayed to the person 205 on a portion of the display surface 209, such as a screen of BlackBerry® or other PDA electronic device. In this example, the displayed first content may be a picture of a new car from a brother of the person 205. Data indicative of a response by the person 205 to the displayed new car is acquired using at least one of sensors 206A-206C. The observation operation 410 determines a reaction by the person to the displayed new car based on the data indicative of a response. If, for example, the data indicates an upward movement of the ends of the person's lips and an opening of their eyes, a positive reaction may be detected. The analytical operation 450 determines a content attribute of the displayed picture of the brother's new car. A content attribute may include at least one of a manufacturer of the new car, a color of the new car, or a body style of the new car, such as a convertible, coupe, four-door, or SUV. The query operation 460 facilitates a search for a second content based on the detected reaction (positive) and on the determined content attribute (convertible sports car). The search may be facilitated by communicating with an Internet based search service, such as Google, Yahoo, and/or Live Search. The broadcast operation 480 displays a second content in a manner perceivable by the person by receiving a result of the facilitated search that includes an indication of the second content, and displaying the second content using the display surface 209. For example, the second content may include a picture of next year's model of the same car as the brother's new car.
  • FIG. 5 illustrates an alternative embodiment of the operational flow 400 of FIG. 4. The operational flow may include an additional operation 490. The additional operation may include at least one of an operation 492, an operation 494, an operation 496, or an operation 498. The operation 492 displays the first content in a manner perceivable by the person. The operation 492 may include at least one additional operation, such as the operation 494. The operation 494 displays the first content in a manner perceivable by the person and in a manner designed to facilitate a detectable reaction from the person. The operation 496 selects the second content from a result of the facilitated search. In an embodiment, the second content may be selected in response to an algorithm that includes a machine learning aspect. In an alternative embodiment, the selection algorithm may include a pattern recognition algorithm. The operation 498 maintains informational data corresponding to the second content. The operation 490 may be implemented using at least one circuit of the additional circuits 290 of FIG. 3.
  • FIG. 6 illustrates another alternative embodiment of the operational flow 400 of FIG. 4. The observation operation 410 may include at least one additional operation. The at least one additional operation may include an operation 412, an operation 414, an operation 416, an operation 418, an operation 422, an operation 424, an operation 426, or an operation 428. The operation 412 includes at least one of sensing, identifying, or recognizing a reaction by the person to a displayed first content. The operation 412 may be implemented using the reaction circuit 212 of FIG. 3. The operation 414 includes detecting at least one of a positive or negative reaction by the person to a displayed first content. The operation 414 may be implemented using the reaction evaluation circuit 214. The operation 416 includes sensing a gaze by the person at a displayed first content and detecting a reaction by the person to the displayed first content. The operation 416 may be implemented using the gaze reaction circuit 216. The operation 418 includes detecting a response by the person to a displayed first content. The operation 418 may be implemented using the response sensor circuit 218. The operation 422 includes detecting a physical reaction by the person to a displayed first content. The operation 422 may be implemented using the physical reaction circuit 222. The operation 424 includes detecting an emotional reaction by the person to a displayed first content. The operation 424 may be implemented using the emotional reaction circuit 224. The operation 426 includes directly detecting from the person a response of the person to a displayed first content. The operation 426 may be implemented using the direct sensor circuit 226. The operation 428 includes detecting a reaction state of the person to a displayed first content. The operation 428 may be implemented using the reaction state sensor circuit 228.
  • FIG. 7 illustrates a further alternative embodiment of the operational flow 400 of FIG. 4. The observation operation 410 may include at least one additional operation. The at least one additional operation may include an operation 432, an operation 434, an operation 436, an operation 438, or an operation 442. The operation 432 includes detecting a reaction by the person to a displayed first content. The displayed first content includes at least one of a displayed search result, Internet search results, such as from a search provider such as Google, Yahoo, or Live Search. Alternatively, the displayed first content may include sports scores, or news. For example, the displayed search results may include a displayed result of a restaurant search, a movie search, or car repair shops. In further alternative, the displayed first content may include a program list, a music list, a file lists, or directory search result of locally stored files. The operation 434 includes detecting a reaction by the person to a displayed first content. The displayed first content includes at least one of a displayed image, avatar, icon, name, title, descriptor, or broadcasted sound. For example, a title may include a song title, a book title, or a movie title. The operation 436 includes detecting a reaction by the person to a displayed first content. The displayed first content includes at least one of a visual-based, image-based, text-based, or sound-based content. The operations 432, 434, and/or 436 may be implemented using the content characteristic circuit 232.
  • The operation 438 includes detecting a reaction by the person to a displayed first content. The displayed first content includes a content displayed on a surface coupled with a computing device, such as a built-in screen of the computing device or a screen physically coupled with computing device, or displayed on a surface separate from the computing device, such as projected onto a separate screen or a wall surface. The operation 442 includes detecting a reaction by the person to a displayed first content. The displayed first content includes a content displayed by at least one of a mobile communications device, handheld communications device, desktop computing device, limited resources computing device, thin computing device, or portable computing device. The operations 438 and/or 442 may be implemented using the device type detector circuit.
  • FIG. 8 illustrates an alternative embodiment of the operational flow 400 of FIG. 4. The analytical operation 450 may include at least one additional operation. The at least one additional operation may include an operation 452, or an operation 454. The operation 452 includes determining at least two content attributes of the displayed first content. The operation 452 may be implemented using the multiple attribute determining circuit 252. The operation 454 includes determining a content attribute of the displayed first content. The determined content attribute may include at least one of a category, tag, subject, color, texture, or theme attribute of the displayed first content. For example, a theme attribute may include a sunset, famous athlete, convict, dog, cat, horse, car, airplane, flower, people, inventor, or entertainer attribute. The operation 454 may be implemented using the attribute determining circuit 254.
  • FIG. 9 illustrates another alternative embodiment of the operational flow 400 of FIG. 4. The query operation 460 may include at least one additional operation. The at least one additional operation may include an operation 462, an operation 464, an operation 466, an operation 468, an operation 472, or an operation 474. The operation 462 includes searching a local data store for a second content based on the detected reaction and on the determined content attribute. In an embodiment, the local data store may include a hard drive having at least one of stored music, or stored video files. The operation 462 may be implemented using the local data store search circuit 262. The operation 464 includes facilitating a search by a search engine for a second content based on the detected reaction and on the determined content attribute. The operation 464 may be implemented using the search engine facilitating circuit 264. The operation 466 includes facilitating a search by a Web search engine for a second content based on the detected reaction and on the determined content attribute. For example, a Web search engine provides the person 205 with tools to search through Web sites, images, videos, news, and a number of other categories. In an embodiment, a Web search engine includes at least one of Google, Yahoo, or Live Search. The operation 466 may be implemented using the Web search facilitating circuit 266. The operation 468 includes facilitating a search for a second content by a search algorithm responsive to the detected reaction and on the determined content attribute. The operation 468 may be implemented using the algorithm search facilitating circuit 268. The operation 472 includes facilitating a search for at least two instances of a second content based on the detected reaction and on the determined content attribute. The operation 472 may be implemented using the multiple target search facilitating circuit 272. The operation 474 includes facilitating a search for a second content based on at least one of a positive correlation, or a negative correlation between the detected reaction and on the determined content attribute. For example, the search may be facilitated based upon a detected positive reaction by the person and on the determined content attribute to locate a second content that is more of the same as the first content. In another example, the search may be facilitated based upon a detected negative reaction by the person and on the determined content attribute to locate a second content that is different from the first content.
  • FIG. 10 illustrates an example environment 500. The environment includes an electronic device 501 that is coupleable to a network 299, and which may be used by a person 205. The electronic device may be coupled to the network via a wired link, illustrated as a cable link, and/or a wireless link illustrated as a satellite link. The electronic device includes the user direct-input device 208, the display surface 209, a response sensor apparatus 520, an analytic circuit 530, a characterization circuit 540, a query circuit 550, and a chooser circuit 560. In an alternative embodiment, the electronic device includes at least one of a portable electronic device, or a mobile electronic device.
  • The display surface 209 includes a display surface operable to display electronic content in a manner perceivable by a person. In an embodiment, the electronic content includes electronically stored information. In another embodiment, electronically stored content may include electronically stored content as described in Federal Rule of Civil Procedure 26(f). In a further embodiment, electronic content may include at least one of electronically stored text, Web content, picture, image, or streaming image. The response sensor apparatus 520 includes the sensor 206A, the sensor 206B, the wearable/mountable sensor 206C, and a sensor data acquisition module 524. The response sensor apparatus includes a sensor apparatus operable to acquire data indicative of a response by the person 205 to a first electronic content displayed on the surface 209.
  • The analytic circuit 530 includes an analytic circuit operable determine an indication of an expression by the person corresponding with the displayed first electronic content, the determination based on the data indicative of a response. In an embodiment, the expression by the person may include at least one of an expression by the person of interest, disinterest, like, dislike, happiness, or anger. The characterization circuit 540 includes a characterization circuit operable to determine an attribute of the displayed first electronic content. The query circuit 550 includes a query circuit operable to cause a search for a second electronic content corresponding to the indication of expression and to the attribute of the first electronic content. The chooser circuit 560 includes a chooser circuit operable to select the second electronic content from a result of the search.
  • In an alternative embodiment, the electronic device 501 may include a digital storage device 590 operable to save the selected second electronic content. In another embodiment, the electronic device may include a broadcast circuit 575 operable to facilitate a display at least the first electronic content and the selected second electronic content. In a further embodiment, the electronic device may include a receiver circuit, illustrated as a transceiver circuit 580, operable to receive a result of the initiated search.
  • In an alternative embodiment, the display surface 209 may include a display surface operable to display electronic content in a manner perceivable by a person 205 and in a manner designed to facilitate sensing a response by the person. In another embodiment, the response sensor apparatus 520 may include a sensor apparatus operable to acquire data indicative of a physically manifested response by the person to a first electronic content displayed on the surface. In a further embodiment, the analytic circuit 530 may include an analytic circuit operable determine an indication of an emotional expression by the person corresponding with the displayed first electronic content, the determination based on the data indicative of a response.
  • FIG. 11 illustrates an example computer program product 600. The computer program product includes a computer-readable storage medium 610 bearing program instructions 620. The program instructions are operable to perform a process in a computing device. The process includes detect a reaction by a person to a displayed first content. The process also includes determine a content attribute of the displayed first content. The process further includes facilitate a search for a second content based on the detected reaction and on the determined content attribute. The process also includes select the second content from a result of the facilitated search, and save data indicative of the selected second content. In an alternative embodiment 622, the process may include facilitating a display of the selected second content.
  • FIG. 12 illustrates an example environment 700 that includes an electronic device 705. The electronic device includes means 710 for detecting a reaction by a person to a displayed first content. The electronic device also includes means 720 for determining a content attribute of the displayed first content. The electronic device further includes means 730 for facilitating a search for a second content based on the detected reaction and on the determined content attribute. The electronic device includes means 740 for displaying the second content in a manner perceivable by the person.
  • FIG. 13 illustrates an example operational flow 800 implemented in an environment that includes a person viewing content displayed by an electronic device. In an alternative embodiment, the operational flow is 800 implemented in an environment that includes a person viewing content displayed by an electronic device and directly interacting with the electronic device via a user interface. After a start operation, a discovery operation 810 includes detecting a reaction by the person to a displayed first content. In an embodiment, the detected reaction includes at least one of a detected gesture, movement, physiological, or physical reaction. A call operation 820 includes transmitting a search request for a second content corresponding to the detected reaction and to an attribute of the displayed first content. A reception operation 830 includes receiving a response to the search request that includes at least an indication of the second content. A broadcast operation 840 includes displaying the second content. The operational flow the proceeds to an end operation.
  • In an alternative embodiment, the operational flow may include at least one additional operation 850. The at least one additional operation may include an operation 852, and/or an operation 854. The operation 852 includes determining a content attribute of the displayed first content. The operational flow 854 includes selecting the second content from the response to the search request.
  • Returning to FIG. 10, an alternative embodiment of the example environment 500 includes the electronic device 501 coupleable to a network 299. The display surface 209 includes a display surface operable to display electronic content in a manner perceivable by the person 205. The response sensor apparatus 520 includes a sensor apparatus operable to acquire data indicative of a response by the person to a first electronic content displayed on the surface. The analytic circuit 530 includes an analytic circuit operable to detect a reaction by a person to a displayed first content in response to the acquired data. The query circuit 550 includes a query circuit operable to transmit a search request for a second electronic content that corresponds to the detected reaction and to an attribute of the displayed first content. For example, the search request may be addressed to the server 298 and transmitted over the network 299. The transceiver circuit 580 includes a receiver circuit operable to receive a response to the search request that includes at least an indication of the second content.
  • In another alternative embodiment, the chooser circuit 560 may include a chooser circuit operable to select the second electronic content from the received response to the search request. In a further embodiment, the broadcast circuit 575 may include a broadcast circuit operable to facilitate a display of the first electronic content and the second electronic content. In another embodiment, the transceiver circuit 580 may include a receiver circuit operable to receive a result of the initiated search. In a further embodiment, the digital storage device 590 may include a digital storage device operable to save the received response to the search request. In another embodiment, the display surface 209 may include a display surface operable to display electronic content in a manner perceivable by the person and in a manner designed to facilitate sensing a response by the person. In a further embodiment, the sensor apparatus 520 may include a sensor apparatus operable to acquire data indicative of a physically manifested response by the person to a first electronic content displayed on the surface.
  • FIG. 14 illustrates an example computer program product 860. The computer program product includes a computer-readable computer storage medium 862 bearing program instructions 864. The program instructions are operable to perform a process in a computing device. The process includes detect a reaction by a person to a displayed first content. The process also includes transmit a search request for a second content corresponding to the detected reaction and to an attribute of the displayed first content. The process further includes receive a response to the search request that includes at least an indication of the second content. The process also includes save data indicative of the received response to the search request. The process further includes display the second content. In an alternative embodiment, the process may include select the second content from the received response to the search request 866.
  • FIG. 15 illustrates an example environment 900 in which embodiments may be implemented. The example environment includes an electronic device 904 that includes a request receiver circuit 910, an analytic circuit 950, a search facilitation circuit 960, and a reply transmission circuit 980. In some embodiments, one or more of the request receiver circuit, the analytic circuit, the search facilitation circuit, and the reply transmission circuit may be structurally distinct from the remaining circuits or the electronic device. The electronic device 904 may include a wired or wireless access to a requestor electronic device 901 via the network 299 using the communications circuit 970. In an alternative embodiment, the electronic device may be coupled to the network via a wireless link, a satellite link, and/or a wired link. In an embodiment, the electronic device or a portion of the electronic device may be implemented in whole or in part using the thin computing device 20 described in conjunction with FIG. 1, and/or the computing device 110 described in conjunction with FIG. 2. In another embodiment, the electronic device or a portion of the electronic device may be implemented using Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. In a further embodiment, one or more of the circuits and/or the machine may be implemented in hardware, software, and/or firmware.
  • The electronic device 904 may include at least one additional circuit. The at least one additional circuit may include additional circuit(s) 995. In addition, the electronic device may include a processor 972, such as the processing unit 21 described in conjunction with FIG. 1, and/or the processor 120 described in conjunction with FIG. 2. In further addition, the electronic device may include a digital storage media 920, a communications circuit 970, and/or a broadcast circuit 975. In an embodiment, the electronic device 904 may include a network server electronic device, or a group of network server electronic devices.
  • In an embodiment, the request receiver circuit 910 may include at least one additional circuit. The at least one additional circuit may include at least one of a sensor data receiving circuit 912, and/or a content data receiving circuit 914. In another embodiment, the analytic circuit 950 may include at least one additional circuit, such as an expression indication analytic circuit 952.
  • FIG. 16 illustrates an example operational flow 1000. FIG. 16 and several following figures may include various examples of operational flows, discussions, and explanations with respect to the above-described environment 900 of FIG. 15, and/or with respect to other examples and contexts. However, it should be understood that the operational flows may be executed in a number of other environments and contexts, and/or in modified versions of FIG. 15. Also, although the various operational flows are illustrated in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, and/or may be performed concurrently.
  • After a start operation, the operational flow 1000 includes a reception operation 1010. The reception operation includes receiving, from a requester, sensor data indicative of a response by a person to a first content displayed to the person. In an alternative embodiment, the reception operation includes receiving the sensor data from a requestor via at least one of a network, or the Internet. The reception operation may be implemented using the request circuit 910 of FIG. 15. An analysis operation 1050 includes analyzing the received sensor data for an indication of an expression by the person corresponding to the first content. The analysis operation may be implemented using the analytic circuit 950. A query operation 1060 includes facilitating a search for a second content using a search parameter corresponding to the indication of an expression by the person and to a content attribute of the displayed first content. In an alternative embodiment, the search may include at least one of a search of a local data store, a search by a search engine, or a search by a Web search engine. The query operation may be implemented using the search circuit 960. A reply operation 1080 includes returning to the requester an indication of the second content. The indication of the second content may be returned to the requester via at least one of a network, or the Internet. The reply operation may be implemented using the reply transmission circuit 980. The operational flow 400 includes an end operation.
  • In an alternative embodiment, the operational flow 1000 may include at least one additional operation, such as an operation 1090. The operation 1090 includes determining a content attribute of the displayed first content. The operation 1090 may be implemented using the attribute determining circuit 990.
  • FIG. 17 illustrates another alternative embodiment of the operational flow 1000 of FIG. 16. The reception operation 1010 may include at least one additional operation. The at least one additional operation may include an operation 1012, or an operation 1014. The operation 1012 includes receiving from a requestor at least one of raw sensor data, partially processed sensor data, or processed sensor data indicative of a response by the person to a first content displayed to the person. The operation 1012 may be implemented using the sensor data receiving circuit 912. The operation 1014 includes receiving data indicative of a content attribute of the displayed first content. The operation 1014 may be implemented using content data receiving circuit 914.
  • The analysis operation 1050 may include at least one additional operation, such as an operation 1052. The operation 1052 includes analyzing the received sensor data for an indication of an expression of at least one of interest, disinterest, like, dislike, excitement, boredom, happy, or anger by the person corresponding to the first content. The operation 1052 may be implemented using the expression indication analytic circuit 952.
  • FIG. 18 illustrates an example system 1100. The example system includes an electronic device 1104 operable to exchange communications with a requestor device 1101 using the network 299, via for example, a wireless link, a satellite link, and/or a wired link. The electronic device includes a processing circuit 1120, a query circuit 1130, a chooser circuit 1140, and a digital storage device 1150. In an embodiment, the electronic device or a portion of the electronic device may be implemented in whole or in part using the thin computing device 20 described in conjunction with FIG. 1, and/or the computing device 110 described in conjunction with FIG. 2. In another embodiment, the electronic device or a portion of the electronic device may be implemented using Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. In a further embodiment, one or more of the circuits and/or the machinery of the electronic device may be implemented in hardware, software, and/or firmware.
  • The processing circuit 1120 includes a processing circuit operable to analyze received sensor data for an indication of an expression by a person corresponding to a first displayed electronic content. For example, the received sensor data may include sensor data acquired by the response sensing apparatus 206 described in conjunction with FIG. 3. The query circuit 1130 includes a query circuit operable to cause a search for a second electronic content based on the indication of expression and on an attribute of the displayed first electronic content. In an alternative embodiment, the query circuit may include a query circuit operable to cause a search of an index for a second electronic content based on the indication of expression and on an attribute of the displayed first electronic content. In an embodiment, the search for a second electronic content may include a search of an index 1154 saved on the digital storage device. In another embodiment, the search for a second electronic content may include a search of at least one of a Web database, Web index, directory index, file index, content of a directory, or content of a file.
  • The chooser circuit 1140 includes a chooser circuit operable to select the second electronic content from a result of the search for a second electronic content. The digital storage device 1150 includes a storage device operable to save an indication of the selected second electronic content. For example, the indication of the selected second electronic content may be saved in a storage media 1152.
  • In an alternative embodiment, the electronic device 1104 may include a characterization circuit 1160 operable to determine the attribute of the displayed first electronic content. In another embodiment, the electronic device may include a transmitter circuit 1170 operable to send an indication of the selected second electronic content addressed to the requestor. In a further embodiment, the electronic device may include a receiver circuit 1110 operable to receive from a requestor sensor data acquired from a person and indicative of a response by the person to a display of a first electronic content. The receiver circuit may further include a receiver circuit operable to receive from a requestor sensor data acquired from the person and indicative of a response by the person to a display of a first electronic content, and to receive data that is indicative of the displayed first electronic content.
  • FIG. 19 illustrates an example computer program product. The computer program product includes a computer-readable medium 1210 bearing the program instructions. The computer program product also includes program instructions 1220 operable to perform a process in a computing device. The process includes receive sensor data from a requestor indicative of a response by a person to a viewed first content. The process also includes analyze the received sensor data for an indication of an expression by the person corresponding to the viewed first content. The process further includes facilitate a search of an index for a second content using a search parameter corresponding to the indicated expression and to a content attribute of the viewed first content. The process also includes return to the requester an indication of the second content.
  • In an alternative embodiment, the process further includes select the second content from a result of the search for a second content 1222. In another embodiment, the process further includes save data indicative of the selected second content 1224. In another embodiment, the computer-readable medium includes a computer storage medium.
  • FIG. 20 illustrates an example system 1300 that includes an electronic device 1305. The electronic device includes means 1310 for receiving data from a requestor indicative of a sensed response by a person to a first content displayed to the person. The electronic device also includes means 1320 for analyzing the received data for an indication of an expression by the person corresponding to the first content. The electronic device further includes means 1330 for facilitating a search for a second content using a search parameter corresponding to the indication of an expression by the person and to a content attribute of the displayed first content. The electronic device also includes means 1340 for returning to the requestor an indication of the second content. In alternative embodiments, the electronic device may include means 1350 for receiving an indication of a content attribute of the displayed first content. The electronic device may include means 1360 for determining a content attribute of the displayed first content.
  • FIG. 21 illustrates an example environment 1400 in which embodiments may be implemented. The example environment includes an electronic device 1401, a response sensing apparatus 206, a transceiver circuit 1407, a user direct-input device 208, and a display surface 209. In some embodiments, one or more of the response sensing apparatus, the transceiver circuit, the user direct-input interface, and the display surface may be structurally distinct from the remaining circuits or the electronic device. The display surface may be physically incorporated with the electronic device, or may be physically separate from the electronic device and electronically coupled with the device.
  • The electronic device 1401 may include a wired or wireless access to digital content using the transceiver 1407, such as via a network 299. In an alternative embodiment, the electronic device may be coupled to the network via a wireless link, a satellite link, and/or a wired link.
  • In an embodiment, the electronic device 1401 includes a reaction detector circuit 1410, an analytic circuit 1450, a query circuit 1470, and a display circuit 1480. In some embodiments, one or more of the reaction detector circuit, the analytic determining circuit, the query circuit, and/or the display circuit may be structurally distinct from the remaining circuits. In an embodiment, the electronic device or a portion of the electronic device may be implemented in whole or in part using the thin computing device 20 described in conjunction with FIG. 1, and/or the computing device 110 described in conjunction with FIG. 2. In another embodiment, the electronic device or a portion of the electronic device may be implemented using Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. In a further embodiment, one or more of the circuits and/or the machine may be implemented in hardware, software, and/or firmware. The person 205 may input commands and information to the electronic device 1401 using the user direct-input device 208.
  • The electronic device 1401 may include at least one additional circuit. The at least one additional circuit may include additional circuits 1490. In addition, the electronic device may include a processor (not illustrated), such as the processing unit 21 described in conjunction with FIG. 1, and/or the processor 120 described in conjunction with FIG. 2. In further addition, the electronic device may include a computer storage media illustrated as a data store. In an embodiment, the electronic device may include a mobile electronic device.
  • In an embodiment, the reaction detector circuit 1410 may include at least one additional circuit. The at least one additional circuit may include at least one of a reaction acquisition circuit 1412, positive/negative reaction circuit 1414, a gaze reaction circuit 1416, a physiological reaction circuit 1418, a physical reaction circuit 1422, a common attribute circuit 1424, a search results attribute circuit 1426, a contextual attribute circuit 1428, a content characteristic circuit 1432, a device type circuit 1434, a display coupling circuit 1436, and/or a serial/parallel display reaction detector circuit 1438.
  • In another embodiment, the analytic circuit 1450 may include at least one additional circuit. The at least one additional circuit may include at least one of a content attributes determining circuit 1452, a style analytic circuit 1454, a sub-hierarchy analytic circuit 1458, or an attribute comparator circuit 1458.
  • In a further embodiment, the query circuit 1470 may include at least one additional circuit. The at least one additional circuit may include at least one of a multiple element search parameter circuit 1472, a local data store query circuit 1474, a search engine query circuit 1476, a third party search engine query circuit 1478, an algorithm search facilitating circuit 1482, a multiple second content search facilitating circuit 1484, a positive/negative correlation search facilitating circuit 1486, or a search parameter scope circuit 1488.
  • FIG. 22 illustrates an example operational flow 1500 implemented in an environment that includes a person viewing at least two instances of content having a common contextual attribute and displayed by an electronic device. In an alternative embodiment, the environment further includes an environment that includes a person directly interacting with the electronic device using a user direct-input device and viewing at least two instances of content displayed having a common contextual attribute and by an electronic device. FIG. 22 and several following figures may include various examples of operational flows, discussions, and explanations with respect to the above-described environment 1400 of FIG. 21, and/or with respect to other examples and contexts. However, it should be understood that the operational flows may be executed in a number of other environments and contexts, and/or in modified versions of FIG. 21. Also, although the various operational flows are illustrated in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, and/or may be performed concurrently.
  • After a start operation, the operational flow 400 includes an observation operation 1510. The observation operation includes detecting a reaction by a person to a displayed first content of the at least two instances of displayed content having a common contextual attribute. The observation operation may be implemented using the reaction detector circuit 1410 of FIG. 21, and/or the response sensing apparatus 206 of FIG. 3. In an alternative embodiment, the reaction detector circuit 1410 is at least substantially similar to the reaction detector circuit 210 of FIG. 3.
  • An analytical operation 1550 includes determining a content attribute of the displayed first content. The analytical operation may be implemented using the analytic circuit 1550. A query operation 1570 includes initiating a search for a second content using a search parameter corresponding to the detected reaction and to the determined content attribute. The query operation may be implemented using the query circuit 1470. A broadcast operation 1590 includes facilitating a display of the second content in a manner perceivable by the person. The broadcast operation may be implemented using the display circuit 1480. The operational flow 400 then proceeds to an end operation.
  • FIGS. 23 and 24 illustrate an alternative embodiment of the operational flow 1400 of FIG. 22. The operational flow may include at least one additional operation, illustrated as an operation 1610. The operation 1610 may include at least one of an operation 1612, an operation 1614, an operation 1616, or an operation 1618. The operation 1612 includes displaying the at least two instances of displayed content in a manner perceivable by the person. In an alternative embodiment, the operation 1612 may include at least one additional embodiment such as the operation 1613. The operation 1613 includes displaying the at least two instances of displayed content in a manner perceivable by the person and in a manner designed to facilitate a detectable response from the person. The operation 1612 and/or operation 1613 may be implemented using the display circuit 1480 and/or the display device 1409 of FIG. 21. The operation 1614 includes sensing a reaction by the person to the displayed first content of the at least two instances of displayed content having a common contextual attribute. The operation 1614 may be implemented using the response sensing apparatus 206 and its associated sensors 206A, 206B, and/or 206C. The operation 1616 includes selecting the second content from a result of the initiated search. The operation 1616 may be implemented using a circuit of the additional circuits 1490. The operation 1618 includes providing an access to the selected second content. The operation 1618 may be implemented using a circuit of the additional circuits 1490.
  • FIG. 25 illustrates an alternative embodiment of the operational flow 1400 of FIG. 22. The observation operation 1510 may include at least one additional operation. The at least one additional operation may include an operation 1512, an operation 1514, an operation 1516, an operation 1518, an operation 1522, or an operation 1524. The operation 1512 includes at least one of sensing, identifying, or recognizing a reaction by a person to a displayed first content of the at least two instances of displayed content having a common contextual attribute. The operation 1512 may be implemented using the reaction acquisition circuit 1412. The operation 1514 includes detecting at least one of a positive or negative reaction by a person to a displayed first content of the at least two instances of displayed content having a common contextual attribute. The operation 1514 may be implemented using the positive/negative reaction circuit 1414. The operation 1516 includes sensing a gaze by a person at a displayed first content of the at least two instances of displayed content and detecting a reaction by a person to the displayed first content. In an embodiment for example, the person 205 may gaze across a result displayed on the display surface 1409 from a search of the Internet, the displayed result including at least two instances of search results. Each of the at least two instances of displayed search results may be displayed textually on separate lines, or the at least two instances of displayed search results may be representatively and pictorially displayed by figures or pictures. For example, a result of a search of the word “Caesar” may be pictorially displayed by a picture of a statue of the Emperor Caesar, a picture of a Caesar salad, and a picture of Caesar's Italian restaurant. The operation 1516 in this example would include sensing the person's gaze across a first displayed picture of these pictorially represented search results, and detect a reaction by the person to the first picture of the three displayed pictures. For example, the operation 1516 may sense person's gaze on the picture of Caesar's Italian restaurant and detect a reaction. A positive reaction may be detected from the person to the picture of Caesar's Italian restaurant because the person is hungry and looking for a nearby Italian restaurant. The operation 1516 may be implemented using the gaze reaction circuit 1416.
  • The operation 1518 includes detecting a physiological reaction by a person to a displayed first content of the at least two instances of displayed content having a common contextual attribute. The operation 1518 may be implemented using the physiological reaction circuit 1418. The operation 1522 includes detecting a physiological response by a person to a displayed first content of the at least two instances of displayed content having a common contextual attribute. The operation 1522 may be implemented using the physical reaction circuit 1422. The operation 1524 includes detecting a reaction by a person to a displayed first content of the at least two instances of displayed content having a common contextual attribute. The common contextual attribute including the at least two instances of displayed content having been returned in response to a search request. The operation 1524 may be implemented using the common attribute circuit 1424.
  • FIG. 26 illustrates a further alternative embodiment of the operational flow 1500 of FIG. 22. The observation operation 1510 may include at least one additional operation. The at least one additional operation may include an operation 1526, an operation 1528, an operation 1532, an operation 1534, an operation 1536, or an operation 1538. The operation 1526 includes detecting a reaction by a person to a displayed first content of the at least two instances of displayed content having a common contextual attribute. The common contextual attribute including being at least one of a displayed search result, an Internet search result, a sports result, a query result, a program list, a music list, a file list, or a directory search result. The operation 1526 may be implemented using the search results attribute circuit 1426. The operation 1528 includes detecting a reaction by a person to a displayed first content of the at least two instances of displayed content having a common contextual attribute. The at least two instances of displayed content including at least one of displayed images, avatars, icons, names, titles, or descriptors. The operation 1528 may be implemented using the contextual attribute detector circuit 1428. The operation 1532 includes detecting a reaction by a person to a displayed first content of the at least two instances of displayed content having a common contextual attribute. The common contextual attribute of at least two instances of displayed content include at least one of a displayed visual-based, image-based, text-based, or sound-based contextual attribute. The operation 1532 may be implemented using the contextual characteristic circuit 1432. The operation 1534 includes detecting a reaction by a person to a displayed first content of the at least two instances of displayed content having a common contextual attribute. The at least two instances of displayed content including content displayed by at least one of a mobile communications device, handheld communications device, desktop computing device, limited resources computing device, thin computing device, or portable computing device. The operation 1534 may be implemented using the device type circuit 1434. The operation 1536 includes detecting a reaction by a person to a displayed first content of the at least two instances of displayed content having a common contextual attribute. The displayed at least two instances of content including at least two instances of content displayed on a surface coupled with a computing device, or displayed on a surface separate from the computing device. The operation 1536 may be implemented using the display coupling circuit 1436. The operation 1538 includes detecting a reaction by a person to a displayed first content of the at least two instances of displayed content. The at least two instances of displayed content includes content displayed in at least one of a consecutive manner, or a simultaneous manner. The operation 1538 may be implemented using the serial/parallel display reaction detector circuit 1438.
  • FIG. 27 illustrates an alternative embodiment of the operational flow 1500 of FIG. 22. The analytical operation 1550 may include at least one additional operation. The at least one additional operation may include an operation 1552, an operation 1554, an operation 1556, an operation 1558, or an operation 1562. The operation 1552 includes determining at least two content attributes of the displayed first content. The operation 1552 may be implemented using the content attributes determining circuit 1452. The operation 1554 includes determining a content attribute of the displayed first content. The determined content attribute including at least one of a category, tag, subject, color, texture, or theme of the displayed first content. For example, a theme may include sunsets, famous athletes, convicts, dogs, cats, horses, cars, airplanes, flowers, people, inventors, or entertainers. The operation 1554 may be implemented using the style analytic circuit 1454. The operation 1556 includes determining a content attribute of the displayed first content, the determined content attribute including at least one of a subset, drilldown, or a step down a hierarchy. The operation 1556 may be implemented using the sub-hierarchy analytic circuit 1456. The operation 1558 includes determining a content attribute of the displayed first content that is at least substantially absent from the other instances of the at least two instances of displayed content. The operation 1562 includes determining a content attribute of the displayed first content that is a sub-category of the common contextual attribute of the at least two instances of displayed content. The operations 1558 and/or 1562 may be implemented using the attribute comparator circuit 1458.
  • FIG. 28 illustrates another alternative embodiment of the operational flow 1500 of FIG. 22. The query operation 1570 may include at least one additional operation. The at least one additional operation may include an operation 1572, an operation 1574, an operation 1576, an operation 1578, an operation 1582, an operation 1584, an operation 1586, or an operation 1588. The operation 1572 includes initiating a search for a second content using a search parameter corresponding to the detected reaction, to the determined content attribute, and to the common contextual attribute. The operation 1572 may be implemented using the multiple element search parameter circuit 1572. The operation 1574 includes initiating a search of a local data store using a search parameter corresponding to the detected reaction and to the determined content attribute. The operation 1574 may be implemented using the local data store query circuit 1474. The operation 1576 includes initiating a search by a search engine for a second content using a search parameter corresponding to the detected reaction and to the determined content attribute. The operation 1576 may be implemented using the search engine query circuit 1476. The operation 1578 includes initiating a search by a third-party search engine for a second content using a search parameter corresponding to the detected reaction and to the determined content attribute. The operation 1578 may be implemented using the third party search engine query circuit 1478. The operation 1582 includes initiating a search for a second content using a search algorithm responsive to the detected reaction and to the determined content attribute. The operation 1582 may be implemented using the algorithm search facilitating circuit 1482. The operation 1584 includes initiating a search for at least two instances of a second content using a search parameter corresponding to the detected reaction and to the determined content attribute. The operation 1584 may be implemented using the multiple second content search facilitating circuit 1484. The operation 1586 includes initiating a search for a second content based on at least one of a positive correlation, or a negative correlation between the detected reaction and on the determined content attribute. The operation 1586 may be implemented using the positive/negative correlation search facilitating circuit 1486. The operation 1588 includes initiating a search for a second content using a search parameter corresponding to the detected reaction and to a determined content attribute of the at least two instances of displayed content. The operation 1588 may be implemented using the search parameter scope circuit 1488.
  • FIG. 29 illustrates an example environment 1700. The environment includes an electronic apparatus 1701 that is coupleable to the network 299, and which may be used by the person 205. The electronic apparatus may be coupled to the network via a wired link, illustrated as a cable link, and/or a wireless link, illustrated as a satellite link or a cellular network link. In an embodiment, the electronic apparatus may include a portable electronic apparatus, or a mobile electronic apparatus. In another embodiment, the electronic apparatus may include a wireless electronic apparatus. The electronic apparatus includes the user direct-input device 208, the display surface 209, a response sensor apparatus 1720, a target-content selector circuit 1730, a characterization circuit 1740, a query circuit 1750, and a chooser circuit 1760.
  • The display surface 209 includes a display surface operable to display at least two instances of electronic content in a manner perceivable by a person, such as by the person 205. The response sensor apparatus 1720 includes a sensor data acquisition module 1724, and at least one the sensor 206A, the sensor 206B, the wearable/mountable sensor 206C. The response sensor apparatus includes a sensor apparatus operable to acquire data indicative of a response by the person 205 to at least two instances electronic content displayed by the surface 209.
  • The response sensor apparatus 1720 includes a response sensor apparatus operable to acquire data respectively indicative of respective responses by the person 205 to a first electronic content and a response to a second electronic content of at least two instances of electronic content displayed on the display surface 209 and having a common contextual attribute. In an alternative embodiment, the response sensor apparatus further includes a response sensor apparatus operable to acquire data respectively indicative of respective responses by the person to a first electronic content and a response to a second electronic content of at least two instances of electronic content having a common contextual attribute and concurrently displayed on the surface. In another alternative embodiment, the response sensor apparatus further includes a response sensor apparatus operable to acquire data respectively indicative of respective responses by the person to a first electronic content and a response to a second electronic content of at least two instances of electronic content having a common contextual attribute and serially displayed on the surface
  • The target content selector circuit 1730 includes a target-content selector circuit operable to select the first electronic content as an electronic content of interest over the second electronic content based at least in part on the data indicative of the response to the person to the first electronic content and to the second electronic content. In an alternative embodiment, the target-content selector circuit includes a target-content selector circuit operable to select the first electronic content as an electronic content of interest over the second electronic content by application of a target-selection algorithm that is responsive to the data indicative of the response to the person to the first electronic content and to the second electronic content. For example, the target-selection algorithm may be structured to select a target electronic content in response to a longest duration of the person's gaze with respect to the first electronic content and to the second electronic content. In another example, the target-selection algorithm may be structured to select a target electronic content in response to a plurality of parameters, such as duration of the person's gaze and the person 205's P-300 electrical brain wave response with respect to the first electronic content and with respect to the second electronic content.
  • The characterization circuit 1740 includes a characterization circuit operable to determine an attribute of the displayed first electronic content. The query circuit 1750 includes a query circuit operable to cause a search for a third electronic content based on the determined attribute of the first electronic content. The chooser circuit 1760 includes a chooser circuit operable to select the third electronic content from a result of the initiated search.
  • In an alternative electronic embodiment, the electronic apparatus 1701 may include a digital storage device 1790 operable to save the selected third electronic content. In another alternative embodiment, the electronic apparatus may include a receiver circuit operable to receive a result of the initiated search, illustrated as the transceiver circuit 1780. In an alternative embodiment, the electronic apparatus may include a broadcast circuit 1775 operable to facilitate a display of electronic content using the display surface 209, and/or another display surface, such as a wall or tabletop. In another alternative embodiment, the electronic apparatus may include a broadcast circuit 1775 operable to facilitate a display of electronic content.
  • In an alternative embodiment, the electronic apparatus 1701 may include a manifestation-analyzer circuit 1785. The manifestation-analyzer circuit includes a circuit operable to determine an indication of an expression by the person related to the first electronic content, the determination in response to the data indicative of a response to the first electronic content. In this alternative embodiment, the query circuit 1750 includes a query circuit operable to cause a search for the third electronic content based on the indication of expression and the attribute of the first electronic content.
  • FIG. 30 illustrates an example computer program product 1800. The computer program product includes a computer-readable medium 1810 bearing program instructions 1820. The program instructions are operable to perform a process in a computing device. The process includes receive data indicative of respective responses by the person to at least two instances of electronic content being displayed on a surface and having a common contextual attribute. The process also includes select a first electronic content as an electronic content of interest over the remaining instances of electronic content based at least in part on the received data. The process further includes determine a reaction by the person to the first electronic content and a content attribute of the first electronic content. The process also includes initiate a search for a second electronic content based on the determined reaction by the person to the first electronic content and the determined content attribute of the first electronic content. The process includes select the second electronic content from a result of the initiated search. The process also includes facilitate a display of the selected second electronic content in a manner perceivable by the person. In an alternative embodiment, the process may include save data indicative of the selected second electronic content 1822.
  • In another alternative embodiment, the computer-readable medium includes a computer readable storage medium 1812. In a further alternative embodiment, the computer-readable medium includes a computer readable communication medium 1814.
  • FIG. 31 illustrates an example electronic device 1905. The electronic device includes means 1910 for detecting a reaction by a person to a displayed first content of at least two instances of displayed content having a common contextual attribute. The electronic device also includes means 1920 for determining a content attribute of the displayed first content. The electronic device further includes means 1930 for initiating a search for a second content based on the reaction by a person and the content attribute. The electronic device also includes means 1950 for facilitating a display of the selected second content in a manner perceivable by the person. In an alternative embodiment, the electronic device may include means 1940 for selecting the second content from a result of the search. In another alternative embodiment, the electronic device may include means 1960 for saving data indicative of the selected second content.
  • FIG. 32 illustrates an example environment 2000. The example environment includes an electronic device 2004. The electronic device 2004 may include a wired or wireless access to other electronic devices, such as for example, a computing device, a requestor device 2001, or a server, using a communications circuit 2070, via the network 299. In an alternative embodiment, the electronic device may be coupled to the network via a wireless link, a satellite link, and/or a cellular network link.
  • In an embodiment, the electronic device 2004 includes a request receiver circuit 2010, a content of interest selector circuit 2020, a search facilitating circuit 2030, and a reply sending circuit 2050. In an alternative embodiment, the electronic device may include at least one of the communications circuit 2070, a broadcast circuit 2075, a content attribute receiver circuit 2092, a content attribute determining circuit 2094, a reaction analyzer circuit 2096, a context attribute receiver circuit 2098, a processor 2084, a digital storage device 2080, or additional circuit(s) 2095. In some embodiments, one or more of the request receiver circuit, the content of interest selector circuit, the search facilitating circuit, the reply sending circuit, the communications circuit, the broadcast circuit, the content attribute receiver circuit, the content attribute determining circuit, the reaction analyzing circuit, the context attribute receiver circuit, the processor, the digital storage device, or the additional circuit(s) may be structurally distinct from the remaining circuits. In an embodiment, the electronic device or a portion of the electronic device may be implemented in whole or in part using the thin computing device 20 described in conjunction with FIG. 1, and/or the computing device 110 described in conjunction with FIG. 2. In another embodiment, the electronic device or a portion of the electronic device may be implemented using Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. In a further embodiment, one or more of the circuits and/or the machine may be implemented in hardware, software, and/or firmware. The processor may be implemented using a processor such as the processing unit 21 described in conjunction with FIG. 1, and/or the processor 120 described in conjunction with FIG. 2. In an embodiment, the electronic device may include a mobile electronic device.
  • FIG. 33 illustrates an example operational flow 2100. FIG. 33 and several following figures may include various examples of operational flows, discussions, and explanations with respect to the above-described environment 2000 of FIG. 32, and/or with respect to other examples and contexts. However, it should be understood that the operational flows may be executed in a number of other environments and contexts, and/or in modified versions of FIG. 32. Also, although the various operational flows are illustrated in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, and/or may be performed concurrently.
  • After a start operation, the operational flow 2100 includes reception operation 2110. The reception operation includes receiving information that is indicative of respective responses by a person to each of at least two instances of electronically displayed content. The received information is derived from data acquired by a sensor coupled to the person and sent by a requester electronic device. For example, in an embodiment, the data may include data acquired by at least one of the sensors 206A-206C and the response sensing apparatus 206 of a response by the person 205 viewing the at least two instances of content electronically displayed by the display surface 209 of FIG. 3, or as described in conjunction with FIG. 10. The reception operation may be implemented using the request receiver circuit 2010 of FIG. 32. A choosing operation 2120 includes selecting a particular content from the at least two instances of electronically displayed content. The selecting based at least in part on the received information. The choosing operation may be implemented using the content of interest selector circuit 2020. A focusing operation 2130 includes facilitating a search for a new content using a search parameter corresponding to a content attribute of the particular content. The focusing operation may be implemented using the search facilitation circuit 2030. A reply operation 2150 includes returning an indication of the new content to the requestor electronic device. The reply operation may be implemented using the reply sending circuit 2050. The operational flow then moves to an end operation. In an alternative embodiment, the operational flow may include at least one additional operation, illustrated as an additional operation 2190.
  • FIG. 34 illustrates an alternative embodiment of the operational flow 2100 of FIG. 33. The additional operation 2190 may include at least one of an operation 2192, an operation 2194, an operation 2196, or an operation 2198. The operation 2192 includes receiving information indicative of a content attribute of the particular content. The operation 2192 may be implemented using the content attribute receiver circuit 2092. In an alternative embodiment, the operation 2192 may include at least one operation, such as the operation 2193. The operation 2193 includes determining a content attribute of the particular content. The operation 2194 includes determining a content attribute of the particular content. The operation 2194 may be implemented using the content attribute determining circuit 2094. The operation 2196 includes analyzing the received information for an indication of a reaction of the person to the particular content. The operation 2196 may be implemented using the reaction analyzer circuit 2096. The operation 2198 includes receiving data indicative of a common contextual attribute of the at least two instances of electronically displayed content. The operation 2198 may be implemented using the context attribute receiver circuit 2098.
  • FIG. 35 illustrates another alternative embodiment of the operational flow 2100 of FIG. 33. The reception operation 2110 may include at least one additional operation. The at least one additional operation may include an operation 2112, an operation 2114, an operation 2116, or an operation 2118. The operation 2112 includes receiving least one of raw, partially transformed, or transformed information that is indicative of respective responses by a person to each of at least two instances of electronically displayed content. The operation 2114 includes receiving information that is indicative of respective responses by a person to each of at least two instances of electronically displayed content having a common contextual attribute. The operation 2116 includes receiving information that is indicative of respective responses by a person to each of at least two instances of electronically displayed content, and indicative of a common contextual attribute of the at least two instances of electronically displayed content. The operation 2118 includes receiving information that is indicative of respective responses by a person to each of at least two instances of electronically displayed content. The received information is derived from data acquired by a sensor that is at least one of electrically, optically, or mechanically coupled to the person and sent by a requestor electronic device.
  • FIG. 36 illustrates a further alternative embodiment of the operational flow 2100 of FIG. 33. The choosing operation 2120 may include at least one additional operation, such as the operation 2122. The operation 2122 includes selecting from the at least two instances of electronically displayed content a particular content corresponding to at least one of a positive or a negative indication of interest by the person.
  • FIG. 37 illustrates an alternative embodiment of the operational flow 2100 of FIG. 33. The focusing operation 2130 may include at least one additional operation. The at least one additional operation may include an operation 2132, an operation 2134, an operation 2136, an operation 2138, an operation 2142, an operation 2144, or an operation 2146. The operation 2132 includes facilitating a search of an index for a new content using a search parameter corresponding to a content attribute of the particular content. The operation 2134 includes facilitating a search for a focused content using a search parameter corresponding to a content attribute of the particular content. The operation 2136 includes facilitating a search for a new content using a search parameter corresponding both to a content attribute of the particular content and to an indication of a reaction by the person to the selected content of interest. The operation 2138 includes facilitating a search for a new content using a search parameter corresponding to at least one of a content attribute of the particular content, a common contextual attribute of the at least two instances of electronically displayed content, or a reaction by the person to the selected content of interest. The operation 2142 includes facilitating a search of at least one of a Web database, a Web index, a directory index, a file index, content of a directory, or content of a file for a new content using a search parameter corresponding to a content attribute of the particular content. The operation 2144 includes searching for a new content using a search parameter corresponding to a content attribute of the particular content. The operation 2146 includes facilitating a search for a new content predicted to be of interest to the person by a computer implemented algorithm, the algorithm responsive to a content attribute of the particular content.
  • FIG. 38 illustrates an example environment 2200. The example environment includes an electronic device 2204 operable to communicate with a requestor device 2201 using the network 299, via for example, a wireless link, a satellite link, and/or a wired link. In an embodiment, the requestor device may include the electronic device 1401 described in conjunction with FIG. 21, and/or the electronic apparatus 1701 described in conjunction with FIG. 29. The electronic device includes an information receiver circuit 2210, a selector circuit 2220, a query circuit 2230, and a content of possible interest (CPI) transmitter circuit 2240. In an alternative embodiment, the electronic device may include at least one of a characterization circuit 2250, a chooser circuit 2260, an analytic circuit 2270, a storage device 2280, a results receiver 2285, a processor 2285, or other circuit(s) 2290. In an embodiment, the electronic device or a portion of the electronic device may be implemented in whole or in part using the computing device 110 described in conjunction with FIG. 2. In another embodiment, the electronic device or a portion of the electronic device may be implemented using Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. In a further embodiment, one or more of the circuits and/or the machinery of the electronic device may be implemented in hardware, software, and/or firmware.
  • The information receiver circuit 2210 includes an information receiver circuit operable to receive information that is indicative of respective responses by a person to each of at least two instances of electronically displayed content. The received information is derived from data acquired by a sensor coupled to the person and sent by a requestor electronic device. For example, in an embodiment, the data may include data acquired by at least one of the sensors 206A-206C and the response sensing apparatus 206 sensing respective responses to the person 205 viewing the at least two instance of content electronically displayed by the display surface 209 of FIG. 3, or as described in conjunction with FIG. 10. The selector circuit 2220 includes a selector circuit operable to choose a particular content from the at least two instances of electronically displayed content, the selecting based at least in part on the received information. The query circuit 2230 includes a query circuit operable to facilitate a search for a new content using a search parameter corresponding to a content attribute of the particular content. The CPI transmitter circuit 2240 includes a transmitter circuit operable to send information that is indicative of the new content to the requester electronic device.
  • The characterization circuit 2250 includes a characterization circuit operable to determine the content attribute of the particular content. The chooser circuit 2260 includes a chooser circuit operable to select the new content from a result of the facilitated search. The analytic circuit 2270 includes an analytic circuit operable to analyze the received information for a respective indication of a reaction by the person corresponding to each of the at least two instances of electronically displayed content. The storage device 2280 includes a storage device operable to save data indicative of the new content. The results receiver circuit 2285 includes a receiver circuit operable to receive a result of the search for a new content.
  • FIG. 39 illustrates an example computer program product 2300. The computer program product includes a computer-readable medium 2310 bearing program instructions 2320 operable to perform a process in a computing device. For example, the computing device may be implemented in whole or in part using the thin computing device 20 described in conjunction with FIG. 1, and/or the computing device 110 described in conjunction with FIG. 2. The process includes receiving information that is indicative of respective responses by a person to each of at least two instances of electronically displayed content. The received information is derived from data acquired by a sensor coupled to the person and sent by a requestor electronic device. The process also includes selecting a particular content from the at least two instances of electronically displayed content, the selecting based at least in part on the received information. The process further includes facilitating a search for a new content using a search parameter corresponding to a content attribute of the particular content. The process also includes returning an indication of the new content to the requestor electronic device.
  • In an alternative embodiment, the process further includes receiving information indicative of a content attribute of the particular content 2322. In another alternative embodiment, the process further includes determining a content attribute of the particular content 2324. In a further embodiment, the process further includes analyzing the received information for an indication of a reaction of the person to the particular content 2326. In another alternative embodiment, the process further includes receiving information indicative of a common contextual attribute of the at least two instances of electronically displayed content 2328. In a further alternative embodiment, the process further includes saving data indicative of the new content 2332.
  • In another alternative embodiment, the computer-readable medium includes computer storage medium 2312.
  • FIG. 40 illustrates an example electronic device 2405. The electronic apparatus includes means 2410 for receiving information that is indicative of respective responses by a person to each of at least two instances of electronically displayed content. The received information is derived from data acquired by a sensor coupled to the person and sent by a requestor electronic device. The electronic apparatus also includes means 2420 for selecting a particular content from the at least two instances of electronically displayed content, the selecting based at least in part on the received information. The electronic device further includes means 2430 for facilitating a search for a new content using a search parameter corresponding to a content attribute of the particular content. The electronic device includes means 2440 for returning an indication of the new content to the requester electronic device.
  • In an alternative embodiment, the electronic apparatus may include means 2450 for receiving information indicative of a content attribute of the particular content. In another embodiment, the electronic apparatus may include means 2460 for determining a content attribute of the particular content. In a further alternative embodiment, the electronic apparatus may include means 2670 for receiving data indicative of a common contextual attribute of the at least two instances of electronically displayed content.
  • FIG. 41 illustrates an example environment 2600. The example environment includes an electronic device 2601. The electronic device includes a response sensing apparatus 206, a transceiver circuit 2607, and the electronic display surface 209. In an alternative embodiment, the electronic device may include a user direct-input device 208. In some embodiments, one or more of the response sensing apparatus, the transceiver circuit, the user direct-input interface, and the display surface may be structurally distinct from the remaining circuits or the electronic device. The display surface may be physically incorporated with the electronic device, or may be physically separate from the electronic device and electronically coupled with the device. In another embodiment, the display surface is structurally and electrically distinct from the electronic device, and displays a content projected by a projector display engine (not shown) of the electronic device. The electronic device 2601 may include a wired or wireless access to digital content using the transceiver 2607, such as via a network 299. In an alternative embodiment, the electronic device may be coupled to the network via a wireless link, a satellite link, and/or a wired link.
  • In an embodiment, the electronic device 2601 includes a synthesizer circuit 2610, an analytic circuit 2660, a query circuit 2680, and a display circuit 2670. In some embodiments, one or more of the synthesizer circuit, the analytic circuit, the query circuit, and the display circuit may be structurally distinct from the remaining circuits. In an embodiment, the electronic device or a portion of the electronic device may be implemented in whole or in part using the thin computing device 20 described in conjunction with FIG. 1, and/or the computing device 110 described in conjunction with FIG. 2. In another embodiment, the electronic device or a portion of the electronic device may be implemented using Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. In a further embodiment, one or more of the circuits and/or the machine may be implemented in hardware, software, and/or firmware. The person 205 may input commands and information to the electronic device 2601 using the user direct-input device 208.
  • The electronic device 2601 may include at least one additional circuit. The at least one additional circuit may include additional circuit(s) 2690. In addition, the electronic device may include a processor, such as the processing unit 21 described in conjunction with FIG. 1, and/or the processor 120 described in conjunction with FIG. 2. In further addition, the electronic device may include a computer storage media illustrated as a data store. In an embodiment, the electronic device may include a mobile electronic device.
  • FIG. 42 illustrates an example operational flow 2700. In an embodiment, the operational flow may be implemented in an environment that includes a person viewing a general advertisement displayed by an electronic device and having a characteristic. In an alternative embodiment, the operational flow may be implemented in an environment that includes a person directly interacting with the electronic device using a user direct-input device and viewing a general advertisement having a characteristic. In another alternative embodiment, the operational flow may be implemented in an environment that includes a person viewing a general advertisement having a characteristic. FIG. 42 and several following figures may include various examples of operational flows, discussions, and explanations with respect to the above-described environment 2600 of FIG. 41, and/or with respect to other examples and contexts. However, it should be understood that the operational flows may be executed in a number of other environments and contexts, and/or in modified versions of FIG. 41. Also, although the various operational flows are illustrated in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, and/or may be performed concurrently.
  • After a start operation, the operational flow 2700 moves to a synthesize operation 2710. The synthesize operation includes generating a marketing impact information indicative of a physiological response by a person to an electronically displayed general advertisement. In an embodiment, a general advertisement includes an advertisement that is not specific or personalized to the person. In another embodiment, a general advertisement includes an advertisement that is not specialized for the person. For example, a general advertisement may include an advertisement selected using revealed or entered profile information of the person, or an advertisement selected based upon a content displayed to the person, such as by using Google AdSense. In a further embodiment, a general advertisement includes an advertisement for which the person's physiologic response is not previously known. In another embodiment, a general advertisement includes a targeted-advertisement for which the person's physiologic response is not previously known. In a further embodiment, a general advertisement includes a previously selected targeted-advertisement for which the person's physiologic response is known when a more focused targeted-advertisement is sought. The synthesize operation may be implemented using the synthesizer circuit 2710 of FIG. 41. Data corresponding to the physiological response by the person to the electronically displayed general advertisement may be acquired using the response sensing apparatus 206 of FIG. 3. In an alternative embodiment, data corresponding to the physiological response by the person to the electronically displayed general advertisement may be acquired using a reaction detector circuit (not shown), such as by the reaction detector circuit 1410 of FIG. 21, and/or the reaction detector circuit 210 of FIG. 3, in conjunction with the response sensing apparatus 206 of FIG. 3.
  • An attribute operation 2760 includes acquiring an indication of a characteristic of the electronically displayed general advertisement. In an embodiment, a characteristic of the electronically displayed general advertisement may include at least one of a category, tag, subject, color, texture, or theme. For example, a subject characteristic may include a nature, athletic, criminal, animal, car, airplane, boat, flower, people, or entertainer. The attribute operation may be implemented using the analytic circuit 2660 of FIG. 41. A choice operation 2780 includes initiating a selection of a targeted-advertisement using an advertising rule responsive to at least the characteristic of the electronically displayed general advertisement and the marketing impact information. The choice operation may be implemented using the query circuit 2680 of FIG. 41. The operational flow then moves to an end operation.
  • FIG. 43 illustrates an alternative embodiment of the operational flow 2700 of FIG. 42. The operational flow may include at least one additional operation 2810. The at least one additional operation may include an operation 2812, an operation 2816, an operation 2818, or an operation 2822. The operation 2812 includes electronically displaying the general advertisement in a manner perceivable by the person. The operation 2812 may be implemented using the display circuit 2670 of FIG. 41. In an embodiment, the display circuit may facilitate a display of the general advertisement using the electronic display surface 209. In another embodiment, the display circuit may facilitate a display of the general advertisement using a projector circuit operable to display the general advertisement on another surface, such as a wall, screen, or article of clothing. In an embodiment, the operation 2812 may include at least one additional operation, such as an operation 2814. The operation 2814 includes electronically displaying the general advertisement in a manner perceivable by at least one of the person's visual, audio, tactile, or olfactory senses. The operation 2816 includes receiving the targeted-advertisement. For example, the targeted-advertisement may be received from an adserver 2604. The operation 2816 may be implemented using a receiver circuit, such as a receiver element of the transceiver circuit 2607. The operation 2818 includes saving an indication of the targeted-advertisement. The operation 2818 may be implemented using the data store of FIG. 41. The operation 2822 includes electronically displaying the selected targeted-advertisement. The operation 2822 may be implemented using the display circuit 2670 of FIG. 41. In an embodiment, the display circuit may facilitate a display of the selected targeted-advertisement using the electronic display surface 209. In another embodiment, the display circuit may facilitate a display of the selected targeted-advertisement using a projector circuit operable to display the selected targeted-advertisement on another surface, such as a wall, screen, or article of clothing. In an alternative embodiment, the operation 2822 includes electronically displaying the selected targeted-advertisement in a manner perceivable by the person. The operation 2822 may be implemented using a display, such as the electronic display 209.
  • FIG. 44 illustrates another alternative embodiment of the operational flow 2700 of FIG. 42. The synthesize operation 2710 may include at least one additional operation. The at least one additional operation may include an operation 2712, an operation 2714, an operation 2716, an operation 2718, an operation 2722, or an operation 2724. The operation 2712 includes generating a marketing impact information indicative of a physiological response by a person to an electronically displayed general advertisement. In an embodiment, the physiological response includes at least one of an eye gaze direction, an eye movement, eye dwell time, a movement of an eyelid, an eye blink, a pupil dilation, a lip movement, a brain wave, a heart rate, a respiration rate, or a voice quality response. The operation 2714 includes generating a marketing impact information indicative of a physiological response by a person to an electronically displayed general advertisement and based at least in part on data produced by a sensor coupled to the person. The operation 2716 includes generating a marketing impact information that is indicative of a physiological response by a person to an electronically displayed general advertisement and that is based at least in part on data produced by at least one of an eye gaze, pulse, brain wave, or P-300 sensor coupled to a person. The operation 2718 includes generating a marketing impact information indicative of a physiological response by a person to an electronically displayed general advertisement, and based at least in part on data produced by a sensor that is at least one of visually, optically, physically, or mechanically coupled to the person. The operation 2722 includes generating a marketing impact information indicative of a physiological response by a person to an electronically displayed general advertisement, and based at least in part on data produced by a sensor that is worn by the person. The operation 2724 includes generating a marketing impact information indicative of a psychophysiological response by a person to an electronically displayed general advertisement.
  • FIG. 45 illustrates another alternative embodiment of the operational flow 2700 of FIG. 42. The synthesize operation 2710 may include at least one additional operation. The at least one additional operation may include an operation 2726, an operation 2728, an operation 2732, an operation 2734, an operation 2736, or an operation 2738. The operation 2726 includes generating a marketing impact information indicative of a response by a person to an electronically displayed general advertisement. The general advertisement is electronically displayed using at least one of a screen, a display surface, a projector, or a sound. The operation 2728 includes generating a marketing impact information indicative of a response by a person to an electronically displayed general advertisement. The general advertisement includes at least one of a promotional content, an offer of a product and/or service, a public service announcement, or a product placement. The operation 2732 includes generating a marketing impact information indicative of a response by a person to an electronically displayed general advertisement. The general advertisement includes a form of communication designed to persuade the person to take some action, now or in the future. The operation 2734 includes generating a marketing impact information indicative of a response by a person to an electronically displayed general advertisement. The general advertisement includes a communication designed to encourage or stimulate patronization of a specific seller or purchase of a particular product. The operation 2736 includes generating a marketing impact information indicative of a response by a person to an electronically displayed general advertisement. The general advertisement is received from an advertising server via a network. The operation 2738 includes generating a marketing impact information indicative of a response by a person to an electronically displayed general advertisement. The general advertisement is received from an advertising server via a network in conjunction with another content configurable for electronic display. In an embodiment, the advertising server may include the Adserver 2604 illustrated in conjunction with FIG. 41.
  • FIG. 46 illustrates an alternative embodiment of the operational flow 2700 of FIG. 42. The synthesize operation 2710 may include at least one additional operation. The at least one additional operation may include an operation 2742, an operation 2744, an operation 2746, or an operation 2748. The operation 2742 includes generating a marketing impact evaluation indicative of a response by a person to an electronically displayed general advertisement. The operation 2744 includes generating a marketing impact information indicative of a determined reaction by a person to the electronically displayed general advertisement. The determined reaction based at least in part on sensor-acquired data indicative of a physiological response. The operation 2746 includes generating a marketing impact information indicative of a physiological response by a person to an electronically displayed paid content. The operation 2748 includes generating a marketing impact information indicative of a physiological response by a person to an electronically displayed general advertisement. In an embodiment, the electronically displayed general advertisement includes at least one of an electronically displayed text, logo, photograph, picture, classified ad, graphic information, static image, dynamic image, streaming ad, interactive, audio, video, banner, rich media banner, placement ad, search advertising, contextual advertising, commercial message, interactive ad, interstitial ad, floating ad, wallpaper ad, pop-up, pop-under, or map ad.
  • FIG. 47 illustrates a further alternative embodiment of the operational flow 2700 of FIG. 42. The attribute operation 2760 may include at least one additional operation. The at least one additional operation may include an operation 2762, an operation 2764, or an operation 2768. The operation 2762 includes at least one of determining, or ascertaining an indication of a characteristic of the electronically displayed general advertisement. The operation 2764 includes at least one of receiving, or receiving in conjunction with receiving the general advertisement, an indication of a characteristic of the electronically displayed general advertisement. The operation 2766 includes acquiring an indication of a characteristic of the electronically displayed general advertisement. In an embodiment, the characteristic of the electronically displayed general advertisement includes at least one of content, subject matter, dialog, cultural, ethnic, linguistic, visual, verbal, sexual, price range, local, global, or brand characteristic.
  • FIG. 48 illustrates an alternative embodiment of the operational flow 2700 of FIG. 42. The choice operation 2780 may include at least one additional operation. The at least one additional operation may include an operation 2782, an operation 2784, an operation 2786, an operation 2788, an operation 2792, or an operation 2794. The operation 2782 includes initiating at least one of a local, a remote, or remote from an advertising server, selection of a targeted-advertisement using an advertising rule responsive at least to both the characteristic of the electronically displayed general advertisement and the marketing impact information. The operation 2784 includes initiating a selection of a targeted-advertisement from at least two instances of available advertising using an advertising rule responsive to at least the characteristic of the electronically displayed general advertisement and the marketing impact information. The operation 2786 includes initiating a selection of a targeted-advertisement using an advertising rule responsive to at least the characteristic of the electronically displayed general advertisement, the marketing impact information, and a historical behavior by the person. In an embodiment, the historical behavior by the person may include a historical Internet related behavior. In another embodiment, the historical behavior by the person may include a profile of the person. The operation 2788 includes initiating a selection of a targeted-advertisement using an advertising rule responsive at least to both the characteristic of the electronically displayed general advertisement and the marketing impact information. The targeted-advertisement includes a content that is particularly relevant to the person. The operation 2792 includes initiating a selection of a personalized advertisement using an advertising rule responsive at least to both the characteristic of the electronically displayed general advertisement and the marketing impact information. The operation 2794 includes initiating a selection of a targeted-advertisement using an advertising rule responsive at least to both the characteristic of the electronically displayed general advertisement and the marketing impact information. The targeted-advertisement includes a content that is of potential interest to the person.
  • FIG. 49 illustrates an example electronic system 2900. The system includes an electronic apparatus 2901 that is coupleable to a network 299, and which may be used by a person 205. The electronic apparatus may be coupled to the network via a wired link, illustrated as a cable link, and/or a wireless link illustrated as a satellite link. The system includes a response sensor apparatus 2920, a recognition circuit 2930, a query circuit 2940, and an electronic display circuit 2950. In an embodiment, the response sensor apparatus, the recognition circuit, the query circuit, and the electronic display circuit are included in the electronic apparatus. In another embodiment, the electronic apparatus may include the user direct-input device 208. In some embodiments, one or more of the response sensor apparatus, the recognition circuit, the query circuit, and the electronic display circuit may be structurally distinct from the remaining circuits or the electronic apparatus. In an alternative embodiment, the electronic apparatus includes at least one of a portable electronic device, or a mobile electronic device.
  • The response sensor apparatus 2920 includes a response sensor apparatus operable to acquire data indicative of a physiological response by the person 205 to a general advertisement. The response sensor apparatus includes a sensor data acquisition module 2924 and at least one user sensor operable to acquire data indicative of the response by the person to the general advertisement displayed by or on a display surface. The at least one user sensor is illustrated as a sensor 206A, a sensor 206B, and/or a wearable/mountable sensor 206C. The at least one user sensor may be physically incorporated with the electronic device, or may be physically separate from the electronic device and electronically coupled with the device. The general advertisement may include at least one of a static advertisement, such as a billboard or a poster, an advertisement displayed on the electronic display surface 209, or an advertisement projected on the display surface 2959. In an embodiment, the response sensor apparatus is at least substantially similar to the response sensor apparatus 520 described in conjunction with FIG. 10.
  • The recognition circuit 2930 includes a recognition circuit operable to generate a marketing impact information about the general advertisement based at least in part on the acquired data indicative of a physiological response. The query circuit 2940 includes a query circuit operable to initiate a selection of a targeted-advertisement by a targeted-advertising selection engine responsive to at least a characteristic of the general advertisement and the marketing impact information. In an embodiment, the targeted-advertising selection engine includes a local targeted-advertising selection engine 2975. In another embodiment, the targeted-advertising selection engine includes a targeted-advertising selection engine included in an adserver 2904. In an alternative embodiment, the query circuit is operable to initiate at least one of a local, or a remote selection of a targeted-advertisement. For example, the remote selection of a targeted-advertisement may be performed by a remote advertising server, illustrated as an adserver 2904. In another alternative embodiment, the targeted-advertisement may be selected from at least two instances of available marketing content.
  • The electronic display circuit 2950 includes an electronic display circuit operable to present the targeted-advertisement in a manner perceivable by a person. In an embodiment, the electronic display circuit is operable to drive an electronic display surface. For example, the electronic display circuit may drive the electronic display surface 209. In another embodiment, the electronic display circuit includes projector display engine 2952 operable to project the general advertisement on a display surface 2959.
  • In an alternative embodiment, the recognition circuit 2930 includes a feature extraction circuit operable to compute a response information based at least in part on the acquired data indicative of a physiological response. In another embodiment, the recognition circuit includes a classification circuit operable to classify a reaction by the person to the general advertisement based at least in part on the response information.
  • In a further embodiment, the electronic display circuit 2950 includes an electronic display circuit operable to present the general advertisement and the targeted-advertisement in a manner perceivable by a person. In an embodiment, the electronic display circuit includes an electronic display operable to present electronic content in a manner perceivable by the person and in a manner designed to facilitate sensing a response by the person. In another embodiment, the electronic display circuit further includes an electronic display operable to present the targeted-advertisement in a manner perceivable by at least one of the person's visual, audio, tactile, or olfactory senses.
  • In an embodiment, the electronic system 2900 includes a characterization circuit 2935. The characterization circuit includes a characterization circuit operable to acquire an indication of a characteristic of the general advertisement. The characterization circuit may further include a characterization circuit operable to at least one of receive, or determine an indication of a characteristic of the general advertisement. In another embodiment, the electronic system includes a digital storage device 2990 operable to save an indication of the selected targeted-advertisement. In a further embodiment, the electronic system includes an advertising broadcast circuit 2970 operable to facilitate a display of the selected targeted-advertisement by the electronic display. In a further embodiment, the electronic system includes a receiver circuit 2998 operable to receive an indication of the selected targeted-advertisement.
  • FIG. 50 illustrates an example computer program product 3000. The computer program product includes a computer-readable medium 3010 bearing program instructions 3020, and the program instructions. The program instructions are operable to perform a process in a computing device. The process includes instructions generating a marketing impact information indicative of a physiological response by a person to an electronically displayed general advertisement 3030. The process also includes instructions acquiring an indication of a characteristic of the electronically displayed general advertisement 3040. The process further includes instructions initiating a selection of a targeted-advertisement using an advertising rule responsive to at least the characteristic of the electronically displayed general advertisement and the marketing impact information 3050.
  • In an alternative embodiment, the process of the program instructions 3020 may include additional instruction(s) 3090. The additional instructions may include instructions initiating an electronic display of the general advertisement 3092. The additional instructions may include instructions receiving an indication of the targeted-advertisement 3094. The additional instructions may instructions saving an indication of the targeted-advertisement 3096.
  • In another alternative embodiment, the instructions initiating 3050 further include instructions sending a request to an advertising server for a selection of a targeted-advertisement using an advertising rule responsive to at least the characteristic of the electronically displayed general advertisement and the marketing impact information 3052. In a further alternative embodiment, the instructions initiating further include instructions receiving a selected targeted-advertisement from an advertising server. The targeted-advertisement is selected using an advertising rule responsive to at least the characteristic of the electronically displayed general advertisement and the marketing impact information 3054.
  • In a further embodiment, the computer readable medium 3010 includes a computer readable storage medium 3012. In another embodiment, the computer readable includes a computer readable communications medium 3014.
  • FIG. 51 illustrates an example system 3100 that includes an electronic device 3105. The electronic device includes means 3110 for generating a marketing impact information indicative of a physiological response by a person to an electronically displayed general advertisement. The electronic device also includes means 3120 for acquiring an indication of a characteristic of the electronically displayed general advertisement. The electronic device further includes means 3130 for initiating a selection of a targeted-advertisement using an advertising rule responsive to at least the characteristic of the electronically displayed general advertisement and the marketing impact information.
  • In an alternative embodiment, the electronic device may include at least one additional means. The at least one additional means may include means 3140, means 3150, or means 3160. Means 3140 includes means for electronically displaying the general advertisement in a manner perceivable by the person. Means 3150 includes means for receiving the targeted-advertisement. Means 3160 includes means for saving an indication of the targeted-advertisement.
  • FIG. 52 illustrates an example environment 3300. The example environment includes an electronic device 3301. The electronic device includes a response sensing apparatus 206, a transceiver circuit 3307, and a display surface 209. In an alternative embodiment, the electronic device may include a user direct-input device 208. In some embodiments, one or more of the response sensing apparatus, the transceiver circuit, the user direct-input interface, and the display surface may be structurally distinct from the remaining circuits or the electronic device. The environment also includes a marketing evaluation circuit 3310, a targeted-advertising request circuit 3330, a targeted-advertising receiver circuit 3350, and a targeted-advertising broadcast circuit 3370. In an embodiment, one of more of the marketing evaluation circuit, the targeted-advertising request circuit, the targeted-advertising receiver circuit, and the targeted-advertising broadcast circuit are included in the electronic device. The display surface may be physically incorporated with the electronic device, or may be physically separate from the electronic device and electronically coupled with the device. In another embodiment, the display surface is structurally and electrically distinct from the electronic device, and is operable to display a content projected by a projector display engine (not shown) of the electronic device.
  • The electronic device 3301 may include a wired or wireless access via a network 299 to advertising, other digital content, and/or to servers using the transceiver 3307. In an alternative embodiment, the electronic device may be coupled to the network via a wireless link, a satellite link, and/or a wired link. In another embodiment, the circuits, or the electronic device, or a portion of the electronic device may be implemented using Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. In a further embodiment, one or more of the circuits and/or the machine may be implemented in hardware, software, and/or firmware. The person 205 may input commands and information to the electronic device 3301 using the user direct-input device 208.
  • The electronic device 3301 may include at least one additional circuit. The at least one additional circuit may include additional circuit(s) 3390. In addition, the electronic device may include a processor, such as the processing unit 21 described in conjunction with FIG. 1, and/or the processor 120 described in conjunction with FIG. 2. In further addition, the electronic device may include a computer storage media illustrated as a data store. In an embodiment, the electronic device may include a mobile electronic device.
  • FIG. 53 illustrates an example operational flow 3400. The operational flow may be implemented in an environment that includes a person viewing a general advertisement displayed by an electronic device and having a characteristic. In an alternative embodiment, the operational flow may be implemented in an environment that includes a person directly interacting with the electronic device using a user direct-input device and viewing a general advertisement having a characteristic. In another alternative embodiment, the operational flow may be implemented in an environment that includes a person viewing a general advertisement having a characteristic. FIG. 53 and several following figures may include various examples of operational flows, discussions, and explanations with respect to the above-described environment 3300 of FIG. 52, and/or with respect to other examples and contexts. However, it should be understood that the operational flows may be executed in a number of other environments and contexts, and/or in modified versions of FIG. 52. Also, although the various operational flows are illustrated in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, and/or may be performed concurrently.
  • After a start operation, the operational flow 3400 moves to an assessment operation 3410. The assessment operation includes generating a marketing impact information indicative of a physiological response by a person to an electronically displayed general advertisement. The assessment operation may be implemented using the marketing evaluation circuit 3310 of FIG. 52. A call operation 3430 includes sending the marketing impact information to an advertising-selector server. The call operation may be implemented using the targeted-advertisement request circuit 3330. A return operation 3450 includes receiving an indication of a targeted-advertisement chosen using an advertising rule responsive to at least a characteristic of the electronically displayed general advertisement and the marketing impact information. The return operation may be implemented using the targeted-advertisement reception circuit 3350. A broadcast operation 3470 includes facilitating an electronic display of the targeted-advertisement. The broadcast operation may be implemented using the targeted-advertisement broadcast circuit 3370. The operational flow then moves to an end operation.
  • FIG. 54 illustrates an alternative embodiment of the operational flow 3400 of FIG. 53. The operational flow may include at least one additional operation 3490. The at least one additional operation may include an operation 3492, an operation 3494, an operation 3496, or an operation 3498. The operation 3492 includes electronically displaying the targeted-advertisement. The operation 3492 may be implemented using the display circuit 3380 of FIG. 52. The operation 3494 includes saving the indication of a targeted-advertisement. The operation 3494 may be implemented using the data store of FIG. 52. The operation 3496 includes determining a characteristic of the electronically displayed general advertisement. The operation 3496 may be implemented using a recognition circuit 3385 of FIG. 52. The operation 3498 includes receiving the selected targeted-advertisement. The operation 3498 may be implemented using a receiver portion of the transceiver circuit 3307. The operation 3498 may include at least one additional operation, such as an operation 3499. The operation 3499 includes receiving the selected targeted-advertisement from an advertising-content server.
  • FIG. 55 illustrates another alternative embodiment of the operational flow 3400 of FIG. 53. The assessment operation 3410 may include at least one additional operation. The at least one additional operation may include an operation 3412, or an operation 3414. The operation 3412 includes generating a marketing impact information indicative of a physiological response by a person to an electronically displayed general advertisement and based at least in part on data produced by a sensor coupled to a person. The operation 3414 includes generating a marketing impact information indicative of a determined reaction by a person to the electronically displayed general advertisement. The determined reaction is based at least in part on sensor-acquired data indicative of a physiological response.
  • FIG. 56 illustrates a further alternative embodiment of the operational flow 3400 of FIG. 53. The call operation 3430 may include at least one additional operation. The at least one additional operation may include an operation 3432, an operation 3434, or an operation 3436. The operation 3432 includes sending the marketing impact information and a determined characteristic of the electronically displayed general advertisement to an advertising-selector server. The advertising-selector server may include an advertising-selector server 3498. The operation 3434 includes sending the marketing impact information and a received characteristic of the electronically displayed general advertisement to an advertising-selector server. The operation 3436 includes sending the marketing impact information and a request for a targeted-advertisement to an advertising-selector server.
  • FIG. 57 illustrates an alternative embodiment of the operational flow 3400 of FIG. 53. The return operation 3450 may include at least one additional operation. The at least one additional operation may include an operation 3452, an operation 3454, an operation 3456, or an operation 3458. The operation 3452 includes receiving from the advertising-selector server an indication of a targeted-advertisement chosen using an advertising rule responsive to at least a characteristic of the electronically displayed general advertisement and the marketing impact information. The operation 3454 includes receiving an indication of a targeted-advertisement chosen from at least two instances of available marketing content using an advertising rule responsive to at least a characteristic of the electronically displayed general advertisement and the marketing impact information. The operation 3456 includes receiving an indication of a targeted-advertisement chosen using an advertising rule responsive to at least a determined characteristic of the electronically displayed general advertisement and the marketing impact information. The operation 3458 includes receiving the targeted-advertisement chosen using an advertising rule responsive to at least a characteristic of the electronically displayed general advertisement and the marketing impact information. In an alternative embodiment, the operation 3458 may include at least one additional operation, such as an operation 3459 (not shown). The operation 3459 includes electronically displaying the targeted-advertisement.
  • FIG. 58 illustrates an example system 3500. The system includes an electronic apparatus 3501 that is coupleable to a network 299, and which may be used by a person 205. The electronic apparatus may be coupled to the network via a wired link, illustrated as a cable link, and/or a wireless link illustrated as a satellite link. The system includes a sensor apparatus 3520, a recognition circuit 3530, a caller circuit 3540, an electronic display circuit 3550, and a receiver circuit 3560. In an embodiment, at least one of the sensor apparatus, the recognition circuit, the caller circuit, the electronic display circuit, and the receiver circuit are included in the electronic apparatus. In an embodiment, the electronic apparatus may include the user direct-input device 208. In an alternative embodiment, the electronic apparatus includes at least one of a portable electronic device, or a mobile electronic device.
  • The sensor apparatus 3520 includes a sensor apparatus operable to acquire data indicative of a physiological response by the person 205 to a general advertisement. The sensor apparatus includes a sensor data acquisition module 3524 and at least one user sensor operable to acquire data indicative of a response by a person to the general advertisement displayed by or on a display surface. The at least one user sensor is illustrated as a sensor 206A, a sensor 206B, and/or a wearable/mountable sensor 206C. The at least one user sensor may be physically incorporated with the electronic device, or may be physically separate from the electronic device and electronically coupled with the device. The general advertisement may include at least one of a static advertisement, such as a billboard, an advertisement displayed on the electronic display surface 209, or an advertisement projected on a display surface 3559. In an embodiment, the sensor apparatus is at least substantially similar to the response sensor apparatus 520 described in conjunction with FIG. 10.
  • The recognition circuit 3530 includes a recognition circuit operable to generate a marketing impact information about the general advertisement based at least in part on the acquired data indicative of a physiological response by the person to the general advertisement. The caller circuit 3540 includes a caller circuit operable to send a request for a targeted-advertisement. The receiver circuit 3560 includes a receiver circuit operable to receive an indication of a targeted-advertisement selected from at least two instances of available marketing content by a selection engine responsive at least to a characteristic of the general advertisement and the marketing impact information. The electronic display circuit 3550 includes an electronic display circuit operable to present the targeted-advertisement in a manner perceivable by the person.
  • In an embodiment, the recognition circuit 3530 further includes a feature extraction circuit operable to compute response information based at least in part on the acquired data indicative of a physiological response by the person to the general advertisement. In another embodiment, the recognition circuit further includes a classification circuit operable to classify a reaction by the person to the general advertisement based at least in part on the response information.
  • In a further embodiment, the caller circuit 3540 includes a caller circuit operable to send a request for a targeted-advertisement to an advertisement server. In another embodiment, the caller circuit includes a caller circuit operable to send a request for a targeted-advertisement and an indication of the characteristic of the general advertisement. In another embodiment, the receiver circuit 3560 includes a receiver circuit operable to receive from an advertising server an indication of a targeted-advertisement selected from at least two instances of available marketing content by a selection engine responsive at least to a characteristic of the general advertisement and the marketing impact information.
  • In an embodiment, the electronic display circuit 3550 includes an electronic display circuit operable to present the general advertisement and the targeted-advertisement in a manner perceivable by a person. In another embodiment, the electronic display circuit includes an electronic display circuit operable to present electronic content in a manner perceivable by the person and in a manner facilitating sensing a response by the person. In a further embodiment, the electronic display circuit includes an electronic display circuit operable to present the targeted-advertisement in a manner perceivable by at least one of the person's visual, audio, tactile, or olfactory senses.
  • In another embodiment, the system 3500 includes a characterization circuit 3535 operable to acquire an indication of a characteristic of the general advertisement. The characterization circuit may further include a characterization circuit operable to at least one of receive, or determine an indication of a characteristic of the general advertisement. In a further embodiment, the system includes a digital storage device 3590 operable to save an indication of the selected targeted-advertisement. In an alternative embodiment, the system includes an advertising broadcast circuit 3570 operable to facilitate a display of the selected targeted-advertisement by the electronic display circuit.
  • FIG. 59 illustrates an example computer program product 3600. The computer program product includes a computer-readable medium 3610 bearing program instructions 3620. The program instructions are operable to perform a process in a computing device. The process includes generating a marketing impact information indicative of a physiological response by a person to an electronically displayed general advertisement and based at least in part on data produced by a sensor coupled to a person 3630. The process also includes sending the marketing impact information to an advertising-selector server 3640. The process further includes receiving an indication of a targeted-advertisement chosen by an advertising rule responsive to at least a characteristic of the electronically displayed general advertisement and the marketing impact information 3650. The process also includes facilitating an electronic display of the targeted-advertisement 3660.
  • In an alternative embodiment, the process of the program instructions includes facilitating an electronic display of the general advertisement 3692. In another alternative embodiment, the process of the program instructions includes saving an indication of the targeted-advertisement 3694.
  • In an embodiment, the indication receiving process 3650 of the program instructions includes receiving a targeted-advertisement chosen by an advertising rule responsive to at least a characteristic of the electronically displayed general advertisement and the marketing impact information 3652. In another embodiment, the indication receiving process of the program instructions includes receiving from the advertising-selector server an indication of a targeted-advertisement chosen by an advertising rule responsive to at least a characteristic of the electronically displayed general advertisement and the marketing impact information.
  • In another embodiment, the computer-readable medium 3610 includes a computer storage medium 3612. In a further embodiment, the computer-readable medium includes a communications medium 3614.
  • FIG. 60 illustrates an example electronic device 3701. The electronic device includes means 3710 for generating a marketing impact information indicative of a physiological response by a person to an electronically displayed general advertisement and based at least in part on data produced by a sensor coupled to a person. The electronic device also includes means 3720 for sending the marketing impact information to an advertising-selector server. The electronic device further includes means 3730 for receiving an indication of a targeted-advertisement chosen by an advertising rule responsive to at least a characteristic of the electronically displayed general advertisement and the marketing impact information. The electronic device also includes means 3740 for facilitating an electronic display of the targeted-advertisement.
  • In an alternative embodiment, the electronic device may include at least one additional means. The at least one additional means may include means 3750, means 3760, means 3770, or means 3780. The means 3750 includes means for electronically displaying the targeted-advertisement. The means 3760 includes means for saving the indication of a targeted-advertisement. The means 3770 includes means for determining a characteristic of the electronically displayed general advertisement. The means 3780 includes means for receiving the indication of a targeted-advertisement.
  • FIG. 61 illustrates an example environment 3900. The example environment includes an electronic device 3904. In an embodiment, the electronic device 3904 may include a network server electronic device, or a group of network server electronic devices. In another embodiment, the electronic device may include an advertising server, such as the adserver 3398 of FIG. 52, and/or the adserver 3504 of FIG. 58. The electronic device includes a request receiver circuit 3910, an advertisement characterizer circuit 3940, a targeted-advertising picker circuit 3960, and a response sender circuit 3970.
  • In an alternative embodiment, the electronic device 3904 may include at least one additional circuit. The at least one additional circuit may include a communications circuit 3975, a recognition circuit 3992, and/or an additional circuit(s) 3995. In addition, the electronic device may include a processor 3984, which may be at least substantially similar to the processing unit 21 described in conjunction with FIG. 1, and/or the processor 120 described in conjunction with FIG. 2. In addition, the electronic device may include a computer storage media illustrated as a digital storage device 3980.
  • In another embodiment, one or more of the circuits, or the electronic device, or a portion of the electronic device may be implemented using Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. In a further embodiment, one or more of the circuits and/or the electronic device may be implemented in hardware, software, and/or firmware.
  • In an alternative embodiment, the electronic device 3904 may be coupled to the network 299 via a wireless link, a satellite link, and/or a wired link. In another embodiment, the electronic device is operable using the communications circuit 3975 to communicate with other networked electronic devices, such as a server operable to serve advertising and/or other digital content. In a further embodiment, the electronic device is operable to communicate with a requestor electronic device 3901. In an embodiment, the requester electronic device is operable to acquire information indicative of a physiological response by a person to a general advertisement, such as the electronic device 3301 of FIG. 52, and/or the electronic apparatus 3501 of FIG. 58.
  • FIG. 62 illustrates an example operational flow 4000. FIG. 62 and several following figures may include various examples of operational flows, discussions, and explanations with respect to the above-described environment 3900 of FIG. 61, and/or with respect to other examples and contexts. However, it should be understood that the operational flows may be executed in a number of other environments and contexts, and/or in modified versions of FIG. 61. Also, although the various operational flows are illustrated in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, and/or may be performed concurrently.
  • After a start operation, the operation flow 4000 includes an incoming operation 4010. The incoming operation includes receiving from a requestor device a marketing impact information indicative of a physiological response by a person to a general advertisement. The incoming operation may be implemented using the request receiver circuit 3910 of FIG. 61. In an embodiment, the marketing impact information is originated by a requester electronic device 3901. For example, the requestor electronic device may include the electronic device 3301 of FIG. 52, and/or the electronic apparatus 3501 of FIG. 58. A classification operation 4040 includes acquiring an indication of a characteristic of the general advertisement. The classification operation may be implemented using the advertisement characterizer circuit 3940. A choosing operation 4060 includes initiating a selection of a targeted-advertisement by an advertising rule responsive to at least the characteristic of the general advertisement and the marketing impact information. The choosing operation may be implemented using the targeted-advertisement picker circuit 3960. A reply operation 4070 includes returning an indication of the targeted-advertisement. The reply operation may be implemented using the response sender circuit 3970. The operational flow then includes an end operation.
  • FIG. 63 illustrates an alternative embodiment of the operational flow 4000 of FIG. 62. The incoming operation 4010 may include at least one additional operation. The at least one additional operation may include an operation 4012, an operation 4014, an operation 4016, an operation 4018, an operation 4022, an operation 4024, or an operation 4026. The operation 4012 includes receiving from a requestor device a marketing impact information indicative of a physiological response by a person to an electronically displayed general advertisement. The operation 4014 includes receiving from a requestor device a marketing impact information indicative of a physiological response by a person to a general advertisement and based at least in part on data produced by a sensor coupled to the person. The operation 4016 includes receiving from a requestor device a marketing impact information indicative of a physiological response by a person to a general advertisement and a request for a targeted-advertisement. The operation 4018 includes receiving from a requester device a marketing impact information indicative of a physiological response by a person to a general advertisement and a request for a targeted-advertisement responsive to the marketing impact information. The operation 4022 includes receiving from a requestor device at least one of a raw, partially transformed, or transformed marketing impact information indicative of a physiological response by a person to a general advertisement. The operation 4024 includes receiving from a requester device a marketing impact information indicative of a reaction by a person to a general advertisement. The operation 4026 includes receiving from a requestor device a marketing impact information indicative of at least one of a positive or a negative reaction by a person to a general advertisement.
  • FIG. 64 illustrates another alternative embodiment of the operational flow 4000 of FIG. 62. The classification operation 4040 may include at least one additional operation. The at least one additional operation may include an operation 4042, or an operation 4048. The operation 4042 includes receiving an indication of a characteristic of the general advertisement. The operation 4048 includes determining an indication of a characteristic of the general advertisement.
  • The operation 4042 may include at least one additional operation. The at least one additional operation may include an operation 4044, or an operation 4046. The operation 4044 includes receiving an indication of a characteristic of the general advertisement in conjunction with receiving the marketing impact information. The operation 4046 includes receiving an indication of a characteristic of the general advertisement from a third-party device.
  • FIG. 65 illustrates a further alternative embodiment of the operational flow 4000 of FIG. 62. The choosing operation 4060 may include at least one additional operation, such as the operation 4062. The operation 4062 includes initiating a selection of a targeted-advertisement predicted to be of interest to the person by a computer-implemented algorithm responsive to at least the characteristic of the general advertisement and the marketing impact information. The reply operation 4070 may include at least one additional operation, such as the operation 4072. The operation 4072 includes returning an indication of the targeted-advertisement to at least one of the requestor device, an advertisement server, or to another device.
  • FIG. 66 illustrates an alternative embodiment of the operational flow 4000 of FIG. 62. The operational flow may include at least one additional operation, such as a recognition operation 4092. The recognition operation includes analyzing the marketing impact information for an indication of a reaction by the person to the general advertisement. The recognition operation may be implemented using the recognition circuit 3992 of FIG. 61. In an alternative embodiment, the choosing operation 4060 may include at least one additional operation, such as an operation 4064. The operation 4064 includes initiating a selection of a targeted-advertisement by an advertising rule responsive to at least the characteristic of the general advertisement and the indication of a reaction by the person to the general advertisement.
  • FIG. 67 illustrates an example environment 4100 that includes an electronic device 4104. In an embodiment, the electronic device 4104 may include a network server electronic device, or a group of network server electronic devices. In another embodiment, the electronic device may include an advertising server, such as the adserver 3398 of FIG. 52, and/or the adserver 3504 of FIG. 58. The electronic device includes an advertising services circuit 4110, an attribute circuit 4120, a targeted-advertising selection engine 4130, and a reply circuit 4140.
  • In another embodiment, the electronic device 4104 includes at least one additional circuit. The at least one additional circuit may include a characterization circuit 4150, a chooser circuit 4160, an analytic circuit 4170, a digital storage device 4180, a results receiver circuit 4185, a processor 4188, and/or an additional circuit(s) 4190. The processor 4188 may be at least substantially similar to the processing unit 21 described in conjunction with FIG. 1, and/or the processor 120 described in conjunction with FIG. 2. In an embodiment, at least one circuit may be implemented using Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. In a further embodiment, one or more of the circuits and/or the electronic device may be implemented in hardware, software, and/or firmware.
  • In an embodiment, the electronic device 4104 may be coupled to the network 299 via a wireless link, a satellite link, and/or a wired link. In another embodiment, the electronic device is operable to communicate with other networked electronic devices, such as a server operable to serve advertising and/or other digital content. In a further embodiment, the electronic device is operable to communicate with a requestor device 4101. In an embodiment, the requestor electronic device is operable to acquire information indicative of a physiological response by a person to a general advertisement, such as the electronic device 3301 of FIG. 52, and/or the electronic apparatus 3501 of FIG. 58.
  • The advertising services circuit 4110 includes an advertising services circuit operable to receive from a requestor device a marketing impact information indicative of a physiological response by a person to a general advertisement. In an embodiment, a receipt of the marketing impact information may be considered a request for a targeted-advertisement. In an alternative embodiment, the advertising services circuit may further include an advertising services circuit operable to receive from a requester device a marketing impact information based at least in part on data produced by a sensor coupled to the person and indicative of a physiological response by a person to a general advertisement. In another alternative embodiment, the advertising services circuit further includes an advertising services circuit operable to receive a marketing impact information based at least in part on data produced by a sensor coupled to the person and indicative of a physiological response by a person to a general advertisement. In a further embodiment, the advertising services circuit further includes an advertising services circuit operable to receive a marketing impact information indicative of a physiological response by a person to a general advertisement, and to transform the marketing impact information to include an indication of a reaction by the person to the general advertisement. In an alternative embodiment, the advertising services circuit further includes an advertising services circuit operable to receive a marketing impact information based at least in part on data produced by a sensor coupled to the person and indicative of a physiological response by a person to a general advertisement, and a request for the targeted-advertisement.
  • The attribute circuit 4120 includes an attribute circuit operable to obtain an indication of a characteristic of the general advertisement. In an alternative embodiment, the attribute circuit further includes an attribute circuit operable to acquire an indication of a characteristic of the general advertisement. In another embodiment, the attribute circuit further includes an attribute circuit operable to determine an indication of a characteristic of the general advertisement.
  • The targeted advertising selection engine 4130 includes a targeted-advertising selection engine operable to pick a targeted-advertisement based on at least the indication of a characteristic of the general advertisement and the marketing impact information. The reply circuit 4140 includes a reply circuit operable to return an indication of the targeted-advertisement. In an alternative embodiment, the reply circuit may further include a reply circuit operable to return an indication of the targeted-advertisement to the requester device. In another alternative embodiment, the reply circuit further includes a reply circuit operable to return an indication of the targeted-advertisement to an advertising server.
  • FIG. 68 illustrates an example computer program product 4200. The computer program product includes a computer-readable medium 4210 bearing program instructions 4220. The program instructions are operable to perform a process in a computing device. The process includes receiving from a requestor device a marketing impact information indicative of a physiological response by a person to a general advertisement. The process also includes acquiring an indication of a characteristic of the general advertisement. The process further includes initiating a selection of a targeted-advertisement by an advertising rule responsive to at least the characteristic of the general advertisement and the marketing impact information. The process also includes returning an indication of the targeted-advertisement. In an alternative embodiment, the process further comprises transforming the marketing impact information to include an indication of a reaction by the person to the general advertisement. In another embodiment, the computer-readable medium includes a computer storage medium 4212. In a further embodiment, the computer-readable medium includes a communications medium 4214.
  • FIG. 69 illustrates an example system 4300 that includes an electronic device 4305. The electronic device includes means 4310 for receiving from a requestor device a marketing impact information indicative of a physiological response by a person to a general advertisement. The electronic device also includes means 4320 for acquiring an indication of a characteristic of the general advertisement. The electronic device further includes means 4330 for initiating a selection of a targeted-advertisement by an advertising rule responsive to at least the characteristic of the general advertisement and the marketing impact information. The electronic device also includes means 4340 for returning an indication of the targeted-advertisement.
  • FIG. 70 illustrates an example environment 4500. The example environment includes a mobile device 4510 having a core communication function and operable to present human perceivable content using a display. In an embodiment, the mobile device may include at least one of a portable, handheld, a cellular, or a wireless mobile device. In another embodiment, the mobile device may include at least one of a human carried, vehicle borne, an aircraft borne, a train borne, or a vessel borne mobile device. In an embodiment, the mobile device may include a mobile electronic device. A core communication function circuit 4585 may implement the core communication function. The core communication function may include at least one of a voice, telephone, email, message, global positioning, navigation, picture, video, browsing, or Internet core communication function. The display may include a display surface. The display surface may include at least one of an electronically driven display surface, illustrated as the electronic display surface 209, or the display surface 4509. The electronic display surface 209 may be operated by an electronic display surface controller circuit 4554. The electronic display surface may be physically incorporated with the mobile device, or may be physically separate from the mobile device and electronically coupled with the mobile device. In another embodiment, the display surface 4509 is structurally and physically distinct from the mobile device, and is able to display a content projected by a projector display engine 4552 of the mobile device, such as a screen or a reflective wall. The display surface may include at least one of a visually reflective surface, a flat surface, a screen, an audio speaker, or a scent emitter. The mobile device may include an electronic display circuit 4550, which may include the projector display engine 4552, and/or the electronic display surface controller circuit 4554.
  • The mobile device 4510 includes one or more sensors, illustrated as a sensor 206X, coupled to a sensor apparatus 4506. In an embodiment, the sensor 206X may include at least one of an image sensor, such as a camera, a thermal sensor, or a brain wave sensor, such as a P-300 sensor. For example, the sensor may include a device operable to output data indicative of a gaze 207 of the person 205 with respect to the display 209. In another embodiment, the sensor 260X may include at least one of the sensor 206A, 206B, or 206C described in conjunction with FIG. 3.
  • Where identification of gaze 207 direction involves a situation where the back of the person's 205 head is toward a portion of the display, the sensor apparatus 4506 may aim an attention-attracting content at several portions of a broad display surface, or at multiple display surfaces that might be good candidates. This might simply involve dividing a room into quadrants or similar and eliminating one more of the quadrants. For example, an attention-attracting content may include a guiding structure, such as faint running arrows that would draw the person's gaze toward the display surface 4509 and/or the electronic display surface 209.
  • Where the mobile device 4510 is worn on a belt of the person 205, a heuristic may determine whether it is on the right side, left side, facing front or facing back, for example by matching to recent gait patterns or by determining a rotational change combined with an altitude change to determine that the person has sat down and the relative orientation of waist band to mobile device position. Then, a gimbaled structure or other attitude detector may give an orientation relative to the ground. If a line of sight is somewhat clear for projection to the display surface 4509, such as a wall, and the motion pattern may be approximated to a known motion pattern. The mobile device 4510 may determine if a line of sight is clear by emitting a sampling beam and detecting its features. For example, two or three slightly converging beams provide a good distance indicator based upon a separation of the pixels (with some intelligence to determine whether or not distance is beyond the convergence point). In an embodiment where the display is occluded, at least with respect to the gaze 207 direction, the mobile device may implement gaze direction finding using a more diffuse illumination pattern and a time sequenced pattern, for example, something like a disco ball.
  • The mobile device 4510 also includes a recognition circuit 4520, a display space availability determination circuit 4530, and an advertising content output circuit 4560. In some embodiments, one or more of the sensor apparatus 4506, the recognition circuit, the display space availability determination circuit, the advertising content output circuit, the user interface 4565, the core communication function circuit 4585, and the electronic display surface 209 may be structurally distinct from the remaining circuits of the mobile device.
  • In an alternative embodiment, the mobile device 4510 may include at least one additional circuit. The at least one additional circuit may include a user direct-input device 208, such as at least one of a keyboard, a mouse, or a trackball. In another alternative embodiment, the mobile device may include at least one of a transceiver circuit 4580, a third-party notification circuit 4583, a processor, such as a processor 4512, other circuit(s) 4515, or a data store or computer storage media, such as a digital storage device 4590. The processor 4512 may be at least substantially similar to the processing unit 21 described in conjunction with FIG. 1, and/or the processor 120 described in conjunction with FIG. 2.
  • The mobile device 4510 may include a wired or wireless access via a network 299 to servers for cellular communications, Internet access, advertising content, and/or digital content, using the transceiver circuit 4580. In an alternative embodiment, the mobile device may be coupled to the network via a wireless link, a satellite link, and/or a wired link. In another embodiment, one or more of the circuits, or the mobile device, or a portion of the mobile device may be implemented using Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. In a further embodiment, one or more of the circuits and/or the machine may be implemented in hardware, software, and/or firmware. The person 205 may input commands and information to the mobile device 4510 using the user direct-input device 208.
  • FIG. 71 illustrates an example operational flow 4600. FIG. 71 and several following figures may include various examples of operational flows, discussions, and explanations with respect to the above-described environment 4500 of FIG. 70, and/or with respect to other examples and contexts. However, it should be understood that the operational flows may be executed in a number of other environments and contexts, and/or in modified versions of FIG. 70. Also, although the various operational flows are illustrated in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, and/or may be performed concurrently.
  • A start operation 4610 starts the operational flow 4600 in a mobile device having a core communication function and that is operable to present human perceivable content using a display surface. The start operation may be implemented using the mobile device 4510 of FIG. 70. A discovery operation 4620 identifies an attention of a person with respect to the display surface. The discovery operation may be implemented using the recognition circuit 4520. An opportunity operation 4640 determines that the display surface is available to present advertising content. In an alternative embodiment, the opportunity operation determines that at least a portion of the display surface is available. In another embodiment, the opportunity operation determines that at least substantially the entire display surface is available. In a further embodiment, the opportunity operation determines that at least two display surfaces are available. The opportunity operation may be implemented using the display space availability determination circuit 4530. A broadcast operation 4660 presents an advertising content using the display surface. The broadcast operation may be implemented using the advertising content output circuit 4560, which may cooperate with the electronic display circuit 4550 to present the advertising content using at least one of the electronic display surface 209, or the display surface 4509. The operational flow then moves to an end operation.
  • For example, in use, an embodiment of the mobile device 4510 may be used by the person 205 to place a cellular telephone call. Upon conclusion of the call, the person may place the idle mobile device on a table of a coffee shop with the electronic display surface 209 facing upward. The discovery operation 4620 identifies a gaze 207 of the person 205 or of another person at the table toward or in a vicinity of the electronic display surface. The opportunity operation 4640 determines that the display surface is not presenting content related to cellular telephone functions and is thus available to present advertising content. The broadcast operation 4660 presents an advertising content using the display surface, which may be seen or heard by the person 205 or the another person.
  • FIG. 72 illustrates an alternative embodiment of the example operational flow 4600 of FIG. 71. The start operation 4610 may include at least one additional operation. The at least one additional operation may include an operation 4612, an operation 4614, an operation 4616, or an operation 4618. The operation 4612 includes starting in at least one of a portable, handheld, a cellular, or a wireless mobile device having a core communication function and that is operable to present human perceivable content using a display surface. The operation 4614 includes starting in at least one of a human borne, vehicle borne, an aircraft borne, a train borne, or a vessel borne mobile device having a core communication function and that is operable to present human perceivable content using a display surface. The operation 4616 includes starting in a mobile device having at least one of a voice, telephone, email, message, global positioning, navigation, picture, video, browsing, or Internet core communication function, and that is operable to present human perceivable content using a display surface. The operation 4618 includes starting in a mobile device having a core communication function and operable to present human perceivable content using a display surface. The display surface includes at least one of a visually reflective surface, a flat surface, a screen, an audio speaker, or a scent emitter.
  • FIG. 73 illustrates an alternative embodiment of the example operational flow 4600 of FIG. 71. The discovery operation 4620 may include at least one additional operation. The at least one additional operation may include an operation 4622, an operation 4624, or an operation 4626. The operation 4622 identifies at least one of a gaze, or a hearing attention of a person with respect to the display surface. For example, in an embodiment, the gaze may include the gaze 207 of the person 205 with respect to the electronic display surface 209 or the display surface 4509. The operation 4624 identifies a neurological indicator of attention of a person with respect to the display surface. For example, a neurological indicator of attention may include a brain state, a brainwave pattern, and/or a measure of attention, motivation, interest, mood, or the like. The operation 4626 detects a physical orientation of an element of a person's sensory system with respect to the display surface. For example, a physical orientation may include a physical orientation of an eye for a visual orientation to the display surface, a physical orientation of an ear for an auditory orientation to the display surface, a physical orientation of a hand for a touch orientation to the display surface, or a physical orientation of a nose for a smell orientation to the display surface. The operation 4626 may include at least one additional operation. The at least one additional operation may include an operation 4628, an operation 4632, or an operation 4634. The operation 4628 detects at least one of a gaze, or a hearing orientation of a person toward the display surface. The operation 4632 detects at least one of an eye, or an ear orientation of a person toward the display surface. The operation 4634 detects a physical orientation of an element of a person's sensory system, the detected physical orientation is at least one of directed toward, directed at, in a direction of, directed on, or directed in a vicinity of the display surface.
  • FIG. 74 illustrates a further embodiment of the example operational flow 4600 of FIG. 71. The opportunity operation 4640 may include at least one additional operation. The at least one additional operation may include an operation 4642, an operation 4644, an operation 4646, an operation 4648, an operation 4652, an operation 4654, or an operation 4656. The operation 4642 determines that the display surface is not presenting a human perceivable content related to the core communication function. The operation 4644 determines that the display surface is presenting an overridable human perceivable content. The operation 4646 determines using a content prioritization algorithm that the display surface is available to present advertising content. In an alternative embodiment, the operation 4646 determines using a content prioritization algorithm that the display surface is available to present advertising content over another content. For example, the content prioritization algorithm may determine that the display surface is available to present an advertising content over displaying a phone number called while the person is on the call. The operation 4648 determines that an advantageous display surface ambient environmental state exists for presenting advertising content using the display surface. For example, an advantageous display surface ambient environmental state may include an advantageous ambient light level proximate to the display surface 4559 for presentation of at least one of advertising content, a particular advertising content, or a class of advertising content. In a further example, an advantageous display surface ambient environmental state may include at least one of an advantageous ambient sound level, other persons proximate to the mobile device, or an advantageous orientation of mobile device with respect to the display surface. The operation 4652 determines that an advantageous state of the mobile device exists for presenting advertising content using the display surface. For example, a determined advantageous state of the mobile device may include a determined advantageous remaining battery power state of the mobile device, or a determined advantageous display brightness state of the mobile device. The operation 4654 determines that a sufficient power is available to present advertising content using the display surface. The operation 4656 predicts that a sufficient time will be available to present the advertising content using the display surface. In an embodiment, the operation 4656 predicts that sufficient time will be available to present advertising content without substantially interfering with a subsequent presentation of a core communication function related content. In another embodiment, the operation 4656 predicts that sufficient time will be available to present advertising content without substantially interfering with a possible subsequent presentation of a core communication function related content.
  • FIG. 75 illustrates an alternative embodiment of the example operational flow 4600 of FIG. 71. The broadcast operation 4660 may include at least one additional operation. The at least one additional operation may include an operation 4662, an operation 4664, an operation 4666, or an operation 4668. The operation 4662 presents a selected advertising content using the display surface. The operation 4664 ceases presenting an advertising content using the display surface when use of the display surface is requested by the core communications function. The operation 4666 presents an advertising content using an aspect of the display surface. The operation 4668 presents an advertising content using at least a portion of the display surface.
  • FIG. 76 illustrates another alternative embodiment of the example operational flow 4600 of FIG. 71. The operational flow may include at least one additional operation, such as a choosing operation 4670. The choosing operation facilitates a selection of the advertising content. The choosing operation may include at least one additional operation. The at least one additional operation may include an operation 4672, an operation 4674, or an operation 4676. The operation 4672 facilitates a selection of the advertising content. The advertising content is selected from at least one of a locally stored advertising content, or a remotely stored advertising content. The operation 4674 facilitates a selection of the advertising content. The advertising content is selected from at least one of a locally generated advertising content, or a remotely generated advertising content. The operation 4676 initiates a selection of the advertising content.
  • FIG. 77 illustrates a further alternative embodiment of the example operational flow 4600 of FIG. 71. The operational flow may include an additional operation 4680. The additional operation 4860 may include at least one of an operation 4682, an operation 4684, an operation 4688, an operation 4692, or an operation 4694. The operation 4682 receives the advertising content from a remote advertising server. The operation 4684 facilitates a selection of a follow-up advertising content at least partially based on a response by the person to the presented advertising content. The operation 4684 may include at least one additional operation, such as an operation 4686. The operation 4686 facilitates a selection of follow-up advertising content at least partially based on a physiological response by the person to the presented advertising content. The operation 4688 notifies an advertising selector of the determined availability of the display surface to present advertising content. The operation 4692 saves an indication of having presented the advertising content. In an embodiment, the saved indication includes a saved indication of a date the advertising content was presented, a saved identification of the advertising content, and/or a saved indication of another advertising metric. The operation 4694 saves an indication of a response by the person with respect to the presented advertising content. For example, the response by the person may include at least one of a physiological, or an active response by the person, for example, such as a keyed or spoken entry. In another embodiment, the operation 4694 saves an indication of a response by the person with respect to the presented advertising content in a format useable for a later analysis of the response by the person.
  • FIG. 78 illustrates an example operational flow 4700. FIG. 78 and several following figures may include various examples of operational flows, discussions, and explanations with respect to the above-described environment 4500 of FIG. 70, and/or with respect to other examples and contexts. However, it should be understood that the operational flows may be executed in a number of other environments and contexts, and/or in modified versions of FIG. 70. Also, although the various operational flows are illustrated in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, and/or may be performed concurrently.
  • A start operation 4710 starts the operational flow 4700 in a mobile device. The mobile device has a core communication function and is operable to present human perceivable content using a display. The start operation may be implemented using the mobile device 4510 of FIG. 70. A discovery operation 4720 detects an attention of a person with respect to the display. The discovery operation may be implemented using the recognition circuit 4520. An opportunity operation 4640 determines that space is available on the display for presentation of advertising content. The opportunity operation may be implemented using the display space availability determining circuit 4530. An availability operation 4730 sends to a third-party an indication of the detected attention of the person and an indication of the determined availability of the display to present advertising content. For example, the availability operation may send to the adserver 4504 an indication that a person, such as the person 205, is attentive to the display, such as a gaze 207 attention directed toward the display surface 4509, and that the display is not presenting a core communication function content. The availability operation may be implemented using the third-party notification circuit 4583. The operational flow then moves to an end operation.
  • FIG. 79 illustrates an alternative embodiment of the example operational flow 4700 of FIG. 78. The discovery operation 4720 may include at least one additional operation. The at least one additional operation may include an operation 4722, or an operation 4724. The operation 4722 detects at least one of an eye gaze, or an ear position of a person with respect to the display. The operation 4724 detects a neurological indicator of attention of a person with respect to the display.
  • FIG. 80 illustrates an alternative embodiment of the example operational flow 4700 of FIG. 78. The operational flow may include at least one additional operation, such as an operation 4780. The operation 4780 receives an indication of an advertising content selected by a remotely located application for presentation. The operation 4780 may include at least one additional operation, such as the operation 4782. The operation 4782 presents the selected advertising content using the display.
  • FIG. 81 illustrates an example environment 4800. The environment includes a mobile communications device 4801. The mobile communications device includes a display circuit 4850, a core communication system 4820, a tracking system 4830, a display status circuit 4840, and an advertisement insertion circuit 4860. The display circuit 4850 includes a display circuit operable to facilitate presentation of human perceivable content on a display surface. The display surface may include at least one of the electronic display 209, or the display surface 4809. In some embodiments, one or more of the display circuit, the core communication system, the tracking system, the display status circuit, the advertisement insertion circuit, and the electronic display surface may be structurally distinct from the remaining circuits of the mobile device.
  • The mobile communications device 4801 may include a wired or wireless access via a network 299 to servers for cellular communications, Internet access, advertising content, and/or digital content, using the transceiver circuit 4880. In an alternative embodiment, the mobile device may be coupled to the network via a wireless link, a satellite link, and/or a wired link. In another embodiment, one or more of the circuits, or the mobile device, or a portion of the mobile device may be implemented using Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. In a further embodiment, one or more of the circuits and/or the machine may be implemented in hardware, software, and/or firmware. The person 205 may input commands and information to the mobile device using the user direct-input device 208.
  • The core communication system 4820 includes a core communication system operable to exchange data with another computing device and to provide core communication related information to the display circuit. The another computing device may include the adserver 4804, and the data exchange may use the network 299. The tracking system 4830 includes a tracking system operable to determine a physical orientation of an element of a person's 205 sensory system with respect to the display surface, such as the electronic display surface 209 and/or the display surface 4509. The display status circuit 4840 includes a display status circuit operable to determine an availability of the display surface to present advertising content. The availability of the display surface may include at least one of an opportunity, or an opening to present advertising content. The advertisement insertion circuit 4860 includes an advertisement insertion circuit operable to provide advertising content to the display circuit for presentation.
  • In an alternative embodiment, the mobile communications device 4801 may include at least one of a handheld, cellular device, wireless, or portable mobile communications device. In another alternative embodiment, the mobile communications device may include at least one of a vehicle mountable mobile communications device. In a further embodiment, the mobile communications device may include a mobile communications device having at least one of a display surface, or a projector engine operable to project an image onto a physically separate display surface.
  • In an alternative embodiment, the display circuit 4850 may include a display circuit operable to facilitate presentation of at least one of human perceivable content related to the core communication function, or to an advertisement. In another embodiment, display circuit may include a display circuit operable to facilitate presentation of a human perceivable content by at least one of controlling a display surface electronically coupled to the mobile communications device, or controlling a projector engine electronically coupled to the mobile communications device and operable to project an image onto a physically separate display surface. In a further embodiment, the display circuit may include a display circuit operable to facilitate presentation of at least one of a visual, audio, or haptic human perceivable content. An audio display surface may include a speaker cone display surface, and/or a flat panel speaker display surface.
  • In an alternative embodiment, the core communications system 4820 may include at least one of a voice, telephone, cell phone, email, message, global positioning, navigation, picture, video, browsing, or Internet core communication system. In another embodiment, the core communications system may include a core communications system operable to exchange data with at least one of a local, or a remote another computing device. For example, the another computing device may include at least one of a cellular phone site, cellular server, network server, networked computing device, or another mobile communications device. In a further embodiment, the core communications system operable to provide core communication related information to the display circuit for presentation to a user of the mobile communications device.
  • In an alternative embodiment, the tracking system 4830 may include a tracking system operable to determine at least one of an eye gaze, or an ear orientation of a person with respect to the display surface. In another embodiment, the tracking system may include a tracking system operable to determine a physical orientation of at least one of an eye, ear, hand, mouth, or nose element of a person's 205 sensory system with respect to the display surface 209, 4509. The person's sensory system respectively includes at least one of a visual, auditory, touch, taste, or smell sensory system. In a further embodiment, the tracking system may include a tracking system operable to determine a physical orientation of an element of a person's sensory system with respect to the display surface, and to acquire data indicative of the physical orientation.
  • In an alternative embodiment, the display status circuit 4840 may include a display status circuit operable to determine an availability of the display surface to advantageously present an advertisement. For example, an availability of the display surface to advantageously present an advertisement may include at least one of an absence of core communication function content being displayed, or an overridable content being displayed. In another embodiment, the display status circuit may include a display status circuit operable to determine at least one of an ambient environmental state that may impact a presentation by the display surface of human perceivable content, or a state of the mobile device that may impact a presentation by the display surface of human perceivable content. In a further embodiment, the display status circuit may include a display status circuit operable to determine an availability of the display surface to present at least one of a priority or a space-available advertising content.
  • In an alternative embodiment, the advertisement insertion circuit 4860 may include an advertisement insertion circuit operable to provide at least one of a priority or a space-available advertising content to the display circuit 4850 for presentation. In another embodiment, the advertisement insertion circuit may include an advertisement insertion circuit operable to provide a selected advertising content to the display circuit for presentation.
  • In an alternative embodiment, the mobile communications device 4801 may include an advertisement acquirement circuit 4862 operable to initiate a selection of the advertising content. In another alternative embodiment, the advertisement acquirement circuit may include an advertisement acquirement circuit operable to select the advertising content. In a further embodiment, the advertisement acquirement circuit may include an advertisement acquirement circuit operable to receive an indication of the selected advertising content. For example, the indication of the selected advertising content may be received from a remote advertising server, such as the adserver 4804, or a local advertising content selector circuit. In another embodiment, the advertisement acquirement circuit may include an advertisement acquirement circuit operable to receive the selected advertising content from a remote advertising server.
  • FIG. 82 illustrates an example computer program product 4900. The computer program product includes a computer-readable storage medium 4910 bearing program instructions 4920. The program instructions are operable to perform a process in a mobile computing device having a core communication function and operable to present human perceivable content using a display surface. The process includes identifying an attention of a person with respect to the display surface. The process also includes determining that the display surface is available to present advertising content. The process further includes presenting an advertising content using the display surface.
  • In an alternative embodiment, the process of the program instructions 4920 may include facilitating a selection of the advertising content 4922. In another embodiment, the process may include receiving the advertising content from a remote advertising server 4924. In a further embodiment, the process may include notifying an advertising selector of the determined availability of the display surface to present advertising content 4926. In another embodiment, the process may include saving an indication of an action by the person 4928. In a further embodiment, the process may include saving an indication of a physiological response by the person with respect to the presented advertising content.
  • FIG. 83 illustrates an example system 5000. The system includes a mobile device 5010 having a core communication function and operable to present human perceivable content using a display surface. The mobile device includes means 5020 for identifying an attention of a person with respect to the display surface. The mobile device also includes means 5030 for determining that the display surface is available to present advertising content. The mobile device further includes means 5040 for presenting an advertising content using the display surface.
  • In an alternative embodiment, the mobile device may include means 5050 for facilitating a selection of the advertising content. In another embodiment, the mobile device may include means 5060 for receiving the advertising content from a remote advertising server. In a further embodiment, the mobile device may include means 5070 for notifying an advertising selector of the determined availability of the display surface to present advertising content.
  • FIG. 84 illustrates an example operational flow 5100. FIG. 84 and several following figures may include various examples of operational flows, discussions, and explanations with respect to the above-described environment 4500 of FIG. 70, and/or with respect to other examples and contexts. However, it should be understood that the operational flows may be executed in a number of other environments and contexts, and/or in modified versions of FIG. 84. Also, although the various operational flows are illustrated in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated, and/or may be performed concurrently.
  • A start operation 5110 starts the operational flow 5100 in a mobile device. The mobile device is operable to present human perceivable content using a display surface. The start operation may be implemented using the mobile device 4510 of FIG. 70. A discovery operation 5120 detects an attention of a person with respect to the display surface. The discovery operation may be implemented using the recognition circuit 4520. An opening operation 5130 determines that space is available on the display surface for presenting a space-available advertisement. The opening operation may be implemented using the display space determination circuit 4530. An announcement operation 5140 sends an indication to a third-party of an opportunity for presentation of a space-available advertisement. The announcement operation may be implemented using the third-party notification circuit 4583. The operational flow then moves to an end operation.
  • In an alternative embodiment, the opening operation 5130 may include at least one additional operation. The at least one additional operation may include an operation 5132, an operation 5134, or an operation 5136. The operation 5132 determines that space is available on at least a portion on the display surface for presenting a space-available advertisement. The operation 5134 determines that space is available for presenting a space-available advertisement on one portion of the display surface having at least two portions. The operation 5136 determines that space is now, and/or is predicted to be available on the display surface for presenting a space-available advertisement.
  • FIG. 85 illustrates an alternative embodiment of the operational flow 5100 of FIG. 84. The operational flow may include at least one additional operation. The at least one additional operation may include an operation 5150. The operation 5150 receives an indication of a space-available-advertisement. In an embodiment, the indication of a space-available advertisement may be received in response to the sent indication of an opportunity for presentation of a space-available advertisement. The operation 5150 may be implemented using the transceiver circuit 4580. The operation 5150 may include at least one additional operation, such as an operation 5152. The operation 5152 presents the space-available-advertisement. The operation 5152 may be implemented using the advertising content output circuit 4560.
  • The foregoing detailed description has set forth various embodiments of the systems, apparatus, devices, computer program products, and/or processes using block diagrams, flow diagrams, operation diagrams, flowcharts, illustrations, and/or examples. A particular block diagram, operation diagram, flowchart, illustration, environment, and/or example should not be interpreted as having any dependency or requirement relating to any one or combination of components illustrated therein. For example, in certain instances, one or more elements of an environment may be deemed not necessary and omitted. In other instances, one or more other elements may be deemed necessary and added.
  • Insofar as such block diagrams, operation diagrams, flowcharts, illustrations, and/or examples contain one or more functions and/or operations, it will be understood that each function and/or operation within such block diagrams, operation diagrams, flowcharts, illustrations, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof unless otherwise indicated. In an embodiment, several portions of the subject matter described herein may be implemented via Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. However, those skilled in the art will recognize that some aspects of the embodiments disclosed herein, in whole or in part, can be equivalently implemented in circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as a program product in a variety of forms, and that an illustrative embodiment of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution. Examples of a signal bearing medium include, but are not limited to, the following: a recordable type medium such as a floppy disk, a hard disk drive, a Compact Disc (CD), a Digital Video Disk (DVD), a digital tape, a computer memory, etc.; and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link, etc.).
  • Those having skill in the art will recognize that the state of the art has progressed to the point where there is little distinction left between hardware and software implementations of aspects of systems; the use of hardware or software is generally (but not always, in that in certain contexts the choice between hardware and software can become significant) a design choice representing cost vs. efficiency tradeoffs. Those having skill in the art will appreciate that there are various vehicles by which processes and/or systems and/or other technologies described herein can be effected (e.g., hardware, software, and/or firmware), and that the preferred vehicle will vary with the context in which the processes and/or systems and/or other technologies are deployed. For example, if an implementer determines that speed and accuracy are paramount, the implementer may opt for a mainly hardware and/or firmware vehicle; alternatively, if flexibility is paramount, the implementer may opt for a mainly software implementation; or, yet again alternatively, the implementer may opt for some combination of hardware, software, and/or firmware. Hence, there are several possible vehicles by which the processes and/or devices and/or other technologies described herein may be effected, none of which is inherently superior to the other in that any vehicle to be utilized is a choice dependent upon the context in which the vehicle will be deployed and the specific concerns (e.g., speed, flexibility, or predictability) of the implementer, any of which may vary. Those skilled in the art will recognize that optical aspects of implementations will typically employ optically-oriented hardware, software, and or firmware. Those skilled in the art will recognize that optical aspects of implementations will typically employ optically-oriented hardware, software, and or firmware.
  • In a general sense, those skilled in the art will recognize that the various aspects described herein which can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or any combination thereof can be viewed as being composed of various types of “electrical circuitry.” Consequently, as used herein “electrical circuitry” includes, but is not limited to, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of random access memory), and/or electrical circuitry forming a communications device (e.g., a modem, communications switch, or optical-electrical equipment). Those having skill in the art will recognize that the subject matter described herein may be implemented in an analog or digital fashion or some combination thereof.
  • It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to inventions containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations).
  • Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to “at least one of A, B, or C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase “A or B” will be understood to include the possibilities of “A” or “B” or “A and B.”
  • The herein described aspects depict different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely exemplary, and that in fact many other architectures can be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively “associated” such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated can also be viewed as being “operably connected,” or “operably coupled,” to each other to achieve the desired functionality. Any two components capable of being so associated can also be viewed as being “operably couplable” to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components.
  • While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.

Claims (43)

  1. 1. A method implemented in a mobile device having a core communication function and operable to present human perceivable content using a display, the method comprising:
    detecting an attention of a person with respect to the display;
    determining that space is available on the display for presentation of advertising content; and
    sending to a third-party an indication of the detected attention of the person and an indication of the determined availability of the display to present advertising content.
  2. 2. The method of claim 1, wherein the detecting an attention of a person with respect to the display further includes:
    detecting at least one of an eye gaze, or an ear position of a person with respect to the display.
  3. 3. The method of claim 1, wherein the detecting an attention of a person with respect to the display further includes:
    detecting a neurological indicator of attention of a person with respect to the display.
  4. 4. The method of claim 1, further comprising:
    receiving an indication of an advertising content selected by a remotely located application for presentation.
  5. 5. The method of claim 4, further comprising:
    presenting the selected advertising content using the display.
  6. 6. A mobile communications device comprising:
    a display circuit operable to facilitate presentation of human perceivable content on a display surface;
    a core communication system operable to exchange data with another computing device and to provide core communication related information to the display circuit;
    a tracking system operable to determine a physical orientation of an element of a person's sensory system with respect to the display surface;
    a display status circuit operable to determine an availability of the display surface to present advertising content; and
    an advertisement insertion circuit operable to provide advertising content to the display circuit for presentation.
  7. 7. The mobile communications device of claim 6, wherein the mobile communications device further includes:
    at least one of a handheld, cellular device, wireless, or portable mobile communications device.
  8. 8. The mobile communications device of claim 6, wherein the mobile communications device further includes:
    at least one of a vehicle mountable mobile communications device.
  9. 9. The mobile communications device of claim 6, wherein the mobile communications device further includes:
    a mobile communications device having at least one of a display surface, or a projector engine operable to project an image onto a physically separate display surface.
  10. 10. The mobile communications device of claim 6, wherein the display circuit operable to facilitate presentation of human perceivable content further includes:
    a display circuit operable to facilitate presentation of at least one of human perceivable content related to the core communication function, or to an advertising content.
  11. 11. The mobile communications device of claim 6, wherein the display circuit operable to facilitate presentation of human perceivable content further includes:
    a display circuit operable to facilitate presentation of a human perceivable content by at least one of controlling a display surface electronically coupled to the mobile communications device, or controlling a projector engine electronically coupled to the mobile communications device and operable to project an image onto a physically separate display surface.
  12. 12. The mobile communications device of claim 6, wherein the display circuit operable to facilitate presentation of human perceivable content further includes:
    a display circuit operable to facilitate presentation of at least one of a visual, audio, or haptic human perceivable content.
  13. 13. The mobile communications device of claim 6, wherein the core communications system further includes:
    at least one of a voice, telephone, cell phone, email, message, global positioning, navigation, picture, video, browsing, or Internet core communication system.
  14. 14. The mobile communications device of claim 6, wherein the core communications system further includes:
    a core communications system operable to exchange data with at least one of a local, or a remote another computing device.
  15. 15. The mobile communications device of claim 6, wherein the core communications system further includes:
    a core communications system operable to provide core communication related information to the display circuit for presentation to a user of the mobile communications device.
  16. 16. The mobile communications device of claim 6, wherein the tracking system further includes:
    a tracking system operable to determine at least one of an eye gaze, or an ear orientation of a person with respect to the display surface.
  17. 17. The mobile communications device of claim 6, wherein the tracking system further includes:
    a tracking system operable to determine a physical orientation of at least one of an eye, ear, hand, mouth, or nose element of a person's sensory system with respect to the display surface.
  18. 18. The mobile communications device of claim 6, wherein the tracking system further includes:
    a tracking system operable to determine a physical orientation of an element of a person's sensory system with respect to the display surface, and to acquire data indicative of the physical orientation.
  19. 19. The mobile communications device of claim 6, wherein the display status circuit further includes:
    a display status circuit operable to determine an availability of the display surface to advantageously present advertising content.
  20. 20. The mobile communications device of claim 6, wherein the display status circuit further includes:
    a display status circuit operable to determine at least one of an ambient environmental state that may impact a presentation by the display surface of human perceivable content, or a state of the mobile device that may impact a presentation by the display surface of human perceivable content.
  21. 21. The mobile communications device of claim 6, wherein the display status circuit further includes:
    a display status circuit operable to determine an availability of the display surface to present at least one of a priority or a space-available advertising content.
  22. 22. The mobile communications device of claim 6, wherein the advertisement insertion circuit further includes:
    an advertisement insertion circuit operable to provide at least one of a priority or a space-available advertising content to the display circuit for presentation.
  23. 23. The mobile communications device of claim 6, wherein the advertisement insertion circuit further includes:
    an advertisement insertion circuit operable to provide a selected advertising content to the display circuit for presentation.
  24. 24. The mobile communications device of claim 6, further comprising:
    an advertisement acquirement circuit operable to initiate a selection of the advertising content.
  25. 25. The mobile communications device of claim 24, wherein the advertisement acquirement circuit further includes:
    an advertisement acquirement circuit operable to select the advertising content.
  26. 26. The mobile communications device of claim 24, wherein the advertisement acquirement circuit further includes:
    an advertisement acquirement circuit operable to receive an indication of the selected advertising content.
  27. 27. The mobile communications device of claim 24, wherein the advertisement acquirement circuit further includes:
    an advertisement acquirement circuit operable to receive the selected advertising content from a remote advertising server.
  28. 28. A computer program product comprising:
    (a) program instructions operable to perform a process in a mobile computing device having a core communication function and operable to present human perceivable content using a display surface, the process comprising:
    identifying an attention of a person with respect to the display surface;
    determining that the display surface is available to present advertising content; and
    presenting an advertising content using the display surface; and
    (b) a computer-readable storage medium bearing the program instructions.
  29. 29. The computer program product of claim 28, wherein the process further comprises:
    facilitating a selection of the advertising content.
  30. 30. The computer program product of claim 28, wherein the process further comprises:
    receiving the advertising content from a remote advertising server.
  31. 31. The computer program product of claim 28, wherein the process further comprises:
    notifying an advertising selector of the determined availability of the display surface to present advertising content.
  32. 32. The computer program product of claim 28, wherein the process further comprises:
    saving an indication of an action by the person.
  33. 33. The computer program product of claim 28, wherein the process further comprises:
    saving an indication of a physiological response by the person with respect to the presented advertising content.
  34. 34. A mobile device having a core communication function and operable to present human perceivable content using a display surface, the mobile device comprising:
    means for identifying an attention of a person with respect to the display surface;
    means for determining that the display surface is available to present advertising content; and
    means for presenting an advertising content using the display surface.
  35. 35. The mobile device of claim 34, further comprising:
    means for facilitating a selection of the advertising content.
  36. 36. The mobile device of claim 34, further comprising:
    means for receiving the advertising content from a remote advertising server.
  37. 37. The mobile device of claim 34, further comprising:
    means for notifying an advertising selector of the determined availability of the display surface to present advertising content.
  38. 38. A method implemented in a mobile device operable to present human perceivable content using a display surface, the method comprising:
    detecting an attention of a person with respect to the display surface;
    determining that space is available on the display surface for presenting a space-available advertisement; and
    sending an indication to a third-party of an opportunity for presentation of a space-available advertisement.
  39. 39. The method of claim 38, wherein the determining that space is available on the display surface for presenting a space-available advertisement further includes:
    determining that space is available on at least a portion on the display surface for presenting a space-available advertisement.
  40. 40. The method of claim 38, wherein the determining that space is available on the display surface for presenting a space-available advertisement further includes:
    determining that space is available for presenting a space-available advertisement on one portion of the display surface having at least two portions.
  41. 41. The method of claim 38, wherein the determining that space is available on the display surface for presenting a space-available advertisement further includes:
    determining that space is now or is predicted to be available on the display surface for presenting a space-available advertisement.
  42. 42. The method of claim 38, further comprising:
    receiving an indication of a space-available advertisement.
  43. 43. The method of claim 39, further comprising:
    presenting the indicated space-available advertisement.
US12006793 2007-10-24 2008-01-03 Opportunity advertising in a mobile device Pending US20090112713A1 (en)

Priority Applications (12)

Application Number Priority Date Filing Date Title
US11977752 US9513699B2 (en) 2007-10-24 2007-10-24 Method of selecting a second content based on a user's reaction to a first content
US11977748 US20090113297A1 (en) 2007-10-24 2007-10-25 Requesting a second content based on a user's reaction to a first content
US11978206 US8112407B2 (en) 2007-10-24 2007-10-26 Selecting a second content based on a user's reaction to a first content
US11978534 US8126867B2 (en) 2007-10-24 2007-10-27 Returning a second content based on a user's reaction to a first content
US11980321 US8234262B2 (en) 2007-10-24 2007-10-29 Method of selecting a second content based on a user's reaction to a first content of at least two instances of displayed content
US11981573 US20090112849A1 (en) 2007-10-24 2007-10-30 Selecting a second content based on a user's reaction to a first content of at least two instances of displayed content
US11983406 US8001108B2 (en) 2007-10-24 2007-11-07 Returning a new content based on a person's reaction to at least two instances of previously displayed content
US11998826 US20090112695A1 (en) 2007-10-24 2007-11-30 Physiological response based targeted advertising
US11998820 US20090112694A1 (en) 2007-10-24 2007-11-30 Targeted-advertising based on a sensed physiological response by a person to a general advertisement
US11998779 US20090112693A1 (en) 2007-10-24 2007-11-30 Providing personalized advertising
US12001759 US9582805B2 (en) 2007-10-24 2007-12-11 Returning a personalized advertisement
US12006793 US20090112713A1 (en) 2007-10-24 2008-01-03 Opportunity advertising in a mobile device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US12006792 US20090112696A1 (en) 2007-10-24 2008-01-03 Method of space-available advertising in a mobile device
US12006793 US20090112713A1 (en) 2007-10-24 2008-01-03 Opportunity advertising in a mobile device
US12011031 US20090112697A1 (en) 2007-10-30 2008-01-22 Providing personalized advertising

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US11977752 Continuation US9513699B2 (en) 2007-10-24 2007-10-24 Method of selecting a second content based on a user's reaction to a first content
US11998779 Continuation-In-Part US20090112693A1 (en) 2007-10-24 2007-11-30 Providing personalized advertising

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US12001759 Continuation-In-Part US9582805B2 (en) 2007-10-24 2007-12-11 Returning a personalized advertisement
US12006792 Continuation-In-Part US20090112696A1 (en) 2007-10-24 2008-01-03 Method of space-available advertising in a mobile device

Publications (1)

Publication Number Publication Date
US20090112713A1 true true US20090112713A1 (en) 2009-04-30

Family

ID=40584494

Family Applications (2)

Application Number Title Priority Date Filing Date
US11977752 Active 2029-02-22 US9513699B2 (en) 2007-10-24 2007-10-24 Method of selecting a second content based on a user's reaction to a first content
US12006793 Pending US20090112713A1 (en) 2007-10-24 2008-01-03 Opportunity advertising in a mobile device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11977752 Active 2029-02-22 US9513699B2 (en) 2007-10-24 2007-10-24 Method of selecting a second content based on a user's reaction to a first content

Country Status (1)

Country Link
US (2) US9513699B2 (en)

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080077020A1 (en) * 2006-09-22 2008-03-27 Bam Labs, Inc. Method and apparatus for monitoring vital signs remotely
US20090112656A1 (en) * 2007-10-24 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Returning a personalized advertisement
US20090112810A1 (en) * 2007-10-24 2009-04-30 Searete Llc Selecting a second content based on a user's reaction to a first content
US20090112696A1 (en) * 2007-10-24 2009-04-30 Jung Edward K Y Method of space-available advertising in a mobile device
US20090112813A1 (en) * 2007-10-24 2009-04-30 Searete Llc Method of selecting a second content based on a user's reaction to a first content of at least two instances of displayed content
US20090112694A1 (en) * 2007-10-24 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Targeted-advertising based on a sensed physiological response by a person to a general advertisement
US20090112849A1 (en) * 2007-10-24 2009-04-30 Searete Llc Selecting a second content based on a user's reaction to a first content of at least two instances of displayed content
US20090112697A1 (en) * 2007-10-30 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Providing personalized advertising
US20090112914A1 (en) * 2007-10-24 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Returning a second content based on a user's reaction to a first content
US20090219168A1 (en) * 2008-02-29 2009-09-03 Sony Corporation Living posters
US20090292713A1 (en) * 2008-05-23 2009-11-26 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Acquisition and particular association of data indicative of an inferred mental state of an authoring user
US20090290767A1 (en) * 2008-05-23 2009-11-26 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Determination of extent of congruity between observation of authoring user and observation of receiving user
US20100109868A1 (en) * 2008-11-05 2010-05-06 Berger Adam L Notifying A User Of An Available Media Object
US20100169157A1 (en) * 2008-12-30 2010-07-01 Nokia Corporation Methods, apparatuses, and computer program products for providing targeted advertising
US20100205541A1 (en) * 2009-02-11 2010-08-12 Jeffrey A. Rapaport social network driven indexing system for instantly clustering people with concurrent focus on same topic into on-topic chat rooms and/or for generating on-topic search results tailored to user preferences regarding topic
US20110213664A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US20110221657A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Optical stabilization of displayed content with a variable lens
US8155706B1 (en) * 2009-12-21 2012-04-10 David Hurst Scent notification system for a portable communication device
US20120280826A1 (en) * 2008-06-17 2012-11-08 Canon Kabushiki Kaisha Management apparatus for managing a content display change time on a display apparatus and content information to be transmitted to a terminal
US9091851B2 (en) 2010-02-28 2015-07-28 Microsoft Technology Licensing, Llc Light control in head mounted displays
US9097890B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc Grating in a light transmissive illumination system for see-through near-eye display glasses
US9097891B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US9128281B2 (en) 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US9134534B2 (en) 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US9182596B2 (en) 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US9223134B2 (en) 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US9341843B2 (en) 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
US9366862B2 (en) 2010-02-28 2016-06-14 Microsoft Technology Licensing, Llc System and method for delivering content to a group of see-through near eye display eyepieces
US9513699B2 (en) 2007-10-24 2016-12-06 Invention Science Fund I, LL Method of selecting a second content based on a user's reaction to a first content
US9759917B2 (en) 2010-02-28 2017-09-12 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered AR eyepiece interface to external devices

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130036197A1 (en) * 2011-08-05 2013-02-07 Xtreme Labs Inc. Method and system for a mobile local server
US9053493B2 (en) 2012-08-13 2015-06-09 Google Inc. Affecting display of content based on negative reactions
US20140316881A1 (en) * 2013-02-13 2014-10-23 Emotient Estimation of affective valence and arousal with automatic facial expression measurement
US9383819B2 (en) 2013-06-03 2016-07-05 Daqri, Llc Manipulation of virtual object in augmented reality via intent
US9354702B2 (en) 2013-06-03 2016-05-31 Daqri, Llc Manipulation of virtual object in augmented reality via thought
US10026095B2 (en) * 2013-09-10 2018-07-17 Chian Chiu Li Systems and methods for obtaining and utilizing user reaction and feedback
CN105745613A (en) * 2013-12-26 2016-07-06 英特尔公司 Mechanism for facilitating dynamic change orientation for edit modes at computing devices
US9880711B2 (en) * 2014-01-22 2018-01-30 Google Llc Adaptive alert duration
US9639231B2 (en) * 2014-03-17 2017-05-02 Google Inc. Adjusting information depth based on user's attention
US9575560B2 (en) 2014-06-03 2017-02-21 Google Inc. Radar-based gesture-recognition through a wearable device
US9811164B2 (en) 2014-08-07 2017-11-07 Google Inc. Radar-based gesture sensing and data transmission
US9921660B2 (en) 2014-08-07 2018-03-20 Google Llc Radar-based gesture recognition
US9588625B2 (en) 2014-08-15 2017-03-07 Google Inc. Interactive textiles
US20160055201A1 (en) * 2014-08-22 2016-02-25 Google Inc. Radar Recognition-Aided Searches
US9778749B2 (en) 2014-08-22 2017-10-03 Google Inc. Occluded gesture recognition
US20160063015A1 (en) * 2014-08-28 2016-03-03 Kent Andrew Edmonds Systems and methods for providing complimentary content on linked machines
US20160072756A1 (en) * 2014-09-10 2016-03-10 International Business Machines Corporation Updating a Sender of an Electronic Communication on a Disposition of a Recipient Toward Content of the Electronic Communication
US10016162B1 (en) 2015-03-23 2018-07-10 Google Llc In-ear health monitoring
US9983747B2 (en) 2015-03-26 2018-05-29 Google Llc Two-layer interactive textiles
US9848780B1 (en) 2015-04-08 2017-12-26 Google Inc. Assessing cardiovascular function using an optical sensor
US10080528B2 (en) 2015-05-19 2018-09-25 Google Llc Optical central venous pressure measurement
US10088908B1 (en) 2015-05-27 2018-10-02 Google Llc Gesture detection and interactions
US9837760B2 (en) 2015-11-04 2017-12-05 Google Inc. Connectors for connecting electronics embedded in garments to external devices

Citations (104)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4670798A (en) * 1983-10-28 1987-06-02 Max L. Campbell Point of purchase advertising system
US4931865A (en) * 1988-08-24 1990-06-05 Sebastiano Scarampi Apparatus and methods for monitoring television viewers
US4984098A (en) * 1986-08-01 1991-01-08 Popad, Inc. Point of purchase automatically-actuated audio advertising device and method
US5117407A (en) * 1988-02-11 1992-05-26 Vogel Peter S Vending machine with synthesized description messages
US5227874A (en) * 1986-03-10 1993-07-13 Kohorn H Von Method for measuring the effectiveness of stimuli on decisions of shoppers
US5485139A (en) * 1993-09-30 1996-01-16 Tarnovsky; George V. Talking display signage
US5657004A (en) * 1995-04-11 1997-08-12 Felknor International, Inc. Electronically controlled point of purchase display
US5731805A (en) * 1996-06-25 1998-03-24 Sun Microsystems, Inc. Method and apparatus for eyetrack-driven text enlargement
US5796343A (en) * 1994-09-09 1998-08-18 Stauder; Gary D. Device and method for preventing damage to goods during handling by material handling equipment
US5923252A (en) * 1995-04-06 1999-07-13 Marvel Corporation Pty Limited Audio/visual marketing device and marketing system
US6064383A (en) * 1996-10-04 2000-05-16 Microsoft Corporation Method and system for selecting an emotional appearance and prosody for a graphical character
US6190314B1 (en) * 1998-07-15 2001-02-20 International Business Machines Corporation Computer input device with biosensors for sensing user emotions
US6219657B1 (en) * 1997-03-13 2001-04-17 Nec Corporation Device and method for creation of emotions
US20020030163A1 (en) * 2000-08-09 2002-03-14 Zhang Evan Y.W. Image intensifier and LWIR fusion/combination system
US6393136B1 (en) * 1999-01-04 2002-05-21 International Business Machines Corporation Method and apparatus for determining eye contact
US20020072952A1 (en) * 2000-12-07 2002-06-13 International Business Machines Corporation Visual and audible consumer reaction collection
US20020112035A1 (en) * 2000-10-30 2002-08-15 Carey Brian M. System and method for performing content experience management
US6520905B1 (en) * 1998-02-26 2003-02-18 Eastman Kodak Company Management of physiological and psychological state of an individual using images portable biosensor device
US20030060897A1 (en) * 2000-03-24 2003-03-27 Keisuke Matsuyama Commercial effect measuring system, commercial system, and appealing power sensor
US20030060728A1 (en) * 2001-09-25 2003-03-27 Mandigo Lonnie D. Biofeedback based personal entertainment system
US20030078840A1 (en) * 2001-10-19 2003-04-24 Strunk David D. System and method for interactive advertising
US20030088463A1 (en) * 1999-10-21 2003-05-08 Steven Fischman System and method for group advertisement optimization
US6577329B1 (en) * 1999-02-25 2003-06-10 International Business Machines Corporation Method and system for relevance feedback through gaze tracking and ticker interfaces
US6585521B1 (en) * 2001-12-21 2003-07-01 Hewlett-Packard Development Company, L.P. Video indexing based on viewers' behavior and emotion feedback
US6601021B2 (en) * 2000-12-08 2003-07-29 Xerox Corporation System and method for analyzing eyetracker data
US6606605B1 (en) * 1998-07-20 2003-08-12 Usa Technologies, Inc. Method to obtain customer specific data for public access electronic commerce services
US6687608B2 (en) * 2000-12-27 2004-02-03 Fuji Photo Film Co., Ltd. Information notification system and method, and navigation system and method
US6704034B1 (en) * 2000-09-28 2004-03-09 International Business Machines Corporation Method and apparatus for providing accessibility through a context sensitive magnifying glass
US6731307B1 (en) * 2000-10-30 2004-05-04 Koninklije Philips Electronics N.V. User interface/entertainment device that simulates personal interaction and responds to user's mental state and/or personality
US20040101178A1 (en) * 2002-11-25 2004-05-27 Eastman Kodak Company Imaging method and system for health monitoring and personal security
US20040152957A1 (en) * 2000-06-16 2004-08-05 John Stivoric Apparatus for detecting, receiving, deriving and displaying human physiological and contextual information
US20050010637A1 (en) * 2003-06-19 2005-01-13 Accenture Global Services Gmbh Intelligent collaborative media
US20050017870A1 (en) * 2003-06-05 2005-01-27 Allison Brendan Z. Communication methods based on brain computer interfaces
US20050021677A1 (en) * 2003-05-20 2005-01-27 Hitachi, Ltd. Information providing method, server, and program
US6873314B1 (en) * 2000-08-29 2005-03-29 International Business Machines Corporation Method and system for the recognition of reading skimming and scanning from eye-gaze patterns
US20050071865A1 (en) * 2003-09-30 2005-03-31 Martins Fernando C. M. Annotating meta-data with user responses to digital content
US20050107718A1 (en) * 2002-01-04 2005-05-19 Dan Hashimshony Method and system for examining tissue according to the dielectric properties thereof
US20050157377A1 (en) * 2004-01-20 2005-07-21 Ron Goldman Portable electronic device with a laser projection display
US7010497B1 (en) * 1999-07-08 2006-03-07 Dynamiclogic, Inc. System and method for evaluating and/or monitoring effectiveness of on-line advertising
US20060064411A1 (en) * 2004-09-22 2006-03-23 William Gross Search engine using user intent
US20060074883A1 (en) * 2004-10-05 2006-04-06 Microsoft Corporation Systems, methods, and interfaces for providing personalized search and information access
US20060122834A1 (en) * 2004-12-03 2006-06-08 Bennett Ian M Emotion detection device & method for use in distributed systems
US20060143647A1 (en) * 2003-05-30 2006-06-29 Bill David S Personalizing content based on mood
US20060146281A1 (en) * 2004-12-03 2006-07-06 Goodall Eleanor V Method and system for vision enhancement
US20060170945A1 (en) * 2004-12-30 2006-08-03 Bill David S Mood-based organization and display of instant messenger buddy lists
US7169113B1 (en) * 1997-09-05 2007-01-30 Hello-Hello, Inc. Portrayal of human information visualization
US7181693B1 (en) * 2000-03-17 2007-02-20 Gateway Inc. Affective control of information systems
US20070052672A1 (en) * 2005-09-08 2007-03-08 Swisscom Mobile Ag Communication device, system and method
US20070066916A1 (en) * 2005-09-16 2007-03-22 Imotions Emotion Technology Aps System and method for determining human emotion by analyzing eye properties
US7197472B2 (en) * 2000-01-13 2007-03-27 Erinmedia, Llc Market data acquisition system
US20070085759A1 (en) * 2005-09-15 2007-04-19 Lg Electronics Inc. Method for displaying multimedia contents and mobile communications terminal capable of implementing the same
US20070104369A1 (en) * 2005-11-04 2007-05-10 Eyetracking, Inc. Characterizing dynamic regions of digital media data
US20070105071A1 (en) * 2005-11-04 2007-05-10 Eye Tracking, Inc. Generation of test stimuli in visual media
US7225142B1 (en) * 1996-08-01 2007-05-29 At&T Corp. Interactive multimedia advertising and electronic commerce on a hypertext network
US20070150916A1 (en) * 2005-12-28 2007-06-28 James Begole Using sensors to provide feedback on the access of digital content
US20070162505A1 (en) * 2006-01-10 2007-07-12 International Business Machines Corporation Method for using psychological states to index databases
US20070167689A1 (en) * 2005-04-01 2007-07-19 Motorola, Inc. Method and system for enhancing a user experience using a user's physiological state
US20070184420A1 (en) * 2006-02-08 2007-08-09 Honeywell International Inc. Augmented tutoring
US20070191691A1 (en) * 2005-05-19 2007-08-16 Martin Polanco Identification of guilty knowledge and malicious intent
US20080065468A1 (en) * 2006-09-07 2008-03-13 Charles John Berg Methods for Measuring Emotive Response and Selection Preference
US7356547B2 (en) * 2001-11-21 2008-04-08 Microsoft Corporation Methods and systems for selectively displaying advertisements
US20080091515A1 (en) * 2006-10-17 2008-04-17 Patentvc Ltd. Methods for utilizing user emotional state in a business process
US7363282B2 (en) * 2003-12-03 2008-04-22 Microsoft Corporation Search system using user behavior data
US20080104045A1 (en) * 2006-11-01 2008-05-01 Cohen Alain J Collectively enhanced semantic search
US20080114756A1 (en) * 1999-12-28 2008-05-15 Levino Authomatic, personalized online information and product services
US20080146892A1 (en) * 2006-12-19 2008-06-19 Valencell, Inc. Physiological and environmental monitoring systems and methods
US20080147488A1 (en) * 2006-10-20 2008-06-19 Tunick James A System and method for monitoring viewer attention with respect to a display and determining associated charges
US20080162142A1 (en) * 2006-12-29 2008-07-03 Industrial Technology Research Institute Emotion abreaction device and using method of emotion abreaction device
US7418405B1 (en) * 2003-05-23 2008-08-26 Amazon.Com, Inc. Interactive time-limited merchandising program and method for improved online cross-selling
US20080209321A1 (en) * 2007-02-15 2008-08-28 Fujitsu Limited Mobile terminal apparatus, and display control method therefor
US20080281661A1 (en) * 2007-05-08 2008-11-13 Jenn-Shoou Young Real-time Advertisement Displaying System and Method thereof
US20090062679A1 (en) * 2007-08-27 2009-03-05 Microsoft Corporation Categorizing perceptual stimuli by detecting subconcious responses
US7503653B2 (en) * 2004-11-22 2009-03-17 Carestream Health, Inc. Diagnostic system having gaze tracking
US20090100015A1 (en) * 2007-10-11 2009-04-16 Alon Golan Web-based workspace for enhancing internet search experience
US20090112656A1 (en) * 2007-10-24 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Returning a personalized advertisement
US20090112693A1 (en) * 2007-10-24 2009-04-30 Jung Edward K Y Providing personalized advertising
US20090112696A1 (en) * 2007-10-24 2009-04-30 Jung Edward K Y Method of space-available advertising in a mobile device
US20090112695A1 (en) * 2007-10-24 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Physiological response based targeted advertising
US20090150363A1 (en) * 2002-09-03 2009-06-11 William Gross Apparatus and methods for locating data
US7547279B2 (en) * 2002-01-23 2009-06-16 Samsung Electronics Co., Ltd. System and method for recognizing user's emotional state using short-time monitoring of physiological signals
US7555287B1 (en) * 2001-11-01 2009-06-30 Nokia Corporation Customized messaging between wireless access point and services
US7562064B1 (en) * 1999-07-03 2009-07-14 Microsoft Corporation Automated web-based targeted advertising with quotas
US7679579B2 (en) * 2004-12-24 2010-03-16 Fujifilm Corporation Projection type image display apparatus
US7689672B2 (en) * 2000-02-25 2010-03-30 Microsoft Corporation Collecting user attributes and device attributes to target users with promotions
US7702318B2 (en) * 2005-09-14 2010-04-20 Jumptap, Inc. Presentation of sponsored content based on mobile transaction event
US7703114B2 (en) * 2005-02-25 2010-04-20 Microsoft Corporation Television system targeted advertising
US7703611B1 (en) * 2000-09-29 2010-04-27 Aol Inc. Targeted geographical condition notification of users based on a geographic location and device types or software of the users
US7720784B1 (en) * 2005-08-30 2010-05-18 Walt Froloff Emotive intelligence applied in electronic devices and internet using emotion displacement quantification in pain and pleasure space
US7742037B2 (en) * 2003-12-10 2010-06-22 Sony Corporation Input device, input method and electronic equipment
US7760910B2 (en) * 2005-12-12 2010-07-20 Eyetools, Inc. Evaluation of visual stimuli using existing viewing data
US7762665B2 (en) * 2003-03-21 2010-07-27 Queen's University At Kingston Method and apparatus for communication between humans and devices
US7769632B2 (en) * 1999-12-17 2010-08-03 Promovu, Inc. System for selectively communicating promotional information to a person
US7769764B2 (en) * 2005-09-14 2010-08-03 Jumptap, Inc. Mobile advertisement syndication
US7865404B2 (en) * 1996-01-17 2011-01-04 Paradox Technical Solutions Llc Intelligent agents for electronic commerce
US7874983B2 (en) * 2003-01-27 2011-01-25 Motorola Mobility, Inc. Determination of emotional and physiological states of a recipient of a communication
US7881493B1 (en) * 2003-04-11 2011-02-01 Eyetools, Inc. Methods and apparatuses for use of eye interpretation information
US7904439B2 (en) * 2002-04-04 2011-03-08 Microsoft Corporation System and methods for constructing personalized context-sensitive portal pages or views by analyzing patterns of users' information access activities
US7908150B2 (en) * 2001-05-07 2011-03-15 Jean-Luc Rochet System and a method for performing personalised interactive automated electronic marketing of the marketing service provider
US7930199B1 (en) * 2006-07-21 2011-04-19 Sensory Logic, Inc. Method and report assessing consumer reaction to a stimulus by matching eye position with facial coding
US7931602B2 (en) * 2004-03-24 2011-04-26 Seiko Epson Corporation Gaze guidance degree calculation system, gaze guidance degree calculation program, storage medium, and gaze guidance degree calculation method
US8126220B2 (en) * 2007-05-03 2012-02-28 Hewlett-Packard Development Company L.P. Annotating stimulus based on determined emotional response
US8407055B2 (en) * 2005-08-05 2013-03-26 Sony Corporation Information processing apparatus and method for recognizing a user's emotion
US8473044B2 (en) * 2007-03-07 2013-06-25 The Nielsen Company (Us), Llc Method and system for measuring and ranking a positive or negative response to audiovisual or interactive media, products or activities using physiological signals
US8712713B2 (en) * 2006-03-20 2014-04-29 Qualcomm Incorporated Method and apparatus for determining the altitude of a mobile device

Family Cites Families (118)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4075657A (en) 1977-03-03 1978-02-21 Weinblatt Lee S Eye movement monitoring apparatus
DE69334106T2 (en) 1992-12-09 2007-09-06 Sedna Patent Services, Llc Menu driven television program access system and method
US5471542A (en) 1993-09-27 1995-11-28 Ragland; Richard R. Point-of-gaze tracker
US5676138A (en) 1996-03-15 1997-10-14 Zawilinski; Kenneth Michael Emotional response analyzer system with multimedia display
US5933811A (en) 1996-08-20 1999-08-03 Paul D. Angles System and method for delivering customized advertisements within interactive communication systems
US5948061A (en) 1996-10-29 1999-09-07 Double Click, Inc. Method of delivery, targeting, and measuring advertising over networks
US6118888A (en) * 1997-02-28 2000-09-12 Kabushiki Kaisha Toshiba Multi-modal interface apparatus and method
US5982357A (en) 1997-03-12 1999-11-09 Key Tronic Corporation Computer keyboard systems and methods for determining excessive key stroke force
GB9800590D0 (en) 1998-01-13 1998-03-11 Bae Sema Ltd Intelligent human computer interface system
EP0963115A1 (en) 1998-06-05 1999-12-08 THOMSON multimedia Apparatus and method for selecting viewers' profile in interactive TV
US6807532B1 (en) 1998-07-20 2004-10-19 Usa Technologies, Inc. Method of soliciting a user to input survey data at an electronic commerce terminal
US6466232B1 (en) * 1998-12-18 2002-10-15 Tangis Corporation Method and system for controlling presentation of information to a user based on the user's condition
US7120880B1 (en) 1999-02-25 2006-10-10 International Business Machines Corporation Method and system for real-time determination of a subject's interest level to media content
US6401050B1 (en) * 1999-05-21 2002-06-04 The United States Of America As Represented By The Secretary Of The Navy Non-command, visual interaction system for watchstations
US7451177B1 (en) * 1999-08-12 2008-11-11 Avintaquin Capital, Llc System for and method of implementing a closed loop response architecture for electronic commerce
US6847992B1 (en) * 1999-10-19 2005-01-25 Netzero, Inc. Data pass-through to sponsors
US7630986B1 (en) * 1999-10-27 2009-12-08 Pinpoint, Incorporated Secure data interchange
US7472102B1 (en) 1999-10-29 2008-12-30 Microsoft Corporation Cluster-based and rule-based approach for automated web-based targeted advertising with quotas
US7779436B1 (en) 1999-11-24 2010-08-17 Jlb Ventures Llc Method for using banner advertisements during commercial breaks
US20040193488A1 (en) 2000-01-19 2004-09-30 Denis Khoo Method and system for advertising over a data network
EP1215662A4 (en) 2000-02-28 2005-09-21 Sony Corp Speech recognition device and speech recognition method, and recording medium
US6453194B1 (en) * 2000-03-29 2002-09-17 Daniel A. Hill Method of measuring consumer reaction while participating in a consumer activity
US7096185B2 (en) * 2000-03-31 2006-08-22 United Video Properties, Inc. User speech interfaces for interactive media guidance applications
US7228327B2 (en) * 2000-05-08 2007-06-05 Hoshiko Llc Method and apparatus for delivering content via information retrieval devices
US6456262B1 (en) * 2000-05-09 2002-09-24 Intel Corporation Microdisplay with eye gaze detection
EP1299835A4 (en) 2000-05-26 2007-07-11 Hello Hello Inc Audience attention and response evaluation
WO2002010358A3 (en) * 2000-07-31 2003-08-21 Maxygen Inc Nucleotide incorporating enzymes
JP2002112969A (en) * 2000-09-02 2002-04-16 Samsung Electronics Co Ltd Device and method for recognizing physical and emotional conditions
US7590723B2 (en) 2000-09-25 2009-09-15 Short Path Inc. System and method for targeting advertisements to tenants in a building
US6904408B1 (en) 2000-10-19 2005-06-07 Mccarthy John Bionet method, system and personalized web content manager responsive to browser viewers' psychological preferences, behavioral responses and physiological stress indicators
US20020174425A1 (en) 2000-10-26 2002-11-21 Markel Steven O. Collection of affinity data from television, video, or similar transmissions
US6622140B1 (en) 2000-11-15 2003-09-16 Justsystem Corporation Method and apparatus for analyzing affect and emotion in text
EP1241588A3 (en) 2001-01-23 2006-01-04 Matsushita Electric Industrial Co., Ltd. Audio information provision system
US20020107718A1 (en) * 2001-02-06 2002-08-08 Morrill Mark N. "Host vendor driven multi-vendor search system for dynamic market preference tracking"
US20020194501A1 (en) * 2001-02-25 2002-12-19 Storymail, Inc. System and method for conducting a secure interactive communication session
JP2004287471A (en) * 2001-03-02 2004-10-14 Ccp:Kk Automatic editing system
GB0107689D0 (en) 2001-03-28 2001-05-16 Ncr Int Inc Self service terminal
US6968334B2 (en) 2001-05-15 2005-11-22 Nokia Corporation Method and business process to maintain privacy in distributed recommendation systems
KR100580617B1 (en) 2001-11-05 2006-05-16 삼성전자주식회사 Object growth control system and method
US20030135582A1 (en) * 2001-12-21 2003-07-17 Docomo Communications Laboratories Usa, Inc. Context aware search service
US20030128389A1 (en) * 2001-12-26 2003-07-10 Eastman Kodak Company Method for creating and using affective information in a digital imaging system cross reference to related applications
US6901411B2 (en) * 2002-02-11 2005-05-31 Microsoft Corporation Statistical bigram correlation model for image retrieval
US7905832B1 (en) * 2002-04-24 2011-03-15 Ipventure, Inc. Method and system for personalized medical monitoring and notifications therefor
US8611919B2 (en) 2002-05-23 2013-12-17 Wounder Gmbh., Llc System, method, and computer program product for providing location based services and mobile e-commerce
US20030236582A1 (en) 2002-06-25 2003-12-25 Lee Zamir Selection of items based on user reactions
JP3867627B2 (en) * 2002-06-26 2007-01-10 ソニー株式会社 The audience state estimation device and the audience state estimation method and audience state estimation program
US20040001616A1 (en) * 2002-06-27 2004-01-01 Srinivas Gutta Measurement of content ratings through vision and speech recognition
US20040092809A1 (en) * 2002-07-26 2004-05-13 Neurion Inc. Methods for measurement and analysis of brain activity
WO2004047448A3 (en) * 2002-11-15 2004-09-16 Maarten P Bodlaender Introducing new content items in a community-based recommendation system
US6637883B1 (en) * 2003-01-23 2003-10-28 Vishwas V. Tengshe Gaze tracking system and method
US8292433B2 (en) * 2003-03-21 2012-10-23 Queen's University At Kingston Method and apparatus for communication between humans and devices
US20040242216A1 (en) 2003-06-02 2004-12-02 Nokia Corporation Systems and methods for transferring data between mobile stations
WO2005008899A1 (en) * 2003-07-17 2005-01-27 Xrgomics Pte Ltd Letter and word choice text input method for keyboards and reduced keyboard systems
US7245483B2 (en) * 2003-07-18 2007-07-17 Satori Labs, Inc. Integrated personal information management system
US7836010B2 (en) 2003-07-30 2010-11-16 Northwestern University Method and system for assessing relevant properties of work contexts for use by information services
US6959147B2 (en) 2003-10-17 2005-10-25 Eastman Kodak Company Digital one-time-use camera system
US7930206B2 (en) 2003-11-03 2011-04-19 Google Inc. System and method for enabling an advertisement to follow the user to additional web pages
US7495659B2 (en) * 2003-11-25 2009-02-24 Apple Inc. Touch pad for handheld device
EP1538536A1 (en) * 2003-12-05 2005-06-08 Sony International (Europe) GmbH Visualization and control techniques for multimedia digital content
US20050131744A1 (en) 2003-12-10 2005-06-16 International Business Machines Corporation Apparatus, system and method of automatically identifying participants at a videoconference who exhibit a particular expression
US20050132405A1 (en) 2003-12-15 2005-06-16 Microsoft Corporation Home network media server with a jukebox for enhanced user experience
US20050210008A1 (en) * 2004-03-18 2005-09-22 Bao Tran Systems and methods for analyzing documents over a network
US7590619B2 (en) * 2004-03-22 2009-09-15 Microsoft Corporation Search system using user behavior data
EP1582965A1 (en) * 2004-04-01 2005-10-05 Sony Deutschland Gmbh Emotion controlled system for processing multimedia data
KR20060131981A (en) 2004-04-15 2006-12-20 코닌클리케 필립스 일렉트로닉스 엔.브이. Method of generating a content item having a specific emotional influence on a user
US7403815B2 (en) 2004-06-04 2008-07-22 Drexel University Brain state recognition system
US7839423B2 (en) 2004-06-18 2010-11-23 Nec Corporation Image display system with gaze directed zooming
US20050289582A1 (en) 2004-06-24 2005-12-29 Hitachi, Ltd. System and method for capturing and using biometrics to review a product, service, creative work or thing
US20060075108A1 (en) 2004-09-15 2006-04-06 Nortel Networks Limited Network media gateway
US20060133586A1 (en) * 2004-12-08 2006-06-22 Ntt Docomo, Inc. Information notification system and information notification method
US8880677B2 (en) * 2005-01-03 2014-11-04 Qualcomm Connected Experiences, Inc. System and method for delivering content to users on a network
US8235725B1 (en) * 2005-02-20 2012-08-07 Sensory Logic, Inc. Computerized method of assessing consumer reaction to a business stimulus employing facial coding
US7460150B1 (en) 2005-03-14 2008-12-02 Avaya Inc. Using gaze detection to determine an area of interest within a scene
US20070220010A1 (en) 2006-03-15 2007-09-20 Kent Thomas Ertugrul Targeted content delivery for networks
US8787706B2 (en) * 2005-03-18 2014-07-22 The Invention Science Fund I, Llc Acquisition of a user expression and an environment of the expression
US20070214471A1 (en) 2005-03-23 2007-09-13 Outland Research, L.L.C. System, method and computer program product for providing collective interactive television experiences
US20080052219A1 (en) * 2006-03-31 2008-02-28 Combinenet, Inc. System for and method of expressive auctions of user events
US8548853B2 (en) 2005-06-08 2013-10-01 Microsoft Corporation Peer-to-peer advertisement platform
EP1911263A4 (en) 2005-07-22 2011-03-30 Kangaroo Media Inc System and methods for enhancing the experience of spectators attending a live sporting event
US7716199B2 (en) 2005-08-10 2010-05-11 Google Inc. Aggregating context data for programmable search engines
WO2007030275A3 (en) * 2005-09-02 2007-05-18 Emsense Corp A device and method for sensing electrical activity in tissue
US9471925B2 (en) * 2005-09-14 2016-10-18 Millennial Media Llc Increasing mobile interactivity
KR100746995B1 (en) * 2005-09-22 2007-08-08 한국과학기술원 Method for communicating with and pointing to a device by intuitive real spatial aiming using indoor location-based-service and electronic compass
EP1984803A2 (en) 2005-09-26 2008-10-29 Philips Electronics N.V. Method and apparatus for analysing an emotional state of a user being provided with content information
US20070112758A1 (en) * 2005-11-14 2007-05-17 Aol Llc Displaying User Feedback for Search Results From People Related to a User
US9740794B2 (en) 2005-12-23 2017-08-22 Yahoo Holdings, Inc. Methods and systems for enhancing internet experiences
US7599918B2 (en) * 2005-12-29 2009-10-06 Microsoft Corporation Dynamic search with implicit user intention mining
US20070205963A1 (en) 2006-03-03 2007-09-06 Piccionelli Gregory A Heads-up billboard
CA2639125A1 (en) 2006-03-13 2007-09-13 Imotions-Emotion Technology A/S Visual attention and emotional response detection and display system
KR100792698B1 (en) 2006-03-14 2008-01-08 엔에이치엔(주) Method and system for matching advertisement using seed
US7610255B2 (en) * 2006-03-31 2009-10-27 Imagini Holdings Limited Method and system for computerized searching and matching multimedia objects using emotional preference
JP2007280486A (en) * 2006-04-05 2007-10-25 Sony Corp Recording device, reproduction device, recording and reproducing device, recording method, reproducing method, recording and reproducing method, and recording medium
JP4682903B2 (en) 2006-04-06 2011-05-11 株式会社デンソー Remote service system for a vehicle
US7901288B2 (en) 2006-04-20 2011-03-08 International Business Machines Corporation Embedded advertising enhancements in interactive computer game environments
US8032425B2 (en) * 2006-06-16 2011-10-04 Amazon Technologies, Inc. Extrapolation of behavior-based associations to behavior-deficient items
US7621871B2 (en) 2006-06-16 2009-11-24 Archinoetics, Llc Systems and methods for monitoring and evaluating individual performance
GB0614458D0 (en) 2006-07-20 2006-08-30 Clare Jon Computerised hypnosis therapy device and method
WO2008077177A1 (en) 2006-12-22 2008-07-03 Neuro-Insight Pty. Ltd. A method to evaluate psychological responses to visual objects
US8269834B2 (en) 2007-01-12 2012-09-18 International Business Machines Corporation Warning a user about adverse behaviors of others within an environment based on a 3D captured image stream
US20080295126A1 (en) 2007-03-06 2008-11-27 Lee Hans C Method And System For Creating An Aggregated View Of User Response Over Time-Variant Media Using Physiological Data
US8782681B2 (en) 2007-03-08 2014-07-15 The Nielsen Company (Us), Llc Method and system for rating media and events in media based on physiological data
US9361623B2 (en) 2007-04-03 2016-06-07 International Business Machines Corporation Preferred customer marketing delivery based on biometric data for a customer
US20080255949A1 (en) 2007-04-13 2008-10-16 Lucid Systems, Inc. Method and System for Measuring Non-Verbal and Pre-Conscious Responses to External Stimuli
US7917840B2 (en) * 2007-06-05 2011-03-29 Aol Inc. Dynamic aggregation and display of contextually relevant content
US8249922B2 (en) 2007-06-15 2012-08-21 Alcatel Lucent Method and apparatus for advertisement delivery in wireless networks
US20090006188A1 (en) 2007-06-26 2009-01-01 Microsoft Corporation Associating an activity with an online advertisement
US8538814B2 (en) * 2007-07-11 2013-09-17 Hewlett-Packard Development Company, L.P. Systems and methods of providing advertising content
US20090076887A1 (en) * 2007-09-16 2009-03-19 Nova Spivack System And Method Of Collecting Market-Related Data Via A Web-Based Networking Environment
US8862690B2 (en) * 2007-09-28 2014-10-14 Ebay Inc. System and method for creating topic neighborhood visualizations in a networked system
US20090132368A1 (en) * 2007-10-19 2009-05-21 Paul Cotter Systems and Methods for Providing Personalized Advertisement
US9513699B2 (en) 2007-10-24 2016-12-06 Invention Science Fund I, LL Method of selecting a second content based on a user's reaction to a first content
US20090138565A1 (en) * 2007-11-26 2009-05-28 Gil Shiff Method and System for Facilitating Content Analysis and Insertion
US8069125B2 (en) 2007-12-13 2011-11-29 The Invention Science Fund I Methods and systems for comparing media content
US8356004B2 (en) 2007-12-13 2013-01-15 Searete Llc Methods and systems for comparing media content
US20090171164A1 (en) 2007-12-17 2009-07-02 Jung Edward K Y Methods and systems for identifying an avatar-linked population cohort
US20090157813A1 (en) 2007-12-17 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for identifying an avatar-linked population cohort
US20090164131A1 (en) 2007-12-20 2009-06-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for specifying a media content-linked population cohort
US7958156B2 (en) * 2008-02-25 2011-06-07 Yahoo!, Inc. Graphical/rich media ads in search results

Patent Citations (111)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4670798A (en) * 1983-10-28 1987-06-02 Max L. Campbell Point of purchase advertising system
US5227874A (en) * 1986-03-10 1993-07-13 Kohorn H Von Method for measuring the effectiveness of stimuli on decisions of shoppers
US4984098A (en) * 1986-08-01 1991-01-08 Popad, Inc. Point of purchase automatically-actuated audio advertising device and method
US5117407A (en) * 1988-02-11 1992-05-26 Vogel Peter S Vending machine with synthesized description messages
US4931865A (en) * 1988-08-24 1990-06-05 Sebastiano Scarampi Apparatus and methods for monitoring television viewers
US5485139A (en) * 1993-09-30 1996-01-16 Tarnovsky; George V. Talking display signage
US5796343A (en) * 1994-09-09 1998-08-18 Stauder; Gary D. Device and method for preventing damage to goods during handling by material handling equipment
US5923252A (en) * 1995-04-06 1999-07-13 Marvel Corporation Pty Limited Audio/visual marketing device and marketing system
US5657004A (en) * 1995-04-11 1997-08-12 Felknor International, Inc. Electronically controlled point of purchase display
US7865404B2 (en) * 1996-01-17 2011-01-04 Paradox Technical Solutions Llc Intelligent agents for electronic commerce
US5731805A (en) * 1996-06-25 1998-03-24 Sun Microsystems, Inc. Method and apparatus for eyetrack-driven text enlargement
US7225142B1 (en) * 1996-08-01 2007-05-29 At&T Corp. Interactive multimedia advertising and electronic commerce on a hypertext network
US6064383A (en) * 1996-10-04 2000-05-16 Microsoft Corporation Method and system for selecting an emotional appearance and prosody for a graphical character
US6219657B1 (en) * 1997-03-13 2001-04-17 Nec Corporation Device and method for creation of emotions
US7169113B1 (en) * 1997-09-05 2007-01-30 Hello-Hello, Inc. Portrayal of human information visualization
US6520905B1 (en) * 1998-02-26 2003-02-18 Eastman Kodak Company Management of physiological and psychological state of an individual using images portable biosensor device
US6190314B1 (en) * 1998-07-15 2001-02-20 International Business Machines Corporation Computer input device with biosensors for sensing user emotions
US6606605B1 (en) * 1998-07-20 2003-08-12 Usa Technologies, Inc. Method to obtain customer specific data for public access electronic commerce services
US6393136B1 (en) * 1999-01-04 2002-05-21 International Business Machines Corporation Method and apparatus for determining eye contact
US6577329B1 (en) * 1999-02-25 2003-06-10 International Business Machines Corporation Method and system for relevance feedback through gaze tracking and ticker interfaces
US7562064B1 (en) * 1999-07-03 2009-07-14 Microsoft Corporation Automated web-based targeted advertising with quotas
US20060129457A1 (en) * 1999-07-08 2006-06-15 Dynamiclogic, Inc. System and method for evaluating and/or monitoring effectiveness of on-line advertising
US7010497B1 (en) * 1999-07-08 2006-03-07 Dynamiclogic, Inc. System and method for evaluating and/or monitoring effectiveness of on-line advertising
US20030088463A1 (en) * 1999-10-21 2003-05-08 Steven Fischman System and method for group advertisement optimization
US7769632B2 (en) * 1999-12-17 2010-08-03 Promovu, Inc. System for selectively communicating promotional information to a person
US20080114756A1 (en) * 1999-12-28 2008-05-15 Levino Authomatic, personalized online information and product services
US20100122178A1 (en) * 1999-12-28 2010-05-13 Personalized User Model Automatic, personalized online information and product services
US7197472B2 (en) * 2000-01-13 2007-03-27 Erinmedia, Llc Market data acquisition system
US7689672B2 (en) * 2000-02-25 2010-03-30 Microsoft Corporation Collecting user attributes and device attributes to target users with promotions
US7181693B1 (en) * 2000-03-17 2007-02-20 Gateway Inc. Affective control of information systems
US20030060897A1 (en) * 2000-03-24 2003-03-27 Keisuke Matsuyama Commercial effect measuring system, commercial system, and appealing power sensor
US20040152957A1 (en) * 2000-06-16 2004-08-05 John Stivoric Apparatus for detecting, receiving, deriving and displaying human physiological and contextual information
US20020030163A1 (en) * 2000-08-09 2002-03-14 Zhang Evan Y.W. Image intensifier and LWIR fusion/combination system
US6873314B1 (en) * 2000-08-29 2005-03-29 International Business Machines Corporation Method and system for the recognition of reading skimming and scanning from eye-gaze patterns
US20050108092A1 (en) * 2000-08-29 2005-05-19 International Business Machines Corporation A Method of Rewarding the Viewing of Advertisements Based on Eye-Gaze Patterns
US6704034B1 (en) * 2000-09-28 2004-03-09 International Business Machines Corporation Method and apparatus for providing accessibility through a context sensitive magnifying glass
US7703611B1 (en) * 2000-09-29 2010-04-27 Aol Inc. Targeted geographical condition notification of users based on a geographic location and device types or software of the users
US20020112035A1 (en) * 2000-10-30 2002-08-15 Carey Brian M. System and method for performing content experience management
US6731307B1 (en) * 2000-10-30 2004-05-04 Koninklije Philips Electronics N.V. User interface/entertainment device that simulates personal interaction and responds to user's mental state and/or personality
US20020072952A1 (en) * 2000-12-07 2002-06-13 International Business Machines Corporation Visual and audible consumer reaction collection
US6601021B2 (en) * 2000-12-08 2003-07-29 Xerox Corporation System and method for analyzing eyetracker data
US6687608B2 (en) * 2000-12-27 2004-02-03 Fuji Photo Film Co., Ltd. Information notification system and method, and navigation system and method
US7908150B2 (en) * 2001-05-07 2011-03-15 Jean-Luc Rochet System and a method for performing personalised interactive automated electronic marketing of the marketing service provider
US20030060728A1 (en) * 2001-09-25 2003-03-27 Mandigo Lonnie D. Biofeedback based personal entertainment system
US20030078840A1 (en) * 2001-10-19 2003-04-24 Strunk David D. System and method for interactive advertising
US6708176B2 (en) * 2001-10-19 2004-03-16 Bank Of America Corporation System and method for interactive advertising
US7555287B1 (en) * 2001-11-01 2009-06-30 Nokia Corporation Customized messaging between wireless access point and services
US7356547B2 (en) * 2001-11-21 2008-04-08 Microsoft Corporation Methods and systems for selectively displaying advertisements
US6585521B1 (en) * 2001-12-21 2003-07-01 Hewlett-Packard Development Company, L.P. Video indexing based on viewers' behavior and emotion feedback
US20050107718A1 (en) * 2002-01-04 2005-05-19 Dan Hashimshony Method and system for examining tissue according to the dielectric properties thereof
US7547279B2 (en) * 2002-01-23 2009-06-16 Samsung Electronics Co., Ltd. System and method for recognizing user's emotional state using short-time monitoring of physiological signals
US7904439B2 (en) * 2002-04-04 2011-03-08 Microsoft Corporation System and methods for constructing personalized context-sensitive portal pages or views by analyzing patterns of users' information access activities
US20090150363A1 (en) * 2002-09-03 2009-06-11 William Gross Apparatus and methods for locating data
US20040101178A1 (en) * 2002-11-25 2004-05-27 Eastman Kodak Company Imaging method and system for health monitoring and personal security
US7874983B2 (en) * 2003-01-27 2011-01-25 Motorola Mobility, Inc. Determination of emotional and physiological states of a recipient of a communication
US7762665B2 (en) * 2003-03-21 2010-07-27 Queen's University At Kingston Method and apparatus for communication between humans and devices
US7881493B1 (en) * 2003-04-11 2011-02-01 Eyetools, Inc. Methods and apparatuses for use of eye interpretation information
US20050021677A1 (en) * 2003-05-20 2005-01-27 Hitachi, Ltd. Information providing method, server, and program
US7418405B1 (en) * 2003-05-23 2008-08-26 Amazon.Com, Inc. Interactive time-limited merchandising program and method for improved online cross-selling
US7764311B2 (en) * 2003-05-30 2010-07-27 Aol Inc. Personalizing content based on mood
US20060143647A1 (en) * 2003-05-30 2006-06-29 Bill David S Personalizing content based on mood
US20050017870A1 (en) * 2003-06-05 2005-01-27 Allison Brendan Z. Communication methods based on brain computer interfaces
US20050010637A1 (en) * 2003-06-19 2005-01-13 Accenture Global Services Gmbh Intelligent collaborative media
US20050071865A1 (en) * 2003-09-30 2005-03-31 Martins Fernando C. M. Annotating meta-data with user responses to digital content
US7363282B2 (en) * 2003-12-03 2008-04-22 Microsoft Corporation Search system using user behavior data
US7742037B2 (en) * 2003-12-10 2010-06-22 Sony Corporation Input device, input method and electronic equipment
US20050157377A1 (en) * 2004-01-20 2005-07-21 Ron Goldman Portable electronic device with a laser projection display
US7931602B2 (en) * 2004-03-24 2011-04-26 Seiko Epson Corporation Gaze guidance degree calculation system, gaze guidance degree calculation program, storage medium, and gaze guidance degree calculation method
US20060064411A1 (en) * 2004-09-22 2006-03-23 William Gross Search engine using user intent
US20060074883A1 (en) * 2004-10-05 2006-04-06 Microsoft Corporation Systems, methods, and interfaces for providing personalized search and information access
US7503653B2 (en) * 2004-11-22 2009-03-17 Carestream Health, Inc. Diagnostic system having gaze tracking
US20060122834A1 (en) * 2004-12-03 2006-06-08 Bennett Ian M Emotion detection device & method for use in distributed systems
US20060146281A1 (en) * 2004-12-03 2006-07-06 Goodall Eleanor V Method and system for vision enhancement
US7679579B2 (en) * 2004-12-24 2010-03-16 Fujifilm Corporation Projection type image display apparatus
US20060170945A1 (en) * 2004-12-30 2006-08-03 Bill David S Mood-based organization and display of instant messenger buddy lists
US7703114B2 (en) * 2005-02-25 2010-04-20 Microsoft Corporation Television system targeted advertising
US20070167689A1 (en) * 2005-04-01 2007-07-19 Motorola, Inc. Method and system for enhancing a user experience using a user's physiological state
US20070191691A1 (en) * 2005-05-19 2007-08-16 Martin Polanco Identification of guilty knowledge and malicious intent
US8407055B2 (en) * 2005-08-05 2013-03-26 Sony Corporation Information processing apparatus and method for recognizing a user's emotion
US7720784B1 (en) * 2005-08-30 2010-05-18 Walt Froloff Emotive intelligence applied in electronic devices and internet using emotion displacement quantification in pain and pleasure space
US20070052672A1 (en) * 2005-09-08 2007-03-08 Swisscom Mobile Ag Communication device, system and method
US7907940B2 (en) * 2005-09-14 2011-03-15 Jumptap, Inc. Presentation of sponsored content based on mobile transaction event
US7702318B2 (en) * 2005-09-14 2010-04-20 Jumptap, Inc. Presentation of sponsored content based on mobile transaction event
US7769764B2 (en) * 2005-09-14 2010-08-03 Jumptap, Inc. Mobile advertisement syndication
US20070085759A1 (en) * 2005-09-15 2007-04-19 Lg Electronics Inc. Method for displaying multimedia contents and mobile communications terminal capable of implementing the same
US20070066916A1 (en) * 2005-09-16 2007-03-22 Imotions Emotion Technology Aps System and method for determining human emotion by analyzing eye properties
US20070105071A1 (en) * 2005-11-04 2007-05-10 Eye Tracking, Inc. Generation of test stimuli in visual media
US20070104369A1 (en) * 2005-11-04 2007-05-10 Eyetracking, Inc. Characterizing dynamic regions of digital media data
US7760910B2 (en) * 2005-12-12 2010-07-20 Eyetools, Inc. Evaluation of visual stimuli using existing viewing data
US20070150916A1 (en) * 2005-12-28 2007-06-28 James Begole Using sensors to provide feedback on the access of digital content
US20070162505A1 (en) * 2006-01-10 2007-07-12 International Business Machines Corporation Method for using psychological states to index databases
US20070184420A1 (en) * 2006-02-08 2007-08-09 Honeywell International Inc. Augmented tutoring
US8712713B2 (en) * 2006-03-20 2014-04-29 Qualcomm Incorporated Method and apparatus for determining the altitude of a mobile device
US7930199B1 (en) * 2006-07-21 2011-04-19 Sensory Logic, Inc. Method and report assessing consumer reaction to a stimulus by matching eye position with facial coding
US20080065468A1 (en) * 2006-09-07 2008-03-13 Charles John Berg Methods for Measuring Emotive Response and Selection Preference
US20100174586A1 (en) * 2006-09-07 2010-07-08 Berg Jr Charles John Methods for Measuring Emotive Response and Selection Preference
US20080091515A1 (en) * 2006-10-17 2008-04-17 Patentvc Ltd. Methods for utilizing user emotional state in a business process
US20080147488A1 (en) * 2006-10-20 2008-06-19 Tunick James A System and method for monitoring viewer attention with respect to a display and determining associated charges
US20080104045A1 (en) * 2006-11-01 2008-05-01 Cohen Alain J Collectively enhanced semantic search
US20080146892A1 (en) * 2006-12-19 2008-06-19 Valencell, Inc. Physiological and environmental monitoring systems and methods
US20080162142A1 (en) * 2006-12-29 2008-07-03 Industrial Technology Research Institute Emotion abreaction device and using method of emotion abreaction device
US20080209321A1 (en) * 2007-02-15 2008-08-28 Fujitsu Limited Mobile terminal apparatus, and display control method therefor
US8473044B2 (en) * 2007-03-07 2013-06-25 The Nielsen Company (Us), Llc Method and system for measuring and ranking a positive or negative response to audiovisual or interactive media, products or activities using physiological signals
US8126220B2 (en) * 2007-05-03 2012-02-28 Hewlett-Packard Development Company L.P. Annotating stimulus based on determined emotional response
US20080281661A1 (en) * 2007-05-08 2008-11-13 Jenn-Shoou Young Real-time Advertisement Displaying System and Method thereof
US20090062679A1 (en) * 2007-08-27 2009-03-05 Microsoft Corporation Categorizing perceptual stimuli by detecting subconcious responses
US20090100015A1 (en) * 2007-10-11 2009-04-16 Alon Golan Web-based workspace for enhancing internet search experience
US20090112695A1 (en) * 2007-10-24 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Physiological response based targeted advertising
US20090112696A1 (en) * 2007-10-24 2009-04-30 Jung Edward K Y Method of space-available advertising in a mobile device
US20090112693A1 (en) * 2007-10-24 2009-04-30 Jung Edward K Y Providing personalized advertising
US20090112656A1 (en) * 2007-10-24 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Returning a personalized advertisement

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080077020A1 (en) * 2006-09-22 2008-03-27 Bam Labs, Inc. Method and apparatus for monitoring vital signs remotely
US9582805B2 (en) 2007-10-24 2017-02-28 Invention Science Fund I, Llc Returning a personalized advertisement
US20090112810A1 (en) * 2007-10-24 2009-04-30 Searete Llc Selecting a second content based on a user's reaction to a first content
US20090112696A1 (en) * 2007-10-24 2009-04-30 Jung Edward K Y Method of space-available advertising in a mobile device
US20090112813A1 (en) * 2007-10-24 2009-04-30 Searete Llc Method of selecting a second content based on a user's reaction to a first content of at least two instances of displayed content
US20090112694A1 (en) * 2007-10-24 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Targeted-advertising based on a sensed physiological response by a person to a general advertisement
US20090112656A1 (en) * 2007-10-24 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Returning a personalized advertisement
US9513699B2 (en) 2007-10-24 2016-12-06 Invention Science Fund I, LL Method of selecting a second content based on a user's reaction to a first content
US20090112914A1 (en) * 2007-10-24 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Returning a second content based on a user's reaction to a first content
US8112407B2 (en) 2007-10-24 2012-02-07 The Invention Science Fund I, Llc Selecting a second content based on a user's reaction to a first content
US8234262B2 (en) 2007-10-24 2012-07-31 The Invention Science Fund I, Llc Method of selecting a second content based on a user's reaction to a first content of at least two instances of displayed content
US8126867B2 (en) 2007-10-24 2012-02-28 The Invention Science Fund I, Llc Returning a second content based on a user's reaction to a first content
US20090112849A1 (en) * 2007-10-24 2009-04-30 Searete Llc Selecting a second content based on a user's reaction to a first content of at least two instances of displayed content
US20090112697A1 (en) * 2007-10-30 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Providing personalized advertising
US20090219168A1 (en) * 2008-02-29 2009-09-03 Sony Corporation Living posters
US20090292713A1 (en) * 2008-05-23 2009-11-26 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Acquisition and particular association of data indicative of an inferred mental state of an authoring user
US9192300B2 (en) * 2008-05-23 2015-11-24 Invention Science Fund I, Llc Acquisition and particular association of data indicative of an inferred mental state of an authoring user
US9161715B2 (en) * 2008-05-23 2015-10-20 Invention Science Fund I, Llc Determination of extent of congruity between observation of authoring user and observation of receiving user
US20090290767A1 (en) * 2008-05-23 2009-11-26 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Determination of extent of congruity between observation of authoring user and observation of receiving user
US8681013B2 (en) * 2008-06-17 2014-03-25 Canon Kabushiki Kaisha Management apparatus for managing a content display change time on a display apparatus and content information to be transmitted to a terminal
US20120280826A1 (en) * 2008-06-17 2012-11-08 Canon Kabushiki Kaisha Management apparatus for managing a content display change time on a display apparatus and content information to be transmitted to a terminal
US20100109868A1 (en) * 2008-11-05 2010-05-06 Berger Adam L Notifying A User Of An Available Media Object
US8937543B2 (en) * 2008-11-05 2015-01-20 Penthera Partners, Inc. Notifying a user of an available media object
US9857956B2 (en) 2008-11-05 2018-01-02 Penthera Partners, Inc. Notifying a user of an available media object
US20140250383A1 (en) * 2008-11-05 2014-09-04 Penthera Partners, Inc. Notifying A User Of An Available Media Object
US8754765B2 (en) * 2008-11-05 2014-06-17 Penthera Partners, Inc. Notifying a user of an available media object
US20100169157A1 (en) * 2008-12-30 2010-07-01 Nokia Corporation Methods, apparatuses, and computer program products for providing targeted advertising
US8539359B2 (en) * 2009-02-11 2013-09-17 Jeffrey A. Rapaport Social network driven indexing system for instantly clustering people with concurrent focus on same topic into on-topic chat rooms and/or for generating on-topic search results tailored to user preferences regarding topic
US20100205541A1 (en) * 2009-02-11 2010-08-12 Jeffrey A. Rapaport social network driven indexing system for instantly clustering people with concurrent focus on same topic into on-topic chat rooms and/or for generating on-topic search results tailored to user preferences regarding topic
US8155706B1 (en) * 2009-12-21 2012-04-10 David Hurst Scent notification system for a portable communication device
US8814691B2 (en) 2010-02-28 2014-08-26 Microsoft Corporation System and method for social networking gaming with an augmented reality
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US9097891B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US9134534B2 (en) 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US9097890B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc Grating in a light transmissive illumination system for see-through near-eye display glasses
US9182596B2 (en) 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US20110221657A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Optical stabilization of displayed content with a variable lens
US9223134B2 (en) 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US9329689B2 (en) 2010-02-28 2016-05-03 Microsoft Technology Licensing, Llc Method and apparatus for biometric data capture
US9341843B2 (en) 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
US9366862B2 (en) 2010-02-28 2016-06-14 Microsoft Technology Licensing, Llc System and method for delivering content to a group of see-through near eye display eyepieces
US20110213664A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US9091851B2 (en) 2010-02-28 2015-07-28 Microsoft Technology Licensing, Llc Light control in head mounted displays
US9759917B2 (en) 2010-02-28 2017-09-12 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered AR eyepiece interface to external devices
US9875406B2 (en) 2010-02-28 2018-01-23 Microsoft Technology Licensing, Llc Adjustable extension for temple arm
US9128281B2 (en) 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display

Also Published As

Publication number Publication date Type
US9513699B2 (en) 2016-12-06 grant
US20090113298A1 (en) 2009-04-30 application

Similar Documents

Publication Publication Date Title
Chandler et al. A dictionary of media and communication
Jaimes et al. Multimodal human–computer interaction: A survey
US20100153146A1 (en) Generating Generalized Risk Cohorts
US20090171559A1 (en) Method, Apparatus and Computer Program Product for Providing Instructions to a Destination that is Revealed Upon Arrival
US20100153353A1 (en) Generating Predilection Cohorts
US20130159127A1 (en) Method of and system for rating sources for fact checking
US8521818B2 (en) Methods and apparatus for recognizing and acting upon user intentions expressed in on-line conversations and similar environments
US20140164994A1 (en) Fact checking graphical user interface including fact checking icons
US20090030800A1 (en) Method and System for Searching a Data Network by Using a Virtual Assistant and for Advertising by using the same
Starner Wearable computing and contextual awareness
US20100153390A1 (en) Scoring Deportment and Comportment Cohorts
US20140195328A1 (en) Adaptive embedded advertisement via contextual analysis and perceptual computing
US20090289956A1 (en) Virtual billboards
US20120230539A1 (en) Providing location identification of associated individuals based on identifying the individuals in conjunction with a live video stream
US20080004953A1 (en) Public Display Network For Online Advertising
US20100030648A1 (en) Social media driven advertisement targeting
US20110126119A1 (en) Contextual presentation of information
US20130177296A1 (en) Generating metadata for user experiences
US20120232966A1 (en) Identifying predetermined objects in a video stream captured by a mobile device
US20090232357A1 (en) Detecting behavioral deviations by measuring eye movements
US8295542B2 (en) Adjusting a consumer experience based on a 3D captured image stream of a consumer response
US20080002892A1 (en) Method and system for image and video analysis, enhancement and display for communication
US20090028434A1 (en) System and method for displaying contextual supplemental content based on image content
US20060209175A1 (en) Electronic association of a user expression and a context of the expression
US20130046823A1 (en) Computer-Vision Content Detection for Connecting Objects in Media to Users

Legal Events

Date Code Title Description
AS Assignment

Owner name: SEARETE LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JUNG, EDWARD K.Y.;LEVIEN, ROYCE A.;LORD, ROBERT W.;AND OTHERS;REEL/FRAME:020754/0944;SIGNING DATES FROM 20080120 TO 20080327