EP2038736A2 - Monitoring usage of a portable user appliance - Google Patents

Monitoring usage of a portable user appliance

Info

Publication number
EP2038736A2
EP2038736A2 EP07812865A EP07812865A EP2038736A2 EP 2038736 A2 EP2038736 A2 EP 2038736A2 EP 07812865 A EP07812865 A EP 07812865A EP 07812865 A EP07812865 A EP 07812865A EP 2038736 A2 EP2038736 A2 EP 2038736A2
Authority
EP
European Patent Office
Prior art keywords
pua
data
user
content
message
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP07812865A
Other languages
German (de)
French (fr)
Other versions
EP2038736A4 (en
Inventor
Roberta M. Mcconochie
Alan R. Neuhauser
Jack C. Crystal
Jack K. Zhang
Eugene L. Flanagan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nielsen Audio Inc
Original Assignee
Arbitron Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Arbitron Inc filed Critical Arbitron Inc
Publication of EP2038736A2 publication Critical patent/EP2038736A2/en
Publication of EP2038736A4 publication Critical patent/EP2038736A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • A61B3/1216Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes for diagnostics of the iris
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1172Identification of persons based on the shapes or appearances of their bodies or parts thereof using fingerprinting
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1176Recognition of faces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4833Assessment of subject's compliance to treatment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4887Locating particular structures in or on the body
    • A61B5/489Blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B7/00Instruments for auscultation
    • A61B7/003Detecting lung or respiration noise
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B7/00Instruments for auscultation
    • A61B7/008Detecting noise of gastric tract, e.g. caused by voiding
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B7/00Instruments for auscultation
    • A61B7/02Stethoscopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3466Performance evaluation by tracing or monitoring
    • G06F11/3485Performance evaluation by tracing or monitoring for I/O devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/101Collaborative creation, e.g. joint development of products or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/322Aspects of commerce using mobile devices [M-devices]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/018Certifying business or products
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0207Discounts or incentives, e.g. coupons or rebates
    • G06Q30/0226Incentive systems for frequent usage, e.g. frequent flyer miles programs or point systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/12Healthy persons not otherwise provided for, e.g. subjects of a marketing survey
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • G06Q30/0203Market surveys; Market polls

Definitions

  • data means any indicia, signals, marks, symbols, domains, symbol sets, representations, and any other physical form or forms representing information, whether permanent or temporary, whether visible, audible, acoustic, electric, magnetic, electromagnetic or otherwise manifested.
  • data as used to represent predetermined information in one physical form shall be deemed to encompass any and all representations of corresponding information in a different physical form or forms.
  • media data and “media” as used herein mean data which is widely accessible, whether over-the-air, or via cable, satellite, network, internetwork (including the Internet), print, displayed, distributed on storage media, or by any other means or technique that is humanly perceptible, without regard to the form or content of such data, and including but not limited to audio, video, audio/video, text, images, animations, databases, broadcasts, displays (including but not limited to video displays, posters and billboards), signs, signals, web pages, print media and streaming media data.
  • search data means data comprising (1 ) data concerning usage of media, (2) data concerning exposure to media, and/or (3) market research data.
  • presentation data shall mean media data, content other than media data or a message to be presented to a user.
  • database means an organized body of related data, regardless of the manner in which the data or the organized body thereof is represented.
  • the organized body of related data may be in the form of a table, a map, a grid, a packet, a datagram, a frame, a file, an e-mail, a message, a document, a list or in any other form.
  • correlate means a process of ascertaining a relationship between or among data, including but not limited to an identity relationship, a correspondence or other relationship of such data to further data, inclusion in a dataset, exclusion from a dataset, a predefined mathematical relationship between or among the data and/or to further data, and the existence of a common aspect between or among the data.
  • purchase and purchasing as used herein mean a process of obtaining title, a license, possession or other right in or to goods or services in exchange for consideration, whether payment of money, barter or other legally sufficient consideration, or as promotional samples.
  • goods and services include, but are not limited to, data and rights in or to data.
  • network includes both networks and internetworks of all kinds, including the Internet, and is not limited to any particular network or inter-network.
  • first,” “second,” “primary,” and “secondary” are used herein to distinguish one element, set, data, object, step, process, function, activity or thing from another, and are not used to designate relative position, arrangement in time or relative importance, unless otherwise stated explicitly.
  • Coupled means a relationship between or among two or more devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, and/or means, constituting any one or more of (a) a connection, whether direct or through one or more other devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, or means, (b) a communications relationship, whether direct or through one or more other devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, or means, and/or (c) a functional relationship in which the operation of any one or more devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, or means depends, in whole or in part, on the operation of any one or more others thereof.
  • the terms "communicate” and “communicating” as used herein include both conveying data from a source to a destination, and delivering data to a communications medium, system, channel, network, device, wire, cable, fiber, circuit, and/or link to be conveyed to a destination.
  • the term "communications” as used herein includes one or more of a communications medium, system, channel, network, device, wire, cable, fiber, circuit and link.
  • messages includes data to be communicated, in communication or which has been communicated.
  • processor means processing devices, apparatus, programs, circuits, components, systems and subsystems, whether implemented in hardware, software or both, and whether or not programmable.
  • processor includes, but is not limited to one or more computers, hardwired circuits, signal modifying devices and systems, devices and machines for controlling systems, central processing units, programmable devices and systems, field programmable gate arrays, application specific integrated circuits, systems on a chip, systems comprised of discrete elements and/or circuits, state machines, virtual machines, data processors, processing facilities and combinations of any of the foregoing.
  • storage and “data storage” as used herein mean data storage devices, apparatus, programs, circuits, components, systems, subsystems and storage media serving to retain data, whether on a temporary or permanent basis, and to provide such retained data.
  • panelist panel member
  • panel member panel member
  • participant are interchangeably used herein to refer to a person who is, knowingly or unknowingly, participating in a study to gather information, whether by electronic, survey or other means, about that person's activity.
  • the term "household” as used herein is to be broadly construed to include family members, a family living at the same residence, a group of persons related or unrelated to one another living at the same residence, and a group of persons (of which the total number of unrelated persons does not exceed a predetermined number) living within a common facility, such as a fraternity house, an apartment or other similar structure or arrangement.
  • activity includes, but is not limited to, purchasing conduct, shopping habits, viewing habits, computer, Internet usage, exposure to media, personal attitudes, awareness, opinions and beliefs, as well as other forms of activity discussed herein.
  • the term "portable user appliance” (also referred to herein, for convenience, by the abbreviation "PUA”) as used herein means an electrical or nonelectrical device capable of being carried by or on the person of a user or capable of being disposed on or in, or held by, a physical object (e.g., attache, purse) capable of being carried by or on the user, and having at least one function of primary benefit to such user, including without limitation, a cellular telephone, a personal digital assistant (“PDA”), a Blackberry ® device, a radio, a television, a game system (e.g., a Gameboy ® device), a notebook computer, a laptop computer, a GPS device, a personal audio device (e.g., an MP3 player), a DVD player, a two-way radio, a personal communications device, a telematics device, a remote control device, a wireless headset, a wristwatch, a portable data storage device (e.g., Thumb TM drive),
  • search device shall mean (1 ) a portable user appliance configured or otherwise enabled to gather, store and/or communicate research data, or to cooperate with other devices to gather, store and/or communicate research data, and/or (2) a research data gathering, storing and/or communicating device.
  • user-beneficial function shall mean a function initiated or carried out by a person with the use of a PUA, which function is of primary benefit to that person.
  • a method of gathering data concerning usage of a PUA comprises: monitoring content created in the use of the PUA to produce content related data; and communicating the content related data to a usage data processing facility.
  • a system for gathering data concerning usage of a PUA comprises a monitor in or on the PUA and operative to monitor content created in the use of the PUA to produce content related data; and communications coupled with the monitor to receive the content related data and operative to communicate the content related data from the PUA to a usage data processing facility.
  • a method of monitoring use of a PUA by a user, the PUA including a communication interface for communicating with at least another PUA comprises detecting communication by the communication interface of the PUA; providing communication content data relating to content of the communication of the PUA; and providing trend data representing at least one trend of usage of the PUA by the user based on the communication data.
  • FIG. 1 A illustrates various monitoring systems that include a portable user appliance (“PUA”) used by a user and configured to operate as a research device;
  • PUA portable user appliance
  • Figure 1 B is a block diagram showing certain details of the monitoring systems of Figure 1A;
  • Figure 1 C is a block diagram showing the monitoring systems of Figure 1 A including a PUA coupled with a docking station;
  • Figures 2A and 2B are flow diagrams illustrating actions by the monitoring systems of Figures 1 A-1 C which actively monitor use of the PUA;
  • Figure 3 is a flow diagram illustrating actions by the monitoring systems of Figures 1A-1 C which monitor usage of the PUA;
  • Figure 4 is a flow diagram illustrating actions by the monitoring systems of Figures 1A-1 C which provide trend data representing one or more PUA usage trends.
  • FIGs 1 A and 1 B are schematic illustrations of a monitoring system 1 that includes a PUA 2, which is used by a user 3, and a processor 5.
  • the PUA 2 is replaced by a research device that does not comprise a PUA.
  • the processor 5 may include one or a plurality of processors which are located together or separate from one another disposed within or controlled by one or more organizations.
  • the PUA 2 may be coupled to the processor 5 via communications 7 which allows data to be exchanged between the PUA 2 and the processor 5.
  • the PUA 2 is wirelessly coupled via communications 7 to the processor 5.
  • the monitoring system 1 also includes storage 6 for storing data including, but not limited to, data received and/or processed by the central processor 5.
  • storage 6 includes one or more storage units located together or separate from one another at the same or different locations.
  • storage 6 is included with processor 5.
  • FIG 1 B is a more detailed illustration of an embodiment of the monitoring system 1 in which the PUA 2 is adapted to communicate wirelessly with the processor 5 using wireless communications 8.
  • the PUA 2 includes a communication interface 9 for communicating and receiving data through communications 8.
  • the PUA 2 also includes a message input 1 1 to allow the user of the PUA 2 to input a message into the PUA 2.
  • the message input 1 1 is coupled with the communication interface 9 of the PUA 2, so that a message inputted using the message input 1 1 can be communicated from the PUA 2 via communications 8. It is understood that messages inputted using the message input 1 1 may be communicated to the processor 5, or to another PUA 2, or to another location or device coupled with communications 8.
  • the message input 1 1 comprises a plurality of keys 1 1 a in the form of a keypad.
  • the configuration of the message input 1 1 may vary, such that, for example, the message input 1 1 may comprise one or more of a key, a button, a switch, a keyboard, a microphone, a video camera, a touch pad, an accelerometer, a motion detector, a touch screen, a tablet, a scroll-and-click wheel or the like.
  • the PUA 2 also comprises a sensor or a detector 13 for detecting one or more parameters.
  • the parameter or parameters detected by the sensor/detector 13 include, but are not limited to, the remaining power capacity of the PUA 2, one or more of a user's biometric functions or parameters, a location of the PUA 2, a change in location of the PUA 2, data input to the PUA by the user, sounds external to the PUA 2, motion of the PUA 2, pressure being applied to the PUA 2, or an impact of the PUA 2 with another object.
  • sensor/detector 13 detects a presence indication signal or a personal identification signal emitted by a signal emitter 14 carried in or on the person of the user.
  • the signal emitter 14 comprises a device worn or carried by the user, such as a ring, a necklace, or other article of jewelry, a wristwatch, a key fob, or article of clothing that emits a predetermined signal indicating a user's presence or the identity of the user wearing or carrying the device.
  • the signal may be emitted as an acoustic signal, an RF or other electromagnetic signal, or a chemical signal that sensor/detector 13 is operative to receive, or an electrical signal.
  • the signal emitter 14 comprises a device implanted in the user, such as under the user's skin.
  • the sensor/detector 13 includes a plurality of sensors or detectors each for detecting one or more of a plurality of parameters.
  • the sensor/detector 13 is coupled with the communications interface 9 of the PUA 2 so that data produced as a result of the sensing or detecting performed by the sensor/detector 13 can be communicated from the PUA 2 to the processor 5.
  • the PUA 2 shown in Figure 1 B includes both the message input 11 and the sensor/detector 13, it is understood that in other embodiments, one of these elements may be omitted depending on the design of the PUA 2 and the requirements of the monitoring system 1.
  • the illustrative configuration of the monitoring system 1 shown in Figure 1 B includes storage 6 coupled or included with the processor 5 to store data, including data received and/or processed by the processor 5. Data stored in storage 6 can also be retrieved by the processor 5 when needed.
  • the PUA 2 shown in Figures 1 A and 1 B may be supplied with power from an A/C power source or other power supply, or using one or more batteries or other on-board power source (not shown for purposes of simplicity and clarity). It is understood that batteries used to supply power to the PUA 2 may include any type of batteries, whether rechargeable or not, that are suitable for use with the particular PUA 2. In certain embodiments, the PUA 2 receives power from rechargeable batteries or another kind of rechargeable power supply, such as a capacitor, and/or from a radiant energy converter, such as a photoelectric power converter, or a mechanical energy converter, such as a microelectric generator.
  • the PUA 2 is connected with a docking station from time to time, which is used for charging the PUA 2 and/or transmitting data stored in the PUA 2 to the processor 5.
  • Figure 1 C shows an embodiment of the PUA 2 used with the docking station 15.
  • the coupling 16 can be a direct connection between the PUA 2 and the docking station 15 to allow recharging of the PUA 2 and/or communication of data between the PUA 2 and the docking station 15.
  • data is communicated from the PUA to the docking station by a wireless infra-red, RF, capacitive or inductive link.
  • data is communicated from the PUA 2 to the processor 5 by cellular telephone link or other wired or wireless network or device coupling.
  • the docking station is connected to a power supply 17 to provide power for charging the PUA 2 when the PUA 2 is coupled with the docking station 15.
  • the docking station 15 includes a communication interface 19 adapted to communicate with the processor 5 through communications 7.
  • data stored in the PUA 2 such as data collected by the PUA 2 when it was carried by the user, is transferred to the docking station 15 using the coupling 16 and thereafter communicated using the communication interface 19 to the processor 5 through communications 7.
  • the use of the docking station 15, rather than the PUA 2, to communicate to the processor 5 data collected by the PUA 2 enables conservation of power by the PUA 2 or the use of an internal power supply having a relatively low power capacity.
  • the docking station 15 is also used to receive data from the processor 5 via communications 7, and to transfer the received data from the docking station 15 to the PUA 2 via the coupling 16 when the PUA 2 is coupled with the docking station 15.
  • the configuration of the docking station 15 is not limited to the configuration shown in Figure 1 C and may vary from one embodiment to another.
  • the docking station is used only for charging the PUA 2 and does not include a communication interface 19.
  • the docking station 15 is implemented variously as a cradle receiving the PUA 2 or as a standard AC-to DC converter, like a cellular telephone charger.
  • the docking station 15 is used only for communication of data between the PUA 2 and the processor 5 and does not charge the PUA 2.
  • the PUA 2 may be connected to a power supply, separate from the docking station 15, for charging, or charged using an internal power converter, or by replacing one or more batteries.
  • the PUA 2 shown in Figures 1 A-1 C optionally includes an output (not shown for purposes of simplicity and clarity) for outputting a message to the user.
  • the output can be in the form of a display for displaying text, or one or more symbols and/or images, a speaker or earphone for outputting a voicemail or a voice message, or one or more LED's or lamps for indicating a message to the user. It is understood that the output or outputs are not limited to the examples provided herein and can comprise any suitable output or outputs adapted to provide a message to the user.
  • the monitoring system 1 shown in Figures 1 A and 1 B is used in certain embodiments for monitoring use by a user of the PUA 2 in accordance with at least one predetermined use criterion.
  • the at least one predetermined use criterion comprises one or more of the following criteria: that the PUA 2 is being carried and/or used, that the PUA 2 is being carried and/or used by a specific user, that the PUA 2 is turned "on," that the PUA 2 is charged, that the PUA 2 maintains a minimum power capacity, that the PUA 2 is, or has been, docked at, or connected with, the docking station 15 for a predetermined length of time, at certain times or during a predetermined time period, that the PUA is functioning properly to provide a benefit to the user, and that the PUA 2 is capable of collecting, storing and/or communicating research data, or of cooperating with one or more other devices to do so.
  • Other predetermined use criteria not mentioned above may also be employed in monitoring the PUA's use.
  • the method of monitoring use by a user of a research device such as PUA 2 in accordance with at least one predetermined use criterion comprises communicating a request message to the research device, requesting a response from the user of the PUA, receiving a response message communicated from the research device in response to the request message, and storing data indicating whether the use is in compliance with the at least one predetermined use criterion and/or a level of the user's compliance therewith.
  • Figure 2A shows a block diagram of the actions performed by the monitoring systems shown in Figures 1 A-1 C.
  • a request message is first communicated 100 to a PUA having a two-way communication capability with a remotely-located processor, such as processor 5 of Figures 1 A-1 C, requesting a response from a user of the PUA.
  • the request message comprises a text message, a telephone call, a voice mail, an e-mail, a voice message, a sound, a plurality of sounds, a web page, an image, a light alert, or a combination thereof, or any other data presented to the user via the PUA which indicates to the user that a response is being requested.
  • the request message is presented to the user using an appropriate output (for example, a sound reproducing device, such as a speaker or earphone) if the message is a telephone call, a voice mail, a voice message, a sound or a plurality of sounds; a visual display, if the message is a text message, an e-mail, a web page or another image; and/or one or more light emitting devices (for example, LED's or lamps) if the message is a light alert.
  • the request message requests a pre- determined response from the PUA user, or a more general response such as a response that acknowledges receipt of the request message.
  • the request is accompanied by data of interest to the user, such as access to certain web sites or content, such as music, video, news, or electronic coupons.
  • access to such data is conditioned on providing the requested response according to parameters expressed in the request message or otherwise predetermined.
  • the processor is implemented as one or more programmable processors running a communications management program module serving to control communications with the PUA and/or its user, along with other PUA's, to request a response including data from which compliance can be assessed.
  • such communications are scheduled in advance by the programming module with or without reference to a database storing schedule data representing a schedule of such communications, and carried out thereby automatically by means of communications 7.
  • such communications are scheduled in advance and notified to human operators who initiate calls to the PUA's and/or the PUA's users according to the schedule, to solicit data from which compliance can be assessed.
  • both automatic communications and human-initiated communications as described above are carried out.
  • a response message is generated 102 in the PUA.
  • the response message is generated by inputting the response message by an action of the user using the message input of the PUA.
  • the response message comprises a code, including letter characters, number characters or symbols, or a combination thereof
  • the response message is generated using the message input of the PUA.
  • the response message comprises data stored in the PUA, in which case, the response message is generated by selecting the stored data using the message input.
  • the response message is a response signal generated by activating the message input, such as, for example, by switching one or more switches or by pressing one or more buttons of the message input.
  • the response message comprises one or more audible sounds
  • the response message is generated by inputting the sounds using the message input.
  • the message input comprises an audio input device, such as an acoustic transducer.
  • the response message can be generated in response to a request for a pre-determined response, or in response to a request for a more general response.
  • the response message is communicated from the PUA through communications thereof and is received 104 in the remotely-located processor, such as processor 5.
  • the communications comprises cellular telephone communications, PCS communications, wireless networking communications, satellite communications, or a Bluetooth, ZigBee, electro-optical or other wireless link.
  • such communications comprises as Ethernet interface, a telephone modem, a USB port, a Firewire connection, a cable modem, an audio or video connection, or other network or device interface.
  • the processor When the response message from the PUA is received, or a predetermined time period passes without receiving the response message, the processor provides data indicating whether the use of the PUA is in compliance with at least one predetermined criterion and/or the level of the user's compliance. The data provided by the processor is then stored 106 by the processor. In certain embodiments, the processor provides data indicating a user's compliance and/or the level of a user's compliance based on whether or not the response message from the PUA was received. In other embodiments, the processor provides compliance and/or level of compliance data based on the content of the response message, and/or the length of time passed before the response message from the PUA is received, and/or other factors discussed in more detail herein below.
  • the processor is implemented as one or more programmable processors running a compliance analysis program module which receives the data returned by the PUA and/or the user of the PUA to the communications management program module and serves to analyze the compliance of the user based on such data and in accordance with compliance rules stored in a storage, such as storage 6 of Figures 1 A-1 C. Based on such analysis, the compliance analysis program module produces compliance data indicating whether the user complied with the predetermined use criteria and/or a level of such compliance. [00050] In certain embodiments, a reward may be provided to a user when the user's use of the PUA is in compliance with the predetermined use criteria or when the user's level of compliance is above a pre-selected compliance level.
  • the reward may be in the form of cash, credit, a prize or a benefit, such as a free service or points usable to make purchases or receive prizes, either by means of the PUA or through a different means or service.
  • the reward comprises data of interest to the user, such as access to certain web sites or content, such as music, video, news, or electronic coupons.
  • a reward to the user is determined 108.
  • the reward to the user including the type of the reward and/or an amount or quality of the reward, is determined by the processor of the monitoring system based on the stored data indicating user's compliance or the level of user's compliance.
  • the reward is provided to the user if the user's level of compliance is higher than a predetermined level and/or the type and/or the amount of the reward determined in 108 is varied as the level of the user's compliance increases or decreases. For instance, in certain embodiments a number of points awarded to the user that may be used to purchase goods or services, is greater where the user responds to a larger percentage of request messages, or is increased as the number of request messages that the user responds to increases.
  • Providing rewards to PUA users for use of the PUA in compliance with the predetermined use criteria provides an incentive for the users to comply with the use requirements so as to earn a reward or to earn a higher reward. Therefore, providing a reward to the PUA user for the correct use of the PUA also promotes correct use of the PUA in the future in accordance with the predetermined usage criterion or criteria.
  • the monitoring system also communicates a message to the PUA user indicating compliance and/or the level of compliance with the predetermined use criteria for the PUA and/or the reward earned by the user 1 10.
  • the message communicated to the user can be in the form of a text message, a telephone call, a voice mail, a voice message, an e-mail, an image or a combination thereof communicated via the PUA or otherwise.
  • the message can be in form of a light indication, such as by lighting up an LED or lamp to indicate whether the use of the PUA is in compliance or whether a reward has been earned by the user.
  • the determination of the reward to the user 108 and the communication of the message to the user 1 10 are optional actions by the monitoring system in monitoring the user's use of the PUA.
  • the determination of the reward is omitted and the monitoring system proceeds to communicating the message to the user indicating the user's compliance and/or level of compliance.
  • the monitoring system determines the reward to the user and automatically provides the reward to the user, such as by sending the reward directly to the user or applying the reward to the user's account, without communicating any messages to the user indicating the user's compliance, level of compliance or reward earned.
  • the monitoring system where it has determined that a user has failed to comply, it sends one or more messages to the user and/or to the user's PUA noting such failure, with or without further message content encouraging compliance in the future.
  • the message noting failure to comply is sent in a plurality of different forms, such as both a text message and a voice call, which can be generated either automatically or by human intervention.
  • the determination of a reward is made by one or more programmable processors running a reward determination program module that receives the compliance data produced by the compliance analysis program module and serve to produce reward data based on stored rules, such as rules stored in storage 6, specifying what rewards (including kind and amount), if any, to accord to the user for whom the compliance data was produced.
  • the communications management program module communicates a reward notification to the PUA and/or its user, and/or communicates an order to a service (such as a supplier of goods or services, which can include content and other data) to provide the determined rewards to the user or credit an account of the user with such rewards.
  • the use of a research device is monitored by communicating a request message to the research device, the request message requesting a response from the user of the research device, receiving a response message communicated from the research device in response to the request message, and determining whether the use of the research device by the user is in compliance with the at least one predetermined use criterion.
  • Figure 2B illustrates this embodiment of monitoring use of a research device, namely, a user's PUA, by the monitoring system.
  • the user's PUA is replaced by a research device that does not comprise a PUA.
  • a request message is sent to a PUA from a monitoring system
  • a response message is generated 202 in the PUA and communicated thereby to the monitoring system, in response to the request message and the response message is received 204 by the monitoring system from the PUA (or its non-receipt is recorded).
  • These actions performed by the monitoring system are similar to those, i.e. 100, 102 and 104, described above with respect to Figure 2A, and therefore a detailed description thereof is omitted for purposes of clarity and simplicity.
  • the monitoring system determines 205 whether the user's use of the PUA complies with at least one predetermined use criterion.
  • This determination 205 is performed by a processor of the monitoring system.
  • the predetermined criteria includes, but is not limited to, the PUA being carried, the PUA being carried by a specific user, the PUA being turned “on,” the PUA being charged, the PUA maintaining a minimum charge or power capacity, the PUA being docked at, or connected with, the docking station for a predetermined length and/or period of time, or at certain times, the PUA functioning properly and the PUA being capable of collecting, storing and/or communicating research data, or of cooperating with one or more other devices to do so.
  • the determination 205 whether the use of the PUA is in compliance with the predetermined criteria is based on at least one of the receipt or non-receipt 204 of the response message from the PUA, the time of receipt of the response message and the content of the response message. For example, when the determination 205 is based on the receipt or non-receipt of the response message from the PUA, the processor determines that the use of the PUA is not in compliance with the predetermined criteria if the receipt message is not received within a predetermined period of time from the sending of the request message to the PUA in 200.
  • a request message requesting a response from the user (such as a text message or voice prompt) is sent to the PUA at regular intervals during the day, at intervals determined according to dayparts or according to a pseudorandom schedule, and the promptness of the user's response, if any, is used to determine an amount or quality of a reward to the user.
  • the processor determines how much time had elapsed between the time of sending of the request message to the PUA and the time of receipt of the response message from the PUA and compares it to a selected compliant response time.
  • the compliant response time in certain embodiments is a constant duration for all users, all PUA's, all types of request messages, all places and all times. In certain other embodiments, the compliant response time is selected based on user demographics or an individual profile. In certain embodiments, the compliant response time is based on the type of request message and/or its contents.
  • the compliant response time is specified in the message, for example, "Please respond within ten minutes.”
  • the compliant response time is selected based on the type of PUA that receives it, for example, a cellular telephone or Blackberry device for which a relatively short response time can be expected, as compared to a personal audio or DVD player, for which a longer response time may be appropriate.
  • the compliant response time is selected depending on the manner in which the request message is to be presented to the user. For example, if receipt of the message is indicated to the user by an audible alert or device vibration, a shorter response time can be expected than in the case of a message presented only visually.
  • the compliant response time is selected based on the time of day. For example, during morning or afternoon drive time, the response time may be lengthened since the user may not be able to respond as quickly as during the evening when the user is at home. In certain embodiments, the compliant response time is selected based on the user's location. For example, in certain places it may be customary to respond to messages more quickly than in others. In certain embodiments, the compliant response time is selected based on a combination of two or more of the foregoing factors.
  • the time elapsed between the sending of the request message and the receipt of the response is less than the selected response time, it is determined that the user's use of the PUA is in compliance with the pre-determined criteria. However, if the elapsed time is greater than the selected response time, it is determined that the use of the PUA is not in compliance with the predetermined criteria.
  • the amount of time elapsed between the sending 200 of the request message and the receiving 204 of the response message is used to determine a level of the user's compliance with the predetermined use criteria. In particular, the level of compliance determined by the processor will depend on how quickly the response message is received by the processor, such that the level of compliance is greater as the amount of time elapsed between the sending 200 of the request message and the receipt 204 of the response message is less.
  • the processor determines whether the content of the response message complies with predetermined parameters.
  • a selected response message, complying with predetermined parameters is requested 200 by the request message communicated to the PUA, and in determining compliance and/or the level of compliance, the processor compares the response message received 204 from the PUA with the requested response.
  • the request message communicated 200 to the PUA comprises a request for the user's password or for a particular code, such as a user' screen name or real name
  • the response message received 204 in response to the request message is compared by the processor to pre- stored data, such as a password, code, screen name or real name stored in a database, to determine 205 whether the use of the PUA is in compliance with the predetermined criteria. If the received response message matches the stored message, i.e. password, a name (such as a screen name selected by the user or the user's real name) or a code, stored in the database, then the processor determines that the user is in compliance with the predetermined criteria.
  • the monitoring system is capable not only of confirming that the PUA is being carried and/or used, but also of confirming that the PUA is being carried and/or used by a specific user.
  • the requested response comprises information from the user, such as what the user is doing when the message is received or at other times, the user's location or locations at various times, media or products to which the user has been exposed, has purchased or used, or plans to purchase or use, the user's beliefs and/or the user's opinions.
  • the requested response comprises information concerning an operational state of the PUA (for example, as indicated thereby or as determined by the user), whether and/or when the user performed some action (such as docking or recharging the PUA), and/or whether and/or how the user is carrying the PUA.
  • the processor determines 205 the level of the PUA user's compliance based on the content of the message.
  • the response message received 204 is compared with stored data, such as a password, name or code stored in the database, and determines the level of compliance based on how closely the response message matches with the stored data.
  • a first, or highest, level of compliance is determined if the response message matches the stored message
  • a second level of compliance which is lower than the first level, is determined if the response message does not match the stored message
  • a third, or lowest, level of compliance is determined if no response message is received 204 from the PUA.
  • a plurality of different intermediate levels of compliance may be determined instead of the second level of compliance, if a response message is received but does not match the stored message.
  • the level determined is based on the extent of similarity between the response message and the pre-stored data.
  • the intermediate level of compliance will be higher in a case where the response message received 204 from the PUA differs from the stored message by only one character than in a case where the response message received from the PUA is completely different from the stored message.
  • the user's compliance and/or level of compliance is determined not only based on the content of the response message but also on the time of receipt of the response message. In certain ones of such embodiments, the user's compliance will depend on whether the response message matches with the stored data, as well as on how quickly the response message is received from the PUA. In certain ones of such embodiments, the highest level of compliance is determined if the response message received from the PUA matches the stored data, and if the time elapsed between the sending of the request message to the PUA and the receipt of the response message is less than a selected time.
  • the level of compliance determined 205 is selected at a level intermediate a highest level of compliance and a lowest level. If no response message is received from the PUA, then the lowest level of compliance, or non- compliance is determined by the monitoring system.
  • the monitoring system also determines and/or provides 206 a reward to the user for complying with predetermined criteria 206 and/or sends a message to the user indicating at least one of the user's compliance, the level of compliance and the reward to the user 208.
  • the monitoring system determines whether the PUA use complies with the predetermined use criteria and/or the level of the user's compliance
  • the monitoring system proceeds to determine and/or provide 206 a reward to the user of the PUA.
  • the system then communicates 208 a message to the user indicating the user's compliance, level of compliance and/or the reward earned by the user.
  • the determination and/or provision 206 of the reward and the communication 208 of the message indicating compliance, level of compliance and/or the reward are optional.
  • the determination and/or provision of the reward is performed without communicating the message to the user, while in other embodiments, the communication 208 of the message is performed without determining and/or providing 206 the reward.
  • the monitoring system monitors one or more parameters, such as biometric parameters, sounds external to a research device, an impact of the research device with another object, motion of the research device, proximity of the research device to the person of a user, proximity of the research device to a presence indicator or personal identification device in or on the person of a user, pressure applied to the research device, recharging of the research device, its power capacity, docking of the research device, data input (e.g., messages) to the research device, location of the research device and/or changes in the research device's location, to determine whether the use of the research device is in compliance with at least one predetermined criterion.
  • biometric parameters sounds external to a research device
  • an impact of the research device with another object motion of the research device
  • proximity of the research device to the person of a user proximity of the research device to a presence indicator or personal identification device in or on the person of a user
  • pressure applied to the research device recharging of the research device, its power capacity, docking of the research device
  • the monitoring system produces monitored data by monitoring at least one of a user's heart activity, a user's brain activity, a user's breathing activity, a user's pulse, a user's blood oxygenation, a user's borborygmus (gastrointestinal noise), a user's gait, a user's voice, a user's key, keypad or keyboard usage characteristics (e.g., keystroke recognition), a user's vascular pattern, a user's facial or ear patterns, a user's signature, a user's fingerprint, a user's handprint or hand geometry, a user's retinal or iris patterns, a user's airborne biochemical indicators (sometimes referred to as a user's "smellprint”), a user's muscular activity, a user's body temperature, sounds external to the research device, motion of the research device, pressure applied to the research device, recharging of the research device, docking of the research device, its power capacity,
  • the monitoring of the biometric parameters 222, external sounds, presence indication signal, personal identification signal 224, PUA location, PUA location changes 226, data input 228 and/or impact of the PUA with another object, pressure applied to the PUA, motion of the PUA, recharging, power capacity, docking 230 is performed in the PUA 2 by the sensor/detector 13 in cooperation with a processor of the PUA (not shown for purposes of simplicity and clarity).
  • the sensor/detector 13 in certain embodiments includes a plurality of sensors and/or detectors which monitor a plurality of parameters.
  • the sensor/detector 13 comprises one or more of a heart monitor for monitoring heart activity of the user, an EEG monitor for monitoring the user's brain activity, a breathing monitor for monitoring the user's breathing activity including, but not limited to, the user's breathing rate, a pulse rate monitor, a pulse oximeter, a sound detector for monitoring the user's borborygmus and/or the user's voice, a gait sensor and/or a gait analyzer for detecting data representing the user's gait, such as a motion sensor or accelerometer (which may also be used to monitor muscle activity), a video camera for use in detecting motion based on changes to its output image signal over time, a temperature sensor for monitoring the user's temperature, an electrode or electrodes for picking up EKG and/or EEG signals, and a fingerprint or handprint scanner for detecting the user's fingerprint or handprint.
  • a heart monitor for monitoring heart activity of the user
  • EEG monitor for monitoring the user's brain activity
  • a breathing monitor for monitoring the
  • sensor/detector 13 comprises a low-intensity light source, for scanning, detecting or otherwise sensing the retinal or iris patterns of the user.
  • sensor/detector 13 comprises a device configured with an optical sensor or other imaging device to capture predetermined parameters of the user's hand, such as hand shape, finger length, finger thickness, finger curvature and/or any portion thereof.
  • sensor/detector 13 comprises an electronic sensor, a chemical sensor, and/or an electronic or chemical sensor configured as an array of chemical sensors, wherein each chemical sensor may detect a specific odorant or other biochemical indicator.
  • sensor/detector 13 comprises an optical or other radiant energy scanning or imaging device for detecting a vascular pattern or other tissue structure, or blood flow or pressure characteristic of the user's hand or other body part.
  • the sensor/detector 13 comprises a video camera, optical scanner or other device sufficient to recognize one or more facial features or one or more features of the user's ear or other body part.
  • the senor/detector 13 is mounted in or on the PUA 2, while in others the sensor/detector 13 is arranged separately from the PUA 2 and communicates therewith via a cable or via an RF, inductive, acoustic, infrared or other wireless link.
  • the sensor/detector 13 of the PUA 2 monitors sounds external to the PUA 224
  • the sensor/detector 13 comprises an acoustic sensor such as a microphone or any other suitable sound detector for detecting external sounds.
  • the sensor/detector 13, which monitors external sounds cooperates with the processor for analyzing the detected external sounds.
  • the external sounds detected by the sensor/detector 13 include, but are not limited to, environmental noise, rubbing of the PUA 2 against the user's clothing or other external objects, vehicle sounds (such as engine noise and sounds characteristic of opening and closing car doors), the user's voice print, dropping of the PUA, average ambient noise level, and the like.
  • sensor/detector 13 receives a presence indication signal or personal identification signal from signal emitter 14
  • sensor/detector 13 comprises a device operative to receive the signal, such as an RF receiver, a microphone, an optical sensor, an inductive pickup, a capacitive pickup, a chemical sensor or a conductive connection.
  • the sensor/detector 13 monitors the user's data input 228 (e.g., messages or inputs to control a diverse operation of the PUA, such as to make use of an application running thereon, like a game)
  • the sensor/detector 13 comprises a pressure sensor for sensing pressure applied to the message input by the user.
  • the sensor/detector 13 comprises a utility, such as a key logger, running on the processor of the PUA to determine and record its usage.
  • the sensor/detector 13 directly or indirectly detects the change in the PUA's location. Direct detection of the PUA's location is accomplished by detecting the location of the PUA and the change in PUA's location over time.
  • the sensor/detector 13 comprises a satellite location system, such as a GPS receiver, an ultra wideband location detector, a cellular telephone location detector, an angle of arrival location detector, a time difference of arrival location detector, an enhanced signal strength location detector, a location fingerprinting location detector, an inertial location monitor, a short range location signal receiver or any other suitable location detector. The same means can also be employed to determine the PUA's location.
  • Indirect detection of the PUA's location change is accomplished by detecting a predetermined parameter which is directly or indirectly related to the location of the PUA and determining from variations in the predetermined parameter whether a change in the location of the PUA has occurred.
  • a predetermined parameter detected by the sensor/detector 13 can be variations in the strength of a RF signal received by the PUA, and in such case, the sensor/detector 13 comprises a RF signal receiver. Where location change data is available such data is used in certain embodiments to determine whether and when the PUA was or is being carried.
  • the sensor/detector 13 monitors the impact of the PUA 2 with another object 230
  • the sensor/detector 13 comprises an impact detector for measuring pre-determined levels of impact of the PUA 2 with other objects.
  • the sensor/detector 13 comprises an accelerometer for detecting a relatively large acceleration upon impact of the PUA 2 with another object.
  • a pressure sensor is placed on an enclosure of the PUA or mechanically coupled therewith to receive force applied to such enclosure.
  • the magnitude of the pressure as it varies over time and/or with location on the enclosure are analyzed to determine if the PUA is being or was carried and/or the manner in which it was used and/or the event of non-use.
  • a video camera of the PUA is used as a motion sensor.
  • changes in the image data provided at the output of the video camera are processed to determine movement or an extent of movement of the image over time to detect that the PUA is being moved about, either by translational or rotation.
  • Techniques for producing motion vectors indicating motion of an image or an extent of such motion are well known in the art, and are used in certain embodiments herein to evaluate whether the PUA is moving and/or the extent of such movement.
  • changes in the light intensity or color composition of the image data output by the video camera (either the entire image or one or more portions thereof) over time are used to detect motion of the PUA.
  • a light sensitive device such as a light sensitive diode of the PUA, is used as a motion sensor. Changes in the output of the light sensitive device over time that characterize movement serve to indicate that the PUA is being carried.
  • the one or more parameters also include power remaining in the PUA, recharging of the PUA and/or the event of docking of the PUA by coupling the PUA with the docking station, for example, as illustrated in Figure 1 C.
  • the monitoring system produces monitored data by monitoring the power remaining in the PUA and/or by monitoring the docking of the PUA at the docking station.
  • the monitoring system monitors the length of time the PUA was coupled with the docking station, the time period during which the PUA was coupled with the docking station, a time at which the PUA is docked, a time at which the PUA was undocked, whether or not the PUA is coupled with the docking station and/or the length of time passed since the PUA was last docked at the docking station.
  • the monitoring of one or more parameters 222-230 by the monitoring system produces monitored data which indicates at least whether or not the PUA was being carried and/or used in one or more of various ways. For example, if monitoring includes monitoring one or more biometric parameters of the user, then the monitored data indicates at least whether or not the biometric parameters being monitored have been detected. Similarly, in the case of monitoring PUA location changes, external sounds, data input, pressure, motion, light changes and/or impact of the PUA with other objects, the monitored data includes data indicating at least whether or not any of these parameters have been detected.
  • Monitored data that indicates that one or more of these parameters have been detected in the PUA indicates that the PUA was being carried and/or used, while monitored data indicating a lack of any detection of one or more of the monitored parameters indicates that the PUA was not being carried or used.
  • the monitored data produced indicates at least whether or not the PUA was charged and/or whether or not the PUA was docked at the docking station according to a predetermined time parameter.
  • the monitored data includes data indicating at least whether or not the PUA was charged, and in certain embodiments, the monitored data indicates whether the power capacity remaining in the PUA was greater than a predetermined minimum.
  • monitoring includes monitoring of the docking of the PUA at the docking station
  • the monitored data indicates at least whether or not the PUA was docked at the docking station at any time
  • the monitoring data indicates one or more of whether or not the PUA was docked at the docking station for a predetermined length of time, how frequently the PUA was docked, when the PUA was docked, when the PUA was undocked and/or the time periods during which the PUA was docked.
  • the monitored data produced in these embodiments can be used to determine whether the use of the PUA was in compliance with the criteria for recharging of the PUA and/or docking of the PUA.
  • monitored data comprises data which can be used to confirm the identity of the PUA user. For example, if one or more biometric parameters of the user are monitored by the sensor/detector, the monitored data includes data indicating or relating to one or more of the user's heart rate or other heart activity or parameter, EEG, blood oxygenation, breathing rate or other breathing activity or parameter, borborygmus, gait, voice, voice analysis, key, keypad or keyboard usage characteristics, fingerprints, handprints, hand geometry, pulse, retinal or iris patterns, olfactory characteristics or other biochemical indicators, patterns of muscular activity, vascular patterns, facial or ear patterns, signature, and/or body temperature detected once or a plurality of times over a predetermined period of time.
  • monitored data can include data relating to the specific locations or changes in location of the PUA and/or relating to the specific RF signal strengths of the PUA detected one or a plurality of times over a predetermined period of time.
  • the sensor/detector 13 of the PUA 2 comprises a digital writing tablet that is used to input a digital handwritten signature from the user to assess who is using the PUA.
  • a storage of the PUA stores signature recognition software to control a processor of the PUA to compare the current user's signature input by means of the digital writing tablet against stored templates of one or more users' handwritten signatures to determine if there is a match. (The storage and the processor are not shown for purposes of simplicity and clarity.) Based on the results of the matching process, data is produced indicating whether the current user's signature matches any of the stored templates to assess the identity of the current user of the PUA.
  • the templates of the users' signatures are produced in a training mode of the signature recognition software, in which each potential user inputs one or more signatures using the digital writing tablet from which a corresponding template is produced by the PUA's processor and then stored in its storage.
  • the PUA includes a digital writing tablet to enable a user-beneficial function, such as note taking and it is then unnecessary to provide a dedicated digital writing tablet.
  • the sensor/detector 13 comprises a microphone and a voiceprint recognition technique is used to assess the identity of the user of the PUA 2.
  • the PUA's storage stores voice recognition software to control its processor to compare the current user's voice input by means of the microphone against stored voiceprints of one or more possible users to determine if there is a match. Based on the results of the matching process, data is produced indicating whether the current user's voice matches the voice represented by any of the stored voiceprints to assess the identity of the current user of the PUA.
  • the voiceprints of one or more potential users are produced in a training mode of the voice recognition software, in which each potential user speaks into the microphone of the PUA to produce data from which the voiceprint is produced by its processor and then stored in its storage.
  • Various ones of such embodiments extract the user's voiceprint under different conditions.
  • the user's voiceprint is extracted when the user places a voice call using the PUA as a cellular telephone in response to a request message from a monitoring system.
  • the PUA's processor extracts voiceprints continuously from the output of its microphone, or at predetermined times or intervals, or when a telephone call is made using the PUA as a cellular telephone or when the output from the PUA's microphone indicates that someone may be speaking into it (indicated, for example by the magnitude of the output, and/or its time and/or frequency characteristics).
  • the extracted voiceprints are compared to the stored voiceprint to assess the identity of the person using the PUA.
  • the sensor/detector 13 comprises an imagining device, such as a video camera, or other radiant energy detector, such as a line scanner implemented by means of a CCD or an array of photodiodes, that is used to input data representing an image or line scan of a physical feature of the user, such as an iris, a retina, an image of all or portion of the user's face, finger, palm, hand or ear to assess the identity of the user of the PUA 2.
  • a physical feature of the user such as an iris, a retina, an image of all or portion of the user's face, finger, palm, hand or ear to assess the identity of the user of the PUA 2.
  • a physical feature of the user such as an iris, a retina, an image of all or portion of the user's face, finger, palm, hand or ear
  • the input data is processed to extract an iris or retinal pattern code.
  • a facial image is processed to extract data unique to the user such as a signature or feature set representing facial bone structure.
  • the PUA's storage stores pattern recognition software to control its processor to compare the current user's iris or retinal pattern code, facial signature or feature set or other characteristic data input by means of the imaging device against one or more stored pattern codes, signatures, feature sets or other characteristic data of one or more potential users, as the case may be, to determine if there is a match.
  • characteristic data may be stored in storage 50 or in a storage of a separate device, system or processing facility.
  • data is produced by the PUA's processor operating under control of the pattern recognition software to assess the identity of the current user of the PUA.
  • the pattern code, signature, feature set or other characteristic data of each potential user is produced in a training mode of the pattern recognition software, in which the appropriate physical feature of the potential user is imaged or scanned one or more times using the imaging device from which the desired data is produced by the PUA's processor and then stored in its storage.
  • the physical feature concerned is scanned or imaged at a plurality of different orientations to produce the desired data.
  • the PUA (such as a cellular telephone) includes a digital camera to enable a user-beneficial function, such as digital photography or video imaging and it is then unnecessary to provide a dedicated imaging device or scanner.
  • a keyboard dynamics technique is used to assess the identity of the user.
  • the PUA's storage stores keystroke monitoring software to control its processor to collect characteristic keystroke parameters, such as data indicating how long the user holds down the keys 1 1 a of PUA 2, the delay between one keystroke and the next (known as "latency"), and frequency of using of special keys, such as a delete key.
  • characteristic keystroke parameters such as data indicating how long the user holds down the keys 1 1 a of PUA 2, the delay between one keystroke and the next (known as "latency"), and frequency of using of special keys, such as a delete key.
  • Still other parameters such as typing speed and the manner in which the user employs key combinations (such as keyboard shortcuts), may be monitored by the processor. These parameters are processed in a known manner to produce a feature set characterizing the user's key usage style which is then compared against a stored feature sets representing the styles of one or more potential users.
  • data is produced indicating whether the current user's key usage style matches that of one of the potential users as represented by a matching stored feature set to assess the identity of the current user of the PUA.
  • the feature sets representing the usage styles of the potential users are produced in a training mode of the software, in which each potential user makes use of the key or keys of the PUA to produce data from which the feature set is produced by the PUA's processor and then stored in its storage.
  • the sensor/detector 13 comprises a motion sensitive device, such as an accelerometer, that produces data related to motion of the PUA 2. This data is used to produce a feature set characterizing motion of the PUA, and thus the gait of a person carrying it.
  • the PUA's storage stores pattern recognition software to control its processor to compare the current user's gait feature set against one or more stored reference feature sets representing the individual gaits of potential users to determine if there is a match. Based on the results of the matching process, data is produced indicating whether the current user's gait matches that represented by a stored feature set to assess the identity the current user of the PUA.
  • the various feature sets each representing the gait of a potential user are produced in a training mode of the pattern recognition software, in which each potential user walks about carrying the PUA while the motion sensitive device thereof produces data from which its processor produces a respective reference feature set which it stores in the PUA's storage.
  • the PUA includes an accelerometer as an input device to enable a user-beneficial function, such as a gaming input or scrolling command input, and it is then unnecessary to provide a dedicated accelerometer as the motion sensitive device.
  • multiple devices and pattern recognition techniques are employed to produce a more accurate and reliable identification of the user than is possible using only one such pattern recognition technique.
  • one or more of such pattern recognition techniques or other passive data gathering technique is employed to assess the user's identity. Such detection may be based on an amount by which a monitored feature set differs from a stored feature set representing a characteristic of each potential user as determined by the PUA's processor.
  • the processor When the processor produces data indicating an identification of the user, in certain embodiments either the processor controls a speaker, earphone or visual display of the PUA to present a message to the user requesting a response from which the user's identity may be positively determined, or the processor sends a message to a monitoring system (not shown for purposes of simplicity and clarity) indicating that such a message should be presented to the user. In the latter case, the monitoring system responds to such message from the processor to send a message to the PUA for presentation to the user to request an appropriate response from the user from which the user's identity may be determined, either by the processor or by the monitoring system. The user's response to such message is used to determine the user's identity.
  • data concerning usage of a PUA to perform a user-beneficial function is gathered by the monitoring system.
  • the gathering of data concerning such usage of the PUA comprises monitoring usage of the PUA to produce usage data within the PUA, and communicating the usage data from the PUA to a usage data processing facility.
  • This embodiment is illustratively shown in Figure 3. This is especially useful for gathering marketing data concerning how users employ PUA's with an ability to communicate, such as cellular telephones, PDA's, notebook and laptop computers, Blackberry devices, PCS devices, two-way radios, as well as other kinds of PUA's having device-to-device communicating ability or wireless networking ability.
  • the monitoring system monitors the user's use of the PUA 280 and produces usage data within the PUA 282 based on such monitoring. If the monitoring system shown in Figure 1 B is employed, certain monitoring of PUA usage is performed by the sensor/detector 13, which detects the use of one or more functions performed by the PUA. For example, if the PUA includes a function of generating and communicating a text message to another PUA, the sensor/detector 13 in the PUA 2 detects when the user generates and/or communicates a text message, and usage data relating to the generation and communication of the text message is produced in the PUA 2.
  • the operations of sensor/detector 13 are implemented by a processor of the PUA that may carry out additional operations beyond those of sensor/detector 13.
  • the usage data produced in the PUA 2 includes at least data relating to content generated by the performance of the PUA function.
  • the usage data also comprises one or more of data indicating the type of PUA function used, data indicating the time of use of the PUA function, data indicating the length of time of the use of the PUA function, and data relating to the use of communications, if any, to send or receive messages with the use of the PUA.
  • Data relating to the use of communications by the PUA includes data relating to the time a message is communicated, the size of the message and/or the destination of the message, such as the recipient's telephone number, email address and/or IP address.
  • Data relating to the content generated by the use of the PUA function includes data relating to the subject of the generated content and/or data relating to words, phrases, names or concepts included in the content, such as "buzz words". Buzz words comprise words, terms or phrases that advertisers and other businesses would find of value as descriptive of consumers' experiences and reactions to media and advertising content. Some examples include word pair choices such as " boring” vs. “exciting”; "essential” vs.
  • the usage data produced in the PUA is thereafter communicated 284 to a usage data processing facility.
  • the usage data processing facility includes a processor, such as the processor 5 shown in Figure 1 B.
  • the processing facility is adapted to receive and process usage data to generate trend data relating to a variety of trends.
  • the trend data generated by the processing facility includes, but is not limited to, data relating to the time, frequency and/or manner of usage of the PUA function, the preference of one PUA function over others, the use of a particular "buzz word," name, brand and/or concept by users, the communications to a particular area code, IP address and/or email service, and other trends relating to the usage of the PUA.
  • the PUA includes communications for communicating with at least another PUA
  • the methods and systems for monitoring use of a PUA comprise detecting communicating a message by the communications of the PUA, providing monitored data relating to content of the message, and providing trend data representing at least one trend of usage of the PUA by the user based on the monitored data.
  • Figure 4 shows a flow diagram of actions performed by the monitoring system.
  • the PUA is adapted to communicate with other PUA's using a communication interface.
  • the PUA 2 includes communications in the form of an interface 9 which can communicate using the communications 7.
  • each of the other PUA's also includes a corresponding interface which is coupled with the communications 7, such that each such PUA can communicate with other PUA's via the communications 7.
  • the communicating of the message is detected 290 in the PUA.
  • the sensor/detector 13 is used to detect the communicating of the message by the PUA 2.
  • the operation of sensor/detector 13 is provided by a processor that may carry out operations in addition to those of sensor/detector 13.
  • the communicating by the PUA is detected by detecting a connection between the interface of the PUA with another PUA or device.
  • the communicating by the PUA is detected by detecting data sent from or received by the interface.
  • monitored data relating thereto is gathered 292 comprising at least data representing content of the message, such as the subject of the communication and/or the use of pre-selected words, names, concepts or images in the communication.
  • the monitored data includes data related to one or more of the time of communicating, the duration of communicating, the length or size of the message, the type of message (e.g., e-mail, voice, text message, etc.), and the source and/or the recipient of the message.
  • the monitored data is then processed 294 to determine at least one trend of usage of the PUA by the user and to provide trend data relating to at least one trend of usage.
  • the monitored data is processed either in the PUA 2, or is first communicated to the processor 5 via the communications 5 and thereafter processed by the processor 5 to provide trend data.
  • Trend data provided based on the monitored data comprises data relating to at least one of the PUA functions used by the user, the type of messages sent or received by the user, the frequency of messages sent or received by the user, the time of communicating the messages, the duration of the communicating, the source and recipient of the messages and the content of the messages.
  • the trend data provided by the monitoring system is then stored 296 either in the PUA or in an external storage.
  • the trend data is stored in at least one of the PUA 2 or in the storage 6. If the trend data is stored in the PUA 2, this data can thereafter be communicated to an external storage device such as the storage 6 of the monitoring system 1.
  • Trend data provided in the embodiments shown in Figures 6 and 7, and described above can be used as market research data to determine user preferences, including the user's preferences relating to the PUA functions.
  • trend data can be used to determine which functions of the PUA are most frequently used by which users, which functions could be removed or added in future versions of the PUA products.
  • trend data can be used to determine the popularity or success of a particular product, brand, person or concept and to ascertain how well a particular product, service or brand may do in the market.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Strategic Management (AREA)
  • Biomedical Technology (AREA)
  • General Business, Economics & Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Human Resources & Organizations (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Biophysics (AREA)
  • Quality & Reliability (AREA)
  • Pathology (AREA)
  • Tourism & Hospitality (AREA)
  • Data Mining & Analysis (AREA)
  • Operations Research (AREA)
  • Game Theory and Decision Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • Cardiology (AREA)
  • Physiology (AREA)
  • Pulmonology (AREA)
  • Radiology & Medical Imaging (AREA)

Abstract

Methods and systems for gathering data concerning usage of a portable user appliance are disclosed. Content created in the use of the portable user appliance is monitored to produce content related data. The content related data is communicated to a usage data processing facility for producing reports of interest to advertisers, media organizations, marketers and the like.

Description

MONITORING USAGE OF A PORTABLE USER APPLIANCE
[0001] Methods and systems for monitoring use of a portable user appliance (PUA) are disclosed.
BACKGROUND
[0002] Media organizations, marketers, advertisers, and others are interested in learning consumers' opinions, preferences, beliefs and feelings about media, advertisements and products. Various ways of soliciting consumers' views on these matters have been employed, but they are relatively expensive to carry out and it is questionable whether the views so obtained are biased since many consumers can be reluctant to express their real views when they respond to an explicit request for them. Accordingly, a relatively inexpensive way of obtaining consumers' views on such matters which also tends to provide such views candidly in an unbiased manner, would be advantageous.
DISCLOSURE
[0003] For this application, the following terms and definitions shall apply:
[0004] The term "data" as used herein means any indicia, signals, marks, symbols, domains, symbol sets, representations, and any other physical form or forms representing information, whether permanent or temporary, whether visible, audible, acoustic, electric, magnetic, electromagnetic or otherwise manifested. The term "data" as used to represent predetermined information in one physical form shall be deemed to encompass any and all representations of corresponding information in a different physical form or forms.
[0005] The terms "media data" and "media" as used herein mean data which is widely accessible, whether over-the-air, or via cable, satellite, network, internetwork (including the Internet), print, displayed, distributed on storage media, or by any other means or technique that is humanly perceptible, without regard to the form or content of such data, and including but not limited to audio, video, audio/video, text, images, animations, databases, broadcasts, displays (including but not limited to video displays, posters and billboards), signs, signals, web pages, print media and streaming media data.
[0006] The term "research data" as used herein means data comprising (1 ) data concerning usage of media, (2) data concerning exposure to media, and/or (3) market research data.
[0007] The term "presentation data" as used herein shall mean media data, content other than media data or a message to be presented to a user.
[0008] The term "database" as used herein means an organized body of related data, regardless of the manner in which the data or the organized body thereof is represented. For example, the organized body of related data may be in the form of a table, a map, a grid, a packet, a datagram, a frame, a file, an e-mail, a message, a document, a list or in any other form.
[0009] The term "correlate" as used herein means a process of ascertaining a relationship between or among data, including but not limited to an identity relationship, a correspondence or other relationship of such data to further data, inclusion in a dataset, exclusion from a dataset, a predefined mathematical relationship between or among the data and/or to further data, and the existence of a common aspect between or among the data.
[00010] The terms "purchase" and "purchasing" as used herein mean a process of obtaining title, a license, possession or other right in or to goods or services in exchange for consideration, whether payment of money, barter or other legally sufficient consideration, or as promotional samples. As used herein, the term "goods" and "services" include, but are not limited to, data and rights in or to data.
[00011] The term "network" as used herein includes both networks and internetworks of all kinds, including the Internet, and is not limited to any particular network or inter-network. [00012] The terms "first," "second," "primary," and "secondary" are used herein to distinguish one element, set, data, object, step, process, function, activity or thing from another, and are not used to designate relative position, arrangement in time or relative importance, unless otherwise stated explicitly.
[00013] The terms "coupled", "coupled to", and "coupled with" as used herein each mean a relationship between or among two or more devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, and/or means, constituting any one or more of (a) a connection, whether direct or through one or more other devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, or means, (b) a communications relationship, whether direct or through one or more other devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, or means, and/or (c) a functional relationship in which the operation of any one or more devices, apparatus, files, circuits, elements, functions, operations, processes, programs, media, components, networks, systems, subsystems, or means depends, in whole or in part, on the operation of any one or more others thereof.
[00014] The terms "communicate" and "communicating" as used herein include both conveying data from a source to a destination, and delivering data to a communications medium, system, channel, network, device, wire, cable, fiber, circuit, and/or link to be conveyed to a destination. The term "communications" as used herein includes one or more of a communications medium, system, channel, network, device, wire, cable, fiber, circuit and link.
[00015] The term "message" as used herein includes data to be communicated, in communication or which has been communicated.
[00016] The term "processor" as used herein means processing devices, apparatus, programs, circuits, components, systems and subsystems, whether implemented in hardware, software or both, and whether or not programmable. The term "processor" as used herein includes, but is not limited to one or more computers, hardwired circuits, signal modifying devices and systems, devices and machines for controlling systems, central processing units, programmable devices and systems, field programmable gate arrays, application specific integrated circuits, systems on a chip, systems comprised of discrete elements and/or circuits, state machines, virtual machines, data processors, processing facilities and combinations of any of the foregoing.
[00017] The terms "storage" and "data storage" as used herein mean data storage devices, apparatus, programs, circuits, components, systems, subsystems and storage media serving to retain data, whether on a temporary or permanent basis, and to provide such retained data.
[00018] The terms "panelist," "panel member" and "participant" are interchangeably used herein to refer to a person who is, knowingly or unknowingly, participating in a study to gather information, whether by electronic, survey or other means, about that person's activity.
[00019] The term "household" as used herein is to be broadly construed to include family members, a family living at the same residence, a group of persons related or unrelated to one another living at the same residence, and a group of persons (of which the total number of unrelated persons does not exceed a predetermined number) living within a common facility, such as a fraternity house, an apartment or other similar structure or arrangement.
[00020] The term "activity" as used herein includes, but is not limited to, purchasing conduct, shopping habits, viewing habits, computer, Internet usage, exposure to media, personal attitudes, awareness, opinions and beliefs, as well as other forms of activity discussed herein.
[00021] The term "portable user appliance" (also referred to herein, for convenience, by the abbreviation "PUA") as used herein means an electrical or nonelectrical device capable of being carried by or on the person of a user or capable of being disposed on or in, or held by, a physical object (e.g., attache, purse) capable of being carried by or on the user, and having at least one function of primary benefit to such user, including without limitation, a cellular telephone, a personal digital assistant ("PDA"), a Blackberry ® device, a radio, a television, a game system (e.g., a Gameboy ® device), a notebook computer, a laptop computer, a GPS device, a personal audio device (e.g., an MP3 player), a DVD player, a two-way radio, a personal communications device, a telematics device, a remote control device, a wireless headset, a wristwatch, a portable data storage device (e.g., Thumb ™ drive), a camera, a recorder, a keyless entry device, a ring, a comb, a pen, a pencil, a notebook, a wallet, a tool, a flashlight, an implement, a pair of glasses, an article of clothing, a belt, a belt buckle, a fob, an article of jewelry, an ornamental article, a pair of shoes or other foot garment (e.g., sandals), a jacket, and a hat, as well as any devices combining any of the foregoing or their functions.
[00022] The term "research device" as used herein shall mean (1 ) a portable user appliance configured or otherwise enabled to gather, store and/or communicate research data, or to cooperate with other devices to gather, store and/or communicate research data, and/or (2) a research data gathering, storing and/or communicating device.
[00023] The term "user-beneficial function" as used herein shall mean a function initiated or carried out by a person with the use of a PUA, which function is of primary benefit to that person.
[00024] A method of gathering data concerning usage of a PUA, comprises: monitoring content created in the use of the PUA to produce content related data; and communicating the content related data to a usage data processing facility.
[00025] A system for gathering data concerning usage of a PUA comprises a monitor in or on the PUA and operative to monitor content created in the use of the PUA to produce content related data; and communications coupled with the monitor to receive the content related data and operative to communicate the content related data from the PUA to a usage data processing facility.
[00026] A method of monitoring use of a PUA by a user, the PUA including a communication interface for communicating with at least another PUA comprises detecting communication by the communication interface of the PUA; providing communication content data relating to content of the communication of the PUA; and providing trend data representing at least one trend of usage of the PUA by the user based on the communication data.
[00027] A system for monitoring use of a PUA by a user, the PUA including a communication interface for communicating with at least another PUA comprises a monitor operative to detect data communicated by the communication interface of the PUA and to produce communication content data relating to content of the communicated data; and a processor coupled with the monitor to receive the communication content data and operative to provide trend data representing at least one trend of usage of the PUA by the user based on the communication content data.
[00028] Certain embodiments of the methods and systems are presented in the following disclosure in conjunction with the accompanying drawings, in which:
[00029] Figure 1 A illustrates various monitoring systems that include a portable user appliance ("PUA") used by a user and configured to operate as a research device;
[00030] Figure 1 B is a block diagram showing certain details of the monitoring systems of Figure 1A;
[00031] Figure 1 C is a block diagram showing the monitoring systems of Figure 1 A including a PUA coupled with a docking station;
[00032] Figures 2A and 2B are flow diagrams illustrating actions by the monitoring systems of Figures 1 A-1 C which actively monitor use of the PUA;
[00033] Figure 3 is a flow diagram illustrating actions by the monitoring systems of Figures 1A-1 C which monitor usage of the PUA; and
[00034] Figure 4 is a flow diagram illustrating actions by the monitoring systems of Figures 1A-1 C which provide trend data representing one or more PUA usage trends. [00035] Various embodiments of methods and systems for monitoring use of a PUA by one or more users are described herein below. Referring to the drawings, Figures 1 A and 1 B are schematic illustrations of a monitoring system 1 that includes a PUA 2, which is used by a user 3, and a processor 5. In certain embodiments otherwise corresponding to the embodiment of Figures 1 A and 1 B, the PUA 2 is replaced by a research device that does not comprise a PUA. The processor 5 may include one or a plurality of processors which are located together or separate from one another disposed within or controlled by one or more organizations. As shown, the PUA 2 may be coupled to the processor 5 via communications 7 which allows data to be exchanged between the PUA 2 and the processor 5. In certain embodiments, the PUA 2 is wirelessly coupled via communications 7 to the processor 5.
[00036] In some embodiments, the monitoring system 1 also includes storage 6 for storing data including, but not limited to, data received and/or processed by the central processor 5. In certain embodiments storage 6 includes one or more storage units located together or separate from one another at the same or different locations. In certain embodiments storage 6 is included with processor 5.
[00037] Figure 1 B is a more detailed illustration of an embodiment of the monitoring system 1 in which the PUA 2 is adapted to communicate wirelessly with the processor 5 using wireless communications 8. The PUA 2 includes a communication interface 9 for communicating and receiving data through communications 8. As shown, the PUA 2 also includes a message input 1 1 to allow the user of the PUA 2 to input a message into the PUA 2. The message input 1 1 is coupled with the communication interface 9 of the PUA 2, so that a message inputted using the message input 1 1 can be communicated from the PUA 2 via communications 8. It is understood that messages inputted using the message input 1 1 may be communicated to the processor 5, or to another PUA 2, or to another location or device coupled with communications 8. In the illustrative embodiment shown in Figure 1 B, the message input 1 1 comprises a plurality of keys 1 1 a in the form of a keypad. However, the configuration of the message input 1 1 may vary, such that, for example, the message input 1 1 may comprise one or more of a key, a button, a switch, a keyboard, a microphone, a video camera, a touch pad, an accelerometer, a motion detector, a touch screen, a tablet, a scroll-and-click wheel or the like.
[00038] In the illustrative configuration shown in Figure 1 B, the PUA 2 also comprises a sensor or a detector 13 for detecting one or more parameters. The parameter or parameters detected by the sensor/detector 13 include, but are not limited to, the remaining power capacity of the PUA 2, one or more of a user's biometric functions or parameters, a location of the PUA 2, a change in location of the PUA 2, data input to the PUA by the user, sounds external to the PUA 2, motion of the PUA 2, pressure being applied to the PUA 2, or an impact of the PUA 2 with another object. In certain embodiments, sensor/detector 13 detects a presence indication signal or a personal identification signal emitted by a signal emitter 14 carried in or on the person of the user. In certain ones of these embodiments, the signal emitter 14 comprises a device worn or carried by the user, such as a ring, a necklace, or other article of jewelry, a wristwatch, a key fob, or article of clothing that emits a predetermined signal indicating a user's presence or the identity of the user wearing or carrying the device. The signal may be emitted as an acoustic signal, an RF or other electromagnetic signal, or a chemical signal that sensor/detector 13 is operative to receive, or an electrical signal. In certain embodiments, the signal emitter 14 comprises a device implanted in the user, such as under the user's skin. In certain embodiments, the sensor/detector 13 includes a plurality of sensors or detectors each for detecting one or more of a plurality of parameters.
[00039] As shown in Figure 1 B, the sensor/detector 13 is coupled with the communications interface 9 of the PUA 2 so that data produced as a result of the sensing or detecting performed by the sensor/detector 13 can be communicated from the PUA 2 to the processor 5. Although the PUA 2 shown in Figure 1 B includes both the message input 11 and the sensor/detector 13, it is understood that in other embodiments, one of these elements may be omitted depending on the design of the PUA 2 and the requirements of the monitoring system 1.
[00040] As in Figure 1 A, the illustrative configuration of the monitoring system 1 shown in Figure 1 B includes storage 6 coupled or included with the processor 5 to store data, including data received and/or processed by the processor 5. Data stored in storage 6 can also be retrieved by the processor 5 when needed.
[00041] The PUA 2 shown in Figures 1 A and 1 B may be supplied with power from an A/C power source or other power supply, or using one or more batteries or other on-board power source (not shown for purposes of simplicity and clarity). It is understood that batteries used to supply power to the PUA 2 may include any type of batteries, whether rechargeable or not, that are suitable for use with the particular PUA 2. In certain embodiments, the PUA 2 receives power from rechargeable batteries or another kind of rechargeable power supply, such as a capacitor, and/or from a radiant energy converter, such as a photoelectric power converter, or a mechanical energy converter, such as a microelectric generator. In certain embodiments, the PUA 2 is connected with a docking station from time to time, which is used for charging the PUA 2 and/or transmitting data stored in the PUA 2 to the processor 5. Figure 1 C shows an embodiment of the PUA 2 used with the docking station 15. The docking station 15, which is typically not carried by the user and not coupled with the PUA 2 while the PUA is being carried by the user, is adapted to couple with the PUA 2 via a coupling 16. The coupling 16 can be a direct connection between the PUA 2 and the docking station 15 to allow recharging of the PUA 2 and/or communication of data between the PUA 2 and the docking station 15. In certain embodiments, data is communicated from the PUA to the docking station by a wireless infra-red, RF, capacitive or inductive link. In certain embodiments, data is communicated from the PUA 2 to the processor 5 by cellular telephone link or other wired or wireless network or device coupling.
[00042] As shown in Figure 1 C, in certain embodiments the docking station is connected to a power supply 17 to provide power for charging the PUA 2 when the PUA 2 is coupled with the docking station 15. In addition, in certain embodiments the docking station 15 includes a communication interface 19 adapted to communicate with the processor 5 through communications 7. When the PUA 2 is coupled with the docking station 15 via the coupling 16, data stored in the PUA 2, such as data collected by the PUA 2 when it was carried by the user, is transferred to the docking station 15 using the coupling 16 and thereafter communicated using the communication interface 19 to the processor 5 through communications 7. In these embodiments, the use of the docking station 15, rather than the PUA 2, to communicate to the processor 5 data collected by the PUA 2 enables conservation of power by the PUA 2 or the use of an internal power supply having a relatively low power capacity. In certain embodiments, the docking station 15 is also used to receive data from the processor 5 via communications 7, and to transfer the received data from the docking station 15 to the PUA 2 via the coupling 16 when the PUA 2 is coupled with the docking station 15.
[00043] As can be appreciated, the configuration of the docking station 15 is not limited to the configuration shown in Figure 1 C and may vary from one embodiment to another. For example, in certain embodiments, the docking station is used only for charging the PUA 2 and does not include a communication interface 19. In such embodiments, the docking station 15 is implemented variously as a cradle receiving the PUA 2 or as a standard AC-to DC converter, like a cellular telephone charger. In other embodiments, the docking station 15 is used only for communication of data between the PUA 2 and the processor 5 and does not charge the PUA 2. In such embodiments, the PUA 2 may be connected to a power supply, separate from the docking station 15, for charging, or charged using an internal power converter, or by replacing one or more batteries.
[00044] In certain embodiments, the PUA 2 shown in Figures 1 A-1 C optionally includes an output (not shown for purposes of simplicity and clarity) for outputting a message to the user. The output can be in the form of a display for displaying text, or one or more symbols and/or images, a speaker or earphone for outputting a voicemail or a voice message, or one or more LED's or lamps for indicating a message to the user. It is understood that the output or outputs are not limited to the examples provided herein and can comprise any suitable output or outputs adapted to provide a message to the user.
[00045] The monitoring system 1 shown in Figures 1 A and 1 B is used in certain embodiments for monitoring use by a user of the PUA 2 in accordance with at least one predetermined use criterion. The at least one predetermined use criterion comprises one or more of the following criteria: that the PUA 2 is being carried and/or used, that the PUA 2 is being carried and/or used by a specific user, that the PUA 2 is turned "on," that the PUA 2 is charged, that the PUA 2 maintains a minimum power capacity, that the PUA 2 is, or has been, docked at, or connected with, the docking station 15 for a predetermined length of time, at certain times or during a predetermined time period, that the PUA is functioning properly to provide a benefit to the user, and that the PUA 2 is capable of collecting, storing and/or communicating research data, or of cooperating with one or more other devices to do so. Other predetermined use criteria not mentioned above may also be employed in monitoring the PUA's use.
[00046] In certain embodiments, the method of monitoring use by a user of a research device such as PUA 2 in accordance with at least one predetermined use criterion comprises communicating a request message to the research device, requesting a response from the user of the PUA, receiving a response message communicated from the research device in response to the request message, and storing data indicating whether the use is in compliance with the at least one predetermined use criterion and/or a level of the user's compliance therewith. This monitoring method is illustrated in more detail in Figure 2A, which shows a block diagram of the actions performed by the monitoring systems shown in Figures 1 A-1 C.
[00047] As shown in Figure 2A, a request message is first communicated 100 to a PUA having a two-way communication capability with a remotely-located processor, such as processor 5 of Figures 1 A-1 C, requesting a response from a user of the PUA. In certain embodiments, the request message comprises a text message, a telephone call, a voice mail, an e-mail, a voice message, a sound, a plurality of sounds, a web page, an image, a light alert, or a combination thereof, or any other data presented to the user via the PUA which indicates to the user that a response is being requested. The request message is presented to the user using an appropriate output (for example, a sound reproducing device, such as a speaker or earphone) if the message is a telephone call, a voice mail, a voice message, a sound or a plurality of sounds; a visual display, if the message is a text message, an e-mail, a web page or another image; and/or one or more light emitting devices (for example, LED's or lamps) if the message is a light alert. In certain embodiments, the request message requests a pre- determined response from the PUA user, or a more general response such as a response that acknowledges receipt of the request message. In certain embodiments, the request is accompanied by data of interest to the user, such as access to certain web sites or content, such as music, video, news, or electronic coupons. In certain ones of such embodiments, access to such data is conditioned on providing the requested response according to parameters expressed in the request message or otherwise predetermined. In certain embodiments, the processor is implemented as one or more programmable processors running a communications management program module serving to control communications with the PUA and/or its user, along with other PUA's, to request a response including data from which compliance can be assessed. In certain ones of such embodiments, such communications are scheduled in advance by the programming module with or without reference to a database storing schedule data representing a schedule of such communications, and carried out thereby automatically by means of communications 7. In certain ones of such embodiments, such communications are scheduled in advance and notified to human operators who initiate calls to the PUA's and/or the PUA's users according to the schedule, to solicit data from which compliance can be assessed. In certain ones of such embodiments, both automatic communications and human-initiated communications as described above are carried out.
[00048] In response to the request message, a response message is generated 102 in the PUA. In certain embodiments, the response message is generated by inputting the response message by an action of the user using the message input of the PUA. In particular, in certain embodiments in which the response message comprises a code, including letter characters, number characters or symbols, or a combination thereof, the response message is generated using the message input of the PUA. Alternatively, the response message comprises data stored in the PUA, in which case, the response message is generated by selecting the stored data using the message input. In other embodiments, the response message is a response signal generated by activating the message input, such as, for example, by switching one or more switches or by pressing one or more buttons of the message input. Where the response message comprises one or more audible sounds, the response message is generated by inputting the sounds using the message input. In such embodiments, the message input comprises an audio input device, such as an acoustic transducer. As mentioned above, the response message can be generated in response to a request for a pre-determined response, or in response to a request for a more general response.
[00049] After the response message is generated in the PUA, the response message is communicated from the PUA through communications thereof and is received 104 in the remotely-located processor, such as processor 5. In certain embodiments, such communications comprises cellular telephone communications, PCS communications, wireless networking communications, satellite communications, or a Bluetooth, ZigBee, electro-optical or other wireless link. In certain embodiments, such communications comprises as Ethernet interface, a telephone modem, a USB port, a Firewire connection, a cable modem, an audio or video connection, or other network or device interface. When the response message from the PUA is received, or a predetermined time period passes without receiving the response message, the processor provides data indicating whether the use of the PUA is in compliance with at least one predetermined criterion and/or the level of the user's compliance. The data provided by the processor is then stored 106 by the processor. In certain embodiments, the processor provides data indicating a user's compliance and/or the level of a user's compliance based on whether or not the response message from the PUA was received. In other embodiments, the processor provides compliance and/or level of compliance data based on the content of the response message, and/or the length of time passed before the response message from the PUA is received, and/or other factors discussed in more detail herein below. In certain embodiments the processor is implemented as one or more programmable processors running a compliance analysis program module which receives the data returned by the PUA and/or the user of the PUA to the communications management program module and serves to analyze the compliance of the user based on such data and in accordance with compliance rules stored in a storage, such as storage 6 of Figures 1 A-1 C. Based on such analysis, the compliance analysis program module produces compliance data indicating whether the user complied with the predetermined use criteria and/or a level of such compliance. [00050] In certain embodiments, a reward may be provided to a user when the user's use of the PUA is in compliance with the predetermined use criteria or when the user's level of compliance is above a pre-selected compliance level. The reward may be in the form of cash, credit, a prize or a benefit, such as a free service or points usable to make purchases or receive prizes, either by means of the PUA or through a different means or service. In certain ones of such embodiments, the reward comprises data of interest to the user, such as access to certain web sites or content, such as music, video, news, or electronic coupons. As shown in Figure 2A, when data indicating compliance or a level of compliance above a pre-selected compliance level is produced and/or stored, a reward to the user is determined 108. The reward to the user, including the type of the reward and/or an amount or quality of the reward, is determined by the processor of the monitoring system based on the stored data indicating user's compliance or the level of user's compliance. Where the reward is determined based on the level of the user's compliance, in certain embodiments the reward is provided to the user if the user's level of compliance is higher than a predetermined level and/or the type and/or the amount of the reward determined in 108 is varied as the level of the user's compliance increases or decreases. For instance, in certain embodiments a number of points awarded to the user that may be used to purchase goods or services, is greater where the user responds to a larger percentage of request messages, or is increased as the number of request messages that the user responds to increases.
[00051] Providing rewards to PUA users for use of the PUA in compliance with the predetermined use criteria provides an incentive for the users to comply with the use requirements so as to earn a reward or to earn a higher reward. Therefore, providing a reward to the PUA user for the correct use of the PUA also promotes correct use of the PUA in the future in accordance with the predetermined usage criterion or criteria.
[00052] In certain embodiments, the monitoring system also communicates a message to the PUA user indicating compliance and/or the level of compliance with the predetermined use criteria for the PUA and/or the reward earned by the user 1 10. The message communicated to the user can be in the form of a text message, a telephone call, a voice mail, a voice message, an e-mail, an image or a combination thereof communicated via the PUA or otherwise. In some embodiments, the message can be in form of a light indication, such as by lighting up an LED or lamp to indicate whether the use of the PUA is in compliance or whether a reward has been earned by the user. As shown in Figure 2A, the determination of the reward to the user 108 and the communication of the message to the user 1 10 are optional actions by the monitoring system in monitoring the user's use of the PUA. In some configurations, for example, the determination of the reward is omitted and the monitoring system proceeds to communicating the message to the user indicating the user's compliance and/or level of compliance. In other configurations, however, the monitoring system determines the reward to the user and automatically provides the reward to the user, such as by sending the reward directly to the user or applying the reward to the user's account, without communicating any messages to the user indicating the user's compliance, level of compliance or reward earned. In certain embodiments, where the monitoring system has determined that a user has failed to comply, it sends one or more messages to the user and/or to the user's PUA noting such failure, with or without further message content encouraging compliance in the future. In certain ones of such embodiments, the message noting failure to comply is sent in a plurality of different forms, such as both a text message and a voice call, which can be generated either automatically or by human intervention. In certain embodiments, the determination of a reward is made by one or more programmable processors running a reward determination program module that receives the compliance data produced by the compliance analysis program module and serve to produce reward data based on stored rules, such as rules stored in storage 6, specifying what rewards (including kind and amount), if any, to accord to the user for whom the compliance data was produced. Based on the reward data, the communications management program module communicates a reward notification to the PUA and/or its user, and/or communicates an order to a service (such as a supplier of goods or services, which can include content and other data) to provide the determined rewards to the user or credit an account of the user with such rewards.
[00053] In certain embodiments, the use of a research device is monitored by communicating a request message to the research device, the request message requesting a response from the user of the research device, receiving a response message communicated from the research device in response to the request message, and determining whether the use of the research device by the user is in compliance with the at least one predetermined use criterion. Figure 2B illustrates this embodiment of monitoring use of a research device, namely, a user's PUA, by the monitoring system. In certain other embodiments otherwise corresponding to the embodiment of Figure 2B, the user's PUA is replaced by a research device that does not comprise a PUA.
[00054] As shown in Figure 2B at 200, a request message is sent to a PUA from a monitoring system, a response message is generated 202 in the PUA and communicated thereby to the monitoring system, in response to the request message and the response message is received 204 by the monitoring system from the PUA (or its non-receipt is recorded). These actions performed by the monitoring system are similar to those, i.e. 100, 102 and 104, described above with respect to Figure 2A, and therefore a detailed description thereof is omitted for purposes of clarity and simplicity. As further shown in Figure 2B, when the response message is received from the PUA, the monitoring system determines 205 whether the user's use of the PUA complies with at least one predetermined use criterion. This determination 205 is performed by a processor of the monitoring system. As mentioned herein above, the predetermined criteria includes, but is not limited to, the PUA being carried, the PUA being carried by a specific user, the PUA being turned "on," the PUA being charged, the PUA maintaining a minimum charge or power capacity, the PUA being docked at, or connected with, the docking station for a predetermined length and/or period of time, or at certain times, the PUA functioning properly and the PUA being capable of collecting, storing and/or communicating research data, or of cooperating with one or more other devices to do so.
[00055] In certain embodiments, the determination 205 whether the use of the PUA is in compliance with the predetermined criteria is based on at least one of the receipt or non-receipt 204 of the response message from the PUA, the time of receipt of the response message and the content of the response message. For example, when the determination 205 is based on the receipt or non-receipt of the response message from the PUA, the processor determines that the use of the PUA is not in compliance with the predetermined criteria if the receipt message is not received within a predetermined period of time from the sending of the request message to the PUA in 200. In certain ones of such embodiments, a request message requesting a response from the user (such as a text message or voice prompt) is sent to the PUA at regular intervals during the day, at intervals determined according to dayparts or according to a pseudorandom schedule, and the promptness of the user's response, if any, is used to determine an amount or quality of a reward to the user.
[00056] When the determination of compliance with predetermined use criteria is based on the time of receipt of the response message, the processor determines how much time had elapsed between the time of sending of the request message to the PUA and the time of receipt of the response message from the PUA and compares it to a selected compliant response time. The compliant response time in certain embodiments is a constant duration for all users, all PUA's, all types of request messages, all places and all times. In certain other embodiments, the compliant response time is selected based on user demographics or an individual profile. In certain embodiments, the compliant response time is based on the type of request message and/or its contents. In certain ones of such embodiments, the compliant response time is specified in the message, for example, "Please respond within ten minutes." In certain embodiments the compliant response time is selected based on the type of PUA that receives it, for example, a cellular telephone or Blackberry device for which a relatively short response time can be expected, as compared to a personal audio or DVD player, for which a longer response time may be appropriate. In certain embodiments, the compliant response time is selected depending on the manner in which the request message is to be presented to the user. For example, if receipt of the message is indicated to the user by an audible alert or device vibration, a shorter response time can be expected than in the case of a message presented only visually. In certain embodiments, the compliant response time is selected based on the time of day. For example, during morning or afternoon drive time, the response time may be lengthened since the user may not be able to respond as quickly as during the evening when the user is at home. In certain embodiments, the compliant response time is selected based on the user's location. For example, in certain places it may be customary to respond to messages more quickly than in others. In certain embodiments, the compliant response time is selected based on a combination of two or more of the foregoing factors.
[00057] If the time elapsed between the sending of the request message and the receipt of the response is less than the selected response time, it is determined that the user's use of the PUA is in compliance with the pre-determined criteria. However, if the elapsed time is greater than the selected response time, it is determined that the use of the PUA is not in compliance with the predetermined criteria. In certain embodiments, the amount of time elapsed between the sending 200 of the request message and the receiving 204 of the response message is used to determine a level of the user's compliance with the predetermined use criteria. In particular, the level of compliance determined by the processor will depend on how quickly the response message is received by the processor, such that the level of compliance is greater as the amount of time elapsed between the sending 200 of the request message and the receipt 204 of the response message is less.
[00058] When the determination whether the user's PUA use is in compliance with one or more predetermined criteria is based on the content of the response message, the processor determines whether the content of the response message complies with predetermined parameters. In such embodiments, a selected response message, complying with predetermined parameters, is requested 200 by the request message communicated to the PUA, and in determining compliance and/or the level of compliance, the processor compares the response message received 204 from the PUA with the requested response. In one illustrative embodiment, the request message communicated 200 to the PUA comprises a request for the user's password or for a particular code, such as a user' screen name or real name, and the response message received 204 in response to the request message is compared by the processor to pre- stored data, such as a password, code, screen name or real name stored in a database, to determine 205 whether the use of the PUA is in compliance with the predetermined criteria. If the received response message matches the stored message, i.e. password, a name (such as a screen name selected by the user or the user's real name) or a code, stored in the database, then the processor determines that the user is in compliance with the predetermined criteria. By requesting a selected response message, such as a password, name or code, the monitoring system is capable not only of confirming that the PUA is being carried and/or used, but also of confirming that the PUA is being carried and/or used by a specific user.
[00059] In certain embodiments, in addition to or instead of other requested information, the requested response comprises information from the user, such as what the user is doing when the message is received or at other times, the user's location or locations at various times, media or products to which the user has been exposed, has purchased or used, or plans to purchase or use, the user's beliefs and/or the user's opinions. In certain embodiments, in addition to or instead of other requested information, the requested response comprises information concerning an operational state of the PUA (for example, as indicated thereby or as determined by the user), whether and/or when the user performed some action (such as docking or recharging the PUA), and/or whether and/or how the user is carrying the PUA.
[00060] In certain embodiments, the processor determines 205 the level of the PUA user's compliance based on the content of the message. In this illustrative embodiment, the response message received 204 is compared with stored data, such as a password, name or code stored in the database, and determines the level of compliance based on how closely the response message matches with the stored data. In certain ones of such embodiments, a first, or highest, level of compliance is determined if the response message matches the stored message, a second level of compliance, which is lower than the first level, is determined if the response message does not match the stored message, and a third, or lowest, level of compliance is determined if no response message is received 204 from the PUA. In some embodiments, a plurality of different intermediate levels of compliance may be determined instead of the second level of compliance, if a response message is received but does not match the stored message. In such embodiments, the level determined is based on the extent of similarity between the response message and the pre-stored data. Thus, for example, the intermediate level of compliance will be higher in a case where the response message received 204 from the PUA differs from the stored message by only one character than in a case where the response message received from the PUA is completely different from the stored message.
[00061] In certain embodiments, the user's compliance and/or level of compliance is determined not only based on the content of the response message but also on the time of receipt of the response message. In certain ones of such embodiments, the user's compliance will depend on whether the response message matches with the stored data, as well as on how quickly the response message is received from the PUA. In certain ones of such embodiments, the highest level of compliance is determined if the response message received from the PUA matches the stored data, and if the time elapsed between the sending of the request message to the PUA and the receipt of the response message is less than a selected time. If the response message does not match the stored data and/or the time elapsed between the sending of the request message and the receipt of the response message is greater than the selected time, then the level of compliance determined 205 is selected at a level intermediate a highest level of compliance and a lowest level. If no response message is received from the PUA, then the lowest level of compliance, or non- compliance is determined by the monitoring system.
[00062] In some embodiments, the monitoring system also determines and/or provides 206 a reward to the user for complying with predetermined criteria 206 and/or sends a message to the user indicating at least one of the user's compliance, the level of compliance and the reward to the user 208. In particular, after the monitoring system determines whether the PUA use complies with the predetermined use criteria and/or the level of the user's compliance, the monitoring system proceeds to determine and/or provide 206 a reward to the user of the PUA. The system then communicates 208 a message to the user indicating the user's compliance, level of compliance and/or the reward earned by the user. These actions performed by the monitoring system are similar to those (106 and 108) described above with respect to Figure 2A, and thus a detailed description thereof is omitted. As in the embodiments described with respect to Figure 2A, the determination and/or provision 206 of the reward and the communication 208 of the message indicating compliance, level of compliance and/or the reward are optional. Moreover, as in the embodiments described with respect to Figure 2A, in certain embodiments, the determination and/or provision of the reward is performed without communicating the message to the user, while in other embodiments, the communication 208 of the message is performed without determining and/or providing 206 the reward.
[00063] In certain embodiments of monitoring methods and systems, the monitoring system monitors one or more parameters, such as biometric parameters, sounds external to a research device, an impact of the research device with another object, motion of the research device, proximity of the research device to the person of a user, proximity of the research device to a presence indicator or personal identification device in or on the person of a user, pressure applied to the research device, recharging of the research device, its power capacity, docking of the research device, data input (e.g., messages) to the research device, location of the research device and/or changes in the research device's location, to determine whether the use of the research device is in compliance with at least one predetermined criterion. In one illustrative embodiment, the monitoring system produces monitored data by monitoring at least one of a user's heart activity, a user's brain activity, a user's breathing activity, a user's pulse, a user's blood oxygenation, a user's borborygmus (gastrointestinal noise), a user's gait, a user's voice, a user's key, keypad or keyboard usage characteristics (e.g., keystroke recognition), a user's vascular pattern, a user's facial or ear patterns, a user's signature, a user's fingerprint, a user's handprint or hand geometry, a user's retinal or iris patterns, a user's airborne biochemical indicators (sometimes referred to as a user's "smellprint"), a user's muscular activity, a user's body temperature, sounds external to the research device, motion of the research device, pressure applied to the research device, recharging of the research device, docking of the research device, its power capacity, an impact of the research device with another object, data input to the research device by a user, location of the research device and a change in a location of the research device, and determines whether use of the research device by the user is in accordance with at least one predetermined criterion based on the monitored data.
[00064] Referring again to Figure 1 B, the monitoring of the biometric parameters 222, external sounds, presence indication signal, personal identification signal 224, PUA location, PUA location changes 226, data input 228 and/or impact of the PUA with another object, pressure applied to the PUA, motion of the PUA, recharging, power capacity, docking 230 is performed in the PUA 2 by the sensor/detector 13 in cooperation with a processor of the PUA (not shown for purposes of simplicity and clarity). As mentioned above, the sensor/detector 13 in certain embodiments includes a plurality of sensors and/or detectors which monitor a plurality of parameters. In the embodiments in which the sensor/detector 13 monitors one or more biometric parameters of the PUA user 222, the sensor/detector 13 comprises one or more of a heart monitor for monitoring heart activity of the user, an EEG monitor for monitoring the user's brain activity, a breathing monitor for monitoring the user's breathing activity including, but not limited to, the user's breathing rate, a pulse rate monitor, a pulse oximeter, a sound detector for monitoring the user's borborygmus and/or the user's voice, a gait sensor and/or a gait analyzer for detecting data representing the user's gait, such as a motion sensor or accelerometer (which may also be used to monitor muscle activity), a video camera for use in detecting motion based on changes to its output image signal over time, a temperature sensor for monitoring the user's temperature, an electrode or electrodes for picking up EKG and/or EEG signals, and a fingerprint or handprint scanner for detecting the user's fingerprint or handprint. Where the user's retinal or iris patterns are monitored, sensor/detector 13 comprises a low-intensity light source, for scanning, detecting or otherwise sensing the retinal or iris patterns of the user. Where the user's hand geometry is detected, sensor/detector 13 comprises a device configured with an optical sensor or other imaging device to capture predetermined parameters of the user's hand, such as hand shape, finger length, finger thickness, finger curvature and/or any portion thereof. Where the user's smellprint is detected, sensor/detector 13 comprises an electronic sensor, a chemical sensor, and/or an electronic or chemical sensor configured as an array of chemical sensors, wherein each chemical sensor may detect a specific odorant or other biochemical indicator. Where a vascular pattern of the user is detected, sensor/detector 13 comprises an optical or other radiant energy scanning or imaging device for detecting a vascular pattern or other tissue structure, or blood flow or pressure characteristic of the user's hand or other body part. Where the user's facial or ear patterns are detected, the sensor/detector 13 comprises a video camera, optical scanner or other device sufficient to recognize one or more facial features or one or more features of the user's ear or other body part. In certain ones of these embodiments, the sensor/detector 13 is mounted in or on the PUA 2, while in others the sensor/detector 13 is arranged separately from the PUA 2 and communicates therewith via a cable or via an RF, inductive, acoustic, infrared or other wireless link.
[00065] In the embodiments in which the sensor/detector 13 of the PUA 2 monitors sounds external to the PUA 224, the sensor/detector 13 comprises an acoustic sensor such as a microphone or any other suitable sound detector for detecting external sounds. In certain embodiments, the sensor/detector 13, which monitors external sounds, cooperates with the processor for analyzing the detected external sounds. The external sounds detected by the sensor/detector 13 include, but are not limited to, environmental noise, rubbing of the PUA 2 against the user's clothing or other external objects, vehicle sounds (such as engine noise and sounds characteristic of opening and closing car doors), the user's voice print, dropping of the PUA, average ambient noise level, and the like. In the embodiments in which sensor/detector 13 receives a presence indication signal or personal identification signal from signal emitter 14, sensor/detector 13 comprises a device operative to receive the signal, such as an RF receiver, a microphone, an optical sensor, an inductive pickup, a capacitive pickup, a chemical sensor or a conductive connection.
[00066] In certain ones of the embodiments in which the sensor/detector 13 monitors the user's data input 228 (e.g., messages or inputs to control a diverse operation of the PUA, such as to make use of an application running thereon, like a game), the sensor/detector 13 comprises a pressure sensor for sensing pressure applied to the message input by the user. Alternatively or in addition, the sensor/detector 13 comprises a utility, such as a key logger, running on the processor of the PUA to determine and record its usage.
[00067] In the embodiments in which location change is being monitored 226, the sensor/detector 13 directly or indirectly detects the change in the PUA's location. Direct detection of the PUA's location is accomplished by detecting the location of the PUA and the change in PUA's location over time. In this case, the sensor/detector 13 comprises a satellite location system, such as a GPS receiver, an ultra wideband location detector, a cellular telephone location detector, an angle of arrival location detector, a time difference of arrival location detector, an enhanced signal strength location detector, a location fingerprinting location detector, an inertial location monitor, a short range location signal receiver or any other suitable location detector. The same means can also be employed to determine the PUA's location. Indirect detection of the PUA's location change is accomplished by detecting a predetermined parameter which is directly or indirectly related to the location of the PUA and determining from variations in the predetermined parameter whether a change in the location of the PUA has occurred. One of such predetermined parameters detected by the sensor/detector 13 can be variations in the strength of a RF signal received by the PUA, and in such case, the sensor/detector 13 comprises a RF signal receiver. Where location change data is available such data is used in certain embodiments to determine whether and when the PUA was or is being carried.
[00068] In embodiments in which the sensor/detector 13 monitors the impact of the PUA 2 with another object 230, the sensor/detector 13 comprises an impact detector for measuring pre-determined levels of impact of the PUA 2 with other objects. In certain embodiments, the sensor/detector 13 comprises an accelerometer for detecting a relatively large acceleration upon impact of the PUA 2 with another object.
[00069] In embodiments where pressure applied to the PUA is monitored, a pressure sensor is placed on an enclosure of the PUA or mechanically coupled therewith to receive force applied to such enclosure. In certain ones of such embodiments, the magnitude of the pressure as it varies over time and/or with location on the enclosure are analyzed to determine if the PUA is being or was carried and/or the manner in which it was used and/or the event of non-use.
[00070] In certain embodiments where motion of the PUA is monitored, a video camera of the PUA is used as a motion sensor. In certain ones of such embodiments, changes in the image data provided at the output of the video camera (either the entire image or one or more portions thereof) are processed to determine movement or an extent of movement of the image over time to detect that the PUA is being moved about, either by translational or rotation. Techniques for producing motion vectors indicating motion of an image or an extent of such motion are well known in the art, and are used in certain embodiments herein to evaluate whether the PUA is moving and/or the extent of such movement. In certain ones of such embodiments, changes in the light intensity or color composition of the image data output by the video camera (either the entire image or one or more portions thereof) over time are used to detect motion of the PUA. In certain embodiments where motion of the PUA is monitored, a light sensitive device, such as a light sensitive diode of the PUA, is used as a motion sensor. Changes in the output of the light sensitive device over time that characterize movement serve to indicate that the PUA is being carried.
[00071] In certain embodiments, the one or more parameters also include power remaining in the PUA, recharging of the PUA and/or the event of docking of the PUA by coupling the PUA with the docking station, for example, as illustrated in Figure 1 C. In such embodiments, the monitoring system produces monitored data by monitoring the power remaining in the PUA and/or by monitoring the docking of the PUA at the docking station. In the embodiments in which the docking of the PUA is monitored, the monitoring system monitors the length of time the PUA was coupled with the docking station, the time period during which the PUA was coupled with the docking station, a time at which the PUA is docked, a time at which the PUA was undocked, whether or not the PUA is coupled with the docking station and/or the length of time passed since the PUA was last docked at the docking station.
[00072] The monitoring of one or more parameters 222-230 by the monitoring system, as described above, produces monitored data which indicates at least whether or not the PUA was being carried and/or used in one or more of various ways. For example, if monitoring includes monitoring one or more biometric parameters of the user, then the monitored data indicates at least whether or not the biometric parameters being monitored have been detected. Similarly, in the case of monitoring PUA location changes, external sounds, data input, pressure, motion, light changes and/or impact of the PUA with other objects, the monitored data includes data indicating at least whether or not any of these parameters have been detected. Monitored data that indicates that one or more of these parameters have been detected in the PUA, in turn, indicates that the PUA was being carried and/or used, while monitored data indicating a lack of any detection of one or more of the monitored parameters indicates that the PUA was not being carried or used.
[00073] In certain embodiments, the monitored data produced indicates at least whether or not the PUA was charged and/or whether or not the PUA was docked at the docking station according to a predetermined time parameter. In the case of monitoring the power charge in the PUA, the monitored data includes data indicating at least whether or not the PUA was charged, and in certain embodiments, the monitored data indicates whether the power capacity remaining in the PUA was greater than a predetermined minimum. Where monitoring includes monitoring of the docking of the PUA at the docking station, the monitored data indicates at least whether or not the PUA was docked at the docking station at any time, and in some embodiments, the monitoring data indicates one or more of whether or not the PUA was docked at the docking station for a predetermined length of time, how frequently the PUA was docked, when the PUA was docked, when the PUA was undocked and/or the time periods during which the PUA was docked. The monitored data produced in these embodiments can be used to determine whether the use of the PUA was in compliance with the criteria for recharging of the PUA and/or docking of the PUA.
[00074] In certain embodiments, monitored data comprises data which can be used to confirm the identity of the PUA user. For example, if one or more biometric parameters of the user are monitored by the sensor/detector, the monitored data includes data indicating or relating to one or more of the user's heart rate or other heart activity or parameter, EEG, blood oxygenation, breathing rate or other breathing activity or parameter, borborygmus, gait, voice, voice analysis, key, keypad or keyboard usage characteristics, fingerprints, handprints, hand geometry, pulse, retinal or iris patterns, olfactory characteristics or other biochemical indicators, patterns of muscular activity, vascular patterns, facial or ear patterns, signature, and/or body temperature detected once or a plurality of times over a predetermined period of time. In another example, if the PUA location change is being monitored, then monitored data can include data relating to the specific locations or changes in location of the PUA and/or relating to the specific RF signal strengths of the PUA detected one or a plurality of times over a predetermined period of time.
[00075] In certain embodiments the sensor/detector 13 of the PUA 2 comprises a digital writing tablet that is used to input a digital handwritten signature from the user to assess who is using the PUA. In accordance with known handwriting identification techniques, a storage of the PUA stores signature recognition software to control a processor of the PUA to compare the current user's signature input by means of the digital writing tablet against stored templates of one or more users' handwritten signatures to determine if there is a match. (The storage and the processor are not shown for purposes of simplicity and clarity.) Based on the results of the matching process, data is produced indicating whether the current user's signature matches any of the stored templates to assess the identity of the current user of the PUA. The templates of the users' signatures are produced in a training mode of the signature recognition software, in which each potential user inputs one or more signatures using the digital writing tablet from which a corresponding template is produced by the PUA's processor and then stored in its storage. In certain ones of such embodiments, the PUA includes a digital writing tablet to enable a user-beneficial function, such as note taking and it is then unnecessary to provide a dedicated digital writing tablet.
[00076] In certain embodiments, the sensor/detector 13 comprises a microphone and a voiceprint recognition technique is used to assess the identity of the user of the PUA 2. In accordance with known voiceprint recognition techniques, the PUA's storage stores voice recognition software to control its processor to compare the current user's voice input by means of the microphone against stored voiceprints of one or more possible users to determine if there is a match. Based on the results of the matching process, data is produced indicating whether the current user's voice matches the voice represented by any of the stored voiceprints to assess the identity of the current user of the PUA. The voiceprints of one or more potential users are produced in a training mode of the voice recognition software, in which each potential user speaks into the microphone of the PUA to produce data from which the voiceprint is produced by its processor and then stored in its storage. Various ones of such embodiments extract the user's voiceprint under different conditions. In one such embodiment, the user's voiceprint is extracted when the user places a voice call using the PUA as a cellular telephone in response to a request message from a monitoring system. In other such embodiments, the PUA's processor extracts voiceprints continuously from the output of its microphone, or at predetermined times or intervals, or when a telephone call is made using the PUA as a cellular telephone or when the output from the PUA's microphone indicates that someone may be speaking into it (indicated, for example by the magnitude of the output, and/or its time and/or frequency characteristics). The extracted voiceprints are compared to the stored voiceprint to assess the identity of the person using the PUA.
[00077] In certain embodiments, the sensor/detector 13 comprises an imagining device, such as a video camera, or other radiant energy detector, such as a line scanner implemented by means of a CCD or an array of photodiodes, that is used to input data representing an image or line scan of a physical feature of the user, such as an iris, a retina, an image of all or portion of the user's face, finger, palm, hand or ear to assess the identity of the user of the PUA 2. In the case of an iris or retinal image, the input data is processed to extract an iris or retinal pattern code. A facial image is processed to extract data unique to the user such as a signature or feature set representing facial bone structure. An image of a finger, palm or hand is processed to extract a fingerprint or palm print, or other characteristic data such as hand geometry or tissue vascular structure. In accordance with known pattern recognition techniques, the PUA's storage stores pattern recognition software to control its processor to compare the current user's iris or retinal pattern code, facial signature or feature set or other characteristic data input by means of the imaging device against one or more stored pattern codes, signatures, feature sets or other characteristic data of one or more potential users, as the case may be, to determine if there is a match. Such characteristic data may be stored in storage 50 or in a storage of a separate device, system or processing facility. Based on the results of the matching process, data is produced by the PUA's processor operating under control of the pattern recognition software to assess the identity of the current user of the PUA. The pattern code, signature, feature set or other characteristic data of each potential user is produced in a training mode of the pattern recognition software, in which the appropriate physical feature of the potential user is imaged or scanned one or more times using the imaging device from which the desired data is produced by the PUA's processor and then stored in its storage. In certain embodiments the physical feature concerned is scanned or imaged at a plurality of different orientations to produce the desired data. In certain ones of the foregoing embodiments, the PUA (such as a cellular telephone) includes a digital camera to enable a user-beneficial function, such as digital photography or video imaging and it is then unnecessary to provide a dedicated imaging device or scanner.
[00078] In certain embodiments, a keyboard dynamics technique is used to assess the identity of the user. In accordance with known keyboard dynamics techniques, the PUA's storage stores keystroke monitoring software to control its processor to collect characteristic keystroke parameters, such as data indicating how long the user holds down the keys 1 1 a of PUA 2, the delay between one keystroke and the next (known as "latency"), and frequency of using of special keys, such as a delete key. Still other parameters, such as typing speed and the manner in which the user employs key combinations (such as keyboard shortcuts), may be monitored by the processor. These parameters are processed in a known manner to produce a feature set characterizing the user's key usage style which is then compared against a stored feature sets representing the styles of one or more potential users. Based on the results of this comparison, data is produced indicating whether the current user's key usage style matches that of one of the potential users as represented by a matching stored feature set to assess the identity of the current user of the PUA. The feature sets representing the usage styles of the potential users are produced in a training mode of the software, in which each potential user makes use of the key or keys of the PUA to produce data from which the feature set is produced by the PUA's processor and then stored in its storage.
[00079] In certain embodiments, the sensor/detector 13 comprises a motion sensitive device, such as an accelerometer, that produces data related to motion of the PUA 2. This data is used to produce a feature set characterizing motion of the PUA, and thus the gait of a person carrying it. In accordance with known gait identification techniques, the PUA's storage stores pattern recognition software to control its processor to compare the current user's gait feature set against one or more stored reference feature sets representing the individual gaits of potential users to determine if there is a match. Based on the results of the matching process, data is produced indicating whether the current user's gait matches that represented by a stored feature set to assess the identity the current user of the PUA. The various feature sets each representing the gait of a potential user are produced in a training mode of the pattern recognition software, in which each potential user walks about carrying the PUA while the motion sensitive device thereof produces data from which its processor produces a respective reference feature set which it stores in the PUA's storage. In certain ones of such embodiments, the PUA includes an accelerometer as an input device to enable a user-beneficial function, such as a gaming input or scrolling command input, and it is then unnecessary to provide a dedicated accelerometer as the motion sensitive device.
[00080] In certain ones of such embodiments, multiple devices and pattern recognition techniques are employed to produce a more accurate and reliable identification of the user than is possible using only one such pattern recognition technique. In certain embodiments, one or more of such pattern recognition techniques or other passive data gathering technique is employed to assess the user's identity. Such detection may be based on an amount by which a monitored feature set differs from a stored feature set representing a characteristic of each potential user as determined by the PUA's processor. When the processor produces data indicating an identification of the user, in certain embodiments either the processor controls a speaker, earphone or visual display of the PUA to present a message to the user requesting a response from which the user's identity may be positively determined, or the processor sends a message to a monitoring system (not shown for purposes of simplicity and clarity) indicating that such a message should be presented to the user. In the latter case, the monitoring system responds to such message from the processor to send a message to the PUA for presentation to the user to request an appropriate response from the user from which the user's identity may be determined, either by the processor or by the monitoring system. The user's response to such message is used to determine the user's identity.
[00081] In certain embodiments, data concerning usage of a PUA to perform a user-beneficial function is gathered by the monitoring system. In particular, the gathering of data concerning such usage of the PUA comprises monitoring usage of the PUA to produce usage data within the PUA, and communicating the usage data from the PUA to a usage data processing facility. This embodiment is illustratively shown in Figure 3. This is especially useful for gathering marketing data concerning how users employ PUA's with an ability to communicate, such as cellular telephones, PDA's, notebook and laptop computers, Blackberry devices, PCS devices, two-way radios, as well as other kinds of PUA's having device-to-device communicating ability or wireless networking ability.
[00082] As shown in Figure 3, the monitoring system monitors the user's use of the PUA 280 and produces usage data within the PUA 282 based on such monitoring. If the monitoring system shown in Figure 1 B is employed, certain monitoring of PUA usage is performed by the sensor/detector 13, which detects the use of one or more functions performed by the PUA. For example, if the PUA includes a function of generating and communicating a text message to another PUA, the sensor/detector 13 in the PUA 2 detects when the user generates and/or communicates a text message, and usage data relating to the generation and communication of the text message is produced in the PUA 2. In certain ones of these embodiments, the operations of sensor/detector 13 are implemented by a processor of the PUA that may carry out additional operations beyond those of sensor/detector 13. [00083] The usage data produced in the PUA 2 includes at least data relating to content generated by the performance of the PUA function. In certain embodiments, the usage data also comprises one or more of data indicating the type of PUA function used, data indicating the time of use of the PUA function, data indicating the length of time of the use of the PUA function, and data relating to the use of communications, if any, to send or receive messages with the use of the PUA. Data relating to the use of communications by the PUA includes data relating to the time a message is communicated, the size of the message and/or the destination of the message, such as the recipient's telephone number, email address and/or IP address. Data relating to the content generated by the use of the PUA function includes data relating to the subject of the generated content and/or data relating to words, phrases, names or concepts included in the content, such as "buzz words". Buzz words comprise words, terms or phrases that advertisers and other businesses would find of value as descriptive of consumers' experiences and reactions to media and advertising content. Some examples include word pair choices such as " boring" vs. "exciting"; "essential" vs. "unnecessary." Further examples include words and phrases that convey a rank-order (ordinal) scale such as "superior quality" vs. "good quality" vs. "acceptable" vs. "poor quality" vs. "unacceptable," "not interested at all" vs. "slightly interested" vs. "might consider purchasing" vs. "interested in purchasing" vs. "plan to purchase" vs. "will definitely purchase."
[00084] The usage data produced in the PUA is thereafter communicated 284 to a usage data processing facility. The usage data processing facility includes a processor, such as the processor 5 shown in Figure 1 B. The processing facility is adapted to receive and process usage data to generate trend data relating to a variety of trends. The trend data generated by the processing facility includes, but is not limited to, data relating to the time, frequency and/or manner of usage of the PUA function, the preference of one PUA function over others, the use of a particular "buzz word," name, brand and/or concept by users, the communications to a particular area code, IP address and/or email service, and other trends relating to the usage of the PUA. [00085] In certain embodiments, the PUA includes communications for communicating with at least another PUA, and the methods and systems for monitoring use of a PUA comprise detecting communicating a message by the communications of the PUA, providing monitored data relating to content of the message, and providing trend data representing at least one trend of usage of the PUA by the user based on the monitored data. These embodiments are illustrated in Figure 4 which shows a flow diagram of actions performed by the monitoring system.
[00086] In this embodiment, the PUA is adapted to communicate with other PUA's using a communication interface. As shown in Figure 1 B, the PUA 2 includes communications in the form of an interface 9 which can communicate using the communications 7. In this case, each of the other PUA's also includes a corresponding interface which is coupled with the communications 7, such that each such PUA can communicate with other PUA's via the communications 7.
[00087] Referring now to Figure 4, when the interface of the PUA communicates a message with another PUA or with any other device, the communicating of the message is detected 290 in the PUA. If the PUA 2 shown in Figure 1 B is employed, the sensor/detector 13 is used to detect the communicating of the message by the PUA 2. In certain ones of such embodiments, the operation of sensor/detector 13 is provided by a processor that may carry out operations in addition to those of sensor/detector 13. In certain embodiments, the communicating by the PUA is detected by detecting a connection between the interface of the PUA with another PUA or device. In other embodiments, the communicating by the PUA is detected by detecting data sent from or received by the interface.
[00088] When communicating of the message by the PUA is detected, monitored data relating thereto is gathered 292 comprising at least data representing content of the message, such as the subject of the communication and/or the use of pre-selected words, names, concepts or images in the communication. In certain embodiments, the monitored data includes data related to one or more of the time of communicating, the duration of communicating, the length or size of the message, the type of message (e.g., e-mail, voice, text message, etc.), and the source and/or the recipient of the message. The monitored data is then processed 294 to determine at least one trend of usage of the PUA by the user and to provide trend data relating to at least one trend of usage. If the monitoring system 1 of Figure 1 B is used, the monitored data is processed either in the PUA 2, or is first communicated to the processor 5 via the communications 5 and thereafter processed by the processor 5 to provide trend data. Trend data provided based on the monitored data comprises data relating to at least one of the PUA functions used by the user, the type of messages sent or received by the user, the frequency of messages sent or received by the user, the time of communicating the messages, the duration of the communicating, the source and recipient of the messages and the content of the messages.
[00089] The trend data provided by the monitoring system is then stored 296 either in the PUA or in an external storage. In the monitoring system 1 of Figure 1 B, the trend data is stored in at least one of the PUA 2 or in the storage 6. If the trend data is stored in the PUA 2, this data can thereafter be communicated to an external storage device such as the storage 6 of the monitoring system 1.
[00090] Trend data provided in the embodiments shown in Figures 6 and 7, and described above, can be used as market research data to determine user preferences, including the user's preferences relating to the PUA functions. Thus, for example, trend data can be used to determine which functions of the PUA are most frequently used by which users, which functions could be removed or added in future versions of the PUA products. In the embodiments in which trend data includes data related to the content of PUA users' communications, trend data can be used to determine the popularity or success of a particular product, brand, person or concept and to ascertain how well a particular product, service or brand may do in the market.
[00091] Although various embodiments of the present invention have been described with reference to a particular arrangement of parts, features and the like, these are not intended to exhaust all possible arrangements or features, and indeed many other embodiments, modifications and variations will be ascertainable to those of skill in the art.

Claims

What is claimed is:
1. A method of gathering data concerning usage of a PUA, comprising: monitoring content created in the use of the PUA to produce content related data; and communicating the content related data to a usage data processing facility.
2. The method of claim 1 , wherein the PUA comprises communications operative to provide device-to-device communicating ability and/or wireless networking ability.
3. The method of claim 2, wherein the PUA comprises one of a cellular telephone, a PDA, a notebook computer, a laptop computer, a PCS device, and a two-way radio.
4. The method of claim 3, wherein a processor of the PUA monitors content created in the use of the PUA and carries out additional operations.
5. The method of claim 1 , wherein the content related data comprises data indicating the type of PUA function used, data indicating the time of use of the PUA function, data indicating the length of time of the use of the PUA function, data relating to the use of communications to send or receive messages with the use of the PUA and data relating to content generated by the use of the PUA function.
6. The method of claim 5, wherein the data relating to the use of communications by the PUA comprises data relating to the time a message is communicated, the size of the message and the destination of the message.
7. The method of claim 1 , wherein the data relating to the content generated by the use of the PUA comprises at least one of data relating to the subject of the generated content and data relating to words, phrases, names or concepts included in the content.
8. The method of claim 1 , wherein the usage data processing facility comprises a processor.
9. The method of claim 1 , comprising gathering data with the use of the PUA from which an identification of a user thereof may be assessed.
10. A system for gathering data concerning usage of a PUA comprises a monitor in or on the PUA and operative to monitor content created in the use of the PUA to produce content related data; and communications coupled with the monitor to receive the content related data and operative to communicate the content related data from the PUA to a usage data processing facility.
1 1. The system of claim 10, wherein the PUA comprises communications operative to provide device-to-device communicating ability and/or wireless networking ability.
12. The system of claim 10, the PUA comprises one of a cellular telephone, a PDA, a notebook computer, a laptop computer, a PCS device, and a two-way radio.
13. The system of claim 12, wherein operations of the sensor/detector are implemented by a processor of the PUA that is operative to carry out additional operations beyond those of the sensor/detector.
14. The system of claim 10, wherein the usage data comprises data indicating a type of PUA function used, data indicating a time of use of the PUA function, data indicating a length of time of the use of the PUA function, data relating to a use of communications to send or receive messages with the use of the PUA and data relating to content generated by the use of the PUA function.
15. The system of claim 14, wherein the data relating to the use of Communications by the PUA comprises data relating to the time a message is communicated, the size of the message and the destination of the message.
16. The system of claim 10, wherein the data relating to the content generated by the use of the PUA comprises data relating to the subject of the generated content and data relating to words, phrases, names or concepts included in the content.
17. The system of claim 10, wherein the monitor is operative to gather data from which an identification of a user of the PUA can be assessed.
18. A method of monitoring use of a PUA by a user, the PUA including a communication interface for communicating with at least another PUA comprises detecting communication by the communication interface of the PUA; providing communication content data relating to content of the communication of the PUA; and providing trend data representing at least one trend of usage of the PUA by the user based on the communication data.
19. The method of claim 18, wherein the communication content data comprises at least one of data relating to the subject of the generated content and data relating to words, phrases, names or concepts included in the content.
20. The method of claim 18, wherein the trend data is stored in one of the PUA or an external storage.
21. The method of claim 18, comprising processing the trend data to produce market research data reflecting user preferences.
22. A system for monitoring use of a PUA by a user, the PUA including a communication interface for communicating with at least another PUA comprises a monitor operative to detect data communicated by the communication interface of the PUA and to produce communication content data relating to content of the communicated data; and a processor coupled with the monitor to receive the communication content data and operative to provide trend data representing at least one trend of usage of the PUA by the user based on the communication content data.
23. The system of claim 22, wherein the communication content data comprises at least one of data relating to the subject of the generated content and data relating to words, phrases, names or concepts included in the content.
24. The system of claim 22, wherein the trend data is stored in one of the PUA or an external storage.
25. The system of claim 22, wherein the trend data is processed to produce market research data reflecting user preferences.
EP07812865A 2006-07-12 2007-07-12 Monitoring usage of a portable user appliance Withdrawn EP2038736A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US83174406P 2006-07-12 2006-07-12
PCT/US2007/073393 WO2008008913A2 (en) 2006-07-12 2007-07-12 Monitoring usage of a portable user appliance

Publications (2)

Publication Number Publication Date
EP2038736A2 true EP2038736A2 (en) 2009-03-25
EP2038736A4 EP2038736A4 (en) 2009-08-19

Family

ID=38924192

Family Applications (5)

Application Number Title Priority Date Filing Date
EP07840400A Ceased EP2038743A4 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives
EP07812866A Withdrawn EP2037799A4 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives
EP07812862A Ceased EP2038766A4 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives
EP07812860A Withdrawn EP2038823A4 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives
EP07812865A Withdrawn EP2038736A4 (en) 2006-07-12 2007-07-12 Monitoring usage of a portable user appliance

Family Applications Before (4)

Application Number Title Priority Date Filing Date
EP07840400A Ceased EP2038743A4 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives
EP07812866A Withdrawn EP2037799A4 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives
EP07812862A Ceased EP2038766A4 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives
EP07812860A Withdrawn EP2038823A4 (en) 2006-07-12 2007-07-12 Methods and systems for compliance confirmation and incentives

Country Status (13)

Country Link
US (8) US20080109295A1 (en)
EP (5) EP2038743A4 (en)
JP (3) JP5519278B2 (en)
KR (3) KR20090031772A (en)
CN (6) CN101512575A (en)
AU (5) AU2007272428A1 (en)
BR (3) BRPI0714294A2 (en)
CA (5) CA2658977A1 (en)
HK (1) HK1155234A1 (en)
IL (3) IL196434A0 (en)
MX (3) MX2009000468A (en)
NO (3) NO20090655L (en)
WO (5) WO2008008913A2 (en)

Families Citing this family (174)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1685735A (en) 2002-04-22 2005-10-19 尼尔逊媒介研究股份有限公司 Methods and apparatus to collect audience information associated with a media presentation
MXPA05008287A (en) 2003-02-10 2005-09-20 Nielsen Media Res Inc Methods and apparatus to adaptively gather audience information data.
US7464155B2 (en) * 2003-03-24 2008-12-09 Siemens Canada Ltd. Demographic information acquisition system
US8023882B2 (en) 2004-01-14 2011-09-20 The Nielsen Company (Us), Llc. Portable audience measurement architectures and methods for portable audience measurement
US8738763B2 (en) 2004-03-26 2014-05-27 The Nielsen Company (Us), Llc Research data gathering with a portable monitor and a stationary device
WO2006037014A2 (en) 2004-09-27 2006-04-06 Nielsen Media Research, Inc. Methods and apparatus for using location information to manage spillover in an audience monitoring system
WO2006099612A2 (en) 2005-03-17 2006-09-21 Nielsen Media Research, Inc. Methods and apparatus for using audience member behavior information to determine compliance with audience measurement system usage requirements
JP2009507301A (en) * 2005-09-02 2009-02-19 ニールセン メディア リサーチ インコーポレイテッド Method and apparatus for measuring print media
US20070135690A1 (en) * 2005-12-08 2007-06-14 Nicholl Richard V Mobile communication device that provides health feedback
EP3010167B1 (en) 2006-03-27 2017-07-05 Nielsen Media Research, Inc. Methods and systems to meter media content presented on a wireless communication device
MX2007015979A (en) 2006-03-31 2009-04-07 Nielsen Media Res Inc Methods, systems, and apparatus for multi-purpose metering.
KR20090031772A (en) * 2006-07-12 2009-03-27 아비트론 인코포레이티드 Monitoring usage of a portable user appliance
US20120278377A1 (en) * 2006-07-12 2012-11-01 Arbitron, Inc. System and method for determining device compliance and recruitment
US8260252B2 (en) 2006-10-02 2012-09-04 The Nielsen Company (Us), Llc Method and apparatus for collecting information about portable device usage
US8014726B1 (en) 2006-10-02 2011-09-06 The Nielsen Company (U.S.), Llc Method and system for collecting wireless information transparently and non-intrusively
US8157730B2 (en) 2006-12-19 2012-04-17 Valencell, Inc. Physiological and environmental monitoring systems and methods
US8652040B2 (en) 2006-12-19 2014-02-18 Valencell, Inc. Telemetric apparatus for health and environmental monitoring
US8234366B2 (en) 2007-03-29 2012-07-31 At&T Intellectual Property I, Lp Methods and apparatus to provide presence information
US10489795B2 (en) 2007-04-23 2019-11-26 The Nielsen Company (Us), Llc Determining relative effectiveness of media content items
US20090171767A1 (en) * 2007-06-29 2009-07-02 Arbitron, Inc. Resource efficient research data gathering using portable monitoring devices
US8321556B1 (en) 2007-07-09 2012-11-27 The Nielsen Company (Us), Llc Method and system for collecting data on a wireless device
US20090023429A1 (en) * 2007-07-17 2009-01-22 Yahoo! Inc. Asynchronous search platform for mobile device users
US20090037386A1 (en) * 2007-08-03 2009-02-05 Dietmar Theobald Computer file processing
US8764653B2 (en) * 2007-08-22 2014-07-01 Bozena Kaminska Apparatus for signal detection, processing and communication
US8438619B2 (en) * 2007-09-21 2013-05-07 Netmotion Wireless Holdings, Inc. Network access control
US9124378B2 (en) 2007-10-06 2015-09-01 The Nielsen Company (Us), Llc Gathering research data
US8206325B1 (en) 2007-10-12 2012-06-26 Biosensics, L.L.C. Ambulatory system for measuring and monitoring physical activity and risk of falling and for automatic fall detection
US8251903B2 (en) 2007-10-25 2012-08-28 Valencell, Inc. Noninvasive physiological analysis using excitation-sensor modules and related devices and methods
US20090150217A1 (en) 2007-11-02 2009-06-11 Luff Robert A Methods and apparatus to perform consumer surveys
US10867133B2 (en) * 2008-05-01 2020-12-15 Primal Fusion Inc. System and method for using a knowledge representation to provide information based on environmental inputs
US8843948B2 (en) 2008-09-19 2014-09-23 The Nielsen Company (Us), Llc Methods and apparatus to detect carrying of a portable audience measurement device
US8359205B2 (en) 2008-10-24 2013-01-22 The Nielsen Company (Us), Llc Methods and apparatus to perform audio watermarking and watermark detection and extraction
US9667365B2 (en) 2008-10-24 2017-05-30 The Nielsen Company (Us), Llc Methods and apparatus to perform audio watermarking and watermark detection and extraction
US8040237B2 (en) * 2008-10-29 2011-10-18 The Nielsen Company (Us), Llc Methods and apparatus to detect carrying of a portable audience measurement device
US9750462B2 (en) 2009-02-25 2017-09-05 Valencell, Inc. Monitoring apparatus and methods for measuring physiological and/or environmental conditions
US8788002B2 (en) 2009-02-25 2014-07-22 Valencell, Inc. Light-guiding devices and monitoring devices incorporating same
EP3127476A1 (en) 2009-02-25 2017-02-08 Valencell, Inc. Light-guiding devices and monitoring devices incorporating same
EP2410910A4 (en) * 2009-03-27 2014-10-15 Dexcom Inc Methods and systems for promoting glucose management
JP2012526314A (en) 2009-05-08 2012-10-25 ゾケム オーワイ System and method for analyzing behavioral and contextual data
AU2010256401B2 (en) 2009-06-05 2014-08-14 Advanced Brain Monitoring, Inc. Systems and methods for controlling position
KR101608339B1 (en) * 2009-06-08 2016-04-11 삼성전자주식회사 Method and device for measuring location, and moving object
GB2471902A (en) * 2009-07-17 2011-01-19 Sharp Kk Sleep management system which correlates sleep and performance data
US9061109B2 (en) * 2009-07-22 2015-06-23 Accuvein, Inc. Vein scanner with user interface
JP5413033B2 (en) * 2009-08-03 2014-02-12 株式会社リコー Information processing apparatus, information leakage prevention method and program
US9357921B2 (en) * 2009-10-16 2016-06-07 At&T Intellectual Property I, Lp Wearable health monitoring system
US9293060B2 (en) 2010-05-06 2016-03-22 Ai Cure Technologies Llc Apparatus and method for recognition of patient activities when obtaining protocol adherence data
US8335715B2 (en) * 2009-11-19 2012-12-18 The Nielsen Company (Us), Llc. Advertisement exchange using neuro-response data
US8335716B2 (en) * 2009-11-19 2012-12-18 The Nielsen Company (Us), Llc. Multimedia advertisement exchange
WO2011070677A1 (en) * 2009-12-11 2011-06-16 富士通株式会社 Information processing device and control method
US8872663B2 (en) * 2010-01-19 2014-10-28 Avery Dennison Corporation Medication regimen compliance monitoring systems and methods
US8782556B2 (en) 2010-02-12 2014-07-15 Microsoft Corporation User-centric soft keyboard predictive technologies
US8855101B2 (en) 2010-03-09 2014-10-07 The Nielsen Company (Us), Llc Methods, systems, and apparatus to synchronize actions of audio source monitors
US8979665B1 (en) 2010-03-22 2015-03-17 Bijan Najafi Providing motion feedback based on user center of mass
US8732605B1 (en) 2010-03-23 2014-05-20 VoteBlast, Inc. Various methods and apparatuses for enhancing public opinion gathering and dissemination
US9134875B2 (en) 2010-03-23 2015-09-15 VoteBlast, Inc. Enhancing public opinion gathering and dissemination
US9883786B2 (en) * 2010-05-06 2018-02-06 Aic Innovations Group, Inc. Method and apparatus for recognition of inhaler actuation
US10116903B2 (en) * 2010-05-06 2018-10-30 Aic Innovations Group, Inc. Apparatus and method for recognition of suspicious activities
US9875666B2 (en) 2010-05-06 2018-01-23 Aic Innovations Group, Inc. Apparatus and method for recognition of patient activities
WO2011161303A1 (en) 2010-06-24 2011-12-29 Zokem Oy Network server arrangement for processing non-parametric, multi-dimensional, spatial and temporal human behavior or technical observations measured pervasively, and related method for the same
US8340685B2 (en) 2010-08-25 2012-12-25 The Nielsen Company (Us), Llc Methods, systems and apparatus to generate market segmentation data with anonymous location data
US8677385B2 (en) 2010-09-21 2014-03-18 The Nielsen Company (Us), Llc Methods, apparatus, and systems to collect audience measurement data
US8607295B2 (en) * 2011-07-06 2013-12-10 Symphony Advanced Media Media content synchronized advertising platform methods
US8412857B2 (en) * 2010-11-22 2013-04-02 Motorola Mobility Llc Authenticating, tracking, and using a peripheral
US8667303B2 (en) 2010-11-22 2014-03-04 Motorola Mobility Llc Peripheral authentication
US20140317744A1 (en) * 2010-11-29 2014-10-23 Biocatch Ltd. Device, system, and method of user segmentation
US11064910B2 (en) 2010-12-08 2021-07-20 Activbody, Inc. Physical activity monitoring system
US8885842B2 (en) 2010-12-14 2014-11-11 The Nielsen Company (Us), Llc Methods and apparatus to determine locations of audience members
US8753275B2 (en) * 2011-01-13 2014-06-17 BioSensics LLC Intelligent device to monitor and remind patients with footwear, walking aids, braces, or orthotics
US8888701B2 (en) 2011-01-27 2014-11-18 Valencell, Inc. Apparatus and methods for monitoring physiological data during environmental interference
US8635291B2 (en) * 2011-02-18 2014-01-21 Blackberry Limited Communication device and method for overriding a message filter
US8918802B2 (en) 2011-02-28 2014-12-23 The Nielsen Company (Us), Llc Methods and apparatus to monitor media exposure
US20120245951A1 (en) * 2011-03-23 2012-09-27 Jonathan Peter Gips System and method for compliance reward
WO2013016007A2 (en) 2011-07-25 2013-01-31 Valencell, Inc. Apparatus and methods for estimating time-state physiological parameters
EP2739207B1 (en) 2011-08-02 2017-07-19 Valencell, Inc. Systems and methods for variable filter adjustment by heart rate metric feedback
US9224359B2 (en) 2011-09-26 2015-12-29 Google Technology Holdings LLC In-band peripheral authentication
JP5822651B2 (en) * 2011-10-26 2015-11-24 株式会社ソニー・コンピュータエンタテインメント Individual discrimination device and individual discrimination method
US9696336B2 (en) 2011-11-30 2017-07-04 The Nielsen Company (Us), Llc Multiple meter detection and processing using motion data
US20130138386A1 (en) * 2011-11-30 2013-05-30 Arbitron Inc. Movement/position monitoring and linking to media consumption
US8977194B2 (en) 2011-12-16 2015-03-10 The Nielsen Company (Us), Llc Media exposure and verification utilizing inductive coupling
US8538333B2 (en) 2011-12-16 2013-09-17 Arbitron Inc. Media exposure linking utilizing bluetooth signal characteristics
US9332363B2 (en) 2011-12-30 2016-05-03 The Nielsen Company (Us), Llc System and method for determining meter presence utilizing ambient fingerprints
US9339691B2 (en) 2012-01-05 2016-05-17 Icon Health & Fitness, Inc. System and method for controlling an exercise device
US20130186405A1 (en) * 2012-01-25 2013-07-25 Openpeak Inc. System and method for monitoring medical equipment
US8797139B2 (en) * 2012-02-23 2014-08-05 Infineon Technologies Ag System-level chip identify verification (locking) method with authentication chip
JP6146760B2 (en) * 2012-02-28 2017-06-14 国立研究開発法人産業技術総合研究所 ORDERING DEVICE, ORDERING METHOD, AND PROGRAM
US9519909B2 (en) 2012-03-01 2016-12-13 The Nielsen Company (Us), Llc Methods and apparatus to identify users of handheld computing devices
US20130231596A1 (en) * 2012-03-02 2013-09-05 David W. Hornbach Sequential compression therapy compliance monitoring systems & methods
US20130262184A1 (en) * 2012-03-30 2013-10-03 Arbitron Inc. Systems and Methods for Presence Detection and Linking to Media Exposure Data
US8473975B1 (en) 2012-04-16 2013-06-25 The Nielsen Company (Us), Llc Methods and apparatus to detect user attentiveness to handheld computing devices
DE13784699T1 (en) * 2012-04-30 2015-07-30 Webtrends, Inc. Method and system for streaming processed real-time data from devices controlled by a remote processor
US10102345B2 (en) 2012-06-19 2018-10-16 Activbody, Inc. Personal wellness management platform
US9230064B2 (en) * 2012-06-19 2016-01-05 EZ as a Drink Productions, Inc. Personal wellness device
US10133849B2 (en) 2012-06-19 2018-11-20 Activbody, Inc. Merchandizing, socializing, and/or gaming via a personal wellness device and/or a personal wellness platform
CN102830901A (en) * 2012-06-29 2012-12-19 鸿富锦精密工业(深圳)有限公司 Office device
US9052896B2 (en) * 2012-07-20 2015-06-09 Facebook, Inc. Adjusting mobile device state based on user intentions and/or identity
US9282366B2 (en) 2012-08-13 2016-03-08 The Nielsen Company (Us), Llc Methods and apparatus to communicate audience measurement information
US20160232536A1 (en) * 2012-08-28 2016-08-11 NextLOGik Auditing, compliance, monitoring, and compliance management
US9992729B2 (en) 2012-10-22 2018-06-05 The Nielsen Company (Us), Llc Systems and methods for wirelessly modifying detection characteristics of portable devices
US9453863B2 (en) * 2012-11-16 2016-09-27 International Business Machines Corporation Implementing frequency spectrum analysis using causality Hilbert Transform results of VNA-generated S-parameter model information
EP2926148B1 (en) * 2012-11-30 2019-07-31 The Nielsen Company (US), LLC Multiple meter detection and processing using motion data
US20140187268A1 (en) * 2012-12-28 2014-07-03 Arbitron Inc. Apparatus, System and Method for Location Detection and User Identification for Media Exposure Data
EP2928364A4 (en) 2013-01-28 2015-11-11 Valencell Inc Physiological monitoring devices having sensing elements decoupled from body motion
US9026053B2 (en) * 2013-02-17 2015-05-05 Fitbit, Inc. System and method for wireless device pairing
US9223297B2 (en) 2013-02-28 2015-12-29 The Nielsen Company (Us), Llc Systems and methods for identifying a user of an electronic device
US9021516B2 (en) 2013-03-01 2015-04-28 The Nielsen Company (Us), Llc Methods and systems for reducing spillover by measuring a crest factor
US9118960B2 (en) 2013-03-08 2015-08-25 The Nielsen Company (Us), Llc Methods and systems for reducing spillover by detecting signal distortion
EP2969058B1 (en) 2013-03-14 2020-05-13 Icon Health & Fitness, Inc. Strength training apparatus with flywheel and related methods
US9311789B1 (en) 2013-04-09 2016-04-12 BioSensics LLC Systems and methods for sensorimotor rehabilitation
US9697533B2 (en) 2013-04-17 2017-07-04 The Nielsen Company (Us), Llc Methods and apparatus to monitor media presentations
US20140312834A1 (en) * 2013-04-20 2014-10-23 Yuji Tanabe Wearable impact measurement device with wireless power and data communication
US9229476B2 (en) 2013-05-08 2016-01-05 EZ as a Drink Productions, Inc. Personal handheld electronic device with a touchscreen on a peripheral surface
US9262064B2 (en) 2013-07-09 2016-02-16 EZ as a Drink Productions, Inc. Handheld computing platform with integrated pressure sensor and associated methods of use
US10021169B2 (en) * 2013-09-20 2018-07-10 Nuance Communications, Inc. Mobile application daily user engagement scores and user profiles
US9403047B2 (en) 2013-12-26 2016-08-02 Icon Health & Fitness, Inc. Magnetic resistance mechanism in a cable machine
US9426525B2 (en) 2013-12-31 2016-08-23 The Nielsen Company (Us), Llc. Methods and apparatus to count people in an audience
US10083459B2 (en) 2014-02-11 2018-09-25 The Nielsen Company (Us), Llc Methods and apparatus to generate a media rank
US10433612B2 (en) 2014-03-10 2019-10-08 Icon Health & Fitness, Inc. Pressure sensor to quantify work
US9953330B2 (en) * 2014-03-13 2018-04-24 The Nielsen Company (Us), Llc Methods, apparatus and computer readable media to generate electronic mobile measurement census data
CN103916725B (en) * 2014-03-27 2018-01-19 上海华博信息服务有限公司 A kind of bluetooth earphone
US10124246B2 (en) 2014-04-21 2018-11-13 Activbody, Inc. Pressure sensitive peripheral devices, and associated methods of use
US9699499B2 (en) 2014-04-30 2017-07-04 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
WO2015191445A1 (en) 2014-06-09 2015-12-17 Icon Health & Fitness, Inc. Cable system incorporated into a treadmill
WO2015195965A1 (en) 2014-06-20 2015-12-23 Icon Health & Fitness, Inc. Post workout massage device
US9538921B2 (en) 2014-07-30 2017-01-10 Valencell, Inc. Physiological monitoring devices with adjustable signal analysis and interrogation power and monitoring methods using same
EP3199100A1 (en) 2014-08-06 2017-08-02 Valencell, Inc. Earbud with a physiological information sensor module
CN104217351A (en) * 2014-08-12 2014-12-17 苏州佳世达电通有限公司 Product use bonus point method
US9794653B2 (en) 2014-09-27 2017-10-17 Valencell, Inc. Methods and apparatus for improving signal quality in wearable biometric monitoring devices
CN104469411B (en) * 2014-12-01 2018-01-02 北京正奇联讯科技有限公司 The monitoring method and system of streaming media playing troubleshooting
US10391361B2 (en) 2015-02-27 2019-08-27 Icon Health & Fitness, Inc. Simulating real-world terrain on an exercise device
US9924224B2 (en) 2015-04-03 2018-03-20 The Nielsen Company (Us), Llc Methods and apparatus to determine a state of a media presentation device
US9848222B2 (en) 2015-07-15 2017-12-19 The Nielsen Company (Us), Llc Methods and apparatus to detect spillover
US10521731B2 (en) * 2015-09-14 2019-12-31 Adobe Inc. Unique user detection for non-computer products
US10945618B2 (en) 2015-10-23 2021-03-16 Valencell, Inc. Physiological monitoring devices and methods for noise reduction in physiological signals based on subject activity type
US10610158B2 (en) 2015-10-23 2020-04-07 Valencell, Inc. Physiological monitoring devices and methods that identify subject activity type
CN108348195B (en) * 2015-11-19 2022-07-05 松下知识产权经营株式会社 Walking movement display system and program
US11276030B2 (en) 2015-11-20 2022-03-15 Ocado Innovation Limited Automated delivery device and handling method
CA2958003C (en) 2016-02-19 2022-04-05 Paul Stanley Addison System and methods for video-based monitoring of vital signs
US10625137B2 (en) 2016-03-18 2020-04-21 Icon Health & Fitness, Inc. Coordinated displays in an exercise device
US10272317B2 (en) 2016-03-18 2019-04-30 Icon Health & Fitness, Inc. Lighted pace feature in a treadmill
US10493349B2 (en) 2016-03-18 2019-12-03 Icon Health & Fitness, Inc. Display on exercise device
WO2018009736A1 (en) 2016-07-08 2018-01-11 Valencell, Inc. Motion-dependent averaging for physiological metric estimating systems and methods
US10671705B2 (en) 2016-09-28 2020-06-02 Icon Health & Fitness, Inc. Customizing recipe recommendations
US10084612B2 (en) * 2016-10-05 2018-09-25 International Business Machines Corporation Remote control with muscle sensor and alerting sensor
US10140440B1 (en) * 2016-12-13 2018-11-27 Symantec Corporation Systems and methods for securing computing devices that are not in users' physical possessions
US10685131B1 (en) 2017-02-03 2020-06-16 Rockloans Marketplace Llc User authentication
CN106691498A (en) * 2017-02-06 2017-05-24 宁波江丰生物信息技术有限公司 Borborygmus processing system
FR3064572B1 (en) * 2017-04-04 2019-03-22 Continental Automotive France METHOD FOR TEMPORARILY INHIBITING REMOTE ACTIVATION OF A FUNCTION PRESENT IN A MOTOR VEHICLE
CN107609461A (en) * 2017-07-19 2018-01-19 阿里巴巴集团控股有限公司 The training method of model, the determination method, apparatus of data similarity and equipment
WO2019060367A1 (en) 2017-09-19 2019-03-28 Adam Hanina Apparatus and method for recognition of suspicious activities
JP2019067055A (en) * 2017-09-29 2019-04-25 日本電気株式会社 Terminal device, retrieval device, analyzer, estimation device, system, and operation method and program of terminal device
WO2019094893A1 (en) 2017-11-13 2019-05-16 Covidien Lp Systems and methods for video-based monitoring of a patient
AU2018400475B2 (en) 2018-01-08 2024-03-07 Covidien Lp Systems and methods for video-based non-contact tidal volume monitoring
CN108371816B (en) * 2018-02-12 2021-06-25 青岛未来移动医疗科技有限公司 Breathing diagnosis and treatment guide game engine and operation method
WO2019238230A1 (en) 2018-06-14 2019-12-19 Brainlab Ag Registration of an anatomical body part by detecting a finger pose
WO2019240991A1 (en) 2018-06-15 2019-12-19 Covidien Lp Systems and methods for video-based patient monitoring during surgery
EP3813653A4 (en) * 2018-06-28 2022-04-13 Board of Trustees of Michigan State University Mobile device applications to measure blood pressure
US11311252B2 (en) 2018-08-09 2022-04-26 Covidien Lp Video-based patient monitoring systems and associated methods for detecting and monitoring breathing
US11617520B2 (en) 2018-12-14 2023-04-04 Covidien Lp Depth sensing visualization modes for non-contact monitoring
KR102620073B1 (en) 2019-01-04 2024-01-03 삼성전자주식회사 Home appliance and control method thereof
US11315275B2 (en) 2019-01-28 2022-04-26 Covidien Lp Edge handling methods for associated depth sensing camera devices, systems, and methods
US11122134B2 (en) * 2019-02-12 2021-09-14 The Nielsen Company (Us), Llc Methods and apparatus to collect media metrics on computing devices
US11532396B2 (en) 2019-06-12 2022-12-20 Mind Medicine, Inc. System and method for patient monitoring of gastrointestinal function using automated stool classifications
KR102091986B1 (en) * 2019-12-26 2020-03-20 한국생산성본부 Ai marketting system based on customer journey analytics
CN111096830B (en) * 2019-12-28 2021-11-30 杭州电子科技大学 Exoskeleton gait prediction method based on LightGBM
US11341525B1 (en) * 2020-01-24 2022-05-24 BlueOwl, LLC Systems and methods for telematics data marketplace
US11484208B2 (en) 2020-01-31 2022-11-01 Covidien Lp Attached sensor activation of additionally-streamed physiological parameters from non-contact monitoring systems and associated devices, systems, and methods
CN111356006B (en) * 2020-03-13 2023-03-17 北京奇艺世纪科技有限公司 Video playing method, device, server and storage medium
US11590427B1 (en) * 2020-03-19 2023-02-28 BlueOwl, LLC Systems and methods for tournament-based telematics insurance pricing
US11373425B2 (en) * 2020-06-02 2022-06-28 The Nielsen Company (U.S.), Llc Methods and apparatus for monitoring an audience of media based on thermal imaging
US11595723B2 (en) 2020-08-20 2023-02-28 The Nielsen Company (Us), Llc Methods and apparatus to determine an audience composition based on voice recognition
US11553247B2 (en) 2020-08-20 2023-01-10 The Nielsen Company (Us), Llc Methods and apparatus to determine an audience composition based on thermal imaging and facial recognition
US11763591B2 (en) 2020-08-20 2023-09-19 The Nielsen Company (Us), Llc Methods and apparatus to determine an audience composition based on voice recognition, thermal imaging, and facial recognition
US12026729B1 (en) 2021-10-04 2024-07-02 BlueOwl, LLC Systems and methods for match evaluation based on change in telematics inferences via a telematics marketplace
US12056722B1 (en) 2021-10-04 2024-08-06 Quanata, Llc Systems and methods for managing vehicle operator profiles based on relative telematics inferences via a telematics marketplace

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020143577A1 (en) * 2001-04-02 2002-10-03 Saul Shiffman Apparatus and method for prediction and management of subject compliance in clinical research
WO2004006110A1 (en) * 2002-07-10 2004-01-15 Eclipse Integrated Systems, Inc. Method and system for increasing the efficacy of a clinical trial
US20050172021A1 (en) * 1997-03-28 2005-08-04 Brown Stephen J. Remotely monitoring an individual using scripted communications

Family Cites Families (327)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2662168A (en) 1946-11-09 1953-12-08 Serge A Scherbatskoy System of determining the listening habits of wave signal receiver users
JPS512419B1 (en) 1966-11-16 1976-01-26
US3919479A (en) 1972-09-21 1975-11-11 First National Bank Of Boston Broadcast signal identification system
JPS512419A (en) 1974-06-25 1976-01-10 Canon Kk Shatsuta asochi
JPS5137050A (en) 1974-09-18 1976-03-29 Mitsubishi Electric Corp KOSHUHAPARUSUCHOKURYUAAKUYOSETSUSOCHI
JPS5327638U (en) 1976-08-16 1978-03-09
US4107735A (en) * 1977-04-19 1978-08-15 R. D. Percy & Company Television audience survey system providing feedback of cumulative survey results to individual television viewers
US4107734A (en) * 1977-01-31 1978-08-15 R. D. Percy & Company Television viewer reaction determining system
US4308554A (en) 1977-04-19 1981-12-29 R. D. Percy & Company Television viewer reaction determining system
JPS5327638A (en) 1977-04-30 1978-03-15 Kyodo Printing Co Ltd Method of making colapsible tube
DE2727268A1 (en) 1977-06-16 1979-01-04 Bayer Ag METHOD OF MANUFACTURING AZO DYES
GB2027298A (en) * 1978-07-31 1980-02-13 Shiu Hung Cheung Method of and apparatus for television audience analysis
US4230990C1 (en) 1979-03-16 2002-04-09 John G Lert Jr Broadcast program identification method and system
US4646145A (en) * 1980-04-07 1987-02-24 R. D. Percy & Company Television viewer reaction determining systems
US4450551A (en) 1981-06-19 1984-05-22 Sanyo Electric Co., Ltd. Keel-tipped stylus, and method and device for making keel-tipped stylus
US4584602A (en) * 1982-11-08 1986-04-22 Pioneer Ansafone Manufacturing Corporation Polling system and method using nondedicated telephone lines
GB8314468D0 (en) 1983-05-25 1983-06-29 Agb Research Plc Television monitoring
US4658290A (en) * 1983-12-08 1987-04-14 Ctba Associates Television and market research data collection system and method
US4697209A (en) 1984-04-26 1987-09-29 A. C. Nielsen Company Methods and apparatus for automatically identifying programs viewed or recorded
US4677466A (en) 1985-07-29 1987-06-30 A. C. Nielsen Company Broadcast program identification method and apparatus
US4626904A (en) * 1985-11-12 1986-12-02 Control Data Corporation Meter for passively logging the presence and identity of TV viewers
US4652915A (en) * 1985-11-12 1987-03-24 Control Data Corporation Method for polling headphones of a passive TV audience meter system
US4695879A (en) * 1986-02-07 1987-09-22 Weinblatt Lee S Television viewer meter
US4739398A (en) 1986-05-02 1988-04-19 Control Data Corporation Method, apparatus and system for recognizing broadcast segments
US4718106A (en) * 1986-05-12 1988-01-05 Weinblatt Lee S Survey of radio audience
US4803625A (en) 1986-06-30 1989-02-07 Buddy Systems, Inc. Personal health monitor
US4779198A (en) * 1986-08-26 1988-10-18 Control Data Corporation Audience monitoring system
US4843562A (en) 1987-06-24 1989-06-27 Broadcast Data Systems Limited Partnership Broadcast information classification system and method
DE3720882A1 (en) 1987-06-24 1989-01-05 Media Control Musik Medien METHOD AND CIRCUIT ARRANGEMENT FOR THE AUTOMATIC RECOGNITION OF SIGNAL SEQUENCES
US4973952A (en) 1987-09-21 1990-11-27 Information Resources, Inc. Shopping cart display system
US4907079A (en) 1987-09-28 1990-03-06 Teleview Rating Corporation, Inc. System for monitoring and control of home entertainment electronic devices
FR2628588A1 (en) 1988-03-14 1989-09-15 Croquet Cie METHOD AND SYSTEM FOR ACQUIRING AND TRANSMITTING INFORMATION ON THE AUDIENCE OF TELEVISION PROGRAMS
US4912552A (en) * 1988-04-19 1990-03-27 Control Data Corporation Distributed monitoring system
US4955070A (en) 1988-06-29 1990-09-04 Viewfacts, Inc. Apparatus and method for automatically monitoring broadcast band listening habits
US4858000A (en) 1988-09-14 1989-08-15 A. C. Nielsen Company Image recognition audience measurement system and method
US5023929A (en) * 1988-09-15 1991-06-11 Npd Research, Inc. Audio frequency based market survey method
DE3901790A1 (en) 1989-01-21 1990-07-26 Gfk Gmbh METHOD FOR THE REMOTE CONTROLLED REPLACEMENT OF A PARTICULAR PROGRAM PART OF A TELEVISION PROGRAM BY A SEPARATELY SENT PROGRAM PART FOR SPECIFIC SELECTED RECEIVER, HOUSEHOLD TERMINAL DEVICE AND THROUGH THE DRIVE DRIVE
US4972503A (en) 1989-08-08 1990-11-20 A. C. Nielsen Company Method and apparatus for determining audience viewing habits by jamming a control signal and identifying the viewers command
WO1991011062A1 (en) 1990-01-18 1991-07-25 Young Alan M Method and apparatus for broadcast media audience measurement
CA2033558C (en) 1990-03-27 1996-11-26 Rand B. Nickerson Real-time wireless audience response system
JPH0666738B2 (en) 1990-04-06 1994-08-24 株式会社ビデオ・リサーチ CM automatic confirmation device
US5382970A (en) * 1991-07-19 1995-01-17 Kiefl; John B. Television viewer monitoring system including portable data meter for each viewer
KR100205403B1 (en) 1991-09-18 1999-07-01 구자홍 Structure of magneto-optical recording medium
KR940001238B1 (en) 1991-09-25 1994-02-18 주식회사 금성사 Optical recording material
FR2681997A1 (en) 1991-09-30 1993-04-02 Arbitron Cy METHOD AND DEVICE FOR AUTOMATICALLY IDENTIFYING A PROGRAM COMPRISING A SOUND SIGNAL
US5319735A (en) 1991-12-17 1994-06-07 Bolt Beranek And Newman Inc. Embedded signalling
US5331544A (en) * 1992-04-23 1994-07-19 A. C. Nielsen Company Market research method and system for collecting retail store and shopper market research data
US5436653A (en) 1992-04-30 1995-07-25 The Arbitron Company Method and system for recognition of broadcast segments
JP3035407B2 (en) 1992-05-26 2000-04-24 株式会社ビデオリサーチ Viewing source detection device
GB9221678D0 (en) 1992-10-15 1992-11-25 Taylor Nelson Group Limited Identifying a received programme stream
NZ259776A (en) 1992-11-16 1997-06-24 Ceridian Corp Identifying recorded or broadcast audio signals by mixing with encoded signal derived from code signal modulated by narrower bandwidth identification signal
JP3447333B2 (en) 1993-06-18 2003-09-16 株式会社ビデオリサーチ CM automatic identification system
US5483276A (en) * 1993-08-02 1996-01-09 The Arbitron Company Compliance incentives for audience monitoring/recording devices
US5481294A (en) 1993-10-27 1996-01-02 A. C. Nielsen Company Audience measurement system utilizing ancillary codes and passive signatures
US5488408A (en) 1994-03-22 1996-01-30 A.C. Nielsen Company Serial data channel metering attachment for metering channels to which a receiver is tuned
US5450490A (en) 1994-03-31 1995-09-12 The Arbitron Company Apparatus and methods for including codes in audio signals and decoding
US5704029A (en) * 1994-05-23 1997-12-30 Wright Strategies, Inc. System and method for completing an electronic form
JP3611880B2 (en) 1994-07-26 2005-01-19 株式会社ビデオリサーチ Push button PM device
JP3607725B2 (en) 1994-07-26 2005-01-05 株式会社ビデオリサーチ Push button PM device
US5594934A (en) 1994-09-21 1997-01-14 A.C. Nielsen Company Real time correlation meter
US5737026A (en) 1995-02-28 1998-04-07 Nielsen Media Research, Inc. Video and data co-channel communication system
JP3974953B2 (en) 1995-07-21 2007-09-12 株式会社ビデオリサーチ Television viewer identification method and apparatus
JP3688764B2 (en) 1995-07-21 2005-08-31 株式会社ビデオリサーチ Television viewer identification method and apparatus
JP2939429B2 (en) 1995-07-21 1999-08-25 株式会社ビデオリサーチ External electronic device playback state detection device
US6001065A (en) * 1995-08-02 1999-12-14 Ibva Technologies, Inc. Method and apparatus for measuring and analyzing physiological signals for active or passive control of physical and virtual spaces and the contents therein
US6154484A (en) 1995-09-06 2000-11-28 Solana Technology Development Corporation Method and apparatus for embedding auxiliary data in a primary data signal using frequency and time domain processing
JP3574241B2 (en) 1995-10-19 2004-10-06 池上通信機株式会社 Counting people by thermal image
JP3643157B2 (en) 1995-11-29 2005-04-27 池上通信機株式会社 Object height measurement method using stereo images
JP3631541B2 (en) 1995-11-29 2005-03-23 池上通信機株式会社 Object tracking method using stereo images
US6035177A (en) 1996-02-26 2000-03-07 Donald W. Moses Simultaneous transmission of ancillary and audio signals by means of perceptual coding
JP3117075B2 (en) 1996-03-12 2000-12-11 富士電機株式会社 Circuit breaker
US5828325A (en) 1996-04-03 1998-10-27 Aris Technologies, Inc. Apparatus and method for encoding and decoding information in analog signals
JP3625344B2 (en) 1996-11-05 2005-03-02 株式会社ビデオリサーチ Viewing channel detector
US5864708A (en) 1996-05-20 1999-01-26 Croft; Daniel I. Docking station for docking a portable computer with a wireless interface
US5889548A (en) 1996-05-28 1999-03-30 Nielsen Media Research, Inc. Television receiver use metering with separate program and sync detectors
US5822744A (en) 1996-07-15 1998-10-13 Kesel; Brad Consumer comment reporting apparatus and method
US6026387A (en) * 1996-07-15 2000-02-15 Kesel; Brad Consumer comment reporting apparatus and method
JP3035408U (en) 1996-07-17 1997-03-18 祐二 上田 A leveler that makes it easy to measure the verticality of your feet in tight spaces.
US6647548B1 (en) 1996-09-06 2003-11-11 Nielsen Media Research, Inc. Coded/non-coded program audience measurement system
JP3688833B2 (en) 1996-12-02 2005-08-31 株式会社ビデオリサーチ Car radio listening situation investigation device
US6958710B2 (en) 2002-12-24 2005-10-25 Arbitron Inc. Universal display media exposure measurement
US7607147B1 (en) 1996-12-11 2009-10-20 The Nielsen Company (Us), Llc Interactive service device metering systems
US7587323B2 (en) * 2001-12-14 2009-09-08 At&T Intellectual Property I, L.P. System and method for developing tailored content
US6675383B1 (en) 1997-01-22 2004-01-06 Nielsen Media Research, Inc. Source detection apparatus and method for audience measurement
US5940135A (en) 1997-05-19 1999-08-17 Aris Technologies, Inc. Apparatus and method for encoding and decoding information in analog signals
US6278453B1 (en) * 1997-06-13 2001-08-21 Starfish Software, Inc. Graphical password methodology for a microprocessor device accepting non-alphanumeric user input
DK0887958T3 (en) 1997-06-23 2003-05-05 Liechti Ag Method of compressing recordings of ambient sound, method of detecting program elements therein, devices and computer program thereto
US6016476A (en) * 1997-08-11 2000-01-18 International Business Machines Corporation Portable information and transaction processing system and method utilizing biometric authorization and digital certificate security
JP3737614B2 (en) 1997-10-09 2006-01-18 株式会社ビデオリサーチ Broadcast confirmation system using audio signal, and audio material production apparatus and broadcast confirmation apparatus used in this system
JPH11122203A (en) 1997-10-09 1999-04-30 Video Research:Kk Broadcast confirmation system, video source production device used for the system and broadcast confirmation device
US5945932A (en) 1997-10-30 1999-08-31 Audiotrack Corporation Technique for embedding a code in an audio signal and for detecting the embedded code
CN1282470A (en) 1997-11-20 2001-01-31 尼尔逊媒介研究股份有限公司 Voice recognition unit for audience measurement system
US6467089B1 (en) 1997-12-23 2002-10-15 Nielsen Media Research, Inc. Audience measurement system incorporating a mobile handset
WO1999034274A2 (en) * 1997-12-31 1999-07-08 Todd Kenneth J Dynamically configurable electronic comment card
JP3964979B2 (en) 1998-03-18 2007-08-22 株式会社ビデオリサーチ Music identification method and music identification system
JP3749787B2 (en) 1998-03-23 2006-03-01 株式会社ビデオリサーチ Car radio listening situation survey system and car radio listening situation measuring machine
JP3964041B2 (en) 1998-03-23 2007-08-22 株式会社ビデオリサーチ Viewing channel determination device
JP4287053B2 (en) 1998-05-12 2009-07-01 ニールセン メディア リサーチ インコーポレイテッド Audience rating system for digital TV
JP4034879B2 (en) 1998-06-08 2008-01-16 株式会社ビデオリサーチ Viewing measuring apparatus and viewing measuring method
US6272176B1 (en) 1998-07-16 2001-08-07 Nielsen Media Research, Inc. Broadcast encoding system and method
US7006555B1 (en) 1998-07-16 2006-02-28 Nielsen Media Research, Inc. Spectral audio encoding
JP3688903B2 (en) 1998-09-07 2005-08-31 株式会社ビデオリサーチ Portable radio listening status recording device
JP2000113334A (en) * 1998-09-30 2000-04-21 Ncr Internatl Inc Method and device for displaying advertisement message for customer by using sales management terminal equipment
US6271631B1 (en) * 1998-10-15 2001-08-07 E.L. Specialists, Inc. Alerting system using elastomeric EL lamp structure
CN1329783A (en) 1998-12-08 2002-01-02 尼尔逊媒介研究股份有限公司 Metering viewing of video displayed in windows
US6720990B1 (en) * 1998-12-28 2004-04-13 Walker Digital, Llc Internet surveillance system and method
US20020056043A1 (en) 1999-01-18 2002-05-09 Sensar, Inc. Method and apparatus for securely transmitting and authenticating biometric data over a network
CA2683191A1 (en) 1999-03-02 2000-09-08 Amway Corp. Electronic commerce transactions within a marketing system
US20030011048A1 (en) * 1999-03-19 2003-01-16 Abbott Donald C. Semiconductor circuit assembly having a plated leadframe including gold selectively covering areas to be soldered
US7555470B2 (en) 1999-03-22 2009-06-30 Health Hero Network, Inc. Research data collection and analysis
CN1182700C (en) 1999-04-30 2004-12-29 汤姆森特许公司 Status monitoring and data processing system suitable for use in a bi-directional communication device
EP1186127A1 (en) 1999-05-20 2002-03-13 Nielsen Media Research, Inc. Viewer identification apparatus for use in a broadcast audience measurement
US6871180B1 (en) 1999-05-25 2005-03-22 Arbitron Inc. Decoding of information in audio signals
US7166064B2 (en) * 1999-07-08 2007-01-23 Icon Ip, Inc. Systems and methods for enabling two-way communication between one or more exercise devices and computer devices and for enabling users of the one or more exercise devices to competitively exercise
DE19934978A1 (en) 1999-07-26 2001-02-22 Siemens Ag Method and circuit arrangement for monitoring and possibly for controlling the transmission capacity of a data transmission link
EP1217942A1 (en) * 1999-09-24 2002-07-03 Healthetech, Inc. Physiological monitor and associated computation, display and communication unit
US6572560B1 (en) * 1999-09-29 2003-06-03 Zargis Medical Corp. Multi-modal cardiac diagnostic decision support system and method
KR100330253B1 (en) 1999-10-30 2002-03-27 박영달 Question search apparatus for digital media and method thereof
US6524239B1 (en) * 1999-11-05 2003-02-25 Wcr Company Apparatus for non-instrusively measuring health parameters of a subject and method of use thereof
US7284033B2 (en) * 1999-12-14 2007-10-16 Imahima Inc. Systems for communicating current and future activity information among mobile internet users and methods therefor
AU2592701A (en) 1999-12-23 2001-07-03 My-E-Surveys.Com, Llc System and methods for internet commerce and communication based on customer interaction and preferences
US6564104B2 (en) * 1999-12-24 2003-05-13 Medtronic, Inc. Dynamic bandwidth monitor and adjuster for remote communications with a medical device
US6294999B1 (en) 1999-12-29 2001-09-25 Becton, Dickinson And Company Systems and methods for monitoring patient compliance with medication regimens
JP2001188703A (en) 2000-01-05 2001-07-10 Video Research:Kk Method for obtaining page information
US6661438B1 (en) * 2000-01-18 2003-12-09 Seiko Epson Corporation Display apparatus and portable information processing apparatus
US6757719B1 (en) * 2000-02-25 2004-06-29 Charmed.Com, Inc. Method and system for data transmission between wearable devices or from wearable devices to portal
US6893396B2 (en) 2000-03-01 2005-05-17 I-Medik, Inc. Wireless internet bio-telemetry monitoring system and interface
US20010037206A1 (en) * 2000-03-02 2001-11-01 Vivonet, Inc. Method and system for automatically generating questions and receiving customer feedback for each transaction
US6963848B1 (en) * 2000-03-02 2005-11-08 Amazon.Com, Inc. Methods and system of obtaining consumer reviews
US6934684B2 (en) * 2000-03-24 2005-08-23 Dialsurf, Inc. Voice-interactive marketplace providing promotion and promotion tracking, loyalty reward and redemption, and other features
US20040103139A1 (en) * 2000-03-30 2004-05-27 United Devices, Inc. Distributed processing system having sensor based data collection and associated method
US6968564B1 (en) 2000-04-06 2005-11-22 Nielsen Media Research, Inc. Multi-band spectral audio encoding
US20030036683A1 (en) 2000-05-01 2003-02-20 Kehr Bruce A. Method, system and computer program product for internet-enabled, patient monitoring system
WO2001086877A2 (en) 2000-05-05 2001-11-15 Nomadix, Inc. Network usage monitoring device and associated method
JP3489537B2 (en) 2000-05-16 2004-01-19 日本電気株式会社 Function calling method and terminal device by keyword detection
JP2001324988A (en) 2000-05-17 2001-11-22 Video Research:Kk Audio signal recording and reproducing device and audio signal reproducing device
US7689437B1 (en) * 2000-06-16 2010-03-30 Bodymedia, Inc. System for monitoring health, wellness and fitness
US6699188B2 (en) * 2000-06-22 2004-03-02 Guidance Interactive Technologies Interactive reward devices and methods
US6879652B1 (en) 2000-07-14 2005-04-12 Nielsen Media Research, Inc. Method for encoding an input signal
JP2002044689A (en) 2000-07-25 2002-02-08 Toshiba Corp Commercial broadcasting confirmation system and slip issuing system
JP2002041578A (en) 2000-07-25 2002-02-08 Video Research:Kk Examination method, recording medium with examination program recorded thereof and examination system
AU8741101A (en) * 2000-08-22 2002-03-04 Ernex Marketing Technologies I Marketing systems and methods
JP2002175387A (en) 2000-09-01 2002-06-21 Sony Computer Entertainment Inc Utilization condition monitoring method and system for contents, computer program and recording medium
US6754470B2 (en) 2000-09-01 2004-06-22 Telephia, Inc. System and method for measuring wireless device and network usage and performance metrics
JP2002092253A (en) * 2000-09-12 2002-03-29 Mitsubishi Electric Corp Behavior pattern gathering system and behavior pattern gathering method
JP2002092504A (en) 2000-09-13 2002-03-29 Video Research:Kk Order receiving method and storage medium with order receiving program stored therein
KR100421739B1 (en) 2000-09-16 2004-03-12 (주)모바일타운 A target marketing method based on the transfer and response of the goods/services informations using wireless mobile terminals
JP4236801B2 (en) * 2000-09-19 2009-03-11 日本電気株式会社 MARKET RESEARCH SERVER AND SERVER GROUP, MARKET RESEARCH SYSTEM HAVING THEM, AND MARKET RESEARCH METHOD
US6700482B2 (en) 2000-09-29 2004-03-02 Honeywell International Inc. Alerting and notification system
JP2002117217A (en) 2000-10-12 2002-04-19 Video Research:Kk Method and device for collecting record and recording medium with record collection program recorded thereon
US6819219B1 (en) * 2000-10-13 2004-11-16 International Business Machines Corporation Method for biometric-based authentication in wireless communication for access control
JP2002135757A (en) 2000-10-27 2002-05-10 Intage Inc Advertisement viewing effect evaluation system
JP2002133283A (en) 2000-10-27 2002-05-10 Intage Inc System for providing commodity information based on environmental properties
US7031980B2 (en) * 2000-11-02 2006-04-18 Hewlett-Packard Development Company, L.P. Music similarity function based on signal analysis
JP2002163281A (en) * 2000-11-27 2002-06-07 Indigo Corp Information retrieving method and system
US6484033B2 (en) 2000-12-04 2002-11-19 Motorola, Inc. Wireless communication system for location based schedule management and method therefor
JP4224201B2 (en) 2000-12-15 2009-02-12 株式会社ビデオリサーチ Media contact rate survey system
US20030006911A1 (en) * 2000-12-22 2003-01-09 The Cadre Group Inc. Interactive advertising system and method
US6622087B2 (en) 2000-12-26 2003-09-16 Intel Corporation Method and apparatus for deriving travel profiles
US20020114299A1 (en) 2000-12-27 2002-08-22 Daozheng Lu Apparatus and method for measuring tuning of a digital broadcast receiver
EP1223757B1 (en) * 2001-01-09 2006-03-22 Metabyte Networks, Inc. System, method, and software application for targeted advertising via behavioral model clustering, and preference programming based on behavioral model clusters
US7305697B2 (en) * 2001-02-02 2007-12-04 Opentv, Inc. Service gateway for interactive television
JP2002236776A (en) 2001-02-09 2002-08-23 Video Research:Kk Investigation program and investigation method
JP3546021B2 (en) 2001-02-15 2004-07-21 株式会社ビデオリサーチ Video processing apparatus, video processing method, and video processing program
JP2002245192A (en) 2001-02-19 2002-08-30 Intage Inc Digital contents distribution device and digital contents distribution system using it
US20040162035A1 (en) * 2001-03-08 2004-08-19 Hannes Petersen On line health monitoring
WO2002076077A1 (en) * 2001-03-16 2002-09-26 Leap Wireless International, Inc. Method and system for distributing content over a wireless communications system
US7856377B2 (en) 2001-03-29 2010-12-21 American Express Travel Related Services Company, Inc. Geographic loyalty system and method
US7415447B2 (en) 2001-04-02 2008-08-19 Invivodata, Inc. Apparatus and method for prediction and management of participant compliance in clinical research
US8065180B2 (en) 2001-04-02 2011-11-22 invivodata®, Inc. System for clinical trial subject compliance
JP2002304185A (en) 2001-04-04 2002-10-18 Video Research:Kk Method and system for copyright management, and program
JP4649053B2 (en) 2001-04-23 2011-03-09 株式会社ビデオリサーチ Copyrighted content monitoring system and copyrighted content monitoring program
US7319863B2 (en) * 2001-05-11 2008-01-15 Wildseed, Ltd. Method and system for providing an opinion and aggregating opinions with mobile telecommunication device
US7072931B2 (en) * 2001-05-16 2006-07-04 David Goldhaber Accreditation maintenance through remote site monitoring
DE10124752B4 (en) * 2001-05-21 2006-01-12 Infineon Technologies Ag Circuit arrangement for reading and storing binary memory cell signals
US7992161B2 (en) * 2001-05-22 2011-08-02 At&T Intellectual Property I, L.P. Method and apparatus for providing incentives for viewers to watch commercial advertisements
JP4527903B2 (en) 2001-05-28 2010-08-18 株式会社ビデオリサーチ Viewing situation survey device
US8091100B2 (en) 2001-06-18 2012-01-03 The Nielsen Company (Us), Llc Prompting of audience member identification
US20020198990A1 (en) 2001-06-25 2002-12-26 Bradfield William T. System and method for remotely monitoring and controlling devices
US8572640B2 (en) 2001-06-29 2013-10-29 Arbitron Inc. Media data use measurement with remote decoding/pattern matching
AU2002346116A1 (en) 2001-07-20 2003-03-03 Gracenote, Inc. Automatic identification of sound recordings
JP2003058688A (en) 2001-08-14 2003-02-28 Video Research:Kk Purchase research method and purchase research processing program
US7100181B2 (en) 2001-08-22 2006-08-29 Nielsen Media Research, Inc. Television proximity sensor
US6862355B2 (en) 2001-09-07 2005-03-01 Arbitron Inc. Message reconstruction from partial detection
JP2003085326A (en) * 2001-09-10 2003-03-20 Toshiba Corp Questionnaire collection system, questionnaire collection method, and questionnaire collection program
US20030054866A1 (en) 2001-09-20 2003-03-20 Byers Charles Calvin Method for automatically selecting the alert type for a mobile electronic device
WO2003027391A1 (en) 2001-09-24 2003-04-03 The Procter & Gamble Company A soft absorbent web material
US6623427B2 (en) * 2001-09-25 2003-09-23 Hewlett-Packard Development Company, L.P. Biofeedback based personal entertainment system
US6623428B2 (en) * 2001-10-11 2003-09-23 Eastman Kodak Company Digital image sequence display system and method
IL161635A0 (en) 2001-11-08 2004-09-27 Behavioral Informatics Inc Monitoring a daily living activity and analyzing data related thereto
US7117513B2 (en) 2001-11-09 2006-10-03 Nielsen Media Research, Inc. Apparatus and method for detecting and correcting a corrupted broadcast time code
US6912386B1 (en) 2001-11-13 2005-06-28 Nokia Corporation Method for controlling operation of a mobile device by detecting usage situations
US20030131350A1 (en) 2002-01-08 2003-07-10 Peiffer John C. Method and apparatus for identifying a digital audio signal
KR100580618B1 (en) 2002-01-23 2006-05-16 삼성전자주식회사 Apparatus and method for recognizing user emotional status using short-time monitoring of physiological signals
JP4119130B2 (en) 2002-01-25 2008-07-16 株式会社ビデオリサーチ External input terminal detection method and apparatus
JP3669965B2 (en) 2002-02-19 2005-07-13 株式会社ビデオリサーチ Viewing channel determination method and apparatus
US7181159B2 (en) * 2002-03-07 2007-02-20 Breen Julian H Method and apparatus for monitoring audio listening
US7471987B2 (en) * 2002-03-08 2008-12-30 Arbitron, Inc. Determining location of an audience member having a portable media monitor
US20040203630A1 (en) 2002-03-15 2004-10-14 Wang Charles Chuanming Method and apparatus for targeting service delivery to mobile devices
JP2003316923A (en) * 2002-04-22 2003-11-07 Ntt Docomo Tokai Inc Questionnaire system and questionnaire method
CN1685735A (en) 2002-04-22 2005-10-19 尼尔逊媒介研究股份有限公司 Methods and apparatus to collect audience information associated with a media presentation
JP2003331106A (en) 2002-05-17 2003-11-21 Ics:Kk System for campaign information data processing based upon identification information
JP2004013472A (en) 2002-06-06 2004-01-15 Video Research:Kk Customer database merge method and merge processing program, and computer-readable recording medium recorded with merge relational data
US7236799B2 (en) * 2002-06-14 2007-06-26 Cingular Wireless Ii, Llc Apparatus and systems for providing location-based services within a wireless network
JP2004021778A (en) 2002-06-19 2004-01-22 Nec Infrontia Corp Data collection system
GB2391135B (en) * 2002-06-28 2006-01-11 Nokia Corp User group creation
JP4490029B2 (en) * 2002-06-28 2010-06-23 キヤノン電子株式会社 Information analysis apparatus, control method therefor, information analysis system, and program
US7139916B2 (en) 2002-06-28 2006-11-21 Ebay, Inc. Method and system for monitoring user interaction with a computer
US20040005900A1 (en) * 2002-07-05 2004-01-08 Martin Zilliacus Mobile terminal interactivity with multimedia programming
JP2004102651A (en) 2002-09-10 2004-04-02 Intage Nagano:Kk Card type information storage medium
GB2393356B (en) 2002-09-18 2006-02-01 E San Ltd Telemedicine system
US20040209595A1 (en) * 2002-09-25 2004-10-21 Joseph Bekanich Apparatus and method for monitoring the time usage of a wireless communication device
US7222071B2 (en) * 2002-09-27 2007-05-22 Arbitron Inc. Audio data receipt/exposure measurement with code monitoring and signature extraction
US20050125240A9 (en) * 2002-10-21 2005-06-09 Speiser Leonard R. Product recommendation in a network-based commerce system
CN1774922A (en) 2002-10-23 2006-05-17 尼尔逊媒介研究股份有限公司 Digital data insertion apparatus and methods for use with compressed audio/video data
JP3699953B2 (en) 2002-10-24 2005-09-28 株式会社ビデオリサーチ TV viewing situation survey device
US7263086B2 (en) * 2002-11-12 2007-08-28 Nokia Corporation Method and system for providing location-based services in multiple coverage area environments
US7035257B2 (en) * 2002-11-14 2006-04-25 Digi International, Inc. System and method to discover and configure remotely located network devices
JP3720037B2 (en) * 2002-11-22 2005-11-24 松下電器産業株式会社 Operation history utilization system and method
US6845360B2 (en) 2002-11-22 2005-01-18 Arbitron Inc. Encoding multiple messages in audio data and detecting same
CN1717694A (en) 2002-11-28 2006-01-04 皇家飞利浦电子股份有限公司 Bio-linking a user and authorization means
KR20050109919A (en) * 2002-12-10 2005-11-22 텔어바웃 인크 Content creation, distribution, interaction, and monitoring system
WO2004059547A1 (en) * 2002-12-26 2004-07-15 Japan Tobacco Inc. Analyzing system, analyzing method in that system, and system for collecting examination results used for analyzing
JP2004206529A (en) 2002-12-26 2004-07-22 Nippon Telegraph & Telephone East Corp Automatic adjustment system and key holder for use in the same
EP1586045A1 (en) 2002-12-27 2005-10-19 Nielsen Media Research, Inc. Methods and apparatus for transcoding metadata
US20050203800A1 (en) * 2003-01-22 2005-09-15 Duane Sweeney System and method for compounded marketing
JP4474831B2 (en) 2003-01-28 2010-06-09 日本電気株式会社 Mobile station location system, control device and mobile station in mobile communication network
JP4776170B2 (en) * 2003-01-29 2011-09-21 技研商事インターナショナル株式会社 Location certification system
US7065351B2 (en) 2003-01-30 2006-06-20 Qualcomm Incorporated Event-triggered data collection
US7158011B2 (en) 2003-02-14 2007-01-02 Brue Vesta L Medication compliance device
JP2004246725A (en) * 2003-02-14 2004-09-02 Sharp Corp Display device, display control device, display control program, and computer-readable recording medium recording the same
KR20040104195A (en) 2003-06-03 2004-12-10 엘지전자 주식회사 Method for receiving location information of mobile communication terminal
US20040252816A1 (en) * 2003-06-13 2004-12-16 Christophe Nicolas Mobile phone sample survey method
WO2005006768A1 (en) 2003-06-20 2005-01-20 Nielsen Media Research, Inc Signature-based program identification apparatus and methods for use with digital broadcast systems
US7363214B2 (en) * 2003-08-08 2008-04-22 Cnet Networks, Inc. System and method for determining quality of written product reviews in an automated manner
US7592908B2 (en) 2003-08-13 2009-09-22 Arbitron, Inc. Universal display exposure monitor using personal locator service
JP4338486B2 (en) 2003-09-11 2009-10-07 株式会社電通 Database fusion device and advertising media planning support device
WO2005046201A2 (en) 2003-10-16 2005-05-19 Nielsen Media Research, Inc. Audio signature apparatus and methods
ES2388923T3 (en) 2003-10-17 2012-10-19 Nielsen Media Research, Inc. Portable multi-purpose audience measurement system
JP4351514B2 (en) 2003-10-27 2009-10-28 株式会社ビデオリサーチ Viewing channel determination method and apparatus
US7081823B2 (en) * 2003-10-31 2006-07-25 International Business Machines Corporation System and method of predicting future behavior of a battery of end-to-end probes to anticipate and prevent computer network performance degradation
JP4544847B2 (en) * 2003-11-20 2010-09-15 ソニー・エリクソン・モバイルコミュニケーションズ株式会社 Electronic equipment and system
US7784069B2 (en) * 2003-12-01 2010-08-24 International Business Machines Corporation Selecting divergent storylines using branching techniques
JP4628691B2 (en) * 2003-12-26 2011-02-09 テクマトリックス株式会社 E-mail processing program, method and apparatus thereof
US20050234309A1 (en) * 2004-01-07 2005-10-20 David Klapper Method and apparatus for classification of movement states in Parkinson's disease
JP2005208822A (en) * 2004-01-21 2005-08-04 Seiko Epson Corp Authentication device, portable terminal, electronic settlement system, and authentication program
KR100619827B1 (en) 2004-01-30 2006-09-13 엘지전자 주식회사 Methods and a apparatus of confirmation message sender for mobile communication system
US20050197988A1 (en) * 2004-02-17 2005-09-08 Bublitz Scott T. Adaptive survey and assessment administration using Bayesian belief networks
US20060154642A1 (en) * 2004-02-20 2006-07-13 Scannell Robert F Jr Medication & health, environmental, and security monitoring, alert, intervention, information and network system with associated and supporting apparatuses
US7463143B2 (en) 2004-03-15 2008-12-09 Arbioran Methods and systems for gathering market research data within commercial establishments
US20050203798A1 (en) 2004-03-15 2005-09-15 Jensen James M. Methods and systems for gathering market research data
US7420464B2 (en) 2004-03-15 2008-09-02 Arbitron, Inc. Methods and systems for gathering market research data inside and outside commercial establishments
US7463144B2 (en) 2004-03-19 2008-12-09 Arbitron, Inc. Gathering data concerning publication usage
US7483975B2 (en) * 2004-03-26 2009-01-27 Arbitron, Inc. Systems and methods for gathering data concerning usage of media data
JP4435612B2 (en) * 2004-03-26 2010-03-24 株式会社吉田製作所 Transaction authentication system using wireless communication media installed in dental structures
US20050213511A1 (en) * 2004-03-29 2005-09-29 Merlin Mobile Media System and method to track wireless device and communications usage
CA2562137C (en) 2004-04-07 2012-11-27 Nielsen Media Research, Inc. Data insertion apparatus and methods for use with compressed audio/video data
US20050228718A1 (en) * 2004-04-13 2005-10-13 Pop Insights Inc. Point of purchase research device
US8135606B2 (en) * 2004-04-15 2012-03-13 Arbitron, Inc. Gathering data concerning publication usage and exposure to products and/or presence in commercial establishment
JP4429786B2 (en) 2004-04-22 2010-03-10 株式会社ビデオリサーチ TV viewing situation survey device
JP2005309911A (en) * 2004-04-23 2005-11-04 Matsushita Electric Ind Co Ltd Evaluation system
US7409444B2 (en) * 2004-05-10 2008-08-05 Bdna Corporation Method and apparatus for managing business cell phone usage
US8232862B2 (en) * 2004-05-17 2012-07-31 Assa Abloy Ab Biometrically authenticated portable access device
JP4432628B2 (en) 2004-06-07 2010-03-17 株式会社デンソー Vehicle remote monitoring system, vehicle information communication device, communication terminal, and operating device
JP4210241B2 (en) 2004-06-11 2009-01-14 株式会社ビデオリサーチ Investigation method and program
JP2006011681A (en) * 2004-06-24 2006-01-12 Dainippon Printing Co Ltd Identification system
US20050289582A1 (en) * 2004-06-24 2005-12-29 Hitachi, Ltd. System and method for capturing and using biometrics to review a product, service, creative work or thing
MX2007000076A (en) 2004-07-02 2007-03-28 Nielsen Media Res Inc Methods and apparatus for mixing compressed digital bit streams.
AU2005267913A1 (en) 2004-07-30 2006-02-09 Nielson Media Research, Inc. Methods and apparatus for improving the accuracy and reach of electronic media exposure measurement systems
MX2007001251A (en) 2004-07-30 2008-02-14 Nielsen Media Res Inc Methods and apparatus for improving the accuracy and reach of electronic media exposure measurement systems.
CN102523063A (en) 2004-08-09 2012-06-27 尼尔森(美国)有限公司 Methods and apparatus to monitor audio/visual content from various sources
US20060041657A1 (en) * 2004-08-17 2006-02-23 Chih-Po Wen Method and apparatus for managing business cell phone usage
WO2006023770A2 (en) 2004-08-18 2006-03-02 Nielsen Media Research, Inc. Methods and apparatus for generating signatures
US7493388B2 (en) 2004-08-20 2009-02-17 Bdna Corporation Method and/or system for identifying information appliances
WO2006037014A2 (en) * 2004-09-27 2006-04-06 Nielsen Media Research, Inc. Methods and apparatus for using location information to manage spillover in an audience monitoring system
US20060101116A1 (en) 2004-10-28 2006-05-11 Danny Rittman Multifunctional telephone, walkie talkie, instant messenger, video-phone computer, based on WiFi (Wireless Fidelity) and WiMax technology, for establishing global wireless communication, network and video conferencing via the internet
JP4516408B2 (en) 2004-11-10 2010-08-04 株式会社ビデオリサーチ Data reading and collecting apparatus, portable data reader and data collecting machine
JP2006139591A (en) * 2004-11-12 2006-06-01 Fujitsu Ltd Process synchronous certification system and process synchronous certification method
JP4608290B2 (en) 2004-11-17 2011-01-12 セイコーエプソン株式会社 Information collection system, information collection device, terminal device management program, information collection management program, information collection management method, terminal device management method
JP4509750B2 (en) * 2004-11-25 2010-07-21 株式会社エヌ・ティ・ティ・ドコモ Portable terminal, server device, electronic value distribution system, and electronic value distribution method
US20060168613A1 (en) * 2004-11-29 2006-07-27 Wood Leslie A Systems and processes for use in media and/or market research
JP2006178602A (en) * 2004-12-21 2006-07-06 Net Base:Kk Personal information protection law assessment survey system
JP4008929B2 (en) 2005-02-23 2007-11-14 株式会社ビデオリサーチ Automatic TV commercial identification device
US8060753B2 (en) * 2005-03-07 2011-11-15 The Boeing Company Biometric platform radio identification anti-theft system
US7616110B2 (en) * 2005-03-11 2009-11-10 Aframe Digital, Inc. Mobile wireless customizable health and condition monitor
US7817983B2 (en) 2005-03-14 2010-10-19 Qualcomm Incorporated Method and apparatus for monitoring usage patterns of a wireless device
WO2006099612A2 (en) * 2005-03-17 2006-09-21 Nielsen Media Research, Inc. Methods and apparatus for using audience member behavior information to determine compliance with audience measurement system usage requirements
US20060218034A1 (en) 2005-03-23 2006-09-28 Kelly Laird R System and method for monitoring and recording research activity
US20060294108A1 (en) * 2005-04-14 2006-12-28 Adelson Alex M System for and method of managing schedule compliance and bidirectionally communicating in real time between a user and a manager
US20060240877A1 (en) * 2005-04-22 2006-10-26 Viktor Filiba System and method for providing in-coming call alerts
JP2009507301A (en) 2005-09-02 2009-02-19 ニールセン メディア リサーチ インコーポレイテッド Method and apparatus for measuring print media
JP4621572B2 (en) 2005-09-22 2011-01-26 株式会社ビデオリサーチ Viewing channel determination method and apparatus
WO2007038470A2 (en) 2005-09-26 2007-04-05 Nielsen Media Research, Inc. Methods and apparatus for metering computer-based media presentation
US8983551B2 (en) * 2005-10-18 2015-03-17 Lovina Worick Wearable notification device for processing alert signals generated from a user's wireless device
AU2006304933B2 (en) 2005-10-21 2011-07-21 The Nielsen Company (Us), Llc Methods and apparatus for metering portable media players
US20070136129A1 (en) * 2005-12-13 2007-06-14 Xerox Corporation Customer data collection system
US7740179B2 (en) 2005-12-15 2010-06-22 Mediamark Research, Inc. System and method for RFID-based printed media reading activity data acquisition and analysis
US8527320B2 (en) 2005-12-20 2013-09-03 Arbitron, Inc. Methods and systems for initiating a research panel of persons operating under a group agreement
US7872574B2 (en) 2006-02-01 2011-01-18 Innovation Specialists, Llc Sensory enhancement systems and methods in personal electronic devices
US20070208232A1 (en) * 2006-03-03 2007-09-06 Physiowave Inc. Physiologic monitoring initialization systems and methods
US8200320B2 (en) * 2006-03-03 2012-06-12 PhysioWave, Inc. Integrated physiologic monitoring systems and methods
JP4275679B2 (en) 2006-04-28 2009-06-10 株式会社インテージ Loading plan creation method and program thereof
US20120278377A1 (en) * 2006-07-12 2012-11-01 Arbitron, Inc. System and method for determining device compliance and recruitment
KR20090031772A (en) * 2006-07-12 2009-03-27 아비트론 인코포레이티드 Monitoring usage of a portable user appliance
US20120245978A1 (en) * 2006-07-12 2012-09-27 Arbitron, Inc. System and method for determinimg contextual characteristics of media exposure data
US8433726B2 (en) * 2006-09-01 2013-04-30 At&T Mobility Ii Llc Personal profile data repository
JP4728197B2 (en) 2006-09-28 2011-07-20 株式会社ビデオリサーチ Viewing channel determination method and system, terminal device, and center device
US20080204273A1 (en) * 2006-12-20 2008-08-28 Arbitron,Inc. Survey data acquisition
JP4963260B2 (en) 2007-04-25 2012-06-27 株式会社ビデオリサーチ Investigation system and investigation method
KR101370318B1 (en) 2007-06-11 2014-03-06 에스케이플래닛 주식회사 Method and Server for Collecting Contents Usage Information
JP4909190B2 (en) 2007-06-22 2012-04-04 株式会社ビデオリサーチ Questionnaire survey system, questionnaire survey terminal and questionnaire survey method
US20090171767A1 (en) * 2007-06-29 2009-07-02 Arbitron, Inc. Resource efficient research data gathering using portable monitoring devices
JP2008009442A (en) 2007-07-23 2008-01-17 Video Research:Kk Voice data processing method
US9124378B2 (en) * 2007-10-06 2015-09-01 The Nielsen Company (Us), Llc Gathering research data
KR20080034048A (en) 2008-04-07 2008-04-17 비해비어럴 인포매틱스, 인크. Monitoring a daily living activity and analyzing data related thereto
US8448105B2 (en) 2008-04-24 2013-05-21 University Of Southern California Clustering and fanout optimizations of asynchronous circuits
US8843948B2 (en) 2008-09-19 2014-09-23 The Nielsen Company (Us), Llc Methods and apparatus to detect carrying of a portable audience measurement device
US8040237B2 (en) 2008-10-29 2011-10-18 The Nielsen Company (Us), Llc Methods and apparatus to detect carrying of a portable audience measurement device
US8826317B2 (en) * 2009-04-17 2014-09-02 The Nielson Company (Us), Llc System and method for determining broadcast dimensionality
JP5327638B2 (en) 2009-12-16 2013-10-30 株式会社セガ GAME DEVICE AND GAME PROGRAM
JP5627440B2 (en) 2010-12-15 2014-11-19 キヤノン株式会社 Acoustic apparatus, control method therefor, and program
US20120173701A1 (en) * 2010-12-30 2012-07-05 Arbitron Inc. Matching techniques for cross-platform monitoring and information
US8830792B2 (en) 2011-04-18 2014-09-09 Microsoft Corporation Mobile device localization using audio signals
US20130035979A1 (en) * 2011-08-01 2013-02-07 Arbitron, Inc. Cross-platform audience measurement with privacy protection
US9332363B2 (en) * 2011-12-30 2016-05-03 The Nielsen Company (Us), Llc System and method for determining meter presence utilizing ambient fingerprints
US9293023B2 (en) * 2014-03-18 2016-03-22 Jack Ke Zhang Techniques for emergency detection and emergency alert messaging
US10763901B2 (en) 2016-12-14 2020-09-01 Sony Semiconductor Solutions Corporation Transmission device, transmission method, and communication system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050172021A1 (en) * 1997-03-28 2005-08-04 Brown Stephen J. Remotely monitoring an individual using scripted communications
US20020143577A1 (en) * 2001-04-02 2002-10-03 Saul Shiffman Apparatus and method for prediction and management of subject compliance in clinical research
WO2004006110A1 (en) * 2002-07-10 2004-01-15 Eclipse Integrated Systems, Inc. Method and system for increasing the efficacy of a clinical trial

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2008008913A2 *

Also Published As

Publication number Publication date
CA2658979A1 (en) 2008-01-17
WO2008008911A3 (en) 2008-11-20
MX2009000467A (en) 2009-04-14
WO2008008915A3 (en) 2008-10-09
WO2008008915A2 (en) 2008-01-17
CN101512484A (en) 2009-08-19
KR20090031772A (en) 2009-03-27
WO2008008899A2 (en) 2008-01-17
CN101512484B (en) 2013-12-11
CA2658977A1 (en) 2008-01-17
US9489640B2 (en) 2016-11-08
US20080091087A1 (en) 2008-04-17
EP2038743A2 (en) 2009-03-25
WO2008008899A3 (en) 2008-11-06
NO20090655L (en) 2009-04-02
JP5519278B2 (en) 2014-06-11
US20080091762A1 (en) 2008-04-17
IL196434A0 (en) 2009-09-22
CN103593562A (en) 2014-02-19
EP2038823A4 (en) 2009-08-05
MX2009000469A (en) 2009-05-12
US20190371462A1 (en) 2019-12-05
US20080091451A1 (en) 2008-04-17
KR20090031460A (en) 2009-03-25
US20080109295A1 (en) 2008-05-08
WO2008008905A2 (en) 2008-01-17
JP2009544082A (en) 2009-12-10
BRPI0714294A2 (en) 2013-03-12
JP2009544081A (en) 2009-12-10
KR20090031771A (en) 2009-03-27
CN102855378A (en) 2013-01-02
AU2007272440A1 (en) 2008-01-17
MX2009000468A (en) 2009-05-12
AU2007272434B2 (en) 2014-05-22
WO2008008905A3 (en) 2008-10-30
CN101512472A (en) 2009-08-19
US20080086533A1 (en) 2008-04-10
CA2659277A1 (en) 2008-01-17
IL196433A0 (en) 2009-09-22
EP2037799A4 (en) 2009-08-26
AU2007272444A1 (en) 2008-01-17
JP2009544080A (en) 2009-12-10
AU2007272428A1 (en) 2008-01-17
CA2659244A1 (en) 2008-01-17
HK1155234A1 (en) 2012-05-11
AU2007272442A1 (en) 2008-01-17
US20170039337A1 (en) 2017-02-09
CN103400280A (en) 2013-11-20
EP2038766A4 (en) 2009-08-05
US20230376901A1 (en) 2023-11-23
WO2008008913A2 (en) 2008-01-17
EP2038823A2 (en) 2009-03-25
EP2037799A2 (en) 2009-03-25
NO20090633L (en) 2009-04-14
EP2038743A4 (en) 2009-08-05
CA2659240A1 (en) 2008-01-17
JP5319526B2 (en) 2013-10-16
WO2008008913A3 (en) 2008-05-22
US10387618B2 (en) 2019-08-20
US11741431B2 (en) 2023-08-29
EP2038766A2 (en) 2009-03-25
IL196435A0 (en) 2009-09-22
BRPI0714293A2 (en) 2013-03-12
NO20090634L (en) 2009-04-14
JP5319527B2 (en) 2013-10-16
WO2008008911A2 (en) 2008-01-17
EP2038736A4 (en) 2009-08-19
CN101512575A (en) 2009-08-19
BRPI0714296A2 (en) 2013-03-12
AU2007272434A1 (en) 2008-01-17

Similar Documents

Publication Publication Date Title
US11741431B2 (en) Methods and systems for compliance confirmation and incentives
AU2014202095A1 (en) Methods and systems for compliance confirmation and incentives

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20080331

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA HR MK RS

A4 Supplementary search report drawn up and despatched

Effective date: 20090716

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 19/00 20060101ALI20090710BHEP

Ipc: G06F 3/048 20060101AFI20090212BHEP

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20121018

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20130301