US20120184305A1 - Sensory Enhancement Systems and Methods in Personal Electronic Devices - Google Patents

Sensory Enhancement Systems and Methods in Personal Electronic Devices Download PDF

Info

Publication number
US20120184305A1
US20120184305A1 US13433861 US201213433861A US2012184305A1 US 20120184305 A1 US20120184305 A1 US 20120184305A1 US 13433861 US13433861 US 13433861 US 201213433861 A US201213433861 A US 201213433861A US 2012184305 A1 US2012184305 A1 US 2012184305A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
ped
event
remote
peds
associated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13433861
Inventor
William L. Betts
Carol Betts
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Innovation Specialists LLC
Original Assignee
Innovation Specialists LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10009Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation sensing by radiation using wavelengths larger than 0.1 mm, e.g. radio-waves or microwaves
    • G06K7/10366Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation sensing by radiation using wavelengths larger than 0.1 mm, e.g. radio-waves or microwaves the interrogation device being adapted for miscellaneous applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by the preceding groups
    • G01N33/0004Gaseous mixtures, e.g. polluted air
    • G01N33/0009General constructional details of gas analysers, e.g. portable test equipment
    • G01N33/0073Control unit therefor
    • G01N33/0075Control unit therefor for multiple spatially distributed sensors, e.g. for environmental monitoring
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/02Mechanical actuation
    • G08B13/14Mechanical actuation by lifting or attempted removal of hand-portable articles
    • G08B13/1427Mechanical actuation by lifting or attempted removal of hand-portable articles with transmitter-receiver for distance detection
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal operating condition and not elsewhere provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/0202Child monitoring systems using a transmitter-receiver system carried by the parent and the child
    • G08B21/0263System arrangements wherein the object is to detect the direction in which child or item is located
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal operating condition and not elsewhere provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/0202Child monitoring systems using a transmitter-receiver system carried by the parent and the child
    • G08B21/0275Electronic Article Surveillance [EAS] tag technology used for parent or child unit, e.g. same transmission technology, magnetic tag, RF tag, RFID
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal operating condition and not elsewhere provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/0202Child monitoring systems using a transmitter-receiver system carried by the parent and the child
    • G08B21/0286Tampering or removal detection of the child unit from child or article
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal operating condition and not elsewhere provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0407Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
    • G08B21/0415Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting absence of activity per se
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal operating condition and not elsewhere provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0407Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
    • G08B21/043Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting an emergency event, e.g. a fall
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal operating condition and not elsewhere provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0446Sensor means for detecting worn on the body to detect changes of posture, e.g. a fall, inclination, acceleration, gait
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal operating condition and not elsewhere provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/08Alarms for ensuring the safety of persons responsive to the presence of persons in a body of water, e.g. a swimming pool; responsive to an abnormal condition of a body of water
    • G08B21/088Alarms for ensuring the safety of persons responsive to the presence of persons in a body of water, e.g. a swimming pool; responsive to an abnormal condition of a body of water by monitoring a device worn by the person, e.g. a bracelet attached to the swimmer
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal operating condition and not elsewhere provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/12Alarms for ensuring the safety of persons responsive to undesired emission of substances, e.g. pollution alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/009Signalling of the alarm condition to a substation whose identity is signalled to a central station, e.g. relaying alarm signals in order to extend communication range
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/016Personal emergency signalling and security systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B29/00Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
    • G08B29/18Prevention or correction of operating errors
    • G08B29/181Prevention or correction of operating errors due to failing power supply
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L17/00Speaker identification or verification
    • G10L17/06Decision making techniques; Pattern matching strategies
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L17/00Speaker identification or verification
    • G10L17/26Recognition of special voice characteristics, e.g. for use in lie detectors; Recognition of animal voices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72563Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status with means for adapting by the user the functionality or the communication capability of the terminal under specific circumstances
    • H04M1/72569Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status with means for adapting by the user the functionality or the communication capability of the terminal under specific circumstances according to context or environment related information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R3/00Circuits for transducers, loudspeakers or microphones
    • H04R3/005Circuits for transducers, loudspeakers or microphones for combining the signals of two or more microphones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72522With means for supporting locally a plurality of applications to increase the functionality
    • H04M1/72527With means for supporting locally a plurality of applications to increase the functionality provided by interfacing with an external accessory
    • H04M1/7253With means for supporting locally a plurality of applications to increase the functionality provided by interfacing with an external accessory using a two-way short-range wireless interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/10Details of telephonic subscriber devices including a GPS signal receiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2430/00Signal processing covered by H04R, not provided for in its groups
    • H04R2430/20Processing of the output signals of the acoustic transducers of an array for obtaining a desired directivity characteristic

Abstract

Disclosed are personal electronic devices (PEDs) having a sensory enhancement (SE) system for monitoring environmental conditions and detecting environmental events, for example but not limited to, changes in acoustic, thermal, optical, electromagnetic, chemical, dynamic, wireless, atmospheric, or biometric conditions. The detection of such events can be used to invoke a notification, an alert, a corrective action, or some other action, depending upon the implementation to the PED user or another party.

Description

    CLAIM OF PRIORITY
  • This application is a divisional of application Ser. No. 13/409,220, filed Mar. 1, 2012, which is a divisional of application Ser. No. 13/371,769, filed Feb. 13, 2012, which is a divisional of application Ser. No. 13/005,683, filed Jan. 13, 2011, which is a divisional application of application Ser. No. 11/345,058, filed Feb. 1, 2006, now U.S. Pat. No. 7,872,574, issued Jan. 18, 2011, all of which applications are incorporated herein by reference in their entirety.
  • FIELD OF THE INVENTION
  • The present invention generally relates to sensory enhancement (SE) systems and methods implemented in personal electronic devices (PEDs) for monitoring environmental conditions and detecting environmental events, for example but not limited to, changes in acoustic, thermal, optical, electromagnetic, chemical, dynamic, wireless, atmospheric, or biometric conditions. The detection of such events can be used to invoke a notification, an alert, a corrective action, communication to another device, or some other action, depending upon the implementation.
  • BACKGROUND OF THE INVENTION
  • Humans today live in a complex and rapidly changing environment. Frequently, they utilize and carry or otherwise transport with them one or more personal electronic devices (PEDs) that demand their attention and further increase the complexity of their environment. Personal digital assistants (PDA's), global positioning system (GPS) navigators, portable computers, calculators, digital cameras, hearing aids, radios, tape, CD, DVD, and/or MP3 players, video games, and wireless (e.g., cellular) telephones are good examples of PEDs.
  • The inventor has discovered that the functionality of such PEDs can be expanded to provide very beneficial sensory enhancement to the user with respect to the environment in which the PED is situated, as will be described in detail hereinafter.
  • SUMMARY OF INVENTION
  • The present invention provides various embodiments for sensory enhancement (SE) in a personal electronic device (PED). The present invention provides systems and methods that can acoustic, thermal, optical, electromagnetic, chemical, dynamic, wireless, atmospheric, or biometric signals in an environment to which the PED is exposed and generate appropriate notification signals. This sensory enhancement functionality may be implemented in its own PED or may be implemented in virtually any type of PED that performs other functions, for example but not limited to, a personal digital assistant (PDA); GPS navigator; portable computer; calculator; digital camera; hearing aid; radio; tape, CD, DVD, and/or MP3 player; video game; and wireless (e.g., cellular) telephone; etc. The conventional functions of these aforementioned PEDs are called herein “electronic based intelligence functions.” In the preferred embodiments, sensory enhancement functionality can proceed concurrently with the electronic based intelligence functions of the PED.
  • One embodiment of a device for sensory enhancement, among others that are described herein, can be summarized as follows. The device is essentially a PED that can be transported with a user. It comprises a first means for performing a first electronic based intelligence function; and a second means for performing a second electronic based intelligence function. The second means comprises a transducer (or sensor), means for detecting an event in an environment to which the PED is exposed via the transducer, and means for producing a notification upon detection of the event.
  • Another embodiment of a device for sensory enhancement, among others that are described herein, can be summarized as follows. The device is essentially a PED that can be transported with a user. It comprises a means for storing a reference signature, a means for detecting an event in an environment associated with the PED, and a means for producing a notification upon the detecting of the event. In this embodiment, the means for detecting includes a means for sensing a signal in the environment, a means for correlating the signal with the reference signature, and a means for indicating the detecting of the event based upon the correlating.
  • Another embodiment of a device for sensory enhancement, among others that are described herein, can be summarized as follows. In essence, this device includes functionality to permit it to cooperate with and exchange information with other PEDs so that measurement and detection functions can be enhanced. In a sense, a distributed system for sensory enhancement is thereby implemented.
  • Such an embodiment of the distributed system, among others that are described herein, can be summarized as follows: a plurality PEDs; means for communicating among the plurality of PEDs a selection of a reference signature corresponding to an event to be detected; means for permitting one or more of the PEDs to measure a characteristic of an environment with a transducer associated therewith; means for detecting the event in one or more of the PEDs; and means for generating a notification signal in the one or more PEDs indicating detection of the event. Furthermore, although not necessary for implementation, in the preferred embodiment, the PEDs further include a means for permitting the users to define whether or not their respective PEDs will cooperate and exchange information with others.
  • An embodiment of a method for sensory enhancement, among others that are described herein, can be summarized as follows. The method comprises the steps of: communicating to a PED a selection of a reference signature corresponding to an event to be detected; transporting the PED into an environment; permitting the PED to measure a characteristic of the environment with the transducer associated with the PED; and receiving a signal from the PED indicating detection of the event.
  • Another embodiment of a method for sensory enhancement, among others that are described herein, can be summarized as follows. The method comprises the steps of: providing a plurality of PEDs; communicating among the plurality of PEDs a selection of a reference signature corresponding to an event to be detected; permitting one or more of the PEDs to measure a characteristic of an environment with a transducer associated therewith; detecting the event in one or more of the PEDs; and generating a notification signal in the one or more PEDs indicating detection of the event.
  • Other systems, methods, features, and advantages of the present invention will become apparent to one of skill in the art upon examination of the drawings and detailed description.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present invention. Like reference numerals designate corresponding parts throughout the several views.
  • FIG. 1 is a block diagram of one example implementation of a sensory enhancement system.
  • FIG. 2 is a block diagram of an example implementation of a personal electronic device (PED) having the sensory enhancement system of FIG. 1.
  • FIG. 3 is a block diagram of an example implementation of a control menu for the PED of FIG. 2.
  • FIG. 4 is an example spectrogram graph illustrating measurement of acoustic data in three dimensions (time, frequency and magnitude) that can be analyzed in order to detect an acoustic event.
  • FIG. 5 is an example spectrogram graph illustrating Doppler calculations in connection with measured acoustic data.
  • FIG. 6 is an example spectrogram graph constructed by zero crossing analysis of sub-bands.
  • FIG. 7 is a diagram illustrating cooperative operation of multiple PEDs.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • FIG. 1 is a block diagram of an example implementation of the sensory enhancement (SE) system in accordance with the present invention and is generally denoted by reference numeral 100. As is shown in FIG. 1, the SE system 100 includes one or more input devices 105, such as but not limited to, a computer 120 as shown that can be communicatively coupled to the Internet 110, an audio microphone 130 as shown, etc., for receiving one or more reference signatures that are used to identify environmental events. The input devices 105 can be any transducer for sensing acoustic, thermal, optical, electromagnetic, chemical, dynamic, wireless, atmospheric, or biometric conditions (e.g., a body function, such as blood pressure, body temperature, heart rate, sugar level, heart beat, oxygen level, etc.), for example but not limited to, an audio microphone, video camera, Hall Effect magnetic field detector, flux gate compass, electromagnetic field detector, accelerometer, barometric pressure sensor, thermometer, ionization detector, smoke detector, gaseous detector, radiation detector, biometric sensor, etc.
  • The system 100 further comprises a detection engine 215 that stores the one or more reference signatures that are used to identify environmental events, that correlates sensed environmental signals with the reference signatures, and that detects occurrences of the environmental events. The detection engine 215 can be implemented in hardware, software, or a combination thereof, but is preferably implemented in software executed by a computer based architecture. When designed via software, it can be stored and transported in a computer readable medium. The system 100 further comprises one or more outputs 225, such as but not limited to, as shown, an audio speaker 250, a visual display device 260, a mechanical vibrator 270, etc., for advising of detection of environmental or physiological events.
  • The SE system 100 is designed to be operated in several modes. The architecture of the SE system 100 will be described as each of these modes is described in detail hereafter.
  • In a first mode, a computer 120 is connected to a reference memory array 160 by a switch 150. One or more reference signatures are collected by the computer 120 and loaded into the reference memory array 160.
  • Reference signatures, such as bird calls, a human voice, registered emergency signals (e.g., a police car siren or fire truck siren), etc. can be collected from the Internet 110 or another source by the computer 120.
  • As an example, bird songs can be acquired via download from the U.S. Geological Survey web site at http://www.mbr-pwrc.usqs.qov (Gough, G. A., Sauer, J. R., Miff, M. Patuxent Bird Identification Infocenter. 1998. Version 97.1. Patuxent Wildlife Research Center, Laurel, Md.). For instance, the bird song associated with the Eastern bluebird (Sialia sialis) can be downloaded from this site and is a 4 second, 32 Kbps MPEG Audio Layer-3 recording. Another site that includes .mp3 audio recordings and sonograms of bird songs is http://askabiologist.asu.edu/expstuff/experiments/birdsongs/birds_az.html (Kazilek, C. J. Ask A Biologist web site, Arizona State University, 1997-2004). Sonograms are graphs of frequency versus time and can include a measure of intensity or amplitude by gray scale or color variation. The SE system 100 is designed to transform the audio recordings into suitable numerical arrays for recognition. The frequency range of 0.2 Hz to 20 KHz is sufficient for bird calls and speech recognition applications. Furthermore, a time interval of several seconds is normally sufficient.
  • The preprocessor 170 extracts the reference signals from the reference memory array 160 and reformats them to facilitate rapid correlation. The frequency domain is a preferred format for sonograms. The preprocessor 170 analyzes each signature by a sequence of Fourier transforms taken repeatedly over a period of time corresponding to the duration of the signature. The Fourier transform is preferably a two-dimensional vector, but a single measure of amplitude versus frequency is sufficient. In the preferred embodiment, the SE system 100 processes a 3-dimensional array of amplitude, frequency, and time. The transformed signature arrays are stored back into a reference memory array 160 for subsequent rapid correlation. Preferably, each reference signature array includes an identifier field associated with the signature. As an example, for a bird song identification, this may be the name and picture/image of the bird associated with the signature. Or, in the case of emergency signals, the identifier can simply be an indication of the type of emergency. Furthermore, the emergency identifier can also indicate an appropriate evasive or corrective action.
  • In a second mode of operation, system 100 can acquire the reference signature signal directly from the local environment via the audio microphone 130. Audio signals from the microphone 130 are amplified and converted to digital signals by amplifier and analog-to-digital converter (ADC) 140. The digital signal from amplifier and ADC 140 is selected by the user via the switch 150 and loaded directly into the reference memory array 160. Preferably, several seconds of signal are collected in this particular application. Then, the preprocessor 170 reformats the reference signal for rapid correlation, preferably by Fourier transform.
  • A gain control 141 associated with the ADC 140 can be controlled by the user to control the range of the microphone 130 (or another input device, if applicable, and depending upon the application).
  • In a third mode of operation, the SE system 100 monitors the environment continuously (at discrete successive short time intervals due to the computer-based architecture) for signals that match those stored in the reference memory array 160. To reduce computational burden, the preprocessor 170 is designed to monitor the microphone 130 for a preset threshold level of signal before beginning the correlation process. When the signal exceeds the preset threshold level, the preprocessor 170 begins executing a Fourier transform. After several seconds or a period equal to the period of the reference signatures, the transformed active signal is stored at the output of the preprocessor 170. Then, array addressing logic 180 begins selecting one reference signature at a time for correlation. Each reference signature is correlated by a correlator 190 with the active signal to determine if the reference signature matches the active signal from the environment.
  • The comparator 200 compares the magnitude of the output of the correlator 190 with a threshold to determine a match. When searching for events in the active signal, such as emergency signals, the correlator 190 is compared with a fixed threshold. In this case, the switch 210 selects a fixed threshold 211 for comparison. If the correlation magnitude exceeds the fixed threshold 211, then the comparator 200 has detected a match. The comparator 200 then activates the correlation identifier register 220 and the correlation magnitude register 230. The magnitude of the comparison result is stored in the correlation magnitude register 230, and the identity of the source is stored in the correlation identifier register 220. For emergency events, an immediate alert signal may be given. This may be an audible signal via a speaker 250, a visual signal via a display 260, a vibration signal via vibrator 270, or some other signal that can be communicated to a user of the SE system 100.
  • The fixed threshold 211 can be predefined by a programmer or the user of the system 100.
  • Noise canceling technology is available to improve resolution. Noise canceling microphones or microphone arrays can be used to cancel ambient noise and better detect events. The noise canceling technology can be implemented in software in the detection engine 215, such as in or in association with the preprocessor 170.
  • Speaker 250 may be a conventional audio speaker or a more sophisticated audio device. For example, a pair of stereo headphones can be used in stead of speaker 250 so that the location of the detected event can be projected by way of the dual stereo channels associated with the stereo headphones. More specifically, assume that two input microphones 130 are employed so that the direction of an event can be determined via different event signal intensities at the two microphones 130. If an emergency signal is detected from the left, then a notification signal could be played on the left stereo channel so that the user knows that the event occurred on the left. This technique can be used within a noisy or sound suppressing vehicle to relay sounds detected by external microphones to internal stereo speakers. Moreover, a map and/or directional arrow can be used in display 260 to present the location or direction of the detected event.
  • After event detection by the SE system 100, the process is stopped and the array addressing logic 180 is reset. A search for new active signals then resumes.
  • In some embodiments of the SE system 100, the SE system 100 may be designed to communicate a notification to a remote communications device in order to advise a remote party of detection of an event. Examples include a text message, an email, a voice message, etc.
  • In a fourth mode of operation, the SE system 100 searches for the best match for the active signal. In this case, the correlation magnitude register 230 is first cleared. Then, the switch 210 selects the output 212 of the correlation magnitude register 230 as the threshold input to the comparator 200. The array addressing logic 180 then sequentially selects all stored references of a set for correlation. After each reference in the set is correlated, the comparator 200 compares the result with previous correlations stored in the correlation magnitude register 230. If the new correlation magnitude is higher, then the new correlation magnitude is loaded into the correlation magnitude register 230, and the respective identifier is loaded into the correlation identifier register 220.
  • In an alternative embodiment, the correlation process can be performed by an associative process, where the active reference is associated directly with the stored references in a parallel operation that is faster than the sequential operation. New device technologies may enable associative processing. For example, reference memory array 160 can utilize content addressable memory devices for associative processing. ASIC devices and devices, such as the Texas Instruments TNETX3151 Ethernet switch incorporate content addressable memory. U.S. Pat. No. 5,216,541, titled “Optical Associative Identifier with Real Time Joint Transform Correlator,” which is incorporated herein by reference, describes optical associative correlation.
  • In a second alternative embodiment, multiple correlators can be used to simultaneously correlate multiple reference signatures. Each stored reference can have a dedicated correlator or several correlators can each process its own set of stored references. Multiple SE systems 100 can perform correlations with their individual sets of stored references and communicate shared results. Dispersed portable PEDs having the SE systems 100 can sense over a wider geographical range and increase effective processing speed.
  • This correlation process continues until all stored reference signatures in the set under analysis have been correlated. When the correlation process is completed, the correlation identifier register 220 holds the best match of the identity of the source of the active signal. The identity can be displayed as a photo or text description in display 260 or as a verbal announcement via amplifier 240 and speaker 250. If the final correlation magnitude is lower than a predetermined threshold, then the active signature can be loaded into the reference memory array 160 as a new unknown source.
  • In a fifth mode of operation, the SE system 100 can attempt to identify unknown sources. Switch 150 is connected to the computer 120 for access to the Internet 110. The computer 120 then searches the Internet 110 for additional references using, for example, a Web browser, associated with the computer 120. The references are downloaded and stored in the reference memory array 160. The unknown source is correlated with the new additional references until a match is found.
  • The computer 120 can be configured to browse for reference signatures at known World Wide Web (WWW) sites that have such signatures. Furthermore, in accordance with another aspect of the present invention, a server having a database of reference signatures can be constructed and deployed and consulted by the computer 120. Such a configuration is desirable because the format of the reference signatures stored in the server database would be known by the computer 120, making access and analysis of same easy. Moreover, as a novel business method, the user of the system 100 could be charged for access to the reference signatures in the database by the system owner/operator.
  • FIG. 2 is a block diagram of an example implementation of a portable PED 300 having the SE system 100. The PED 300 can be designed to implement only one electronic based intelligence function, i.e., the SE system 100. However, in the preferred embodiment, the PED 300 is designed with the SE system 100 and at least one other electronic based intelligence function. In general, the PED 300 of the preferred embodiment is implemented by storing suitable SE software (that implements the SE system 100) in a conventional computer-architecture-based PED, such as a wireless (e.g., cellular) telephone or PDA with wireless telephone capability.
  • A wireless telephone implementation is particularly convenient for acoustic SE, because wireless telephones incorporate a microphone for detection and a speaker for output. Many contemporary wireless telephones incorporate speech recognition software for dialing by voice command. This recognition software can be augmented to provide additional SE capabilities. The speech recognition capability typically includes a learning function whereby the user first enunciates the command while in a special learning mode. This learned command is then stored for later reference, typically with respect to a telephone number. All potential commands are recorded in this manner and stored for reference. Then, in normal operation, when the user enunciates a command, that command is compared with all stored reference commands. The reference that most closely matches the command is used to select and dial the respective phone number.
  • The acoustic SE system 100 recognizes a much broader set of signals beyond the speech recognized as dial commands. The acoustic SE system 100 stores additional reference signals for recognition. These additional reference signals can be recorded directly by the SE system 100. Or, preferentially, these signals can be obtained as files downloaded from a central repository. Examples include a set of bird songs or a set of registered emergency signals.
  • Other signals may be computationally derived, such as the Doppler shift of passing vehicles or projectiles. The magnitude of Doppler shift gives the relative speed, and the rate of change of the Doppler shift gives the proximity or closest approach of the vehicle or projectile. Note that only one sensor, or transducer, is needed for determining proximity and speed of an object, whereas the determination of direction would typically require the use of two or more sensors.
  • Personal Equipment
  • In architecture, as illustrated in FIG. 2, the PED 300 generally comprises an operator interface 320, a baseband subsystem 340, and an RF subsystem 370.
  • The operator interface 320 allows the operator to communicate with the baseband subsystem 340. The operator interface 320 incorporates an audio speaker 250, a vibrator 270, a display 260, a keyboard (or dialpad) 328, and an audio microphone 130. The keyboard 328 is used by the operator to generally control the PED 300. Commands or telephone numbers can be entered on the keyboard 328. A display 260 presents the status of the PED 300 to the operator. Speech signals for communications and alert signals for SE are generated digitally in the baseband subsystem 340 and sent to digital-to-analog converter (DAC) 322. The DAC 322 converts digital signals from the baseband subsystem 340 into analog signals to drive the speaker 250. The speaker 250 presents alarm and alert signals as well as received speech signals. Microphone 130 converts acoustic signals into analog input signals for detection by the SE system 100 or for transmission as speech by the PED 300. Analog signals from microphone 130 are converted to digital signals by an ADC 332 for input to the baseband subsystem 340. Speaker 250 and microphone 130 can be stereo devices for the detection and indication of the relative bearing of detected events. Multi-dimensional devices will provide better 3-dimensional position information and improved rejection of ambient noise. Speaker 250 may include a very low frequency mode. A mechanical vibrator 270 can give a mechanical alert signal, if desired.
  • Baseband subsystem 340 implements control and baseband signal processing functions of the PED 300. For communications functions, the baseband signal processing includes speech recognition, speech compression/decompression, error detection/correction, filtering, and baseband modulation/demodulation. For SE, the baseband signal processing functions include preprocessing, signature array computations, correlation, and detection. Advantageously, when not actively serving for communications, the entire baseband subsystem 340 can be devoted to SE. At least one exception is concurrent emergency signal detection that may be necessary to alert the operator whose attention has been diverted by conversations facilitated by the PED 300.
  • Baseband subsystem 340 comprises a general purpose microprocessor 368, a memory 350, a digital signal processor (DSP) 360, and other components interconnected by a local interface, which in the preferred embodiment, is a digital communications bus 342. Digital communications bus 342 may be a single bidirectional bus or multiple busses. The operator interface 320 connects directly to the digital communication bus 342. Those skilled in the art will recognize that components of the operator interface 320 and RF subsystem 370 may alternatively be connected to specific interface circuitry that connects to the digital communications bus 342 or that connects to other components, such as the microprocessor 368. An interface alternative is direct memory access (DMA) to transfer data directly into memory 350 or into memory arrays internal to microprocessor 368 or DSP 360.
  • Examples of dual core processors that can be used in the PED 300 to implement the DSP 360 include, for example, but not limited to, the IBM Power5 multi-chipped processor and the Texas Instruments TMS320C6416 family of digital signal processors. The Texas Instruments TCS1110 chipset is typically used for GSM cell phone handsets. It includes the TBB1110, a dual-core digital baseband processor with both VCP Viterbi decoder and TCP Turbo decoder coprocessors for error correction. Moreover, the Texas Instruments TRF6150 tri-band direct-conversion RF transceiver can implement the RF subsystem 370. GSM is a digital cellular telecommunications system standard as specified in technical specifications such as ETSI TS 101 855.
  • Microprocessor 368 controls the PED 300 in response to execution of software program instructions stored in memory 350. Software program instructions can be executed directly from memory 350 via bus 342 or batch transferred to memory that is internal to microprocessor 368 or DSP 360 for execution. Microprocessor 368 and DSP 360 may be a single device comprising multiple microprocessors, DSP's and memory devices. DSP devices typically contain multiple functional units including memory, a generalized DSP and multiple specialized pre-programmed DSP's or logic units for implementing features, such as Fourier transformation and Reed Solomon error correction. System-on-a-chip SOC and system-in-a-package SIP technology provide for multiple processors and multiple technologies. Multiple technologies allow for very sensitive environmental detectors and communications receivers as well as high power technology for communications transmitters. Examples include the IBM Power5 multi-chipped processor and the TI C6X family of digital signal processors.
  • As mentioned, for SE, the baseband signal processing functions include preprocessing, signature array computations, correlation, and detection. These functions can be implemented by the detection engine 215, which in this embodiment, is in the form of software stored in the memory 350 and executed by the microprocessor 368 and/or the DSP 360.
  • In the preferred embodiment, the microprocessor 368 implements low duty cycle control functions, such as accessing a local list of telephone numbers, call setup, implementation of communications protocols, and general initialization and control of the operator interface 320 and RF subsystem 370. Control commands are transferred from microprocessor 368 to control signals block 366 via bus 342. Control signals block 366 generates signals to the RF subsystem 370 to control frequency synthesis, radiated power, receiver sensitivity, antenna array pointing, initialization, and other communications parameters. Control signals block 366 can be used to pre-program coefficients of multiple input multiple output (MIMO) processors within the RF subsystem 370. Coefficients can be generated at a low duty cycle in the baseband subsystem to offload processing in the RF subsystem 370.
  • Microprocessor 368 can also access the Internet 110 by wireless connections through the RF subsystem 370. Direct internet access facilitates collection of reference signatures for SE.
  • DSP 360 performs the complex baseband signal processing operations. These typically involve complex array processing and very high speed arithmetic operations.
  • DSP 360 can also perform the control functions of the microprocessor 368. However, it is generally more economical to utilize the independent microprocessor 368 for control functions.
  • In addition to the microphone 130, one or more additional environmental sensors 348 (or transducers) may be implemented to monitor the environment and transfer digital replicas of detected events to bus 342 for analysis and action by DSP 360 and microprocessor 368. Sensors 348 may include, for example but not limited to, a microphone, video camera, Hall Effect magnetic field detector, flux gate compass, electromagnetic field detector, accelerometer, barometric pressure sensor, thermometer, ionization detector, smoke detector, gaseous detector, radiation detector, biometric sensor, etc. The set of sensors 348 is optionally provisioned, as needed, to minimize cost. For example accelerometers in the device can warn of impending falls. Web site htpp://link.abpi.net/l.php?20050822A7 discusses a balance device that utilizes a stereo warning of sway.
  • The RF subsystem 370 handles signals that are at radio frequencies, which are those that cannot be economically processed by the baseband subsystem 340. Techniques, such as heterodyning, can be used to shift the economical threshold for specific implementations.
  • In an alternative embodiment, the RF subsystem 370 can be designed to utilize additional frequency bands to detect and access wireless data being transmitted in the environment, for example, signals communicated pursuant to the Bluetooth IEEE 802.15.1 communication protocol, the 802.11 communication protocol, etc. External equipment can provide an alert or other information to the system 300.
  • In the preferred embodiment of the system 300, the system 300 wirelessly accesses the Internet 110 via the RF subsystem 370 for updating an address book, for obtaining updates of software, and for acquiring reference signatures for the SE functions.
  • In another alternative embodiment, the RF subsystem 370 can be augmented to interrogate radio frequency identification (RFID) tags. As RFID becomes more common, the ability to interrogate and read these devices will become essential and provide significant SE. RFID business cards can be read directly to load the address book of the PED 300, thereby avoiding spelling and transposition errors.
  • As further illustrated in FIG. 2, a DAC 346 converts digital signals from bus the 342 to analog signals for modulation by a modulator 378. The modulated signals are coupled via a diplexer 382 to an antenna 380. Received signals are coupled from the antenna 380 to the diplexer 382, then to demodulator 372 for demodulation. Analog demodulated signals are converted to digital signals by an ADC 344 and transferred to the bus 342 for final decoding in the baseband subsystem 340. Those skilled in the art will recognize that DAC 346 and ADC 344 can be located at various points within modulator 378 and demodulator 372. As shown, modulation and demodulation are predominantly analog functions, but contemporary designs implement these functions in the digital domain. A significant portion of the modulation and demodulation functions can be implemented in DSP 360 or other DSP elements within the modulator 378 or demodulator 372.
  • In an alternative embodiment, the antenna 380 may be implemented as a single antenna, multiple antennas, or an antenna array. Diplexer 382 may not be required if independent antennas are used to transmit and receive. Fractal antennas may cover a much wider frequency range allowing operation in multiple frequency bands. Antenna arrays are beneficial for beam forming to enhance signals or to reject interfering signals. Antenna beams offer additional directional information that may be useful in locating the signal source. Display 260 can present a directional arrow indicating the direction to a signal source located by automatic beam steering.
  • The GPS receiver 374 is another optional element. GPS receiver 374 receives position information from global positioning system satellites via an antenna 376. The position information is transferred directly to the baseband subsystem 340 for processing. The GPS receiver 374 can use the independent antenna 376 or share the common antenna 380. U.S. Pat. No. 6,975,277, titled “Wireless communications device pseudo-fractal antenna,” which is incorporated herein by reference, describes an antenna for operating in the GPS and cellular telephone bands, and such antenna can be implemented in the PED 300. Many of the GPS functions, such as coordinate transformation, can be implemented in GPS receiver 374 or DSP 360. Position information from the GPS receiver 374 can be used to alert the operator of proximity to various locations, including those that are hazardous or dangerous. GPS receiver 374 can provide dynamic inputs of speed, direction, and distance traveled to the SE system 100.
  • Operator Interface
  • FIG. 3 illustrates an example of a set of control screen menus 400 that can be used to control the SE system 100 (FIG. 1) associated with the PED 300 (FIG. 2). The screens represent one possible implementation that could be realized in a typical cell phone communications device, such as the commercially available Motorola V60t cell phone. These menus are accessed and displayed through keyboard 328 (FIG. 2) and display 260 (FIG. 2).
  • The menu access begins by activating the PED 300 and depressing the MENU key 410 or enunciating a voice command into microphone 130 (FIG. 2). This activates a new MENU screen 420 which lists a number of optional commands. To place a conventional phone call, the DIAL command is selected to open the DIAL menu 432.
  • This selection causes display of an alphabetical list of names associated with phone numbers stored in the phone memory 350 (FIG. 2). After selecting the desired name, a call is placed to the respective phone number.
  • Voice commands are implemented by pressing a voice command key, then enunciating the command, such as “name dial” or “number dial” into microphone 130
  • (FIG. 2). Speaker 250 (FIG. 2) is then used to issue guidance instructions, such as “say the name”. The operator then enunciates the name into microphone 130, the name is repeated via speaker 250, and if confirmed by the operator, then the call is placed. These voice commands can be used to step through the entire control menu 400. The main menu is accessed by selecting the SENSORY command in the MENU screen 420. This opens the SENSORY screen 430. SENSORY screen 430 allows selection of one or more sensory modes, but preferably multiple sensory modes in this example implementation, that can be active simultaneously. In this implementation, the sensory modes include acoustic, optical, thermal, chemical, electromagnetic, atmospheric, biometric, dynamic, and wireless (corresponding to the types of sensors that are associated with the PED 300). By way of example, a few of these are discussed to clarify the operation of the PED 300.
  • Selection of the ACOUSTIC command in the SENSORY screen 430 activates the ACOUSTIC screen 440. ACOUSTIC screen 440 may have a large number of choices, only four are shown for exemplary purposes. The RECORD selection of ACOUSTIC screen 440 will activate the RECORD screen 450. This screen enables at least three commands: (1) to start recording an acoustic signature, (2) to stop recording the signature and (3) to label the signature. The label could be typed on keyboard 328 (FIG. 2) or spoken into microphone 130 (FIG. 2). Camera phones can use a photograph of the source for a label. The label is an identifier that can be used by the correlation identifier register 220 of FIG. 1. A number of sub menus (not shown) can be used to enhance recording. The sensitivity of the microphone 130 can be adjusted. An indicator lamp or sound level meter can be displayed in display 260 (FIG. 2) to provide an indication to the operator when an acoustic signal has been detected with suitable quality for recording. The operator can initiate recording when suitable quality is indicated.
  • A second choice in the ACOUSTIC screen 440 opens the IDENTIFY screen 452. The IDENTIFY screen 452 enables a number of choices for identification of acoustic signals. The IMMEDIATE command initiates a search to identify the audio signals currently detected by the microphone 130. All signatures within reference memory array 160 of FIG. 1 are searched. If a match is found, then the identity of the matching reference will be loaded into correlation identifier register 220 (FIG. 1) and displayed on display 260 (FIG. 2) or announced via speaker 250 (FIG. 2). The third command in the IDENTIFY screen 452 is for warnings. This opens the acoustic WARNINGS screen 460. Two of several possible warning commands are shown in WARNINGS screen 460. The DECIBELS command will enable a warning if the sound pressure in the vicinity exceeds a safe threshold as measured in decibels. The threshold can be set by the operator. This warning offers protection when the user enters an area of dangerous sound pressure levels. The PROXIMITY command in
  • WARNINGS screen 460 activates the proximity detection system to monitor Doppler shifted acoustic signals and warn of objects passing nearby. Speed and distance are measured and displayed with selectable warning thresholds. The IDENTIFY screen 452 also offers a PROXIMITY command that will issue a warning when the GPS measured position approaches within a selectable range of locations, such as but not limited to, dangerous locations, stored in memory 350.
  • The LOCATE command of the ACOUSTIC screen 440 activates the LOCATE screen 454, which is used to locate the position of the source of detected acoustic signals. The DIRECTIONAL MICROPHONE command of LOCATE screen 454 will activate directional microphones 130 (two or more are needed to determine direction) that can identify the direction to the source of the acoustic signals by measuring the relative phase of the acoustic wave front as it passes over the device. Optionally, additional microphones 130 can be place at some distance away from the PED 300 to give better resolution of range. These can be wired to the device or communicate via wireless signals, such as those specified in IEEE wireless standard 802.11. This DIRECTIONAL MICROPHONE command can also be used to initialize the sensors. Initialization may require leveling the device and rotating it to align a Hall Effect magnetic compass within the device. The COOPERATIVE DEVICES command of the LOCATE screen 454 is used to coordinate multiple PEDs 300 to determine location. This command opens the COOPERATIVE DEVICES screen 462 which is used to control cooperative operation. The VOLUNTEER command allows the operator to volunteer the PED 300 for cooperative operation with other PEDs 300 in the area. A volunteer signal will be sent to other PEDs 300 identifying the PED 300, its location, and the sensors that are available. The volunteer signal will be sent when first selected and again whenever queried by another PED 300 that is searching for cooperative partners. The MEMBERS command opens the MEMBERS screen 472, which lists the names or phone numbers of nearby devices to be selected as members of the coordination team. The REFERENCE command selects one or more reference signatures that are used to identify the selected environmental event. The reference signatures are transmitted to all of the PEDs 300 participating in the coordination team.
  • The BIOMETRIC command of SENSORY screen 430 activates the biometric screen 442. The BIOMETRIC screen 442 has check boxes that are selected to activate various biometric monitors for pulse rate, oxygen level, blood pressure, temperature, intoxication, pedometer, and sway. Functions such as the pedometer and sway can be measured directly by internal accelerometers. The GPS receiver 374 (FIG. 2) can be used to calibrate the walking gate automatically or to directly measure the distance traveled. Oxygen level can be measured by folding an appendage sensor of the PED 300 around a finger allowing an internal illuminator and detector to measure blood oxygen levels. The other biometric parameters have corresponding biometric sensors communicatively coupled to the PED 300 by physical wires or wireless signals. Selection of any biometric parameters in BIOMETRIC screen 442 will open the MONITOR screen 456 where independent thresholds can be set for warnings on each parameter. The parameter values can be continuously displayed on display 260 or announced on speaker 250. Audio announcements can be issued when values change, when limits are exceeded or periodically. A running chronological record of the parameters can be maintained in memory 350 (FIG. 2). Parameters can be recorded in files with respect to a real time reference derived from GPS receiver 374. Recorded parameter files can be recalled later for display as a graph on display 360 or communicated to a central repository or other device by physical wires or wireless signals.
  • The WIRELESS command of SENSORY screen 430 activates the WIRELESS screen 444. The SEARCH command of WIRELESS screen 444 initiates a search for wireless signals. Wireless signals may be long range, such as weather warnings.
  • Others may be issued by nearby equipment. Dangerous heavy equipment can be modified to generate wireless signals, such as for example but not limited to, those specified in IEEE wireless standard 802.11b or 802.11g. These signals can warn of the nearby equipment and issue detailed instructions to be followed when in close proximity to the equipment.
  • A wireless signal could be used to warn against cell phone use and shut down the cell phone after an adequate warning period for the conversation to be politely terminated. U.S. Pat. No. 6,943,667, which is incorporated herein by reference, describes a method for waking a device in response to wireless network activity and presents a method for determining if a wireless signal is from a known source. The foregoing methods can be implemented in the PED 300 so that the PED 300 can detect and identify wireless network activity. Furthermore, U.S. Pat. No. 6,222,458, which is incorporated herein by reference, describes an automatic cell phone detection system/method at a combustible delivery station that provides for turning off a pump when a cell phone is detected. Such a system/method can be implemented in the PED 300 so that the PED 300 can turn off its corresponding transmitter when in close proximity to a combustible or explosive environment. The CHEMICAL command of SENSORY screen 430 can be used to detect combustible, explosive, or toxic environments as well as combustion products of smoke and carbon monoxide.
  • The menu screens preferably include redundancy, allowing the user to activate specific detectors from several different screens to fit the preferences of the user. The ATMOSPHERIC command of SENSORY screen 430 can be used to detect a range of atmospheric conditions including but not limited to temperature, barometric pressure, humidity, precipitation, lightning, tornadoes, wind speed, wind direction, dew point, fog, smoke, gaseous vapors, airborne particulates, airborne pathogens, sound pressure, solar intensity, radiation, etc. A different set of these parameters can be selected by the user for outdoor activity or in confined, possibly contaminated areas. U.S. Pat. No. 6,232,882, titled “Warning System and Method for Detection of Tornadoes,” which is incorporated herein by reference, describes a system and method for detecting and differentiating between lightning strikes and tornado generated electromagnetic signals. Such system and method can be implemented in the PED 300 of the present invention.
  • Spectrogram Example
  • FIG. 4 is one nonlimiting example of a spectrogram as may be presented in a printed document. In this example, the abscissa x-axis is frequency in Hertz (Hz) and the ordinate y-axis is time in seconds. This plane of the graph depicts changes in frequency with respect to time. Any acoustic source will generate multiple frequencies and all are shown in the spectrogram. A third dimension, the magnitude of each frequency is displayed by variations in the intensity or darkness of each plotted point. For calculations and correlation, this same information is stored in reference memory array 160 as a three dimensional array representing time, frequency, and magnitude.
  • U.S. Pat. No. 6,173,074, titled “Acoustic Signature Recognition and Identification,” which is incorporated herein by reference, describes a system and process for performing such calculations and correlation that can be implemented in the SE system 100. In essence, the system and process use a Fast Fourier Transform (FFT) to compute the spectrogram image of frequency versus time, which is then used to identify machinery.
  • Doppler Calculations
  • Doppler frequency calculations are well known in the art. Doppler frequency shift of acoustic or electromagnetic waves occurs when the source of a signal is in motion with respect to the observer. The frequencies of signals emanating from an approaching object are shifted up to higher frequency in direct proportion to the relative speed. When the object passes its point of closest approach and begins to recede, then the signal will be shifted to lower frequency as shown in FIG. 5. The frequency at the point of closest approach is the true frequency of the signal. This true frequency, ft, can be computed as the average between the original approach frequency, fa, and final departure frequency, fr. One half of the difference between the original approach frequency and final departure frequency indicates the Doppler frequency shift, fd, which is used to estimate the speed of the object, sa, from the known propagation speed of the wave, sp.

  • f t=(f a +f r)/2

  • f d=(f a −f r)/2

  • s a =s p *f d /f t
  • FIG. 5 is an example spectrogram of an object traveling at 110 ft/s and passing at two different ranges of 500 ft and 100 ft. For this example, the audio noise emanating from the object is 200 Hz corresponding to reciprocating equipment running at 12,000 rotations per minute (rpm). For illustrative purposes, the actual Doppler frequency shift is derived from the spectrogram for passage at 500 ft and plotted at the bottom of FIG. 5. The equations above yield an estimated true frequency of 200 Hz, an estimated Doppler frequency shift of 19.96 Hz, and an estimated speed of 109.8 ft/s. If a known frequency is emanating from the object, then the Doppler shift and speed can be computed on first approach. If the frequency is unknown, then it is best to wait for departure and estimate the true frequency as outlined above using the broadest possible frequency spread. The Doppler frequency shift and corresponding range can be underestimated for objects that pass far away.
  • The rate of change in frequency indicates the distance of closest approach, D. The apparent frequency will change as a sinusoidal function of the bearing to the passing object. The bearing B relative to a zero degree angle at closest approach can be computed as a function of this apparent frequency f.

  • B=arcsine ((f−f t)/f d)
  • The rate of change is computed by measuring the time T required for a predetermined frequency shift. Distance run Dr is then computed from the estimated speed sa to be Dr=T*sa. Knowledge of the distance run and the bearing between two points establishes a triangle and enables calculation of the distance of closest approach.
  • A number of solutions are available, but one of the simplest is to time the passage in a 60 degree cone from +30 degrees to −30 degrees where the frequency will change from ft+fd/2 to ft −f d/2. Within this 60 degree cone, the target is in close proximity for the final measurement and the distance of closest approach is D=Dr/(2*tangent(B)), where B=30 degrees. This calculation can be used for any symmetric measurements across the point of closest approach.
  • In general, a closed solution can be computed from any two points.
  • Computational accuracy improves at close range where the bearing is less than 45 degrees. At times T1 and T2, respective frequencies of f1 and f2 are measured. The time of transit between the two points is T=T2−T1, the distance run between these two points is Dr=T*Sa, and the distance of closest approach is computed from the bearings to each point B1 and B2 to yield

  • D=D r cosine (B 1) cosine (B 2)/sine (B 1 −B 2)
  • Most objects generate a packet of multiple frequencies. The centroid of the packet can be used to simplify the calculations. U.S. Pat. No. 6,853,695, titled “System and Method for Deriving Symbol Timing,” which is incorporated herein by reference, describes a centroid calculation process for timing estimates that can be used for a packet of frequencies. The foregoing process can be implemented in the SE system 100. U.S. Pat. No. 4,640,134, titled “Apparatus and Method for Analyzing Acoustical Signals,” which is incorporated herein by reference, describes a process for zero crossing analysis of sub-bands to construct acoustical spectrograms, as shown in FIG. 6. The aforementioned process can also be implemented in the SE system 100.
  • Magnitude or intensity of the sound waves can be expected to increase on approach and decrease on departure. But, magnitude or the volume of sound can vary for many reasons and may not be sufficiently reliable for range estimates when used alone. However, a steady increase in sound power magnitude with no change in frequency indicates a potential collision.
  • A second method for computing range is the use of comb filters to detect only Doppler shifted frequencies. This method is used, for example, in Doppler weather radar, which detects moving weather phenomena. It relies on knowledge of the frequency of the original signal which is transmitted locally, reflects off of the target and returns with Doppler shifted frequency proportional to the speed of the target.
  • Another method for calculating the range to moving objects is to compute the range from differences in the relative speed of propagation of different signals. It is well known that the 186,300 miles per second speed of light is much faster than the 1100 ft/s speed of sound in air. Many people estimate the distance to dangerous lightning storms by counting the seconds between the flash of lightning and the arrival of the sound of thunder. For most purposes, the speed of light is instantaneous so that each second of delay equates to 1100 feet distance from the lightning strike. An SE system 100 with optical and audio capability can use this same or a similar method to estimate distance. The RF subsystem 370 can detect radio frequency signals generated by the electrostatic discharge of lightning when indoors or beyond the visual range of the lightning. For greater accuracy, air pressure and temperature can be measured to accurately predict the local speed of sound.
  • A differential acoustic method can be applied to moving vehicles. An acoustic sensor can be placed in the ground or water near the SE system 100. The speed of sound in water is 4856 ft/s. Acoustic waves propagating through the ground or water will be detected earlier than acoustic waves propagating through the air. The difference in propagation speed can be used to compute the range to the object directly. This technique can be implemented in the canes used by visually impaired individuals. An acoustic sensor in the tip of the cane will detect approaching objects before an acoustic sensor placed higher up to monitor air borne acoustic signals. The difference in time of arrival at the two sensors can be used to compute range.
  • Sound Power Level Warnings
  • The National Institute of Health (www.nih.gov) and National Institute for Occupational Safety and Health (http://www.cdc.gov/niosh/98-126.html) recommend no more than 15 minutes of exposure to high sound power levels above 100 dBA and no more than 8 hours of exposure above 85 dBA. The SE system 100 can be designed to give an immediate warning of high sound pressure levels or give a weighted measure over time so that the 100 dBA warning will be given after 15 minutes of exposure. Cumulative exposure can be accurately computed by the SE system 100 for all sound level exposure throughout the day. For each 3 dB increase in sound power level above 85 dBA the recommended exposure time limit is cut in half. For a sound power level of Pi in dBA the maximum exposure time is

  • T i=8/log10 −1 ((P i−85)/10) hours

  • or

  • T i=8/antilog10 ((P i−85)/10) hours.
  • The SE system 100 measures the cumulative exposure at all levels above 85 dBA by recording the total time ti that the sound power level is in each range Pi. Then, the cumulative exposure dose D relative to a maximum exposure limit of 100% is given by

  • D=(t 1 /T 1 +t 2 /T 2 + . . . +t n /T n)*100%.
  • Audio devices that use ear plugs or ear phones could be modified to implement the SE system 100 in order to provide a back pressure measurement such that the device can compute the sound pressure within the ear. Alternatively, the ear plug sound power level can be calibrated with respect to the volume setting on the audio device so that the sound power level can be computed from the volume setting. This calculation can be used to alert the operator of dangerous volume levels. For safety, the device could automatically reduce volume levels to maintain safe sound levels.
  • Physical Conditioning Assistance
  • The sensors associated with the SE system 100 can be used to assist athletes in physical conditioning. A pulse rate monitor can alert when the pulse rate has achieved the desired level and warn of excess exertion or irregular pulse rate. For example the PED 300 can be strapped to the arm of the athlete where the SE system 100 pressure sensor or microphone can sense the pulse rate. Performance measures can be augmented by measurement of the blood oxygen level, hydration and other physiological parameters. Ambient air monitoring by the SE system 100 can warn of dangerous pollution in the local environment where over excursion may be dangerous. The GPS receiver 374 (FIG. 2) in each PED 300 yields position information that can give the athlete real time speed and distance run in the field. Casual conditioning attributed to walking and other motion throughout the day can be recorded by the PED 300. The PED 300 can provide audio entertainment, music or exercise instructions while exercising. A brief audio announcement by the PED 300 can serve to periodically alert the athlete to progress or dangerous conditions.
  • Near Field Communications
  • The RF subsystem 370 can include a a near field communications (NFC) wireless transceiver. This enables the user to communicate with a station by holding the PED 300 within four inches of the station. This method is commonly used to make purchases similar to credit card transactions by simply holding the device near a point of sale reader. As a result, the PED 300 can be used to make point of sale transactions. This secure technology can also be used by the PED 300 to exchange confidential information such as medical records, reference signatures, biometric parameter monitoring instructions and recorded results.
  • Personal Tracking Tags
  • The RF subsystem 370 can be augmented to interrogate tracking tags, such as radio frequency identification (RFID) tags or other transponders. The tag can be placed on a child or in a briefcase, portable computer, purse, or any other item that may become lost, forgotten, or stolen. The tag will be queried periodically by the SE system 100 in the PED 300 to determine that the tag is in close proximity. If the tag is not in close proximity, then an alarm can be issued by the output devices 225 of the PED 300. Signal power and time delay between query and response will give an indication of range. Automatic beam steering antenna arrays can provide a directional indication to the lost item. At some frequencies multi-path reflections of the signal may degrade the directional information.
  • The tag should be a bracelet or other interlocking mechanism that has a positive indication of attachment. The bracelet can be placed on a child's arm, briefcase handle or purse strap. Removal of the bracelet should cause an immediate alarm. The tracking tag can be a label that can be placed on any item to be tracked. The label can be inconspicuously placed to deter removal. Alternatively, the tag can be placed where its removal would be immediately obvious to other individuals, such as in a child's shoe.
  • The tracking tag can issue an alarm in response to additional environmental information such as excessive heat or humidity in the vicinity of the tracked item. For example if a child should fall into a swimming pool even the simplest tracking tag should fail to respond resulting in an immediate alarm. Transponder tracking tags of higher complexity can incorporate their own SE system 100 that communicates selected environmental information back to the PED 300.
  • Chemical Detectors/Transducers
  • The SE system 100 can be designed to detect chemical changes in the environment. A portable PED 300 having the SE system 100 that can detect dangerous chemical changes, such as smoke, would be beneficial. In this configuration, the PED 300 is essentially a mobile smoke and carbon monoxide alarm.
  • The SE system 100 can be designed to detect potential impairment of an operator's senses by judgment of motion and dexterity in operation of the PED 300.
  • One or more chemical sensors can be utilized to detect intoxication as demonstrated by pending U.S. Patent Application No. 20040081582, titled “Cell Phone/Breath Analyzer,” filed Apr. 29, 2004, which is incorporated herein by reference.
  • One or more chemical sensors for continuous monitoring for toxic fumes can also be implemented in the SE system 100. CO and NO can be detected by the system and process described in pending U.S. Patent Application No. 20040016104, titled “Electrodes for Solid State Gas Sensor,” filed Jan. 29, 2004, which is incorporated herein by reference. U.S. Pat. No. 6,638,407, titled “Electrochemical Gas Sensor with Gas Communication Means,” which is incorporated herein by reference, describes a detector that can be used to detect CO. Such detectors could be included in the SE system 100 for continuous protection.
  • U.S. Pat. No. 6,830,668, titled “Small Volume Electrochemical Sensor,” which is incorporated herein by reference, describes a sensor that can be implemented in the SE system 100 for the purposed of conducting field analysis of liquid samples.
  • Cooperative Operation of Multiple PEDs
  • Two or more PEDs 300 can function cooperatively to provide sensory enhancement over a wider range than that covered by a single PED 300. Multiple cooperating PEDs 300 can simultaneously monitor for selected environmental events as illustrated in FIG. 7.
  • COOPERATIVE DEVICES screen 462 (FIG. 3) is used to coordinate two or more PEDs 300. The VOLUNTEER command allows the operator to volunteer the PED 300 for cooperative operation with other PEDs 300 in the area. A volunteer signal will be sent to other PEDs 300 identifying the PED 300, its location, and the sensors that are available. The volunteer signal is sent when first selected and again whenever queried by another PED 300 that is searching for cooperative partners. The MEMBERS command opens the MEMBERS screen 472, which lists the names or phone numbers of PED 300 devices to be selected as members of the coordination team. The REFERENCE command selects one or more reference signatures that are used to identify the selected environmental event. The reference signatures are transmitted to all of the PEDs 300 participating in the coordination team. The PEDs 300 should be dispersed across the area of interest to cover the widest possible range. The locations of the PEDs 300 can be predetermined or they can travel randomly. Each of the PEDs 300 then commences simultaneous monitoring for the selected event.
  • As illustrated in FIG. 7, events occurring at target location 740 are easily detected within range circle 730 of nearby cooperating PED 300 a. Cooperating PED 300 a can communicate the detected events to other cooperating PEDs. In some cases the only position information is the location of the single detecting cooperating PED 300 a and possibly the range from the detecting cooperating PED 300 a. In other cases multiple cooperating PEDs 300 a, 300 b and 300 c may detect the event and triangulation between the multiple cooperating PEDs 300 a, 300 b and 300 c can determine the target location 740 with greater accuracy. Some cooperating PED devices, such as cooperating PED 300 d may be blocked from detecting the event by range, terrain or buildings such as condos 710 and 712. Beneficially, all of the cooperating PEDs can be notified of the detected event by wireless signals communicated from the detecting PED or PEDs.
  • Upon detection of the selected event in one or more PEDs 300, the GPS receiver 374 (FIG. 2) in the detecting PED 300 can accurately identify the location and time of detection at the detecting PED 300. The position and time of detection at each detecting PED 300 can be communicated to all participating PEDs 300 by wireless signals, such as Bluetooth, IEEE 802.11 or ordinary text messaging between cell phones. Correlation of three or more detecting PEDs 300 will allow an accurate position determination of the source of the event. If the source is moving, then the direction of travel can be determined by computing the vector between successive positions. Each PED 300 can calculate and display the location of the event. This process can be used to locate the source and motion of any signals such as a toxic cloud, alarm signal, wireless signal, weapons discharge, lightning strike, tornado, or person talking. A team of individuals can locate a missing person or child by coordinating their PEDs 300 in a search for the voice print of the missing person or child. U.S. Pat. No. 6,232,882, titled “Warning System and Method for Detection of Tornadoes,” which is incorporated herein by reference, describes a method for detecting, differentiating, and locating lightning strikes and tornado generated electromagnetic signals. U.S. Pat. No. 6,944,466, titled “Mobile Location Estimation in a Wireless Communication System,” which is incorporated herein by reference, describes a method for locating the source of a wireless signal based on signals received at multiple receiver stations. Such systems/methods can be implemented in the portable cooperating PEDs 300.
  • Other Examples of Applications
  • The present invention has many applications, a few nonlimiting examples of which have been described. A few more are set out hereafter.
  • The SE system 100 can be incorporated in a wireless telephone to monitor its microphone for emergency warnings, such as the siren of an emergency vehicle, bell of a railroad crossing, drawbridge bell, etc. Upon detection of an emergency signal, the telephone can be designed to immediately cease its current operation and give an immediate unmistakable audible warning. If equipped with a display device, the telephone can also produce a visual alert. If equipped with a mechanical vibrator, the telephone can produce a vibration alert through one of its normal ring signaling modes.
  • The SE system 100 can be used for detecting a siren or alert signal from a smoke detector. Conventional smoke detectors suffer from common failures, such as a run down battery. Weak siren signals or low battery signals can be detected by the PED 300 and the user can be alerted with a visual, audio, and/or mechanical queue. The SE system 100 can provide redundancy by directly detecting smoke, carbon monoxide or other toxic vapors. The portable PED 300 with SE system 100 is used frequently; assuring that a weak battery or degraded power will be quickly detected and corrected.
  • The SE system 100 can be designed to detect bird songs. Naturalists may wish to better hear or identify sounds of nature, such as bird songs. The PED 300 can be designed to store reference signatures of bird songs, to detect bird songs, and to alert the user of such detection. The identity of the bird can be displayed and, in some implementations, the direction can be indicated via an arrow on the display or via an audible indication. A PED 300 with mapping GPS navigator capability can superimpose the directional vector on the GPS map display.
  • The SE system 100 can be used for monitoring biometric sensors. Conventional biometric heart or respiratory monitors may be inconvenient. By implementing these features in the PED 300, the features will be always available. Low battery conditions will be immediately apparent.
  • The SE system 100 can be designed to sense temperature and monitor it in connection with a threshold. As an example, a temperature warning system can be implemented. A user can be alerted when the environmental temperature exceeds a predefined threshold.
  • The SE system 100 can be designed to monitor for wireless signals, such as IEEE 802.15.1, 802.11, or other wireless communications protocols. Equipment in the environment could be designed to transmit a signal to indicate any abnormal condition in the nearby equipment, and the SE system 100 can detect the abnormal condition and advise the user of same.
  • The SE system 100 can be designed to identify individuals participating in nearby conversations. Individuals can be detected by voice print analysis. This could be useful in detecting terrorist suspects.
  • The SE system 100 can be designed to detect the discharge of a firearm. Law enforcement officers may wish to locate the source of sounds, such as weapons discharge.
  • The SE system 100 can be designed to assist in military applications. For example, military applications may require the rapid detection of the sonic report of a passing projectile which may arrive seconds before the report of the weapon that discharged the projectile.
  • U.S. Pat. No. 5,703,321, which is incorporated herein by reference, describes a device for locating artillery and sniper positions. It basically describes a pyrotechnic device which is deployed in large numbers to signal when the acoustic signature of a munitions discharge is detected in the immediate vicinity. The PED 300 can be designed to provide the same or similar functionality. Multiple cooperating PEDs 300 in audible range of the discharge can record the time of detection at each PED 300. The GPS receiver 374 (FIG. 2) in each PED 300 can accurately identify the time of arrival of the wave front at the known GPS position of the PED 300. The time of arrival and position at each PED 300 can be communicated to the others by wireless signals, such as the ordinary text messaging used in cell phones. Correlation of three or more PEDs 300 will allow an accurate position determination of the source of the discharge. Each PED 300 can calculate and display the position of the discharge. This same process can be used to locate the source of any acoustic signals such as an alarm signal or person talking.
  • U.S. Pat. No. 5,703,835, which is incorporated herein by reference, describes a system for effective control of urban environment security. It describes an urban security gun shot detection system that uses sensors mounted in fixed positions throughout the urban area. The PED 300 can be designed to implement the same or a similar technique. The PEDs 300 could be the radios carried by law enforcement personnel or could be cell phones associated with citizen volunteers. The GPS receiver 374 (FIG. 2) in each PED 300 provides the position of mobile PEDs 300 allowing accurate triangulation to determine the location of the gun shot.
  • The SE system 100 can be designed to detect emergency sirens or approaching vehicles. Those with hearing impairments would benefit by a visual or vibration alert to dangerous situations, such as emergency signals or approaching vehicles.
  • The SE system 100 can be designed to include a GPS receiver 374 (FIG. 2). In one embodiment, among others, the SE system 100 can detect and provide an alert when the PED 300 is within a certain region of the earth or at a particular location.
  • The SE system 100 can be designed with an accelerometer that warns of impending falls. See web site http://link.abpi.net/l.php?20050822A7 that discusses a balance device that utilizes a stereo warning of sway.
  • Variations and Modifications
  • In concluding the detailed description, it should be noted that the terminology “preferred embodiment” herein means the one embodiment currently believed by the inventor(s) to be the best embodiment of a plurality of possible embodiments. Moreover, it will be obvious to those skilled in the art that many variations and modifications may be made to the preferred embodiment(s) without substantially departing from the principles of the present invention. All such variations and modifications are intended to be included herein within the teachings of the present invention in this document and to be protected by the scope of the following claims.

Claims (20)

  1. 1. A method for a personal electronic device (PED) that can be transported with a user, comprising the steps of:
    determining local event information associated with an event in an environment associated with the PED by:
    detecting occurrence of the event in an environment associated with the PED by analyzing data produced from a transducer;
    identifying a local event time pertaining to the event;
    determining a local PED position pertaining to the PED;
    determining remote event information associated with the event in the environment associated with one or more remote PEDs by:
    communicating with the one or more remote PEDs in the environment associated with the PED;
    receiving a remote event time pertaining to the event and a remote PED position from each of the one or more remote PEDs; and
    determining information indicative of the event based at least upon the following:
    the local event time, the local PED position, and the remote event time and remote PED position pertaining to each of the one or more remote PEDs.
  2. 2. The method of claim 1, wherein the information indicative of the event is information indicative of a location for the event.
  3. 3. The method of claim 1, further comprising the steps of:
    storing reference data corresponding to the event;
    producing sensed data by sensing the environment with a transducer associated with the PED;
    comparing the sensed data with the reference data; and
    detecting the occurrence of the event based upon the comparison.
  4. 4. The method of claim 2, wherein the location information comprises at least one of the following: a geographical location, a range, or a direction.
  5. 5. The method of claim 1, wherein local and remote event times correspond to arrival times of event signals.
  6. 6. The method of claim 2, wherein the location information is determined by a process of triangulation of the local event time, the local PED position, a first remote event time, a first remote PED position, a second remote event time, and a second remote PED position.
  7. 7. The method of claim 2, further comprising the step of displaying the location information on a display screen associated with the PED.
  8. 8. The method of claim 1, further comprising the step of providing a user interface that enables a user to select whether or not to communicate volunteer data into the environment, the volunteer data indicative of whether or not the PED will cooperate with the one or more remote PEDs.
  9. 9. The method of claim 8, wherein the volunteer data comprises a PED identification, a PED location, and transducer availability.
  10. 10. The method of claim 8, further comprising the step of receiving a query from a remote PED that is searching for PED cooperation and communicating the volunteer data in response to the query.
  11. 11. The method of claim 8, wherein the volunteer data comprises reference data to be used to identify the event.
  12. 12. The method of claim 1, further comprising the steps of:
    providing a user interface that enables a user to select remote PEDs from a list; and
    cooperating with selected remote PEDs.
  13. 13. The method of claim 12, wherein the list contains at least one of the following: a telephone number associated with each PED or a name associated with each PED.
  14. 14. A method for a personal electronic device (PED) that can be transported with a user, comprising the steps of:
    providing a first electronic based intelligence function to the user with the PED; and
    providing a second electronic based intelligence function to the user with the PED, the second electronic based intelligence function being different than the first electronic based intelligence function, the second electronic based intelligence function comprising at least the following steps:
    storing reference data corresponding to an event;
    producing sensed data by sensing the environment with a transducer associated with the PED;
    comparing the sensed data with the stored reference data;
    receiving event information from one or more remote PEDs in the environment, the event information indicative of event detection; and
    concluding that the event is detected based upon the comparison and the received event information.
  15. 15. The method of claim 12, further comprising the steps of:
    broadcasting a cooperation request to the one or more remote PEDs;
    receiving a response from at least one of the one or more remote PEDs indicating a willingness to cooperate in connection with the event detection; and
    communicating the reference data to the one or more remote PEDs.
  16. 16. A non-transitory computer readable medium comprising computer program code instructions for a personal electronic device (PED) that can be transported with a user, the computer program code comprising:
    instructions designed to determine local event information associated with an event in an environment associated with the PED by:
    detecting occurrence of the event in an environment associated with the PED by analyzing data produced from a transducer;
    identifying a local event time pertaining to the event;
    determining a local PED position pertaining to the PED;
    instructions designed to determine remote event information associated with the event in the environment associated with one or more remote PEDs by:
    communicating with the one or more remote PEDs in the environment associated with the PED;
    receiving a remote event time pertaining to the event and a remote PED position from each of the one or more remote PEDs; and
    instructions designed to determine information indicative of the event based at least upon the following: the local event time, the local PED position, and the remote event time and remote PED position pertaining to each of the one or more remote PEDs.
  17. 17. The medium of claim 16, wherein the computer program code instructions include instructions designed to determine information indicative of a location for the event.
  18. 18. A non-transitory computer readable medium comprising computer program code instructions for a personal electronic device (PED) that can be transported with a user, the PED designed to perform a first electronic based intelligence function, the computer program code instructions designed to perform a second electronic based intelligence function that is different than the first electronic based intelligence function, the computer program code instructions comprising:
    instructions designed to store reference data corresponding to an event;
    instructions designed to produce sensed data by sensing the environment with a transducer associated with the PED;
    instructions designed to compare the sensed data with the reference data;
    instructions designed to receive event information from one or more remote PEDs in the environment, the event information indicative of event detection; and
    instructions designed to conclude that the event is detected based upon the comparison and the received event information.
  19. 19. The medium of claim 18, further comprising:
    instructions designed to broadcast a cooperation request to the one or more remote PEDs;
    instructions designed to receive a response from at least one of the one or more remote PEDs indicating a willingness to cooperate in connection with the event detection; and
    instructions designed to communicate the reference data to the one or more remote PEDs.
  20. 20. The medium of claim 18, further comprising:
    instructions designed to display a list of the one or more remote PEDs that are available for cooperation in detecting the event; and
    instructions designed to enable the user to select the one or more remote PEDs to engage in the cooperation.
US13433861 2006-02-01 2012-03-29 Sensory Enhancement Systems and Methods in Personal Electronic Devices Abandoned US20120184305A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US11345058 US7872574B2 (en) 2006-02-01 2006-02-01 Sensory enhancement systems and methods in personal electronic devices
US13005683 US20110121965A1 (en) 2006-02-01 2011-01-13 Sensory Enhancement Systems and Methods in Personal Electronic Devices
US13371769 US20120139721A1 (en) 2006-02-01 2012-02-13 Sensory Enhancement Systems and Methods in Personal Electronic Devices
US13409220 US8390445B2 (en) 2006-02-01 2012-03-01 Sensory enhancement systems and methods in personal electronic devices
US13433861 US20120184305A1 (en) 2006-02-01 2012-03-29 Sensory Enhancement Systems and Methods in Personal Electronic Devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13433861 US20120184305A1 (en) 2006-02-01 2012-03-29 Sensory Enhancement Systems and Methods in Personal Electronic Devices

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13409220 Division US8390445B2 (en) 2006-02-01 2012-03-01 Sensory enhancement systems and methods in personal electronic devices

Publications (1)

Publication Number Publication Date
US20120184305A1 true true US20120184305A1 (en) 2012-07-19

Family

ID=40507666

Family Applications (7)

Application Number Title Priority Date Filing Date
US11345058 Active 2029-11-20 US7872574B2 (en) 2006-02-01 2006-02-01 Sensory enhancement systems and methods in personal electronic devices
US13005683 Abandoned US20110121965A1 (en) 2006-02-01 2011-01-13 Sensory Enhancement Systems and Methods in Personal Electronic Devices
US13371769 Abandoned US20120139721A1 (en) 2006-02-01 2012-02-13 Sensory Enhancement Systems and Methods in Personal Electronic Devices
US13409220 Active US8390445B2 (en) 2006-02-01 2012-03-01 Sensory enhancement systems and methods in personal electronic devices
US13433861 Abandoned US20120184305A1 (en) 2006-02-01 2012-03-29 Sensory Enhancement Systems and Methods in Personal Electronic Devices
US14992355 Abandoned US20160125212A1 (en) 2006-02-01 2016-01-11 Sensory Enhancement Systems and Methods in Personal Electronic Devices
US14992378 Abandoned US20160125885A1 (en) 2006-02-01 2016-01-11 Sensory Enhancement Systems and Methods in Personal Electronic Devices

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US11345058 Active 2029-11-20 US7872574B2 (en) 2006-02-01 2006-02-01 Sensory enhancement systems and methods in personal electronic devices
US13005683 Abandoned US20110121965A1 (en) 2006-02-01 2011-01-13 Sensory Enhancement Systems and Methods in Personal Electronic Devices
US13371769 Abandoned US20120139721A1 (en) 2006-02-01 2012-02-13 Sensory Enhancement Systems and Methods in Personal Electronic Devices
US13409220 Active US8390445B2 (en) 2006-02-01 2012-03-01 Sensory enhancement systems and methods in personal electronic devices

Family Applications After (2)

Application Number Title Priority Date Filing Date
US14992355 Abandoned US20160125212A1 (en) 2006-02-01 2016-01-11 Sensory Enhancement Systems and Methods in Personal Electronic Devices
US14992378 Abandoned US20160125885A1 (en) 2006-02-01 2016-01-11 Sensory Enhancement Systems and Methods in Personal Electronic Devices

Country Status (1)

Country Link
US (7) US7872574B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110285915A1 (en) * 2010-03-23 2011-11-24 Thales Method and system for cooperative transmission of a video sequence
US20180047415A1 (en) * 2015-05-15 2018-02-15 Google Llc Sound event detection

Families Citing this family (175)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8468414B2 (en) * 2009-08-27 2013-06-18 Icomm Technologies Inc. Method and apparatus for a wireless mobile system implementing beam steering phase array antenna
US7811231B2 (en) 2002-12-31 2010-10-12 Abbott Diabetes Care Inc. Continuous glucose monitoring system and methods of use
US8066639B2 (en) 2003-06-10 2011-11-29 Abbott Diabetes Care Inc. Glucose measuring device for use in personal area network
US7722536B2 (en) * 2003-07-15 2010-05-25 Abbott Diabetes Care Inc. Glucose measuring device integrated into a holster for a personal area network device
US8771183B2 (en) 2004-02-17 2014-07-08 Abbott Diabetes Care Inc. Method and system for providing data communication in continuous glucose monitoring and management system
US8880138B2 (en) 2005-09-30 2014-11-04 Abbott Diabetes Care Inc. Device for channeling fluid and methods of use
WO2007052726A1 (en) * 2005-11-02 2007-05-10 Yamaha Corporation Teleconference device
US7766829B2 (en) 2005-11-04 2010-08-03 Abbott Diabetes Care Inc. Method and system for providing basal profile modification in analyte monitoring and management systems
WO2007056873A2 (en) * 2005-11-15 2007-05-24 Swiss Reinsurance Company Automated trigger system with regenerative time-controlled trigger indices for monitoring devices in multi-stage coverage of damages systems for nascent and/or occurring cyclones and corresponding method
US7697967B2 (en) 2005-12-28 2010-04-13 Abbott Diabetes Care Inc. Method and apparatus for providing analyte sensor insertion
US7736310B2 (en) 2006-01-30 2010-06-15 Abbott Diabetes Care Inc. On-body medical device securement
WO2007088236A1 (en) * 2006-02-03 2007-08-09 Nokia Corporation A hearing agent and a related method
US7981034B2 (en) 2006-02-28 2011-07-19 Abbott Diabetes Care Inc. Smart messages and alerts for an infusion delivery and management system
US7826879B2 (en) 2006-02-28 2010-11-02 Abbott Diabetes Care Inc. Analyte sensors and methods of use
US8226891B2 (en) 2006-03-31 2012-07-24 Abbott Diabetes Care Inc. Analyte monitoring devices and methods therefor
US7620438B2 (en) 2006-03-31 2009-11-17 Abbott Diabetes Care Inc. Method and system for powering an electronic device
EP2038743A4 (en) 2006-07-12 2009-08-05 Arbitron Inc Methods and systems for compliance confirmation and incentives
US9332363B2 (en) * 2011-12-30 2016-05-03 The Nielsen Company (Us), Llc System and method for determining meter presence utilizing ambient fingerprints
US8932216B2 (en) 2006-08-07 2015-01-13 Abbott Diabetes Care Inc. Method and system for providing data management in integrated analyte monitoring and infusion system
US7653425B2 (en) 2006-08-09 2010-01-26 Abbott Diabetes Care Inc. Method and system for providing calibration of an analyte sensor in an analyte monitoring system
US7618369B2 (en) * 2006-10-02 2009-11-17 Abbott Diabetes Care Inc. Method and system for dynamically updating calibration parameters for an analyte sensor
US8684923B2 (en) * 2006-10-17 2014-04-01 At&T Intellectual Property I, Lp Methods systems, and computer program products for aggregating medical information
US20080199894A1 (en) 2007-02-15 2008-08-21 Abbott Diabetes Care, Inc. Device and method for automatic data acquisition and/or detection
US9636450B2 (en) 2007-02-19 2017-05-02 Udo Hoss Pump system modular components for delivering medication and analyte sensing at seperate insertion sites
US8123686B2 (en) 2007-03-01 2012-02-28 Abbott Diabetes Care Inc. Method and apparatus for providing rolling data in communication systems
US7768387B2 (en) 2007-04-14 2010-08-03 Abbott Diabetes Care Inc. Method and apparatus for providing dynamic multi-stage signal amplification in a medical device
US9204827B2 (en) 2007-04-14 2015-12-08 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in medical communication system
US9008743B2 (en) 2007-04-14 2015-04-14 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in medical communication system
CA2683959C (en) 2007-04-14 2017-08-29 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in medical communication system
US10057676B2 (en) * 2007-04-20 2018-08-21 Lloyd Douglas Manning Wearable wirelessly controlled enigma system
US7928850B2 (en) 2007-05-08 2011-04-19 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US8665091B2 (en) 2007-05-08 2014-03-04 Abbott Diabetes Care Inc. Method and device for determining elapsed sensor life
US8456301B2 (en) 2007-05-08 2013-06-04 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US8461985B2 (en) 2007-05-08 2013-06-11 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US8600681B2 (en) 2007-05-14 2013-12-03 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US8103471B2 (en) 2007-05-14 2012-01-24 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US8560038B2 (en) 2007-05-14 2013-10-15 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US8140312B2 (en) 2007-05-14 2012-03-20 Abbott Diabetes Care Inc. Method and system for determining analyte levels
US8260558B2 (en) 2007-05-14 2012-09-04 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US9125548B2 (en) 2007-05-14 2015-09-08 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US8444560B2 (en) 2007-05-14 2013-05-21 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US10002233B2 (en) 2007-05-14 2018-06-19 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US8239166B2 (en) 2007-05-14 2012-08-07 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US8597188B2 (en) 2007-06-21 2013-12-03 Abbott Diabetes Care Inc. Health management devices and methods
US8160900B2 (en) 2007-06-29 2012-04-17 Abbott Diabetes Care Inc. Analyte monitoring and management device and method to analyze the frequency of user interaction with the device
US8834366B2 (en) 2007-07-31 2014-09-16 Abbott Diabetes Care Inc. Method and apparatus for providing analyte sensor calibration
US20090063402A1 (en) * 2007-08-31 2009-03-05 Abbott Diabetes Care, Inc. Method and System for Providing Medication Level Determination
US8374668B1 (en) 2007-10-23 2013-02-12 Abbott Diabetes Care Inc. Analyte sensor with lag compensation
US8409093B2 (en) 2007-10-23 2013-04-02 Abbott Diabetes Care Inc. Assessing measures of glycemic variability
CN103561154B (en) 2007-11-09 2015-11-18 谷歌公司 Automatic activation method and system applications in mobile computing devices
US8473022B2 (en) 2008-01-31 2013-06-25 Abbott Diabetes Care Inc. Analyte sensor with time lag compensation
US8346335B2 (en) 2008-03-28 2013-01-01 Abbott Diabetes Care Inc. Analyte sensor calibration management
WO2009125577A1 (en) * 2008-04-10 2009-10-15 パナソニック株式会社 Imaging device, imaging system, and imaging method
US8219025B2 (en) * 2008-04-14 2012-07-10 Honeywell International Inc. Stand alone sensor apparatus for continuous web machines
US8924159B2 (en) 2008-05-30 2014-12-30 Abbott Diabetes Care Inc. Method and apparatus for providing glycemic control
US7826382B2 (en) 2008-05-30 2010-11-02 Abbott Diabetes Care Inc. Close proximity communication device and methods
US8591410B2 (en) 2008-05-30 2013-11-26 Abbott Diabetes Care Inc. Method and apparatus for providing glycemic control
US20100231378A1 (en) * 2008-06-16 2010-09-16 Linda Rosita Ward Personal Security System
US20120256945A1 (en) * 2008-06-17 2012-10-11 Digigage Ltd. System for altering virtual views
US20090315708A1 (en) * 2008-06-19 2009-12-24 John Walley Method and system for limiting audio output in audio headsets
US8560010B2 (en) * 2008-06-25 2013-10-15 Wade Koehn Cell phone with breath analyzer
US20100057040A1 (en) 2008-08-31 2010-03-04 Abbott Diabetes Care, Inc. Robust Closed Loop Control And Methods
US8622988B2 (en) * 2008-08-31 2014-01-07 Abbott Diabetes Care Inc. Variable rate closed loop control and methods
US9943644B2 (en) 2008-08-31 2018-04-17 Abbott Diabetes Care Inc. Closed loop control with reference measurement and methods thereof
US8734422B2 (en) 2008-08-31 2014-05-27 Abbott Diabetes Care Inc. Closed loop control with improved alarm functions
US9392969B2 (en) 2008-08-31 2016-07-19 Abbott Diabetes Care Inc. Closed loop control and signal attenuation detection
US8986208B2 (en) 2008-09-30 2015-03-24 Abbott Diabetes Care Inc. Analyte sensor sensitivity attenuation mitigation
US20100152620A1 (en) * 2008-12-12 2010-06-17 Immersion Corporation Method and Apparatus for Providing A Haptic Monitoring System Using Multiple Sensors
US9727139B2 (en) 2008-12-12 2017-08-08 Immersion Corporation Method and apparatus for providing a haptic monitoring system using multiple sensors
US20100177080A1 (en) * 2009-01-13 2010-07-15 Metrologic Instruments, Inc. Electronic-ink signage device employing thermal packaging for outdoor weather applications
US9517679B2 (en) 2009-03-02 2016-12-13 Flir Systems, Inc. Systems and methods for monitoring vehicle occupants
US9235876B2 (en) 2009-03-02 2016-01-12 Flir Systems, Inc. Row and column noise reduction in thermal images
US9292909B2 (en) 2009-06-03 2016-03-22 Flir Systems, Inc. Selective image correction for infrared imaging devices
US9756264B2 (en) 2009-03-02 2017-09-05 Flir Systems, Inc. Anomalous pixel detection
US9900526B2 (en) 2011-06-10 2018-02-20 Flir Systems, Inc. Techniques to compensate for calibration drifts in infrared imaging devices
US9208542B2 (en) 2009-03-02 2015-12-08 Flir Systems, Inc. Pixel-wise noise reduction in thermal images
US9948872B2 (en) 2009-03-02 2018-04-17 Flir Systems, Inc. Monitor and control systems and methods for occupant safety and energy efficiency of structures
US9674458B2 (en) 2009-06-03 2017-06-06 Flir Systems, Inc. Smart surveillance camera systems and methods
US9843742B2 (en) 2009-03-02 2017-12-12 Flir Systems, Inc. Thermal image frame capture using de-aligned sensor array
US9986175B2 (en) 2009-03-02 2018-05-29 Flir Systems, Inc. Device attachment with infrared imaging sensor
US9451183B2 (en) 2009-03-02 2016-09-20 Flir Systems, Inc. Time spaced infrared image enhancement
US9961277B2 (en) 2011-06-10 2018-05-01 Flir Systems, Inc. Infrared focal plane array heat spreaders
US9998697B2 (en) 2009-03-02 2018-06-12 Flir Systems, Inc. Systems and methods for monitoring vehicle occupants
US9635285B2 (en) 2009-03-02 2017-04-25 Flir Systems, Inc. Infrared imaging enhancement with fusion
US10051210B2 (en) 2011-06-10 2018-08-14 Flir Systems, Inc. Infrared detector array with selectable pixel binning systems and methods
US8280436B2 (en) * 2009-03-13 2012-10-02 Harris Jr Patrick G Wireless telephony device with breath analysis sensor and methods for use therewith
US8085145B2 (en) * 2009-04-03 2011-12-27 Sharp Laboratories Of America, Inc. Personal environmental monitoring method and system and portable monitor for use therein
WO2010121084A1 (en) 2009-04-15 2010-10-21 Abbott Diabetes Care Inc. Analyte monitoring system having an alert
WO2010127050A1 (en) 2009-04-28 2010-11-04 Abbott Diabetes Care Inc. Error detection in critical repeating data in a wireless sensor system
US8368556B2 (en) 2009-04-29 2013-02-05 Abbott Diabetes Care Inc. Method and system for providing data communication in continuous glucose monitoring and management system
US10091439B2 (en) 2009-06-03 2018-10-02 Flir Systems, Inc. Imager with array of multiple infrared imaging modules
US9756262B2 (en) 2009-06-03 2017-09-05 Flir Systems, Inc. Systems and methods for monitoring power systems
US9819880B2 (en) 2009-06-03 2017-11-14 Flir Systems, Inc. Systems and methods of suppressing sky regions in images
US9973692B2 (en) 2013-10-03 2018-05-15 Flir Systems, Inc. Situational awareness by compressed display of panoramic views
US9843743B2 (en) 2009-06-03 2017-12-12 Flir Systems, Inc. Infant monitoring systems and methods using thermal imaging
CN104799866A (en) 2009-07-23 2015-07-29 雅培糖尿病护理公司 The analyte monitoring device
EP2456351B1 (en) 2009-07-23 2016-10-12 Abbott Diabetes Care, Inc. Real time management of data relating to physiological control of glucose levels
US8478557B2 (en) 2009-07-31 2013-07-02 Abbott Diabetes Care Inc. Method and apparatus for providing analyte monitoring system calibration accuracy
EP2473422A4 (en) 2009-08-31 2014-09-17 Abbott Diabetes Care Inc Displays for a medical device
EP2473098A4 (en) 2009-08-31 2014-04-09 Abbott Diabetes Care Inc Analyte signal processing device and methods
EP2473099A4 (en) 2009-08-31 2015-01-14 Abbott Diabetes Care Inc Analyte monitoring system and methods for managing power and noise
US9351669B2 (en) 2009-09-30 2016-05-31 Abbott Diabetes Care Inc. Interconnect for on-body analyte monitoring device
US8290434B2 (en) * 2009-10-21 2012-10-16 Apple Inc. Method and apparatus for triggering network device discovery
US8688443B2 (en) * 2009-12-23 2014-04-01 At&T Intellectual Property I, L.P. Multimodal augmented reality for location mobile information service
JP5582803B2 (en) * 2010-01-27 2014-09-03 京セラ株式会社 Portable electronic devices
WO2011099969A1 (en) * 2010-02-11 2011-08-18 Hewlett-Packard Development Company, L.P. Input command
US9848134B2 (en) 2010-04-23 2017-12-19 Flir Systems, Inc. Infrared imager with integrated metal layers
US9207708B2 (en) 2010-04-23 2015-12-08 Flir Systems, Inc. Abnormal clock rate detection in imaging sensor arrays
US9706138B2 (en) 2010-04-23 2017-07-11 Flir Systems, Inc. Hybrid infrared sensor array having heterogeneous infrared sensors
US20110298887A1 (en) * 2010-06-02 2011-12-08 Maglaque Chad L Apparatus Using an Accelerometer to Capture Photographic Images
US8446318B2 (en) 2010-06-22 2013-05-21 Shirook Ali Controlling a beamforming antenna using reconfigurable parasitic elements
US8680989B2 (en) * 2010-12-21 2014-03-25 Qualcomm Incorporated Sensor to detect an emergency event
US9135466B2 (en) 2010-12-30 2015-09-15 Telefonaktiebolaget L M Ericsson (Publ) Biometric user equipment GUI trigger
CN103688245A (en) * 2010-12-30 2014-03-26 安比恩特兹公司 Information processing using a population of data acquisition devices
CN107019515A (en) 2011-02-28 2017-08-08 雅培糖尿病护理公司 Devices, systems, and methods associated with analyte monitoring devices and devices incorporating the same
US8610567B2 (en) 2011-05-04 2013-12-17 Continental Automotive Systems, Inc. System and method for airbag deployment detection
KR101808375B1 (en) 2011-06-10 2017-12-12 플리어 시스템즈, 인크. Low power and small form factor infrared imaging
US9706137B2 (en) 2011-06-10 2017-07-11 Flir Systems, Inc. Electrical cabinet infrared monitor
WO2012170949A3 (en) 2011-06-10 2013-04-11 Flir Systems, Inc. Non-uniformity correction techniques for infrared imaging devices
US10079982B2 (en) 2011-06-10 2018-09-18 Flir Systems, Inc. Determination of an absolute radiometric value using blocked infrared sensors
US9473681B2 (en) 2011-06-10 2016-10-18 Flir Systems, Inc. Infrared camera system housing with metalized surface
US9509924B2 (en) 2011-06-10 2016-11-29 Flir Systems, Inc. Wearable apparatus with integrated infrared imaging module
US8627723B2 (en) * 2011-08-10 2014-01-14 Wildlife Acoustics, Inc. Digital sampling and zero crossing of acoustic signals of animals
US8868039B2 (en) 2011-10-12 2014-10-21 Digimarc Corporation Context-related arrangements
WO2013066873A1 (en) 2011-10-31 2013-05-10 Abbott Diabetes Care Inc. Electronic devices having integrated reset systems and methods thereof
US9389431B2 (en) 2011-11-04 2016-07-12 Massachusetts Eye & Ear Infirmary Contextual image stabilization
JP2015505251A (en) 2011-11-07 2015-02-19 アボット ダイアベティス ケア インコーポレイテッドAbbott Diabetes Care Inc. Analyte monitoring device and method
US9317656B2 (en) 2011-11-23 2016-04-19 Abbott Diabetes Care Inc. Compatibility mechanisms for devices in a continuous analyte monitoring system and methods thereof
US8710993B2 (en) 2011-11-23 2014-04-29 Abbott Diabetes Care Inc. Mitigating single point failure of devices in an analyte monitoring system and methods thereof
US9339217B2 (en) 2011-11-25 2016-05-17 Abbott Diabetes Care Inc. Analyte monitoring system and methods of use
US9069648B2 (en) * 2012-01-25 2015-06-30 Martin Kelly Jones Systems and methods for delivering activity based suggestive (ABS) messages
US9218728B2 (en) * 2012-02-02 2015-12-22 Raytheon Company Methods and apparatus for acoustic event detection
US10013677B2 (en) 2012-02-07 2018-07-03 Whirlpool Corporation Appliance monitoring systems and methods
CA2805226A1 (en) 2012-02-07 2013-08-07 Scott Andrew Horstemeyer Appliance monitoring systems and methods
WO2013124878A1 (en) * 2012-02-20 2013-08-29 富士通株式会社 Communication apparatus, system, control program, and control method
US20130222137A1 (en) * 2012-02-29 2013-08-29 Motorola Mobility, Inc. Method for adapting a mobile communication device's function to monitored activity and a user's profile
US20130268101A1 (en) * 2012-04-09 2013-10-10 Icon Health & Fitness, Inc. Exercise Device Audio Cue System
US8774368B2 (en) * 2012-06-08 2014-07-08 Avaya Inc. System and method to use enterprise communication systems to measure and control workplace noise
KR20140009894A (en) * 2012-07-13 2014-01-23 엘지전자 주식회사 Mobile terminal and controlling method thereof
US9811884B2 (en) 2012-07-16 2017-11-07 Flir Systems, Inc. Methods and systems for suppressing atmospheric turbulence in images
US8995230B2 (en) 2012-07-26 2015-03-31 Wildlife Acoustics, Inc. Method of extracting zero crossing data from full spectrum signals
US20140074704A1 (en) * 2012-09-11 2014-03-13 Cashstar, Inc. Systems, methods and devices for conducting transactions with electronic passbooks
US9968306B2 (en) 2012-09-17 2018-05-15 Abbott Diabetes Care Inc. Methods and apparatuses for providing adverse condition notification with enhanced wireless communication range in analyte monitoring systems
US9082413B2 (en) 2012-11-02 2015-07-14 International Business Machines Corporation Electronic transaction authentication based on sound proximity
US20140185830A1 (en) * 2012-12-27 2014-07-03 Daniel Avrahami Methods, systems, and apparatus for audio backtracking control
US9391580B2 (en) * 2012-12-31 2016-07-12 Cellco Paternership Ambient audio injection
US9830588B2 (en) * 2013-02-26 2017-11-28 Digimarc Corporation Methods and arrangements for smartphone payments
US9965756B2 (en) * 2013-02-26 2018-05-08 Digimarc Corporation Methods and arrangements for smartphone payments
US9171450B2 (en) * 2013-03-08 2015-10-27 Qualcomm Incorporated Emergency handling system using informative alarm sound
US9354090B2 (en) 2013-05-22 2016-05-31 Honeywell Limited Scanning sensor arrangement for paper machines or other systems
US9796274B2 (en) 2013-05-22 2017-10-24 Honeywell Limited Power delivery system for providing power to sensor head of paper machine or other system
US9264162B2 (en) 2013-05-23 2016-02-16 Honeywell Limited Wireless position-time synchronization for scanning sensor devices
US9264803B1 (en) * 2013-06-05 2016-02-16 Google Inc. Using sounds for determining a worn state of a wearable computing device
US20140362999A1 (en) * 2013-06-06 2014-12-11 Robert Scheper Sound detection and visual alert system for a workspace
CN103325214B (en) * 2013-06-28 2015-09-16 惠州市德赛西威汽车电子股份有限公司 Automatic call system for a two-wheeled vehicle and a method for help
WO2015060872A1 (en) * 2013-10-25 2015-04-30 Intel Corporation Apparatus and methods for capturing and generating user experiences
US9870697B2 (en) * 2013-12-17 2018-01-16 At&T Mobility Ii Llc Method, computer-readable storage device and apparatus for providing a collaborative standalone area monitor
US9288572B2 (en) 2014-01-09 2016-03-15 International Business Machines Corporation Haptic microphone
WO2015159101A1 (en) * 2014-04-17 2015-10-22 Airbase Systems Ltd A method and system for analysing environmental data
US9373345B1 (en) 2014-12-11 2016-06-21 International Business Machines Corporation Pro-active protection of communication devices that are senstive to vibration or shock
US9829577B2 (en) 2014-12-19 2017-11-28 The Regents Of The University Of Michigan Active indoor location sensing for mobile devices
US9647719B2 (en) * 2015-02-16 2017-05-09 Federated Wireless, Inc. Method, system, and apparatus for spectrum sensing of radar signals
GB2538043B (en) * 2015-03-09 2017-12-13 Buddi Ltd Activity monitor
US9432430B1 (en) * 2015-04-02 2016-08-30 Sas Institute Inc. Event stream processing device providing real-time incident identification
US9547970B2 (en) * 2015-04-14 2017-01-17 General Electric Company Context-aware wearable safety system
CN107690664A (en) * 2015-04-16 2018-02-13 霍尼韦尔国际公司 Multi-sensor input analysis for improved safety
CN107548505A (en) * 2015-05-08 2018-01-05 惠普发展公司有限责任合伙企业 Alarm event determinations via microphone arrays
US20170026860A1 (en) * 2015-07-02 2017-01-26 Carrier Corporation Device and method for detecting high wind weather events using radio emissions
US9609449B1 (en) * 2015-10-26 2017-03-28 Microsoft Technology Licensing, Llc Continuous sound pressure level monitoring
US20170280222A1 (en) * 2016-03-24 2017-09-28 Bragi GmbH Real-Time Multivariable Biometric Analysis and Display System and Method
BE1024732B1 (en) * 2016-11-10 2018-06-14 Technical Hands On Management Construction, Afgekort T.H.O.M. Besloten Vennootschap Met Beperkte Aansprakelijkheid A method for the prevention of accidents at work
US9911301B1 (en) * 2017-02-07 2018-03-06 Luisa Foley Lost child notification system
US10104484B1 (en) 2017-03-02 2018-10-16 Steven Kenneth Bradford System and method for geolocating emitted acoustic signals from a source entity
US20180277135A1 (en) * 2017-03-24 2018-09-27 Hyundai Motor Company Audio signal quality enhancement based on quantitative snr analysis and adaptive wiener filtering
US10045143B1 (en) 2017-06-27 2018-08-07 International Business Machines Corporation Sound detection and identification

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6684176B2 (en) * 2001-09-25 2004-01-27 Symbol Technologies, Inc. Three dimensional (3-D) object locator system for items or sites using an intuitive sound beacon: system and method of operation
US20040157648A1 (en) * 2000-02-25 2004-08-12 Charmed Technology, Inc. Wearable computing device capable of responding intelligently to surroundings
US20050148346A1 (en) * 2003-12-30 2005-07-07 Maloney John E. TDOA/GPS hybrid wireless location system
US20050191963A1 (en) * 2004-02-28 2005-09-01 Hymes Charles M. Wireless communications with visually-identified targets
US20060109083A1 (en) * 2004-11-24 2006-05-25 Rathus Spencer A Method and apparatus for accessing electronic data about at least one person of interest
US8085145B2 (en) * 2009-04-03 2011-12-27 Sharp Laboratories Of America, Inc. Personal environmental monitoring method and system and portable monitor for use therein

Family Cites Families (72)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4538296A (en) * 1983-07-22 1985-08-27 Short Robert S Sound inhibitor for audio transducers
US4640134A (en) * 1984-04-04 1987-02-03 Bio-Dynamics Research & Development Corporation Apparatus and method for analyzing acoustical signals
US4537296A (en) * 1984-07-23 1985-08-27 Alma Piston Company Clutch driven plate assembly
JP2527807B2 (en) * 1989-05-09 1996-08-28 セイコー電子工業株式会社 Optical association identification device
US5046101A (en) * 1989-11-14 1991-09-03 Lovejoy Controls Corp. Audio dosage control system
US5544661A (en) * 1994-01-13 1996-08-13 Charles L. Davis Real time ambulatory patient monitor
US5504717A (en) * 1994-05-27 1996-04-02 Alliant Techsystems Inc. System for effective control of urban environment security
US5461365A (en) * 1994-10-27 1995-10-24 Schlager; Dan Multi-hazard alarm system using selectable power-level transmission and localization
DE4439850C1 (en) * 1994-11-08 1996-03-14 Daimler Benz Aerospace Ag Artillery and protective screening localisation device
US5481266A (en) * 1994-11-17 1996-01-02 Davis; Warren F. Autodyne motion sensor
US5559497A (en) * 1994-11-28 1996-09-24 Hong; Chia-Ping Body temperature sensing and alarming device
US6072396A (en) * 1994-12-30 2000-06-06 Advanced Business Sciences Apparatus and method for continuous electronic monitoring and tracking of individuals
US5731757A (en) * 1996-08-19 1998-03-24 Pro Tech Monitoring, Inc. Portable tracking apparatus for continuous position determination of criminal offenders and victims
US5978738A (en) * 1997-02-13 1999-11-02 Anthony Brown Severe weather detector and alarm
US5959529A (en) * 1997-03-07 1999-09-28 Kail, Iv; Karl A. Reprogrammable remote sensor monitoring system
US6732064B1 (en) * 1997-07-02 2004-05-04 Nonlinear Solutions, Inc. Detection and classification system for analyzing deterministic properties of data using correlation parameters
US6009320A (en) * 1997-08-07 1999-12-28 Dudley; Sandra L. Vehicle alarm system triggerable cell phone activation circuit
US6173074B1 (en) * 1997-09-30 2001-01-09 Lucent Technologies, Inc. Acoustic signature recognition and identification
GB2324632B (en) * 1997-10-20 1999-08-11 Steven Derek Pike Microphone unit
US5970446A (en) * 1997-11-25 1999-10-19 At&T Corp Selective noise/channel/coding models and recognizers for automatic speech recognition
US6593845B1 (en) * 1998-01-09 2003-07-15 Intermac Ip Corp. Active RF tag with wake-up circuit to prolong battery life
US6233045B1 (en) * 1998-05-18 2001-05-15 Light Works Llc Self-mixing sensor apparatus and method
US5923258A (en) * 1998-06-09 1999-07-13 K Jump Health Co., Ltd. Electronic thermometer with high intensity fever alarm
CA2255472C (en) * 1998-12-10 2005-02-22 Senco Sensors Inc. Electrochemical gas sensor with gas communication means
US6333694B2 (en) * 2000-03-09 2001-12-25 Advanced Marketing Systems Corporation Personal emergency response system
US6000932A (en) * 1999-04-12 1999-12-14 Willet; David Lighter that is convertible to a fishing lure
US6094141A (en) * 1999-05-03 2000-07-25 Tsai; Ching-Tien Low power-consumption luminous decorative/warning means
US6408187B1 (en) * 1999-05-14 2002-06-18 Sun Microsystems, Inc. Method and apparatus for determining the behavior of a communications device based upon environmental conditions
US6222458B1 (en) * 1999-11-15 2001-04-24 Scott C. Harris Automatic cell phone detection at a combustible delivery station
US6446958B1 (en) * 1999-11-18 2002-09-10 Pitney Bowes Inc. Method and system for directing an item through the feed path of a folding apparatus
US20040252867A1 (en) * 2000-01-05 2004-12-16 Je-Hsiung Lan Biometric sensor
US6587824B1 (en) * 2000-05-04 2003-07-01 Visteon Global Technologies, Inc. Selective speaker adaptation for an in-vehicle speech recognition system
US6232882B1 (en) * 2000-07-19 2001-05-15 Spectrum Electronics, Inc. Warning system and method for detection of tornadoes
US6466958B1 (en) * 2000-09-12 2002-10-15 Interstate Electronics Corporation, A Division Of L3 Communications Corporation Parallel frequency searching in an acquisition correlator
US7457750B2 (en) * 2000-10-13 2008-11-25 At&T Corp. Systems and methods for dynamic re-configurable speech recognition
US6549756B1 (en) * 2000-10-16 2003-04-15 Xoucin, Inc. Mobile digital communication/computing device including heart rate monitor
US6785550B1 (en) * 2000-11-28 2004-08-31 Lucent Technologies Inc. Mobile location estimation in a wireless communication system
US6434372B1 (en) * 2001-01-12 2002-08-13 The Regents Of The University Of California Long-range, full-duplex, modulated-reflector cell phone for voice/data transmission
US6826762B2 (en) * 2001-02-16 2004-11-30 Microsoft Corporation Radio interface layer in a cell phone with a set of APIs having a hardware-independent proxy layer and a hardware-specific driver layer
US6876968B2 (en) * 2001-03-08 2005-04-05 Matsushita Electric Industrial Co., Ltd. Run time synthesizer adaptation to improve intelligibility of synthesized speech
CA2438172C (en) * 2001-03-12 2007-01-16 Eureka Technology Partners, Llc Article locator system
US6690618B2 (en) * 2001-04-03 2004-02-10 Canesta, Inc. Method and apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device
US7003123B2 (en) * 2001-06-27 2006-02-21 International Business Machines Corp. Volume regulating and monitoring system
US20030044002A1 (en) * 2001-08-28 2003-03-06 Yeager David M. Three dimensional audio telephony
US6840904B2 (en) * 2001-10-11 2005-01-11 Jason Goldberg Medical monitoring device and system
US7399277B2 (en) * 2001-12-27 2008-07-15 Medtronic Minimed, Inc. System for monitoring physiological characteristics
JP3826032B2 (en) * 2001-12-28 2006-09-27 株式会社東芝 Speech recognition device, speech recognition method and a speech recognition program
US6882837B2 (en) * 2002-01-23 2005-04-19 Dennis Sunga Fernandez Local emergency alert for cell-phone users
GB0202386D0 (en) * 2002-02-01 2002-03-20 Cedar Audio Ltd Method and apparatus for audio signal processing
US6943667B1 (en) * 2002-02-25 2005-09-13 Palm, Inc. Method for waking a device in response to a wireless network activity
US7202795B2 (en) * 2002-04-22 2007-04-10 Strategic Design Federation W, Inc. Weather warning system and method
US6830668B2 (en) * 2002-04-30 2004-12-14 Conductive Technologies, Inc. Small volume electrochemical sensor
US6992580B2 (en) * 2002-07-25 2006-01-31 Motorola, Inc. Portable communication device and corresponding method of operation
US20040081582A1 (en) 2002-09-10 2004-04-29 Oxyfresh Worldwide, Inc. Cell phone/breath analyzer
US6793448B2 (en) * 2002-10-25 2004-09-21 Itl Technologies, Inc. Adaptive rails for stacking/securing different sized shipping containers
US20040100376A1 (en) * 2002-11-26 2004-05-27 Kimberly-Clark Worldwide, Inc. Healthcare monitoring system
US7109859B2 (en) * 2002-12-23 2006-09-19 Gentag, Inc. Method and apparatus for wide area surveillance of a terrorist or personal threat
US8073689B2 (en) * 2003-02-21 2011-12-06 Qnx Software Systems Co. Repetitive transient noise removal
US7877105B2 (en) * 2003-07-02 2011-01-25 ST-Ericcson SA Method and arrangement for frequency synchronization of a mobile station with a base station in a mobile communication system
US6856253B1 (en) * 2003-08-14 2005-02-15 Gary W. Crook Personal hydrogen sulfide gas alarm system
JP4548646B2 (en) * 2003-09-12 2010-09-22 貞煕 古井 Noise adaptation system of speech model, a noise adaptation method, and speech recognition noise adaptation program
US7774833B1 (en) * 2003-09-23 2010-08-10 Foundry Networks, Inc. System and method for protecting CPU against remote access attacks
US7042361B2 (en) * 2003-10-02 2006-05-09 Kazdin Ronald S Child monitoring, communication and locating system
US6975277B2 (en) * 2003-11-21 2005-12-13 Kyocera Wireless Corp. Wireless communications device pseudo-fractal antenna
US8271200B2 (en) * 2003-12-31 2012-09-18 Sieracki Jeffrey M System and method for acoustic signature extraction, detection, discrimination, and localization
US7126467B2 (en) * 2004-07-23 2006-10-24 Innovalarm Corporation Enhanced fire, safety, security, and health monitoring and alarm response method, system and device
US7261691B1 (en) * 2004-08-02 2007-08-28 Kwabena Asomani Personalized emergency medical monitoring and transmission system
US7574451B2 (en) * 2004-11-02 2009-08-11 Microsoft Corporation System and method for speeding up database lookups for multiple synchronized data streams
US8378811B2 (en) * 2005-03-11 2013-02-19 Aframe Digital, Inc. Mobile wireless customizable health and condition monitor
US8027833B2 (en) * 2005-05-09 2011-09-27 Qnx Software Systems Co. System for suppressing passing tire hiss
US7844307B2 (en) * 2006-01-26 2010-11-30 Sigmatel, Inc. Wireless handset having selective communication control and corresponding methods
US8078120B2 (en) * 2008-02-11 2011-12-13 Cobra Electronics Corporation Citizens band radio with wireless cellular telephone connectivity

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040157648A1 (en) * 2000-02-25 2004-08-12 Charmed Technology, Inc. Wearable computing device capable of responding intelligently to surroundings
US6684176B2 (en) * 2001-09-25 2004-01-27 Symbol Technologies, Inc. Three dimensional (3-D) object locator system for items or sites using an intuitive sound beacon: system and method of operation
US20050148346A1 (en) * 2003-12-30 2005-07-07 Maloney John E. TDOA/GPS hybrid wireless location system
US20050191963A1 (en) * 2004-02-28 2005-09-01 Hymes Charles M. Wireless communications with visually-identified targets
US20060109083A1 (en) * 2004-11-24 2006-05-25 Rathus Spencer A Method and apparatus for accessing electronic data about at least one person of interest
US8085145B2 (en) * 2009-04-03 2011-12-27 Sharp Laboratories Of America, Inc. Personal environmental monitoring method and system and portable monitor for use therein

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110285915A1 (en) * 2010-03-23 2011-11-24 Thales Method and system for cooperative transmission of a video sequence
US8705449B2 (en) * 2010-03-23 2014-04-22 Thales Method and system for cooperative transmission of a video sequence
US20180047415A1 (en) * 2015-05-15 2018-02-15 Google Llc Sound event detection
US10074383B2 (en) * 2015-05-15 2018-09-11 Google Llc Sound event detection

Also Published As

Publication number Publication date Type
US20110121965A1 (en) 2011-05-26 application
US20160125885A1 (en) 2016-05-05 application
US8390445B2 (en) 2013-03-05 grant
US20160125212A1 (en) 2016-05-05 application
US20090085873A1 (en) 2009-04-02 application
US7872574B2 (en) 2011-01-18 grant
US20120139721A1 (en) 2012-06-07 application
US20120154144A1 (en) 2012-06-21 application

Similar Documents

Publication Publication Date Title
Shin et al. Adaptive step length estimation algorithm using low-cost MEMS inertial sensors
Teixeira et al. A survey of human-sensing: Methods for detecting presence, count, location, track, and identity
Volgyesi et al. Shooter localization and weapon classification with soldier-wearable networked sensors
US7683929B2 (en) System and method for video content analysis-based detection, surveillance and alarm management
US6965312B2 (en) Firearm shot helmet detection system and method of use
US6961001B1 (en) Perimeter monitoring alarm method and system
US7486185B2 (en) Method and system for providing tracking services to locate an asset
US20060120568A1 (en) System and method for tracking individuals
US7592908B2 (en) Universal display exposure monitor using personal locator service
US20110092249A1 (en) Portable blind aid device
US20050128074A1 (en) Method and system for providing tracking services to locate an asset
US20070135690A1 (en) Mobile communication device that provides health feedback
US20010026240A1 (en) Personal location detection system
US20140089243A1 (en) System and Method For Item Self-Assessment As Being Extant or Displaced
US20130150004A1 (en) Method and apparatus for reducing mobile phone usage while driving
US7787857B2 (en) Method and apparatus for providing an alert utilizing geographic locations
US20070270122A1 (en) Apparatus, system, and method for disabling a mobile communicator
US20060071783A1 (en) Method and system for providing tracking services to locate an asset
US20150054639A1 (en) Method and apparatus for detecting mobile phone usage
US6178141B1 (en) Acoustic counter-sniper system
US7855935B1 (en) Weapon fire location systems and methods involving mobile device and/or other features
US20050017900A1 (en) Tracking unit
US20130040600A1 (en) Notification and Tracking System for Mobile Devices
US8630820B2 (en) Methods and systems for threat assessment, safety management, and monitoring of individuals and groups
US20050083195A1 (en) Disguised personal security system in a mobile communications device

Legal Events

Date Code Title Description
AS Assignment

Owner name: INNOVATION SPECIALISTS, LLC, FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BETTS, WILLIAM L.;BETTS, CAROL;REEL/FRAME:028935/0979

Effective date: 20060919