US20020167862A1 - Method and apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device - Google Patents

Method and apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device Download PDF

Info

Publication number
US20020167862A1
US20020167862A1 US10/115,357 US11535702A US2002167862A1 US 20020167862 A1 US20020167862 A1 US 20020167862A1 US 11535702 A US11535702 A US 11535702A US 2002167862 A1 US2002167862 A1 US 2002167862A1
Authority
US
United States
Prior art keywords
sound
source position
detection points
amplitude
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US10/115,357
Other versions
US6690618B2 (en
Inventor
Carlo Tomasi
Fahri Surucu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HEILONGJIANG GOLDEN JUMPING GROUP Co Ltd
Original Assignee
Canesta Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Assigned to CANESTA, INC. reassignment CANESTA, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SURUCU, FAHRI, TOMASI, CARLO
Priority to US10/115,357 priority Critical patent/US6690618B2/en
Application filed by Canesta Inc filed Critical Canesta Inc
Priority to PCT/US2002/010661 priority patent/WO2002082249A2/en
Publication of US20020167862A1 publication Critical patent/US20020167862A1/en
Priority to US10/313,939 priority patent/US20030132921A1/en
Priority to AU2002359625A priority patent/AU2002359625A1/en
Priority to AU2003213068A priority patent/AU2003213068A1/en
Priority to PCT/US2003/004530 priority patent/WO2003071411A1/en
Publication of US6690618B2 publication Critical patent/US6690618B2/en
Application granted granted Critical
Assigned to HEILONGJIANG GOLDEN JUMPING GROUP CO., LTD. reassignment HEILONGJIANG GOLDEN JUMPING GROUP CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CANESTA, INC.
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • G01S5/22Position of source determined by co-ordinating a plurality of position lines defined by path-difference measurements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • G06F3/0433Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves in which the acoustic waves are either generated by a movable member and propagated within a surface layer or propagated within a surface layer and captured by a movable member
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • G06F3/0436Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves in which generating transducers and detecting transducers are attached to a single acoustic waves transmission substrate
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19695Arrangements wherein non-video detectors start video recording or forwarding but do not generate an alarm themselves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/80Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using ultrasonic, sonic or infrasonic waves
    • G01S3/802Systems for determining direction or deviation from predetermined direction
    • G01S3/808Systems for determining direction or deviation from predetermined direction using transducers spaced apart and measuring phase or time difference between signals therefrom, i.e. path-difference systems
    • G01S3/8083Systems for determining direction or deviation from predetermined direction using transducers spaced apart and measuring phase or time difference between signals therefrom, i.e. path-difference systems determining direction of source
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S367/00Communications, electrical: acoustic wave systems and devices
    • Y10S367/907Coordinate determination

Definitions

  • one type of sonar technology generates sound waves that can be reflected off of objects.
  • An acoustic wave may be generated and sent out in a particular direction. If the acoustic wave encounters an object, the acoustic wave is reflected back to the source position. The time-of-travel for the acoustic wave to be sent out and reflected back is used to determine a distance of the object from the source position.
  • Another type of acoustic application includes devices that use an array of microphones to record sounds.
  • a typical use for a microphone array is to detect the direction of a voice in a room and to perform signal processing so as to enhance the voice recording from that direction.
  • passive sonar Another type of acoustic localization technology is passive sonar.
  • Passive sonar devices listen for sounds emitted by designated objects.
  • passive sonar technology is often used to detect underwater vessels and marine life.
  • Typical passive sonar applications look for sound over large underwater domains.
  • the sounds that are detected and processed by such applications are continuous signals, such as emitted by submarines. These sounds may last seconds or even minutes.
  • passive sonar applications transform detected signals into the frequency domain, and apply complex frequency-based processing methods to determine signal delays.
  • Triangulation is one common technique used to identify position information of objects. Triangulation typically uses radio-frequency (RF) waves.
  • RF radio-frequency
  • a device can be located by having an RF transmitter that emits signals detectable to a matching receiver. The receiver triangulates signals received from the transmitter over a duration of time in order to determine the transmitter's position.
  • sound caused by an event of interest is detected at a plurality of detection points.
  • Information about the sound is recorded. This information is dependent on a distance between the source position and the location of the detection point where the sound is received.
  • the source position is approximated using the recorded information and the relative position of individual detection points relative to one another. The approximated source position is used as input for operating the electronic device.
  • FIG. 1 is a block diagram of a localization apparatus, under an embodiment of the invention.
  • FIG. 2 illustrates a method for approximating a source position of where a sound-causing event of interest occurs.
  • FIG. 3 illustrates a method for approximating a source position of where an event of interest occurs using a time value of when the sound emitted by the event is received at different detection points.
  • FIG. 5 illustrates a method for approximating a source position of where an event of interest occurs using amplitude information derived from the sound emitting from the event.
  • FIG. 6 illustrates a technique for determining the source position of the sound based on the determined amplitudes of the sound at the individual detection points.
  • FIG. 7 illustrates characteristics of a waveform that can be used to determine when a sound caused by an event of interest is determined as being received by microphones in a microphone set.
  • FIG. 8 illustrates a system using an imaged keyboard, under an embodiment of the invention.
  • FIG. 9 is a block diagram that illustrates a computer system upon which an embodiment of the invention may be implemented.
  • Embodiments of the invention describe a method and apparatus for approximating a source position of a sound-causing event.
  • numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the present invention.
  • Embodiments of the invention provide a method and apparatus for approximating a source position of a sound-causing event for determining an input used to operate an electronic device.
  • sound caused by an event of interest is detected at a plurality of detection points.
  • Information about the sound is recorded. This information is dependent on a distance between the source position and the location of the detection point where the sound is received.
  • the source position is approximated using the recorded information and the relative position of individual detection points relative to one another.
  • a system in another embodiment, includes sound-detection devices and a processor operatively coupled to the sound detection devices.
  • the processor can determine when sounds received by the sound-detection devices correspond to the arrival of sound from an event of interest.
  • the processor can approximate the source position of the sound, and use the source position as input for executing an operation.
  • the term “approximate” may refer to a determination or measurement that is 80% accurate.
  • An “electronic device” includes devices having processing resources. Examples of electronic devices that can be used with embodiments of the invention include mobile phones, personal-digital assistants such as those operating a PALM OS (manufactured by PALM INC.) or POCKET PC (manufactured by MICROSOFT CORP.), digital cameras, and personal computer systems such as laptop or desktop computers.
  • PALM OS manufactured by PALM INC.
  • POCKET PC manufactured by MICROSOFT CORP.
  • laptop or desktop computers personal computer systems
  • a “sound-causing event” is an event that causes a short, sharp sound to occur.
  • the event may be the result of an activity, action, or movement.
  • the event corresponds to a user initiating contact with a finger to a surface such as a table.
  • the sound-causing event is short and discrete.
  • the duration for the sound-causing event is less than a second, or even half a second, and preferably less than 250 milliseconds.
  • the duration of a sound emitted by the sound-causing event originating from the position where the event occurred may have a similarly short duration. Accordingly, embodiments of the invention provide that sounds from the sound-causing events may be utilized as transient signals for processing purposes, in that the sounds have sharp peaks and relatively short durations.
  • Sound-detection devices include microphones. Specific examples of microphones for use with embodiments of the invention include electromagnetic, electrostatic, and piezo-electric devices.
  • the sound-detection devices may process sounds in analog form. The sounds may be converted into digital format for the processor using an analog-digital converter.
  • embodiments of the invention allow for a diverse range of applications in which events of interest can be located through sounds that are emitted when those events occur. The sounds from the events of interest are used to determine the range of the source position from a designated point. In turn, the source position of the event of interest may be determined. The determination of the source position may be interpreted as input for operating an electronic device.
  • Some of the applications discussed by embodiments of the invention include use in detecting a key selected by a user on a keyboard, where the keyboard is presented as an image.
  • Another application discussed is the ability to automate focusing and tracking of an image capturing device, such as a video recorder.
  • a video recorder may be configured to detect the approximate position of a speaker in a room. The speaker's position may be input for focusing and tracking the speaker with the video recorder or other image capturing device.
  • embodiments of the invention have particular applications in which the source position of a sound emitted from an event of interest corresponds to input for operating an electronic device.
  • a system or apparatus incorporated under an embodiment may be portable.
  • portable microphones may attach to a mobile computer system, such as a personal-digital assistant or cell phone.
  • a processor of the computer system may execute operations using a source position of sound arising from an event of interest as input.
  • a system or apparatus such as provided under an embodiment of the invention may be in the vicinity of the event that causes the sound.
  • the system or apparatus may be within a room or building in which the event takes place.
  • the system or apparatus may be used on a table in which the event of interest occurs.
  • Specific dimensions of regions that may contain both the event of interest and the system or apparatus include, for example, dimensions of 10 meters by 10 meters.
  • sounds can be differentiated from regions that are square inches or less in area, where the regions combined occupy an area of less than two feet in either direction.
  • embodiments of the invention determine approximate positions emitted from events of interest using relatively simple computational methods. Since the duration of sounds detected from events of interest are relatively short, all computations needed to approximate the source position from an event of interest may be performed in the time-domain, without transforming any of the signals into the frequency domain.
  • the use of transient time signals by some embodiments may preclude the use of more complex frequency-domain processing, such as used in many passive sonar applications.
  • embodiments of the invention enable relatively simple computational methods to be employed for approximating the source position of an event of interest.
  • the use of relatively simple computational methods conserves processing and computing resources, enabling embodiments of the invention to be employed with relatively small electronic devices. As such, embodiments of the invention compare favorably to prior art applications, such as current passive sonar technology, which uses more extensive computational methods to determine position information based on measured sound delays.
  • FIG. 1 is a block diagram of a localization apparatus, under an embodiment of the invention.
  • the localization apparatus 100 includes a set of microphones (collectively referred to as “microphone set” 110 ) employed as sound-detecting devices.
  • microphone set 110 includes a first microphone 112 , a second microphone 114 and a third microphone 116 .
  • the microphones in microphone set 110 are operatively coupled to a electronic device 120 .
  • the relative positions of individual microphones in the microphone set 110 are known relative to other microphones in the microphone set.
  • each microphone in the microphone set 110 is coupled to the electronic device via an analog-digital converter 108 .
  • the electronic device 120 may be formed by a combination of processing and memory resources.
  • electronic device 120 may correspond to a desktop or laptop computer equipped with a sound card for receiving information form microphones in microphone set 110 .
  • electronic device 120 may be a computer system, such as described with FIG. 9.
  • the microphones in the microphone set 110 are fixed relative to one another.
  • each microphone 112 , 114 and 116 may be positioned in one housing.
  • the microphones in the microphone set 110 may be independently positionable relative to the other microphones in the microphone set.
  • microphones in the microphone set 110 may each be operatively coupled to the electronic device 120 via a wire link.
  • the microphones in the microphone set 110 may be operative coupled to the electronic device 120 electronic device using a wireless link, such as through infrared or RF communications.
  • the relative positions of the microphone in microphone set 110 have to be determined once the individual microphones are positioned.
  • Embodiments of the invention employ microphones on at least three detection points.
  • FIG. 1 illustrates three microphones in microphone set 110 , each positioned at a detection point.
  • the microphones in the microphone set 110 may be arranged in a curve or non-linear fashion with respect to a reference plane.
  • the reference plane may coincide with the sheet of paper.
  • the microphones in the microphone set 110 may be arranged so as to enclose or at least partially surround a space where the event occurs. For instance, in a room, microphones could be placed at the corners. If the microphones are used to determine the location of taps on a surface, then the best performance is achieved when they are placed around the area where the taps occur. However, placement for optimal localization may not always be desirable. For instance, if the localization apparatus is to be used to enter text into a portable device, then it is desirable to arrange the microphones as close as possible to each other, in order to reduce the size of the package. In cases like this, compromises must be made between localization accuracy and other constraints on microphone placement.
  • the electronic device 120 records information from sound received by each microphone in the microphone set 110 .
  • the information recorded from the sound may be dependent on the distance between the source position 130 and the microphone receiving the sound. Since the relative positions of the microphones in the microphone set are known relative to one another, the information recorded from the sound received by each microphone provides information about the distance between the source position 130 and the designated point in the localization apparatus.
  • information that can be recorded from microphones receiving the sound from source position 130 may include the amplitude of an incoming sound emitted by an event occurring at the source position, and/or the time in which each microphone is determined to receive the sound caused by the same event. This information may be used to determine range values and a direction of the source point 130 from the designated point, so that the source position 130 may be approximated relative to the designated point. According to one embodiment, at least three microphones are needed in microphone set 110 in order to determine two or three dimensions that locate the source position 130 from the designated point.
  • the electronic device 120 may execute one or more algorithms, such as described in FIG. 2, FIG. 3 and FIG. 6, to approximate the position of the source position 130 .
  • the electronic device corresponds to a camera, video recorder, or other image capturing device having processing abilities.
  • the event may correspond to a voice created by a speaker in a room.
  • a localization apparatus such as described with FIG. 1 may be used to determine the position of the speaker. This input may be used to adjust the image capturing device to reflect range and direction of the speaker relative to the image capturing device. This allows the image capturing device to track the speaker while keeping the speaker's image in focus.
  • a remote control system for controlling an electronic device may be configured to interpret a user's gestures as commands.
  • the sound-detection system may identify a position of the user, including identifying a range of the user from the electronic device being controlled. Once the user's position is approximated, the optical-detection system can differentiate the user from other individuals who may be moving or making motions that could otherwise mistakenly be interpreted as commands.
  • a security system device may be automatically controlled through sounds made by movement or activity.
  • a video recorder may monitor a designated region for security purposes.
  • the video recorder is configured to approximate the source position of an activity or event by detecting and analyzing emitted sounds from the activity or event.
  • the video recorder may be integrated into a localization apparatus such as described by FIG. 1 so as to be automatically focused and trained onto the approximate source position of the emitted sounds. In this way, an event of interest to the security system may be tracked and recorded in focus by the video recorder.
  • a sound caused by an event of interest occurring at source position 130 is detected at a plurality of detection points.
  • each detection point may coincide with the placement of a microphone in microphone set 110 .
  • a sound-causing event of interest is a “tap” on a surface corresponding to a user's selection of an input (see FIG. 8).
  • Another example is the detection of a spoken word by a speaker in a room.
  • step 220 information about the sound caused by the event at source position 130 may be recorded when the sound is determined as being received by each detection point.
  • the information may be recorded by electronic device 120 .
  • the information is dependent on the distance between the source position 130 and microphones in microphone set 110 that receives the sound. For example, if the microphones are arranged in a non-linear fashion with respect to source position 130 , the information recorded at each microphone in the microphone set 110 may be distinguishable from information recorded by another microphone in the microphone set. The effect of distance on the information recorded as a result of each microphone in microphone set 110 receiving the sound is indicative of the position of source position 130 .
  • information recorded by the electronic device 120 may correspond to an amplitude of the sound recorded at each of the microphones in the microphone set 110 emitted from the event of interest.
  • the information recorded by the electronic device 120 may correspond to the time at which the electronic device records the arrival of the sound emitted by the event of interest at each microphone in microphone set 110 .
  • the information recorded is a combination or average of the amplitude of the sound received by each microphone, and the recorded time at which the electronic device 120 detects the sound as arriving at each microphone.
  • step 230 information recorded at each microphone in microphone set 110 is compared with information recorded by at least one other microphone in the microphone set.
  • the information compared may correspond to the amplitude of the sound received at each microphone in the microphone set 110 , and/or the time at which the sound is determined as being received by each of the microphones.
  • step 240 dimensions defining a space between the source position 130 and the designated point are approximated using the information recorded at each of the detection points. In one embodiment, approximation is made based on a comparison of the information recorded for each microphone. For example, a first dimension for defining source position 130 relative to the designated point may be determined by comparing the information recorded from first microphone 112 receiving the sound to the information recorded by the second microphone 114 receiving the same sound. A second dimension may be determined by comparing the information recorded from first microphone 112 receiving the sound to the information recorded by the third microphone 116 receiving the same sound. A third dimension may be determined by comparing the information recorded from second microphone 114 receiving the sound to the information recorded by the third microphone 116 receiving the sound. Additional microphones may be employed to make additional comparisons between microphones for purpose of redundancy and added dimensions.
  • the range(s) and direction of the source position may be determined in relation to the detection points.
  • some prior art techniques that utilize microphone arrays that record sound information can only tell the direction in which the sound is being received. Such techniques cannot tell the range of the source position for where the sound is or was created.
  • FIG. 3 illustrates a method for approximating a source position of where an event of interest occurs using a time value of when the sound emitted by the event is received at different detection points.
  • a method such as described with FIG. 3 may be implemented using components described with localization apparatus such as described by FIG. 1. Any reference to numerals of FIG. 1 is intended to illustrate exemplary components for practicing an embodiment such as described by FIG. 3.
  • the positions of individual detection points are identified relative to other detection points in the plurality of detection points.
  • microphones in microphone set 110 may be employed in a fixed arrangement so that their relative positions with respect to one another are known.
  • the microphones 112 , 114 and 116 may be constructed in a housing that keeps the microphones in a fixed position relative to one another.
  • microphones 112 , 114 and 116 are each independently positionable.
  • the microphones 112 , 114 and 116 are calibrated once the microphones are positioned.
  • a calibration process may be performed by first positioning the microphones in an operative relationship with an electronic device. Then a sound is emitted from a known reference point. Information from the sound is used to determine positions of the microphones relative to one another, based on the position of the reference point being known.
  • sound is used to detect a key selected by a user being displayed an image of a set of keys.
  • the microphones may be positionable so as to detect taps corresponding to the user touching a space on which the image of the set of keys is being displayed. Once the microphones are positioned, the user may be required to touch a surface at a point corresponding to a designated calibration key.
  • the position of each key in the set of keys may be known relative to the key designated for calibration. Since the spatial relationship between the set of keys being displayed is known, the selection of the calibration key enables the relative distances between the microphones to be determined.
  • the distance between each microphone in the microphone set 110 may be measured after the microphones in the array are placed in the desired position.
  • step 320 sound emitting from the event at the source position 130 is detected at each of the plurality of detection points.
  • the sound from the event may need to be distinguished from background sounds, or sounds from events that are not of interest.
  • electronic device 120 has to be configured to make a determination as to when sound from the event has arrived.
  • the arrival of the sound from the source position 130 is detected and differentiated from other sounds, such as ambient noise and noise from events that are not of interest. Additional description of determining when sound from an event of interest arrives is described with FIG. 7.
  • Step 330 provides that the time value for when sound is detected as arriving at each detection point is determined.
  • the time value may correspond to a time stamp of when each of the microphones in microphone set 110 is determined to receive the sound from source position 130 .
  • the time value marked at each detection point is dependent on the distance between that detection point and the source position 130 .
  • Step 340 provides that the time values determined at individual detection points are compared with time values determined at other detection points. For example, a time stamp marked when first microphone 112 receives the sound from source position 130 is compared with one or both time stamps marked when each of second microphone 114 and third microphone 116 receive the same sound.
  • step 350 the comparison of the time values are used to approximate the position of the source position. For example, a difference between the time stamp at first microphone 112 and second microphone 114 may be used to determine one dimension of the source position relative to the designated position. A difference between the time stamp at first microphone 112 and third microphone 116 may be used to determine a second dimension of the source position relative to the designated position. A similar difference may be calculated by comparing the second microphone 114 and third microphone 116 in order to determine a third dimension of the source position relative to the designated position.
  • FIG. 4 illustrates a technique for determining the source position of a sound using the detected time values of when the sound is received by individual detection points.
  • FIG. 4 illustrates use of an iso-delay map for approximating a source position of sound emitted by an event of interest.
  • the iso-delay map consists of iso-delay curves that have been previously determined for sounds emitted at known source positions using known locations of detection points. Measured time values may be mapped onto the iso-delay map for purposes of finding intersecting iso-delay curves that approximate the source position of the event of interest.
  • the iso-delay curves can be used to determine the magnitude and direction of vector p. If a sound propagates from p at velocity v, it reaches the microphone at b with a time delay d ab after reaching the microphone at a. This delay satisfies the equation
  • Equation (1) is a quadratic equation in the two unknown variables in p. Equation (1) can be used to determine an iso-delay curve on the plane, on which the sound source must lie. This curve (i.e., 412 ) is the iso-delay curve for delay d ab between a pair of microphones a and b.
  • iso-delay curves for two pairs of microphones are shown for two pairs of microphones arranged on the vertices of a diamond.
  • the solid curves 412 are for one pair of microphones, and the dotted curves 414 are for the other.
  • the two pairs of microphones may be formed using three microphones, as shown with FIG. 1.
  • a first pair of microphones e,g, first microphone 112 and second microphone 114
  • a second pair of microphones first microphone 112 and third microphone 116
  • the time values measured between each microphone pair are mapped onto iso-delay curves 412 , 414 .
  • the point at which the iso-delay curves intersect is the approximate source position.
  • the proximity of the source point to one microphone in a pair effects the position of the source point on the iso-delay map along the axis defining that pair of microphones.
  • E in FIG. 4 represents a source point that is centrally positioned between the first pair of microphones on axis Z, but more proximate to one of the second pair of microphones on axis Y over the other microphone in that pair.
  • Point F represents a source position that is central to both microphone pairs.
  • Point G represents a source position that is skewed towards one of the microphones in both pairs.
  • Each iso-delay curve in FIG. 4 represents one increment of time differential between the time one microphone in a pair of microphones receives a sound from the source point, as compared to the other microphone in the pair. For example, if the source position corresponds to Point A, the iso-delay curve indicates that one of the first microphone 112 receives the sound from the event two time increments (e.g. 10 microseconds) before the third microphone 116 . If the source position corresponds to Point B, one of the microphones in each pair receives the sound from the source position before the other microphone in that pair. For the pair of microphones defined by axis Z, one microphone in that pair receives the sound from the source position three time increments before the other microphone in that pair. For axis Y, one microphone receives the sound from the source position two time increments before the other microphone in that pair. If the source position corresponds to Point C, then each microphone in both pairs receives the sound from the source position at approximately the same time.
  • the source position corresponds to Point A
  • two curves for the two microphone pairs intersect at two different points. Which of these two points corresponds to the sound source can be determined by either adding a third pair of microphones, or by using external knowledge. For instance, if the system is used to determine the location of a finger tap on a desktop, the area in which taps are allowed may be known, and microphones can be arranged so that only one intersection for each pair of curves lies within this area.
  • a third pair of microphones For three-dimensional applications, additional information is needed from a third pair of microphones.
  • Each of the three pairs of microphones is said to yield a quadratic iso-delay surface (not shown) in space, rather than a curve.
  • the three microphone pairs are necessary to localize the sound source because at least three dimensions are needed to locate the source point.
  • the three dimensions may correspond to three ranges relative to an origin or a designated position, or a combination of direction and ranges.
  • the minimum number of microphones is three, since microphones a, b, c yield three pairs (a, b), (a, c), and (b, c).
  • Step 520 provides that the sound from an event occurring at the source point 130 is detected as being received. This step may be performed in a manner such as described with FIG. 3 and FIG. 7.
  • step 540 the amplitudes of the sound determined at the individual detection points are compared to one another.
  • the amplitudes may be compared in a manner similar to the comparison of time values, as described in FIG. 3.
  • Step 550 provides that the comparisons made of the amplitudes at the individual detection points are used to approximate the source position of the sound. For example, a difference between the amplitude of the sound measured at a first detection point and at a second detection point may be used to calculate a first range value for locating the source position of the sound from a designated point. A difference between the amplitude of the sound measured between a second pair of detection points may be used to calculate a second range value for locating the source position of the sound from the designated point. A third range value may similarly be calculated for locating the source position relative to the designated point.
  • FIG. 6 illustrates a technique for determining the source position of the sound based on the determined amplitudes of the sound at the individual detection points.
  • a geometric locus map is used to approximate the source position of sound emitted by an event of interest.
  • the map contains a series of curves, individually referred to as a locus, which are each determined from differences in the measured amplitudes of the sound detected by a designated pair of microphones. At least two loci exist for when sound is measured by a set of three microphones. The intersection of loci determined by pairs of microphones, in a set of three or more microphones, corresponds to the approximate source position.
  • the geometric locus map may be used with a method such as described with FIG. 5.
  • the locus for one pair of microphones along an axis Z are represented by solid lines 612
  • for another pair of microphones along an axis Y are represented by dashed lines 614 .
  • An individual locus may be determined as follows. Given a pair of omni-directional microphones located at points a and b, and a sound source at p with an arbitrary amplitude, the technique may be developed in the following manner. The microphones at a and b will receive sounds with amplitudes of ⁇ and ⁇ , respectively. If a sound propagates through a three-dimensional medium to reach a microphone, then the amplitude of the sound decays by a rate that is inversely proportional to the distance traveled. The law of power conservation explains this easily: Since sound waves propagate spherically in the air, the total power will traverse increasingly large spheres.
  • the sphere in question has a surface area 4 ⁇ r 2 .
  • power decays as inversely proportional to square of the traveled distance. Since the power is related to the square of the sound amplitude, the amplitude decays inversely proportional to the distance.
  • the ratio ⁇ / ⁇ of the amplitudes will be the reciprocal of the ratio ⁇ a ⁇ p ⁇ / ⁇ b ⁇ p ⁇ between the traveled distances. That is,
  • each such equation determines one of the geometric locus in FIG. 6 on which the sound source must lie.
  • each locus is a curve.
  • each equation determines a surface (not shown).
  • Two pairs of microphones (with the pairs optionally sharing one microphone) determine the location of the sound source in two dimensions.
  • at least three pairs (any two of which can share at most one microphone) are needed. In either case, additional microphones can be used to provide redundant measurements, and therefore yield better accuracy in the result.
  • FIG. 7 illustrates characteristics of a waveform that can be used to determine when a sound caused by an event of interest is determined as being received by microphones in an microphone set (i.e., 110 in FIG. 1). Without identifiable characteristics in the waveform that can be detected, microphones in microphone set 110 may not be able to distinguish sounds caused by events of interest from sounds caused by other events, or by noise.
  • the peak 712 may be detected if the value of the peak is greater than or within a range of peak values that correspond to a known sound emitting from the event of interest.
  • finger taps on a table may coincide with events of interest.
  • the electronic device 120 analyzing the sounds received by the microphones in microphone set 110 may seek certain limit or range of peak values known to coincide with finger taps on a given surface. When during a designated interval, the peak 712 is detected, the electronic device 120 knows the sound received by the microphone is from an event of interest.
  • a time value is associated with the peak 712 .
  • a time stamp 715 of when the peak 712 occurs may be used to subsequently approximate the source position of the event, as described in FIG. 3.
  • a magnitude 718 of the peak 712 may be used to subsequently determine the approximate source position of the event, as described in FIG. 5.
  • the waveform characteristic used to detect when microphones receive the sound from the event of interest is a waveform shape.
  • This technique may be referred to as matched filtering.
  • the shape of the waveform in the interval defined by 722 may match a shape of known waveform corresponding to the event of interest.
  • a beginning point or amplitude of the waveform may be used to record the time stamp for when the waveform is received by one of the microphones.
  • the magnitude 718 of the peak detected for the duration 722 may be used to subsequently determine the approximate source position of the event, as described in FIG. 5.
  • the time stamp assigned to the waveform as detected by its shape may be used to approximate the source position, as described with FIG. 3.
  • the source position of a sound caused by an event of interest may be determined using a combination of techniques described herein.
  • the localization methods described in FIG. 3 and FIG. 5 used detected time values and amplitudes separately.
  • each localization method has essentially independent noise statistics.
  • the source position approximated by each localization method is affected in different ways by an inhomogeneous medium, or by electronic noise in the microphone amplifiers, and other deviations from an ideal situation. It therefore may be advantageous to use both methods, and combine their results, especially when a higher accuracy is required in the localization results than what could be obtained with either method used alone.
  • these two numbers or functions could be proportional to the reciprocals of the standard deviations of localization errors, as determined by a calibration procedure.
  • Embodiments of the invention may be used to enable a user to select an input for operating an electronic device.
  • the user may specify the input by selecting a position for where a sound-emitting event is to occur.
  • the user may make a key entry by tapping a region defined for a key.
  • the region may be defined using, for example, a displayed image of a keyboard.
  • FIG. 8 illustrates a system using an imaged keyboard, under an embodiment of the invention.
  • the system includes a projector 810 , a electronic device 820 and an set of microphones 830 .
  • the projector 810 is equipped to display an image 850 of a set of keys onto a surface 836 .
  • the microphone set 830 at least partially surrounds the area of the surface 836 where the image 850 is being displayed.
  • image 850 may correspond to a QWERTY keyboard.
  • a user can select one of the regions 852 differentiated in the image 850 as input for electronic device 820 .
  • the user may make the selection by touching surface 836 in one of the regions 852 corresponding to the key that the user wishes to select.
  • the act of touching the surface 836 causes a sound to be emitted (i.e. a tap) from the region of the surface 836 where that key is displayed.
  • the microphones 830 detect that sound.
  • the electronic device 820 is configured to realize when that sound is detected.
  • the electronic device 820 may be configured to detect one of the characteristics of the sound received from the user's action, as described with FIG. 4.
  • electronic device 820 processes that sound as it was received by each of the microphones 830 for information that is dependent on the space between the contacted region and the microphone detecting the sound.
  • electronic device 820 will detect a time stamp for when the sound of the tap was received by each microphone 830 .
  • the time stamp values may be used in a manner such as described by FIG. 3 to detect the contacted region 852 corresponding to the selected key.
  • Computer system 900 further includes a read only memory (ROM) 908 or other static storage device coupled to bus 902 for storing static information and instructions for processor 904 .
  • ROM read only memory
  • a storage device 910 such as a magnetic disk or optical disk, is provided and coupled to bus 902 for storing information and instructions.
  • Computer system 900 may be coupled via bus 902 to a display 912 , such as a cathode ray tube (CRT), for displaying information to a computer user.
  • a display 912 such as a cathode ray tube (CRT)
  • An input device 914 is coupled to bus 902 for communicating information and command selections to processor 904 .
  • cursor control 916 is Another type of user input device
  • cursor control 916 such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 904 and for controlling cursor movement on display 912 .
  • This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.
  • the invention is related to the use of computer system 900 for approximating a source position of a sound-causing event for determining an input used in operating a computer system.
  • a method for approximating a source position of a sound-causing event for determining an input used in operating a computer system is provided by computer system 900 in response to processor 904 executing one or more sequences of one or more instructions contained in main memory 906 .
  • Such instructions may be read into main memory 906 from another computer-readable medium, such as storage device 910 . Execution of the sequences of instructions contained in main memory 906 causes processor 904 to perform the process steps described herein.
  • Computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read.
  • Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to processor 904 for execution.
  • the instructions may initially be carried on a magnetic disk of a remote computer.
  • the remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem.
  • a modem local to computer system 900 can receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal.
  • An infrared detector coupled to bus 902 can receive the data carried in the infrared signal and place the data on bus 902 .
  • Bus 902 carries the data to main memory 906 , from which processor 904 retrieves and executes the instructions.
  • the instructions received by main memory 906 may optionally be stored on storage device 910 either before or after execution by processor 904 .
  • Computer system 900 also includes a communication interface 918 coupled to bus 902 .
  • Communication interface 918 provides a two-way data communication coupling to a network link 920 that is connected to a local network 922 .
  • communication interface 918 may be an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of telephone line.
  • ISDN integrated services digital network
  • communication interface 918 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN.
  • LAN local area network
  • Wireless links may also be implemented.
  • communication interface 918 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
  • Network link 920 typically provides data communication through one or more networks to other data devices.
  • network link 920 may provide a connection through local network 922 to a host computer 924 or to data equipment operated by an Internet Service Provider (ISP) 926 .
  • ISP 926 in turn provides data communication services through the worldwide packet data communication network now commonly referred to as the “Internet” 928 .
  • Internet 928 uses electrical, electromagnetic or optical signals that carry digital data streams.
  • the signals through the various networks and the signals on network link 920 and through communication interface 918 which carry the digital data to and from computer system 900 , are exemplary forms of carrier waves transporting the information.
  • Computer system 900 can send messages and receive data, including program code, through the network(s), network link 920 and communication interface 918 .
  • a server 930 might transmit a requested code for an application program through Internet 928 , ISP 926 , local network 922 and communication interface 918 .
  • one such downloaded application provides for approximating a source position of a sound-causing event for determining an input used in operating a computer system as described herein.
  • the received code may be executed by processor 904 as it is received, and/or stored in storage device 910 , or other non-volatile storage for later execution. In this manner, computer system 900 may obtain application code in the form of a carrier wave.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Acoustics & Sound (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

A plurality of sound-detecting devices are configured to detect sound from an event of interest. Information about the sound is recorded that is dependent on a distance between the source position and the location of the sound-detection devices. The source position is approximated using the recorded information and the relative position of individual sound-detection devices to one another. The approximated source position is used as input for operating a electronic device.

Description

    RELATED APPLICATIONS
  • This application claims benefit of priority to Provisional U.S. Patent Application No. 60/281,314, filed Apr. 3, 2001, entitled “A Localization System Based On Sound Delays,” and naming Carlo Tomasi as an inventor. The aforementioned priority application is hereby incorporated by reference in its entirety for all purposes.[0001]
  • FIELD OF THE INVENTION
  • The present invention relates to position detection methods and apparatuses. In particular, the present invention relates to a method and apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device. [0002]
  • BACKGROUND OF THE INVENTION
  • There are several known techniques for approximating the position of objects using acoustics, light and radio-frequency waves. These types of localization technologies have a wide array of applications, including, for example, use in navigation and medical applications. [0003]
  • In the field of acoustics, one type of sonar technology generates sound waves that can be reflected off of objects. An acoustic wave may be generated and sent out in a particular direction. If the acoustic wave encounters an object, the acoustic wave is reflected back to the source position. The time-of-travel for the acoustic wave to be sent out and reflected back is used to determine a distance of the object from the source position. [0004]
  • More advanced acoustic wave reflection and detection techniques exist. For example, ultrasounds can map a three-dimensional object by bouncing acoustic waves off the object. [0005]
  • Another type of acoustic application includes devices that use an array of microphones to record sounds. A typical use for a microphone array is to detect the direction of a voice in a room and to perform signal processing so as to enhance the voice recording from that direction. [0006]
  • Another type of acoustic localization technology is passive sonar. Passive sonar devices listen for sounds emitted by designated objects. For example, passive sonar technology is often used to detect underwater vessels and marine life. Typical passive sonar applications look for sound over large underwater domains. The sounds that are detected and processed by such applications are continuous signals, such as emitted by submarines. These sounds may last seconds or even minutes. In order to process continuous sounds, passive sonar applications transform detected signals into the frequency domain, and apply complex frequency-based processing methods to determine signal delays. [0007]
  • Triangulation is one common technique used to identify position information of objects. Triangulation typically uses radio-frequency (RF) waves. A device can be located by having an RF transmitter that emits signals detectable to a matching receiver. The receiver triangulates signals received from the transmitter over a duration of time in order to determine the transmitter's position. [0008]
  • SUMMARY OF THE INVENTION
  • The foregoing needs, and other needs and objects that will become apparent for the following description, are achieved in the present invention, which comprises, in some aspects, a method, apparatus, system and device that approximate a source position of a sound-causing event for determining an input used to operate an electronic device. [0009]
  • In an embodiment, sound caused by an event of interest is detected at a plurality of detection points. Information about the sound is recorded. This information is dependent on a distance between the source position and the location of the detection point where the sound is received. The source position is approximated using the recorded information and the relative position of individual detection points relative to one another. The approximated source position is used as input for operating the electronic device. [0010]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings. Like reference numerals are intended to refer to similar elements among different figures. [0011]
  • FIG. 1 is a block diagram of a localization apparatus, under an embodiment of the invention. [0012]
  • FIG. 2 illustrates a method for approximating a source position of where a sound-causing event of interest occurs. [0013]
  • FIG. 3 illustrates a method for approximating a source position of where an event of interest occurs using a time value of when the sound emitted by the event is received at different detection points. [0014]
  • FIG. 4 illustrates a technique for determining the source position of a sound using the detected time values of when the sound is received by individual detection points. [0015]
  • FIG. 5 illustrates a method for approximating a source position of where an event of interest occurs using amplitude information derived from the sound emitting from the event. [0016]
  • FIG. 6 illustrates a technique for determining the source position of the sound based on the determined amplitudes of the sound at the individual detection points. [0017]
  • FIG. 7 illustrates characteristics of a waveform that can be used to determine when a sound caused by an event of interest is determined as being received by microphones in a microphone set. [0018]
  • FIG. 8 illustrates a system using an imaged keyboard, under an embodiment of the invention. [0019]
  • FIG. 9 is a block diagram that illustrates a computer system upon which an embodiment of the invention may be implemented. [0020]
  • DETAILED DESCRIPTION OF THE INVENTION
  • Embodiments of the invention describe a method and apparatus for approximating a source position of a sound-causing event. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the present invention. [0021]
  • A. Overview [0022]
  • Embodiments of the invention provide a method and apparatus for approximating a source position of a sound-causing event for determining an input used to operate an electronic device. According to one embodiment, sound caused by an event of interest is detected at a plurality of detection points. Information about the sound is recorded. This information is dependent on a distance between the source position and the location of the detection point where the sound is received. The source position is approximated using the recorded information and the relative position of individual detection points relative to one another. [0023]
  • In another embodiment, a system is provided that includes sound-detection devices and a processor operatively coupled to the sound detection devices. The processor can determine when sounds received by the sound-detection devices correspond to the arrival of sound from an event of interest. The processor can approximate the source position of the sound, and use the source position as input for executing an operation. [0024]
  • As used herein, the term “approximate” may refer to a determination or measurement that is 80% accurate. [0025]
  • An “electronic device” includes devices having processing resources. Examples of electronic devices that can be used with embodiments of the invention include mobile phones, personal-digital assistants such as those operating a PALM OS (manufactured by PALM INC.) or POCKET PC (manufactured by MICROSOFT CORP.), digital cameras, and personal computer systems such as laptop or desktop computers. [0026]
  • A “sound-causing event” is an event that causes a short, sharp sound to occur. The event may be the result of an activity, action, or movement. For example, in an application discussed with FIG. 8, the event corresponds to a user initiating contact with a finger to a surface such as a table. In many cases, the sound-causing event is short and discrete. In some embodiments, the duration for the sound-causing event is less than a second, or even half a second, and preferably less than 250 milliseconds. The duration of a sound emitted by the sound-causing event originating from the position where the event occurred may have a similarly short duration. Accordingly, embodiments of the invention provide that sounds from the sound-causing events may be utilized as transient signals for processing purposes, in that the sounds have sharp peaks and relatively short durations. [0027]
  • “Sound-detection devices” include microphones. Specific examples of microphones for use with embodiments of the invention include electromagnetic, electrostatic, and piezo-electric devices. The sound-detection devices may process sounds in analog form. The sounds may be converted into digital format for the processor using an analog-digital converter. Among other advantages, embodiments of the invention allow for a diverse range of applications in which events of interest can be located through sounds that are emitted when those events occur. The sounds from the events of interest are used to determine the range of the source position from a designated point. In turn, the source position of the event of interest may be determined. The determination of the source position may be interpreted as input for operating an electronic device. [0028]
  • Some of the applications discussed by embodiments of the invention include use in detecting a key selected by a user on a keyboard, where the keyboard is presented as an image. Another application discussed is the ability to automate focusing and tracking of an image capturing device, such as a video recorder. For example, a video recorder may be configured to detect the approximate position of a speaker in a room. The speaker's position may be input for focusing and tracking the speaker with the video recorder or other image capturing device. [0029]
  • Accordingly, embodiments of the invention have particular applications in which the source position of a sound emitted from an event of interest corresponds to input for operating an electronic device. In particular, a system or apparatus incorporated under an embodiment may be portable. For example, portable microphones may attach to a mobile computer system, such as a personal-digital assistant or cell phone. A processor of the computer system may execute operations using a source position of sound arising from an event of interest as input. [0030]
  • In this way, a system or apparatus such as provided under an embodiment of the invention may be in the vicinity of the event that causes the sound. For example, the system or apparatus may be within a room or building in which the event takes place. The system or apparatus may be used on a table in which the event of interest occurs. Specific dimensions of regions that may contain both the event of interest and the system or apparatus include, for example, dimensions of 10 meters by 10 meters. In some applications, sounds can be differentiated from regions that are square inches or less in area, where the regions combined occupy an area of less than two feet in either direction. [0031]
  • Furthermore, embodiments of the invention determine approximate positions emitted from events of interest using relatively simple computational methods. Since the duration of sounds detected from events of interest are relatively short, all computations needed to approximate the source position from an event of interest may be performed in the time-domain, without transforming any of the signals into the frequency domain. The use of transient time signals by some embodiments may preclude the use of more complex frequency-domain processing, such as used in many passive sonar applications. As a result, embodiments of the invention enable relatively simple computational methods to be employed for approximating the source position of an event of interest. The use of relatively simple computational methods conserves processing and computing resources, enabling embodiments of the invention to be employed with relatively small electronic devices. As such, embodiments of the invention compare favorably to prior art applications, such as current passive sonar technology, which uses more extensive computational methods to determine position information based on measured sound delays. [0032]
  • B. System Description [0033]
  • FIG. 1 is a block diagram of a localization apparatus, under an embodiment of the invention. The [0034] localization apparatus 100 includes a set of microphones (collectively referred to as “microphone set” 110) employed as sound-detecting devices. Given an example such as shown by FIG. 1, microphone set 110 includes a first microphone 112, a second microphone 114 and a third microphone 116. The microphones in microphone set 110 are operatively coupled to a electronic device 120. The relative positions of individual microphones in the microphone set 110 are known relative to other microphones in the microphone set. In one embodiment, each microphone in the microphone set 110 is coupled to the electronic device via an analog-digital converter 108.
  • The [0035] electronic device 120 may be formed by a combination of processing and memory resources. For example, electronic device 120 may correspond to a desktop or laptop computer equipped with a sound card for receiving information form microphones in microphone set 110. In one embodiment, electronic device 120 may be a computer system, such as described with FIG. 9.
  • In one embodiment, the microphones in the microphone set [0036] 110 are fixed relative to one another. For example, each microphone 112, 114 and 116 may be positioned in one housing. In another embodiment, the microphones in the microphone set 110 may be independently positionable relative to the other microphones in the microphone set. For example, microphones in the microphone set 110 may each be operatively coupled to the electronic device 120 via a wire link. Alternatively, the microphones in the microphone set 110 may be operative coupled to the electronic device 120 electronic device using a wireless link, such as through infrared or RF communications. In such embodiments, the relative positions of the microphone in microphone set 110 have to be determined once the individual microphones are positioned.
  • If the localization apparatus is to be used on a surface, such as the top of a desk or other surface, it may be beneficial to couple the sensitive element of the [0037] microphones 110 intimately with the surface itself, so that sounds are recorded with the highest possible strength. Coupling can be achieved by pushing the microphones onto the surface. The weight of a package containing the microphones 110 may be sufficient to exert this push. In one embodiment, one or more of the microphones 110 may be slightly protruding from the bottom surface of its package, and pushed by a spring from the back. When the package is placed on the surface, its weight compresses the spring, and helps maintain contact between the microphone 130 and the surface.
  • In this situation it may also be desirable to shield the [0038] microphones 110 from spurious sounds propagating through air. This can be achieved with foam mufflers placed around the microphones themselves.
  • Embodiments of the invention employ microphones on at least three detection points. Thus, FIG. 1 illustrates three microphones in microphone set [0039] 110, each positioned at a detection point. The microphones in the microphone set 110 may be arranged in a curve or non-linear fashion with respect to a reference plane. The reference plane may coincide with the sheet of paper.
  • To improve performance, the microphones in the microphone set [0040] 110 may be arranged so as to enclose or at least partially surround a space where the event occurs. For instance, in a room, microphones could be placed at the corners. If the microphones are used to determine the location of taps on a surface, then the best performance is achieved when they are placed around the area where the taps occur. However, placement for optimal localization may not always be desirable. For instance, if the localization apparatus is to be used to enter text into a portable device, then it is desirable to arrange the microphones as close as possible to each other, in order to reduce the size of the package. In cases like this, compromises must be made between localization accuracy and other constraints on microphone placement.
  • For purpose of illustration, a sound-causing event may occur at a [0041] source position 130. The source position 130 has coordinates (i, j) relative to a known designated point. The designated point may correspond to a position of one of the microphones, or to electronic device 120. The source position 130 maybe a first distance D1 away from a first microphone 112, a second distance D2 away from second microphone 114, and a third distance D3 away from third microphone 116. The position of each microphone in the microphone set 110 relative to another microphone in the microphone set is known or determinable. The individual microphones each receive the sound emitted by the event at source 130.
  • The [0042] electronic device 120 records information from sound received by each microphone in the microphone set 110. The information recorded from the sound may be dependent on the distance between the source position 130 and the microphone receiving the sound. Since the relative positions of the microphones in the microphone set are known relative to one another, the information recorded from the sound received by each microphone provides information about the distance between the source position 130 and the designated point in the localization apparatus.
  • As will be described, information that can be recorded from microphones receiving the sound from [0043] source position 130 may include the amplitude of an incoming sound emitted by an event occurring at the source position, and/or the time in which each microphone is determined to receive the sound caused by the same event. This information may be used to determine range values and a direction of the source point 130 from the designated point, so that the source position 130 may be approximated relative to the designated point. According to one embodiment, at least three microphones are needed in microphone set 110 in order to determine two or three dimensions that locate the source position 130 from the designated point. The electronic device 120 may execute one or more algorithms, such as described in FIG. 2, FIG. 3 and FIG. 6, to approximate the position of the source position 130.
  • In one embodiment, the approximated source position is used as input for operating the electronic device. In one application, the electronic device corresponds to a projector connected to a portable computer (see e.g. FIG. 8). The projector is configured to display a set of inputs on a surface for a user. The user's selection of one of the inputs is determined by the location where the user contacts the surface. The user's contact may cause a “tap” or similar sound to be created. This sound is detected by microphones that surround the surface where the set of inputs is being displayed. The sound locates the position where the user made the tap. This position is interpreted as input for operating the electronic device. The input may correspond to a key selection, such as provided by a QWERTY keyboard. [0044]
  • In another embodiment, the electronic device corresponds to a camera, video recorder, or other image capturing device having processing abilities. The event may correspond to a voice created by a speaker in a room. A localization apparatus such as described with FIG. 1 may be used to determine the position of the speaker. This input may be used to adjust the image capturing device to reflect range and direction of the speaker relative to the image capturing device. This allows the image capturing device to track the speaker while keeping the speaker's image in focus. Several other applications may employ features described by embodiments of the invention. For example, a remote control system for controlling an electronic device may be configured to interpret a user's gestures as commands. The remote-control system may be employed to control a wide variety of electronic devices, such as televisions, audio systems, and home entertainment centers. In one application, an optical detection system is used to interpret gestures, such as made with an arm, hand or finger, as a command. A sound-detection system can be used to initiate the optical detection system by directing the optical detection system to a user that has selected to make the commands. For example, a user can choose to enter a gesture command by first making a sound, such as a voice-command or a finger snap. The sound-detection system may approximate the position of the user in the room, so that the optical detection system can be trained onto that user to detect the user's gestures. The sound-detection system may identify a position of the user, including identifying a range of the user from the electronic device being controlled. Once the user's position is approximated, the optical-detection system can differentiate the user from other individuals who may be moving or making motions that could otherwise mistakenly be interpreted as commands. [0045]
  • In another application, a security system device may be automatically controlled through sounds made by movement or activity. For example, a video recorder may monitor a designated region for security purposes. According to an embodiment, the video recorder is configured to approximate the source position of an activity or event by detecting and analyzing emitted sounds from the activity or event. The video recorder may be integrated into a localization apparatus such as described by FIG. 1 so as to be automatically focused and trained onto the approximate source position of the emitted sounds. In this way, an event of interest to the security system may be tracked and recorded in focus by the video recorder. [0046]
  • FIG. 2 illustrates a method for approximating a source position of where a sound-causing event of interest occurs. A method such as described with FIG. 2 may be implemented using a localization apparatus such as described with FIG. 1. Any reference to numerals of FIG. 1 is intended to illustrate exemplary components for practicing an embodiment of the invention, as described by FIG. 2. [0047]
  • In [0048] step 210, a sound caused by an event of interest occurring at source position 130 is detected at a plurality of detection points. In one embodiment, each detection point may coincide with the placement of a microphone in microphone set 110. As will be described, one example of a sound-causing event of interest is a “tap” on a surface corresponding to a user's selection of an input (see FIG. 8). Another example is the detection of a spoken word by a speaker in a room.
  • In [0049] step 220, information about the sound caused by the event at source position 130 may be recorded when the sound is determined as being received by each detection point. The information may be recorded by electronic device 120. The information is dependent on the distance between the source position 130 and microphones in microphone set 110 that receives the sound. For example, if the microphones are arranged in a non-linear fashion with respect to source position 130, the information recorded at each microphone in the microphone set 110 may be distinguishable from information recorded by another microphone in the microphone set. The effect of distance on the information recorded as a result of each microphone in microphone set 110 receiving the sound is indicative of the position of source position 130.
  • According to one embodiment of the invention, information recorded by the [0050] electronic device 120 may correspond to an amplitude of the sound recorded at each of the microphones in the microphone set 110 emitted from the event of interest. According to another embodiment, the information recorded by the electronic device 120 may correspond to the time at which the electronic device records the arrival of the sound emitted by the event of interest at each microphone in microphone set 110. In still another embodiment, the information recorded is a combination or average of the amplitude of the sound received by each microphone, and the recorded time at which the electronic device 120 detects the sound as arriving at each microphone.
  • In [0051] step 230, information recorded at each microphone in microphone set 110 is compared with information recorded by at least one other microphone in the microphone set. The information compared may correspond to the amplitude of the sound received at each microphone in the microphone set 110, and/or the time at which the sound is determined as being received by each of the microphones.
  • In [0052] step 240, dimensions defining a space between the source position 130 and the designated point are approximated using the information recorded at each of the detection points. In one embodiment, approximation is made based on a comparison of the information recorded for each microphone. For example, a first dimension for defining source position 130 relative to the designated point may be determined by comparing the information recorded from first microphone 112 receiving the sound to the information recorded by the second microphone 114 receiving the same sound. A second dimension may be determined by comparing the information recorded from first microphone 112 receiving the sound to the information recorded by the third microphone 116 receiving the same sound. A third dimension may be determined by comparing the information recorded from second microphone 114 receiving the sound to the information recorded by the third microphone 116 receiving the sound. Additional microphones may be employed to make additional comparisons between microphones for purpose of redundancy and added dimensions.
  • As a result of a method such as described with FIG. 2, the range(s) and direction of the source position may be determined in relation to the detection points. In contrast, some prior art techniques that utilize microphone arrays that record sound information can only tell the direction in which the sound is being received. Such techniques cannot tell the range of the source position for where the sound is or was created. [0053]
  • C. Approximating The Source Position Using Detected Time Values Of The Sound Received From The Source Position [0054]
  • FIG. 3 illustrates a method for approximating a source position of where an event of interest occurs using a time value of when the sound emitted by the event is received at different detection points. A method such as described with FIG. 3 may be implemented using components described with localization apparatus such as described by FIG. 1. Any reference to numerals of FIG. 1 is intended to illustrate exemplary components for practicing an embodiment such as described by FIG. 3. [0055]
  • In [0056] step 310, the positions of individual detection points are identified relative to other detection points in the plurality of detection points. For example, microphones in microphone set 110 may be employed in a fixed arrangement so that their relative positions with respect to one another are known. In one application, the microphones 112, 114 and 116 may be constructed in a housing that keeps the microphones in a fixed position relative to one another.
  • Alternatively, [0057] microphones 112, 114 and 116 are each independently positionable. In such embodiments, the microphones 112, 114 and 116 are calibrated once the microphones are positioned. A calibration process may be performed by first positioning the microphones in an operative relationship with an electronic device. Then a sound is emitted from a known reference point. Information from the sound is used to determine positions of the microphones relative to one another, based on the position of the reference point being known.
  • For example, in one application described in FIG. 8, sound is used to detect a key selected by a user being displayed an image of a set of keys. The microphones may be positionable so as to detect taps corresponding to the user touching a space on which the image of the set of keys is being displayed. Once the microphones are positioned, the user may be required to touch a surface at a point corresponding to a designated calibration key. The position of each key in the set of keys may be known relative to the key designated for calibration. Since the spatial relationship between the set of keys being displayed is known, the selection of the calibration key enables the relative distances between the microphones to be determined. [0058]
  • In another calibration technique, the distance between each microphone in the microphone set [0059] 110 may be measured after the microphones in the array are placed in the desired position.
  • In [0060] step 320, sound emitting from the event at the source position 130 is detected at each of the plurality of detection points. The sound from the event may need to be distinguished from background sounds, or sounds from events that are not of interest. Thus, in FIG. 1, for example, electronic device 120 has to be configured to make a determination as to when sound from the event has arrived. The arrival of the sound from the source position 130 is detected and differentiated from other sounds, such as ambient noise and noise from events that are not of interest. Additional description of determining when sound from an event of interest arrives is described with FIG. 7.
  • [0061] Step 330 provides that the time value for when sound is detected as arriving at each detection point is determined. The time value may correspond to a time stamp of when each of the microphones in microphone set 110 is determined to receive the sound from source position 130. The time value marked at each detection point is dependent on the distance between that detection point and the source position 130.
  • [0062] Step 340 provides that the time values determined at individual detection points are compared with time values determined at other detection points. For example, a time stamp marked when first microphone 112 receives the sound from source position 130 is compared with one or both time stamps marked when each of second microphone 114 and third microphone 116 receive the same sound.
  • In [0063] step 350, the comparison of the time values are used to approximate the position of the source position. For example, a difference between the time stamp at first microphone 112 and second microphone 114 may be used to determine one dimension of the source position relative to the designated position. A difference between the time stamp at first microphone 112 and third microphone 116 may be used to determine a second dimension of the source position relative to the designated position. A similar difference may be calculated by comparing the second microphone 114 and third microphone 116 in order to determine a third dimension of the source position relative to the designated position.
  • FIG. 4 illustrates a technique for determining the source position of a sound using the detected time values of when the sound is received by individual detection points. Specifically, FIG. 4 illustrates use of an iso-delay map for approximating a source position of sound emitted by an event of interest. The iso-delay map consists of iso-delay curves that have been previously determined for sounds emitted at known source positions using known locations of detection points. Measured time values may be mapped onto the iso-delay map for purposes of finding intersecting iso-delay curves that approximate the source position of the event of interest. [0064]
  • Given two microphones (i.e. [0065] first microphone 112 and second microphone 114) located at detection points a and b, and a source position of a sound at p, the iso-delay curves can be used to determine the magnitude and direction of vector p. If a sound propagates from p at velocity v, it reaches the microphone at b with a time delay dab after reaching the microphone at a. This delay satisfies the equation
  • vd ab =∥a−p∥−∥b−p∥  (1)
  • where the vertical bars denote the Euclidean norm. In this equation, a, b and v are known, and d[0066] ab is a measured value between pairs of microphones.
  • How this equation is used depends on whether or not the point p is on a known plane that also contains points a and b. For example, in FIG. 8, all sounds from events of interest may be assumed to be on [0067] surface 836, which may be a table. Thus, in this example, the positions of two microphones may be assumed to be on a known plane—the surface 836. If all points are on a known plane, equation (1) is a quadratic equation in the two unknown variables in p. Equation (1) can be used to determine an iso-delay curve on the plane, on which the sound source must lie. This curve (i.e., 412) is the iso-delay curve for delay dab between a pair of microphones a and b.
  • For example, suppose that microphones at a second pair of microphones c and d similarly measure a second delay d[0068] cd. One of these two microphones (i.e. second microphone 114 and third microphone 116) could coincide with the microphone at a, or with the one at b. With equation (1), the new pair of microphones yields a new iso-delay curve (e.g. 414). The intersection of the two curves determines the location of the source position.
  • In FIG. 4, iso-delay curves for two pairs of microphones are shown for two pairs of microphones arranged on the vertices of a diamond. The solid curves [0069] 412 are for one pair of microphones, and the dotted curves 414 are for the other. The two pairs of microphones may be formed using three microphones, as shown with FIG. 1. To illustrate by way of example, a first pair of microphones (e,g, first microphone 112 and second microphone 114) are aligned along an axis Z, and a second pair of microphones (first microphone 112 and third microphone 116) are aligned along an axis Y. The time values measured between each microphone pair are mapped onto iso-delay curves 412, 414. The point at which the iso-delay curves intersect is the approximate source position.
  • The proximity of the source point to one microphone in a pair effects the position of the source point on the iso-delay map along the axis defining that pair of microphones. For example, E in FIG. 4 represents a source point that is centrally positioned between the first pair of microphones on axis Z, but more proximate to one of the second pair of microphones on axis Y over the other microphone in that pair. Point F represents a source position that is central to both microphone pairs. Point G represents a source position that is skewed towards one of the microphones in both pairs. [0070]
  • Each iso-delay curve in FIG. 4 represents one increment of time differential between the time one microphone in a pair of microphones receives a sound from the source point, as compared to the other microphone in the pair. For example, if the source position corresponds to Point A, the iso-delay curve indicates that one of the [0071] first microphone 112 receives the sound from the event two time increments (e.g. 10 microseconds) before the third microphone 116. If the source position corresponds to Point B, one of the microphones in each pair receives the sound from the source position before the other microphone in that pair. For the pair of microphones defined by axis Z, one microphone in that pair receives the sound from the source position three time increments before the other microphone in that pair. For axis Y, one microphone receives the sound from the source position two time increments before the other microphone in that pair. If the source position corresponds to Point C, then each microphone in both pairs receives the sound from the source position at approximately the same time.
  • For some arrangements of the microphones, it is possible that two curves for the two microphone pairs intersect at two different points. Which of these two points corresponds to the sound source can be determined by either adding a third pair of microphones, or by using external knowledge. For instance, if the system is used to determine the location of a finger tap on a desktop, the area in which taps are allowed may be known, and microphones can be arranged so that only one intersection for each pair of curves lies within this area. [0072]
  • For three-dimensional applications, additional information is needed from a third pair of microphones. Each of the three pairs of microphones is said to yield a quadratic iso-delay surface (not shown) in space, rather than a curve. The three microphone pairs are necessary to localize the sound source because at least three dimensions are needed to locate the source point. The three dimensions may correspond to three ranges relative to an origin or a designated position, or a combination of direction and ranges. The minimum number of microphones is three, since microphones a, b, c yield three pairs (a, b), (a, c), and (b, c). [0073]
  • Graphical representations consisting of iso-delay curves or surfaces may be used to determine source positions of sounds in a confined space. Such graphical representations may be stored in the electronic device and be applicable for the confined space, so that when separate microphones record information about the sound, data may be matched to processes that use iso-delay curves or quadratic iso-delay surfaces to determine the source position. [0074]
  • D. Approximating The Source Position Using The Determined Amplitude Of The Sound Caused By The Event [0075]
  • FIG. 5 illustrates a method for approximating a source position of where an event of interest occurs using amplitude information derived from the sound emitting from the event. A method such as described with FIG. 5 may be implemented using localization apparatus (FIG. 1). Any reference to numerals of FIG. 1 is intended to illustrate exemplary components for practicing an embodiment of the invention, as described by FIG. 5. [0076]
  • In [0077] step 510, the relative positions of a plurality of detection points are identified. This step may be performed in a manner described with FIG. 3.
  • [0078] Step 520 provides that the sound from an event occurring at the source point 130 is detected as being received. This step may be performed in a manner such as described with FIG. 3 and FIG. 7.
  • In [0079] step 530, the amplitude of the detected sound at individual detection points is determined. It is possible for this step to be performed as part of step 520. The amplitude of the detected sound is another example of information about sound that is dependent on the distance between the source of the sound and its detection point. The reason for this is described with FIG. 6.
  • In [0080] step 540, the amplitudes of the sound determined at the individual detection points are compared to one another. The amplitudes may be compared in a manner similar to the comparison of time values, as described in FIG. 3.
  • [0081] Step 550 provides that the comparisons made of the amplitudes at the individual detection points are used to approximate the source position of the sound. For example, a difference between the amplitude of the sound measured at a first detection point and at a second detection point may be used to calculate a first range value for locating the source position of the sound from a designated point. A difference between the amplitude of the sound measured between a second pair of detection points may be used to calculate a second range value for locating the source position of the sound from the designated point. A third range value may similarly be calculated for locating the source position relative to the designated point.
  • FIG. 6 illustrates a technique for determining the source position of the sound based on the determined amplitudes of the sound at the individual detection points. In FIG. 6, a geometric locus map is used to approximate the source position of sound emitted by an event of interest. The map contains a series of curves, individually referred to as a locus, which are each determined from differences in the measured amplitudes of the sound detected by a designated pair of microphones. At least two loci exist for when sound is measured by a set of three microphones. The intersection of loci determined by pairs of microphones, in a set of three or more microphones, corresponds to the approximate source position. The geometric locus map may be used with a method such as described with FIG. 5. The locus for one pair of microphones along an axis Z are represented by [0082] solid lines 612, and for another pair of microphones along an axis Y are represented by dashed lines 614.
  • Given known spatial relationships between the microphones, a locus map such as shown by FIG. 6 may be predetermined. Then, differences in amplitudes measured by pairs of microphones may be mapped onto the locus map. The approximate source position may correspond to where two loci curves located by amplitude differences between two pairs of microphones intersect. [0083]
  • An individual locus may be determined as follows. Given a pair of omni-directional microphones located at points a and b, and a sound source at p with an arbitrary amplitude, the technique may be developed in the following manner. The microphones at a and b will receive sounds with amplitudes of α and β, respectively. If a sound propagates through a three-dimensional medium to reach a microphone, then the amplitude of the sound decays by a rate that is inversely proportional to the distance traveled. The law of power conservation explains this easily: Since sound waves propagate spherically in the air, the total power will traverse increasingly large spheres. At a distance r from the sound source, the sphere in question has a surface area 4πr[0084] 2. Hence, power decays as inversely proportional to square of the traveled distance. Since the power is related to the square of the sound amplitude, the amplitude decays inversely proportional to the distance. Hence, the ratio α/β of the amplitudes will be the reciprocal of the ratio ∥a−p∥/∥b−p∥ between the traveled distances. That is,
  • a−p∥/∥b−p∥=β/α.  (2)
  • If the sound propagates in a two-dimensional medium, such as the surface of a table or other essentially two-dimensional object, then the sound amplitude decays inversely proportional to the square root of the traveled distance. A very similar explanation as in the air propagation case is derived from the power conservation law. In this case, total power will be distributed evenly on circles centered at the source point p. Since the circumference of a circle is equal to 2πr, power (amplitude) decays inversely proportional to the (square root of) distance or, conversely, the ratio of distances is the reciprocal of the squared ratio of amplitudes. For example, two microphones can be attached on a flat surface and the sound emitted by any physical impact on this surface will propagate through this surface to reach the microphones. In this case the relationship will be[0085]
  • a−p∥/∥b−p∥=(β/α) 2.  (3)
  • In any case, regardless of whether a two-dimensional or a three-dimensional medium is considered, the following relationship exists between amplitudes and distances: [0086] a - p b - p = f ( β α ) , ( 4 )
    Figure US20020167862A1-20021114-M00001
  • where the function ƒ(·) is known. For the two-dimensional case, we have ƒ(×)=x[0087] 2. For the three-dimensional case, we have ƒ(×)=x.
  • The function ƒ(·) is in both cases a power function. Taking logarithms in equation (3) then yields a linear equation:[0088]
  • δ−ε=k(B−A)  (5)
  • where δ=ln∥a−p∥, ε=ln∥b−p∥, A=lnα, and B=lnβ. The known constant k is equal to 1 in three dimensions, and to 2 in two dimensions. The form (5) of the equation is often preferable from a numerical analysis point of view. If n≧3 microphones measure amplitudes a[0089] 1, . . . , an at positions a1, . . . , an, then several equations of the form (3) or (4) are available.
  • Just as for the time-measurement case, each such equation determines one of the geometric locus in FIG. 6 on which the sound source must lie. For the two-dimensional case, each locus is a curve. In the three-dimensional case, each equation determines a surface (not shown). Two pairs of microphones (with the pairs optionally sharing one microphone) determine the location of the sound source in two dimensions. In the three-dimensional case, at least three pairs (any two of which can share at most one microphone) are needed. In either case, additional microphones can be used to provide redundant measurements, and therefore yield better accuracy in the result. [0090]
  • For simplicity, explanation of this technique will be limited to the two-dimensional case, shown in FIG. 6. Amplitudes determined from a first pair and a second pair of microphones are mapped onto the set of locus shown in FIG. 6. Each amplitude locates a locus, and where the two identified loci meet coincides with the approximate source position of the sound. The set of loci may be determined beforehand if the locations of the microphones in the pairs are known relative to one another. [0091]
  • E. Detecting The Arrival Of Sound From An Event Of Interest [0092]
  • FIG. 7 illustrates characteristics of a waveform that can be used to determine when a sound caused by an event of interest is determined as being received by microphones in an microphone set (i.e., [0093] 110 in FIG. 1). Without identifiable characteristics in the waveform that can be detected, microphones in microphone set 110 may not be able to distinguish sounds caused by events of interest from sounds caused by other events, or by noise.
  • A microphone in an arrangement such as shown by FIG. 1 may detect thousands of sound measurements per second. Furthermore, the sound emitting from an event of interest may last for more than a second, so thousands of acoustic measurements may be made for the duration that sound from an event of interest is received by one microphone. [0094]
  • Furthermore, some events of interest may cause secondary sounds which provide false information. For example, a tap sound on a table may generate echoes when the sound bounces back from the ends of the table, so that the sound detected by the microphones includes both the tap and its echoes. Embodiments of the invention differentiate sounds caused by events of interest from other sounds, including secondary sounds that may be caused by the same event, but carry false information. [0095]
  • In order to accurately determine which sounds provide location information from events of interests, waveforms of the sounds received by microphones at each detection point may be detected and analyzed for characteristics that indicate the nature of the sound. In one embodiment, the characteristic of a [0096] waveform 700 that is detected and analyzed is a peak 712. This waveform characteristic is well-suited for when the sound emitted by the event is sharp.
  • For example, the [0097] peak 712 may be detected if the value of the peak is greater than or within a range of peak values that correspond to a known sound emitting from the event of interest. In a more specific example, finger taps on a table may coincide with events of interest. In FIG. 1, the electronic device 120 analyzing the sounds received by the microphones in microphone set 110 may seek certain limit or range of peak values known to coincide with finger taps on a given surface. When during a designated interval, the peak 712 is detected, the electronic device 120 knows the sound received by the microphone is from an event of interest.
  • According to one embodiment, a time value is associated with the [0098] peak 712. For example, a time stamp 715 of when the peak 712 occurs may be used to subsequently approximate the source position of the event, as described in FIG. 3. According to another embodiment, a magnitude 718 of the peak 712 may be used to subsequently determine the approximate source position of the event, as described in FIG. 5.
  • In another embodiment, the waveform characteristic used to detect when microphones receive the sound from the event of interest is a waveform shape. This technique may be referred to as matched filtering. In one application, the shape of the waveform in the interval defined by [0099] 722 may match a shape of known waveform corresponding to the event of interest. Once the waveform 700 is detected as being received, a beginning point or amplitude of the waveform may be used to record the time stamp for when the waveform is received by one of the microphones. The magnitude 718 of the peak detected for the duration 722 may be used to subsequently determine the approximate source position of the event, as described in FIG. 5. In another embodiment, the time stamp assigned to the waveform as detected by its shape may be used to approximate the source position, as described with FIG. 3.
  • F. Combining Results From Two Techniques To Approximate Source Position [0100]
  • According to another embodiment, the source position of a sound caused by an event of interest may be determined using a combination of techniques described herein. The localization methods described in FIG. 3 and FIG. 5 used detected time values and amplitudes separately. However, since each of the localization methods are based on independent sources of information, each localization method has essentially independent noise statistics. In other words, the source position approximated by each localization method is affected in different ways by an inhomogeneous medium, or by electronic noise in the microphone amplifiers, and other deviations from an ideal situation. It therefore may be advantageous to use both methods, and combine their results, especially when a higher accuracy is required in the localization results than what could be obtained with either method used alone. [0101]
  • One way to combine results is to average them. If the localization method based on detecting time-stamps yields a position estimate p[0102] t for the location of the source position, and the amplitude-based method yields an estimate pa, then the combination could be computed using this relationship:
  • p=k t p t +k a p a  (6)
  • In this equation, the quantities k[0103] t and ka are based on estimates of the relative accuracy of the two methods, and could be functions of position:
  • k t =k t(p t) and ka =k a(p a)  (7)
  • For instance, these two numbers or functions could be proportional to the reciprocals of the standard deviations of localization errors, as determined by a calibration procedure. [0104]
  • G. User-Interface Device For Using Sound To Detect User-Input [0105]
  • Embodiments of the invention may be used to enable a user to select an input for operating an electronic device. The user may specify the input by selecting a position for where a sound-emitting event is to occur. In one application, the user may make a key entry by tapping a region defined for a key. The region may be defined using, for example, a displayed image of a keyboard. [0106]
  • FIG. 8 illustrates a system using an imaged keyboard, under an embodiment of the invention. The system includes a [0107] projector 810, a electronic device 820 and an set of microphones 830. The projector 810 is equipped to display an image 850 of a set of keys onto a surface 836. The microphone set 830 at least partially surrounds the area of the surface 836 where the image 850 is being displayed. In one application, image 850 may correspond to a QWERTY keyboard.
  • A user can select one of the [0108] regions 852 differentiated in the image 850 as input for electronic device 820. The user may make the selection by touching surface 836 in one of the regions 852 corresponding to the key that the user wishes to select. The act of touching the surface 836 causes a sound to be emitted (i.e. a tap) from the region of the surface 836 where that key is displayed. The microphones 830 detect that sound.
  • The [0109] electronic device 820 is configured to realize when that sound is detected. For example, the electronic device 820 may be configured to detect one of the characteristics of the sound received from the user's action, as described with FIG. 4. Upon the sound from the user selection being detected, electronic device 820 processes that sound as it was received by each of the microphones 830 for information that is dependent on the space between the contacted region and the microphone detecting the sound.
  • In one embodiment such as described with FIG. 3, [0110] electronic device 820 will detect a time stamp for when the sound of the tap was received by each microphone 830. The time stamp values may be used in a manner such as described by FIG. 3 to detect the contacted region 852 corresponding to the selected key.
  • In another embodiment, [0111] electronic device 820 will detect an amplitude of the sound received by each microphone 830. Similarly, the amplitudes may be used in a manner such as described by FIG. 5 to detect the contacted key 852 of image 850 being selected by the user. Still further, a combination of methods such as described with FIG. 3 and FIG. 5 may be used to detect which key the user selected. For example, the source position of the user's selection may be approximated using two or more methods, and then averaged.
  • H. Hardware Diagram [0112]
  • FIG. 9 is a block diagram that illustrates a [0113] computer system 900 upon which an embodiment of the invention may be implemented. The computer system 900 may be any electronic device having a processor that is configured or programmed to implement steps for performing embodiments of the invention, as described herein. Computer system 900 includes a bus 902 or other communication mechanism for communicating information, and a processor 904 coupled with bus 902 for processing information. Computer system 900 also includes a main memory 906, such as a random access memory (RAM) or other dynamic storage device, coupled to bus 902 for storing information and instructions to be executed by processor 904. Main memory 906 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 904. Computer system 900 further includes a read only memory (ROM) 908 or other static storage device coupled to bus 902 for storing static information and instructions for processor 904. A storage device 910, such as a magnetic disk or optical disk, is provided and coupled to bus 902 for storing information and instructions.
  • [0114] Computer system 900 may be coupled via bus 902 to a display 912, such as a cathode ray tube (CRT), for displaying information to a computer user. An input device 914, including alphanumeric and other keys, is coupled to bus 902 for communicating information and command selections to processor 904. Another type of user input device is cursor control 916, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 904 and for controlling cursor movement on display 912. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.
  • The invention is related to the use of [0115] computer system 900 for approximating a source position of a sound-causing event for determining an input used in operating a computer system. According to one embodiment of the invention, a method for approximating a source position of a sound-causing event for determining an input used in operating a computer system is provided by computer system 900 in response to processor 904 executing one or more sequences of one or more instructions contained in main memory 906. Such instructions may be read into main memory 906 from another computer-readable medium, such as storage device 910. Execution of the sequences of instructions contained in main memory 906 causes processor 904 to perform the process steps described herein. One or more processors in a multi-processing arrangement may also be employed to execute the sequences of instructions contained in main memory 906. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions to implement the invention. Thus, embodiments of the invention are not limited to any specific combination of hardware circuitry and software.
  • The term “computer-readable medium” as used herein refers to any medium that participates in providing instructions to [0116] processor 904 for execution. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 910. Volatile media includes dynamic memory, such as main memory 906. Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 902. Transmission media can also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
  • Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read. [0117]
  • Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to [0118] processor 904 for execution. For example, the instructions may initially be carried on a magnetic disk of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to computer system 900 can receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal. An infrared detector coupled to bus 902 can receive the data carried in the infrared signal and place the data on bus 902. Bus 902 carries the data to main memory 906, from which processor 904 retrieves and executes the instructions. The instructions received by main memory 906 may optionally be stored on storage device 910 either before or after execution by processor 904.
  • [0119] Computer system 900 also includes a communication interface 918 coupled to bus 902. Communication interface 918 provides a two-way data communication coupling to a network link 920 that is connected to a local network 922. For example, communication interface 918 may be an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 918 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links may also be implemented. In any such implementation, communication interface 918 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
  • Network link [0120] 920 typically provides data communication through one or more networks to other data devices. For example, network link 920 may provide a connection through local network 922 to a host computer 924 or to data equipment operated by an Internet Service Provider (ISP) 926. ISP 926 in turn provides data communication services through the worldwide packet data communication network now commonly referred to as the “Internet” 928. Local network 922 and Internet 928 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 920 and through communication interface 918, which carry the digital data to and from computer system 900, are exemplary forms of carrier waves transporting the information.
  • [0121] Computer system 900 can send messages and receive data, including program code, through the network(s), network link 920 and communication interface 918. In the Internet example, a server 930 might transmit a requested code for an application program through Internet 928, ISP 926, local network 922 and communication interface 918. In accordance with the invention, one such downloaded application provides for approximating a source position of a sound-causing event for determining an input used in operating a computer system as described herein.
  • The received code may be executed by [0122] processor 904 as it is received, and/or stored in storage device 910, or other non-volatile storage for later execution. In this manner, computer system 900 may obtain application code in the form of a carrier wave.
  • I. Conclusion [0123]
  • In the foregoing specification, the invention has been described with reference to specific embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. [0124]

Claims (54)

What is claimed is:
1. A method for approximating a source position of a sound-causing event for determining an input used in operating an electronic device, the method comprising:
detecting a sound caused by the event at a plurality of detection points;
recording information about the sound at individual detection points in the plurality of detections points, the information being dependent on a distance between the source position and each of the individual detection points;
approximating the source position based on the recorded information and on a relative position of individual detection points in the plurality of detection points to other detection points in the plurality of detection points;
determining the input from the source position; and
causing an operation to be automatically performed by the electronic device based on the input.
2. The method of claim 1, wherein detecting a sound includes detecting the sound emitted by a user contacting a region selected from a plurality of regions, each of the plurality of regions being associated with a corresponding input, and wherein determining the input includes determining the corresponding input for the contacted region.
3. The method of claim 1, wherein detecting a sound includes detecting the sound emitted by a user contacting a region selected from a plurality of regions, each of the plurality of regions being associated by the electronic device as a corresponding key in a set of keys, and wherein determining the input includes determining the corresponding key for the contacted region.
4. The method of claim 1, wherein detecting a sound includes detecting the sound emitted by a voice at the source position, and wherein determining the input includes determining at least a first range value for focusing an image capturing device on the source position.
5. The method of claim 4, wherein determining at least a first range value includes determining range values in at least two dimensions.
6. The method of claim 1, wherein recording information about the sound at individual detection points includes:
identifying an amplitude of the sound at each of the plurality of detection points; and
wherein approximating the source position includes:
comparing the amplitude identified at each of the plurality of detection points to the amplitude of the sound identified at one or more other detection point.
7. The method of claim 1, wherein recording information about the sound at individual detection points includes:
determining a time value corresponding to a determination of an arrival of the sound at each of the plurality of detection points; and
wherein approximating the source position includes:
comparing the time value determined at each of the plurality of detection points to the time value determined at one or more other detection points in the plurality of detection points.
8. The method of claim 6, wherein comparing the amplitude identified at each of the plurality of detection points includes:
determining a first comparison value representing a difference between the amplitude at each of a first pair of detection points in the plurality of detection points; and
using the first value to determine a first range value of the source position from a designated position along an axis defining a first dimension between the source position and the designated position.
9. The method of claim 8, wherein comparing the amplitude identified at each of the plurality of detection points includes:
determining a second comparison value representing a difference between the amplitude at each of a second pair of detection points in the plurality of detection points; and
using the second comparison value to determine a second range value of the source position from the designated position along an axis defining a second dimension between the source position and the designated position.
10. The method of claim 9, wherein comparing the amplitude identified at each of the plurality of detection points includes:
determining a third comparison value representing a difference between the amplitude of each of a third pair of detection points in the plurality of detection points; and
using the third comparison value to approximate a third range value of the source position from the designated position along an axis defining a third dimension between the source position and the designated position.
11. The method of claim 10, wherein comparing the amplitude identified at each of the plurality of detection points includes determining the amplitude of the sound detected at each detection point in a set of three detection points.
12. The method of claim 1, wherein recording information about the sound at individual detection points in the plurality of detections points includes determining a first time value corresponding to a determination of an arrival of the sound at a first detection point in the plurality of detection point, determining a second time value corresponding to a determination of an arrival of the sound at a second detection point in the plurality of detection points, and determining a third time value corresponding to a determination of an arrival of the sound at a third detection point in the plurality of detection points.
13. The method of claim 12, wherein approximating the source position includes determining a first comparison value corresponding to a difference of the first time value and the second time value, and a second comparison value corresponding to a difference of the first time value and the third time value.
14. The method of claim 13, wherein approximating the source position includes using the first comparison value to determine a first range value of the source position from a designated position along an axis defining a first dimension between the source position and the designated position, and using the second comparison value to determine a second range value of the source position from the designated position along an axis defining a second dimension between the source position and the designated position.
15. The method of claim 14, wherein approximating the source position includes determining a third comparison value corresponding to a difference of the first time value and the third time value, and using the third comparison value to determine a third range value of the source position from the designated position along an axis defining a third dimension between the source position and the designated position.
16. The method of claim 1, wherein detecting a sound caused by the event includes marking an approximate peak amplitude of the sound at each detection point in the plurality of detection points as the arrival of the sound and that detection point.
17. The method of claim 1, wherein detecting a sound caused by the event includes detecting an approximate waveform shape of the sound at each detection point in the plurality of detection points, and comparing the approximate waveform shape to a known waveform shape to determine that the sound from the event arrived at that detection point.
18. The method of claim 1, wherein recording information about the sound at individual detection points in the plurality of detections points includes:
identifying an amplitude of the sound at each of the plurality of detection points;
determining a time value corresponding to detection of the sound at each of the plurality of detection points; and
wherein approximating the source position includes:
comparing the amplitude identified at each of the plurality of detection points to the amplitude of the sound identified at one or more other detection point to determine at least a first amplitude-determined range value and a second amplitude-determined range value; and
comparing the time value determined at each of the plurality of detection points to the time value determined at one or more other detection points in the plurality of detection points to determine at least a first time-determined range value and a second time-determined range value.
19. The method of claim 18, wherein approximating the source position includes determining a first range value by determining an average of the first amplitude-determined range value and the first time-determined range value, the first range being along an axis defining a first dimension between the source position and the designated position.
20. The method of claim 19, wherein approximating the source position includes weighting the average of the first amplitude-determined range value and the first time-determined range value.
21. The method of claim 19, wherein approximating the source position includes determining a second range value by determining an average of the amplitude-determined second range value with the time-determined second range value, the second range value being along an axis defining a second dimension between the source position and the designated position.
22. The method of claim 21, wherein approximating the source position includes weighting the average of the amplitude-determined range value and the time-determined range value.
23. The method of claim 21, wherein approximating the source position includes:
comparing the amplitude identified at each of the plurality of detection points to the amplitude of the sound identified at one or more other detection point to determine a third amplitude-determined range value; and
comparing the time value determined at each of the plurality of detection points to the time value determined at one or more other detection points in the plurality of detection points to determine a third time-determined range value; and
determining an average of the third amplitude-determined range value and the third time-determined range value.
24. The method of claim 23, further comprising weighting the average of the third amplitude-determined range value and the third time-determined range value.
25. An apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device, the system comprising:
a plurality of sound-detection devices arranged to receive a sound caused by the event; and
a processor operatively coupled to the plurality of sound-detection devices, the processor being configured to:
detect that the sound-detection devices receive the sound;
record information about the sound detected at individual sound-detection devices, the information being dependent on a distance between the source position and the individual sound-detection devices;
approximate the source position based on the recorded information and on a relative position of individual sound-detection devices in the plurality of sound-detection devices to other sound-detection devices in the plurality of sound-detection devices;
determine the input from the source position; and
execute an operation based on the input.
26. The apparatus of claim 25, wherein the processor is configured to detect the sound emitted by a user contacting a region selected from a plurality of regions, each of the plurality of regions being associated with a corresponding input, and to determine the corresponding input for the contacted region.
27. The apparatus of claim 26, wherein the processor is configured to detect the sound emitted by a user contacting a region selected from a plurality of regions, each of the plurality of regions being associated by the electronic device as a corresponding key in a set of keys, and to determine the corresponding key for the contacted region.
28. The apparatus of claim 27, wherein the processor is configured to detect the sound emitted by a voice at the source position, and to determine at least a first range value for focusing an image capturing device on the source position.
29. The apparatus of claim 25, wherein the processor is configured to detect that the sound-detection devices receive the sound caused by the event by detecting an amplitude of a waveform corresponding to the sound.
30. The apparatus of claim 25, wherein the processor is configured to detect that the sound-detection devices receive the sound caused by the event by matching a waveform corresponding to the sound to a waveform of the sound-causing event.
31. The apparatus of claim 25, wherein the sound-detection devices are arranged non-linearly relative to a reference plane.
32. The apparatus of claim 31, wherein the processor is configured to approximate the source position based on a relative position of individual sound-detection devices by:
determining a time value corresponding to each of the sound-detection devices being determined to receive the sound from the source position;
comparing the time value determined by each of the plurality of sound-detection devices to the time values determined by one or more other sound-detection devices in the plurality of sound-detection devices.
33. The apparatus of claim 25, wherein the processor is configured to approximate the source position based on a relative position of individual sound-detection devices by:
identifying an amplitude of the sound at each of the plurality of sound-detecting devices;
comparing the amplitude identified at each of the plurality of sound-detecting devices to the amplitude of the sound identified at one or more other sound-detecting device.
34. A method for detecting a user-input, the method comprising:
displaying an image of a set of keys on a surface;
detecting a sound corresponding to a user's selection of one of the keys in the set of keys;
determining a source position of the sound; and
identifying a key in the set of keys using the source position of the sound.
35. The method of claim 34, wherein identifying a key in the set of keys includes identifying an image of the key that contains the source position of the sound.
36. The method of claim 34, wherein detecting a sound corresponding to a user's selection includes detecting a contact sound caused by the user at a region of the surface being displayed the key selected by the user.
37. The method of claim 34, wherein displaying an image of a set of keys includes displaying a QWERTY keyboard.
38. An apparatus for detecting a user-input, the apparatus comprising:
a projector to display an image corresponding to a set of keys;
a plurality of sound-detection devices arranged to receive a sound corresponding to a user's selection of one of the keys in the set of keys; and
a processor configured to determine the selection of one of the keys by determining an approximate source position of the sound.
39. The apparatus of claim 38, wherein the processor is configured to determine the approximate source position of the sound by identifying an image of the key that occupies the source position of the sound.
40. The apparatus of claim 38, wherein the processor is configured to detect a contact sound received by the sound-detection devices, the contact sound being caused by the user at a region of the surface being displayed the key selected by the user.
41. A computer-readable medium for approximating a source position of a sound-causing event for determining an input used to operate an electronic device, the computer-readable carrying instructions for performing the steps of:
detecting a sound caused by the event at a plurality of detection points;
recording information about the sound at individual detection points in the plurality of detections points, the information being dependent on a distance between the source position and each of the individual detection points;
approximating the source position based on the recorded information and on a relative position of individual detection points in the plurality of detection points to other detection points in the plurality of detection points;
determining the input from the source position; and
causing an operation to be automatically performed by the electronic device based on the input.
42. The computer-readable medium of claim 41, wherein instructions for detecting a sound include instructions for detecting the sound emitted by a user contacting a region selected from a plurality of regions, each of the plurality of regions being associated with a corresponding input, and wherein instructions for determining the input include determining the corresponding input for the contacted region.
43. The computer-readable medium of claim 41, wherein instructions for detecting a sound include instructions for detecting the sound emitted by a user contacting a region selected from a plurality of regions, each of the plurality of regions being associated by the electronic device as a corresponding key in a set of keys, and wherein instructions for determining the input include instructions for determining the corresponding key for the contacted region.
44. The computer-readable medium of claim 41, wherein instructions for detecting a sound include instructions for detecting the sound emitted by a voice at the source position, and wherein instructions for determining the input include instructions for determining at least a first range value for focusing an image capturing device on the source position.
45. The computer-readable medium of claim 44, wherein instructions for determining at least a first range value include instructions for determining range values in at least two dimensions.
46. The computer-readable medium of claim 41, wherein instructions for recording information about the sound at individual detection points include instructions for:
identifying an amplitude of the sound at each of the plurality of detection points; and
wherein instructions for approximating the source position include instructions for:
comparing the amplitude identified at each of the plurality of detection points to the amplitude of the sound identified at one or more other detection point.
47. The computer-readable medium of claim 41, wherein instructions for recording information about the sound at individual detection points include instructions for:
determining a time value corresponding to a determination of an arrival of the sound at each of the plurality of detection points; and
wherein instructions for approximating the source position include instructions for:
comparing the time value determined at each of the plurality of detection points to the time value determined at one or more other detection points in the plurality of detection points.
48. The computer-readable medium of claim 46, wherein instructions for comparing the amplitude identified at each of the plurality of detection points include instructions for:
determining a first comparison value representing a difference between the amplitude at each of a first pair of detection points in the plurality of detection points; and
using the first value to determine a first range value of the source position from a designated position along an axis defining a first dimension between the source position and the designated position.
49. The computer-readable medium of claim 48, wherein instructions for comparing the amplitude identified at each of the plurality of detection points include instructions for:
determining a second comparison value representing a difference between the amplitude at each of a second pair of detection points in the plurality of detection points; and
using the second comparison value to determine a second range value of the source position from the designated position along an axis defining a second dimension between the source position and the designated position.
50. The computer-readable medium of claim 49, wherein instructions for comparing the amplitude identified at each of the plurality of detection points include instructions for:
determining a third comparison value representing a difference between the amplitude of each of a third pair of detection points in the plurality of detection points; and
using the third comparison value to approximate a third range value of the source position from the designated position along an axis defining a third dimension between the source position and the designated position.
51. The computer-readable medium of claim 41, wherein instructions for recording information about the sound at individual detection points in the plurality of detections points include instructions for:
identifying an amplitude of the sound at each of the plurality of detection points;
determining a time value corresponding to detection of the sound at each of the plurality of detection points; and
wherein instructions for approximating the source position include instructions for:
comparing the amplitude identified at each of the plurality of detection points to the amplitude of the sound identified at one or more other detection point to determine at least a first amplitude-determined range value and a second amplitude-determined range value; and
comparing the time value determined at each of the plurality of detection points to the time value determined at one or more other detection points in the plurality of detection points to determine at least a first time-determined range value and a second time-determined range value.
52. The computer-readable medium of claim 51, wherein instructions for approximating the source position include instructions for determining a first range value by determining an average of the first amplitude-determined range value and the first time-determined range value, the first range being along an axis defining a first dimension between the source position and the designated position.
53. The computer-readable medium of claim 52, wherein instructions for approximating the source position include instructions for weighting the average of the first amplitude-determined range value and the first time-determined range value.
54. The computer-readable medium of claim 53, wherein instructions for approximating the source position include instructions for determining a second range value by determining an average of the amplitude-determined second range value with the time-determined second range value, the second range value being along an axis defining a second dimension between the source position and the designated position.
US10/115,357 1999-11-04 2002-04-02 Method and apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device Expired - Fee Related US6690618B2 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
US10/115,357 US6690618B2 (en) 2001-04-03 2002-04-02 Method and apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device
PCT/US2002/010661 WO2002082249A2 (en) 2001-04-03 2002-04-03 Method and apparatus for approximating a source position of a sound-causing event
US10/313,939 US20030132921A1 (en) 1999-11-04 2002-12-05 Portable sensory input device
AU2002359625A AU2002359625A1 (en) 2001-12-07 2002-12-06 Portable sensory input device
AU2003213068A AU2003213068A1 (en) 2002-02-15 2003-02-14 Multiple input modes in overlapping physical space
PCT/US2003/004530 WO2003071411A1 (en) 2002-02-15 2003-02-14 Multiple input modes in overlapping physical space

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US28131401P 2001-04-03 2001-04-03
US10/115,357 US6690618B2 (en) 2001-04-03 2002-04-02 Method and apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US10/313,939 Continuation-In-Part US20030132921A1 (en) 1999-11-04 2002-12-05 Portable sensory input device

Publications (2)

Publication Number Publication Date
US20020167862A1 true US20020167862A1 (en) 2002-11-14
US6690618B2 US6690618B2 (en) 2004-02-10

Family

ID=26813108

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/115,357 Expired - Fee Related US6690618B2 (en) 1999-11-04 2002-04-02 Method and apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device

Country Status (2)

Country Link
US (1) US6690618B2 (en)
WO (1) WO2002082249A2 (en)

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040128127A1 (en) * 2002-12-13 2004-07-01 Thomas Kemp Method for processing speech using absolute loudness
US20050088915A1 (en) * 2002-12-24 2005-04-28 Lapin Brett D. Gun shot digital imaging system
WO2006070044A1 (en) * 2004-12-29 2006-07-06 Nokia Corporation A method and a device for localizing a sound source and performing a related action
US20070078018A1 (en) * 2005-09-30 2007-04-05 Norman Kellogg Golf range with automated ranging system
EP1806952A2 (en) * 2006-01-06 2007-07-11 Agilent Technologies, Inc. Acoustic location and acoustic signal enhancement
WO2007144014A1 (en) * 2006-06-15 2007-12-21 Nokia Corporation Mobile device with virtual keypad
US20080048988A1 (en) * 2006-08-24 2008-02-28 Yingyong Qi Mobile device with acoustically-driven text input and method thereof
EP1982253A2 (en) * 2006-02-10 2008-10-22 AWQ Consulting Touch detection
US20090128483A1 (en) * 2004-03-02 2009-05-21 Microsoft Corporation Advanced navigation techniques for portable devices
US20100110834A1 (en) * 2008-10-30 2010-05-06 Kim Kyu-Hong Apparatus and method of detecting target sound
US20100259620A1 (en) * 2007-10-22 2010-10-14 Bae Systems Plc Cctv incident location system
US20110157315A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Interpolation of three-dimensional video content
US20110159929A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Multiple remote controllers that each simultaneously controls a different visual presentation of a 2d/3d display
US20110164188A1 (en) * 2009-12-31 2011-07-07 Broadcom Corporation Remote control with integrated position, viewer identification and optical and audio test
US20110169919A1 (en) * 2009-12-31 2011-07-14 Broadcom Corporation Frame formatting supporting mixed two and three dimensional video data communication
US20120067201A1 (en) * 2010-09-20 2012-03-22 Raytheon Bbn Technologies Corp. Systems and methods for an indicator for a weapon sight
WO2012061149A1 (en) * 2010-10-25 2012-05-10 Qualcomm Incorporated Three-dimensional sound capturing and reproducing with multi-microphones
US8218902B1 (en) * 2011-12-12 2012-07-10 Google Inc. Portable electronic device position sensing circuit
WO2012058465A3 (en) * 2010-10-29 2012-08-23 Qualcomm Incorporated Transitioning multiple microphones from a first mode to a second mode
WO2012104767A3 (en) * 2011-02-04 2013-01-03 Technogym S.P.A. Apparatus and method for locating the point of impact of a body on a surface
US20130222230A1 (en) * 2012-02-29 2013-08-29 Pantech Co., Ltd. Mobile device and method for recognizing external input
WO2013151789A1 (en) * 2012-04-02 2013-10-10 Qualcomm Incorporated Systems, methods, apparatus, and computer-readable media for gestural manipulation of a sound field
WO2014024009A1 (en) * 2012-08-10 2014-02-13 Nokia Corporation Spatial audio user interface apparatus
WO2014057188A1 (en) * 2012-10-11 2014-04-17 Stoltz Vincent Method and module for controlling equipment by sound pulse
US20140136096A1 (en) * 2011-04-15 2014-05-15 Meijo University Approaching vehicle detecting system and approaching vehicle detecting method
US8855341B2 (en) 2010-10-25 2014-10-07 Qualcomm Incorporated Systems, methods, apparatus, and computer-readable media for head tracking based on recorded sound signals
US9031256B2 (en) 2010-10-25 2015-05-12 Qualcomm Incorporated Systems, methods, apparatus, and computer-readable media for orientation-sensitive recording control
US20150222948A1 (en) * 2012-09-29 2015-08-06 Shenzhen Prtek Co. Ltd. Multimedia Device Voice Control System and Method, and Computer Storage Medium
JP2015152316A (en) * 2014-02-10 2015-08-24 株式会社小野測器 Sound visualizing apparatus
EP2809085A4 (en) * 2012-07-31 2015-12-09 Japan Science & Tech Agency Device for estimating placement of physical objects
US20160029141A1 (en) * 2013-03-19 2016-01-28 Koninklijke Philips N.V. Method and apparatus for determining a position of a microphone
US9307335B2 (en) 2012-07-31 2016-04-05 Japan Science And Technology Agency Device for estimating placement of physical objects
US20160125885A1 (en) * 2006-02-01 2016-05-05 Innovation Specialists, Llc Sensory Enhancement Systems and Methods in Personal Electronic Devices
US20160232884A1 (en) * 2015-02-10 2016-08-11 Navico Holding As Transducer Array Having a Transceiver
US20170033752A1 (en) * 2010-10-21 2017-02-02 Nokia Technologies Oy Recording Level Adjustment Using A Distance To A Sound Source
US20170115827A1 (en) * 2013-03-15 2017-04-27 Elo Touch Solutions, Inc. Acoustic Touch Apparatus And Method Using Touch Sensitive Lamb Waves
EP3200050A4 (en) * 2014-09-24 2018-05-16 Boe Technology Group Co. Ltd. Touch screen and touch point positioning method
US10024957B2 (en) 2015-09-17 2018-07-17 Navico Holding As Adaptive beamformer for sonar imaging
US20180306890A1 (en) * 2015-10-30 2018-10-25 Hornet Industries, Llc System and method to locate and identify sound sources in a noisy environment
US10114119B2 (en) 2015-05-20 2018-10-30 Navico Holding As Sonar systems and methods using interferometry and/or beamforming for 3D imaging
US20190199850A1 (en) * 2015-07-14 2019-06-27 Driving Management Systems, Inc. Detecting the location of a phone using rf wireless and ultrasonic signals
US20200176015A1 (en) * 2017-02-21 2020-06-04 Onfuture Ltd. Sound source detecting method and detecting device
CN111857642A (en) * 2020-07-20 2020-10-30 歌尔科技有限公司 Control method of electronic device, and readable storage medium
US11143758B2 (en) 2017-10-13 2021-10-12 Navico Holding As Sonar transducer performance optimization
US11392290B2 (en) * 2020-06-26 2022-07-19 Intel Corporation Touch control surfaces for electronic user devices and related methods
WO2023099355A1 (en) * 2021-12-02 2023-06-08 Universite De Lille Coincidence detector for locating a source
WO2024137044A1 (en) * 2022-12-23 2024-06-27 Qualcomm Incorporated User equipment inputs based on sounds and/or vibrations

Families Citing this family (105)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7831358B2 (en) * 1992-05-05 2010-11-09 Automotive Technologies International, Inc. Arrangement and method for obtaining information using phase difference of modulated illumination
US7782256B2 (en) * 1999-03-05 2010-08-24 Era Systems Corporation Enhanced passive coherent location techniques to track and identify UAVs, UCAVs, MAVs, and other objects
US7667647B2 (en) * 1999-03-05 2010-02-23 Era Systems Corporation Extension of aircraft tracking and positive identification from movement areas into non-movement areas
US7570214B2 (en) 1999-03-05 2009-08-04 Era Systems, Inc. Method and apparatus for ADS-B validation, active and passive multilateration, and elliptical surviellance
US7777675B2 (en) * 1999-03-05 2010-08-17 Era Systems Corporation Deployable passive broadband aircraft tracking
US7739167B2 (en) 1999-03-05 2010-06-15 Era Systems Corporation Automated management of airport revenues
US20100079342A1 (en) * 1999-03-05 2010-04-01 Smith Alexander E Multilateration enhancements for noise and operations management
US8446321B2 (en) 1999-03-05 2013-05-21 Omnipol A.S. Deployable intelligence and tracking system for homeland security and search and rescue
US7889133B2 (en) 1999-03-05 2011-02-15 Itt Manufacturing Enterprises, Inc. Multilateration enhancements for noise and operations management
US8203486B1 (en) 1999-03-05 2012-06-19 Omnipol A.S. Transmitter independent techniques to extend the performance of passive coherent location
US7908077B2 (en) * 2003-06-10 2011-03-15 Itt Manufacturing Enterprises, Inc. Land use compatibility planning software
US20090251996A1 (en) * 2004-03-09 2009-10-08 Koninklijke Philips Electronics, N.V. Object position estimation
US20060139339A1 (en) * 2004-12-29 2006-06-29 Pechman Robert J Touch location determination using vibration wave packet dispersion
US9760214B2 (en) 2005-02-23 2017-09-12 Zienon, Llc Method and apparatus for data entry input
US9274551B2 (en) * 2005-02-23 2016-03-01 Zienon, Llc Method and apparatus for data entry input
US7683890B2 (en) * 2005-04-28 2010-03-23 3M Innovative Properties Company Touch location determination using bending mode sensors and multiple detection techniques
US20110034210A1 (en) * 2005-05-06 2011-02-10 Seagate Technology Llc Communication device and storage device protocol
US7471592B2 (en) * 2005-05-10 2008-12-30 The United States Of America As Represented By The Secretary Of The Navy Method and apparatus for passive acoustic ranging
US7333394B2 (en) * 2005-06-22 2008-02-19 Basilico Albert R Navigational aid for diver
US7969822B2 (en) * 2005-07-15 2011-06-28 Estate Of Albert R. Basilico System and method for extending GPS to divers and underwater vehicles
US7272074B2 (en) * 2005-07-15 2007-09-18 Basilico Albert R System and method for extending GPS to divers and underwater vehicles
US9152241B2 (en) * 2006-04-28 2015-10-06 Zienon, Llc Method and apparatus for efficient data input
US7965227B2 (en) * 2006-05-08 2011-06-21 Era Systems, Inc. Aircraft tracking using low cost tagging as a discriminator
US8133119B2 (en) 2008-10-01 2012-03-13 Microsoft Corporation Adaptation for alternate gaming input devices
US8294767B2 (en) 2009-01-30 2012-10-23 Microsoft Corporation Body scan
US9652030B2 (en) 2009-01-30 2017-05-16 Microsoft Technology Licensing, Llc Navigation of a virtual plane using a zone of restriction for canceling noise
US8295546B2 (en) 2009-01-30 2012-10-23 Microsoft Corporation Pose tracking pipeline
US8866821B2 (en) 2009-01-30 2014-10-21 Microsoft Corporation Depth map movement tracking via optical flow and velocity prediction
US8773355B2 (en) 2009-03-16 2014-07-08 Microsoft Corporation Adaptive cursor sizing
US9256282B2 (en) 2009-03-20 2016-02-09 Microsoft Technology Licensing, Llc Virtual object manipulation
US8988437B2 (en) 2009-03-20 2015-03-24 Microsoft Technology Licensing, Llc Chaining animations
US8638985B2 (en) 2009-05-01 2014-01-28 Microsoft Corporation Human body pose estimation
US9015638B2 (en) 2009-05-01 2015-04-21 Microsoft Technology Licensing, Llc Binding users to a gesture based system and providing feedback to the users
US8942428B2 (en) 2009-05-01 2015-01-27 Microsoft Corporation Isolate extraneous motions
US9898675B2 (en) 2009-05-01 2018-02-20 Microsoft Technology Licensing, Llc User movement tracking feedback to improve tracking
US8253746B2 (en) 2009-05-01 2012-08-28 Microsoft Corporation Determine intended motions
US8181123B2 (en) 2009-05-01 2012-05-15 Microsoft Corporation Managing virtual port associations to users in a gesture-based computing environment
US9498718B2 (en) 2009-05-01 2016-11-22 Microsoft Technology Licensing, Llc Altering a view perspective within a display environment
US8340432B2 (en) 2009-05-01 2012-12-25 Microsoft Corporation Systems and methods for detecting a tilt angle from a depth image
US8503720B2 (en) 2009-05-01 2013-08-06 Microsoft Corporation Human body pose estimation
US9377857B2 (en) 2009-05-01 2016-06-28 Microsoft Technology Licensing, Llc Show body position
US8649554B2 (en) 2009-05-01 2014-02-11 Microsoft Corporation Method to control perspective for a camera-controlled computer
CN104808821A (en) * 2009-05-26 2015-07-29 美国智能科技有限公司 Method and apparatus for data entry input
US9182814B2 (en) 2009-05-29 2015-11-10 Microsoft Technology Licensing, Llc Systems and methods for estimating a non-visible or occluded body part
US8744121B2 (en) 2009-05-29 2014-06-03 Microsoft Corporation Device for identifying and tracking multiple humans over time
US8320619B2 (en) 2009-05-29 2012-11-27 Microsoft Corporation Systems and methods for tracking a model
US8625837B2 (en) 2009-05-29 2014-01-07 Microsoft Corporation Protocol and format for communicating an image from a camera to a computing environment
US8418085B2 (en) 2009-05-29 2013-04-09 Microsoft Corporation Gesture coach
US8542252B2 (en) 2009-05-29 2013-09-24 Microsoft Corporation Target digitization, extraction, and tracking
US9383823B2 (en) 2009-05-29 2016-07-05 Microsoft Technology Licensing, Llc Combining gestures beyond skeletal
US8856691B2 (en) 2009-05-29 2014-10-07 Microsoft Corporation Gesture tool
US8176442B2 (en) 2009-05-29 2012-05-08 Microsoft Corporation Living cursor control mechanics
US8379101B2 (en) 2009-05-29 2013-02-19 Microsoft Corporation Environment and/or target segmentation
US8509479B2 (en) 2009-05-29 2013-08-13 Microsoft Corporation Virtual object
US8803889B2 (en) 2009-05-29 2014-08-12 Microsoft Corporation Systems and methods for applying animations or motions to a character
US8145594B2 (en) 2009-05-29 2012-03-27 Microsoft Corporation Localized gesture aggregation
US9400559B2 (en) 2009-05-29 2016-07-26 Microsoft Technology Licensing, Llc Gesture shortcuts
US7914344B2 (en) 2009-06-03 2011-03-29 Microsoft Corporation Dual-barrel, connector jack and plug assemblies
US8390680B2 (en) 2009-07-09 2013-03-05 Microsoft Corporation Visual representation expression based on player expression
US9159151B2 (en) 2009-07-13 2015-10-13 Microsoft Technology Licensing, Llc Bringing a visual representation to life via learned input from the user
US9141193B2 (en) 2009-08-31 2015-09-22 Microsoft Technology Licensing, Llc Techniques for using human gestures to control gesture unaware programs
US8490002B2 (en) * 2010-02-11 2013-07-16 Apple Inc. Projected display shared workspaces
US8174934B2 (en) * 2010-07-28 2012-05-08 Empire Technology Development Llc Sound direction detection
US9111326B1 (en) * 2010-12-21 2015-08-18 Rawles Llc Designation of zones of interest within an augmented reality environment
US8845110B1 (en) 2010-12-23 2014-09-30 Rawles Llc Powered augmented reality projection accessory display device
US9134593B1 (en) 2010-12-23 2015-09-15 Amazon Technologies, Inc. Generation and modulation of non-visible structured light for augmented reality projection system
US8905551B1 (en) 2010-12-23 2014-12-09 Rawles Llc Unpowered augmented reality projection accessory display device
US8845107B1 (en) 2010-12-23 2014-09-30 Rawles Llc Characterization of a scene with structured light
US9721386B1 (en) 2010-12-27 2017-08-01 Amazon Technologies, Inc. Integrated augmented reality environment
US9508194B1 (en) 2010-12-30 2016-11-29 Amazon Technologies, Inc. Utilizing content output devices in an augmented reality environment
US9607315B1 (en) 2010-12-30 2017-03-28 Amazon Technologies, Inc. Complementing operation of display devices in an augmented reality environment
US8942917B2 (en) 2011-02-14 2015-01-27 Microsoft Corporation Change invariant scene recognition by an agent
US9857868B2 (en) 2011-03-19 2018-01-02 The Board Of Trustees Of The Leland Stanford Junior University Method and system for ergonomic touch-free interface
US8840466B2 (en) 2011-04-25 2014-09-23 Aquifi, Inc. Method and system to create three-dimensional mapping in a two-dimensional game
US8760395B2 (en) 2011-05-31 2014-06-24 Microsoft Corporation Gesture recognition techniques
US9118782B1 (en) 2011-09-19 2015-08-25 Amazon Technologies, Inc. Optical interference mitigation
US8635637B2 (en) 2011-12-02 2014-01-21 Microsoft Corporation User interface presenting an animated avatar performing a media reaction
US9100685B2 (en) 2011-12-09 2015-08-04 Microsoft Technology Licensing, Llc Determining audience state or interest using passive sensor data
US8854433B1 (en) 2012-02-03 2014-10-07 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
US8898687B2 (en) 2012-04-04 2014-11-25 Microsoft Corporation Controlling a media program based on a media reaction
CA2775700C (en) 2012-05-04 2013-07-23 Microsoft Corporation Determining a future portion of a currently presented media program
US9111135B2 (en) 2012-06-25 2015-08-18 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera
US8934675B2 (en) 2012-06-25 2015-01-13 Aquifi, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US8836768B1 (en) 2012-09-04 2014-09-16 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
US9281727B1 (en) * 2012-11-01 2016-03-08 Amazon Technologies, Inc. User device-based control of system functionality
EP2920610A2 (en) * 2012-11-13 2015-09-23 Aqwary AB Method and system for monitoring the status of divers
US9857470B2 (en) 2012-12-28 2018-01-02 Microsoft Technology Licensing, Llc Using photometric stereo for 3D environment modeling
US9129155B2 (en) 2013-01-30 2015-09-08 Aquifi, Inc. Systems and methods for initializing motion tracking of human hands using template matching within bounded regions determined using a depth map
US9092665B2 (en) 2013-01-30 2015-07-28 Aquifi, Inc Systems and methods for initializing motion tracking of human hands
US9940553B2 (en) 2013-02-22 2018-04-10 Microsoft Technology Licensing, Llc Camera/object pose from predicted coordinates
US9129515B2 (en) * 2013-03-15 2015-09-08 Qualcomm Incorporated Ultrasound mesh localization for interactive systems
US20140269194A1 (en) * 2013-03-18 2014-09-18 Inputek Inc. Three Dimensional Touch by Acoustic Waves
US9298266B2 (en) 2013-04-02 2016-03-29 Aquifi, Inc. Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
KR101411650B1 (en) * 2013-06-21 2014-06-24 김남규 Key input apparatus and key input recognizing apparatus and key input system using them
US9113036B2 (en) 2013-07-17 2015-08-18 Ebay Inc. Methods, systems, and apparatus for providing video communications
US9798388B1 (en) 2013-07-31 2017-10-24 Aquifi, Inc. Vibrotactile system to augment 3D input systems
US9507417B2 (en) 2014-01-07 2016-11-29 Aquifi, Inc. Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9619105B1 (en) 2014-01-30 2017-04-11 Aquifi, Inc. Systems and methods for gesture based interaction with viewpoint dependent user interfaces
US9618618B2 (en) * 2014-03-10 2017-04-11 Elwha Llc Systems and methods for ultrasonic position and motion detection
US9739883B2 (en) 2014-05-16 2017-08-22 Elwha Llc Systems and methods for ultrasonic velocity and acceleration detection
WO2015180796A1 (en) * 2014-05-30 2015-12-03 Sonova Ag A method for controlling a hearing device via touch gestures, a touch gesture controllable hearing device and a method for fitting a touch gesture controllable hearing device
US9995823B2 (en) 2015-07-31 2018-06-12 Elwha Llc Systems and methods for utilizing compressed sensing in an entertainment system
US10334360B2 (en) * 2017-06-12 2019-06-25 Revolabs, Inc Method for accurately calculating the direction of arrival of sound at a microphone array
US10291999B1 (en) 2018-03-29 2019-05-14 Cae Inc. Method and system for validating a position of a microphone
CA3000122C (en) 2018-03-29 2019-02-26 Cae Inc. Method and system for determining a position of a microphone

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3647962A (en) * 1970-07-30 1972-03-07 Science Accessories Corp Keyboard encoding
US4686655A (en) 1970-12-28 1987-08-11 Hyatt Gilbert P Filtering system for processing signature signals
US4312053A (en) 1971-12-03 1982-01-19 Subcom, Inc. Range and depth detection system
US3857022A (en) 1973-11-15 1974-12-24 Integrated Sciences Corp Graphic input device
NO147618L (en) 1976-11-18
US4333170A (en) 1977-11-21 1982-06-01 Northrop Corporation Acoustical detection and tracking system
US4376301A (en) 1980-12-10 1983-03-08 Chevron Research Company Seismic streamer locator
WO1984000427A1 (en) 1982-07-10 1984-02-02 Syrinx Precision Instr Data input device
US4506354A (en) * 1982-09-30 1985-03-19 Position Orientation Systems, Ltd. Ultrasonic position detecting system
US4599607A (en) * 1983-10-31 1986-07-08 General Instrument Corporation Acoustic keyboard
JPS61138420A (en) * 1984-12-11 1986-06-25 アルプス電気株式会社 Keyboard unit
US4716542A (en) 1985-09-26 1987-12-29 Timberline Software Corporation Method and apparatus for single source entry of analog and digital data into a computer
DE3601658A1 (en) 1986-01-21 1987-07-23 Siemens Ag CIRCUIT FOR READING AN OPTOELECTRONIC IMAGE SENSOR
US4980870A (en) 1988-06-10 1990-12-25 Spivey Brett A Array compensating beamformer
US5174759A (en) 1988-08-04 1992-12-29 Preston Frank S TV animation interactively controlled by the viewer through input above a book page
US4956824A (en) 1989-09-12 1990-09-11 Science Accessories Corp. Position determination apparatus
US4991148A (en) 1989-09-26 1991-02-05 Gilchrist Ian R Acoustic digitizing system
US5062641A (en) 1989-09-28 1991-11-05 Nannette Poillon Projectile trajectory determination system
US5099456A (en) 1990-06-13 1992-03-24 Hughes Aircraft Company Passive locating system
US5573077A (en) 1990-11-16 1996-11-12 Knowles; Terence J. Acoustic touch position sensor
US5166905A (en) 1991-10-21 1992-11-24 Texaco Inc. Means and method for dynamically locating positions on a marine seismic streamer cable
JP3182015B2 (en) 1993-01-27 2001-07-03 テキサス インスツルメンツ インコーポレイテツド Optical image synthesis method
US5617371A (en) 1995-02-08 1997-04-01 Diagnostic/Retrieval Systems, Inc. Method and apparatus for accurately determing the location of signal transducers in a passive sonar or other transducer array system
JP3817294B2 (en) 1996-04-01 2006-09-06 浜松ホトニクス株式会社 Solid-state imaging device
US5825033A (en) 1996-10-31 1998-10-20 The Arizona Board Of Regents On Behalf Of The University Of Arizona Signal processing method for gamma-ray semiconductor sensor
US6266048B1 (en) 1998-08-27 2001-07-24 Hewlett-Packard Company Method and apparatus for a virtual display/keyboard for a PDA
JP2002526989A (en) 1998-09-28 2002-08-20 スリーディーヴィー システムズ リミテッド Distance measurement using camera
FI990676A (en) 1999-03-26 2000-09-27 Nokia Mobile Phones Ltd Hand-held entry system for data entry and mobile phone
GB9908545D0 (en) 1999-04-14 1999-06-09 Canon Kk Image processing apparatus

Cited By (115)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8200488B2 (en) * 2002-12-13 2012-06-12 Sony Deutschland Gmbh Method for processing speech using absolute loudness
US20040128127A1 (en) * 2002-12-13 2004-07-01 Thomas Kemp Method for processing speech using absolute loudness
US20050088915A1 (en) * 2002-12-24 2005-04-28 Lapin Brett D. Gun shot digital imaging system
US6965541B2 (en) * 2002-12-24 2005-11-15 The Johns Hopkins University Gun shot digital imaging system
US20090128483A1 (en) * 2004-03-02 2009-05-21 Microsoft Corporation Advanced navigation techniques for portable devices
WO2006070044A1 (en) * 2004-12-29 2006-07-06 Nokia Corporation A method and a device for localizing a sound source and performing a related action
US20070078018A1 (en) * 2005-09-30 2007-04-05 Norman Kellogg Golf range with automated ranging system
EP1806952A2 (en) * 2006-01-06 2007-07-11 Agilent Technologies, Inc. Acoustic location and acoustic signal enhancement
EP1806952A3 (en) * 2006-01-06 2009-03-11 Agilent Technologies, Inc. Acoustic location and acoustic signal enhancement
US20160125885A1 (en) * 2006-02-01 2016-05-05 Innovation Specialists, Llc Sensory Enhancement Systems and Methods in Personal Electronic Devices
EP1982253A2 (en) * 2006-02-10 2008-10-22 AWQ Consulting Touch detection
EP1982253A4 (en) * 2006-02-10 2014-05-14 Tpk Holding Co Ltd Touch detection
US20100214267A1 (en) * 2006-06-15 2010-08-26 Nokia Corporation Mobile device with virtual keypad
WO2007144014A1 (en) * 2006-06-15 2007-12-21 Nokia Corporation Mobile device with virtual keypad
US20080048988A1 (en) * 2006-08-24 2008-02-28 Yingyong Qi Mobile device with acoustically-driven text input and method thereof
JP2010501945A (en) * 2006-08-24 2010-01-21 クゥアルコム・インコーポレイテッド Mobile device with acoustically driven text input and acoustically driven text input method
US8077163B2 (en) * 2006-08-24 2011-12-13 Qualcomm Incorporated Mobile device with acoustically-driven text input and method thereof
US20100259620A1 (en) * 2007-10-22 2010-10-14 Bae Systems Plc Cctv incident location system
US20100110834A1 (en) * 2008-10-30 2010-05-06 Kim Kyu-Hong Apparatus and method of detecting target sound
US8213263B2 (en) * 2008-10-30 2012-07-03 Samsung Electronics Co., Ltd. Apparatus and method of detecting target sound
US20110157170A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Programming architecture supporting mixed two and three dimensional displays
US9204138B2 (en) 2009-12-31 2015-12-01 Broadcom Corporation User controlled regional display of mixed two and three dimensional content
US20110157326A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Multi-path and multi-source 3d content storage, retrieval, and delivery
US20110157336A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Display with elastic light manipulator
US20110157471A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Independent viewer tailoring of same media source content via a common 2d-3d display
US20110157172A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation User controlled regional display of mixed two and three dimensional content
US20110157697A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Adaptable parallax barrier supporting mixed 2d and stereoscopic 3d display regions
US20110157167A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Coordinated driving of adaptable light manipulator, backlighting and pixel array in support of adaptable 2d and 3d displays
US20110159929A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Multiple remote controllers that each simultaneously controls a different visual presentation of a 2d/3d display
US20110157257A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Backlighting array supporting adaptable parallax barrier
US20110161843A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Internet browser and associated content definition supporting mixed two and three dimensional displays
US20110157696A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Display with adaptable parallax barrier
US20110157264A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Communication infrastructure including simultaneous video pathways for multi-viewer support
US20110157168A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Three-dimensional display system with adaptation based on viewing reference of viewer(s)
US20110157309A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Hierarchical video compression supporting selective delivery of two-dimensional and three-dimensional video content
US20110164188A1 (en) * 2009-12-31 2011-07-07 Broadcom Corporation Remote control with integrated position, viewer identification and optical and audio test
US20110164034A1 (en) * 2009-12-31 2011-07-07 Broadcom Corporation Application programming interface supporting mixed two and three dimensional displays
US20110164115A1 (en) * 2009-12-31 2011-07-07 Broadcom Corporation Transcoder supporting selective delivery of 2d, stereoscopic 3d, and multi-view 3d content from source video
US20110164111A1 (en) * 2009-12-31 2011-07-07 Broadcom Corporation Adaptable media stream servicing two and three dimensional content
US20110169919A1 (en) * 2009-12-31 2011-07-14 Broadcom Corporation Frame formatting supporting mixed two and three dimensional video data communication
US20110169913A1 (en) * 2009-12-31 2011-07-14 Broadcom Corporation Set-top box circuitry supporting 2d and 3d content reductions to accommodate viewing environment constraints
US20110169930A1 (en) * 2009-12-31 2011-07-14 Broadcom Corporation Eyewear with time shared viewing supporting delivery of differing content to multiple viewers
US20110157339A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Display supporting multiple simultaneous 3d views
US9013546B2 (en) 2009-12-31 2015-04-21 Broadcom Corporation Adaptable media stream servicing two and three dimensional content
US8988506B2 (en) 2009-12-31 2015-03-24 Broadcom Corporation Transcoder supporting selective delivery of 2D, stereoscopic 3D, and multi-view 3D content from source video
US20110157322A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Controlling a pixel array to support an adaptable light manipulator
US20110157169A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Operating system supporting mixed 2d, stereoscopic 3d and multi-view 3d displays
US9979954B2 (en) 2009-12-31 2018-05-22 Avago Technologies General Ip (Singapore) Pte. Ltd. Eyewear with time shared viewing supporting delivery of differing content to multiple viewers
US9654767B2 (en) 2009-12-31 2017-05-16 Avago Technologies General Ip (Singapore) Pte. Ltd. Programming architecture supporting mixed two and three dimensional displays
US8964013B2 (en) 2009-12-31 2015-02-24 Broadcom Corporation Display with elastic light manipulator
US8922545B2 (en) 2009-12-31 2014-12-30 Broadcom Corporation Three-dimensional display system with adaptation based on viewing reference of viewer(s)
US9019263B2 (en) 2009-12-31 2015-04-28 Broadcom Corporation Coordinated driving of adaptable light manipulator, backlighting and pixel array in support of adaptable 2D and 3D displays
US20110157327A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking
US9247286B2 (en) 2009-12-31 2016-01-26 Broadcom Corporation Frame formatting supporting mixed two and three dimensional video data communication
US20110157330A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation 2d/3d projection system
US8687042B2 (en) 2009-12-31 2014-04-01 Broadcom Corporation Set-top box circuitry supporting 2D and 3D content reductions to accommodate viewing environment constraints
US9143770B2 (en) 2009-12-31 2015-09-22 Broadcom Corporation Application programming interface supporting mixed two and three dimensional displays
US9124885B2 (en) 2009-12-31 2015-09-01 Broadcom Corporation Operating system supporting mixed 2D, stereoscopic 3D and multi-view 3D displays
US20110157315A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Interpolation of three-dimensional video content
US9066092B2 (en) 2009-12-31 2015-06-23 Broadcom Corporation Communication infrastructure including simultaneous video pathways for multi-viewer support
US8767050B2 (en) 2009-12-31 2014-07-01 Broadcom Corporation Display supporting multiple simultaneous 3D views
US8823782B2 (en) * 2009-12-31 2014-09-02 Broadcom Corporation Remote control with integrated position, viewer identification and optical and audio test
US9049440B2 (en) 2009-12-31 2015-06-02 Broadcom Corporation Independent viewer tailoring of same media source content via a common 2D-3D display
US8854531B2 (en) 2009-12-31 2014-10-07 Broadcom Corporation Multiple remote controllers that each simultaneously controls a different visual presentation of a 2D/3D display
US8408115B2 (en) * 2010-09-20 2013-04-02 Raytheon Bbn Technologies Corp. Systems and methods for an indicator for a weapon sight
US20120067201A1 (en) * 2010-09-20 2012-03-22 Raytheon Bbn Technologies Corp. Systems and methods for an indicator for a weapon sight
US20170033752A1 (en) * 2010-10-21 2017-02-02 Nokia Technologies Oy Recording Level Adjustment Using A Distance To A Sound Source
US10601385B2 (en) * 2010-10-21 2020-03-24 Nokia Technologies Oy Recording level adjustment using a distance to a sound source
US9552840B2 (en) 2010-10-25 2017-01-24 Qualcomm Incorporated Three-dimensional sound capturing and reproducing with multi-microphones
US9031256B2 (en) 2010-10-25 2015-05-12 Qualcomm Incorporated Systems, methods, apparatus, and computer-readable media for orientation-sensitive recording control
US8855341B2 (en) 2010-10-25 2014-10-07 Qualcomm Incorporated Systems, methods, apparatus, and computer-readable media for head tracking based on recorded sound signals
WO2012061149A1 (en) * 2010-10-25 2012-05-10 Qualcomm Incorporated Three-dimensional sound capturing and reproducing with multi-microphones
CN103181192A (en) * 2010-10-25 2013-06-26 高通股份有限公司 Three-dimensional sound capturing and reproducing with multi-microphones
WO2012058465A3 (en) * 2010-10-29 2012-08-23 Qualcomm Incorporated Transitioning multiple microphones from a first mode to a second mode
US9226069B2 (en) 2010-10-29 2015-12-29 Qualcomm Incorporated Transitioning multiple microphones from a first mode to a second mode
WO2012104767A3 (en) * 2011-02-04 2013-01-03 Technogym S.P.A. Apparatus and method for locating the point of impact of a body on a surface
US20140136096A1 (en) * 2011-04-15 2014-05-15 Meijo University Approaching vehicle detecting system and approaching vehicle detecting method
US9103903B2 (en) * 2011-04-15 2015-08-11 Toyota Jidosha Kabushiki Kaisha Approaching vehicle detecting system and approaching vehicle detecting method
US8218902B1 (en) * 2011-12-12 2012-07-10 Google Inc. Portable electronic device position sensing circuit
US20130222230A1 (en) * 2012-02-29 2013-08-29 Pantech Co., Ltd. Mobile device and method for recognizing external input
WO2013151789A1 (en) * 2012-04-02 2013-10-10 Qualcomm Incorporated Systems, methods, apparatus, and computer-readable media for gestural manipulation of a sound field
US10448161B2 (en) 2012-04-02 2019-10-15 Qualcomm Incorporated Systems, methods, apparatus, and computer-readable media for gestural manipulation of a sound field
US11818560B2 (en) 2012-04-02 2023-11-14 Qualcomm Incorporated Systems, methods, apparatus, and computer-readable media for gestural manipulation of a sound field
US9307335B2 (en) 2012-07-31 2016-04-05 Japan Science And Technology Agency Device for estimating placement of physical objects
EP2809085A4 (en) * 2012-07-31 2015-12-09 Japan Science & Tech Agency Device for estimating placement of physical objects
WO2014024009A1 (en) * 2012-08-10 2014-02-13 Nokia Corporation Spatial audio user interface apparatus
US20150222948A1 (en) * 2012-09-29 2015-08-06 Shenzhen Prtek Co. Ltd. Multimedia Device Voice Control System and Method, and Computer Storage Medium
US9955210B2 (en) * 2012-09-29 2018-04-24 Shenzhen Prtek Co. Ltd. Multimedia device voice control system and method, and computer storage medium
US9711042B2 (en) 2012-10-11 2017-07-18 Vincent Stoltz Method and a module for controlling pieces of equipment by sound pulse
FR2996922A1 (en) * 2012-10-11 2014-04-18 Vincent Stoltz METHOD AND MODULE FOR CONTROLLING SOUND IMPULSE EQUIPMENT
WO2014057188A1 (en) * 2012-10-11 2014-04-17 Stoltz Vincent Method and module for controlling equipment by sound pulse
US20170115827A1 (en) * 2013-03-15 2017-04-27 Elo Touch Solutions, Inc. Acoustic Touch Apparatus And Method Using Touch Sensitive Lamb Waves
US10678380B2 (en) * 2013-03-15 2020-06-09 Elo Touch Solutions, Inc. Acoustic touch apparatus and method using touch sensitive Lamb waves
US9743211B2 (en) * 2013-03-19 2017-08-22 Koninklijke Philips N.V. Method and apparatus for determining a position of a microphone
US20160029141A1 (en) * 2013-03-19 2016-01-28 Koninklijke Philips N.V. Method and apparatus for determining a position of a microphone
JP2015152316A (en) * 2014-02-10 2015-08-24 株式会社小野測器 Sound visualizing apparatus
EP3200050A4 (en) * 2014-09-24 2018-05-16 Boe Technology Group Co. Ltd. Touch screen and touch point positioning method
US9886938B2 (en) * 2015-02-10 2018-02-06 Navico Holding As Transducer array having a transceiver
US10319356B2 (en) 2015-02-10 2019-06-11 Navico Holding As Transducer array having a transceiver
US20160232884A1 (en) * 2015-02-10 2016-08-11 Navico Holding As Transducer Array Having a Transceiver
US10114119B2 (en) 2015-05-20 2018-10-30 Navico Holding As Sonar systems and methods using interferometry and/or beamforming for 3D imaging
US10547736B2 (en) * 2015-07-14 2020-01-28 Driving Management Systems, Inc. Detecting the location of a phone using RF wireless and ultrasonic signals
US20190199850A1 (en) * 2015-07-14 2019-06-27 Driving Management Systems, Inc. Detecting the location of a phone using rf wireless and ultrasonic signals
US10024957B2 (en) 2015-09-17 2018-07-17 Navico Holding As Adaptive beamformer for sonar imaging
US20180306890A1 (en) * 2015-10-30 2018-10-25 Hornet Industries, Llc System and method to locate and identify sound sources in a noisy environment
US20200176015A1 (en) * 2017-02-21 2020-06-04 Onfuture Ltd. Sound source detecting method and detecting device
US10891970B2 (en) * 2017-02-21 2021-01-12 Onfuture Ltd. Sound source detecting method and detecting device
US11143758B2 (en) 2017-10-13 2021-10-12 Navico Holding As Sonar transducer performance optimization
US11392290B2 (en) * 2020-06-26 2022-07-19 Intel Corporation Touch control surfaces for electronic user devices and related methods
US20220350481A1 (en) * 2020-06-26 2022-11-03 Intel Corporation Touch control surfaces for electronic user devices and related methods
US11893234B2 (en) * 2020-06-26 2024-02-06 Intel Corporation Touch control surfaces for electronic user devices and related methods
CN111857642A (en) * 2020-07-20 2020-10-30 歌尔科技有限公司 Control method of electronic device, and readable storage medium
WO2023099355A1 (en) * 2021-12-02 2023-06-08 Universite De Lille Coincidence detector for locating a source
FR3130043A1 (en) * 2021-12-02 2023-06-09 Université de Lille coincidence detector for locating a source
WO2024137044A1 (en) * 2022-12-23 2024-06-27 Qualcomm Incorporated User equipment inputs based on sounds and/or vibrations

Also Published As

Publication number Publication date
WO2002082249A3 (en) 2003-03-20
US6690618B2 (en) 2004-02-10
WO2002082249A2 (en) 2002-10-17

Similar Documents

Publication Publication Date Title
US6690618B2 (en) Method and apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device
Mao et al. Rnn-based room scale hand motion tracking
EP1852772B1 (en) Method for a touch pad
US8139029B2 (en) Method and device for three-dimensional sensing
US8169404B1 (en) Method and device for planary sensory detection
CN103229071B (en) For the system and method estimated based on the object's position of ultrasonic reflection signal
US10754476B2 (en) Systems and methods for ultrasonic, millimeter wave and hybrid sensing
US11550419B2 (en) Touch-based input device
US12105217B2 (en) Method, device and system for determining relative angle between intelligent devices
CN111683325A (en) Sound effect control method and device, sound box, wearable device and readable storage medium
KR20080042560A (en) Touch panel using wave
Cheng et al. PD-FMCW: Push the limit of device-free acoustic sensing using phase difference in FMCW
GB2385125A (en) Using vibrations generated by movement along a surface to determine position
Lu et al. VPad: Virtual writing tablet for laptops leveraging acoustic signals
Zhao et al. UltraSnoop: Placement-agnostic Keystroke Snooping via Smartphone-based Ultrasonic Sonar
Chung et al. vTrack: virtual trackpad interface using mm-level sound source localization for mobile interaction
Liu et al. Acoustic-based 2-D target tracking with constrained intelligent edge device
Bai et al. WhisperWand: Simultaneous Voice and Gesture Tracking Interface
Lee et al. Sonicstrument: A Musical Interface with Stereotypical Acoustic Transducers.
Hanh et al. Impact Localization on Rigid Surfaces Using Hermitian Angle Distribution for Human–Computer Interface Applications
Huang et al. MM-Tap: Adaptive and Scalable Tap Localization on Ubiquitous Surfaces With mm-Level Accuracy
You et al. Acoustic Signal Processing for Acoustic Source Localisation in an Elastic Solid
Jeong et al. A comparative assessment of Wi-Fi and acoustic signal-based HCI methods on the practicality
García-Requejo et al. Ultrasonic Device-Free Localization System using Orthogonal Chirp-Based Multistatic Sonar
Jiang et al. Echo-ID: Smartphone Placement Region Identification for Context-Aware Computing

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANESTA, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TOMASI, CARLO;SURUCU, FAHRI;REEL/FRAME:012766/0603

Effective date: 20020402

FPAY Fee payment

Year of fee payment: 4

AS Assignment

Owner name: HEILONGJIANG GOLDEN JUMPING GROUP CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CANESTA, INC.;REEL/FRAME:021679/0068

Effective date: 20081003

REMI Maintenance fee reminder mailed
FEPP Fee payment procedure

Free format text: PETITION RELATED TO MAINTENANCE FEES GRANTED (ORIGINAL EVENT CODE: PMFG); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Free format text: PETITION RELATED TO MAINTENANCE FEES FILED (ORIGINAL EVENT CODE: PMFP); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

LAPS Lapse for failure to pay maintenance fees
REIN Reinstatement after maintenance fee payment confirmed
FP Lapsed due to failure to pay maintenance fee

Effective date: 20120210

PRDP Patent reinstated due to the acceptance of a late maintenance fee

Effective date: 20120522

FPAY Fee payment

Year of fee payment: 8

SULP Surcharge for late payment
REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20160210