US20140372027A1 - Music-Based Positioning Aided By Dead Reckoning - Google Patents

Music-Based Positioning Aided By Dead Reckoning Download PDF

Info

Publication number
US20140372027A1
US20140372027A1 US14088437 US201314088437A US2014372027A1 US 20140372027 A1 US20140372027 A1 US 20140372027A1 US 14088437 US14088437 US 14088437 US 201314088437 A US201314088437 A US 201314088437A US 2014372027 A1 US2014372027 A1 US 2014372027A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
mobile device
device according
positioning
burst
signature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US14088437
Inventor
Guobiao Zhang
Bruce Bing Wang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HangZhou HaiCun Information Tech Co Ltd
Original Assignee
HangZhou HaiCun Information Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups
    • G01C21/10Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/14Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by recording the course traversed by the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups
    • G01C21/10Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C22/00Measuring distance traversed on the ground by vehicles, persons, animals, or other moving solid bodies, e.g. using odometers, using pedometers
    • G01C22/006Pedometers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • G01S5/26Position of receiver fixed by co-ordinating a plurality of position lines defined by path-difference measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • G01S5/30Determining absolute distances from a plurality of spaced points of known location

Abstract

Music-based positioning (MP) provides positioning service only sporadically and therefore is not suitable for indoor positioning. The present invention discloses system and method for music-based positioning aided by dead reckoning (MP-DR). At each signature burst (i.e. a highly unique short musical segment suitable for positioning), musical sounds are used for positioning. Between signature bursts, positioning is performed by dead reckoning (DR). MP-DR is an ideal combination of MP and DR: DR extends temporal coverage for MP, while MP provides the much needed periodic calibrations for DR.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims priority of a provisional application entitled “Music-Based Positioning Aided By Dead Reckoning”, Ser. No. 61/835,527, filed Jun. 14, 2013.
  • BACKGROUND
  • 1. Technical Field of the Invention
  • The present invention relates to the field of mobile electronic system, and more particularly to indoor positioning using human consumable sounds such as music.
  • 2. Prior Arts
  • Sound has been suggested as a medium for positioning. For positioning in a large venue like a shopping mall or a supermarket, an important metric is the range of the sound. To limit the number of sound sources (e.g. loudspeakers), the sound used for positioning preferably has a long range.
  • Ultrasound, although widely used for ranging, fails in this aspect. Ultrasound suffers from severe attenuation when transmitted in air. For a distance of 100 meters, the transmission loss for a 40 kHz ultrasound is ˜140 dB (FIG. 1). In fact, ultrasound can only be practically projected to a range of ˜15 meters in air. As a result, ultrasound is not suitable for positioning in a large venue.
  • On the other hand, audible sounds attenuate much less in air. For example, the transmission loss for a 1 kHz audible sound is only ˜40 dB for 100 meters (FIG. 1). To be projected to a long range, audible sounds are preferably music or other human consumable sounds, whose volume can be turned up without causing annoyance to humans in the immediately vicinity. Furthermore, large venues are generally equipped with public address (PA) systems, where loudspeakers are required to provide a good acoustic coverage. It would be very attractive to leverage the existing PA systems and use the human consumable sounds produced thereby for positioning in a large venue. Hereinafter, music is used as a primarily example of human consumable sounds.
  • Although it has many advantages, music-based positioning (MP) faces a difficult challenge. A large venue is filled with background noises and multi-path reflections. Apparently, not every portion of the music can be used for positioning. For example, the portion of the music that is barely distinguishable from background noise cannot be used. To be suitable for positioning, a musical segment (i.e. a burst) should possess enough uniquely identifiable properties. A figure of merit is its correlativity, which represents the relative strength of its auto-correlation vs. its correlation with other signals. A burst with a large correlativity is relatively un-correlated with its lagged replica or background noise. In an audio content, a burst suitable for positioning is referred to as its signature burst. The auto-correlation function of the signature burst will exhibit a distinct central peak with quickly diminishing side lobe tails.
  • Litwhiler et al. (“A simple method for evaluating audible signals for acoustic measurements”, the Technology Interface, Vol. 7, No. 2, Spring 2007) taught a music-based positioning (MP) method. A music file is first sliced into bursts of 1 s long each. The correlativity of each burst is then calculated as the ratio between the peak value and the root-mean-square (rms) value of its auto-correlation function. Any burst with correlativity higher than a pre-determined threshold (e.g. 20) is considered as a signature burst. In the example illustrated in FIG. 2, there are 19 signature bursts (shown as cross-hatched bars) among the 60 bursts evaluated. Its temporal coverage (i.e. the percentage of bursts that are suitable for positioning within a period) is ˜30%. The longest time interval without a signature burst (i.e. non-signature interval) is 10 s. During a non-signature interval, no positioning can be performed using musical sounds. Because it can be used for positioning only sporadically, music was considered un-suitable for indoor positioning in the past.
  • OBJECTS AND ADVANTAGES
  • It is a principle object of the present invention to provide a positioning service in a large venue using the existing infra-structure (e.g. loudspeakers in the public address system and microphones in mobile devices).
  • It is a further object of the present invention to improve the temporal coverage of music-based positioning (MP).
  • It is a further object of the present invention to improve the accuracy of dead reckoning (DR).
  • In accordance with these and other objects of the present invention, the present invention discloses system and method for music-based positioning aided by dead reckoning (MP-DR).
  • SUMMARY OF THE INVENTION
  • The present invention discloses system and method for music-based positioning aided by dead reckoning (MP-DR). At each signature burst (i.e. a highly unique short musical segment suitable for positioning), musical sounds are used for positioning. During a non-signature interval (i.e. between successive signature bursts), positioning is performed by dead reckoning.
  • MP-DR is an ideal combination of music-based positioning (MP) and dead reckoning (DR), whose strengths and weaknesses exactly complement each other. MP provides accurate positioning with an error of ˜1%. However, because most musical segments are not suitable for positioning, MP can be performed only sporadically. The time gap between two successive MP measurements ranges from a few seconds to tens of seconds. On the other hand, DR suffers from accumulation of errors. After calibrating its location at a reference point, although DR initially has a small error, this error grows rapidly and becomes unacceptable after a mobile user walks for more than one minute or so. For MP-DR, DR extends temporal coverage for MP, while MP provides the much needed periodic calibrations for DR.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 displays the relationship of the sound absorption by air with the sound frequency;
  • FIG. 2 displays the correlativity of the first 60 bursts (each burst is 1 s long) in “Margaritaville” by Jimmy Buffet (from Litwhiler);
  • FIG. 3 is a flow chart of a preferred MP-DR method;
  • FIG. 4 displays error vs. time for music-based position aided by dead reckoning (MP-DA);
  • FIGS. 5A-5B illustrate a way to perform MP-DA between the signature bursts Si and Si+1: FIG. 5A illustrates the positioning of the mobile device using the DR-enhanced acoustic positioning method at Si; FIG. 5B illustrates the progression of location errors between Si and Si+1;
  • FIG. 6 is a functional block diagram of a mobile device in a preferred MP-DR system;
  • FIG. 7 is a functional block diagram of a preferred acoustic positioning (AP) module;
  • FIGS. 8A-8C illustrate three preferred signal generators.
  • It should be noted that all the drawings are schematic and not drawn to scale. Relative dimensions and proportions of parts of the device structures in the figures have been shown exaggerated or reduced in size for the sake of clarity and convenience in the drawings. The same reference symbols are generally used to refer to corresponding or similar features in the different embodiments.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Those of ordinary skills in the art will realize that the following description of the present invention is illustrative only and is not intended to be in any way limiting. Other embodiments of the invention will readily suggest themselves to such skilled persons from an examination of the within disclosure.
  • As used herein, a “location” refers to a geographic area with a lower resolution. Because it has a low accuracy, a geographic area identified by dead reckoning (DR) is generally referred to as a location. Accordingly, the process of identifying a geographic area by DR is referred to as localization. A “position” refers to a geographic area with a higher resolution. Because it has a high accuracy, a geographic area identified by acoustic positioning (AP) or music-based positioning (MP) is generally referred to as a position. Accordingly, the process of identifying a geographic area by AP or MP is referred to as positioning.
  • In this specification, “human consumable sounds” are sounds whose volume can be turned up to project the sounds to a long range without causing annoyance to humans in the immediately vicinity. Human consumable sounds are generally produced from an audio content, including music, human speech or others. As music is used as a primary example of human consumable sounds, the terms “music-based positioning (MP)” and “acoustic positioning (AP)” are used interchangeably. They both refer to a positioning method using human consumable sounds.
  • Referring now to FIGS. 3-5B, a general description for music-based positioning aided by dead reckoning (MP-DR) is provided. At each signature burst (i.e. a highly unique short musical segment suitable for positioning), musical sounds are used for positioning. During a non-signature interval (i.e. between successive signature bursts), positioning is performed by dead reckoning. More details of MP-DR are disclosed below.
  • At a first signature burst Si, a mobile device 100 is positioned using the signature burst Si (step 200 of FIG. 3). This is performed using a DR-enhanced acoustic positioning method. It includes the following steps. At the starting point TSi of the signature burst Si, at least two sound-transmitting devices (e.g. speakers) 10 a, 10 b simultaneously produce a signature burst Si (FIG. 5A). The time-of-flight (TOF) between the mobile device 100 and each speaker is measured and converted to distances ra, rb by multiplying the speed of sound. The intersections X, Y of circles 20 a (with its center at 10 a and a radius of ra) and 20 b (with its center at 10 b and a radius of rb) are two possible positions of the mobile device 100. This positioning is further narrowed down to 30 a* (i.e. intersection X) by selecting the intersection (X or Y) which is located within the possible DR locations 40 a. Here, possible DR locations 40 a include all locations predicted by dead reckoning. In this example, it is represented by a circle whose center 30 a is the estimated DR location and whose radius is the DR error. Besides TOF, time-difference-of-flight (TDOF) may also be used for positioning.
  • Although dead reckoning could have a large error (meters to tens of meters), this error is not passed on to the position 30 a* because dead reckoning is only used to select one out of two intersections X, Y. The accuracy of the position 30 a* is primarily determined by the accuracy of the radii ra, rb. Because acoustic ranging has a high accuracy (˜1%), the position 30 a* can achieve a high accuracy, typically around tens of centimeters (FIG. 4).
  • During a non-signature interval NSi (e.g. between two successive signature bursts Si and Si+1 of FIG. 4), the mobile device 100 is localized by dead reckoning (step 300 of FIG. 3). Dead reckoning (DR) is a process of calculating one's current location by using a previously determined location, and advancing that location based upon known or estimated speeds over elapsed time. Dead reckoning uses the accelerometer and the compass of the mobile device to track the mobile user. Based on the accelerometer reading, it is possible to tell whether the user has taken a step, and therefrom estimate the displacement. Based on the compass reading, the direction of each step can be tracked.
  • Because of the noisy sensors, dead reckoning suffers from accumulation of errors, which can grow cubically with the total number of steps walked from the last reference point, where its location is calibrated. This is further illustrated in FIG. 5B. At position 30 a*, the location of the mobile device 100 is calibrated by acoustic positioning and has a small error. As the mobile user walks along the path 50, each step (e.g. 30 b, 30 c, 30 d) increases the location error, as indicated by the increased size of these circles.
  • At a second signature burst Si+1, the mobile device 100 is again positioned using the signature burst Si+1 (step 400 of FIG. 3). Similar to FIG. 5A, the mobile device 100 has two possible positions, i.e. two intersections of circles 20 a′, 20 b′ (FIG. 5B). Because possible DR locations 40 e have already been predicted from dead reckoning, the mobile device 100 can be easily positioned. Overall, the location error of the mobile device 100 exhibits a saw-tooth behavior: at the first signature burst Si, the error is small; during the non-signature interval NSi, the error grows and then sharply drops at the next signature burst Si+1 (FIG. 4).
  • Referring now to FIG. 6, a mobile device 100 used in a preferred MP-DR system is illustrated. The mobile device 100 can be a variety of different types of devices, with different devices being the same or different types of devices. For example, device can be a cellular or other wireless phone, a laptop or netbook computer, a tablet or notepad computer, a mobile station, an entertainment appliance, a game console, an automotive computer, and so forth. Thus, device may range from a full resource device with substantial memory and processor resources (e.g. personal computers, game consoles) to a low-resource device with limited memory and/or processing resources (e.g. traditional set-top boxes, hand-held game consoles).
  • The mobile device 100 comprises an acoustic positioning (AP) module 60, a dead-reckoning (DR) module 70, a processor 80 and a memory 90. In some embodiments, the mobile device 100 may include many more components than those shown in FIG. 6. However, it is not necessary that all of these generally conventional components be shown in order to disclose an illustrative embodiment.
  • The AP module 60 is primarily a music-based positioning (MP) module. It measures the device position using human consumable sounds such as music. The measurement principles range from signal strength measurement to time-of-flight (TOF) measurement. The TOF measurement further includes pulsed time measurement, continuous-wave measurement (e.g. pattern-matching or phase-shift measurement) and others. More details of the AP module 60 are disclosed in FIGS. 7-8C.
  • The DR module 70 receives sensor data and executes DR algorithm to determine the location of the mobile device based on vector analysis of changes in the sensor data. It comprises a plurality of inertial sensors that detect movement (e.g. rotation, motion, velocity, etc.), altitude, and/or direction. These inertial sensors can include an accelerometer, a compass, a gyroscope, and so forth. They collect data regarding the detected movement, position, and/or direction of the device.
  • The processor 80 accepts digital data as input, processes it according to instructions stored in the memory 90 and provides results as output. The memory 90 is adapted to store software. According to the teachings of the present invention, software is provided which includes a set of executable instructions, programs, and or program modules adapted to control the AP module 60 and the DR module 70. In some preferred embodiments, the memory 90 also stores audio contents and the associated signature-burst meta-data.
  • Referring now to FIG. 7, a functional block diagram of a preferred MP module 60 is disclosed. The preferred MP module 60 comprises a sound receiver 62, a signal generator 64 and a correlator 66. The sound receiver 62 is generally a microphone. It receives the acoustic signals 61 from the speakers and converts them into electrical signals, i.e. the received audio signals. The signal generator 64 produces a replica of the transmitted audio signals. The correlator 66 has two inputs 65 a and 65 b: the first input 65 a includes the received audio signals, while the second input includes the replica of the transmitted audio signals. In order to provide real-time acoustic (or, music-based) positioning, a signature-burst meta-data 94 is transmitted to the mobile device 100 and stored in its memory 90. This signature-burst meta-data 94 includes the timing information (TSi, TSi+1 . . . ) of all signature bursts (Si, Si+1 . . . ) of the audio contents to be played. At the expected occurrence of a signature burst (e.g. Si), the correlator 66 is turned on. By matching the pattern of the transmitted audio signals in the received audio signals, the delay of the received audio signals with respect to the transmitted signals can be measured. This delay is the time-of-flight (TOF) between the speaker and the microphone.
  • Referring now to FIGS. 8A-8C, several preferred signal generators 64 are disclosed. In FIG. 8A, all remote speakers simultaneously driven by an audio player and produce sounds from radio broadcast (e.g. AM/FM) signals. The preferred signal generator 64 is a radio receiver 64 a. It receives the same radio broadcast signals as the audio player. Through mixing, filtering and amplifying, it converts the radio signals into base-band signals 65 b. Because the radio signals have a much faster speed (i.e. the speed of light) than the acoustic signals (i.e. the speed of sound), the base-band signals 65 b is considered as a replica of the transmitted audio signals. In this preferred embodiment, the signature-burst meta-data 94 is preferably transmitted by radio broadcast signals, too.
  • In FIG. 8B, all remote speakers are simultaneously driven by an audio player. The audio player plays a list of audio contents pre-defined by an operator. The signal generator 64 comprises an RF receiver 68 a and an audio synthesizer 64 b. At a pre-determined location (e.g. at the entrance) of the venue, the play-list (including the playback schedule) and the associated digital audio files 92 are downloaded to the mobile device via RF signals 67 (e.g. via WiFi and/or cellular signals) through the RF receiver 68 a. This does not require full WiFi coverage at the venue. The digital audio files 92 further include the signature-burst meta-data 94 and the audio data 96. The audio synthesizer 64 b converts the audio data 96 into a replica of the transmitted signals 65 b.
  • FIG. 8C is similar to FIG. 8B, except that the memory 90 of the mobile device 100 already stores the digital audio files on the play-list. In this case, only the play-list (including playback schedule) are downloaded to the mobile device 100 via the RF signals 67. This requires less download time and incurs less download cost.
  • While illustrative embodiments have been shown and described, it would be apparent to those skilled in the art that may more modifications than that have been mentioned above are possible without departing from the inventive concepts set forth therein. The invention, therefore, is not to be limited except in the spirit of the appended claims.

Claims (20)

    What is claimed is:
  1. 1. A mobile device, comprising:
    an acoustic positioning (AP) module for positioning said mobile device using a signature burst, wherein said signature burst includes human consumable sounds suitable for positioning;
    a dead-reckoning (DR) module for localizing said mobile device during an interval without signature burst.
  2. 2. The mobile device according to claim 1, wherein said human consumable sounds are sounds of an audio content.
  3. 3. The mobile device according to claim 2, wherein said audio content is music or human speech.
  4. 4. The mobile device according to claim 2, further comprising a memory for storing said audio content.
  5. 5. The mobile device according to claim 2, further comprising a memory for storing a signature-burst meta-data associated with said audio content.
  6. 6. The mobile device according to claim 1, wherein said human consumable sounds are simultaneously generated by at least two sound-transmitting devices.
  7. 7. The mobile device according to claim 1, wherein said signature burst has a correlativity higher than a pre-determined value.
  8. 8. The mobile device according to claim 7, wherein the correlativity of a burst is the ratio of the peak value and the root-mean-square (rms) value of the auto-correlation function of said burst.
  9. 9. The mobile device according to claim 1, wherein said AP module further comprises a correlating element for calculating a correlation function between received audio signals and transmitted audio signals.
  10. 10. The mobile device according to claim 9, wherein said AP module further comprises a sound-receiving element for converting said human consumable sounds into said received audio signals.
  11. 11. The mobile device according to claim 9, wherein said AP module further comprises a radio-receiving element for converting a radio signal into a replica of said transmitted audio signals.
  12. 12. The mobile device according claim 9, wherein said AP module further comprises an audio-synthesizing element for converting an audio content into a replica of said transmitted audio signals.
  13. 13. The mobile device according claim 9, wherein said AP module measures the time-of-flight or time-difference-of-flight of said received audio signals with respect to said transmitted audio signals.
  14. 14. The mobile device according to claim 1, wherein said DR module comprises at least an accelerometer, a compass and/or a gyroscope.
  15. 15. A mobile device, comprising:
    a memory for storing a signature-burst meta-data associated with an audio content, wherein signature bursts include human consumable sounds suitable for positioning;
    an acoustic positioning module for positioning said mobile device using a signature burst of said audio content.
  16. 16. The mobile device according to claim 15, wherein said audio content is music or human speech.
  17. 17. The mobile device according to claim 15, wherein said human consumable sounds are simultaneously generated by at least two sound-transmitting devices.
  18. 18. The mobile device according to claim 15, wherein said signature bursts have a correlativity higher than a pre-determined value.
  19. 19. The mobile device according to claim 18, wherein the correlativity of a burst is the ratio between the peak value and the root-mean-square (rms) value of the auto-correlation function of said burst.
  20. 20. The mobile device according to claim 15, further comprising a dead-reckoning module for localizing said mobile device during an interval without signature burst.
US14088437 2013-06-14 2013-11-24 Music-Based Positioning Aided By Dead Reckoning Pending US20140372027A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201361835527 true 2013-06-14 2013-06-14
US14088437 US20140372027A1 (en) 2013-06-14 2013-11-24 Music-Based Positioning Aided By Dead Reckoning

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US14088437 US20140372027A1 (en) 2013-06-14 2013-11-24 Music-Based Positioning Aided By Dead Reckoning
US14637380 US20150177000A1 (en) 2013-06-14 2015-03-03 Music-Based Positioning Aided By Dead Reckoning
US15382735 US20170097239A1 (en) 2013-06-14 2016-12-18 Music-Based Positioning Aided By Dead Reckoning

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14637380 Continuation US20150177000A1 (en) 2013-06-14 2015-03-03 Music-Based Positioning Aided By Dead Reckoning

Publications (1)

Publication Number Publication Date
US20140372027A1 true true US20140372027A1 (en) 2014-12-18

Family

ID=52019933

Family Applications (3)

Application Number Title Priority Date Filing Date
US14088437 Pending US20140372027A1 (en) 2013-06-14 2013-11-24 Music-Based Positioning Aided By Dead Reckoning
US14637380 Abandoned US20150177000A1 (en) 2013-06-14 2015-03-03 Music-Based Positioning Aided By Dead Reckoning
US15382735 Pending US20170097239A1 (en) 2013-06-14 2016-12-18 Music-Based Positioning Aided By Dead Reckoning

Family Applications After (2)

Application Number Title Priority Date Filing Date
US14637380 Abandoned US20150177000A1 (en) 2013-06-14 2015-03-03 Music-Based Positioning Aided By Dead Reckoning
US15382735 Pending US20170097239A1 (en) 2013-06-14 2016-12-18 Music-Based Positioning Aided By Dead Reckoning

Country Status (2)

Country Link
US (3) US20140372027A1 (en)
CN (1) CN104239030A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150177000A1 (en) * 2013-06-14 2015-06-25 Chengdu Haicun Ip Technology Llc Music-Based Positioning Aided By Dead Reckoning

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6650999B1 (en) * 1998-05-22 2003-11-18 Hans-Detlef Brust Method and device for finding a parked vehicle
US7091851B2 (en) * 2002-07-02 2006-08-15 Tri-Sentinel, Inc. Geolocation system-enabled speaker-microphone accessory for radio communication devices
US7149629B1 (en) * 2000-11-22 2006-12-12 Applanix Corporation, Inc. AINS land surveyor system with reprocessing, AINS-LSSRP
US20070045018A1 (en) * 2005-08-25 2007-03-01 Carter Scott J Systems and methods for controlling powered vehicles near a restricted region
US7236091B2 (en) * 2005-02-10 2007-06-26 Pinc Solutions Position-tracking system
US20080059988A1 (en) * 2005-03-17 2008-03-06 Morris Lee Methods and apparatus for using audience member behavior information to determine compliance with audience measurement system usage requirements
US7729204B2 (en) * 2007-06-08 2010-06-01 Microsoft Corporation Acoustic ranging
US8046160B2 (en) * 2005-03-18 2011-10-25 Gatekeeper Systems, Inc. Navigation systems and methods for wheeled objects
US8219384B2 (en) * 2010-05-26 2012-07-10 Google Inc. Acoustic model adaptation using geographic information
US8490128B2 (en) * 2009-02-12 2013-07-16 Digimarc Corporation Media processing methods and arrangements
US8666303B2 (en) * 2007-11-07 2014-03-04 The Nielsen Company (Us), Llc Methods and apparatus to collect media exposure information
US8856845B2 (en) * 2012-12-21 2014-10-07 TCL Research America Inc. Method and system for providing personalized contents

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6219385B1 (en) * 1998-12-23 2001-04-17 Itt Manufacturing Enterprises, Inc. Digital AM/FM positioning system (DAFPS)—an international positioning system
DE10035222A1 (en) * 2000-07-20 2002-02-07 Bosch Gmbh Robert Acoustic location of persons in detection area, involves deriving signal source position from received signal time displacements and sound detection element positions
US7336563B2 (en) * 2004-01-30 2008-02-26 Sonitor Technologies As Method and system for increased update rate in acoustic positioning
GB0426448D0 (en) * 2004-12-02 2005-01-05 Koninkl Philips Electronics Nv Position sensing using loudspeakers as microphones
WO2006078597A3 (en) * 2005-01-18 2009-04-16 Eric P Haeker Method and apparatus for generating visual images based on musical compositions
JP2010511172A (en) * 2006-11-30 2010-04-08 カルディナーレ、チコッティ、ジュセッペ The method for locating a remote device using an acoustic wave and the electromagnetic wave
EP2058803B1 (en) * 2007-10-29 2010-01-20 Harman/Becker Automotive Systems GmbH Partial speech reconstruction
US7890262B2 (en) * 2008-03-31 2011-02-15 Honeywell International Inc. Position estimation for navigation devices
EP2163284A1 (en) * 2008-09-02 2010-03-17 Zero Point Holding A/S Integration of audio input to a software application
US8660670B2 (en) * 2009-09-10 2014-02-25 Sam FREED Controller with artificial intelligence based on selection from episodic memory and corresponding methods
US8886980B2 (en) * 2010-03-29 2014-11-11 Qualcomm Incorporated Power efficient way of operating motion sensors
US20120143495A1 (en) * 2010-10-14 2012-06-07 The University Of North Texas Methods and systems for indoor navigation
JP5289517B2 (en) * 2011-07-28 2013-09-11 株式会社半導体理工学研究センター Sensor network system and communication method thereof
US20140350840A1 (en) * 2013-05-23 2014-11-27 Cellco Partnership D/B/A Verizon Wireless Crowd proximity device
US9589189B2 (en) * 2013-05-23 2017-03-07 Cellco Partnership Device for mapping physical world with virtual information
US20140372027A1 (en) * 2013-06-14 2014-12-18 Hangzhou Haicun Information Technology Co. Ltd. Music-Based Positioning Aided By Dead Reckoning
US20150168538A1 (en) * 2013-12-06 2015-06-18 Digimarc Corporation Mobile device indoor navigation
US9500739B2 (en) * 2014-03-28 2016-11-22 Knowles Electronics, Llc Estimating and tracking multiple attributes of multiple objects from multi-sensor data
US20160084937A1 (en) * 2014-09-22 2016-03-24 Invensense Inc. Systems and methods for determining position information using acoustic sensing

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6650999B1 (en) * 1998-05-22 2003-11-18 Hans-Detlef Brust Method and device for finding a parked vehicle
US7149629B1 (en) * 2000-11-22 2006-12-12 Applanix Corporation, Inc. AINS land surveyor system with reprocessing, AINS-LSSRP
US7091851B2 (en) * 2002-07-02 2006-08-15 Tri-Sentinel, Inc. Geolocation system-enabled speaker-microphone accessory for radio communication devices
US7236091B2 (en) * 2005-02-10 2007-06-26 Pinc Solutions Position-tracking system
US20080059988A1 (en) * 2005-03-17 2008-03-06 Morris Lee Methods and apparatus for using audience member behavior information to determine compliance with audience measurement system usage requirements
US8650586B2 (en) * 2005-03-17 2014-02-11 The Nielsen Company (Us), Llc Methods and apparatus for using audience member behavior information to determine compliance with audience measurement system usage requirements
US8046160B2 (en) * 2005-03-18 2011-10-25 Gatekeeper Systems, Inc. Navigation systems and methods for wheeled objects
US20070045018A1 (en) * 2005-08-25 2007-03-01 Carter Scott J Systems and methods for controlling powered vehicles near a restricted region
US7729204B2 (en) * 2007-06-08 2010-06-01 Microsoft Corporation Acoustic ranging
US8666303B2 (en) * 2007-11-07 2014-03-04 The Nielsen Company (Us), Llc Methods and apparatus to collect media exposure information
US8490128B2 (en) * 2009-02-12 2013-07-16 Digimarc Corporation Media processing methods and arrangements
US8219384B2 (en) * 2010-05-26 2012-07-10 Google Inc. Acoustic model adaptation using geographic information
US8856845B2 (en) * 2012-12-21 2014-10-07 TCL Research America Inc. Method and system for providing personalized contents

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150177000A1 (en) * 2013-06-14 2015-06-25 Chengdu Haicun Ip Technology Llc Music-Based Positioning Aided By Dead Reckoning
US20170097239A1 (en) * 2013-06-14 2017-04-06 HangZhou HaiCun Information Technology Co., Ltd. Music-Based Positioning Aided By Dead Reckoning

Also Published As

Publication number Publication date Type
CN104239030A (en) 2014-12-24 application
US20170097239A1 (en) 2017-04-06 application
US20150177000A1 (en) 2015-06-25 application

Similar Documents

Publication Publication Date Title
US20090316529A1 (en) Positioning of a Portable Electronic Device
Peng et al. Beepbeep: a high accuracy acoustic ranging system using cots mobile devices
US20090286549A1 (en) Location Determination
US20130252628A1 (en) Crowd Sourcing With Robust Device Position Determination
US20120313900A1 (en) User interfaces
US8548494B2 (en) Localization of mobile computing devices in indoor environments
US9288597B2 (en) Distributed wireless speaker system with automatic configuration determination when new speakers are added
US20050249360A1 (en) Systems and methods for microphone localization
US20090233551A1 (en) Wireless communication terminals and methods using acoustic ranging synchronized to rf communication signals
US9560449B2 (en) Distributed wireless speaker system
US8224351B1 (en) Identifying a geographically nearby mobile computing device
US20110110531A1 (en) Apparatus, method and computer program for localizing a sound source
US20150256954A1 (en) Networked speaker system with follow me
US20090295639A1 (en) Autonomous ultrasonic indoor location system, apparatus and method
US20120026837A1 (en) Sound direction detection
Yu et al. TOA-based distributed localisation with unknown internal delays and clock frequency offsets in wireless sensor networks
US8660581B2 (en) Mobile device indoor navigation
Chang et al. Spinning beacons for precise indoor localization
US20120309411A1 (en) State estimation using motion context and multiple input observation types
US20070070812A1 (en) Positioning system using ultrasonic waves and method for operating the same
US20110142100A1 (en) Methods and apparatuses for identifying and mitigating interference in a wireless signal
US20140004876A1 (en) Indoor/Outdoor Differentiation Using Radio Frequency (RF) Transmitters
US20140080514A1 (en) Characterizing an indoor structure based on detected movements and/or position locations of a mobile device
US7729204B2 (en) Acoustic ranging
Tiete et al. SoundCompass: a distributed MEMS microphone array-based sensor for sound source localization