EP2761320B1 - Sound-based positioning - Google Patents
Sound-based positioning Download PDFInfo
- Publication number
- EP2761320B1 EP2761320B1 EP12836065.8A EP12836065A EP2761320B1 EP 2761320 B1 EP2761320 B1 EP 2761320B1 EP 12836065 A EP12836065 A EP 12836065A EP 2761320 B1 EP2761320 B1 EP 2761320B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- sound signal
- sound
- location
- receiving device
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000005236 sound signal Effects 0.000 claims description 181
- 238000000034 method Methods 0.000 claims description 25
- 238000005259 measurement Methods 0.000 claims description 15
- 238000003860 storage Methods 0.000 claims description 12
- 238000013459 approach Methods 0.000 description 10
- 230000007613 environmental effect Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000000875 corresponding effect Effects 0.000 description 5
- 238000000605 extraction Methods 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 238000001228 spectrum Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 208000003035 Pierre Robin syndrome Diseases 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000037361 pathway Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 238000013102 re-test Methods 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/18—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/18—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
- G01S5/26—Position of receiver fixed by co-ordinating a plurality of position lines defined by path-difference measurements
Definitions
- GPS global positioning systems
- available consumer devices are limited in their communication capabilities, sensing capabilities (e.g., mobile device microphones), the accuracy of their internal clocks, available power, etc. Accordingly, obtaining highly accurate, real-time location information on a mobile user within enclosed buildings (or where GPS positioning is otherwise unavailable) is difficult without nontrivial modifications to the hardware of available mobile devices.
- WO 01/34264 A1 discloses an acoustic location system.
- an acoustic detection system comprising one or more spaced acoustic signal generators located at predetermined relative positions and an object comprising: (i) a detector for detecting acoustic signals generated by the one or more acoustic signal generators and converting the detected acoustic signals into corresponding electrical signals; and (ii) means for processing the corresponding electrical signals to identify the respective times of arrival of the acoustic signals.
- the position of the object relative to the predetermined relative positions of said acoustic signal generators is determined using the identified times of arrival.
- the acoustic signals output by the acoustic signal generators are spread spectrum electrical signals which reduces the noticeability of the acoustic signals to humans of animals.
- EP 0 492 015 A1 discloses a method and apparatus for navigating an automatic guided vehicle.
- Implementations described and claimed herein address the foregoing problems by using a receiving device to capture sounds signals (e.g., ultrasonic) from multiple sound signal sources, selecting the sound signals satisfying a reliability condition for use in determining an initial position of the receiving device relative to the corresponding sound signal sources, determining the initial position of the receiving device using multilateration of the selected sound signals, and updating the current position of the receiving device as the reliability of individual sound signals varies in the presence of dynamically changing environmental interference, multipathing, and movement between the receiving device and the sound signal sources.
- sounds signals e.g., ultrasonic
- Some modern mobile devices such as smart phones, include microphones capable of detecting ultrasonic signals, which presents opportunities for using consumer-grade mobile devices to perform ultrasonic-based positioning.
- the ultrasonic bandwidth that is detectable by such devices is currently rather narrow (e.g., between 20 KHz and 22 KHz). Nevertheless, ultrasonic signals can be played within this limited bandwidth while providing sufficient information to allow position of a receiving device to be determined relative to the ultrasonic signal sources.
- One implementation includes sound signal sources (e.g., speakers) distributed throughout a given area.
- the sound signals emitted by the sound signal sources are received by one or more receiving devices (e.g., mobile devices having microphones capable of accurately capturing ultrasonic sound signals), which use the received sound signals to compute a location within the given area.
- a receiving device can determine its initial position from the received sound signals using multilateration, a process of determining a position of a receiving device based on accurately computing the time difference of arrival (TDOA) of signals transmitted from multiple sound signal sources having known locations. In this manner, multilateration can be used to determine a position of a receiving device relative to a number of sound signal sources.
- TDOA time difference of arrival
- TOA time of arrival
- one implementation of multilateration involves a receiving device that receives sound signals from multiple sound signal sources at known locations. Differences in the time of arrival of each sound signal, which can be normalized based on known transmission timeslots, are used to determine differences in the distances between the receiving device and each sound signal source.
- the receiving device With two sound signal sources, the receiving device can be located on a hyperboloid. With three sound signal sources, the receiving device can be located on a second hyperboloid, wherein the intersection of the two hyperboloids describes a curve on which the receiving device lies.
- the receiving device can be located on a third hyperboloid, wherein the intersection of the three hyperboloids defines a unique point in three-dimensional space.
- FIG. 1 illustrates an example scenario 100 employing sound-based positioning.
- a shopper 102 (an example user) is carrying a mobile phone executing a positioning application as he moves through a store 104 (an example environment). The body of the shopper 102 is directed toward the northeast of the store 104.
- Multiple sound signal sources e.g., a speaker 106 are positioned throughout the store 104, each sound signal source emitting a sound signal in its own time slot (e.g., in a round-robin fashion).
- Each sound signal can be received by an audio input (e.g., a microphone) of a receiving device (e.g., the mobile phone), provided the sound signal is strong enough to be captured by the receiving device.
- an audio input e.g., a microphone
- a receiving device e.g., the mobile phone
- an ultrasonic sound signal emitted by the speaker 110 is captured by the receiving device, but the sound signal emitted by the speaker 116 is not strong enough to be captured by the receiving device by virtue of the speaker's distance from the receiving device.
- some sound signals are not received directly by the receiving device.
- the body of shopper 102 blocks the direct sound signal path between the speaker 114 and the receiving device.
- the receiving device may receive reflections of the sound signal from the speaker 114 off of the surrounding structures, such as the shelves, ceilings, and floors in the store. Reflected signals introduce additional distance along the sound signal path and therefore do not provide an accurate time of arrival measurement (without some type of normalization).
- the receiving device discerns between direct sound signals and reflected sound signals in order to omit the reflected sound signals from the positioning computation, although other implementations may be employed to account for certain types of reflections.
- known geometric relationships between a sound signal source and a reflecting surface and between a receiving device and a reflecting surface may be used to calculate the physical distance between the sound signal source and the receiving device along the reflection path and, therefore, the direct physical distance between the sound signal source and the receiving device.
- a sound-based positioning system can compute the set of possible intersections (positions) indicated by a number of captured sound signals, whether they are reflected or direct signals.
- the set of positions made possible by the multiple captured signals can be narrowed by other constraints to suggest the most reliable signal sources to be used for the most accurate positioning.
- a sound-based positioning application executing on the receiving device carried by the shopper 102 receives direct sound signals from speakers 106, 108 110, and 112. It is also possible that the receiving device receives direct sound signals from other speakers, such as speakers 118 and 120, which can improve the accuracy and/or reliability of a positioning computation. Nevertheless, by receiving four reliable and direct sound signals, the receiving device can use differential time of arrival (DTOA) measurements and multilateration to compute its position relative to the signal sources within three-dimensional space, absent other information. Alternatively, the receiving device can use multilateration to compute its position relative to the signal sources within two-dimensional space based on three reliable and direct audio sources, absent other information.
- DTOA differential time of arrival
- the reliability and accuracy of the positioning computation can be enhanced and/or the number of signal sources required for the positioning computation can be reduced (such that a subset of previously used sound signal sources are used).
- the sound-positioning application can compute a timing reference for each sound signal (e.g., the times the corresponding sound signal source transmitted and/or stopped transmitting, based on the known distance between the sound signal source and the receiving device). Given these references, the sound-positioning application can switch to non-differential TOA measurements, allowing accurate positioning using fewer than the sound signal sources used in the multilateration stage. Accordingly, as the shopper 102 moves about the store, some previous direct sound signals will become blocked by the shopper's body, other shoppers, shelves, signage, etc. Nevertheless, the sound-positioning application can continue to capture sounds signals from various sound signal sources throughout the environment and accurately determine the shopper's position, even as the number of reliable sound signals varies.
- FIG. 2 illustrates another example scenario 200 employing sound-based positioning in which a shopper 202 is located at a different location and orientation relative to multiple signal sources (e.g., a speaker 206) distributed throughout a store 204, each signal source emitting a sound signal that can be received by an audio input (e.g., a microphone) of a receiving device (e.g., the mobile phone).
- an audio input e.g., a microphone
- the signal strength in the scenario 200 can affect which sound signals are captured by the receiving device.
- some sound signals are not received directly by the receiving device.
- the body of shopper 202 is turned toward the southeast of the store 204.
- the shopper's body blocks the direct sound signal paths between the receiving device and the speakers 206 and 208.
- the receiving device may receive reflections of the sound signal off of the surrounding structures, such as the shelves, ceilings, and floors in the store.
- the receiving device discerns between direct sound signals and reflected sound signals in order to omit the reflected sound signals from the positioning computation, or correctly accounts for the distance the signal has travelled based on the know geometric relationships of sound signal sources, reflecting surfaces, and the receiving device.
- the receiving device carried by the shopper 202 receives direct sound signals from speakers 210, 212, 214, and 216 in their assigned time slots. It is also possible that the receiving device receives direct sound signals from other speakers, such as speakers 218 and 220, which can improve the accuracy and/or reliability of a positioning computation.
- movement of a receiving device throughout a given area can alter in real-time the signal sources upon which the receiving device can base a positioning computation.
- the existence of reflected sound signals within the environment further complicates the selection of reliable sound signals suitable for use in positioning computations. Accordingly, the receiving device and/or a positioning system upon which it relies filters out unreliable sound signals in the environment and excludes their signal sources from positioning computations.
- FIG. 3 illustrates an example data flow diagram 300 for sound-based positioning.
- a receiving device such as a mobile phone, executes an operating system 302 to manage its resources and provide a platform upon which a sound-based positioning application can be run.
- a mobile device executes an operating system with an audio interface that manages the audio resources of the mobile device, such as a microphone 303 and one or more speakers (not shown), and executes a mobile positioning application capable of receiving sounds signals for multiple sound signal sources positioned throughout an area (e.g., a store, a warehouse, a manufacturing floor, an office building, etc.).
- an area e.g., a store, a warehouse, a manufacturing floor, an office building, etc.
- a recorder 304 such as a processor-executable software facility, records and digitizes sound signals 305 captured by the microphone 303.
- the recorder 304 stores the digitized sound signals into a recorder process queue 306, where the recorded sound signal is split into sound signal blocks for streaming Fast Fourier Transform (FFT) processing.
- FFT Fast Fourier Transform
- the block size is on the order of 2048 audio samples long, although other blocks sizes may be employed.
- the recorder process queue 306 spawns a worker thread 308 that processes the sound signal blocks that are dequeued from the record process queue 306 for asynchronous processing.
- the worker thread 308 executes a peak finder 310, which processes each sound signal block processed by the worker thread 308.
- the peak finder 310 employs a cross-correlation manager 312 and a cross-correlator 314 to cross-correlate each sound signal block with a known transmitted signal to identify strong peaks in the correlated output.
- Cross correlation refers to a measure of similarity between two waveforms. One implementation, for example, delays one of the waveforms and then multiples the waveforms together.
- the positioning system By finding a correlation peak that has a particular shape and is above a predetermined threshold (e.g., a reliability condition) when compared to other correlation results, the positioning system omits signals that are excessively delayed by reflections (e.g., which result in a longer path that is detectable at the speed of sound).
- the cross-correlation manager 312 provides a housekeeping function to the cross-correlator 314, which performs the cross-correlation operations and output cross-correlated data 316 as a queue of processed (e.g., cross-correlated) sound signals (e.g., correlation peaks).
- the cross-correlated data 316 is processed by a signal source processor 318, which identifies the sound signal sources (e.g., speakers) from which the sound signals are received and the position of the signal sources.
- a source finder 320 of the signal source processor 318 determines an identification number of each correlation peak, thereby associating the correlation peak with a known signal source.
- signal sources are associated in signal source groups (e.g., between 8 and 16 speakers in one signal source group).
- a group identifier 322 identifies a signal source group in which the identified signal source is a member.
- a position recorder 324 associates the found signal source identifier with the corresponding audio timing of the sound signal (e.g., when the sound signal started relative to the start of the sound signal block).
- a multilateration processor 326 receives a set of identified signal sources and their respective timings, the set of known signal source locations, and any geometric constraints (e.g., aisles in a store) and estimates the position of the receiving device from this data using a multilateration operation.
- a callback utility 328 asynchronously calls the sound locator 330, which translate the user location and relative X/Y/Z coordinates into the coordinate system of the application floor plan map for presentation to a user via a user interface 332 (e.g., which can display the floor plan map and user location on a display).
- a tone can be used to identify an individual signal source.
- each signal source is identified by a locally unique tone and a set of tones spaced in intervals of 100 Hz from 20.1 KHz to 21.6 KHz.
- a "down-chirp" may be employed, using a linearly decreasing frequency.
- Other waveforms may be employed, as described with regard to other scales.
- 16 tones are used to identify an individual signal source.
- 16 tones are again used to identify a single signal source.
- the utilized bandwidth is again split into two frequency ranges (e.g., 20 KHz to 20.8 KHz and 20.8 KHz to 21.6 KHz, both at 100 Hz intervals) to scale to 64 signal sources.
- PNC pseudo-noise coded
- a frequency range e.g., 20.0 KHz to 21.6 KHz
- the utilized bandwidth is again split into two frequency ranges (e.g., 19.0 KHz to 20.4 KHz with a center frequency (f c ) at 19.7 KHz and 20.4 KHz to 21.8 KHz with a center frequency (f c ) at 21.1 KHz).
- 16 PNC waveforms are used to encode 16 group identifiers.
- the utilized bandwidth is again split into two frequency ranges (e.g., 20 KHz to 20.8 KHz and 20.8 KHz to 21.6 KHz, both at 100 HZ intervals) to scale to 256 signal sources.
- a tone is encoded in the first frequency range to indicate one of 8 signal sources and another tone is encoded in the second frequency range to indicate one of 8 groups.
- PNC pseudo-noise coded
- a frequency range e.g., 20.0 KHz to 21.6 KHz
- each sound signal source in a given area with a unique identifier.
- Each sound signal source emits its group identifier after which the individual signal sources emit their signal source identifiers in a round-robin sequence with the signal sources within that group. For example, consider a two-story building, where signal sources on the first floor are in different group as compared to the signal sources on the second floor. Accordingly, the initial group identifier signal indicates the floor on which the receiving device is positioned, and the subsequent sequence of signal source identifiers indicates the signal sources on that floor from which the signals are being captured. In this manner, signal sources identifiers can be shared among different flows, distinguished by the indicated group identifier.
- a signal source can be identified using a variety of detection methods.
- a brute force method obtains a full scale cross-correlation of captured sound signals with each of the N waveforms and selects the waveform with the largest correlation peak.
- waveforms are maintained in a bandwidth of 1600 HZ, providing about 150 frequency bins (e.g., sub-ranges) in the spectral domain.
- a 256-point Fast Fourier Transform (FFT) can be sufficient to perform cross-correlation. After identifying the waveform having the largest correlation peak from the FFT cross-correlation, a full length cross correlation with the reference function of the identified waveform can be performed, thereby identifying the sound signal sources based on the sound signal received.
- FFT Fast Fourier Transform
- Doppler extraction can be used to determine direction and/or velocity of travel of the receiving device and/or the user holding the receiving device.
- Doppler extraction involves a measure of the shift of the location of the tone in the frequency domain compared to the frequency location of the embedded tones.
- the Doppler shift can be iteratively determined by shifting the spectrum of the PNC waveforms by one or more frequency bins, multiplying with the spectrum of the recorded sound signal, performing a short length (e.g., 256) inverse fast Fourier transform (IFFT), and noting the peak of the resulting cross-correlation signal.
- IFFT inverse fast Fourier transform
- the value of the frequency bin shift that maximizes the cross-correlation peak represents a Doppler shift in the recorded sound signal.
- the procedure is repeated in two dimensions, namely PNC waveform identifiers and bin shift.
- the pair of the PNC waveform identifier and the bin shift that maximizes the cross-correlation peak yields both the PNC waveform transmitted by the sound signal source as well as the Doppler shift in the recorded sound signal.
- Doppler extraction techniques may also be employed.
- the Doppler shift in combination with incremental location determinations, can be used to determine the receiving devices (and/or the user's) direction and/or velocity of travel, so as to set constraints on the user's movement.
- Doppler extraction allows prediction of the receiving device's possible positions at a time after which the sound signal is detected, which can be used as a constraint.
- constraints can be used to improve the positioning accuracy and/or to reduce the number of reliable sound signals required to accurately determine location.
- FIG. 4 illustrates example operations 400 for sound-based positioning.
- a receiving operation 402 identifies an environment, such as a store, and receives a map of the environment, including signal source locations, their identifiers, their group identifiers, and other environmental constraints (e.g., where the receiving device can realistically be positioned).
- the receiving operation 402 executes when a sound-based positioning application is initiated on a receiving device.
- the receiving operation 402 detects that the receiving device has entered a known environment (e.g., based on a last known GPS position, based on a recognized Wi-Fi router MAC address, based on user input), and retrieves the map from its own storage or from an external data source (e.g., a Wi-Fi connected service).
- a known environment e.g., based on a last known GPS position, based on a recognized Wi-Fi router MAC address, based on user input
- retrieves the map from its own storage or from an external data source e.g., a Wi-Fi connected service.
- a capture operation 404 captures a sound signal associated with an identifiable signal source.
- Each signal source emits a sound signal in its own timeslot according to a signaling protocol, such as those described with regard to the small, medium, large, and mega scale environments above.
- the received signal is processed to identify the signal source and evaluate its reliability. For example, using cross-correlation of the captured signal with each of the waveform supported in the environment, a capture operation 404 can select the waveform that yields the largest correlation peak to identify the sound signal source of the captured sound signal.
- a decision operation 406 determines whether the captured sound signal is reliable (e.g., a direct sound signal of sufficient strength to be accurately decoded). If not, the capture signal is ignored and a new sound signal is captured in the capture operation 404. Otherwise, determining operation 408 determines the capture timestamp and the identity of the associated sound signal source using the cross-correlation result.
- reliable e.g., a direct sound signal of sufficient strength to be accurately decoded
- Another decision operation 410 determines whether a sufficient number of fresh, reliable sound signals have been captured.
- a reliability condition having one or more components is applied against the captured sounds signals. For example, one component may consider the shape of the sound signal waveform to assist in evaluating whether the captured sounds signal is direct or reflected. A direct sound signal tends to have different characteristics than a reflected sound signal, which can be discerned using various techniques including without limitation cross-correlation, Doppler extraction, etc.
- the component may evaluate a sound signal captured during the same timeslot in each cycle of sound signals. The sound signals captured during the same time slot do not cross-correlate well with each other, and it may be determined that one or more of the sound signals captured during the timeslots are reflected and therefore not reliable.
- Yet another component may consider whether the multilateration converges at or very close to a single point. Divergence in the multilateration solution may indicate that one of the component sound signals is not direct and therefore not reliable.
- Other components may also be employed in the reliability condition.
- a sound signal may be deemed “stale” or “not fresh enough” if the tracking time between a first sound signal and a last sound signal is greater than a certain threshold (e.g., in seconds). For example, as sound signals are received from various sound signal sources in a sequence of time slots, collecting a sufficient number reliable sound signals to multilaterate may occur over a period of many time slots (e.g., enough for the receiving device to have moved a non-negligible distance between the first reliable sound signal and the last reliable time signal). Accordingly, the accuracy of the position computation may be impaired if the receiving device moves too far during this "tracking" time.
- the decision operation 410 determines whether a sound signal block is stale (e.g., to old to contribute to an accurate position computation) and, therefore, unreliable. Stale sound signal blocks can be ignored.
- the determination about whether a sound signal is stale can be informed by a variety of factors (e.g., whether the user exhibits a Doppler shift indicating movement to cause a non-trivial change in position during the tracking time, whether the identity of direct and blocked signal sources changes, heuristics pertaining to shopper movement, etc.). For example, if the user does not show significant velocity based on a Doppler shift measurement of reflected sound signals, if the direct and blocked signal sources remain unchanged during the tracking time, and/or if the location of shoppers are statistically known to be acceptably accurate given the tracking time does not exceed a tracking threshold, then the sound signal may be deemed sufficiently "fresh.”
- environmental constraints may also be applied to reduce the number of fresh, reliable sound signals needed to accurately compute position based on DTOA. For example, if three fresh, reliable sound signals are captured (when a three-dimensional multilateration typically calls for four reliable sounds signals) an assumption is made that the receiving device is assumed to be within a predefined height range, then two-dimensional positioning may be sufficient and a fourth fresh, reliable sound signal is not required. Likewise, if the shoppers assumed to be in the center of the aisle within acceptable tolerance, then the number of fresh, reliable sound signals required for accurate positioning may be reduced. Other environmental constraints may likewise reduce the number of fresh, reliable sound signals required and/or improve the accuracy and reliability sound-based positioning results.
- Yet another environmental constraint that may be employed to reduce the number of fresh, reliable sound signals needed relates to a sequence of position results over a period of time. If the receiving device maintains a list of its most recent positions (and potentially, its velocities), it may assume that its direction (and/or velocity) of travel will not have changed more than a certain amount between any position computations. For example, if the receiving device has several position results indicating that the receiving device is moving North at one mile per hour in immediately previous cycles, then a constraint may be assumed that prevents the next position of the receiving device to be outside of a previous range from the previous position result.
- Yet another environmental constraint may be the knowledge of how far a user may have traveled in a given time period, assuming either the fastest velocity a user can achieve or a typical fastest velocity of a casual user in an environment (e.g., a store).
- the distance traveled may additionally take into account the constraints of walking in the real physical environment such as along the user pathways as opposed to jumping over shelves.
- a positioning operation 412 uses multilateration to determine the position of the receiving device relative to the known locations of the sound signal sources, based on the fresh, reliable sound signals capture by the receiving device, the DTOA of the captured signals, and the received map, and the sound signal source locations and identifiers.
- a presentation operation 414 presents an indication of the computed position within a map on the user interface.
- FIG. 5 illustrates example operations 500 for sound-based positioning using differential time of arrival and non-differential time of arrival.
- a determination operation 502 determines initial position of the receiving device using differential time of arrival from x sound signal sources, in a process similar to that described with regard to FIG. 4 .
- a timing operation 504 determines a time reference based on the initial position and the distances between the initial position and each of the sound signal sources emitting the captured reliable sound signals. Given these distances, the transmission time of each sound signal can be computed, thereby yielding a timing reference for each sound signal.
- a decision operation 506 determines whether insufficient number of fresh, reliable sound signals for non-differential time of arrival positioning have been captured.
- the timing reference for each sound signal makes it possible to perform non-differential TOA measurements, thereby reducing the number of reliable sound signals needed to accurately determine the position of the receiving device relative to emitting sound signal sources.
- Another determining operation 508 determines a subsequent location based on non-differential TOA measurements and the timing references.
- Another decision operation 510 retests the number of fresh, reliable sound signals captured in a subsequent cycle to determine whether non-differential time of arrival positioning may still be accurately computed, in which case processing proceeds to the determining operation 508. Otherwise, processing proceeds to the determination operation 502 to determine a new position using DTOA.
- FIG. 6 illustrates another example system (labeled as a mobile device 600) that may be useful in implementing the described technology.
- the mobile device 600 includes a processor 602, a memory 604, a display 606 (e.g., a touchscreen display), and other interfaces 608 (e.g., a keyboard).
- the memory 604 generally includes both volatile memory (e.g., RAM) and non-volatile memory (e.g., flash memory).
- An operating system 610 such as the Microsoft Windows® Phone 7 operating system, resides in the memory 604 and is executed by the processor 602, although it should be understood that other operating systems may be employed.
- One or more application programs 612 are loaded in the memory 604 and executed on the operating system 610 by the processor 602. Examples of applications 612 include without limitation email programs, scheduling programs, personal information managers, Internet browsing programs, multimedia player applications, etc.
- a notification manager 614 is also loaded in the memory 604 and is executed by the processor 602 to present notifications to the user. For example, when a promotion is triggered and presented to the shopper, the notification manager 614 can cause the mobile device 600 to beep or vibrate (via the vibration device 618) and display the promotion on the display 606.
- the mobile device 600 includes a power supply 616, which is powered by one or more batteries or other power sources and which provides power to other components of the mobile device 600.
- the power supply 616 may also be connected to an external power source that overrides or recharges the built-in batteries or other power sources.
- the mobile device 600 includes one or more communication transceivers 630 to provide network connectivity (e.g., mobile phone network, Wi-Fi®, Bluetooth®, etc.).
- the mobile device 600 also includes various other components, such as a positioning system 620 (e.g., a global positioning satellite transceiver), one or more accelerometers 622, one or more cameras 624, an audio interface 626 (e.g., a microphone, an audio amplifier and speaker and/or audio jack), and additional storage 628. Other configurations may also be employed.
- a sound-based positioning application may be embodied by instructions stored in memory 604 and/or storage devices 628 and processed by the processing unit 602. Sound signal blocks, positions, floor plan maps, respective timings, and other data may be stored in memory 604 and/or storage devices 628 as persistent datastores. It should be understood that device storage may be local (e.g., flash memory or a magnetic storage device) or remote (e.g., via a network-attached storage device, such as a DVD, a CD, or a magnetic storage device).
- An article of manufacture may comprise a storage medium to store logic.
- Examples of a storage medium may include one or more types of computer-readable storage media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth.
- Examples of the logic may include various software elements, such as software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof.
- API application program interfaces
- an article of manufacture may store executable computer program instructions that, when executed by a computer, cause the computer to perform methods and/or operations in accordance with the described embodiments.
- the executable computer program instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, and the like.
- the executable computer program instructions may be implemented according to a predefined computer language, manner or syntax, for instructing a computer to perform a certain function.
- the instructions may be implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.
- the embodiments of the invention described herein are implemented as logical steps in one or more computer systems.
- the logical operations of the present invention are implemented (1) as a sequence of processor-implemented steps executing in one or more computer systems and (2) as interconnected machine or circuit modules within one or more computer systems.
- the implementation is a matter of choice, dependent on the performance requirements of the computer system implementing the invention. Accordingly, the logical operations making up the embodiments of the invention described herein are referred to variously as operations, steps, objects, or modules.
- logical operations may be performed in any order, unless explicitly claimed otherwise or a specific order is inherently necessitated by the claim language.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Description
- Accurately determining the position of a user or mobile device within an indoor setting presents various challenges. For example, global positioning systems (GPS) technologies do not work well within an enclosed building, where the mobile device's communications with the GPS satellites can be impeded by the surrounding structure. Further, available consumer devices are limited in their communication capabilities, sensing capabilities (e.g., mobile device microphones), the accuracy of their internal clocks, available power, etc. Accordingly, obtaining highly accurate, real-time location information on a mobile user within enclosed buildings (or where GPS positioning is otherwise unavailable) is difficult without nontrivial modifications to the hardware of available mobile devices.
-
WO 01/34264 A1 -
EP 0 492 015 A1 discloses a method and apparatus for navigating an automatic guided vehicle. - Implementations described and claimed herein address the foregoing problems by using a receiving device to capture sounds signals (e.g., ultrasonic) from multiple sound signal sources, selecting the sound signals satisfying a reliability condition for use in determining an initial position of the receiving device relative to the corresponding sound signal sources, determining the initial position of the receiving device using multilateration of the selected sound signals, and updating the current position of the receiving device as the reliability of individual sound signals varies in the presence of dynamically changing environmental interference, multipathing, and movement between the receiving device and the sound signal sources.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
- Other implementations are also described and recited herein.
-
-
FIG. 1 illustrates an example scenario employing sound-based positioning. -
FIG. 2 illustrates another example scenario employing sound-based positioning. -
FIG. 3 illustrates an example data flow diagram for sound-based positioning. -
FIG. 4 illustrates example operations for sound-based positioning. -
FIG. 5 illustrates example operations for sound-based positioning using differential time of arrival and non-differential time of arrival. -
FIG. 6 illustrates another example system that may be useful in implementing the described technology. - Some modern mobile devices, such as smart phones, include microphones capable of detecting ultrasonic signals, which presents opportunities for using consumer-grade mobile devices to perform ultrasonic-based positioning. The ultrasonic bandwidth that is detectable by such devices is currently rather narrow (e.g., between 20 KHz and 22 KHz). Nevertheless, ultrasonic signals can be played within this limited bandwidth while providing sufficient information to allow position of a receiving device to be determined relative to the ultrasonic signal sources.
- One implementation, for example, includes sound signal sources (e.g., speakers) distributed throughout a given area. The sound signals emitted by the sound signal sources are received by one or more receiving devices (e.g., mobile devices having microphones capable of accurately capturing ultrasonic sound signals), which use the received sound signals to compute a location within the given area. A receiving device can determine its initial position from the received sound signals using multilateration, a process of determining a position of a receiving device based on accurately computing the time difference of arrival (TDOA) of signals transmitted from multiple sound signal sources having known locations. In this manner, multilateration can be used to determine a position of a receiving device relative to a number of sound signal sources. Thereafter, given the initial position of the receiving device, it is possible to derive a time reference for each for the sound signals and therefore continue updating the position of the receiving device using non-differential time of arrival (TOA) measurements, particularly as the number of reliable sound signals drops (e.g., from changing obstructions and interference between a signal source and a receiving device).
- Generally, one implementation of multilateration involves a receiving device that receives sound signals from multiple sound signal sources at known locations. Differences in the time of arrival of each sound signal, which can be normalized based on known transmission timeslots, are used to determine differences in the distances between the receiving device and each sound signal source. With two sound signal sources, the receiving device can be located on a hyperboloid. With three sound signal sources, the receiving device can be located on a second hyperboloid, wherein the intersection of the two hyperboloids describes a curve on which the receiving device lies. By adding a fourth sound signal source, the receiving device can be located on a third hyperboloid, wherein the intersection of the three hyperboloids defines a unique point in three-dimensional space.
- It should be understood, however, that errors in the measurement of the time of arrival of sound signals can degrade the accuracy of the position computation (e.g., the hyperboloids computed based on the received sound signals rarely intersect at an exact point in space). Accordingly, additional sound signal sources and/or optimization techniques (e.g., a least squares method or an extended Kalman filter) can be applied to improve the accuracy of computed positioning results.
-
FIG. 1 illustrates anexample scenario 100 employing sound-based positioning. A shopper 102 (an example user) is carrying a mobile phone executing a positioning application as he moves through a store 104 (an example environment). The body of theshopper 102 is directed toward the northeast of thestore 104. Multiple sound signal sources (e.g., a speaker 106) are positioned throughout thestore 104, each sound signal source emitting a sound signal in its own time slot (e.g., in a round-robin fashion). Each sound signal can be received by an audio input (e.g., a microphone) of a receiving device (e.g., the mobile phone), provided the sound signal is strong enough to be captured by the receiving device. For example, an ultrasonic sound signal emitted by thespeaker 110 is captured by the receiving device, but the sound signal emitted by thespeaker 116 is not strong enough to be captured by the receiving device by virtue of the speaker's distance from the receiving device. Furthermore, some sound signals are not received directly by the receiving device. For example, the body ofshopper 102 blocks the direct sound signal path between thespeaker 114 and the receiving device. Nevertheless, the receiving device may receive reflections of the sound signal from thespeaker 114 off of the surrounding structures, such as the shelves, ceilings, and floors in the store. Reflected signals introduce additional distance along the sound signal path and therefore do not provide an accurate time of arrival measurement (without some type of normalization). In one implementation, the receiving device discerns between direct sound signals and reflected sound signals in order to omit the reflected sound signals from the positioning computation, although other implementations may be employed to account for certain types of reflections. For example, in one implementation, known geometric relationships between a sound signal source and a reflecting surface and between a receiving device and a reflecting surface may be used to calculate the physical distance between the sound signal source and the receiving device along the reflection path and, therefore, the direct physical distance between the sound signal source and the receiving device. - In another implementation, a sound-based positioning system can compute the set of possible intersections (positions) indicated by a number of captured sound signals, whether they are reflected or direct signals. In some circumstances, the set of positions made possible by the multiple captured signals can be narrowed by other constraints to suggest the most reliable signal sources to be used for the most accurate positioning.
- As shown in
FIG. 1 , a sound-based positioning application executing on the receiving device carried by theshopper 102 receives direct sound signals fromspeakers speakers shopper 102 cannot be positioned on top of a shelf or outside the store 104) or positional approximations (e.g., theshopper 102 is assumed to be positioned in the middle of an aisle), the reliability and accuracy of the positioning computation can be enhanced and/or the number of signal sources required for the positioning computation can be reduced (such that a subset of previously used sound signal sources are used). - Once an initial position of the
shopper 102 is determined using DTOA measurements and multilateration, the sound-positioning application can compute a timing reference for each sound signal (e.g., the times the corresponding sound signal source transmitted and/or stopped transmitting, based on the known distance between the sound signal source and the receiving device). Given these references, the sound-positioning application can switch to non-differential TOA measurements, allowing accurate positioning using fewer than the sound signal sources used in the multilateration stage. Accordingly, as theshopper 102 moves about the store, some previous direct sound signals will become blocked by the shopper's body, other shoppers, shelves, signage, etc. Nevertheless, the sound-positioning application can continue to capture sounds signals from various sound signal sources throughout the environment and accurately determine the shopper's position, even as the number of reliable sound signals varies. -
FIG. 2 illustrates anotherexample scenario 200 employing sound-based positioning in which ashopper 202 is located at a different location and orientation relative to multiple signal sources (e.g., a speaker 206) distributed throughout a store 204, each signal source emitting a sound signal that can be received by an audio input (e.g., a microphone) of a receiving device (e.g., the mobile phone). As with thescenario 100 shown inFIG. 1 , the signal strength in thescenario 200 can affect which sound signals are captured by the receiving device. Furthermore, some sound signals are not received directly by the receiving device. For example, in contrast toFIG. 1 , the body ofshopper 202 is turned toward the southeast of the store 204. Accordingly the shopper's body blocks the direct sound signal paths between the receiving device and thespeakers - As shown in
FIG. 2 , the receiving device carried by theshopper 202 receives direct sound signals fromspeakers speakers - As shown with respect to
FIGs. 1 and2 , movement of a receiving device throughout a given area can alter in real-time the signal sources upon which the receiving device can base a positioning computation. Furthermore, the existence of reflected sound signals within the environment further complicates the selection of reliable sound signals suitable for use in positioning computations. Accordingly, the receiving device and/or a positioning system upon which it relies filters out unreliable sound signals in the environment and excludes their signal sources from positioning computations. -
FIG. 3 illustrates an example data flow diagram 300 for sound-based positioning. A receiving device, such as a mobile phone, executes anoperating system 302 to manage its resources and provide a platform upon which a sound-based positioning application can be run. For example, in one implementation, a mobile device executes an operating system with an audio interface that manages the audio resources of the mobile device, such as amicrophone 303 and one or more speakers (not shown), and executes a mobile positioning application capable of receiving sounds signals for multiple sound signal sources positioned throughout an area (e.g., a store, a warehouse, a manufacturing floor, an office building, etc.). It should be understood that, although the described technology is suitable for indoor use where standard GPS signals are blocked by an enclosing structure, the described technology may also be used in outdoor areas and may be used in combination with GPS and Wi-Fi technology. - When the sound-based positioning application is executing, a
recorder 304, such as a processor-executable software facility, records and digitizessound signals 305 captured by themicrophone 303. Therecorder 304 stores the digitized sound signals into arecorder process queue 306, where the recorded sound signal is split into sound signal blocks for streaming Fast Fourier Transform (FFT) processing. In one implementation, the block size is on the order of 2048 audio samples long, although other blocks sizes may be employed. - The
recorder process queue 306 spawns aworker thread 308 that processes the sound signal blocks that are dequeued from therecord process queue 306 for asynchronous processing. Theworker thread 308 executes apeak finder 310, which processes each sound signal block processed by theworker thread 308. Thepeak finder 310 employs a cross-correlation manager 312 and a cross-correlator 314 to cross-correlate each sound signal block with a known transmitted signal to identify strong peaks in the correlated output. Cross correlation refers to a measure of similarity between two waveforms. One implementation, for example, delays one of the waveforms and then multiples the waveforms together. By finding a correlation peak that has a particular shape and is above a predetermined threshold (e.g., a reliability condition) when compared to other correlation results, the positioning system omits signals that are excessively delayed by reflections (e.g., which result in a longer path that is detectable at the speed of sound). The cross-correlation manager 312 provides a housekeeping function to the cross-correlator 314, which performs the cross-correlation operations and outputcross-correlated data 316 as a queue of processed (e.g., cross-correlated) sound signals (e.g., correlation peaks). - The
cross-correlated data 316 is processed by asignal source processor 318, which identifies the sound signal sources (e.g., speakers) from which the sound signals are received and the position of the signal sources. Asource finder 320 of thesignal source processor 318 determines an identification number of each correlation peak, thereby associating the correlation peak with a known signal source. In one implementation, signal sources are associated in signal source groups (e.g., between 8 and 16 speakers in one signal source group). Agroup identifier 322 identifies a signal source group in which the identified signal source is a member. Aposition recorder 324 associates the found signal source identifier with the corresponding audio timing of the sound signal (e.g., when the sound signal started relative to the start of the sound signal block). - A
multilateration processor 326 receives a set of identified signal sources and their respective timings, the set of known signal source locations, and any geometric constraints (e.g., aisles in a store) and estimates the position of the receiving device from this data using a multilateration operation. Acallback utility 328 asynchronously calls thesound locator 330, which translate the user location and relative X/Y/Z coordinates into the coordinate system of the application floor plan map for presentation to a user via a user interface 332 (e.g., which can display the floor plan map and user location on a display). - A variety of signal source identification schemes may be employed depending on the scale of the environment (e.g., the number of signal sources, the number of receiving devices supported simultaneously, and other environmental factors), although other schemes are contemplate beyond those disclosed herein. For example, in a small-scale environment (e.g., 8-16 signal sources), a tone can be used to identify an individual signal source. In one implementation, each signal source is identified by a locally unique tone and a set of tones spaced in intervals of 100 Hz from 20.1 KHz to 21.6 KHz. In one example, each signal source emits a waveform including a chirp and a tone, such that, for example:
- In a medium-scale environment (e.g., 16-64 signal sources), for example, three approaches are described below, although others may also be used. In one approach, 16 tones are used to identify an individual signal source. The signal sources are divided into 2 groups of signal sources, scaling to up to 32 signal sources, such that, for example:
- In another approach for a medium-scale environment, 16 tones are again used to identify a single signal source. The signal sources are divided into 4 groups of signal sources, scaling to up to 64 signal sources, such that, for example:
- In yet a third approach for a medium-scale environment, the utilized bandwidth is again split into two frequency ranges (e.g., 20 KHz to 20.8 KHz and 20.8 KHz to 21.6 KHz, both at 100 Hz intervals) to scale to 64 signal sources. A tone is encoded in the first frequency range to indicate one of 8 signal sources and another tone is encoded in the second frequency range to indicate one of 8 groups, such that, for example:
- For a large-scale environment (e.g., 64-256 signal sources), for example, three approaches are described below, although others may also be used. In one approach, 16 pseudo-noise coded (PNC) waveforms are used to encode 16 signal source identifiers, each waveform spanning a frequency range (e.g., 20.0 KHz to 21.6 KHz). In addition, 16 tones distributed over a frequency range (e.g., 20.0 KHz to 21.6 KHz) to encode 16 group identifiers, such that, for example:
- In another approach for a large-scale environment, the utilized bandwidth is again split into two frequency ranges (e.g., 19.0 KHz to 20.4 KHz with a center frequency (fc) at 19.7 KHz and 20.4 KHz to 21.8 KHz with a center frequency (fc) at 21.1 KHz). In the first band, 16 PNC waveforms are used to encode 16 group identifiers. In the second band, 16 PNC waveforms are used to encode 16 signal source identifiers for in a second group, such that, for example:
- In yet another approach for a large-scale environment, the utilized bandwidth is again split into two frequency ranges (e.g., 20 KHz to 20.8 KHz and 20.8 KHz to 21.6 KHz, both at 100 HZ intervals) to scale to 256 signal sources. A tone is encoded in the first frequency range to indicate one of 8 signal sources and another tone is encoded in the second frequency range to indicate one of 8 groups. In addition, the signal sources are divided into 4 super-groups of signal sources, such that, for example:
- For a mega-scale environment (e.g., 256-1024 signal sources), 16 pseudo-noise coded (PNC) waveforms are used to encode 16 signal source identifiers, each waveform spanning a frequency range (e.g., 20.0 KHz to 21.6 KHz). In addition, 16 tones distributed over a frequency range (e.g., 20.0 KHz to 21.6 KHz) to encode 16 group identifiers, such that, for example:
- Another option for expanding the scope of the supported environment involves designating each sound signal source in a given area with a unique identifier. Each sound signal source emits its group identifier after which the individual signal sources emit their signal source identifiers in a round-robin sequence with the signal sources within that group. For example, consider a two-story building, where signal sources on the first floor are in different group as compared to the signal sources on the second floor. Accordingly, the initial group identifier signal indicates the floor on which the receiving device is positioned, and the subsequent sequence of signal source identifiers indicates the signal sources on that floor from which the signals are being captured. In this manner, signal sources identifiers can be shared among different flows, distinguished by the indicated group identifier.
- Using these or other encoding schemes, a signal source can be identified using a variety of detection methods. In one approach, a brute force method obtains a full scale cross-correlation of captured sound signals with each of the N waveforms and selects the waveform with the largest correlation peak. In another approach, waveforms are maintained in a bandwidth of 1600 HZ, providing about 150 frequency bins (e.g., sub-ranges) in the spectral domain. A 256-point Fast Fourier Transform (FFT) can be sufficient to perform cross-correlation. After identifying the waveform having the largest correlation peak from the FFT cross-correlation, a full length cross correlation with the reference function of the identified waveform can be performed, thereby identifying the sound signal sources based on the sound signal received.
- In the case of waveforms with embedded tones, Doppler extraction can be used to determine direction and/or velocity of travel of the receiving device and/or the user holding the receiving device. In one implementation, Doppler extraction involves a measure of the shift of the location of the tone in the frequency domain compared to the frequency location of the embedded tones.
- In another implementation involving a pseudo-noise code (PNC) waveform, the Doppler shift can be iteratively determined by shifting the spectrum of the PNC waveforms by one or more frequency bins, multiplying with the spectrum of the recorded sound signal, performing a short length (e.g., 256) inverse fast Fourier transform (IFFT), and noting the peak of the resulting cross-correlation signal. The value of the frequency bin shift that maximizes the cross-correlation peak represents a Doppler shift in the recorded sound signal.
- In yet another implementation involving multiple PNC waveforms, the procedure is repeated in two dimensions, namely PNC waveform identifiers and bin shift. The pair of the PNC waveform identifier and the bin shift that maximizes the cross-correlation peak yields both the PNC waveform transmitted by the sound signal source as well as the Doppler shift in the recorded sound signal.
- Other Doppler extraction techniques may also be employed. As described, the Doppler shift, in combination with incremental location determinations, can be used to determine the receiving devices (and/or the user's) direction and/or velocity of travel, so as to set constraints on the user's movement. As such, Doppler extraction allows prediction of the receiving device's possible positions at a time after which the sound signal is detected, which can be used as a constraint. Such constraints can be used to improve the positioning accuracy and/or to reduce the number of reliable sound signals required to accurately determine location.
-
FIG. 4 illustratesexample operations 400 for sound-based positioning. A receivingoperation 402 identifies an environment, such as a store, and receives a map of the environment, including signal source locations, their identifiers, their group identifiers, and other environmental constraints (e.g., where the receiving device can realistically be positioned). In one implementation, the receivingoperation 402 executes when a sound-based positioning application is initiated on a receiving device. In an alternative implementation, the receivingoperation 402 detects that the receiving device has entered a known environment (e.g., based on a last known GPS position, based on a recognized Wi-Fi router MAC address, based on user input), and retrieves the map from its own storage or from an external data source (e.g., a Wi-Fi connected service). - A
capture operation 404 captures a sound signal associated with an identifiable signal source. Each signal source emits a sound signal in its own timeslot according to a signaling protocol, such as those described with regard to the small, medium, large, and mega scale environments above. The received signal is processed to identify the signal source and evaluate its reliability. For example, using cross-correlation of the captured signal with each of the waveform supported in the environment, acapture operation 404 can select the waveform that yields the largest correlation peak to identify the sound signal source of the captured sound signal. - A
decision operation 406 determines whether the captured sound signal is reliable (e.g., a direct sound signal of sufficient strength to be accurately decoded). If not, the capture signal is ignored and a new sound signal is captured in thecapture operation 404. Otherwise, determining operation 408 determines the capture timestamp and the identity of the associated sound signal source using the cross-correlation result. - Another
decision operation 410 determines whether a sufficient number of fresh, reliable sound signals have been captured. A reliability condition having one or more components is applied against the captured sounds signals. For example, one component may consider the shape of the sound signal waveform to assist in evaluating whether the captured sounds signal is direct or reflected. A direct sound signal tends to have different characteristics than a reflected sound signal, which can be discerned using various techniques including without limitation cross-correlation, Doppler extraction, etc. Furthermore, the component may evaluate a sound signal captured during the same timeslot in each cycle of sound signals. The sound signals captured during the same time slot do not cross-correlate well with each other, and it may be determined that one or more of the sound signals captured during the timeslots are reflected and therefore not reliable. Yet another component may consider whether the multilateration converges at or very close to a single point. Divergence in the multilateration solution may indicate that one of the component sound signals is not direct and therefore not reliable. Other components may also be employed in the reliability condition. - In another perspective, a sound signal may be deemed "stale" or "not fresh enough" if the tracking time between a first sound signal and a last sound signal is greater than a certain threshold (e.g., in seconds). For example, as sound signals are received from various sound signal sources in a sequence of time slots, collecting a sufficient number reliable sound signals to multilaterate may occur over a period of many time slots (e.g., enough for the receiving device to have moved a non-negligible distance between the first reliable sound signal and the last reliable time signal). Accordingly, the accuracy of the position computation may be impaired if the receiving device moves too far during this "tracking" time. In one implementation, the
decision operation 410 determines whether a sound signal block is stale (e.g., to old to contribute to an accurate position computation) and, therefore, unreliable. Stale sound signal blocks can be ignored. - Furthermore, the determination about whether a sound signal is stale can be informed by a variety of factors (e.g., whether the user exhibits a Doppler shift indicating movement to cause a non-trivial change in position during the tracking time, whether the identity of direct and blocked signal sources changes, heuristics pertaining to shopper movement, etc.). For example, if the user does not show significant velocity based on a Doppler shift measurement of reflected sound signals, if the direct and blocked signal sources remain unchanged during the tracking time, and/or if the location of shoppers are statistically known to be acceptably accurate given the tracking time does not exceed a tracking threshold, then the sound signal may be deemed sufficiently "fresh."
- In addition, environmental constraints may also be applied to reduce the number of fresh, reliable sound signals needed to accurately compute position based on DTOA. For example, if three fresh, reliable sound signals are captured (when a three-dimensional multilateration typically calls for four reliable sounds signals) an assumption is made that the receiving device is assumed to be within a predefined height range, then two-dimensional positioning may be sufficient and a fourth fresh, reliable sound signal is not required. Likewise, if the shoppers assumed to be in the center of the aisle within acceptable tolerance, then the number of fresh, reliable sound signals required for accurate positioning may be reduced. Other environmental constraints may likewise reduce the number of fresh, reliable sound signals required and/or improve the accuracy and reliability sound-based positioning results.
- Yet another environmental constraint that may be employed to reduce the number of fresh, reliable sound signals needed relates to a sequence of position results over a period of time. If the receiving device maintains a list of its most recent positions (and potentially, its velocities), it may assume that its direction (and/or velocity) of travel will not have changed more than a certain amount between any position computations. For example, if the receiving device has several position results indicating that the receiving device is moving North at one mile per hour in immediately previous cycles, then a constraint may be assumed that prevents the next position of the receiving device to be outside of a previous range from the previous position result.
- Yet another environmental constraint may be the knowledge of how far a user may have traveled in a given time period, assuming either the fastest velocity a user can achieve or a typical fastest velocity of a casual user in an environment (e.g., a store). The distance traveled may additionally take into account the constraints of walking in the real physical environment such as along the user pathways as opposed to jumping over shelves.
- A
positioning operation 412 uses multilateration to determine the position of the receiving device relative to the known locations of the sound signal sources, based on the fresh, reliable sound signals capture by the receiving device, the DTOA of the captured signals, and the received map, and the sound signal source locations and identifiers. A presentation operation 414 presents an indication of the computed position within a map on the user interface. -
FIG. 5 illustratesexample operations 500 for sound-based positioning using differential time of arrival and non-differential time of arrival. Adetermination operation 502 determines initial position of the receiving device using differential time of arrival from x sound signal sources, in a process similar to that described with regard toFIG. 4 . Atiming operation 504 determines a time reference based on the initial position and the distances between the initial position and each of the sound signal sources emitting the captured reliable sound signals. Given these distances, the transmission time of each sound signal can be computed, thereby yielding a timing reference for each sound signal. - A
decision operation 506 determines whether insufficient number of fresh, reliable sound signals for non-differential time of arrival positioning have been captured. In this operation, the timing reference for each sound signal makes it possible to perform non-differential TOA measurements, thereby reducing the number of reliable sound signals needed to accurately determine the position of the receiving device relative to emitting sound signal sources. Another determiningoperation 508 determines a subsequent location based on non-differential TOA measurements and the timing references. Anotherdecision operation 510 retests the number of fresh, reliable sound signals captured in a subsequent cycle to determine whether non-differential time of arrival positioning may still be accurately computed, in which case processing proceeds to the determiningoperation 508. Otherwise, processing proceeds to thedetermination operation 502 to determine a new position using DTOA. -
FIG. 6 illustrates another example system (labeled as a mobile device 600) that may be useful in implementing the described technology. Themobile device 600 includes aprocessor 602, amemory 604, a display 606 (e.g., a touchscreen display), and other interfaces 608 (e.g., a keyboard). Thememory 604 generally includes both volatile memory (e.g., RAM) and non-volatile memory (e.g., flash memory). Anoperating system 610, such as the Microsoft Windows® Phone 7 operating system, resides in thememory 604 and is executed by theprocessor 602, although it should be understood that other operating systems may be employed. - One or
more application programs 612 are loaded in thememory 604 and executed on theoperating system 610 by theprocessor 602. Examples ofapplications 612 include without limitation email programs, scheduling programs, personal information managers, Internet browsing programs, multimedia player applications, etc. Anotification manager 614 is also loaded in thememory 604 and is executed by theprocessor 602 to present notifications to the user. For example, when a promotion is triggered and presented to the shopper, thenotification manager 614 can cause themobile device 600 to beep or vibrate (via the vibration device 618) and display the promotion on thedisplay 606. - The
mobile device 600 includes apower supply 616, which is powered by one or more batteries or other power sources and which provides power to other components of themobile device 600. Thepower supply 616 may also be connected to an external power source that overrides or recharges the built-in batteries or other power sources. - The
mobile device 600 includes one ormore communication transceivers 630 to provide network connectivity (e.g., mobile phone network, Wi-Fi®, Bluetooth®, etc.). Themobile device 600 also includes various other components, such as a positioning system 620 (e.g., a global positioning satellite transceiver), one ormore accelerometers 622, one ormore cameras 624, an audio interface 626 (e.g., a microphone, an audio amplifier and speaker and/or audio jack), andadditional storage 628. Other configurations may also be employed. - In an example implementation, a sound-based positioning application, a peak finder, a cross correlation manager, a cross-correlator, a worker thread, a sound locator, a user interface, a multilateration processor, and other modules and services may be embodied by instructions stored in
memory 604 and/orstorage devices 628 and processed by theprocessing unit 602. Sound signal blocks, positions, floor plan maps, respective timings, and other data may be stored inmemory 604 and/orstorage devices 628 as persistent datastores. It should be understood that device storage may be local (e.g., flash memory or a magnetic storage device) or remote (e.g., via a network-attached storage device, such as a DVD, a CD, or a magnetic storage device). - Some embodiments may comprise an article of manufacture. An article of manufacture may comprise a storage medium to store logic. Examples of a storage medium may include one or more types of computer-readable storage media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth. Examples of the logic may include various software elements, such as software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. In one embodiment, for example, an article of manufacture may store executable computer program instructions that, when executed by a computer, cause the computer to perform methods and/or operations in accordance with the described embodiments. The executable computer program instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, and the like. The executable computer program instructions may be implemented according to a predefined computer language, manner or syntax, for instructing a computer to perform a certain function. The instructions may be implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.
- The embodiments of the invention described herein are implemented as logical steps in one or more computer systems. The logical operations of the present invention are implemented (1) as a sequence of processor-implemented steps executing in one or more computer systems and (2) as interconnected machine or circuit modules within one or more computer systems. The implementation is a matter of choice, dependent on the performance requirements of the computer system implementing the invention. Accordingly, the logical operations making up the embodiments of the invention described herein are referred to variously as operations, steps, objects, or modules. Furthermore, it should be understood that logical operations may be performed in any order, unless explicitly claimed otherwise or a specific order is inherently necessitated by the claim language.
- The above specification, examples, and data provide a complete description of the structure and use of exemplary embodiments of the invention. Since many embodiments of the invention can be made without departing from the scope of the invention, the invention resides in the claims hereinafter appended. Furthermore, structural features of the different embodiments may be combined in yet another embodiment without departing from the recited claims.
Claims (9)
- A method comprising:receiving by a receiving device (600) at a first location a sound signal from each of a set of sound signal sources, the sound signal emitted from each sound signal source being distinct from the sound signal emitted from another sound signal source and each sound signal source being uniquely identifiable within the set based on information provided in the respective sound signal;selecting sound signals received from a subset of the sound signal sources, the selected sound signals satisfying a reliability condition that filters out reflected sound signals, the location of each sound signal source providing a selected sound signal being known to the receiving device (600); anddetermining the first location of the receiving device (600) relative to the known locations of the subset of sound signal sources using a differential time of arrival measurement and cross-correlationthe method characterized by further comprising:
determining a second location of the receiving device based on the first location and on non-differential time of arrival measurements of further sound signals received at the second location from fewer than the subset of sound signal sources received at the first location. - The method of claim 1, wherein the operation of determining the second location comprises:
determining a timing reference of each captured sound signal based on a distance computed between the first location and locations of each of the sound signal sources in the subset. - The method of claim 1, wherein the operation of determining the second location comprises:determining a direction of travel for the receiving device (600); andevaluating the first location and the non-differential time of arrival measurement against the direction of travel to determine the second location.
- The method of claim 3, wherein the operation of determining the direction of travel comprises:
evaluating a plurality of previously determined locations of the receiving device (600), the previously determined locations being determined based on sound-based positioning. - The method of claim 1, wherein the operation of determining the second location comprises:determining a speed of travel of the receiving device (600); andevaluating the first location and the non-differential time of arrival measurement against the speed of travel to determine the second location.
- The method of claim 5, wherein the operation of determining the speed of travel comprises:
determining a Doppler shift in frequencies of sound signals received at the first location to determine the speed of travel of the receiving device (600). - The method of any one of the preceding claims, wherein the receiving device (600) is a mobile device (600) and the sound signal sources are stationary.
- One or more processor-readable storage media (604, 628) encoding processor-executable instructions for executing on an electronic device (600) the method of any one of the preceding claims.
- A system comprising:a recorder of a receiving device (600) configured to capture at a first location a sound signal from each of a set of sound signal sources, the sound signal emitted from each sound signal source being distinct from the sound signal emitted from another sound signal source and each sound signal source being uniquely identifiable within the set based on information provided in the respective sound signal;a signal source processor of the receiving device (600) configured to select sound signals received from a subset of the sound signal sources, the selected sound signals satisfying a reliability condition that filters out reflected sound signals, the location of each sound signal source providing a selected sound signal being known to the receiving device (600); anda sound locator of the receiving device (600) configured to determine the first location of the receiving device (600) relative to the known locations of the subset of sound signal sources using a differential time of arrival measurement and cross-correlation;characterized in that the sound locator is further configured to determine a second location of the receiving device (600) based on the first location and on non-differential time of arrival measurements of further sound signals received at the second location from fewer than the subset of sound signal sources received at the first location.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/250,482 US8644113B2 (en) | 2011-09-30 | 2011-09-30 | Sound-based positioning |
PCT/US2012/054533 WO2013048708A1 (en) | 2011-09-30 | 2012-09-10 | Sound-based positioning |
Publications (3)
Publication Number | Publication Date |
---|---|
EP2761320A1 EP2761320A1 (en) | 2014-08-06 |
EP2761320A4 EP2761320A4 (en) | 2015-04-08 |
EP2761320B1 true EP2761320B1 (en) | 2020-02-26 |
Family
ID=47992472
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12836065.8A Active EP2761320B1 (en) | 2011-09-30 | 2012-09-10 | Sound-based positioning |
Country Status (7)
Country | Link |
---|---|
US (1) | US8644113B2 (en) |
EP (1) | EP2761320B1 (en) |
JP (1) | JP6468840B2 (en) |
KR (1) | KR102019525B1 (en) |
CN (1) | CN103105602B (en) |
HK (1) | HK1183100A1 (en) |
WO (1) | WO2013048708A1 (en) |
Families Citing this family (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB201200831D0 (en) * | 2012-01-18 | 2012-02-29 | Sensewhere Ltd | Improved positioning system |
EP2850451A1 (en) | 2012-05-15 | 2015-03-25 | Albert-Ludwigs-Universität Freiburg | Handheld-device-based indoor localization system and method |
US9157982B2 (en) * | 2012-08-13 | 2015-10-13 | Symbol Technologies, Llc | Ultrasonic locationing system using regional addressing with ultrasonic tones |
US8942996B2 (en) * | 2012-09-24 | 2015-01-27 | Wal-Mart Stores, Inc. | Determination of customer proximity to a register through use of sound and methods thereof |
US8874135B2 (en) * | 2012-11-30 | 2014-10-28 | Cambridge Silicon Radio Limited | Indoor positioning using camera and optical signal |
US9678195B2 (en) | 2013-02-04 | 2017-06-13 | Takemetuit Inc. | Method of processing positioning signals in positioning systems to accurately determine a true arrival time of each signal |
CN104101863A (en) * | 2013-04-07 | 2014-10-15 | 苏州红亭信息科技有限公司 | Locating system based on intelligent mobile device and locating method |
CN104376849A (en) * | 2013-08-14 | 2015-02-25 | Abb技术有限公司 | System and method for distinguishing sounds, state monitoring system and mobile telephone |
EP2866046B1 (en) * | 2013-10-25 | 2015-12-30 | Albert-Ludwigs-Universität Freiburg | Self-locating mobile receiving device |
US20150117153A1 (en) * | 2013-10-25 | 2015-04-30 | Symbol Technologies, Inc. | Adaptive transmitter cluster area for ultrasonic locationing system |
JP6484986B2 (en) * | 2014-01-31 | 2019-03-20 | 株式会社リコー | POSITION INFORMATION TRANSMISSION SYSTEM, POSITION INFORMATION TRANSMISSION DEVICE, AND POSITION INFORMATION TRANSMISSION METHOD |
JP2016031243A (en) * | 2014-07-25 | 2016-03-07 | シャープ株式会社 | Phase difference calculation device, sound source direction detection device, and phase difference calculation method |
US10816638B2 (en) * | 2014-09-16 | 2020-10-27 | Symbol Technologies, Llc | Ultrasonic locationing interleaved with alternate audio functions |
FR3031190A1 (en) * | 2014-12-31 | 2016-07-01 | Loic Thomas | METHOD FOR LOCATING AT LEAST TWO-DIMENSIONAL OF A MOBILE TERMINAL IN A DISTURB SPACE AND INSTALLATION FOR IMPLEMENTING SAID METHOD |
US20160321917A1 (en) * | 2015-04-30 | 2016-11-03 | Board Of Regents, The University Of Texas System | Utilizing a mobile device as a motion-based controller |
US10849205B2 (en) | 2015-10-14 | 2020-11-24 | Current Lighting Solutions, Llc | Luminaire having a beacon and a directional antenna |
KR101869865B1 (en) * | 2015-12-21 | 2018-06-22 | 서울대학교산학협력단 | The method of indoor localization using pre-designed acoustic source data |
US10228445B2 (en) | 2016-03-30 | 2019-03-12 | International Business Machines Corporation | Signal propagating positioning system |
US10142798B2 (en) * | 2016-08-09 | 2018-11-27 | Symbol Technologies, Llc | Arrangement for, and method of, locating a mobile device in a venue by inferring transit timer values of ranging signals received by the mobile device in a time difference of arrival (TDOA)-based ultrasonic locationing system |
CN106353731A (en) * | 2016-09-14 | 2017-01-25 | 刘珉恺 | Audio positioning device and audio positioning method |
KR102045286B1 (en) * | 2016-11-16 | 2019-11-18 | 고려대학교 산학협력단 | Device and method for detecting ultrasonic-sensor attack |
CA3047610A1 (en) * | 2016-12-20 | 2018-06-28 | Appix Project Inc. | Systems and methods for displaying images across multiple devices |
EP3602100A4 (en) * | 2017-03-20 | 2020-12-30 | Takemetuit Inc. | System and method for enabling determination of a position of a receiver within a space |
CN107656244A (en) * | 2017-08-24 | 2018-02-02 | 南京安璞信息技术有限公司 | Based on the critical indoor locating system and method for listening domain ultrasonic wave reaching time-difference |
US10670693B2 (en) * | 2017-12-29 | 2020-06-02 | Sonitor Technologies As | Position determination system having a deconvolution decoder |
US10616853B2 (en) | 2017-12-29 | 2020-04-07 | Sonitor Technologies As | Location determination using acoustic-contextual data |
US11002825B2 (en) * | 2017-12-29 | 2021-05-11 | Sonitor Technologies As | Position determination system having a deconvolution decoder using a joint snr-time of arrival approach |
US10823830B2 (en) * | 2017-12-29 | 2020-11-03 | Sonitor Technologies As | Location determination using acoustic models |
US10291999B1 (en) | 2018-03-29 | 2019-05-14 | Cae Inc. | Method and system for validating a position of a microphone |
CA3000122C (en) | 2018-03-29 | 2019-02-26 | Cae Inc. | Method and system for determining a position of a microphone |
US11867791B2 (en) * | 2019-06-14 | 2024-01-09 | Lg Electronics Inc. | Artificial intelligence apparatus for determining path of user and method for the same |
WO2020251102A1 (en) * | 2019-06-14 | 2020-12-17 | 엘지전자 주식회사 | Artificial intelligence device for providing service on basis of movement path of user, and method therefor |
US11714158B2 (en) | 2019-08-21 | 2023-08-01 | University Of Washington | Position determination systems and methods utilizing error of multiple candidate positions |
GB201914236D0 (en) * | 2019-10-02 | 2019-11-13 | Forkbeard Tech As | Frequency-shift determination |
CN112098929B (en) * | 2020-01-20 | 2024-05-14 | 苏州触达信息技术有限公司 | Method, device and system for determining relative angle between intelligent devices and intelligent device |
CN113873444B (en) * | 2020-06-30 | 2023-03-10 | 华为技术有限公司 | Positioning method and electronic equipment |
US11624804B2 (en) * | 2020-10-08 | 2023-04-11 | Nokia Technologies Oy | System and method for location determination utilizing direct path information |
CN113030848A (en) * | 2021-03-19 | 2021-06-25 | 星阅科技(深圳)有限公司 | Device for distinguishing whether sound is directional sound source |
JP7049572B1 (en) * | 2021-07-30 | 2022-04-07 | あおみ建設株式会社 | Underwater positioning system and method |
WO2023191333A1 (en) * | 2022-03-28 | 2023-10-05 | 삼성전자 주식회사 | Electronic device and system for location inference |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4713768A (en) * | 1984-02-20 | 1987-12-15 | Hitachi, Ltd. | Method of localizing a moving body |
JPS60173485A (en) * | 1984-02-20 | 1985-09-06 | Hitachi Ltd | Position measurement system |
US5528232A (en) * | 1990-06-15 | 1996-06-18 | Savi Technology, Inc. | Method and apparatus for locating items |
EP0492015A1 (en) * | 1990-12-28 | 1992-07-01 | Uraco Impex Asia Pte Ltd. | Method and apparatus for navigating an automatic guided vehicle |
US5928309A (en) * | 1996-02-05 | 1999-07-27 | Korver; Kelvin | Navigation/guidance system for a land-based vehicle |
US6040800A (en) * | 1997-04-22 | 2000-03-21 | Ericsson Inc. | Systems and methods for locating remote terminals in radiocommunication systems |
US6353412B1 (en) * | 1998-03-17 | 2002-03-05 | Qualcomm, Incorporated | Method and apparatus for determining position location using reduced number of GPS satellites and synchronized and unsynchronized base stations |
AU1290401A (en) * | 1999-11-11 | 2001-06-06 | Scientific Generics Limited | Acoustic location system |
JP2001337157A (en) * | 2000-05-26 | 2001-12-07 | Toyo System Kk | Local positioning system using ultrasonic wave |
GB2364210A (en) | 2000-06-30 | 2002-01-16 | Nokia Oy Ab | Diversity receiver and method of receiving a multi carrier signal |
US6674687B2 (en) * | 2002-01-25 | 2004-01-06 | Navcom Technology, Inc. | System and method for navigation using two-way ultrasonic positioning |
NO318010B1 (en) * | 2002-12-04 | 2005-01-17 | Sonitor Technologies As | Ultrasonic localization system |
JP2004193782A (en) * | 2002-12-09 | 2004-07-08 | Toa Corp | Method of measuring sound wave propagation time between speaker and microphone, and apparatus thereof |
KR100480144B1 (en) | 2003-07-23 | 2005-04-07 | 엘지전자 주식회사 | Position detection apparatus and method for mobile robot |
KR100494847B1 (en) | 2003-11-21 | 2005-06-14 | 한국전자통신연구원 | Apparatus and Method for bidirectional and high-accurate position determination for ubiquitous computing environment |
JP2007528002A (en) | 2004-03-08 | 2007-10-04 | リー、ドン、ウォル | Position recognition system using ultrasound and control method thereof |
JP2006258442A (en) | 2005-03-15 | 2006-09-28 | Yamaha Corp | Position detection system, speaker system, and user terminal device |
US20090316529A1 (en) | 2005-05-12 | 2009-12-24 | Nokia Corporation | Positioning of a Portable Electronic Device |
US20070282565A1 (en) * | 2006-06-06 | 2007-12-06 | Honeywell International Inc. | Object locating in restricted environments using personal navigation |
JP5200228B2 (en) | 2006-08-22 | 2013-06-05 | 国立大学法人愛媛大学 | Position measuring device |
JP5069022B2 (en) * | 2007-03-06 | 2012-11-07 | ゼネラル・エレクトリック・カンパニイ | Method and system for accurate time delay estimation for use in ultrasound imaging |
JP2009025028A (en) * | 2007-07-17 | 2009-02-05 | Brother Ind Ltd | Positioning system |
JP2009139264A (en) * | 2007-12-07 | 2009-06-25 | Synthesize Ltd | Three-dimensional position determination system, and three-dimensional position determination method |
CN101526601B (en) * | 2008-03-04 | 2013-02-13 | 日电(中国)有限公司 | Self-adaptive localization method, equipment and system adopting TOA and RSS fusion mode |
KR100926464B1 (en) * | 2008-03-10 | 2009-11-13 | 숭실대학교산학협력단 | Apparatus and method for detecting damage point in oil pipeline using acoustic wave |
CN102089622A (en) | 2008-07-14 | 2011-06-08 | 矿井安全装置公司 | System and method of determining the location of mobile personnel |
US8489112B2 (en) | 2009-07-29 | 2013-07-16 | Shopkick, Inc. | Method and system for location-triggered rewards |
KR101040181B1 (en) | 2009-12-02 | 2011-06-09 | 고려대학교 산학협력단 | Position measurement system for mobile device |
-
2011
- 2011-09-30 US US13/250,482 patent/US8644113B2/en active Active
-
2012
- 2012-09-10 JP JP2014533559A patent/JP6468840B2/en active Active
- 2012-09-10 EP EP12836065.8A patent/EP2761320B1/en active Active
- 2012-09-10 WO PCT/US2012/054533 patent/WO2013048708A1/en active Application Filing
- 2012-09-10 KR KR1020147008428A patent/KR102019525B1/en active IP Right Grant
- 2012-09-28 CN CN201210377080.3A patent/CN103105602B/en active Active
-
2013
- 2013-09-09 HK HK13110439.3A patent/HK1183100A1/en unknown
Non-Patent Citations (1)
Title |
---|
None * |
Also Published As
Publication number | Publication date |
---|---|
WO2013048708A1 (en) | 2013-04-04 |
CN103105602B (en) | 2015-03-25 |
HK1183100A1 (en) | 2013-12-13 |
KR102019525B1 (en) | 2019-09-06 |
US8644113B2 (en) | 2014-02-04 |
EP2761320A4 (en) | 2015-04-08 |
EP2761320A1 (en) | 2014-08-06 |
JP2014531597A (en) | 2014-11-27 |
US20130083631A1 (en) | 2013-04-04 |
CN103105602A (en) | 2013-05-15 |
KR20140069069A (en) | 2014-06-09 |
JP6468840B2 (en) | 2019-02-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2761320B1 (en) | Sound-based positioning | |
Chen et al. | EchoTrack: Acoustic device-free hand tracking on smart phones | |
EP2641105B1 (en) | System and method for object position estimation based on ultrasonic reflected signals | |
EP2724554B1 (en) | Time difference of arrival determination with direct sound | |
Moutinho et al. | Indoor localization with audible sound—Towards practical implementation | |
Zhou et al. | BatTracker: High precision infrastructure-free mobile device tracking in indoor environments | |
JP6495166B2 (en) | Positioning system, positioning method, and positioning program | |
US9678195B2 (en) | Method of processing positioning signals in positioning systems to accurately determine a true arrival time of each signal | |
CN112154345B (en) | Acoustic positioning transmitter and receiver system and method | |
JP2011058928A (en) | System, device, method and program for estimating position | |
US10704914B2 (en) | Positioning method using broadcast speeches | |
Nishimura et al. | A proposal on direction estimation between devices using acoustic waves | |
US20230039932A1 (en) | Likelihood-based acoustic positioning | |
US20220308158A1 (en) | Frequency-shift determination | |
Lu et al. | The research of amplitude threshold method in ultrasound-based indoor distance-measurement system | |
CN116125385A (en) | Indoor voice positioning method and device based on WIFI | |
Akhlaq | Context Fusion in Location-Aware Pervasive Applications | |
Cho et al. | Location-DB Construction of Access Points for Wireless Location |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20140318 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAX | Request for extension of the european patent (deleted) | ||
RA4 | Supplementary search report drawn up and despatched (corrected) |
Effective date: 20150306 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G01S 5/26 20060101ALI20150302BHEP Ipc: G01S 5/18 20060101AFI20150302BHEP |
|
17Q | First examination report despatched |
Effective date: 20150319 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20190920 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1238316 Country of ref document: AT Kind code of ref document: T Effective date: 20200315 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602012068121 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: FP |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200526 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200526 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200626 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200527 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200719 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1238316 Country of ref document: AT Kind code of ref document: T Effective date: 20200226 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602012068121 Country of ref document: DE |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 |
|
26N | No opposition filed |
Effective date: 20201127 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20200930 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200910 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200930 Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200930 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200930 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200910 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200226 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230501 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: NL Payment date: 20230822 Year of fee payment: 12 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20230823 Year of fee payment: 12 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20230822 Year of fee payment: 12 Ref country code: DE Payment date: 20230822 Year of fee payment: 12 |