WO2019239365A1 - System and method for position and orientation tracking of multiple mobile devices - Google Patents

System and method for position and orientation tracking of multiple mobile devices Download PDF

Info

Publication number
WO2019239365A1
WO2019239365A1 PCT/IB2019/054944 IB2019054944W WO2019239365A1 WO 2019239365 A1 WO2019239365 A1 WO 2019239365A1 IB 2019054944 W IB2019054944 W IB 2019054944W WO 2019239365 A1 WO2019239365 A1 WO 2019239365A1
Authority
WO
WIPO (PCT)
Prior art keywords
receiver
orientation
wireless
wireless receiver
tracking
Prior art date
Application number
PCT/IB2019/054944
Other languages
French (fr)
Other versions
WO2019239365A4 (en
Inventor
Ankit PUROHIT
Original Assignee
Purohit Ankit
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Purohit Ankit filed Critical Purohit Ankit
Publication of WO2019239365A1 publication Critical patent/WO2019239365A1/en
Publication of WO2019239365A4 publication Critical patent/WO2019239365A4/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1654Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with electromagnetic compass
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • H04W4/026Services making use of location information using location based information parameters using orientation information, e.g. compass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • H04W4/027Services making use of location information using location based information parameters using movement velocity, acceleration information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation

Definitions

  • the present invention relates to a system and a method for position and orientation tracking of multiple mobile devices, and more particularly, to a system and a method for wireless and inertial sensors based position and orientation tracking of multiple mobile devices.
  • beacons were introduced, namely iBeacon by Apple Inc. in 2013 and Eddystone from Google in 2015. Although these beacons offer best possible accuracy of ⁇ l meter (m), the range of such devices is limited (-10 m). Wi-Fi routers are used for positioning as well. Although they offer an increased range (-100 m), the accuracy suffers (-5-10 m). Both these tracking methodologies operate by obtaining the received signal strength indicator (RSSI) value at the point of interest. One way these RSSI values are used to obtain location is by performing trilateration. But trilateration is prone to errors because the RSSI values are far from ideal and show fluctuations due to multipath fading. Another way to use these RSSI values is by radio-fingerprinting.
  • RSSI received signal strength indicator
  • radio signals are recorded over the area of interest and localization is performed by obtaining RSSI values corresponding to different emitters and comparing it with the available radio signal map.
  • ultra-wide band (UWB) based position tracking offers improved accuracy ( ⁇ 30 cm)
  • the range is limited again ( ⁇ 30 m) which can be improved by deploying repeater units, but it’s a complex affair.
  • UWB ultra-wide band
  • VIO Visual Inertial Odometry
  • ToFs ToFs
  • AoAs Angles of Arrival
  • Wi-Fi and UWB based tracking perform position tracking only.
  • such methods have a low accuracy and only way of tracking orientation in them is with the help of gyroscope sensors, which are error prone due to drift.
  • accurate position and orientation tracking methods such as Visual Inertial Odometry (VIO) are computationally intensive, have drift and are quite sensitive to lighting conditions.
  • VIO Visual Inertial Odometry
  • various embodiments herein may include one or more systems and methods for performing estimation of position and orientation (localization) of an object using wireless signals, wherein the object can be a smartphone, drone, robot and the like.
  • This localization can be done either with sensors already built into the device or via a module installed onto the object.
  • the method includes calculating consecutive vectors of complex amplitudes of wireless signals impinging on the receiver, referred to as T p.i and T p.
  • the phases of the complex amplitudes in these complex amplitude vectors G r- i and T p are used to compute the displacements and thus perform motion estimation.
  • These complex amplitude vectors are obtained by making use of channel state information obtained at the wireless receiver and the steering matrix that is obtained from estimate of AoAs and/or ToFs of the wireless signals.
  • Traditional algorithms such as MUSIC, ESPIRIT can make use of Channel State Information to perform AoAs and/or ToFs estimation, but they perform poorly in case of a rotating/moving receiver. Therefore, the method of present invention works by fusion of data from wireless sensor and inertial measurement unit (IMU) to perform joint estimation of AoAs and/or ToFs on a wireless receiver.
  • IMU inertial measurement unit
  • the method includes obtaining a channel state information from a wireless receiver and obtaining an angular velocity from an inertial measurement unit, wherein the inertial measurement unit includes accelerometer, gyroscope, magnetometer and the like. Furthermore, the data between the inertial sensors (mainly gyroscope giving out angular velocity) and the wireless sensor (giving out Channel State Information) is combined via maximum likelihood or expectation maximization to jointly perform Angles of Arrival (AoAs) and Times of Flight (ToFs) estimation on the wireless receiver. Once the estimates are calculated, their values can be used to get steering matrices.
  • AoAs Angles of Arrival
  • ToFs Times of Flight
  • Steering matrices and received CSI matrices can be used to compute vectors of complex amplitudes such as r p-i and G r , of wireless signals arriving at the receiver. These complex amplitudes can then be used to perform displacement calculations as is described later.
  • the orientation can be estimated by combining the estimates from gyroscope with these estimated AoAs and with IMU inputs (say via a Kalman Filter), making the orientation estimation driftless. These calculated displacements and orientation changes can be used for tracking of the wireless receiver.
  • This mode of tracking can be further augmented by making use of maps of AoAs and phase & amplitude of complex amplitudes in the vector of complex amplitudes (G) for incoming wireless signals , along with map of magnetic field, making use of technique called Simultaneous Localization and Mapping (SLAM), thus making even precise absolute positioning possible.
  • SLAM Simultaneous Localization and Mapping
  • the amplitude and phase of the complex amplitudes in complex amplitude vectors G & the AoAs for incoming wireless signals and magnetic field data can be used to create a map for performing localization using, for example, Simultaneous Localization and Mapping (SLAM) technique.
  • SLAM Simultaneous Localization and Mapping
  • the method further comprises determining an angle of departure of signal at a transmitter, obtaining AoA of signals at the receiver, finding out the AoA for the Line of Sight signal and comparing the angle of departure and the angle of arrival of LOS signal at receiver for determining the absolute orientation of the wireless receiver. Further, the method comprises calculating a change in phase of the complex amplitudes occurring between the reception of two consecutive data packets of electromagnetic signals arriving at the receiver, wherein the change in phase is related to the displacement of the wireless receiver. Further, the change in phase between consecutive data packets are analysed for performing localization. The tracked orientation values can be used for performing displacement calculations wherein the calculated displacements can be used for tracking the position of the wireless receiver.
  • the angular velocity is measured by a gyroscope with the additional input from the magnetic sensor and the accelerometer and it can be combined with the AoAs estimates via, for example, a Kalman Filter to further increase the accuracy of the device orientation estimates.
  • the wireless receiver can be a rotating receiver, a vibrating receiver, a moving receiver and the like.
  • the method involves performing angle of arrival estimation at the mobile receiver and the angle of departure estimation of signal from the transmitter.
  • the AoAs/ToFs estimates are used to construct steering matrices that are then used to get the vector of complex amplitudes of signals impinging at the mobile receiver.
  • Complex amplitude of Line of Sight signal is identified using ToF estimates and the AoA/AoD estimates are used to perform corrections to the absolute value of complex amplitudes and these corrected values are used to obtain estimate of distance, which is then used with AoA/AoD estimates for receiver localization via triangulation/trilateration.
  • localization computations are performed locally on the device being tracked making the system scalable and with minimized latency.
  • Fig. 1 describes a multipath propagation of a wireless signal between a transmitter and a receiver, according to an exemplary implementation of the presently claimed subject matter.
  • Fig. 2 demonstrates an example of directional gain due to radiation pattern of a dipole antenna, according to an exemplary implementation of the presently claimed subject matter.
  • Fig. 3 describes a two-ray model of propagation and concept of complex amplitudes, according to an exemplary implementation of the presently claimed subject matter.
  • Fig. 4 describes N source signals impinging onto a receiver array with M receiving antenna elements (arranged in Uniform Linear Array (ULA) configuration) because of one transmitter, with spacing between the receiving antenna elements being d, according to an exemplary implementation of the presently claimed subject matter.
  • ULA Uniform Linear Array
  • Fig. 5 illustrates the introduction of phase difference between two consecutive receiving antenna elements due to k th signal wavefront arriving at 9 k angle in 2D case with ULA configuration, according to an exemplary implementation of the presently claimed subject matter.
  • Fig. 6 describes a schematic diagram for Angles of Arrival (AoAs) and/or Times of Flight (ToFs) estimation on rotating receiver, according to an exemplary implementation of the presently claimed subject matter.
  • AlAs Angles of Arrival
  • ToFs Times of Flight
  • Fig. 7 depicts the multiple angle of arrivals due to 1 th access point at p th and (p- 1 ) th frame of time.
  • the unit vectors’ directions change in receiver’s reference frame, according to an exemplary implementation of the presently claimed subject matter.
  • Fig. 8 depicts the system configuration in 2 dimensions, according to an exemplary implementation of the presently claimed subject matter.
  • Fig. 9 shows one method by which displacement of the receiver array in global frame is estimated, according to an exemplary implementation of the presently claimed subject matter.
  • Fig. 10 shows another method by which displacement of the receiver array in receiver’s local frame is estimated, according to an exemplary implementation of the presently claimed subject matter.
  • Fig. 11 illustrates various components that constitute the tracking module, according to an exemplary implementation of the presently claimed subject matter.
  • Fig. 12 describes alternative embodiment of the invention, according to an exemplary implementation of the presently claimed subject matter.
  • FIG 13 illustrates a flowchart of a method for position and orientation tracking of multiple mobile devices, according to an exemplary implementation of the presently claimed subject matter.
  • the various embodiments of the present invention provide a system and a method for position and orientation tracking of multiple mobile devices, and more particularly, to a system and method for wireless and inertial sensors based position and orientation tracking of multiple mobile devices.
  • connections between components and/or modules within the figures are not intended to be limited to direct connections. Rather, these components and modules may be modified, re-formatted or otherwise changed by intermediary components and modules.
  • the present invention provides a system that relies on the change in phase of complex amplitudes of wireless signals impinging on the mobile device, that occurs between reception of two consecutive data packets obtained at the wireless receiver sensor. This change in phase is related to the displacement of the receiver array, and its estimation enables calculation of the displacement or velocity of the receiver array, using which position of the receiver array can be tracked. Furthermore, as AoAs are being computed for estimation of these complex amplitude vectors, these AoAs are used to estimate changes in orientation and thus perform orientation tracking in a drift- less manner.
  • the accuracy is of the order of ⁇ l cm.
  • the tracking is independent of lighting conditions of the environment, as lighting does not affect wireless signals. Also, it is the change in phase between complex amplitudes of received wireless signals of consecutive packets that is analyzed for performing localization, and these packets are spaced at around 1 milli-second or less in time, which means that the method of tracking is independent of changes occurring in environment because such changes occur on slower time scale and also there is no dependence on stored CSI values, making the method more robust. [0037] In another implementation, the tracking relies on reception of wireless packets for localization.
  • these packets are received normally at a rate of 500-1000 Hz and can also be done at higher frequency by modifying the wireless packet.
  • inertial sensors are capable of providing sensor measurements at -1000 Hz easily.
  • the present invention can operate at higher frequencies compared to the existing systems. Also, the entire computation is done on the receiver, hence multiple devices can be tracked, which makes the system a scalable solution and with minimized latency.
  • the method involves performing localization.
  • the displacements of the object are calculated, which are obtained by analysing the change in the phase of the complex amplitudes of the wireless signal that arrive at the receiver, vectors of complex amplitudes being referred to as G r- i and r p. Since a small displacement gives a significant change in phase, sub-cm level tracking can be determined. But to derive G r and G r- i, steering matrices are required such as A( q, t, p-l) and A( q, t, p).
  • AoAs and ToFs of wireless signals are computed in the receiver’s frame as per the invention’s method, using the Channel State Information from wireless sensors and angular velocity from IMU ⁇ gyroscope in particular).
  • FIG. 1 describes a multipath propagation of a wireless signal between a transmitter (102) and a receiver (104), according to an exemplary implementation of the present disclosure.
  • signals are exchanged between the transmitter (102) and the receiver (104) through two propagation mechanisms namely direct and indirect propagation.
  • the direct propagation occurs when there exists a Line of Sight (LOS) path between the transmitter (102) and the receiver (104) and signal arrives at the receiver from the LOS path.
  • indirect propagation happens in absence of a direct LOS path i.e. from a Non-Line of Sight (NLOS) path and it involves one or a combination of the reflection, diffraction, scattering, or refraction.
  • NLOS Non-Line of Sight
  • Fig. 2 demonstrates an example of directional gain due to radiation pattern of a dipole antenna, according to an exemplary implementation of the present disclosure. It describes that the strength of signal transmitted or received at antenna is not uniform across all angles of transmission/reception but has an angular dependence. This means that if a signal s(t) is to be transmitted by antenna, then signal released at angle q t is s(t)*G T (0 T ), where GT(0) is gain function of the transmitter antenna configuration. Similarly, when an antenna receives signal at angle O R , the received signal gets multiplied by G R (O R ) where G R (0) is the gain function of the receiver configuration. For passive antennas, i.e. antennas without in-built amplifying circuitry, this gain function is reciprocal which means that if a transmitting antenna starts behaving as a receiving antenna then the gain function stays the same.
  • Fig. 3 describes a two-ray model of propagation and concept of complex amplitudes, according to an exemplary implementation of the present disclosure. It depicts the case of a single input single output (SISO) transmission for a signal composed of a single frequency component with wavelength l in a 2D scenario.
  • SISO single input single output
  • OFDM Orthogonal Frequency Division Multiplexing
  • the propagating signal undergoes propagation loss only and the electric field gets attenuated by a factor inversely proportional to the distance travelled by the signal along the LOS path and it acquires a phase shift as described by laws of electromagnetic propagation.
  • one signal component is released from antenna at an angle On travels distance di along the LOS path. Hence, it gets multiplied by a factor of
  • G T (0) & G R (0) are gain function dependent on arrival/departure angle of signal for the transmitter (302)/receiver (304). Therefore, the electric field of signal that is received at the receiver (304) is given by:
  • CFR channel state information
  • MIMO Multiple Input Multiple Output
  • Fig. 4 describes N source signals impinging onto a receiver array with M receiving antenna elements (arranged in Uniform Linear Array (ULA) configuration) because of one transmitter, with spacing between the receiving antenna elements being d, according to an exemplary implementation of the present disclosure. It depicts the case of a Multiple Input Multiple Output (MIMO) reception of multiple signals at the receiver array in case of 2- dimensional propagation and for receiver configuration of Uniform Linear Array (ULA).
  • MIMO Multiple Input Multiple Output
  • the same concepts hold in 3 -dimensions and for receivers with arbitrary configurations of the receiving antenna elements and there are equations already derived in the literature for the same.
  • the MIMO uses multiple antenna configurations at the transmitter and receiver end to achieve higher data transfer rate and range by exploiting the phenomenon of multipath propagation.
  • a T x M MIMO configuration has T number of transmitter antenna elements and M receiving antenna elements.
  • the maximum number of spatially independent data streams supported by such configuration is minimum of T or M.
  • a 2 x 3 MIMO configuration can support maximum of 2 spatially independent data streams.
  • T For T number of transmit antennas, there exist minimum (T, M) number of independent signals that leave the transmitter and due to presence of objects in surroundings, N signal wavefronts impinge on the receiver array consisting of M number of receiving antenna elements. As discussed before, each of these N signals have N complex amplitude factors, when arriving at the first element of the antenna. Moreover, receiver array geometry results in certain relationships between signal received amongst the member antenna elements.
  • N signals with angle of arrivals qi, q 2.
  • Q N are impinging at the receiver array containing M antenna elements.
  • Fig. 5 illustrates the introduction of phase difference between two consecutive receiving antenna elements due to k th signal wavefront arriving at 9 k angle in 2D case with ULA configuration, according to an exemplary implementation of the present disclosure. It describes a signal wavefront arriving at the receiver array with 9 k angle of arrival. The phase difference between two consecutive receiving elements is given
  • Q( ⁇ ) is a IcN vector of AoAs in the receiver’s frame of reference;
  • A(0(t)) is a M x N steering matrix that is dependent on receiver configuration and on the angles of arrivals of the signals,
  • T(t) is a time-dependent Nxl vector of complex amplitudes of N wireless signals arriving at the first antenna element due to N impinging signals and
  • n(t) is a vector of noise introduced into the receiving elements.
  • X(t) is a time dependent M x 1 complex vector of CSI factors estimated at receivers in the receiving array. This is the equation for single subcarrier.
  • the above equations get slight modifications.
  • the CSI matrix X is complex and of the shape M x K. If we concatenate all columns of the CSI matrix X one by one into a single column vector X , the relation between X (t), f(t) and A(0, t) is given as:
  • P(6 k (t), i k (t)) is steering vector for the k th impinging wavefront
  • ly is the time of flight and 9 k is the angle of arrival of the k th wavefront signal
  • fa is the spacing between frequencies of subcarriers
  • k is wavenumber of central frequency of communication
  • d is the distance between the antenna elements in the receiver array.
  • A(0, t) is the time dependent steering matrix which is given by: where b(qi,( ⁇ ), i k (t)) are time dependent steering vectors written in column form and of shape (MK)xl .
  • L-packets are collected for determination of angles of arrival and times of flight (614) of N signals and the time period between reception of each packet is T.
  • the time between reception of packets can be arbitrary, but it is taken constant for simplicity and arbitrary timings can be accounted for with slight modifications to equations below.
  • o3 ⁇ 4 is the angular velocity (608) of the receiver at time step j.
  • the ToF of signals doesn’t change during the reception of wireless packets.
  • the related steering vectors are as have been described above. Therefore, the angle of arrivals at 0 th packet can be obtained by fusing the CSI data from wireless sensors (606) and angular velocity (608) data from the IMU sensors into the following minimization problem and solving it:
  • This minimization problem can be solved by a minimization module (612) that is either directly implemented on the main processor (1108) or there can be an additional minimization module (612) integrated as seen in Fig. 6, Fig. 9 and Fig. 10.
  • the minimization module (612) can be either an Application Specific Integrated Circuit (ASIC), Graphic processing unit (GPU), Tensor Processing Unit (TPU), Field Programmable Gate Array (FPGA) or even an additional central processing unit (CPU) that has processing capabilities.
  • ASIC Application Specific Integrated Circuit
  • GPU Graphic processing unit
  • TPU Tensor Processing Unit
  • FPGA Field Programmable Gate Array
  • CPU central processing unit
  • this kind of minimization problem can be converted into different forms and solved by techniques such as Maximum Likelihood (ML), Expectation Maximization (EM), Space Alternating Generalized Expectation (SAGE) Maximization and others which are described in literature on signal processing.
  • the angular velocities o3 ⁇ 4 are measured by gyroscopes (1104) with additional input from magnetic field sensors (604) and/or accelerometers (1106) and these values are then fed into the minimization problem.
  • phase shift that occurs between signals at same antenna but different frequency, due to time of flight of the signal.
  • This additional phase shift has been taken into account while defining the minimization problem, by including the joint estimation of time of flight (ToF) of signal.
  • This ToF can also be the relative time of flight (rToF), if there are correction procedures done before to sanitize the CSI data. But their estimation procedure remains the same. Therefore, ToF and rToF are used interchangeably in this document.
  • the CSI data obtained from wireless receiver is combined with angular velocity (608) obtained from inertial (602)/magnetic sensors (604) and thus angle of arrivals and times of flight (614) of electromagnetic signals at rotating (even with translation) receivers is estimated.
  • This technique can be used for example to perform beamforming when the rotating/moving mobile device acts as transmitter and results in optimum battery usage during transmissions and better signal acquisition during reception, or even to receive signals efficiently when the device acts as a receiver
  • angles of arrival and times of flight (614) of signals are determined, and as these signal sources do not change position so rapidly, these time dependent angles of arrivals are used to accurately track even orientation of the receiver, thus eliminating build-up of drift.
  • a calibration step can be added that compares Angle of Departure (AoD) at the transmitters and the AoA at the receiver and thus the true orientation of the receiver is established. Further, the readings can be fed into estimators such as Kalman Filter (906) that can result in even more accurate results.
  • Fig. 7 depicts the multiple angle of arrivals due to 1 th access point at p th and (p- 1 ) th frame of time.
  • the unit vectors’ directions change in receiver’s reference frame, according to an exemplary implementation of the present disclosure.
  • Fig. 8 depicts the system configuration in 2 dimensions, according to an exemplary implementation of the present disclosure.
  • Q transmitters Access Points APs
  • receiver receives M receiving antenna elements.
  • the receiver gets displaced by d.
  • D is a diagonal matrix of shape N x N that is given by:
  • r3 ⁇ 4. are unit vectors along i th , the angle of arrival of the signal in the global frame as shown in Fig. 7 and d is displacement of the receiver between time steps (p) and (p-l) as depicted in Fig. 8.
  • Fig. 9 shows one method by which displacement of the receiver array in global frame is estimated, according to an exemplary implementation of the present disclosure.
  • Fig. 10 shows another method by which displacement of the receiver array in receiver’s local frame is estimated, according to an exemplary implementation of the presently claimed subject matter.
  • the D matrix (904) is further calculated by solving the following minimization problem:
  • a (L-l) x 1 vector is calculated by taking ratio of elements of D matrix (904) as follows. This ratio enables us to get rid of phase offset that gets introduced due to different clocking frequencies of the transmitter and the receiver:
  • R matrix (910) is calculated row by row as:
  • R k - -
  • R k is the k th row of R matrix (910).
  • w are the AoAs of the k th signal in the global frame of reference.
  • the d is the displacement (912) of the receiver in the global frame.
  • the displacement calculation can also be done in the local frame of the receiver device.
  • the D matrix (904) and S are computed as described above.
  • the R matrix (910) is computed differently.
  • the k th element R k is calculated in receiver’s frame of reference as follows,
  • the 9 k is the AoAs of the k th signal wavefront.
  • the displacement vector obtained is in the local frame of reference of the receiver. Taking into account the change in orientation of the receiver and the displacement (912) from (p- 1 ) th to p th time step, the change in position of the receiver in its frame of reference can be calculated as is done in the technique of Visual Inertial Odometery (VIO).
  • VIO Visual Inertial Odometery
  • the flowchart is as shown in Fig. 10. Although, estimated parameters of impinging wireless signals for two consecutive data packets are used here for performing localization, any pair of received data packets can be used in a similar way for performing localization.
  • Fig. 11 illustrates various components that constitute the tracking module, according to an exemplary implementation of the present disclosure. It consists of a wireless module (1102), gyroscope (1104), accelerometer (1106), magnetic sensor (604) and a visual sensor (1112). It also consists of a processor (1108) or microcontroller for executing digital instructions such as those from ARM, Intel, Microchip and also a memory (1110) being either volatile or non-volatile such as ROM, RAM, Flash memory and such.
  • the module can also consist of one or more cameras, the input data of which can be combined with the position/orientation estimates of the current method to further increase the accuracy.
  • the device can also have special compute units such as GPU, TPU, ASIC, FPGA to assist the central processor in computations.
  • Fig. 12 describes alternative embodiment of the invention, according to an exemplary implementation of the present disclosure.
  • the angles of arrival and times of flight (614) of the incoming signal wavefronts can be calculated in the frame of reference of the receiver.
  • a signal wavefront is released at an angle qc from the transmitter and it travels in the LOS manner towards the receiver.
  • the wavefront impinges on the receiver array at an angle of 0 R.
  • the receiver calculates the complex amplitude vectors (902) of all such wavefronts that impinge due to the transmitter (1202). But using the steering matrix obtained from the angle of arrival estimation, it is now possible to resolve the complex amplitude coefficients of individual impinging wavefronts, as seen in the previous sections.
  • the information about AoA (614) at the receiver (1204) is calculated at the receiver, but the receiver has no information about the AoD at the transmitter.
  • the AoD values can be made available at the receiver as described in following workflow.
  • the receiver to be tracked (Rx) sends out series of packets addressed to the transmitter (Tx). Because of large dimensions of surroundings compared to small motions encountered between packets (time frame of ⁇ 1 ms), it can be assumed that the angle doesn’t change much when Rx moves for collective P packets.
  • AoA estimation algorithms such as MUSIC, ESPIRIT and their variants can be employed on the Tx.
  • electromagnetic propagations are reciprocal in nature, therefore the AoA calculated at Tx is almost approximately equal to the AoD in the previous time step.
  • This AoD is used as q t whereas the AoA at receiver (1204) is used as OR.
  • the transmitter can encode information about its correction factor inside the packets already being sent out and hence, the receiver can know GT(OT).
  • the receiver Rx is computing the angle of arrival OR (614), in its frame.
  • gain factors G R (0 R ) and G T (0 T ) are known on the receiver and used for correction.
  • the modified RSSI value can be employed.
  • r can be estimated, wherein r is the distance between Tx from Rx. So, if the location of Tx is known, Rx localizes itself w.r.t. Tx or otherwise, it is done by relative positioning. In presence of multiple transmitters, triangulation is performed, to determine location of Rx. If there are multiple receivers to be tracked, each of them performs this position correction turn by turn, while relying on displacement calculation via phase, when there is no LOS or while waiting for its turn to send signal to the transmitters.
  • the AoA & AoD are compared and absolute orientation of the receiver is determined because direction of LOS signal that departs the transmitter maintains its orientation in the global frame till it reaches the receiver.
  • values for LOS paths can be used for even making a probability map using probabilistic techniques such as Simultaneous Localization and Mapping (SLAM) and thus estimate position of the receiver.
  • SLAM Simultaneous Localization and Mapping
  • this present invention can be combined with a visual sensor (1112) to perform a convoluted SLAM and achieve much better accuracies or enable even novel implementations.
  • a method involves usage of Wi-
  • Fi as the wireless mode of operation for working indoors and it relies on the calculated displacements for determining its position. It can be used in indoor setting for example, to track an AR/VR headset. Another usage of the method with additional SLAM implementation can be used in industrial settings for asset or personnel tracking. It can be also used in retail for e.g. to enable customers to navigate inside malls or shopping complexes. They are even used in exhibition and trade fair industry. Above use cases are with routers as transmitters in Wi-Fi. But, as Wi-Fi is similar to telecommunication protocol such as cellular 4g in protocols, the presently claimed subject matter can be implemented in 4g as well, providing positioning of all places that get 4g cellular signals. It can be used in city wide cm-level positioning that can be used for e.g. in drone path automation or for autonomous vehicles.
  • this method works in configuration with Wi-Fi (2.4/5/5.8 GHz), 3g/4g/5g, Bluetooth Low Energy, Ultra-Wide Band (UWB) and even on other frequencies of communications that are not standard. It has been devised keeping in mind 802.11h protocol, but can be similarly modified for other advanced protocols such as 802.1 lac, 802.1 lad, 802.11 ah and 802.1 laj. Furthermore, the combination of sensors such as with visual/magnetic/ LIDAR (maybe acoustic) sensors can further augment the method and other modes of its operation like SLAM (using phase/amplitude/magnetic field data).
  • SLAM using phase/amplitude/magnetic field data
  • Fig 13 illustrates a flowchart of a method for position and orientation tracking of multiple mobile devices, according to an exemplary implementation of the presently claimed subject matter.
  • step 1302 obtaining a channel state information (610) by a wireless sensor (606).
  • the wireless sensor (606) is configured to obtain a channel state information (610) from a wireless receiver (1204).
  • step 1304 obtaining an angular velocity (608) by an inertial sensor (602) and/or a magnetic sensor (604).
  • the inertial sensor (602) and the magnetic sensor (604) are configured to obtain an angular velocity (608) from an inertial sensor (602).
  • the minimization module (612) is configured to fuse the data of the channel state information (610) with the angular velocity (608).
  • step 1308 determining a plurality of angles of arrival and times of flight (614) of a plurality of electromagnetic signals by the minimization module (612) at the wireless receiver.
  • the minimization module (612) is configured to determine a plurality of angles of arrival and times of flight of a plurality of electromagnetic signals at the wireless receiver.
  • tracking orientation of the wireless receiver by a tracking module (1100) by combining the determined angles of arrival and the obtained angular velocity (608) with a gyroscope input (1104).
  • the tracking module (1100) is configured to track orientation of the wireless receiver by combining the determined angles of arrival and the obtained angular velocity (608) with a gyroscope input (1104).
  • step 1312 performing displacement calculations with the tracked orientation values and tracking the position of the wireless receiver with the calculated displacement estimates by the tracking module (1100).
  • the tracking module (1100) is configured to perform the displacement calculation with the tracked orientation and to track the position of the wireless receiver with the calculated displacement estimates.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Navigation (AREA)
  • Gyroscopes (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

One or more systems and methods are provided for performing estimation of position and orientation (localization) of an object using wireless signals. This localization can be done either with sensors already built into the device or via a module installed onto the object. The method further includes obtaining a channel state information (610) from a wireless receiver (1204) and obtaining an angular velocity (608) from an inertial measurement unit. Further, the data between the inertial sensor (602) and the wireless sensor (606) is fused by maximum likelihood, expectation maximization and other such variants to perform Angles of Arrival (AoAs) and Times of Flight (ToFs) estimation on the wireless receiver. The orientation can be estimated by combining the estimates from gyroscope (1104) with these estimated AoAs and with accelerometer (1106) and magnetic sensor (604) inputs, making the orientation estimation driftless. Once the estimations are calculated, the values can be used to perform displacement calculations and these calculated displacements can be used for tracking of the wireless receiver.

Description

SYSTEM AND METHOD FOR POSITION AND
ORIENTATION TRACKING OF MULTIPLE MOBILE
DEVICES
TECHNICAL FIELD
[0001] The present invention relates to a system and a method for position and orientation tracking of multiple mobile devices, and more particularly, to a system and a method for wireless and inertial sensors based position and orientation tracking of multiple mobile devices.
BACKGROUND
[0002] Conventionally, position tracking on electronic devices has been performed by use of inertial sensors such as accelerometers and gyroscopes. Accelerometers measure acceleration of the body which is doubly integrated over time to calculate the position. In the same way, gyroscopes sense the angular velocity of the object, and this angular velocity is integrated over time to find out the angular orientation of the body. This kind of tracking is referred to as Dead- Reckoning, but it is highly prone to errors because the sensor readings are noisy and integration of such noisy readings introduces errors that keep accumulating with time, and thus result in quick loss of tracking of the position and/or orientation.
[0003] Further, to offer better position estimates, Bluetooth Low Energy
(BLE) beacons were introduced, namely iBeacon by Apple Inc. in 2013 and Eddystone from Google in 2015. Although these beacons offer best possible accuracy of ~l meter (m), the range of such devices is limited (-10 m). Wi-Fi routers are used for positioning as well. Although they offer an increased range (-100 m), the accuracy suffers (-5-10 m). Both these tracking methodologies operate by obtaining the received signal strength indicator (RSSI) value at the point of interest. One way these RSSI values are used to obtain location is by performing trilateration. But trilateration is prone to errors because the RSSI values are far from ideal and show fluctuations due to multipath fading. Another way to use these RSSI values is by radio-fingerprinting. Here, radio signals are recorded over the area of interest and localization is performed by obtaining RSSI values corresponding to different emitters and comparing it with the available radio signal map. Here, it is difficult to map radio signals completely over the region of interest and also, the radio map is susceptible to changing environment due to movement of people or objects. There also exists ultra-wide band (UWB) based position tracking, though it offers improved accuracy (~30 cm), the range is limited again (~30 m) which can be improved by deploying repeater units, but it’s a complex affair. Also, there is need of specially designed hardware for UWB tracking, which makes related implementation costly.
[0004] Furthermore, for smartphones and mobile robots such as drones, technique of Visual Inertial Odometry (VIO) is being employed. It makes use of smart fusion of data from visual and inertial sensors. It has an inaccuracy of ~3% of distance travelled. Also, as there are images involved, the process is computationally quite intensive, thus these operate at lower update rates. Moreover, this technique cannot work in dark, or in texture-less surroundings, because it requires estimation of markers from obtained images for tracking. This marker estimation is rendered impossible in dark or texture-less surroundings.
[0005] In addition, there are also systems that estimate the Times of Flight
(ToFs) and the Angles of Arrival (AoAs) of signals at static receivers. These parameters are then used to find out the position of transmitter relative to the fixed receivers located at known positions. These offer accuracy that is comparable to that given by UWB systems, but again are limited by the number of transmitters that can be tracked, because of the limitations on the number of transmitted packets that the receiver fleet can process. Moreover, existing methods for performing AoAs and/or ToFs estimations such as MUSIC, ESPIRIT work well in case of static receivers, but when the receiver rotates or moves, these methods become inaccurate. There are also methods that use fingerprinting of Channel State Information (CSI) value, making accuracy better, but with increased computational load (in case of CSI-fmgerprinting). They are prone to noise getting introduced due to changes in the environment and variable CSI values due to device orientation. [0006] Additionally, existing systems and methods such as BLE beacon,
Wi-Fi and UWB based tracking perform position tracking only. However, such methods have a low accuracy and only way of tracking orientation in them is with the help of gyroscope sensors, which are error prone due to drift. Whereas, accurate position and orientation tracking methods such as Visual Inertial Odometry (VIO) are computationally intensive, have drift and are quite sensitive to lighting conditions. Further, there are efforts underway to calculate position by using phase change of CSI. But in such setups, a transmitter is tracked by static receiver arrays and computation for position estimation occurs at a central server. Due to this approach the tracking method is not scalable and has large inherent latency.
[0007] Hence, there is a need of a system and a method for tracking that is accurate, has a good range, operates at higher frequency and is not susceptible to changing surroundings and is scalable.
SUMMARY
[0008] This summary is provided to introduce concepts related to a system and a method for position and orientation tracking of multiple mobile devices. This summary is neither intended to identify essential features of the present invention nor is it intended for use in determining or limiting the scope of the present invention.
[0009] For example, various embodiments herein may include one or more systems and methods for performing estimation of position and orientation (localization) of an object using wireless signals, wherein the object can be a smartphone, drone, robot and the like. This localization can be done either with sensors already built into the device or via a module installed onto the object. In one of the embodiments, the method includes calculating consecutive vectors of complex amplitudes of wireless signals impinging on the receiver, referred to as Tp.i and Tp. The phases of the complex amplitudes in these complex amplitude vectors Gr-i and Tp are used to compute the displacements and thus perform motion estimation. These complex amplitude vectors are obtained by making use of channel state information obtained at the wireless receiver and the steering matrix that is obtained from estimate of AoAs and/or ToFs of the wireless signals. Traditional algorithms such as MUSIC, ESPIRIT can make use of Channel State Information to perform AoAs and/or ToFs estimation, but they perform poorly in case of a rotating/moving receiver. Therefore, the method of present invention works by fusion of data from wireless sensor and inertial measurement unit (IMU) to perform joint estimation of AoAs and/or ToFs on a wireless receiver.
[0010] Further, the method includes obtaining a channel state information from a wireless receiver and obtaining an angular velocity from an inertial measurement unit, wherein the inertial measurement unit includes accelerometer, gyroscope, magnetometer and the like. Furthermore, the data between the inertial sensors (mainly gyroscope giving out angular velocity) and the wireless sensor (giving out Channel State Information) is combined via maximum likelihood or expectation maximization to jointly perform Angles of Arrival (AoAs) and Times of Flight (ToFs) estimation on the wireless receiver. Once the estimates are calculated, their values can be used to get steering matrices. Steering matrices and received CSI matrices can be used to compute vectors of complex amplitudes such as rp-i and Gr, of wireless signals arriving at the receiver. These complex amplitudes can then be used to perform displacement calculations as is described later. The orientation can be estimated by combining the estimates from gyroscope with these estimated AoAs and with IMU inputs (say via a Kalman Filter), making the orientation estimation driftless. These calculated displacements and orientation changes can be used for tracking of the wireless receiver. [0011] This mode of tracking can be further augmented by making use of maps of AoAs and phase & amplitude of complex amplitudes in the vector of complex amplitudes (G) for incoming wireless signals , along with map of magnetic field, making use of technique called Simultaneous Localization and Mapping (SLAM), thus making even precise absolute positioning possible. [0012] In another embodiment, the amplitude and phase of the complex amplitudes in complex amplitude vectors G & the AoAs for incoming wireless signals and magnetic field data can be used to create a map for performing localization using, for example, Simultaneous Localization and Mapping (SLAM) technique.
[0013] In another embodiment, the method further comprises determining an angle of departure of signal at a transmitter, obtaining AoA of signals at the receiver, finding out the AoA for the Line of Sight signal and comparing the angle of departure and the angle of arrival of LOS signal at receiver for determining the absolute orientation of the wireless receiver. Further, the method comprises calculating a change in phase of the complex amplitudes occurring between the reception of two consecutive data packets of electromagnetic signals arriving at the receiver, wherein the change in phase is related to the displacement of the wireless receiver. Further, the change in phase between consecutive data packets are analysed for performing localization. The tracked orientation values can be used for performing displacement calculations wherein the calculated displacements can be used for tracking the position of the wireless receiver. The angular velocity is measured by a gyroscope with the additional input from the magnetic sensor and the accelerometer and it can be combined with the AoAs estimates via, for example, a Kalman Filter to further increase the accuracy of the device orientation estimates. The wireless receiver can be a rotating receiver, a vibrating receiver, a moving receiver and the like.
[0014] In another embodiment, the method involves performing angle of arrival estimation at the mobile receiver and the angle of departure estimation of signal from the transmitter. The AoAs/ToFs estimates are used to construct steering matrices that are then used to get the vector of complex amplitudes of signals impinging at the mobile receiver. Complex amplitude of Line of Sight signal is identified using ToF estimates and the AoA/AoD estimates are used to perform corrections to the absolute value of complex amplitudes and these corrected values are used to obtain estimate of distance, which is then used with AoA/AoD estimates for receiver localization via triangulation/trilateration. [0015] Further, localization computations are performed locally on the device being tracked making the system scalable and with minimized latency.
BRIEF DESCRIPTION OF ACCOMPANYING DRAWINGS
[0016] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and modules. [0017] The prior art, Fig. 1 describes a multipath propagation of a wireless signal between a transmitter and a receiver, according to an exemplary implementation of the presently claimed subject matter.
[0018] The prior art, Fig. 2 demonstrates an example of directional gain due to radiation pattern of a dipole antenna, according to an exemplary implementation of the presently claimed subject matter.
[0019] The prior art, Fig. 3 describes a two-ray model of propagation and concept of complex amplitudes, according to an exemplary implementation of the presently claimed subject matter.
[0020] The prior art, Fig. 4 describes N source signals impinging onto a receiver array with M receiving antenna elements (arranged in Uniform Linear Array (ULA) configuration) because of one transmitter, with spacing between the receiving antenna elements being d, according to an exemplary implementation of the presently claimed subject matter.
[0021] The prior art, Fig. 5 illustrates the introduction of phase difference between two consecutive receiving antenna elements due to kth signal wavefront arriving at 9k angle in 2D case with ULA configuration, according to an exemplary implementation of the presently claimed subject matter. [0022] Fig. 6 describes a schematic diagram for Angles of Arrival (AoAs) and/or Times of Flight (ToFs) estimation on rotating receiver, according to an exemplary implementation of the presently claimed subject matter.
[0023] Fig. 7 depicts the multiple angle of arrivals due to 1th access point at pth and (p- 1 )th frame of time. Here, during displacement and rotation of the receiver array, the unit vectors’ directions change in receiver’s reference frame, according to an exemplary implementation of the presently claimed subject matter.
[0024] Fig. 8 depicts the system configuration in 2 dimensions, according to an exemplary implementation of the presently claimed subject matter.
[0025] Fig. 9 shows one method by which displacement of the receiver array in global frame is estimated, according to an exemplary implementation of the presently claimed subject matter.
[0026] Fig. 10 shows another method by which displacement of the receiver array in receiver’s local frame is estimated, according to an exemplary implementation of the presently claimed subject matter.
[0027] Fig. 11 illustrates various components that constitute the tracking module, according to an exemplary implementation of the presently claimed subject matter. [0028] Fig. 12 describes alternative embodiment of the invention, according to an exemplary implementation of the presently claimed subject matter.
[0029] Fig 13 illustrates a flowchart of a method for position and orientation tracking of multiple mobile devices, according to an exemplary implementation of the presently claimed subject matter.
[0030] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative apparatuses embodying the principles of the present invention. Similarly, it will be appreciated that any flow charts, flow diagrams, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
[0031] The various embodiments of the present invention provide a system and a method for position and orientation tracking of multiple mobile devices, and more particularly, to a system and method for wireless and inertial sensors based position and orientation tracking of multiple mobile devices.
[0032] In the following description, for purpose of explanation, specific details are set forth in order to provide an understanding of the present claimed subject matter. It will be apparent, however, to one skilled in the art that the present claimed subject matter may be practiced without these details. One skilled in the art will recognize that embodiments of the present claimed subject matter, some of which are described below, may be incorporated into a number of systems.
[0033] However, the systems and methods are not limited to the specific embodiments described herein. Further, structures and devices shown in the figures are illustrative of exemplary embodiments of the present claimed subject matter and are meant to avoid obscuring of the present claimed subject matter.
[0034] Furthermore, connections between components and/or modules within the figures are not intended to be limited to direct connections. Rather, these components and modules may be modified, re-formatted or otherwise changed by intermediary components and modules.
[0035] The presently claimed subject matter provides an improved system and a method for position and orientation tracking of multiple mobile devices. [0036] In one of the embodiments, the present invention provides a system that relies on the change in phase of complex amplitudes of wireless signals impinging on the mobile device, that occurs between reception of two consecutive data packets obtained at the wireless receiver sensor. This change in phase is related to the displacement of the receiver array, and its estimation enables calculation of the displacement or velocity of the receiver array, using which position of the receiver array can be tracked. Furthermore, as AoAs are being computed for estimation of these complex amplitude vectors, these AoAs are used to estimate changes in orientation and thus perform orientation tracking in a drift- less manner. As phase of complex amplitudes changes rapidly with displacement on order of fraction of wavelength (^waveiengti1~6 cm for 5 GHz), the accuracy is of the order of ~l cm. Moreover, the tracking is independent of lighting conditions of the environment, as lighting does not affect wireless signals. Also, it is the change in phase between complex amplitudes of received wireless signals of consecutive packets that is analyzed for performing localization, and these packets are spaced at around 1 milli-second or less in time, which means that the method of tracking is independent of changes occurring in environment because such changes occur on slower time scale and also there is no dependence on stored CSI values, making the method more robust. [0037] In another implementation, the tracking relies on reception of wireless packets for localization. And these packets are received normally at a rate of 500-1000 Hz and can also be done at higher frequency by modifying the wireless packet. Moreover, inertial sensors are capable of providing sensor measurements at -1000 Hz easily. Thus, the present invention can operate at higher frequencies compared to the existing systems. Also, the entire computation is done on the receiver, hence multiple devices can be tracked, which makes the system a scalable solution and with minimized latency.
[0038] In another implementation, the method involves performing localization. The displacements of the object are calculated, which are obtained by analysing the change in the phase of the complex amplitudes of the wireless signal that arrive at the receiver, vectors of complex amplitudes being referred to as Gr-i and rp. Since a small displacement gives a significant change in phase, sub-cm level tracking can be determined. But to derive Gr and Gr-i, steering matrices are required such as A( q, t, p-l) and A( q, t, p). Further, for estimating these steering matrices A( q, t, p-l) and A(0, t, p), AoAs and ToFs of wireless signals are computed in the receiver’s frame as per the invention’s method, using the Channel State Information from wireless sensors and angular velocity from IMU {gyroscope in particular).
[0039] It should be noted that the description merely illustrates the principles of the present invention. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described herein, embody the principles of the present invention. Furthermore, all examples recited herein are principally intended expressly to be only for explanatory purposes to help the reader in understanding the principles of the invention and the concepts contributed by the inventor to furthering the art and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass equivalents thereof. [0040] The prior art, Fig. 1 describes a multipath propagation of a wireless signal between a transmitter (102) and a receiver (104), according to an exemplary implementation of the present disclosure. As shown in Fig. 1, signals are exchanged between the transmitter (102) and the receiver (104) through two propagation mechanisms namely direct and indirect propagation. The direct propagation occurs when there exists a Line of Sight (LOS) path between the transmitter (102) and the receiver (104) and signal arrives at the receiver from the LOS path. In contrast, indirect propagation happens in absence of a direct LOS path i.e. from a Non-Line of Sight (NLOS) path and it involves one or a combination of the reflection, diffraction, scattering, or refraction. These mechanisms allow the wireless signal to propagate over multiple paths between the transmitter (102) and the receiver (104) and is referred to as multipath propagation.
[0041] The prior art, Fig. 2 demonstrates an example of directional gain due to radiation pattern of a dipole antenna, according to an exemplary implementation of the present disclosure. It describes that the strength of signal transmitted or received at antenna is not uniform across all angles of transmission/reception but has an angular dependence. This means that if a signal s(t) is to be transmitted by antenna, then signal released at angle qt is s(t)*GT(0T), where GT(0) is gain function of the transmitter antenna configuration. Similarly, when an antenna receives signal at angle OR, the received signal gets multiplied by GR(OR) where GR(0) is the gain function of the receiver configuration. For passive antennas, i.e. antennas without in-built amplifying circuitry, this gain function is reciprocal which means that if a transmitting antenna starts behaving as a receiving antenna then the gain function stays the same.
[0042] The prior art, Fig. 3 describes a two-ray model of propagation and concept of complex amplitudes, according to an exemplary implementation of the present disclosure. It depicts the case of a single input single output (SISO) transmission for a signal composed of a single frequency component with wavelength l in a 2D scenario. The same concepts hold for 3D as well, although with azimuth and elevation angles included and also for signals across multiple carrier frequencies such as in Orthogonal Frequency Division Multiplexing (OFDM) protocol. Further, in case of LOS propagation, the propagating signal undergoes propagation loss only and the electric field gets attenuated by a factor inversely proportional to the distance travelled by the signal along the LOS path and it acquires a phase shift as described by laws of electromagnetic propagation. As seen in Fig. 3, one signal component is released from antenna at an angle On travels distance di along the LOS path. Hence, it gets multiplied by a factor of
— * g-ifcdi * GT(Q1t) por he NLOS path, other signal component is released at d. ^
an angle of q2t travels a total distance of d2, but it undergoes reflection at scatterer. Due to this reflection, the signal gets multiplied by a complex factor g as well, with the overall multiplication factor being— * e lkdz * GT(62T). Also,
d2
these signal components are received at the receiver (304) at angles 0lR and 02R respectively. GT(0) & GR(0) are gain function dependent on arrival/departure angle of signal for the transmitter (302)/receiver (304). Therefore, the electric field of signal that is received at the receiver (304) is given by:
Figure imgf000014_0001
where s(t) is the signal transmitted by the transmitter, k is the wavenumber of the
Figure imgf000014_0002
electromagnetic wave of the signal k =— and G is the channel frequency
A
response (CFR). In this case, it is also the channel state information (CSI) (610) for the SISO channel, as there are just one of each transmitter and receiver and a single frequency of communication.
[0043] The above equation holds for SISO transmission, that is when there is only a single transmitter element and a single receiver element. In case of multiple transmitting and receiving elements, the expression for signal received at each receiving antenna element takes a form which is explained in the following section. Such form of communication is referred to as Multiple Input Multiple Output (MIMO) communication.
[0044] The prior art, Fig. 4 describes N source signals impinging onto a receiver array with M receiving antenna elements (arranged in Uniform Linear Array (ULA) configuration) because of one transmitter, with spacing between the receiving antenna elements being d, according to an exemplary implementation of the present disclosure. It depicts the case of a Multiple Input Multiple Output (MIMO) reception of multiple signals at the receiver array in case of 2- dimensional propagation and for receiver configuration of Uniform Linear Array (ULA). The same concepts hold in 3 -dimensions and for receivers with arbitrary configurations of the receiving antenna elements and there are equations already derived in the literature for the same. The MIMO uses multiple antenna configurations at the transmitter and receiver end to achieve higher data transfer rate and range by exploiting the phenomenon of multipath propagation. A T x M MIMO configuration has T number of transmitter antenna elements and M receiving antenna elements. The maximum number of spatially independent data streams supported by such configuration is minimum of T or M. For e.g. a 2 x 3 MIMO configuration can support maximum of 2 spatially independent data streams.
[0045] For T number of transmit antennas, there exist minimum (T, M) number of independent signals that leave the transmitter and due to presence of objects in surroundings, N signal wavefronts impinge on the receiver array consisting of M number of receiving antenna elements. As discussed before, each of these N signals have N complex amplitude factors, when arriving at the first element of the antenna. Moreover, receiver array geometry results in certain relationships between signal received amongst the member antenna elements. In
Fig.4, N signals with angle of arrivals qi, q2. QN are impinging at the receiver array containing M antenna elements.
[0046] The prior art, Fig. 5 illustrates the introduction of phase difference between two consecutive receiving antenna elements due to kth signal wavefront arriving at 9k angle in 2D case with ULA configuration, according to an exemplary implementation of the present disclosure. It describes a signal wavefront arriving at the receiver array with 9k angle of arrival. The phase difference between two consecutive receiving elements is given
Figure imgf000015_0001
Therefore, there exists a relation between the complex amplitudes obtained at each receiving antenna due to the signal coming at 9k angle. The contribution of this signal to the overall CSI is given as: xk = b(qύ * k where 9k is the direction of arrival of kth signal wave-front, Tk is the complex amplitude of the kth wave-front arriving at Ist antenna element and b(qi,) is steering vector given by:
Figure imgf000016_0001
Therefore, for N signals impinging, the time-dependent vector of CSI at the receiver array is given by
Figure imgf000016_0002
where,
Figure imgf000016_0003
Q(ΐ) is a IcN vector of AoAs in the receiver’s frame of reference; A(0(t)) is a M x N steering matrix that is dependent on receiver configuration and on the angles of arrivals of the signals, T(t) is a time-dependent Nxl vector of complex amplitudes of N wireless signals arriving at the first antenna element due to N impinging signals and n(t) is a vector of noise introduced into the receiving elements. X(t) is a time dependent M x 1 complex vector of CSI factors estimated at receivers in the receiving array. This is the equation for single subcarrier.
[0047] For the case of multiple sub-carriers, like in the case of OFDM communications, the above equations get slight modifications. For a MIMO system with M receive antennas and operating with K subcarriers and N wireless signals arriving at the receiver array, the CSI matrix X is complex and of the shape M x K. If we concatenate all columns of the CSI matrix X one by one into a single column vector X , the relation between X (t), f(t) and A(0, t) is given as:
Figure imgf000017_0001
Where P(6k(t), ik(t)) is steering vector for the kth impinging wavefront, ly is the time of flight and 9k is the angle of arrival of the kth wavefront signal, fa is the spacing between frequencies of subcarriers, k is wavenumber of central frequency of communication and d is the distance between the antenna elements in the receiver array. These wireless communication systems obtain the Channel State Information (610) as mentioned in the 802.11h protocol. Briefly, known training sequence (pilot sequence) is sent from the transmitter. The receiver decodes the signal and received signals are then compared with the sent known symbols to estimate the Channel State Information Matrix X. In derivations below, we can use X in place of X . [0048] Fig. 6 describes a schematic diagram for Angles of Arrival (AoAs) and Times of Flight (ToFs) estimation on rotating receiver, according to an exemplary implementation of the present disclosure. In case of a moving/rotating receiver, steering matrix, complex amplitude vector and noise are a function of time, making the observed CSI vector a function of time as well. This time dependency can be written in form of following equation:
Figure imgf000018_0001
where,
A(0, t) is the time dependent steering matrix which is given by:
Figure imgf000018_0002
where b(qi,(ΐ), ik(t)) are time dependent steering vectors written in column form and of shape (MK)xl .
[0049] In an exemplary embodiment, L-packets are collected for determination of angles of arrival and times of flight (614) of N signals and the time period between reception of each packet is T. In general, the time between reception of packets can be arbitrary, but it is taken constant for simplicity and arbitrary timings can be accounted for with slight modifications to equations below. The equation for calculated CSI can be written as:
Figure imgf000018_0003
where s is the index for the s*11 packet obtained at sth time step and s = [0, 1, 2. . . . , L-l] and the steering matrix for s*11 time step is given by:
Figure imgf000018_0004
here A9S (angular displacement till s*11 packet from 0th packet) is given by:
Figure imgf000019_0001
And o¾ is the angular velocity (608) of the receiver at time step j. Here, we have made an approximation that the ToF of signals doesn’t change during the reception of wireless packets. The related steering vectors are as have been described above. Therefore, the angle of arrivals at 0th packet can be obtained by fusing the CSI data from wireless sensors (606) and angular velocity (608) data from the IMU sensors into the following minimization problem and solving it:
Figure imgf000019_0002
[0050] This minimization problem can be solved by a minimization module (612) that is either directly implemented on the main processor (1108) or there can be an additional minimization module (612) integrated as seen in Fig. 6, Fig. 9 and Fig. 10. The minimization module (612) can be either an Application Specific Integrated Circuit (ASIC), Graphic processing unit (GPU), Tensor Processing Unit (TPU), Field Programmable Gate Array (FPGA) or even an additional central processing unit (CPU) that has processing capabilities. Furthermore, this kind of minimization problem can be converted into different forms and solved by techniques such as Maximum Likelihood (ML), Expectation Maximization (EM), Space Alternating Generalized Expectation (SAGE) Maximization and others which are described in literature on signal processing. In essence, the principle of data fusion stays the same irrespective of the method chosen to define and solve the minimization problem. This kind of problem can also be solved via neural networks by first training the model on known data and then using the trained model to perform the required estimation. The computation for this estimation can occur over computing devices such as CPU, GPU, FPGA, TPU or even neural compute sticks.
[0051] The angular velocities o¾ are measured by gyroscopes (1104) with additional input from magnetic field sensors (604) and/or accelerometers (1106) and these values are then fed into the minimization problem. Moreover, as there are multiple subcarriers involved, there is phase shift that occurs between signals at same antenna but different frequency, due to time of flight of the signal. This additional phase shift has been taken into account while defining the minimization problem, by including the joint estimation of time of flight (ToF) of signal. This ToF can also be the relative time of flight (rToF), if there are correction procedures done before to sanitize the CSI data. But their estimation procedure remains the same. Therefore, ToF and rToF are used interchangeably in this document. [0052] Furthermore, the CSI data obtained from wireless receiver is combined with angular velocity (608) obtained from inertial (602)/magnetic sensors (604) and thus angle of arrivals and times of flight (614) of electromagnetic signals at rotating (even with translation) receivers is estimated. This technique can be used for example to perform beamforming when the rotating/moving mobile device acts as transmitter and results in optimum battery usage during transmissions and better signal acquisition during reception, or even to receive signals efficiently when the device acts as a receiver
[0053] Furthermore, as the angles of arrival and times of flight (614) of signals are determined, and as these signal sources do not change position so rapidly, these time dependent angles of arrivals are used to accurately track even orientation of the receiver, thus eliminating build-up of drift. Moreover, a calibration step can be added that compares Angle of Departure (AoD) at the transmitters and the AoA at the receiver and thus the true orientation of the receiver is established. Further, the readings can be fed into estimators such as Kalman Filter (906) that can result in even more accurate results.
[0054] Fig. 7 depicts the multiple angle of arrivals due to 1th access point at pth and (p- 1 )th frame of time. Here, during displacement and rotation of the receiver array, the unit vectors’ directions change in receiver’s reference frame, according to an exemplary implementation of the present disclosure. Further, Fig. 8 depicts the system configuration in 2 dimensions, according to an exemplary implementation of the present disclosure. There are Q transmitters (Access Points APs) and receiver with M receiving antenna elements. Between pth and (p-l)th packet, the receiver gets displaced by d.
[0055] As described above, AoAs and ToFs are obtained at the receiver and these can then be used to calculate steering matrix at any time step in range of k= (0, 1, 2..., L-l). These steering matrices are then used to calculate the vectors of complex amplitudes of wireless signals impinging at the position of the receiver at these time steps.
[0056] For pth and (p-1 )th wireless signal packets, their corresponding complex amplitude vectors at position of receiver at pth and (p- 1 )th time step can be written as:
Figure imgf000021_0001
where A+ is a pseudo-inverse of A matrix. When there is a displacement of d from (p- 1 )th to pth time step, then the complex amplitudes are related as, rp = D - Tp i
Where D is a diagonal matrix of shape N x N that is given by:
Figure imgf000021_0002
Here r¾. are unit vectors along ith, the angle of arrival of the signal in the global frame as shown in Fig. 7 and d is displacement of the receiver between time steps (p) and (p-l) as depicted in Fig. 8.
[0057] Fig. 9 shows one method by which displacement of the receiver array in global frame is estimated, according to an exemplary implementation of the present disclosure. Fig. 10 shows another method by which displacement of the receiver array in receiver’s local frame is estimated, according to an exemplary implementation of the presently claimed subject matter. As described above, the D matrix (904) is further calculated by solving the following minimization problem:
Figure imgf000022_0001
subject to D
D is Diagonal
This minimization problem can also be solved via minimization modules (612) such as GPU, CPU or ASIC as mentioned before. 5, a (L-l) x 1 vector is calculated by taking ratio of elements of D matrix (904) as follows. This ratio enables us to get rid of phase offset that gets introduced due to different clocking frequencies of the transmitter and the receiver:
Figure imgf000022_0002
Now using the angle of arrivals in the angles or arrival and times of flight estimate (614) in the receiver’s frame of reference, and the bearing angle (908) of the receiver array in global frame, R matrix (910) is calculated row by row as:
— r
Rk = - - | ;os(¾j, - cosf^j ) (sin(0fc| - Sillily,)) where Rk is the kth row of R matrix (910). 0k|w are the AoAs of the kth signal in the global frame of reference. Now it is just a least squares problem of the form:
Figure imgf000022_0003
The d is the displacement (912) of the receiver in the global frame. The displacement calculation can also be done in the local frame of the receiver device. For this purpose, the D matrix (904) and S are computed as described above. However, the R matrix (910) is computed differently. The kth element Rk is calculated in receiver’s frame of reference as follows,
Figure imgf000023_0001
Here, the 9kis the AoAs of the kth signal wavefront. Again, least square problem of the form below is solved as follows:
Figure imgf000023_0002
Here, the displacement vector obtained is in the local frame of reference of the receiver. Taking into account the change in orientation of the receiver and the displacement (912) from (p- 1 )th to pth time step, the change in position of the receiver in its frame of reference can be calculated as is done in the technique of Visual Inertial Odometery (VIO). The flowchart is as shown in Fig. 10. Although, estimated parameters of impinging wireless signals for two consecutive data packets are used here for performing localization, any pair of received data packets can be used in a similar way for performing localization.
[0058] Further, Fig. 11 illustrates various components that constitute the tracking module, according to an exemplary implementation of the present disclosure. It consists of a wireless module (1102), gyroscope (1104), accelerometer (1106), magnetic sensor (604) and a visual sensor (1112). It also consists of a processor (1108) or microcontroller for executing digital instructions such as those from ARM, Intel, Microchip and also a memory (1110) being either volatile or non-volatile such as ROM, RAM, Flash memory and such. The module can also consist of one or more cameras, the input data of which can be combined with the position/orientation estimates of the current method to further increase the accuracy. The device can also have special compute units such as GPU, TPU, ASIC, FPGA to assist the central processor in computations.
[0059] Additionally, if multiple transmitters are used, then the above equation is modified to consider time shifts and those values are then concatenated to modify the above equation. Thus, displacement (912) (or equivalently velocity) of the receiver is calculated. This process is repeated, and the displacements (912) are added on (or velocity integrated), to track position of the receiver with time. Additional Kalman filters (906) can be integrated to make the tracking more robust and accurate. The entire process flow is as depicted in Fig. 9 and in Fig. 10 and the components that make up this tracking module are as shown in Fig. 11. This concept can further be extended to 3 dimensions. [0060] Fig. 12 describes alternative embodiment of the invention, according to an exemplary implementation of the present disclosure. As seen in the previous sections, the angles of arrival and times of flight (614) of the incoming signal wavefronts can be calculated in the frame of reference of the receiver. Here a signal wavefront is released at an angle qc from the transmitter and it travels in the LOS manner towards the receiver. The wavefront impinges on the receiver array at an angle of 0R. There are other NLOS wavefronts as well that impinge on the receiver. At its location, the receiver calculates the complex amplitude vectors (902) of all such wavefronts that impinge due to the transmitter (1202). But using the steering matrix obtained from the angle of arrival estimation, it is now possible to resolve the complex amplitude coefficients of individual impinging wavefronts, as seen in the previous sections. Moreover, by the calculation of the variance between the time of flight (ToF) and AoA of signals over received packets, which of those signals is actually LOS can be classified. Therefore, once the LOS signal is identified, its complex amplitude rLos can be estimated.
[0061] Further, for the complex amplitude rLOs for the LOS path, the magnitude of signal strength by taking its absolute value |rL0S| can be known. From the directional nature of receiving antennas, it is implied that the signal strength magnitude is the true signal power decay multiplied with the gain factor due to transmitter and receiver antennas G = GT(0T)*GR(0r). If this factor is corrected for, it is then possible to obtain the actual signal strength at the position of the receiver due to the transmitter. Therefore, the actual decay in signal strength is given by:
|rLOSr= |rL0S|/(GR(9R) * GT(0T)) [0062] Further, the information about AoA (614) at the receiver (1204) is calculated at the receiver, but the receiver has no information about the AoD at the transmitter. The AoD values can be made available at the receiver as described in following workflow. The receiver to be tracked (Rx) sends out series of packets addressed to the transmitter (Tx). Because of large dimensions of surroundings compared to small motions encountered between packets (time frame of <1 ms), it can be assumed that the angle doesn’t change much when Rx moves for collective P packets. Under those conditions, the quasi-static condition holds on the Tx side and thus AoA estimation algorithms such as MUSIC, ESPIRIT and their variants can be employed on the Tx. Also, electromagnetic propagations are reciprocal in nature, therefore the AoA calculated at Tx is almost approximately equal to the AoD in the previous time step. This AoD is used as qt whereas the AoA at receiver (1204) is used as OR. The transmitter can encode information about its correction factor inside the packets already being sent out and hence, the receiver can know GT(OT). The receiver Rx is computing the angle of arrival OR (614), in its frame. Moreover, the radiation patterns of receiver and transmitter antenna arrays are already known via calibrations. Therefore, gain factors GR(0R) and GT(0T) are known on the receiver and used for correction. Now, the modified RSSI value can
1
be computed and it is approximately value of —
Figure imgf000025_0001
. Using this, r can be estimated, wherein r is the distance between Tx from Rx. So, if the location of Tx is known, Rx localizes itself w.r.t. Tx or otherwise, it is done by relative positioning. In presence of multiple transmitters, triangulation is performed, to determine location of Rx. If there are multiple receivers to be tracked, each of them performs this position correction turn by turn, while relying on displacement calculation via phase, when there is no LOS or while waiting for its turn to send signal to the transmitters. Moreover, the AoA & AoD are compared and absolute orientation of the receiver is determined because direction of LOS signal that departs the transmitter maintains its orientation in the global frame till it reaches the receiver. The uncorrected/corrected |FLos| values for LOS paths can be used for even making a probability map using probabilistic techniques such as Simultaneous Localization and Mapping (SLAM) and thus estimate position of the receiver. Moreover, this present invention can be combined with a visual sensor (1112) to perform a convoluted SLAM and achieve much better accuracies or enable even novel implementations.
[0063] Here the effect of multipath is removed from the RSSI value and have accounted for field strength correction using AoA/AoD. This results in accurate (drift-less) estimation of position.
[0064] In an exemplary implementation, a method involves usage of Wi-
Fi as the wireless mode of operation for working indoors and it relies on the calculated displacements for determining its position. It can be used in indoor setting for example, to track an AR/VR headset. Another usage of the method with additional SLAM implementation can be used in industrial settings for asset or personnel tracking. It can be also used in retail for e.g. to enable customers to navigate inside malls or shopping complexes. They are even used in exhibition and trade fair industry. Above use cases are with routers as transmitters in Wi-Fi. But, as Wi-Fi is similar to telecommunication protocol such as cellular 4g in protocols, the presently claimed subject matter can be implemented in 4g as well, providing positioning of all places that get 4g cellular signals. It can be used in city wide cm-level positioning that can be used for e.g. in drone path automation or for autonomous vehicles.
[0065] Further, this method works in configuration with Wi-Fi (2.4/5/5.8 GHz), 3g/4g/5g, Bluetooth Low Energy, Ultra-Wide Band (UWB) and even on other frequencies of communications that are not standard. It has been devised keeping in mind 802.11h protocol, but can be similarly modified for other advanced protocols such as 802.1 lac, 802.1 lad, 802.11 ah and 802.1 laj. Furthermore, the combination of sensors such as with visual/magnetic/ LIDAR (maybe acoustic) sensors can further augment the method and other modes of its operation like SLAM (using phase/amplitude/magnetic field data). Further, data from both cellular and Wi-Fi sensors can be used in combination with IMU readings (mainly gyroscope) to further improve accuracy. [0066] Fig 13 illustrates a flowchart of a method for position and orientation tracking of multiple mobile devices, according to an exemplary implementation of the presently claimed subject matter.
[0067] At step 1302, obtaining a channel state information (610) by a wireless sensor (606). In another embodiment, the wireless sensor (606) is configured to obtain a channel state information (610) from a wireless receiver (1204).
[0068] At step 1304, obtaining an angular velocity (608) by an inertial sensor (602) and/or a magnetic sensor (604). In another embodiment, the inertial sensor (602) and the magnetic sensor (604) are configured to obtain an angular velocity (608) from an inertial sensor (602).
[0069] At step 1306, fusing the data of the channel state information (610) with the angular velocity (608) by a minimization module (612). In another embodiment, the minimization module (612) is configured to fuse the data of the channel state information (610) with the angular velocity (608).
[0070] At step 1308, determining a plurality of angles of arrival and times of flight (614) of a plurality of electromagnetic signals by the minimization module (612) at the wireless receiver. In another embodiment, the minimization module (612) is configured to determine a plurality of angles of arrival and times of flight of a plurality of electromagnetic signals at the wireless receiver.
[0071] At step 1310, tracking orientation of the wireless receiver by a tracking module (1100) by combining the determined angles of arrival and the obtained angular velocity (608) with a gyroscope input (1104). In another embodiment, the tracking module (1100) is configured to track orientation of the wireless receiver by combining the determined angles of arrival and the obtained angular velocity (608) with a gyroscope input (1104).
[0072] At step 1312, performing displacement calculations with the tracked orientation values and tracking the position of the wireless receiver with the calculated displacement estimates by the tracking module (1100). In another embodiment, the tracking module (1100) is configured to perform the displacement calculation with the tracked orientation and to track the position of the wireless receiver with the calculated displacement estimates. [0073] It should be noted that the description merely illustrates the principles of the present invention. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described herein, embody the principles of the present invention. Furthermore, all the used cases recited herein are principally intended expressly to be only for explanatory purposes to help the reader in understanding the principles of the invention and the concepts contributed by the inventor(s) to furthering the art and are to be construed as being without limitation to such specifically recited used cases and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass equivalents thereof.

Claims

Claims :
1. A method for position and orientation tracking, said method comprising:
obtaining, by a wireless sensor (606), a channel state information
(610);
obtaining, by an inertial sensor (602) and/or a magnetic sensor
(604), an angular velocity (608);
fusing, by a minimization module (612), the data of the channel state information (610) with the angular velocity (608);
determining, by the minimization module (612), a plurality of angles of arrival and a plurality of times of flight (614) of a plurality of electromagnetic signals at the wireless receiver;
tracking, by a tracking module (1100), orientation of the wireless receiver by combining the determined angles of arrival and the obtained angular velocity (608) with a gyroscope input (1104);
performing, by the tracking module (1100), displacement calculation with the tracked orientation; and
tracking, by the tracking module (1 100), position of the wireless receiver with the calculated displacement. 2. The method as claimed in claim 1, wherein the tracking step further comprises implementing simultaneous localization and mapping (SLAM) technique for creating and using a map of estimated parameters such as angles of arrivals, phase and/or amplitude of complex amplitudes of incoming wireless signals and magnetic field data, to determine absolute position of the wireless receiver.
3. The method as claimed in claim 1, further comprises:
determining, by a transmitter (1202), an angle of departure at the transmitter;
comparing, by the tracking module (1100), the angle of departure and the angles of arrival for determining the absolute orientation of the wireless receiver.
4. The method as claimed in claim 1, wherein the tracking module (1100) further comprises calculating a change in phase of complex amplitudes of the electromagnetic signals occurring between the reception of consecutive data packets arriving at the wireless receiver (1204), wherein the change in phase is related to the displacement (912) of the wireless receiver.
5. The method as claimed in claim 4, wherein the tracking module (1100) further comprises analyzing said change in phase of complex amplitudes of incoming wireless signals between reception of consecutive data packets and change in orientation for performing localization.
6. The method as claimed in claim 1, wherein fusing, by the minimization module (612), the data of the channel state information (610) with the angular velocity (608) takes place via maximum likelihood, expectation maximization and other such variants.
7. The method as claimed in claim 1, wherein the angular velocity (608) is measured by a gyroscope (1104) with the additional input from the magnetic sensor (604) and/or the accelerometer (1106).
8. The method as claimed in claim 1, wherein the angular velocity (608) and/or displacements (912) are measured by a camera with the additional input from the inertial sensor (602) and /or the magnetic sensor (604) .
9. The method as claimed in claim 1, wherein the wireless receiver (1204) includes static receiver, a rotating receiver, a vibrating receiver, a moving receiver and the like.
10. A system for position and orientation tracking, said system further comprises: a wireless sensor (606) configured to obtain a channel state information (610);
an inertial sensor (602) and/or a magnetic sensor (604) configured to obtain an angular velocity (608);
a minimization module (612) configured to:
fuse the data of the channel state information (610) with the angular velocity (608);
determine a plurality of angles of arrival and times of flight (614) of a plurality of electromagnetic signals at the wireless receiver (1204);
a tracking module (1100) configured to:
track orientation of the wireless receiver (1204) by combining the determined angles of arrival and the obtained angular velocity (608) with a gyroscope input (1104);
perform displacement calculation with the tracked orientation; and
track position of the wireless receiver with the calculated displacement. 11. The system as claimed in claim 10, wherein the tracking module is further configured to implement technique of simultaneous localization and mapping (SLAM) to create and use a map of estimated parameters such as angles of arrivals, phase and/or amplitude of complex amplitudes of incoming wireless signals and magnetic field data, to determine absolute position of the wireless receiver.
12. The system as claimed in claim 10, further comprises:
a transmitter (1202) is configured to determine an angle of departure at the transmitter (1202); and
the tracking module (1100) is configured to compare the angle of departure and the angles of arrival to determine the absolute orientation of the wireless receiver (1204).
13. The system as claimed in claim 10, wherein the tracking module (1100) is further configured to calculate a change in phase of the complex amplitudes of electromagnetic signals occurring between the reception of consecutive data packets arriving at the wireless receiver (1204), wherein the change in phase is related to the displacement (912) of the wireless receiver.
14. The system as claimed in claim 13, wherein the tracking module (1100) is configured to analyze the change in phase of complex amplitudes of incoming wireless signals and change in orientation of receiver, between reception of consecutive data packets to perform localization.
15. The system as claimed in claim 10, wherein a gyroscope (1104) is configured to measure the angular velocity (608) with the additional input from the magnetic sensor (604) and/or the accelerometer (1106).
16. The system as claimed in claim 10, wherein a camera is configured to measure the angular velocity (608) and displacements (912) with the additional input from the accelerometer (1106) and/or the magnetic sensor (604).
PCT/IB2019/054944 2018-06-13 2019-06-13 System and method for position and orientation tracking of multiple mobile devices WO2019239365A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201821022131 2018-06-13
IN201821022131 2018-06-13

Publications (2)

Publication Number Publication Date
WO2019239365A1 true WO2019239365A1 (en) 2019-12-19
WO2019239365A4 WO2019239365A4 (en) 2020-03-05

Family

ID=68843042

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2019/054944 WO2019239365A1 (en) 2018-06-13 2019-06-13 System and method for position and orientation tracking of multiple mobile devices

Country Status (1)

Country Link
WO (1) WO2019239365A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111708008A (en) * 2020-05-08 2020-09-25 南京工程学院 Underwater robot single-beacon navigation method based on IMU and TOF
CN112179332A (en) * 2020-09-30 2021-01-05 劢微机器人科技(深圳)有限公司 Hybrid positioning method and system for unmanned forklift
WO2021152513A1 (en) * 2020-01-31 2021-08-05 7hugs Labs SAS Low profile pointing device sensor fusion
CN113219447A (en) * 2021-04-09 2021-08-06 国电南瑞科技股份有限公司 Power transmission line distance stability measuring method based on millimeter wave array
US20210349177A1 (en) * 2020-05-08 2021-11-11 7hugs Labs SAS Low profile pointing device
US11233544B1 (en) 2020-08-17 2022-01-25 Qualcomm Incorporated Methods and apparatus for multipath improvements using multiple antennas
WO2022030160A1 (en) * 2020-08-03 2022-02-10 京セラ株式会社 Electronic device
CN114509069A (en) * 2022-01-25 2022-05-17 南昌大学 Indoor navigation positioning system based on Bluetooth AOA and IMU fusion
CN114543844A (en) * 2021-04-09 2022-05-27 恒玄科技(上海)股份有限公司 Audio playing processing method and device of wireless audio equipment and wireless audio equipment
CN114938490A (en) * 2022-05-16 2022-08-23 中国联合网络通信集团有限公司 Terminal positioning method, device and storage medium
CN115334448A (en) * 2022-08-15 2022-11-11 重庆大学 Accurate dynamic positioning method of unmanned self-following device based on Bluetooth and inertial sensor
EP4117317A1 (en) * 2021-07-01 2023-01-11 Sword Health, S.A. Assessment of position of motion trackers on a subject based on wireless communications
CN116761253A (en) * 2023-08-17 2023-09-15 湘江实验室 UWB weighted positioning method based on triangular area
WO2023184538A1 (en) * 2022-04-02 2023-10-05 Oppo广东移动通信有限公司 Information processing method, and device
EP4286794A1 (en) * 2022-06-03 2023-12-06 Sword Health, S.A. Gyroscope drift estimation and compensation with angle of arrival of electromagnetic waves

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
CHUANG ZHANG ET AL.: "Tracking Angles of Departure and Arrival in a Mobile Millimeter-Wave Channel", 20 December 2015 (2015-12-20), XP032922077 *
DURRANT-WHYTE H. ET AL.: "Simultaneous localization and mapping: part I", IEEE ROBOTICS & AUTOMATION MAGAZINE, vol. 13, no. 2, 5 June 2006 (2006-06-05), pages 99 - 110, XP055304478, DOI: 10.1109/MRA.2006.1638022 *
ELENA BERGAMINI ET AL.: "Estimating Orientation Using Magnetic and Inertial Sensors and Different Sensor Fusion Approaches: Accuracy Assessment in Manual and Locomotion Tasks", SENSORS, vol. 14, 2014, pages 18625 - 18649 *
FAN BINGFEI ET AL.: "An Adaptive Orientation Estimation Method for Magnetic and Inertial Sensors in the Presence of Magnetic Disturbances", 19 May 2017 (2017-05-19), XP055530136, Retrieved from the Internet <URL:https://doi.org/10.3390/s17051161> *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230195242A1 (en) * 2020-01-31 2023-06-22 7hugs Labs SAS Low profile pointing device sensor fusion
WO2021152513A1 (en) * 2020-01-31 2021-08-05 7hugs Labs SAS Low profile pointing device sensor fusion
CN111708008A (en) * 2020-05-08 2020-09-25 南京工程学院 Underwater robot single-beacon navigation method based on IMU and TOF
US20210349177A1 (en) * 2020-05-08 2021-11-11 7hugs Labs SAS Low profile pointing device
WO2022030160A1 (en) * 2020-08-03 2022-02-10 京セラ株式会社 Electronic device
US11233544B1 (en) 2020-08-17 2022-01-25 Qualcomm Incorporated Methods and apparatus for multipath improvements using multiple antennas
WO2022039882A1 (en) * 2020-08-17 2022-02-24 Qualcomm Incorporated Methods and apparatus for multipath improvements using multiple antennas
CN112179332A (en) * 2020-09-30 2021-01-05 劢微机器人科技(深圳)有限公司 Hybrid positioning method and system for unmanned forklift
CN113219447A (en) * 2021-04-09 2021-08-06 国电南瑞科技股份有限公司 Power transmission line distance stability measuring method based on millimeter wave array
CN114543844A (en) * 2021-04-09 2022-05-27 恒玄科技(上海)股份有限公司 Audio playing processing method and device of wireless audio equipment and wireless audio equipment
CN114543844B (en) * 2021-04-09 2024-05-03 恒玄科技(上海)股份有限公司 Audio playing processing method and device of wireless audio equipment and wireless audio equipment
EP4117317A1 (en) * 2021-07-01 2023-01-11 Sword Health, S.A. Assessment of position of motion trackers on a subject based on wireless communications
CN114509069B (en) * 2022-01-25 2023-11-28 南昌大学 Indoor navigation positioning system based on Bluetooth AOA and IMU fusion
CN114509069A (en) * 2022-01-25 2022-05-17 南昌大学 Indoor navigation positioning system based on Bluetooth AOA and IMU fusion
WO2023184538A1 (en) * 2022-04-02 2023-10-05 Oppo广东移动通信有限公司 Information processing method, and device
CN114938490B (en) * 2022-05-16 2024-04-09 中国联合网络通信集团有限公司 Terminal positioning method, device and storage medium
CN114938490A (en) * 2022-05-16 2022-08-23 中国联合网络通信集团有限公司 Terminal positioning method, device and storage medium
EP4286794A1 (en) * 2022-06-03 2023-12-06 Sword Health, S.A. Gyroscope drift estimation and compensation with angle of arrival of electromagnetic waves
CN115334448A (en) * 2022-08-15 2022-11-11 重庆大学 Accurate dynamic positioning method of unmanned self-following device based on Bluetooth and inertial sensor
CN115334448B (en) * 2022-08-15 2024-03-15 重庆大学 Accurate dynamic positioning method of unmanned self-following device based on Bluetooth and inertial sensor
CN116761253A (en) * 2023-08-17 2023-09-15 湘江实验室 UWB weighted positioning method based on triangular area
CN116761253B (en) * 2023-08-17 2023-10-20 湘江实验室 UWB weighted positioning method based on triangular area

Also Published As

Publication number Publication date
WO2019239365A4 (en) 2020-03-05

Similar Documents

Publication Publication Date Title
WO2019239365A1 (en) System and method for position and orientation tracking of multiple mobile devices
US10444324B2 (en) Single node location system and method
CN107003378B (en) Portable electronic device and method for determining geographical position of portable electronic device
KR101478170B1 (en) Sysem and method for position estimation using uplink access point
KR20170082016A (en) Method and apparatus for estimating position in terminal
KR20230087469A (en) radio frequency detection communication
Mahapatra et al. Localization based on RSSI exploiting gaussian and averaging filter in wireless sensor network
WO2019154992A1 (en) Position estimation device and communication device
Koivisto et al. Continuous high-accuracy radio positioning of cars in ultra-dense 5G networks
Schüssel Angle of arrival estimation using WiFi and smartphones
US20210396832A1 (en) Method and Apparatus for Determining the Angle of Departure
US11233544B1 (en) Methods and apparatus for multipath improvements using multiple antennas
Obreja et al. Indoor localization using radio beacon technology
Kumar Performance analysis of RSS-based localization in wireless sensor networks
US11368809B2 (en) Single antenna direction finding and localization
Li et al. CSI-based WiFi-inertial state estimation
Elayan et al. Towards an intelligent deployment of wireless sensor networks
KR101459915B1 (en) Method of Localization
Zhang et al. Localizing backscatters by a single robot with zero start-up cost
Carreño et al. Opportunistic hybrid VLC-IMU positioning
Guo et al. Rss-based localization using a single robot in complex environments
US20210080533A1 (en) Single antenna direction finding and localization
US20220099789A1 (en) Positioning system deployment
KR101751210B1 (en) Position estimation apparatus using uplink access point
US20230400549A1 (en) Enhanced assistance data for radio frequency sensing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19819470

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19819470

Country of ref document: EP

Kind code of ref document: A1