WO2020087041A1 - Mixed reality device tracking - Google Patents

Mixed reality device tracking Download PDF

Info

Publication number
WO2020087041A1
WO2020087041A1 PCT/US2019/058215 US2019058215W WO2020087041A1 WO 2020087041 A1 WO2020087041 A1 WO 2020087041A1 US 2019058215 W US2019058215 W US 2019058215W WO 2020087041 A1 WO2020087041 A1 WO 2020087041A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile device
wearable device
acoustic signal
signal
wearable
Prior art date
Application number
PCT/US2019/058215
Other languages
French (fr)
Inventor
Anran WANG
Laura Cristina Trutoiu
Brian T. Schowengerdt
Nicholas Michael Vallidis
Original Assignee
Magic Leap, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magic Leap, Inc. filed Critical Magic Leap, Inc.
Publication of WO2020087041A1 publication Critical patent/WO2020087041A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/14Systems for determining distance or velocity not using reflection or reradiation using ultrasonic, sonic, or infrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S2205/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S2205/01Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations specially adapted for specific applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services

Definitions

  • This disclosure relates in general to systems and methods for tracking the location of a mobile device, and in particular to systems and methods for tracking the location of a mobile device using a sensor-equipped wearable device.
  • Wearable systems frequently incorporate one or more mobile devices, such as may be held by a user of the wearable system.
  • mobile devices such as may be held by a user of the wearable system.
  • Such applications can include those in which the mobile device acts as a mouse, pointer, stylus or other input device; and those in which the mobile device is used to indicate a position or orientation of the user’s hand (e.g., so a virtual object, such as a virtual sword, can be aligned to the hand and accordingly presented to the user via a display).
  • the quality of the user’s experience with such a wearable system can depend on the perceived accuracy and latency of tire device.
  • applications requiring fine motor input such as drawing applications, may be rendered useless if the system cannot reliably detect fine movements of the device, and if the results cannot be rendered to the user with a sufficiently low latency.
  • many such applications benefit from physical flexibility of the device; for instance, the usefulness of the device may be limited if the device must remain tethered to another system component (e.g., a host device, such as a wearable head unit); or within range of a fixed base station.
  • wearable systems may have strict limits on power consumption— for example, because size and weight restrictions of wearable devices may limit the size of a battery that may be used. Size and weight restrictions further limit the number and type of components that can be comfortably housed in the wearable device itself.
  • a tracking technique it is desirable for a tracking technique to track a location (e.g., a position and/or orientation) of a mobile device with relatively high accuracy and low latency; without requiring a tether or a base station; and with a minimum of power consumption. It is further desirable for such tracking techniques to operate with a minimum of specialized hardware; for instance, it is desirable for such tracking techniques to operate on a standard smartphone with conventional sensors.
  • a signal is received from a mobile device at a wearable device that comprises a microphone.
  • the signal comprises a first pure tone acoustic signal having a fixed frequency, and a chirp tone acoustic signal, where a frequency of the chirp tone acoustic signal varies with time.
  • Receiving the signal comprises detecting the signal via the microphone.
  • a relative velocity between the mobile device and the wearable device is determined based on the first pure tone acoustic signal.
  • a displacement between the mobile device and the 'wearable device is deter ined based on the chirp tone acoustic signal.
  • Sensor data is received at the wearable device from the mobile device.
  • the sensor data is indicative of an orientation of the mobile device with respect to an inertial frame.
  • a location of the mobile device with respect to the wearable device is determined using the relative velocity, tire displacement, and the sensor data.
  • FIG. 1 illustrates an example wearable system according to one or more examples of the disclosure.
  • FIG. 2 illustrates an example handheld controller that can be used in conjunction with an example wearable system according to one or more examples of the disclosure.
  • FIG. 3 illustrates an example auxiliary unit that can be used in conjunction with an example wearable system according to one or more examples of the disclosure.
  • FIG. 4 illustrates an example functional block diagram for an example wearable system according to one or more examples of the disclosure.
  • FIG. 5 illustrates an example wearable system including a wearable unit and a remote peripheral according to one or more examples of the disclosure.
  • FIGs. 6A-6C illustrate an example of determining a position and phase of a remote peripheral according to one or more examples of the disclosure.
  • FIGs. 7A-7B illustrate an example of determining a position and orientation of a remote peripheral according to one or more examples of the disclosure.
  • FIG. 1 illustrates an example wearable device 200, which may be a head-mountable system configured to be worn on the head of a user.
  • wearable head unit 200 (which may be, e.g., a wearable augmented reality or mixed reality headgear unit) comprises a display (which may comprise left and right transmissive displays, and associated components for coupling light from the displays to the user’s eyes); left and right acoustic structures (e.g., speakers positioned adjacent to the user’s left and right ears, respectively); one or more sensors such as radar sensors (including transmiting and/or receiving antennas), infrared sensors, accelerometers, gyroscopes, magnetometers, GPS units, inertial
  • wearable head unit 200 can incorporate any suitable display technology, and any suitable number, type, or combination of components without departing from the scope of the invention in some examples, wearable head unit 200 may incorporate one or more microphones configured to detect audio signals generated by the user’s voice; such microphones may be positioned in a wearable head unit adjacent to the user’s mouth.
  • wearable head unit 200 may incorporate networking or wireless features (e.g., Wi-Fi capability, Bluetooth) to communicate with other devices and systems, including other wearable systems.
  • Wearable head unit 200 may further include a battery (which may be mounted in an auxiliary unit, such as a belt pack designed to be worn around a user’s waist), a processor, and a memory.
  • tracking components of wearable head unit 200 may provide input to a processor performing a Simultaneous Localization and Mapping (SLAM) and/or visual odometry algorithm.
  • Wearable head unit 200 may be a first component of a larger wearable system such as a mixed reality system, that includes additional system components.
  • such a wearable system may also include a handheld controller 300, and/or an auxiliary unit 320, which may be a wearable belt pack, as described further below.
  • FIG. 2 illustrates an example handheld controller component 300 of a wearable system 200.
  • handheld controller 300 includes a grip portion 346 and one or more buttons 350 disposed along a top surface 348.
  • buttons 350 may be configured for use as an optical tracking target, e.g , for tracking six-degree-of-freedom (6DOF) motion of the handheld controller 300, in conjunction with a camera or other optical sensor (which in some examples may be mounted in wearable head unit 200.
  • handheld controller 300 includes tracking components (e.g., an IMU, radar sensors (including transmitting and/or receiving antennas), or other suitable sensors or circuitry), for detecting position or orientation, such as position or orientation relative to a wearable head unit or a belt pack.
  • tracking components e.g., an IMU, radar sensors (including transmitting and/or receiving antennas), or other suitable sensors or circuitry
  • such tracking components may be positioned in handle of handheld controller 300 and facing out any surface(s) of the handheld controller 300 (e.g., grip portion 346, top surface 348, and/or bottom surface 352), and/or may be mechanically coupled to the handheld controller.
  • Handheld controller 300 can be configured to provide one or more output signals corresponding to one or more of a pressed state of the buttons; or a position, orientation, and/or motion of the handheld controller 300 (e.g., via an TMU).
  • Such output signals may be used as input to a processor of wearable head unit 200, of handheld controller 300, or of another component of a wearable system (e.g., a wearable mixed reality system).
  • handheld controller 300 can include a processor, a memory, or other suitable computer system components.
  • a processor for example, can be used to execute any suitable process disclosed herein.
  • FIG. 3 illustrates an example auxiliary unit 320 of a wearable system, such as a wearable mixed reality system.
  • the auxiliary unit 320 can include, for example, one or more batteries to provide energy to operate the wearable head unit 200 and/or handheld controller 300, including displays and/or acoustic structures within those components; a processor (which may execute any suitable process disclosed herein); a memory; or any other suitable components of a wearable system.
  • auxiliary unit 320 may be better suited for housing large or heavy components (e.g., batteries), as it may more easily be positioned on parts of a user’s body, such as the waist or back, that are comparatively strong and less easily fatigued by heavy items.
  • large or heavy components e.g., batteries
  • sensing and/or tracking components may be positioned in auxiliary unit 320.
  • Such components can include, for instance, one or more IMUs and/or acoustic sensors.
  • the auxiliary unit 320 can use such components to determine the positions and/or orientations (e.g., 6DOF locations) of handheld controller 300; the wearable head unit 200; or the auxiliary unit itself.
  • the example auxiliary unit 320 includes a clip 2128 for attaching the auxiliary unit 320 to a user’s belt.
  • Other form factors are suitable for auxiliary unit 320 and will be apparent, including form factors that do not involve mounting the unit to a user’s belt.
  • auxiliary unit 320 is coupled to the wearable head unit 200 through a multiconduit cable that can, for example, include electrical wires and fiber optics. Wireless connections to and from the auxiliary unit 320 can also be used (e.g., Bluetooth, Wi-Fi, or any other wireless technology).
  • FIG. 4 shows an example functional block diagram that may correspond to an example mixed reality system (e.g., a mixed reality system including one or more of the components described above with respect to FIGs. 1-3).
  • example handheld controller 400B (which may correspond to handheld controller 300 (a’‘totem”)) includes a totem-to-headgear six degree of freedom (6DOF) totem subsystem 404A and one or more radar sensors 407 (which can include transmitting and/or receiving antennas); and example augmented reality headgear 400A (which may correspond to wearable head unit 200) includes a totem-to-headgear 6DOF headgear subsystem 404B.
  • 6DOF six degree of freedom
  • the 6DOF totem subsystem 404A and the 6DOF headgear subsystem 404B cooperate to determine six coordinates (e.g., offsets in three translation directions and rotation along three axes) of the handheld controller 4Q0B relative to the augmented reality headgear 400A.
  • the six degrees of freedom (6DOF) may be expressed relative to a coordinate system of the headgear 400A.
  • the three translation offsets may be expressed as X, Y, and Z offsets in such a coordinate system, as a translation matrix, or as some other representation.
  • Cartesian coordinates e.g., location
  • the rotation degrees of freedom may be expressed as sequence of yaw, pitch, and roll rotations, as a rotation matrix, as a quaternion, or as some other representation.
  • radar sensor 407 included in handheld controller 400B can comprise an antenna, or an array of multiple antennas, configured to transmit signals having specific radiation patterns (e.g., unique wave polarizations as described below) and at distinct frequencies that can be received by radar sensor 408 in the wearable head unit 400A and used for 6DOF tracking (e.g., as described in further detail below).
  • one or more system components e.g., wearable head unit 400 A, handheld controller 400B, and/or auxiliary unit 400C
  • IMU Inertial Measurement Unit
  • accelerometer e.g., accelerometer, gyroscope, or other sensor that can enhance orientation tracking, such as described below.
  • the wearable head unit 400A; one or more depth cameras 444 (and/or one or more non-depth cameras) included in the wearable head unit 400A; and/or one or more optical targets (e.g., buttons 350 of handheld controller 400B as described above, or dedicated optical targets included in the handheld controller 400B) can be used for 6DOF tracking.
  • the handheld controller 400B can include a camera, as described above; and the wearable head unit 400A can include an optical target for optical tracking in conjunction with the camera.
  • a local coordinate space e.g., a coordinate space fixed relative to wearable head unit 400 A
  • an inertial coordinate space e.g., a coordinate space fixed relative to the real environment
  • such transformations may be necessary for a display of wearable head unit 400.4 to present a virtual object at an expected position and orientation relative to the real environment (e.g., a virtual person sitting in a real chair, facing forward, regardless of the headgear’s position and orientation), rather than at a fixed position and orientation on the display (e.g., at the same position in the right lower corner of the display), to preserve the illusion that the virtual object exists in the real environment (and does not, for example, appear positioned unnaturally in the real environment as the wearable head unit 400 A shifts and rotates).
  • an expected position and orientation relative to the real environment e.g., a virtual person sitting in a real chair, facing forward, regardless of the headgear’s position and orientation
  • a fixed position and orientation on the display e.g., at the same position in the right lower corner of the display
  • a compensatory transformation between coordinate spaces can be determined by processing imagery from the depth cameras 444 using a SLAM and/or visual odometry procedure in order to determine the transformation of the headgear relative to a coordinate system.
  • the depth cameras 444 are coupled to a SLAM/visual odometry block 406 and can provide imagery to block 406.
  • SLAM/visual odometry block 406 implementation can include a processor configured to process this imagery and determine a position and orientation of the user’s head, which can then be used to identify a transformation between a head coordinate space and a real coordinate space.
  • an additional source of information on the user's head pose and location is obtained from IMU 409 (or another suitable sensor, such as an accelerometer or gyroscope).
  • Information from IMU 409 can be integrated with information from the SLAM/visual odometry block 406 to provide improved accuracy and/or more timely information on rapid adjustments of the user’s bead pose and position.
  • the depth cameras 444 can supply 3D imagery to a hand gesture tracker 411, which may be implemented in a processor of wearable head unit 400A.
  • the hand gesture tracker 411 can identify a user’s hand gestures, for example by matching 3D imagery received from the depth cameras 444 to stored patterns representing hand gestures. Other suitable techniques of identifying a user’s band gestures will he apparent.
  • one or more processors 416 may be configured to receive data from the wearable head unit’s headgear subsystem 404B, the radar sensor 408, the IMU 409, the SLAM/visual odometry block 406, depth cameras 444, a microphone 450; and/or the hand gesture tracker 411.
  • the processor 416 can also send and receive control signals from the totem system 404.4.
  • the processor 416 may be coupled to the totem system 404A wirelessly, such as in examples where the handheld controller 400B is untethered.
  • Processor 416 may further communicate with additional components, such as an audio-visual content memory 418, a Graphical Processing Unit (GPU) 420, and/or a Digital Signal Processor (DSP) audio spatializer 422.
  • GPU Graphical Processing Unit
  • DSP Digital Signal Processor
  • the DSP audio spatializer 422 may be coupled to a Head Related Transfer Function (HRTF) memory 425.
  • the GPU 420 can include a left channel output coupled to the left source of imagewise modulated light 424 and a right channel output coupled to the right source of imagewise modulated light 426. GPU 420 can output stereoscopic image data to the sources of imagewise modulated light 424, 426.
  • the DSP audio spatializer 422 can output audio to a left speaker 412 and/or a right speaker 414.
  • the DSP audio spatializer 422 can receive input from processor 419 indicating a direction vector from a user to a virtual sound source (which may be moved by the user, e.g., via the handheld controller 320).
  • the DSP audio spatializer 422 can determine a corresponding HRTF (e.g., by accessing a HRTF, or by interpolating multiple HRTFs). The DSP audio spatializer 422 can then apply the determ ined HRTF to an audio signal, such as an audio signal corresponding to a virtual sound generated by a virtual object. This can enhance the believability and realism of the virtual sound, by incorporating the relative position and orientation of the user relative to the virtual sound in the mixed reality environment—that is, by presenting a virtual sound that matches a user’s expectations of what that virtual sound would sound like if it were a real sound in a real environment.
  • auxiliary unit 400C may include a battery 427 to power its components and/or to supply power to another system component, such as wearable head unit 400 A and/or handheld controller 400B.
  • FIG. 4 presents elements corresponding to various components of an example mixed reality system
  • various other suitable arrangements of these components will become apparent to those skilled in the art.
  • elements presented in FIG. 4 as being associated with auxiliary unit 400C could instead be associated with wearable head unit 400A and/or handheld controller 400B.
  • some mixed reality systems may forgo entirely a handheld controller 400B or auxiliary unit 400C.
  • Such changes and modifications are to be understood as being included within the scope of the disclosed examples.
  • a mobile device e.g , handheld controller 300 or auxiliary unit 320 described above
  • another component of the wearable system e.g., wearable head unit 200, auxiliary unit 320
  • a location e.g., a position and/or orientation
  • Such tracking techniques to operate with a minimum of specialized hardware; for instance, it is desirable for such tracking techniques to operate on a standard smartphone with conventional sensors
  • Acoustic sensors and inertial measurement units are examples of sensors that can be used to achieve tire above advantages. Such sensors may be readily available and may be included in a variety of standard devices, such as smartphones. These sensors can be used to determine a position and/or orientation of a mobile device; for example, an acoustic sensor comprising an array of microphones (a receiver) can be used to determine a displacement (e.g., using 3D acoustic localization) between the microphone array and the source of an acoustic signal (a transmitter) detected by the microphone array. And an IMU can output one or more sensor data values (e.g., outputs of an accelerometer, gyroscope, a magnetometer, and/or other sensors) corresponding to the orientation of the IMU with respect to an inertial frame.
  • sensor data values e.g., outputs of an accelerometer, gyroscope, a magnetometer, and/or other sensors
  • acoustic sensors and IMUs face challenges that limit the accuracy of such determinations.
  • acoustic sensors operate by transmitting and/or receiving sound waves. These sound waves may he difficult to reliably transmit and detect; for example, sound waves from the environment may interfere with a desired acoustic signal.
  • acoustic signals can reflect off of surfaces in the environment, creating multiple paths along which the signals can travel from a transmitter to a receiver, making it difficult to determine a time-of -flight (and thus a displacement value) corresponding to tire line of sight between the transmitter and the receiver.
  • acoustic signals can he obstructed by objects and surfaces that lie between the transmitter and the receiver.
  • the time-of -flight of an acoustic signal can be dependent on environmental variations, such as fluctuations in the ambient room temperature, which can affect the speed of sound in air.
  • environmental variations such as fluctuations in the ambient room temperature, which can affect the speed of sound in air.
  • acoustic sensors typically cannot reliably determine an orientation (rather than a position) of a source of an acoustic signal.
  • IMUs can be used to provide orientation data, their usefulness as position sensors is highly limited.
  • FIG. 5 illustrates an example wearable system 500, in use by a user 501, that can use one or more acoustic sensors (e.g., microphones) and one or more IMUs to determine a location (e.g., position and orientation) of a mobile device relative to a head unit.
  • Example system 500 includes a head unit 510 (which may correspond to wearable head unit 200) worn by user 501, and a mobile device 520 held by user 501.
  • mobile device 520 may be a specialized handheld device such as handheld controller 300 described above.
  • mobile device 520 may be a smartphone equipped with suitable sensors and components.
  • head unit 510 may include an array of one or more microphones 512 (e.g., four microphones) and one or more speakers 514, such as described above with respect to wearable head unit 200; and mobile device 520 may include an array of microphones 522 and one or more speakers 524.
  • Mobile device 520 may transmit one or more acoustic signals 530, which signals are received by head unit 510.
  • speaker 524 of mobile device 520 may transmit acoustic signals 530, which acoustic signals are received by microphone array 512 of head unit 510.
  • head unit 510 may transmit one or more acoustic signals 538, which signals are received by mobile device 520.
  • speaker 514 of head unit 510 may transmit acoustic signals 538, which acoustic signals fire received by microphone array 522 of mobile device 520.
  • Each of acoustic signals 530 and/or 538 may be a composite signal comprising two or more individual component signals.
  • acoustic signals 530 and/or 538 may be processed by wearable system 500 to determine a position of mobile device 520 relative to head unit 510.
  • head unit 510 includes an IMU 516
  • mobile device 520 includes an IMU 526. IMU 516 and IMU 526 output data corresponding to the orientation of head unit 510 and mobile device 520, respectively, relative to an inertial frame.
  • Head unit 510 and/or mobile device 520 may each include one or more processors (e.g., CPUs, GPUs, DSPs) and/or one or more memories for performing the operations below based on acoustic signals 530, and the outputs of various sensors of each device (e.g., IMUs 516 and 526).
  • processors e.g., CPUs, GPUs, DSPs
  • memories for performing the operations below based on acoustic signals 530, and the outputs of various sensors of each device (e.g., IMUs 516 and 526).
  • the operations below ' can be performed partially or entirely by head unit 510.
  • the operations below can be performed partially or entirely by mobile device 520.
  • the operations below can be performed partially or entirely by a separate host device (not shown), such as a network server or a host computer.
  • Some techniques for overcoming limitations of acoustic sensors use a first acoustic signal for determining a relative velocity between a transmitter and a receiver (e.g., by tracking changes in the phase of the signal as it travels between transmitter and receiver); and a second acoustic signal for determining the time-of- flight (and thus the corresponding distance) of the signal between the transmitter and the receiver.
  • the first acoustic signal can be used to determine changes in displacement (e.g., using the determined relative velocity between the transmitter and receiver), while distance tracking based on the second acoustic signal can be used to track absolute displacement.
  • phase tracking can exhibit relatively high accuracy (e.g., accuracy to within three millimeters in some examples), but is generally limited to tracking changes in displacement, not absolute displacement.
  • absolute displacement can be estimated by integrating changes in displacement, such measurements may be unstable, and highly vulnerable to noise, drift, and other sources of error.
  • absolute displacement tracking permits measuring absolute displacement, it generally exhibits lower accuracy than phase tracking, and is more subject to noise (e.g., from reflections of acoustic signals against surfaces that are not along a direct line of sight between transmitter and receiver).
  • FIGs 6A-6C illustrate an example of acoustic signals 530 that may be transmitted from mobile device 520 to head unit 510 of wearable system 500.
  • speaker 524 of mobile device 520 transmits an acoustic signal 530 comprising two component signals: a chirp tone acoustic signal 532, and a pure tone acoustic signal 534.
  • a chirp tone acoustic signal is a sinusoidal signal with a frequency that is modulated as a function of time.
  • the frequency of chirp signal 532 could increase linearly from a first frequency fo to a second frequency// between a first time to and a second time r.
  • a pure tone acoustic signal is a sinusoidal signal wherein the frequency remains constant during at least a portion (e.g., a portion between a first time to and a second time ti) of the signal.
  • both chirp tone signal 532 and pure tone signal 534 are detected by microphone array 512 of head unit 510.
  • FIG. 6B shows chirp tone signal 532 transmitted by speaker 524 of mobile device 520 and detected by microphone array 512 of head unit 510, which outputs a signal 532' (not shown in FIG. 6B) corresponding to a detected version of chirp tone signal 532.
  • Chirp tone signal 532 can be used to compute an absolute displacement between head unit 510 and mobile device 520.
  • chirp tone signal 532 adheres to a template signal that is available to head unit 510; an autocorrelation can he performed between detected chirp tone signal 532' and transmitted chirp tone signal 532, according to techniques known in the art.
  • Peaks of the resulting autocorrelation signal can be identified; the locations of such peaks can correspond to the time-of-flight of chirp tone signal 532 from the transmitted location (speaker 524) to the receiving location (microphone array 512). From this time-of-flight, a displacement between the transmitted location and the receiving location can be computed (e.g., using a known velocity of chirp tone signal 532). Function 542 of FIG. 6B can represent such displacement over time.
  • Chirp tone signal 532 preferably comprises a signal with strong autocorrelation properties, such that off-peak autocorrelation coefficients are minimized; this facilitates the computation of the time-of-flight of chirp tone signal 532.
  • chirp tone signal 532 comprises a frequency-modulated continuous wave (FMCW), which in some examples comprises frequencies in a range of approximately 15-22 KHz.
  • FMCW frequency-modulated continuous wave
  • FIG. 6C shows pure tone signal 534 transmitted by speaker 524 of mobile device 520 and detected by microphones 512 of head unit 510, which outputs a signal 534' (not shown in FIG. 6C) corresponding to a detected version of pure tone signal 534.
  • Pure tone signal 534 can be used to compute a relative velocity (e.g., relative change of displacement) between head unit 510 and mobile device 520. As mobile device 520 moves relative to head unit 510, their relative velocity can correspond to a phase offset dxj> (e.g., ranging between zero and p) that is introduced to pure tone signal 534 as it propagates from speaker 524 of mobile device 520 to microphones 512 of head unit 510.
  • dxj> e.g., ranging between zero and p
  • phase offset can be the result of Doppler effects caused by the relative motion of mobile device 520 and head unit 510 (e.g., by a change in time-of-flight caused by the relative motion).
  • pure tone signal 534 adheres to a template signal that is available to head unit 510.
  • the phase offset ⁇ f can be determined by compfiring received pure tone signal 534' to the template signal, for example according to techniques known in the art to determine a phase offset between two pure tone signals. From this information, a relative velocity between head unit 510 and mobile device 520 can be determined. For instance, phase offset ⁇ f can be compared to a fixed wavelength of pure tone signal 534, according to techniques known in the art, to determine a relative velocity as a function of time (e.g., function 544 in FIG. 6C).
  • function 542 indicating an absolute displacement as a function of time as described above, nor function 544, indicating a relative velocity as a function of time as described above, is independently ideal for determining a position of mobile device 520 relative to head unit 510.
  • function 542 can indicate an absolute displacement of mobile device 520 with respect to head unit 520, but is likely to contain noisy data with a high level of jitter— due in part to acoustic reflections, which can obscure the actual time-of- flight of an acoustic signal along the line of sight between transmitter and receiver.
  • function 544 may generally be smoother and less noisy than function 542 (e.g., because changes in phase are typically less affected by environmental interference and reflections), function 544 cannot independently provide a reliable absolute position of mobile device 520 with respect to head unit 520. However, a combination of function 542
  • a Kalman filter 550 may be used to combine function 542 and function 544 to produce function 562, indicating the absolute position of mobile device 520 with respect to head unit 510 with less noise than present in function 542.
  • Kalman filter 550 can operate recursively to update function 562 based on an equation:
  • Kalman filter 542 updates function 562, representing a current esti ate of the absolute position of mobile device 520 with respect to head unit 510, accordingly.
  • Kalman filter 550 may be a multi-hypothesis Kalman filter; a multi-hypothesis Kalman filter can address errors resulting from multiple peaks in an autocorrelation of chirp tone signal 532 (e.g., which may result from acoustic reflections, or multiple paths traveled by chirp tone signal 532 between the transmitter location and the receiver location).
  • older hypotheses e.g., previous values of function 542 and function 544
  • this technique can result in a relatively low-noise estimate of tire absolute position of mobile device 520 with respect to head unit 510, as represented by function 562; further, this technique can operate with linear complexity, promoting the scalability of the described technique.
  • IMU 516 provides an orientation of head unit 510
  • IMU 526 provides an orientation of mobile device 520, with respect to an inertial frame.
  • the difference between the output of IMU 526 and the output of IMU 516 can provide function 564, which indicates an orientation of mobile device 520, as a function of time, relative to head unit 510.
  • Function 564 can indicate the orientation of mobile device 520 along multiple rotation axes (e.g., pitch, roll, yaw) ⁇ Function 564 can be combined with function 562 to provide a six-degree-of-freedom (6DOF) location 580 of the position (e.g., along X, Y, and Z axes) and orientation (e.g., pitch, roll, yaw) of mobile device 520 relative to head unit 510.
  • 6DOF six-degree-of-freedom
  • 6DOF location 580 can be represented, such as in a memory of head unit 510 and/or mobile device 520, in any suitable format (e.g., one or more vectors, matrices, and/or quaternions).
  • the 6DOF location 580 can then be made available to various software and/or hardware components of wearable system 500, or of external systems.
  • a software application executing on one or more processors of head unit 510 and/or mobile device 520 can query a current 6DOF location 580, for example via an application programming interface (API), which 6DOF indication is communicated (e.g., as a matrix) to the application in response.
  • API application programming interface
  • the 6DOF location 580 can he used to determine an absolute position of mobile device 520 with respect to an inertial frame.
  • wearable system 500 can comprise tracking components (e.g., a GPS unit and/or IMU 516 of head unit 510) configured to output a position and/or orientation of the wearable system 500 relative to the inertial frame.
  • 6DGF location 580 can be combined using this output, using techniques known in the art, to determine the absolute position of mobile device 520 with respect to the inertial frame.
  • a Kalman filter 570 e.g., a basic (single-hypothesis) Kalman filter
  • function 564 e.g., the output of Kalman filter 550, based on function 542 and function 544
  • function 562 e.g., indicating an orientation of mobile device 520 based on the output of IMU 526.
  • transmited signal 530 may need to be synchronized with its received version (e.g., signal 530 as detected by microphone array 512) in order for one or more of the above techniques to be performed.
  • chirp tone signal 532 may be generated by speaker 524 of mobile device 520, and detected as chirp tone signal 532' by microphone array 512 of head unit 510; signal 532 may then be autoeorrelated using signal 532' (i.e., a time-delayed version of chirp tone signal 532).
  • signal 532 and signal 532' may be transmitted and received on different devices, with independent clocks— for example, signal 532 may be transmitted by mobile device 520, and signal 532' may he received by head unit 510— the synchronization of these signals (and, therefore, the reliability of the autocorrelation and the position calculations based on that autocorrelation ) may be subject to the drift of the clocks of the respective devices.
  • a technique for correcting clock drift is based on a likelihood that any drift in the clocks of the two devices (i.e., mobile device 520 and head unit 510) is symmetric—that is, drift of a first clock (e.g., a clock of mobile device 520) will be offset by a drift of equal magnitude and opposite direction of a second clock (e.g., a clock of head unit 510). That is, the drift of the first clock as measured by head unit 510 can be expected to be of equal and opposite magnitude to the drift of the second clock as measured by mobile device 520.
  • a first clock e.g., a clock of mobile device 520
  • a second clock e.g., a clock of head unit 510
  • a difference between the two clocks can be estimated, and corrected for, by transmitting reciprocal acoustic signals.
  • speaker 524 of mobile device 520 can transmit a first acoustic signal 530 to head unit 510.
  • speaker 514 of head unit 510 can transmit a second acoustic signal 538 to mobile device 520.
  • the first acoustic signal 530 and the second acoustic signal 538 may comprise pure tone signals of the same frequency; accordingly, either head unit 510 or mobile device 520 can determine a phase difference between the two signals. This phase difference can correspond to the difference between the clocks of the respective devices.
  • the mobile device 520 may transmit sensor data from IMU 526 to the head unit 510 over one or more communication networks, such as Bluetooth, Wi-Fi, and other radio communication networks. For instance, the mobile device 520 may transmit data from IMU 526 indicative of an orientation of the mobile device 520 with respect to an inertial frame to the head unit 510 over one or more communication networks. In some examples, the head unit 510 may leverage such data received from the mobile device 520 over one or more communication networks in conjunction with sensor data from IMU 516 to determine an orientation of mobile device 520, as a function of time, relati ve to head unit 510.
  • communication networks such as Bluetooth, Wi-Fi, and other radio communication networks.
  • the mobile device 520 may transmit data from IMU 526 indicative of an orientation of the mobile device 520 with respect to an inertial frame to the head unit 510 over one or more communication networks.
  • the head unit 510 may leverage such data received from the mobile device 520 over one or more communication networks in conjunction with sensor data from IMU 516 to determine an orientation of
  • the array of microphones 522 may receive second acoustic signal 538 transmitted by the one or more speakers 514.
  • the second acoustic signal 538 received by the array of microphones 522 may be analyzed by the mobile device 520, the head unit 510, or a combination thereof to determine a phase difference between the first acoustic signal 530 and the second acoustic signal 538.
  • the head unit 510 may measure the drift of the first clock based on the first acoustic signal 530 received by the array of one or more microphones 512, while the mobile device 520 may measure the drift of the second clock based on the second acoustic signal 538 received by the array of microphones 522 and subsequently transmit data indicative of the measured drift of the second clock to the head unit 510 over one or more communication networks.
  • the head unit 510 may leverage such data in conjunction with the measured drift of the first clock to calculate a phase difference between the first acoustic signal 530 and the second acoustic signal 538.
  • the head unit 510 and mobile device 520 may conduct one or more of the aforementioned communications over a wired connection instead of or in addition to the one or more aforementioned communication networks. It is to be understood that the head unit 510 and mobile device 520 may exchange additional information with each other over one or more communication networks, wired connection, or a combination thereof.
  • Some examples of the disclosure are directed to a method of locating a mobile device, the method comprising: receiving, from the mobile device, at a wearable device comprising a microphone, a signal comprising: a first pure tone acoustic signal having a fixed frequency, and a chirp tone acoustic signal, a frequency of the chirp tone acoustic signal varying with time, wherein receiving the signal comprises detecting the signal via the microphone; determining, based on the first pure tone acoustic signal, a relative velocity between the mobile device and the wearable device; determining, based on the chirp tone acoustic signal, a displacement between the mobile device and the wearable device;
  • the relative velocity is determined using a phase offset associated with tire first pure tone acoustic signal detected at the microphone of the wearable device.
  • the determined location comprises a six-degree-of-freedom representation of the mobile device with respect to the wearable device.
  • the method further comprises: in response to receiving the signal, transmitting to the mobile device a second pure tone acoustic signal having a fixed frequency equal to the fixed frequency of the first pure tone acoustic signal; and determining, using a phase difference between the first pure tone acoustic signal and the second pure tone acoustic signal, a clock drift, wherein the displacement between the mobile device and the wearable device is determined using the clock drift.
  • determining the location of the mobile device comprises computing a first relative displacement esti ate via a first Kalman filter using the relative velocity and the displacement as inputs to the first Kalman filter.
  • determining the location of the mobile device further comprises computing a second relative displacement estimate via a second Kalman filter using the first relative displacement estimate and the sensor data as inputs to the second Kalman filter.
  • the mobile device is carried by a user of the wearable device.
  • the wearable device comprises a wearable headset device.
  • the wearable device comprises a belt pack.
  • the mobile device comprises a smartphone.
  • the mobile device comprises a handheld controller device. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the mobile device comprises a belt pack. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the fixed frequency is in the range of approximately 15 KHz to 22 KHz.
  • Some examples of the disclosure are directed to a wearable device comprising: a microphone; and one or more processors coupled to the microphone of the wearable device, wherein the wearable device is configured to: receive, from a mobile device, a signal comprising: a first pure tone acoustic signal having a fixed frequency , and a chirp tone acoustic signal, the frequency of the chirp tone acoustic signal varying with time, wherein receiving the signal comprises detecting the signal via the microphone of the wearable device, and receive, from the mobile device, sensor data indicative of an orientation of the mobile device with respect to an inertial frame; and wherein the one or more processors are configured to: determine, based on the first pure tone acoustic signal, a relative velocity between the mobile device and the wearable device; determine, based on the chirp tone acoustic signal, a displacement between the mobile device and the wearable device; and determine, using the determined relative velocity, the determined displacement, and the sensor data, a location

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

Systems and methods of locating a mobile device are disclosed. According to a disclosed method, a signal is received from a mobile device at a wearable device that comprises a microphone. The signal comprises a first pure tone acoustic signal having a fixed frequency, and a chirp tone acoustic signal, where a frequency of the chirp tone acoustic signal varies with time. Receiving the signal comprises detecting the signal via the microphone. A relative velocity between the mobile device and the wearable device is determined based on the first pure tone acoustic signal. A displacement between the mobile device and the wearable device is determined based on the chirp tone acoustic signal. Sensor data is received at the wearable device from the mobile device. The sensor data, is indicative of an orientation of the mobile device with respect to an inertial frame. A location of the mobile device with respect to the wearable device is determined using the relative velocity, the displacement, and the sensor data.

Description

Figure imgf000003_0001
REFERENCE TO RELATED APPLICATION
[0001] This application claims priority to U.S. Provisional Application No. 62/751,455, filed on October 26, 2018, the contents of which are incorporated by reference herein in their entirety.
FIELD
[0002] This disclosure relates in general to systems and methods for tracking the location of a mobile device, and in particular to systems and methods for tracking the location of a mobile device using a sensor-equipped wearable device.
BACKGROUND
[0003] Wearable systems frequently incorporate one or more mobile devices, such as may be held by a user of the wearable system. For some applications of wearable systems, it can be desirable to track the position and orientation of a mobile device with respect to another component of the wearable system, or with respect to a user of the system. Such applications can include those in which the mobile device acts as a mouse, pointer, stylus or other input device; and those in which the mobile device is used to indicate a position or orientation of the user’s hand (e.g., so a virtual object, such as a virtual sword, can be aligned to the hand and accordingly presented to the user via a display).
[0004] As with many input devices, the quality of the user’s experience with such a wearable system can depend on the perceived accuracy and latency of tire device. For example, applications requiring fine motor input, such as drawing applications, may be rendered useless if the system cannot reliably detect fine movements of the device, and if the results cannot be rendered to the user with a sufficiently low latency. In addition, many such applications benefit from physical flexibility of the device; for instance, the usefulness of the device may be limited if the device must remain tethered to another system component (e.g., a host device, such as a wearable head unit); or within range of a fixed base station.
(Tracking a device in certain mobile and/or outdoor applications, in particular, may preclude the use of tethers or fixed base stations.) Moreover, wearable systems may have strict limits on power consumption— for example, because size and weight restrictions of wearable devices may limit the size of a battery that may be used. Size and weight restrictions further limit the number and type of components that can be comfortably housed in the wearable device itself.
[0005] Accordingly, it is desirable for a tracking technique to track a location (e.g., a position and/or orientation) of a mobile device with relatively high accuracy and low latency; without requiring a tether or a base station; and with a minimum of power consumption. It is further desirable for such tracking techniques to operate with a minimum of specialized hardware; for instance, it is desirable for such tracking techniques to operate on a standard smartphone with conventional sensors.
BRIEF SUMMARY
[0006] Examples of the disclosure describe systems and methods of locating a mobile device. According to a disclosed method, a signal is received from a mobile device at a wearable device that comprises a microphone. The signal comprises a first pure tone acoustic signal having a fixed frequency, and a chirp tone acoustic signal, where a frequency of the chirp tone acoustic signal varies with time. Receiving the signal comprises detecting the signal via the microphone. A relative velocity between the mobile device and the wearable device is determined based on the first pure tone acoustic signal. A displacement between the mobile device and the 'wearable device is deter ined based on the chirp tone acoustic signal. Sensor data is received at the wearable device from the mobile device. The sensor data is indicative of an orientation of the mobile device with respect to an inertial frame. A location of the mobile device with respect to the wearable device is determined using the relative velocity, tire displacement, and the sensor data.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] FIG. 1 illustrates an example wearable system according to one or more examples of the disclosure. [0008] FIG. 2 illustrates an example handheld controller that can be used in conjunction with an example wearable system according to one or more examples of the disclosure.
[0009] FIG. 3 illustrates an example auxiliary unit that can be used in conjunction with an example wearable system according to one or more examples of the disclosure.
[0010] FIG. 4 illustrates an example functional block diagram for an example wearable system according to one or more examples of the disclosure.
[0011] FIG. 5 illustrates an example wearable system including a wearable unit and a remote peripheral according to one or more examples of the disclosure.
[0012] FIGs. 6A-6C illustrate an example of determining a position and phase of a remote peripheral according to one or more examples of the disclosure.
[0013] FIGs. 7A-7B illustrate an example of determining a position and orientation of a remote peripheral according to one or more examples of the disclosure.
DETAILED DESCRIPTION
[0014] In the following description of examples, reference is made to the accompanying drawings which form a part hereof, and in which it is shown by way of illustration specific examples that can be practiced. It is to be understood that other examples can be used and structural changes can be made without departing from the scope of the disclosed examples
[0015] EXAMPLE WEARABLE SYSTEM
[0016] FIG. 1 illustrates an example wearable device 200, which may be a head-mountable system configured to be worn on the head of a user. In the example shown, wearable head unit 200 (which may be, e.g., a wearable augmented reality or mixed reality headgear unit) comprises a display (which may comprise left and right transmissive displays, and associated components for coupling light from the displays to the user’s eyes); left and right acoustic structures (e.g., speakers positioned adjacent to the user’s left and right ears, respectively); one or more sensors such as radar sensors (including transmiting and/or receiving antennas), infrared sensors, accelerometers, gyroscopes, magnetometers, GPS units, inertial
measurement units (IMU), acoustic sensors; an orthogonal coil electromagnetic receiver (e.g., mounted to the left temple piece); left and right cameras (e.g., depth (time-of-flight) cameras) oriented away from the user; and left and right eye cameras oriented toward the user (e.g., for detecting the user’s eye movements). However, wearable head unit 200 can incorporate any suitable display technology, and any suitable number, type, or combination of components without departing from the scope of the invention in some examples, wearable head unit 200 may incorporate one or more microphones configured to detect audio signals generated by the user’s voice; such microphones may be positioned in a wearable head unit adjacent to the user’s mouth. In some examples, wearable head unit 200 may incorporate networking or wireless features (e.g., Wi-Fi capability, Bluetooth) to communicate with other devices and systems, including other wearable systems. Wearable head unit 200 may further include a battery (which may be mounted in an auxiliary unit, such as a belt pack designed to be worn around a user’s waist), a processor, and a memory. In some examples, tracking components of wearable head unit 200 may provide input to a processor performing a Simultaneous Localization and Mapping (SLAM) and/or visual odometry algorithm. Wearable head unit 200 may be a first component of a larger wearable system such as a mixed reality system, that includes additional system components. In some examples, such a wearable system may also include a handheld controller 300, and/or an auxiliary unit 320, which may be a wearable belt pack, as described further below.
[0017] FIG. 2 illustrates an example handheld controller component 300 of a wearable system 200. In some examples, handheld controller 300 includes a grip portion 346 and one or more buttons 350 disposed along a top surface 348. In some examples, buttons 350 may be configured for use as an optical tracking target, e.g , for tracking six-degree-of-freedom (6DOF) motion of the handheld controller 300, in conjunction with a camera or other optical sensor (which in some examples may be mounted in wearable head unit 200. In some examples, handheld controller 300 includes tracking components (e.g., an IMU, radar sensors (including transmitting and/or receiving antennas), or other suitable sensors or circuitry), for detecting position or orientation, such as position or orientation relative to a wearable head unit or a belt pack. In some examples, such tracking components may be positioned in handle of handheld controller 300 and facing out any surface(s) of the handheld controller 300 (e.g., grip portion 346, top surface 348, and/or bottom surface 352), and/or may be mechanically coupled to the handheld controller. Handheld controller 300 can be configured to provide one or more output signals corresponding to one or more of a pressed state of the buttons; or a position, orientation, and/or motion of the handheld controller 300 (e.g., via an TMU). Such output signals may be used as input to a processor of wearable head unit 200, of handheld controller 300, or of another component of a wearable system (e.g., a wearable mixed reality system). Such input may correspond to a position, orientation, and/or movement of the handheld controller (and, by extension, to a position, orientation, and/or movement of a hand of a user holding the controller). Such input may also correspond to a user pressing buttons 350. In some examples, handheld controller 300 can include a processor, a memory, or other suitable computer system components. A processor, for example, can be used to execute any suitable process disclosed herein.
[0018] FIG. 3 illustrates an example auxiliary unit 320 of a wearable system, such as a wearable mixed reality system. The auxiliary unit 320 can include, for example, one or more batteries to provide energy to operate the wearable head unit 200 and/or handheld controller 300, including displays and/or acoustic structures within those components; a processor (which may execute any suitable process disclosed herein); a memory; or any other suitable components of a wearable system. Compared to wearable head units (e.g., wearable head unit 200) or handheld units (e.g., handheld controller 300), auxiliary unit 320 may be better suited for housing large or heavy components (e.g., batteries), as it may more easily be positioned on parts of a user’s body, such as the waist or back, that are comparatively strong and less easily fatigued by heavy items.
[0019] In some examples, sensing and/or tracking components may be positioned in auxiliary unit 320. Such components can include, for instance, one or more IMUs and/or acoustic sensors. In some examples, the auxiliary unit 320 can use such components to determine the positions and/or orientations (e.g., 6DOF locations) of handheld controller 300; the wearable head unit 200; or the auxiliary unit itself.. As shown the example auxiliary unit 320 includes a clip 2128 for attaching the auxiliary unit 320 to a user’s belt. Other form factors are suitable for auxiliary unit 320 and will be apparent, including form factors that do not involve mounting the unit to a user’s belt. In some examples, auxiliary unit 320 is coupled to the wearable head unit 200 through a multiconduit cable that can, for example, include electrical wires and fiber optics. Wireless connections to and from the auxiliary unit 320 can also be used (e.g., Bluetooth, Wi-Fi, or any other wireless technology).
[0020] FIG. 4 shows an example functional block diagram that may correspond to an example mixed reality system (e.g., a mixed reality system including one or more of the components described above with respect to FIGs. 1-3). As shown in FIG. 4, example handheld controller 400B (which may correspond to handheld controller 300 (a’‘totem”)) includes a totem-to-headgear six degree of freedom (6DOF) totem subsystem 404A and one or more radar sensors 407 (which can include transmitting and/or receiving antennas); and example augmented reality headgear 400A (which may correspond to wearable head unit 200) includes a totem-to-headgear 6DOF headgear subsystem 404B. In the example, the 6DOF totem subsystem 404A and the 6DOF headgear subsystem 404B cooperate to determine six coordinates (e.g., offsets in three translation directions and rotation along three axes) of the handheld controller 4Q0B relative to the augmented reality headgear 400A. The six degrees of freedom (6DOF) may be expressed relative to a coordinate system of the headgear 400A. The three translation offsets may be expressed as X, Y, and Z offsets in such a coordinate system, as a translation matrix, or as some other representation. These Cartesian coordinates (e.g., location) can be determined though known radar detection techniques (e.g., as described in further detail below). The rotation degrees of freedom may be expressed as sequence of yaw, pitch, and roll rotations, as a rotation matrix, as a quaternion, or as some other representation.
[0021] In some examples, radar sensor 407 included in handheld controller 400B can comprise an antenna, or an array of multiple antennas, configured to transmit signals having specific radiation patterns (e.g., unique wave polarizations as described below) and at distinct frequencies that can be received by radar sensor 408 in the wearable head unit 400A and used for 6DOF tracking (e.g., as described in further detail below). Additionally, one or more system components (e.g., wearable head unit 400 A, handheld controller 400B, and/or auxiliary unit 400C) can include an Inertial Measurement Unit (IMU), accelerometer, gyroscope, or other sensor that can enhance orientation tracking, such as described below.
[0022] In some examples the wearable head unit 400A; one or more depth cameras 444 (and/or one or more non-depth cameras) included in the wearable head unit 400A; and/or one or more optical targets (e.g., buttons 350 of handheld controller 400B as described above, or dedicated optical targets included in the handheld controller 400B) can be used for 6DOF tracking. In some examples, the handheld controller 400B can include a camera, as described above; and the wearable head unit 400A can include an optical target for optical tracking in conjunction with the camera.
[0023] In some examples, it may become necessary to transform coordinates from a local coordinate space (e.g., a coordinate space fixed relative to wearable head unit 400 A) to an inertial coordinate space (e.g., a coordinate space fixed relative to the real environment). For instance, such transformations may be necessary for a display of wearable head unit 400.4 to present a virtual object at an expected position and orientation relative to the real environment (e.g., a virtual person sitting in a real chair, facing forward, regardless of the headgear’s position and orientation), rather than at a fixed position and orientation on the display (e.g., at the same position in the right lower corner of the display), to preserve the illusion that the virtual object exists in the real environment (and does not, for example, appear positioned unnaturally in the real environment as the wearable head unit 400 A shifts and rotates). In some examples, a compensatory transformation between coordinate spaces can be determined by processing imagery from the depth cameras 444 using a SLAM and/or visual odometry procedure in order to determine the transformation of the headgear relative to a coordinate system. In the example shown in FIG. 4, the depth cameras 444 are coupled to a SLAM/visual odometry block 406 and can provide imagery to block 406. The
SLAM/visual odometry block 406 implementation can include a processor configured to process this imagery and determine a position and orientation of the user’s head, which can then be used to identify a transformation between a head coordinate space and a real coordinate space. Similarly, in some examples, an additional source of information on the user's head pose and location is obtained from IMU 409 (or another suitable sensor, such as an accelerometer or gyroscope). Information from IMU 409 can be integrated with information from the SLAM/visual odometry block 406 to provide improved accuracy and/or more timely information on rapid adjustments of the user’s bead pose and position.
[0024] In some examples, the depth cameras 444 can supply 3D imagery to a hand gesture tracker 411, which may be implemented in a processor of wearable head unit 400A. The hand gesture tracker 411 can identify a user’s hand gestures, for example by matching 3D imagery received from the depth cameras 444 to stored patterns representing hand gestures. Other suitable techniques of identifying a user’s band gestures will he apparent.
[0025] In some examples, one or more processors 416 may be configured to receive data from the wearable head unit’s headgear subsystem 404B, the radar sensor 408, the IMU 409, the SLAM/visual odometry block 406, depth cameras 444, a microphone 450; and/or the hand gesture tracker 411. The processor 416 can also send and receive control signals from the totem system 404.4. The processor 416 may be coupled to the totem system 404A wirelessly, such as in examples where the handheld controller 400B is untethered. Processor 416 may further communicate with additional components, such as an audio-visual content memory 418, a Graphical Processing Unit (GPU) 420, and/or a Digital Signal Processor (DSP) audio spatializer 422. The DSP audio spatializer 422 may be coupled to a Head Related Transfer Function (HRTF) memory 425. The GPU 420 can include a left channel output coupled to the left source of imagewise modulated light 424 and a right channel output coupled to the right source of imagewise modulated light 426. GPU 420 can output stereoscopic image data to the sources of imagewise modulated light 424, 426. The DSP audio spatializer 422 can output audio to a left speaker 412 and/or a right speaker 414. The DSP audio spatializer 422 can receive input from processor 419 indicating a direction vector from a user to a virtual sound source (which may be moved by the user, e.g., via the handheld controller 320). Based on the direction vector, the DSP audio spatializer 422 can determine a corresponding HRTF (e.g., by accessing a HRTF, or by interpolating multiple HRTFs). The DSP audio spatializer 422 can then apply the determ ined HRTF to an audio signal, such as an audio signal corresponding to a virtual sound generated by a virtual object. This can enhance the believability and realism of the virtual sound, by incorporating the relative position and orientation of the user relative to the virtual sound in the mixed reality environment— that is, by presenting a virtual sound that matches a user’s expectations of what that virtual sound would sound like if it were a real sound in a real environment.
[0026] In some examples, such as shown in FIG. 4, one or more of processor 416, GPU 420, DSP audio spatializer 422, HRTF memory 425, and audio/visual content memory 418 may be included in an auxiliary unit 400C (which may correspond to auxiliary unit 320 described above). The auxiliary unit 400C may include a battery 427 to power its components and/or to supply power to another system component, such as wearable head unit 400 A and/or handheld controller 400B. Including such components in an auxiliary unit, which can be mounted to a user’s waist, can limit the size and weight of wearable head unit 400A, which can in turn reduce fatigue of a user’s head and neck.
[0027] While FIG. 4 presents elements corresponding to various components of an example mixed reality system, various other suitable arrangements of these components will become apparent to those skilled in the art. For example, elements presented in FIG. 4 as being associated with auxiliary unit 400C could instead be associated with wearable head unit 400A and/or handheld controller 400B. Furthermore, some mixed reality systems may forgo entirely a handheld controller 400B or auxiliary unit 400C. Such changes and modifications are to be understood as being included within the scope of the disclosed examples. 0028] MOBILE DEVICE TRACKING
[0029] For many applications of wearable systems such as described above, it can be desirable to track the position and orientation of a mobile device (e.g , handheld controller 300 or auxiliary unit 320 described above) with respect to another component of the wearable system (e.g., wearable head unit 200, auxiliary unit 320); or to a user of the system. Further, as described above, it can be desirable for a tracking technique to track a location (e.g., a position and/or orientation) of a mobile device with relatively high accuracy and low latency; without requiring a tether or a base station; and with a minimum of power consumption. It is further desirable for such tracking techniques to operate with a minimum of specialized hardware; for instance, it is desirable for such tracking techniques to operate on a standard smartphone with conventional sensors
[0030] Acoustic sensors and inertial measurement units (IMUs) are examples of sensors that can be used to achieve tire above advantages. Such sensors may be readily available and may be included in a variety of standard devices, such as smartphones. These sensors can be used to determine a position and/or orientation of a mobile device; for example, an acoustic sensor comprising an array of microphones (a receiver) can be used to determine a displacement (e.g., using 3D acoustic localization) between the microphone array and the source of an acoustic signal (a transmitter) detected by the microphone array. And an IMU can output one or more sensor data values (e.g., outputs of an accelerometer, gyroscope, a magnetometer, and/or other sensors) corresponding to the orientation of the IMU with respect to an inertial frame.
[0031] However, used independently, acoustic sensors and IMUs face challenges that limit the accuracy of such determinations. For example, acoustic sensors operate by transmitting and/or receiving sound waves. These sound waves may he difficult to reliably transmit and detect; for example, sound waves from the environment may interfere with a desired acoustic signal. Further, acoustic signals can reflect off of surfaces in the environment, creating multiple paths along which the signals can travel from a transmitter to a receiver, making it difficult to determine a time-of -flight (and thus a displacement value) corresponding to tire line of sight between the transmitter and the receiver. Similarly, acoustic signals can he obstructed by objects and surfaces that lie between the transmitter and the receiver. In addition, the time-of -flight of an acoustic signal can be dependent on environmental variations, such as fluctuations in the ambient room temperature, which can affect the speed of sound in air. These challenges may be magnified when multiple sound transmitters (e.g., speakers) and multiple sound receivers (e.g., microphones) are present. Further, acoustic sensors typically cannot reliably determine an orientation (rather than a position) of a source of an acoustic signal. And while IMUs can be used to provide orientation data, their usefulness as position sensors is highly limited. However, when data from an acoustic sensor is used for both phase tracking and distance measurement, and when this data is fused with data from an IMU, as described below, the result can provide a more robust indication of position and orientation than would be achievable using either type of sensor alone.
[0032] FIG. 5 illustrates an example wearable system 500, in use by a user 501, that can use one or more acoustic sensors (e.g., microphones) and one or more IMUs to determine a location (e.g., position and orientation) of a mobile device relative to a head unit. Example system 500 includes a head unit 510 (which may correspond to wearable head unit 200) worn by user 501, and a mobile device 520 held by user 501. In some examples, mobile device 520 may be a specialized handheld device such as handheld controller 300 described above.
In some examples, mobile device 520 may be a smartphone equipped with suitable sensors and components. In example system 500, head unit 510 may include an array of one or more microphones 512 (e.g., four microphones) and one or more speakers 514, such as described above with respect to wearable head unit 200; and mobile device 520 may include an array of microphones 522 and one or more speakers 524. Mobile device 520 may transmit one or more acoustic signals 530, which signals are received by head unit 510. For example, speaker 524 of mobile device 520 may transmit acoustic signals 530, which acoustic signals are received by microphone array 512 of head unit 510. Similarly, head unit 510 may transmit one or more acoustic signals 538, which signals are received by mobile device 520. For example, speaker 514 of head unit 510 may transmit acoustic signals 538, which acoustic signals fire received by microphone array 522 of mobile device 520. Each of acoustic signals 530 and/or 538 may be a composite signal comprising two or more individual component signals. As described herein, acoustic signals 530 and/or 538 may be processed by wearable system 500 to determine a position of mobile device 520 relative to head unit 510. In addition, in the example shown, head unit 510 includes an IMU 516, and mobile device 520 includes an IMU 526. IMU 516 and IMU 526 output data corresponding to the orientation of head unit 510 and mobile device 520, respectively, relative to an inertial frame.
[0033] Head unit 510 and/or mobile device 520 may each include one or more processors (e.g., CPUs, GPUs, DSPs) and/or one or more memories for performing the operations below based on acoustic signals 530, and the outputs of various sensors of each device (e.g., IMUs 516 and 526). In some examples, the operations below' can be performed partially or entirely by head unit 510. In some examples, the operations below can be performed partially or entirely by mobile device 520. In some examples, the operations below can be performed partially or entirely by a separate host device (not shown), such as a network server or a host computer.
[0034] Some techniques for overcoming limitations of acoustic sensors, such as those described above, use a first acoustic signal for determining a relative velocity between a transmitter and a receiver (e.g., by tracking changes in the phase of the signal as it travels between transmitter and receiver); and a second acoustic signal for determining the time-of- flight (and thus the corresponding distance) of the signal between the transmitter and the receiver. The first acoustic signal can be used to determine changes in displacement (e.g., using the determined relative velocity between the transmitter and receiver), while distance tracking based on the second acoustic signal can be used to track absolute displacement. Generally speaking, phase tracking can exhibit relatively high accuracy (e.g., accuracy to within three millimeters in some examples), but is generally limited to tracking changes in displacement, not absolute displacement. (While absolute displacement can be estimated by integrating changes in displacement, such measurements may be unstable, and highly vulnerable to noise, drift, and other sources of error). In comparison, while absolute displacement tracking permits measuring absolute displacement, it generally exhibits lower accuracy than phase tracking, and is more subject to noise (e.g., from reflections of acoustic signals against surfaces that are not along a direct line of sight between transmitter and receiver).
[0035] FIGs 6A-6C illustrate an example of acoustic signals 530 that may be transmitted from mobile device 520 to head unit 510 of wearable system 500. In the example, as shown in FIG. 6A, speaker 524 of mobile device 520 transmits an acoustic signal 530 comprising two component signals: a chirp tone acoustic signal 532, and a pure tone acoustic signal 534. As used herein, a chirp tone acoustic signal is a sinusoidal signal with a frequency that is modulated as a function of time. For example, the frequency of chirp signal 532 could increase linearly from a first frequency fo to a second frequency// between a first time to and a second time r. A pure tone acoustic signal is a sinusoidal signal wherein the frequency remains constant during at least a portion (e.g., a portion between a first time to and a second time ti) of the signal. As shown in FIG. 6A, both chirp tone signal 532 and pure tone signal 534 are detected by microphone array 512 of head unit 510.
[0036] FIG. 6B shows chirp tone signal 532 transmitted by speaker 524 of mobile device 520 and detected by microphone array 512 of head unit 510, which outputs a signal 532' (not shown in FIG. 6B) corresponding to a detected version of chirp tone signal 532. Chirp tone signal 532 can be used to compute an absolute displacement between head unit 510 and mobile device 520. In some examples, chirp tone signal 532 adheres to a template signal that is available to head unit 510; an autocorrelation can he performed between detected chirp tone signal 532' and transmitted chirp tone signal 532, according to techniques known in the art. Peaks of the resulting autocorrelation signal can be identified; the locations of such peaks can correspond to the time-of-flight of chirp tone signal 532 from the transmitted location (speaker 524) to the receiving location (microphone array 512). From this time-of-flight, a displacement between the transmitted location and the receiving location can be computed (e.g., using a known velocity of chirp tone signal 532). Function 542 of FIG. 6B can represent such displacement over time. Chirp tone signal 532 preferably comprises a signal with strong autocorrelation properties, such that off-peak autocorrelation coefficients are minimized; this facilitates the computation of the time-of-flight of chirp tone signal 532. In some examples, chirp tone signal 532 comprises a frequency-modulated continuous wave (FMCW), which in some examples comprises frequencies in a range of approximately 15-22 KHz.
[0037] FIG. 6C shows pure tone signal 534 transmitted by speaker 524 of mobile device 520 and detected by microphones 512 of head unit 510, which outputs a signal 534' (not shown in FIG. 6C) corresponding to a detected version of pure tone signal 534. Pure tone signal 534 can be used to compute a relative velocity (e.g., relative change of displacement) between head unit 510 and mobile device 520. As mobile device 520 moves relative to head unit 510, their relative velocity can correspond to a phase offset dxj> (e.g., ranging between zero and p) that is introduced to pure tone signal 534 as it propagates from speaker 524 of mobile device 520 to microphones 512 of head unit 510. This phase offset can be the result of Doppler effects caused by the relative motion of mobile device 520 and head unit 510 (e.g., by a change in time-of-flight caused by the relative motion). In some examples, pure tone signal 534 adheres to a template signal that is available to head unit 510. The phase offset άf can be determined by compfiring received pure tone signal 534' to the template signal, for example according to techniques known in the art to determine a phase offset between two pure tone signals. From this information, a relative velocity between head unit 510 and mobile device 520 can be determined. For instance, phase offset άf can be compared to a fixed wavelength of pure tone signal 534, according to techniques known in the art, to determine a relative velocity as a function of time (e.g., function 544 in FIG. 6C).
[0038] Neither function 542, indicating an absolute displacement as a function of time as described above, nor function 544, indicating a relative velocity as a function of time as described above, is independently ideal for determining a position of mobile device 520 relative to head unit 510. For example, function 542 can indicate an absolute displacement of mobile device 520 with respect to head unit 520, but is likely to contain noisy data with a high level of jitter— due in part to acoustic reflections, which can obscure the actual time-of- flight of an acoustic signal along the line of sight between transmitter and receiver.
Conversely, while function 544 may generally be smoother and less noisy than function 542 (e.g., because changes in phase are typically less affected by environmental interference and reflections), function 544 cannot independently provide a reliable absolute position of mobile device 520 with respect to head unit 520. However, a combination of function 542
(representing absolute displacement tracking) and function 544 (representing relative velocity) can together provide what neither can provide individually: a reliable, low noise measurement of the absolute position of mobile device 520 with respect to head unit 520.
[0039] Various techniques are known in the art for combining the absolute displacement indicated by function 542 with the change in displacement indicated by function 544. In some examples, such as shown in FIG. 7A, a Kalman filter 550 may be used to combine function 542 and function 544 to produce function 562, indicating the absolute position of mobile device 520 with respect to head unit 510 with less noise than present in function 542. For example, Kalman filter 550 can operate recursively to update function 562 based on an equation:
Figure imgf000017_0001
[0041] In the above equation describing Kalman filter 550, represents an estimated
O - absolute position (e.g., described by function 562); " ! ~ 1 represents a previously estimated absolute position (e.g., described by function 562); P represents a measured absolute position (e.g., described by function 542); and ^ represents a Kalman gain factor (e.g., which may be described by, or determined based on, function 544). As new sets of measurement data (e.g., which may be represented by function 542 and function 544) arrive, Kalman filter 542 updates function 562, representing a current esti ate of the absolute position of mobile device 520 with respect to head unit 510, accordingly. In some examples, Kalman filter 550 may be a multi-hypothesis Kalman filter; a multi-hypothesis Kalman filter can address errors resulting from multiple peaks in an autocorrelation of chirp tone signal 532 (e.g., which may result from acoustic reflections, or multiple paths traveled by chirp tone signal 532 between the transmitter location and the receiver location). In some examples, older hypotheses (e.g., previous values of function 542 and function 544) can be removed from Kalman filter 550 according to statistical variance; that is, those hypothesis with the largest statistical variance can be removed from the Kalman filter. As will be familiar to those skilled in the art, this technique can result in a relatively low-noise estimate of tire absolute position of mobile device 520 with respect to head unit 510, as represented by function 562; further, this technique can operate with linear complexity, promoting the scalability of the described technique.
[0042] In example system 500, IMU 516 provides an orientation of head unit 510, and IMU 526 provides an orientation of mobile device 520, with respect to an inertial frame. As shown in FIG. 7B, the difference between the output of IMU 526 and the output of IMU 516 can provide function 564, which indicates an orientation of mobile device 520, as a function of time, relative to head unit 510. Function 564 can indicate the orientation of mobile device 520 along multiple rotation axes (e.g., pitch, roll, yaw)· Function 564 can be combined with function 562 to provide a six-degree-of-freedom (6DOF) location 580 of the position (e.g., along X, Y, and Z axes) and orientation (e.g., pitch, roll, yaw) of mobile device 520 relative to head unit 510.
[0043] 6DOF location 580 can be represented, such as in a memory of head unit 510 and/or mobile device 520, in any suitable format (e.g., one or more vectors, matrices, and/or quaternions). The 6DOF location 580 can then be made available to various software and/or hardware components of wearable system 500, or of external systems. For instance, a software application executing on one or more processors of head unit 510 and/or mobile device 520 can query a current 6DOF location 580, for example via an application programming interface (API), which 6DOF indication is communicated (e.g., as a matrix) to the application in response.
[0044] In some examples, the 6DOF location 580 can he used to determine an absolute position of mobile device 520 with respect to an inertial frame. For example, wearable system 500 can comprise tracking components (e.g., a GPS unit and/or IMU 516 of head unit 510) configured to output a position and/or orientation of the wearable system 500 relative to the inertial frame. 6DGF location 580 can be combined using this output, using techniques known in the art, to determine the absolute position of mobile device 520 with respect to the inertial frame.
[0045] As above, various techniques are known in the art for combining the orientation indicated by function 564 with the displacement indicated by function 562. In some examples, such as shown in FIG. 7B, a Kalman filter 570 (e.g., a basic (single-hypothesis) Kalman filter) may be used to combine function 564 and function 562 to produce 6DOF location 580. That is, Kalman filter 570 may accept as inputs function 564 (e.g., the output of Kalman filter 550, based on function 542 and function 544), and function 562 (e.g., indicating an orientation of mobile device 520 based on the output of IMU 526). The operation of Kalman filter 580 can be analogous to the operation of Kalman filter 550, described above, and will be familiar to those skilled in the art. [0046] In some examples, transmited signal 530 may need to be synchronized with its received version (e.g., signal 530 as detected by microphone array 512) in order for one or more of the above techniques to be performed. For example, as described above, chirp tone signal 532 may be generated by speaker 524 of mobile device 520, and detected as chirp tone signal 532' by microphone array 512 of head unit 510; signal 532 may then be autoeorrelated using signal 532' (i.e., a time-delayed version of chirp tone signal 532). To determine a time delay between signal 532 and signal 532', it may be necessary to synchronize signal 532 and signal 532' to a base time. However, since signal 532 and signal 532’ may be transmitted and received on different devices, with independent clocks— for example, signal 532 may be transmitted by mobile device 520, and signal 532' may he received by head unit 510— the synchronization of these signals (and, therefore, the reliability of the autocorrelation and the position calculations based on that autocorrelation ) may be subject to the drift of the clocks of the respective devices.
[0047] A technique for correcting clock drift is based on a likelihood that any drift in the clocks of the two devices (i.e., mobile device 520 and head unit 510) is symmetric— that is, drift of a first clock (e.g., a clock of mobile device 520) will be offset by a drift of equal magnitude and opposite direction of a second clock (e.g., a clock of head unit 510). That is, the drift of the first clock as measured by head unit 510 can be expected to be of equal and opposite magnitude to the drift of the second clock as measured by mobile device 520.
Accordingly, a difference between the two clocks can be estimated, and corrected for, by transmitting reciprocal acoustic signals. For example, as described above with respect to FIG. 5, speaker 524 of mobile device 520 can transmit a first acoustic signal 530 to head unit 510. In response, speaker 514 of head unit 510 can transmit a second acoustic signal 538 to mobile device 520. The first acoustic signal 530 and the second acoustic signal 538 may comprise pure tone signals of the same frequency; accordingly, either head unit 510 or mobile device 520 can determine a phase difference between the two signals. This phase difference can correspond to the difference between the clocks of the respective devices.
This measured difference can then be corrected for in the synchronization of the acoustic signals described above. [0048] In some examples, the mobile device 520 may transmit sensor data from IMU 526 to the head unit 510 over one or more communication networks, such as Bluetooth, Wi-Fi, and other radio communication networks. For instance, the mobile device 520 may transmit data from IMU 526 indicative of an orientation of the mobile device 520 with respect to an inertial frame to the head unit 510 over one or more communication networks. In some examples, the head unit 510 may leverage such data received from the mobile device 520 over one or more communication networks in conjunction with sensor data from IMU 516 to determine an orientation of mobile device 520, as a function of time, relati ve to head unit 510. Furthermore, for at least some examples in which the mobile device 520 includes array of microphones 522, the array of microphones 522 may receive second acoustic signal 538 transmitted by the one or more speakers 514. In these examples, the second acoustic signal 538 received by the array of microphones 522 may be analyzed by the mobile device 520, the head unit 510, or a combination thereof to determine a phase difference between the first acoustic signal 530 and the second acoustic signal 538. For instance, in at least some of these examples, the head unit 510 may measure the drift of the first clock based on the first acoustic signal 530 received by the array of one or more microphones 512, while the mobile device 520 may measure the drift of the second clock based on the second acoustic signal 538 received by the array of microphones 522 and subsequently transmit data indicative of the measured drift of the second clock to the head unit 510 over one or more communication networks. Upon receiving data indicative of the measured drift of the second clock from the mobile device 520 over one or more communication networks, the head unit 510 may leverage such data in conjunction with the measured drift of the first clock to calculate a phase difference between the first acoustic signal 530 and the second acoustic signal 538. In some examples, the head unit 510 and mobile device 520 may conduct one or more of the aforementioned communications over a wired connection instead of or in addition to the one or more aforementioned communication networks. It is to be understood that the head unit 510 and mobile device 520 may exchange additional information with each other over one or more communication networks, wired connection, or a combination thereof.
[0049] Some examples of the disclosure are directed to a method of locating a mobile device, the method comprising: receiving, from the mobile device, at a wearable device comprising a microphone, a signal comprising: a first pure tone acoustic signal having a fixed frequency, and a chirp tone acoustic signal, a frequency of the chirp tone acoustic signal varying with time, wherein receiving the signal comprises detecting the signal via the microphone; determining, based on the first pure tone acoustic signal, a relative velocity between the mobile device and the wearable device; determining, based on the chirp tone acoustic signal, a displacement between the mobile device and the wearable device;
receiving, from the mobile device, at the wearable device, sensor data indicative of an orientation of the mobile device with respect to an inertial frame; and determining, using the relative velocity, the displacement, and the sensor data, a location of the mobile device with respect to the wearable device. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the relative velocity is determined using a phase offset associated with tire first pure tone acoustic signal detected at the microphone of the wearable device. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the determined location comprises a six-degree-of-freedom representation of the mobile device with respect to the wearable device. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the method further comprises: in response to receiving the signal, transmitting to the mobile device a second pure tone acoustic signal having a fixed frequency equal to the fixed frequency of the first pure tone acoustic signal; and determining, using a phase difference between the first pure tone acoustic signal and the second pure tone acoustic signal, a clock drift, wherein the displacement between the mobile device and the wearable device is determined using the clock drift. Additionally or alternatively to one or more of the examples disclosed above, in some examples, determining the location of the mobile device comprises computing a first relative displacement esti ate via a first Kalman filter using the relative velocity and the displacement as inputs to the first Kalman filter. Additionally or alternatively to one or more of the examples disclosed above, in some examples, determining the location of the mobile device further comprises computing a second relative displacement estimate via a second Kalman filter using the first relative displacement estimate and the sensor data as inputs to the second Kalman filter. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the mobile device is carried by a user of the wearable device. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the wearable device comprises a wearable headset device. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the wearable device comprises a belt pack. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the mobile device comprises a smartphone. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the mobile device comprises a handheld controller device. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the mobile device comprises a belt pack. Additionally or alternatively to one or more of the examples disclosed above, in some examples, the fixed frequency is in the range of approximately 15 KHz to 22 KHz.
[0050] Some examples of the disclosure are directed to a wearable device comprising: a microphone; and one or more processors coupled to the microphone of the wearable device, wherein the wearable device is configured to: receive, from a mobile device, a signal comprising: a first pure tone acoustic signal having a fixed frequency , and a chirp tone acoustic signal, the frequency of the chirp tone acoustic signal varying with time, wherein receiving the signal comprises detecting the signal via the microphone of the wearable device, and receive, from the mobile device, sensor data indicative of an orientation of the mobile device with respect to an inertial frame; and wherein the one or more processors are configured to: determine, based on the first pure tone acoustic signal, a relative velocity between the mobile device and the wearable device; determine, based on the chirp tone acoustic signal, a displacement between the mobile device and the wearable device; and determine, using the determined relative velocity, the determined displacement, and the sensor data, a location of the mobile device with respect to the wearable device.
[0051] Although the disclosed examples have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. For example, elements of one or more implementations may he combined, deleted, modified, or supplemented to form further implementations. Such changes and modifications are to be understood as being included within the scope of the disclosed examples as defined by the appended claims.

Claims

CLAIMS What is claimed is:
1. A method of locating a mobile device, the method comprising:
receiving, from the mobile device, at a wearable device comprising a microphone, a signal comprising:
a first pure tone acoustic signal having a fixed frequency, and a chirp tone acoustic signal, a frequency of the chirp tone acoustic signal varying with time,
wherein receiving the signal comprises detecting the signal via the microphone;
determining, based on the first pure tone acoustic signal , a relati ve velocity between the mobile device and the wearable device;
determining, based on the chirp tone acoustic signal, a displacement between the mobile device and the wearable device;
receiving, from the mobile device, at the wearable device, sensor data indicative of an orientation of the mobile device with respect to an inertial frame; and
determining, using the relative velocity, the displacement, and the sensor data, a location of tire mobile device with respect to the wearable device.
2. The method of claim 1, wherein the relative velocity is determined using a phase offset associated with the first pure tone acoustic signal detected at the microphone of the wearable device.
3. The method of claim 1 , wherein the determined location comprises a six-degree-of- freedom representation of the mobile device with respect to the wearable device.
4. The method of claim 1, further comprising:
in response to receiving the signal, transmitting to the mobile device a second pure tone acoustic signal having a fixed frequency equal to the fixed frequency of the first pure tone acoustic signal; and determining, using a phase difference between the first pure tone acoustic signal and the second pure tone acoustic signal, a clock drift,
wherein the displacement between the mobile device and the wearable device is determined using the clock drift.
5. The method of claim 1, wherein determining the location of the mobile device comprises computing a first relative displacement estimate via a first Kalman filter using the relative velocity and the displacement as inputs to the first Kalman filter.
6. The method of claim 4, wherein determining the location of the mobile device further comprises computing a second relative displacement estimate via a second Kalman filter using the first relative displacement estimate and the sensor data as inputs to the second Kalman filter.
7. The method of claim 1 , wherein the mobile device is carried by a user of the wearable device.
8. The method of claim 1, wherein the wearable device comprises a wearable headset device.
9. The method of claim 1 , wherein the wearable device comprises a belt pack.
10. The method of claim 1, wherein the mobile device comprises a smartphone.
11. The method of claim 1, wherein the mobile de vice comprises a handheld controller device.
12. The method of claim 1 , wherein the mobile device comprises a belt pack.
13. The method of claim 1, wherein the fixed frequency is in the range of approximately
15 KHz to 22 KHz.
14. The method of claim 1 , wherein the wearable device further comprises tracking components configured to output a position of the wearable device relative to the inertial frame, the method further comprising:
determining, using the posi tion of the wearable device relative to the inertial frame and the determined location of the mobile device with respect to the wearable device, an absolute position of the mobile device with respect to the inertial frame.
15. A wearable device comprising:
a microphone; and
one or more processors coupled to the microphone of the wearable device, wherein the wearable device is configured to:
receive, from a mobile device, a signal compri ing:
a first pure tone acoustic signal having a fixed frequency, and a chirp tone acoustic signal, the frequency of the chirp tone acoustic signal varying with time, wherein receiving the signal comprises detecting the signal via the microphone of the wearable device, and
receive, from the mobile device, sensor data indicative of an orientation of the mobile device with respect to an inertial frame;
and wherein the one or more processors are configured to:
determine, based on the first pure tone acoustic signal, a relative velocity between the mobile device and the wearable device;
determine, based on the chirp tone acoustic signal, a displacement between the mobile device and the wearable device; and
determine, using the determined relative velocity, the determined displacement, and the sensor data, a location of tire mobile device with respect to the wearable device.
PCT/US2019/058215 2018-10-26 2019-10-25 Mixed reality device tracking WO2020087041A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862751455P 2018-10-26 2018-10-26
US62/751,455 2018-10-26

Publications (1)

Publication Number Publication Date
WO2020087041A1 true WO2020087041A1 (en) 2020-04-30

Family

ID=70331852

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/058215 WO2020087041A1 (en) 2018-10-26 2019-10-25 Mixed reality device tracking

Country Status (1)

Country Link
WO (1) WO2020087041A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114543844A (en) * 2021-04-09 2022-05-27 恒玄科技(上海)股份有限公司 Audio playing processing method and device of wireless audio equipment and wireless audio equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100289662A1 (en) * 2008-01-11 2010-11-18 John Dasilva Personnel safety utilizing time variable frequencies
US20140192622A1 (en) * 2013-01-10 2014-07-10 Carnegie Mellon University, Center For Technology Transfer And Enterprise Creation Method and System for Ultrasonic Signaling, Ranging and Location Tracking
US8942719B1 (en) * 2010-09-08 2015-01-27 Sprint Communications Company L.P. Locating a nearby mobile device without using GPS

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100289662A1 (en) * 2008-01-11 2010-11-18 John Dasilva Personnel safety utilizing time variable frequencies
US8942719B1 (en) * 2010-09-08 2015-01-27 Sprint Communications Company L.P. Locating a nearby mobile device without using GPS
US20140192622A1 (en) * 2013-01-10 2014-07-10 Carnegie Mellon University, Center For Technology Transfer And Enterprise Creation Method and System for Ultrasonic Signaling, Ranging and Location Tracking

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CAI ET AL., A SURVEY ON ACOUSTIC SENSING, 11 January 2019 (2019-01-11), XP55709221, Retrieved from the Internet <URL:https://arxiv.org/pdf/1901.03450.pdf> [retrieved on 20191222] *
ENS ET AL., ACOUSTIC SELF-CALIBRATING SYSTEM FOR INDOOR SMART PHONE TRACKING, 2 March 2015 (2015-03-02), XP055431903, Retrieved from the Internet <URL:http://downloads.hindawi.com/archive/2015/694695.pdf> [retrieved on 20191222] *
LAZIK ET AL., INDOOR PSEUDO-RANGING OF MOBILE DEVICES USING ULTRASONIC CHIRPS, 9 November 2012 (2012-11-09), XP058029979, Retrieved from the Internet <URL:https://users.ece.cmu.edu/-agr/resources/publications/p99-lazik.pdf> [retrieved on 20191222] *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114543844A (en) * 2021-04-09 2022-05-27 恒玄科技(上海)股份有限公司 Audio playing processing method and device of wireless audio equipment and wireless audio equipment
CN114543844B (en) * 2021-04-09 2024-05-03 恒玄科技(上海)股份有限公司 Audio playing processing method and device of wireless audio equipment and wireless audio equipment

Similar Documents

Publication Publication Date Title
US9401050B2 (en) Recalibration of a flexible mixed reality device
CN110140099B (en) System and method for tracking controller
CN108089699B (en) Human body posture detection system, garment and method
US8933931B2 (en) Distributed asynchronous localization and mapping for augmented reality
CN113170272B (en) Near-field audio rendering
US20090259432A1 (en) Tracking determination based on intensity angular gradient of a wave
US11226406B1 (en) Devices, systems, and methods for radar-based artificial reality tracking
CN113825055B (en) Head-to-headphone rotation transform estimation for head pose tracking in spatial audio applications
EP3631600B1 (en) Dynamic control of performance parameters in a six degrees-of-freedom sensor calibration subsystem
US10795458B2 (en) Unfused pose-based drift correction of a fused pose of a totem in a user interaction system
US10248191B2 (en) Virtual rigid framework for sensor subsystem
US11959997B2 (en) System and method for tracking a wearable device
JP2018055589A (en) Program, object chasing method, and display apparatus
US20210349177A1 (en) Low profile pointing device
CN114332423A (en) Virtual reality handle tracking method, terminal and computer-readable storage medium
JP6859447B2 (en) Information processing system and object information acquisition method
Gao et al. Mom: Microphone based 3d orientation measurement
US20230195242A1 (en) Low profile pointing device sensor fusion
WO2020087041A1 (en) Mixed reality device tracking
CN111031468B (en) Visual auxiliary method and device based on individualized HRTF stereo
US11454700B1 (en) Apparatus, system, and method for mitigating systematic distance errors in radar-based triangulation calculations
Pfreundtner et al. (W) Earable Microphone Array and Ultrasonic Echo Localization for Coarse Indoor Environment Mapping
CN112835021B (en) Positioning method, device, system and computer readable storage medium
WO2019221763A1 (en) Position and orientation tracking system, apparatus and method
US10802126B2 (en) Electronic device and positioning method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19875312

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19875312

Country of ref document: EP

Kind code of ref document: A1