WO2020182309A1 - Ultrasonic hand tracking system - Google Patents

Ultrasonic hand tracking system Download PDF

Info

Publication number
WO2020182309A1
WO2020182309A1 PCT/EP2019/056398 EP2019056398W WO2020182309A1 WO 2020182309 A1 WO2020182309 A1 WO 2020182309A1 EP 2019056398 W EP2019056398 W EP 2019056398W WO 2020182309 A1 WO2020182309 A1 WO 2020182309A1
Authority
WO
WIPO (PCT)
Prior art keywords
communication device
mobile communication
augmented reality
orientation
frequency signal
Prior art date
Application number
PCT/EP2019/056398
Other languages
French (fr)
Inventor
Pertti Ikalainen
Jari Savolainen
Original Assignee
Huawei Technologies Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co., Ltd. filed Critical Huawei Technologies Co., Ltd.
Priority to PCT/EP2019/056398 priority Critical patent/WO2020182309A1/en
Publication of WO2020182309A1 publication Critical patent/WO2020182309A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • the aspects of the present disclosure relate generally to augmented reality system and more particularly to hand tracking using a mobile communication device in an augmented reality system.
  • Augmented Reality (AR) glasses enable interaction in six degrees of freedom.
  • the users hand position and angle can be accurately located relative to the coordinates of the augmented reality glasses.
  • One problem in augmented reality glass systems is the user interface and the interaction with the virtual user interface in front of the augmented reality glasses user.
  • Hand tracking in an augmented reality system typically requires an accessory to be used in the hand, together with the augmented reality glasses for full six degree of freedom hand tracking. It would be advantageous to limit the number of devices that are required for hand tracking and interaction with a virtual user interface in an augmented reality system.
  • the system includes a mobile communication device that is configured to transmit an ultrasound frequency signal and rotational data of the mobile communication device.
  • a receiver is configured to detect one or more of the ultrasound frequency signal and inertial measurement data from the mobile communication device and a controller is configured to determine at least a position and orientation of the mobile communication device based on the detected ultrasound frequency signal and the detected rotation.
  • the aspects of the disclosed embodiments allow a wearer of augmented reality glasses to use a smartphone as a six degrees of freedom controller, and track hand movements for interfacing with a virtual user interface.
  • the ultrasound frequency signal is transmitted by a speaker of the mobile communication device.
  • the aspects of the disclosed embodiments allow the speaker of the smart phone to be used as the transmitter of the ultrasound signal.
  • the mobile communication device further comprises an inertial measurement unit that is configured to detect the rotation of the mobile communication device about an axis.
  • the inertial measurement unit is further configured to detect one or more of a three-dimensional acceleration of the mobile communication device or a three-dimensional magnetic field of a surrounding of the mobile communication device.
  • position and orientation can be mapped to a coordinate system of the augmented reality glasses system. Using one or more of these properties, it is possible to draw the smartphone position and orientation relative to the left and right display of the augmented reality glasses and use this information for interaction with an augmented object, which position is also drawn to the displays.
  • the receiver comprises an augmented reality device.
  • the aspects of the disclosed embodiments allow the wearer of augmented reality glasses to track hand movements for interfacing with a virtual user interface.
  • the augmented reality device comprises augmented reality glasses.
  • the aspects of the disclosed embodiments allow the wearer of augmented reality glasses to track hand movements for interfacing with a virtual user interface.
  • the smartphone position and orientation relative to the left and right display of the augmented reality glasses can be used for interaction with an augmented reality object.
  • the controller is configured to map the position and orientation of the mobile communication device to an augmented reality coordinate system.
  • the aspects of the disclosed embodiment enable the smartphone position and orientation relative to the left and right display of the augmented reality glasses to be detected and used for interaction with an augmented reality object, which position is also drawn to the displays.
  • the position determined by the controller from the ultrasound frequency signal is three-axis position data.
  • the ultrasound based three-axis position data and inertial measurement unit based rotation data can be combined and used to map the smartphone position and orientation to the coordinate system of the augmented reality device.
  • the mobile communication device comprises a mobile phone.
  • the aspects of the disclosed embodiments allow the augmented reality glasses wearer to use the mobile phone, or smartphone, as a six degrees of freedom controller, and track hand movements for interfacing with a virtual user interface.
  • the method includes detecting an ultrasound signal and inertial measurement rotation data generated by the mobile communication device and determining at least a position and orientation of the mobile communication device based on the detected ultrasound frequency signal and the detected inertial measurement rotation data.
  • the aspects of the disclosed embodiments allow the augmented reality glasses wearer to use a smartphone as a six degrees of freedom controller, and track hand movements for interfacing with a virtual user interface. By detecting and measuring different properties associated with the mobile communication device, position and orientation can be mapped to a coordinate system of the augmented reality glasses system
  • the mobile communication device comprises a mobile phone.
  • the aspects of the disclosed embodiments allow the augmented reality glasses wearer to use the mobile phone or smartphone as a six degrees of freedom controller, and track hand movements for interfacing with a virtual user interface in the augmented reality system.
  • the method further includes mapping the position and orientation of the mobile communication device to a coordinate system of augmented reality glasses.
  • the aspects of the disclosed embodiment enable the smartphone position and orientation relative to the left and right display of the augmented reality glasses to be detected and used for interaction with an augmented reality object, which position is also drawn to the displays.
  • the method includes interacting with an augmented reality object drawn to the coordinate system based on the position and orientation data of the mobile communication device.
  • the aspects enable the smartphone to be used as the six degrees of freedom controller in an augmented reality system, eliminating the need for a separate controller.
  • the method further includes detecting a three-dimensional acceleration of the mobile communication device and determining at least the position and orientation of the mobile communication device based on the detected ultrasound frequency signal, the detected inertial measurement rotation data and the three-dimensional acceleration of the mobile communication device.
  • the ultrasound based three-axis position data and inertial measurement unit based rotation data can be combined and used to map the smartphone position and orientation to the coordinate system of the augmented reality device.
  • the method further includes detecting a three-dimensional magnetic field of a surrounding of the mobile communication device and determining at least the position and orientation of the mobile communication device based on the detected ultrasound frequency signal, the detected inertial measurement rotation data and the three-dimensional magnetic field of a surrounding of the mobile communication device.
  • position and orientation can be mapped to a coordinate system of the augmented reality glasses system. It is possible to draw the smartphone position and orientation relative to the left and right display of the augmented reality glasses and use this information for interaction with an augmented object, which position is also drawn to the displays.
  • Figures 1 illustrates a schematic view of exemplary system incorporating aspects of the disclosed embodiments.
  • Figure 2 illustrates a flowchart of an process incorporating aspects of the disclosed embodiments.
  • Figure 3 illustrates an exemplary implementation of a system incorporating aspects of the disclosed embodiments.
  • Figure 4 illustrates an architecture of an exemplary apparatus that can be used to practice aspects of the disclosed embodiments.
  • FIG. 1 there can be seen a front view of an exemplary system 100 incorporating aspects of the disclosed embodiments.
  • the aspects of the disclosed embodiments are directed to a system 100 that enables hand tracking for augmented reality glasses apparatus 120 using a mobile communication device 110.
  • the system 100 includes a mobile communication device 110 that is configured to transmit an ultrasound frequency signal 10 as well as inertial measurement data 12.
  • the ultrasound frequency signal 10 and inertial measurement data 12 are configured to be processed by an application engine 126 of the augmented reality glasses apparatus 120 to determine a position and orientation of the mobile communication device relative to the apparatus 120.
  • the position and orientation of the mobile communication device 100 can be mapped to a coordinate system of the apparatus 120.
  • the position and orientation of the mobile communications device 110 can be used for interaction with an augmented object 302 as is illustrated in Figure 3, for example.
  • the aspects of the disclosed embodiments allow the wearer of the augmented reality glasses 120 to use a mobile communication device 110, such as a smartphone, as the six degrees of freedom controller in the augmented reality system 100. This allows the wearer to track hand movements for interfacing with a virtual user interface of the augmented reality system 100.
  • the mobile communication device 110 is configured to generate an ultrasound signal 10 that is detected by a receiver 122.
  • the detected ultrasound signal can be used for position determination.
  • the receiver 122 in this example is an ultrasound receiver.
  • the ultrasound receiver 122 is a standard microphone such as that used in a mobile communication device.
  • the ultrasound receiver 122 could also comprise an array of standard microphones of any suitable type.
  • the application engine 126 coupled to the receiver 122 is configured to run an ultrasound tracking algorithm and map the position of the mobile communication device 110 to the augmented scene 304, as shown in Figure 3, for example.
  • the application engine 126 is configured to map the position of the mobile communication device 110 to the augmented scene 304 using any suitable position algorithm. In this manner, the mobile communication device 110, or smart phone, is used as the six degree of freedom controller (6DoF) for interaction with the augmented reality glasses 120.
  • 6DoF six degree of freedom controller
  • the aspects of the disclosed embodiments can make use of a standard loudspeaker of the mobile communication device 110 as the ultrasound signal source or transmitter 112 for the system 100.
  • the typical loudspeaker of a smartphone such as the mobile communication device 110 shown in Figure 1, is designed for human hearing range from approximately 20Hz to and including 20kHz. However, such speakers or loudspeakers are still capable of generating sound in frequencies above 20kHz.
  • the output audio power typically decreases after 20 kHz, but the needed power for ultrasound tracking in the system 100 is relatively small.
  • the tracking range is small because the maximum needed distance for tracking the mobile communication device 110 in the system 100 is roughly the reach of the human arm from the wearer's head. This can be less than approximately 100 cm, in some cases.
  • the speaker 112 is the transmitter of the ultrasound signal 10 generated by the mobile communication device 110 for position tracking in the system 100.
  • the augmented reality glasses 120 have a corresponding receiver 122 that is configured to detect the ultrasound signal 10.
  • the mobile communication device 110 includes an inertial measurement unit (IMU) 114 that is configured to detect a rotation of the mobile communication device 110 about one or more axes.
  • the inertial measurement unit data 12, or angular rotation vector from the inertial measurement unit 114, can be transmitted to the augmented reality glasses apparatus 120, where it is processed by the application engine 126, also referred to herein as controller 126.
  • the inertial measurement unit data 12 can also be referred to as or include rotation and linear movement data.
  • the application engine 126 is configured to determine at least the position and orientation of the mobile communication device 110 based on one or more of the ultrasound frequency signal 10 and the inertial measurement or rotation data 12.
  • the inertial measurement data 12 is transmitted from the mobile communication device 110 to the augmented reality glasses 120.
  • the ultrasound based three-axis position and inertial measurement unit based rotation data is combined to determine the position and orientation of the mobile communication device 110 relative to the augmented reality glasses apparatus 120.
  • distance is measured by time measurement of ultrasound pulses. The pulses might be modulated to detect multiple sources.
  • inertial measurement unit data can be determined using any suitable method, other than the method described above.
  • the mobile communication device 110 includes a
  • the Bluetooth communication module 118 that is connected to the application engine 116.
  • the augmented reality glasses apparatus 120 includes a Bluetooth communication module 128 that is connected to the application engine 126 and is configured to communicate with the Bluetooth communication module 118.
  • the inertial measurement data 12 can be transmitted from the mobile communication device 110 to the augmented reality glasses 120 using this Bluetooth communication channel 16.
  • the inertial measurement data 12 can be communicated from the mobile communication device 110 to the augmented reality glasses apparatus 120 using any suitable communication channel other than including a Bluetooth communication channel.
  • the application engine 116 is configured to process the inertial measurement data from the inertial measurement unit 114 and enable the transmitter 112 to send the data 12 to the augmented reality glasses 120.
  • the mobile communication device 110 can include an accelerometer device 115.
  • the accelerometer device 115 can be configured to detect a three- dimensional acceleration of the mobile communication device 110.
  • the accelerometer device 115 could be used to compensate for possible errors from ultrasound detection. For example when error reflection causes the ultrasound position to jump but no movement detected by the accelerometer 115, the application engine 126 can be configured to compensate for this error..
  • the mobile communication device 110 can include a magnetometer device 117.
  • the magnetometer device 117 can be configured to detect the magnetic field of earth and used in long-term stabilization of a gyro of mobile communication device 110.
  • FIG. 2 illustrates one example of a process 200 in a system 100 incorporating aspects of the disclosed embodiments.
  • an ultrasound signal 10 sent from the mobile communication device 110 is detected 202.
  • the ultrasound signal 10 is sent from the sound output device 112 or speaker of the mobile communication device 110 and received by a receiver 122 of the augmented reality glasses apparatus 120.
  • This ultrasound signal is detected 202 and processed by the application engine 126 of the augmented reality glasses apparatus 120.
  • the augmented reality glasses apparatus 120 is configured to determine 206 a position and orientation of the mobile communication device 110 relative to the augmented reality glasses apparatus 120.
  • the position and orientation of the mobile communication device 110 is then mapped to a coordinate system of the augmented reality glasses apparatus 120. This allows the position and orientation of the mobile communication device 110 to be presented in the left display 130 and right display 132 of the augmented reality glasses apparatus 120.
  • the wearer of the augmented reality glasses apparatus 120 can then interact 210 with an augmented reality object such as the object 302 shown in Figure 3.
  • the aspects of the disclosed embodiments also use inertial measurement data from the mobile communication device 110 for position and orientation determination 206.
  • the mobile communication device is configured to measure its rotation around one or more axes. This rotational data, referred to herein as inertial measurement data, is detected 204 by the augmented reality glasses apparatus 120 and is used to determine 206 the position and orientation of the mobile communication device 110.
  • acceleration data of the mobile communication device 110 can also be used to determine 206 the position and orientation of the mobile communication device 110.
  • the mobile communication device 110 can include an accelerometer device 115 that is configured to detect an acceleration of the mobile communication device 110 as is described herein.
  • This acceleration data which in one embodiment includes three- dimensional acceleration data, can be transmitted by the mobile communication device 110 together with the ultrasound data. Once detected 212 by the augmented reality glasses apparatus 120, this acceleration data can also be used to determine 206 the position and orientation of the mobile communication device 110.
  • the aspects of the disclosed embodiments also allow magnetic field data of the mobile communication device 110 to be used by the augmented reality glasses apparatus 120 in determining 206 the position and orientation of the mobile communication device 110.
  • the mobile communication device 110 includes the magnetometer device 117 as is described herein.
  • FIG. 4 illustrates a block diagram of an exemplary apparatus 1000 appropriate for implementing aspects of the disclosed embodiments.
  • the apparatus 1000 is appropriate for use in a wireless network and can be implemented in one or more of the mobile communication device 110 or the augmented reality glasses apparatus 120.
  • the apparatus 1000 includes or is coupled to a processor or computing hardware
  • the UI 1008 may be removed from the apparatus 1000.
  • the processor 1002 may be a single processing device or may comprise a plurality of processing devices including special purpose devices, such as for example, digital signal processing (DSP) devices, microprocessors, graphics processing units (GPU), specialized processing devices, or general purpose computer processing unit (CPU).
  • DSP digital signal processing
  • GPU graphics processing units
  • CPU general purpose computer processing unit
  • the processor 1002 often includes a CPU working in tandem with a DSP to handle signal processing tasks.
  • the processor 1002 which can be implemented in one or more of the mobile communication device 110 or augmented reality apparatus 120 described with respect to Figure 1, may be configured to implement any one or more of the methods and processes described herein.
  • the processor 1002 is configured to be coupled to a memory 1004 which may be a combination of various types of volatile and non-volatile computer memory such as for example read only memory (ROM), random access memory (RAM), magnetic or optical disk, or other types of computer memory.
  • the memory 1004 is configured to store computer program instructions that may be accessed and executed by the processor 1002 to cause the processor 1002 to perform a variety of desirable computer implemented processes or methods such as the methods as described herein.
  • the program instructions stored in memory 1004 are organized as sets or groups of program instructions referred to in the industry with various terms such as programs, software components, software modules, units, etc. Each module may include a set of functionality designed to support a certain purpose, such as one or more the functions described with respect to the system 100 of Figure 1. Also included in the memory 1004 are program data and data files which may be stored and processed by the processor 1002 while executing a set of computer program instructions.
  • the apparatus 1000 can also include or be coupled to an RF Unit 1006 such as a transceiver, coupled to the processor 1002 that is configured to transmit and receive RF signals based on digital data 1012 exchanged with the processor 1002 and may be configured to transmit and receive radio signals with other nodes in a wireless network.
  • the RF Unit 1006 includes receivers capable of receiving and interpreting messages sent from satellites in the global positioning system (GPS) and work together with information received from other transmitters to obtain positioning information pertaining to the location of the device 1000.
  • GPS global positioning system
  • the RF unit 1006 includes an antenna unit 1010 which in certain embodiments may include a plurality of antenna elements.
  • the UI 1008 may include one or more user interface elements such as a touch screen, keypad, buttons, voice command processor, as well as other elements adapted for exchanging information with a user.
  • the UI 1008 may also include a display unit configured to display a variety of information appropriate for a computing device or mobile user equipment and may be implemented using any appropriate display type such as for example organic light emitting diodes (OLED), liquid crystal display (LCD), as well as less complex elements such as LEDs or indicator lamps.
  • OLED organic light emitting diodes
  • LCD liquid crystal display
  • the aspects of the disclosed embodiments are directed to the use of a mobile communication device such as a smart phone to be used as a six degrees of freedom controller for interaction with augmented reality glasses.
  • a manufacturer of augmented reality glasses does not need to design a separate controller. Rather, a user's smart phone can be adapted and configured to be used as the controller in the system described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An apparatus and method enables hand tracking using a mobile communication device in an augmented reality system. The system includes a mobile communication device that is configured to transmit an ultrasound frequency signal and inertial measurement data. The augmented reality glasses apparatus is configured to detect the ultrasound frequency signal and inertial measurement data from the mobile communication device and determine at least a position and orientation of the mobile communication device based on the detected ultrasound frequency signal and the detected rotation. The mobile communication device is used as a six degrees of freedom controller for tracking hand movements and interfacing with augmented reality glasses.

Description

ULTRASONIC HAND TRACKING SYSTEM
TECHNICAL FIELD
[0001] The aspects of the present disclosure relate generally to augmented reality system and more particularly to hand tracking using a mobile communication device in an augmented reality system.
BACKGROUND
[0002] Augmented Reality (AR) glasses enable interaction in six degrees of freedom. For example, the users hand position and angle can be accurately located relative to the coordinates of the augmented reality glasses. One problem in augmented reality glass systems is the user interface and the interaction with the virtual user interface in front of the augmented reality glasses user.
[0003] Hand tracking in an augmented reality system typically requires an accessory to be used in the hand, together with the augmented reality glasses for full six degree of freedom hand tracking. It would be advantageous to limit the number of devices that are required for hand tracking and interaction with a virtual user interface in an augmented reality system.
[0004] Accordingly, it would be desirable to be able to provide a system that addresses at least some of the problems identified above.
SUMMARY
[0005] It is an object of the disclosed embodiments to provide an apparatus and method that enable hand tracking using a mobile communication device in an augmented reality system. This object is solved by the subject matter of the independent claims. Further advantageous modifications can be found in the dependent claims.
[0006] According to a first aspect the above and further objects and advantages are obtained by a system. In one embodiment, the system includes a mobile communication device that is configured to transmit an ultrasound frequency signal and rotational data of the mobile communication device. A receiver is configured to detect one or more of the ultrasound frequency signal and inertial measurement data from the mobile communication device and a controller is configured to determine at least a position and orientation of the mobile communication device based on the detected ultrasound frequency signal and the detected rotation. The aspects of the disclosed embodiments allow a wearer of augmented reality glasses to use a smartphone as a six degrees of freedom controller, and track hand movements for interfacing with a virtual user interface.
[0007] In a possible implementation form of the system according to the first aspect the ultrasound frequency signal is transmitted by a speaker of the mobile communication device. The aspects of the disclosed embodiments allow the speaker of the smart phone to be used as the transmitter of the ultrasound signal.
[0008] In a possible implementation form of the system the mobile communication device further comprises an inertial measurement unit that is configured to detect the rotation of the mobile communication device about an axis.
[0009] In a possible implementation form of the system the inertial measurement unit is further configured to detect one or more of a three-dimensional acceleration of the mobile communication device or a three-dimensional magnetic field of a surrounding of the mobile communication device. By detecting and measuring different properties associated with the mobile communication device, position and orientation can be mapped to a coordinate system of the augmented reality glasses system. Using one or more of these properties, it is possible to draw the smartphone position and orientation relative to the left and right display of the augmented reality glasses and use this information for interaction with an augmented object, which position is also drawn to the displays.
[0010] In another possible implementation form of the system, the receiver comprises an augmented reality device. The aspects of the disclosed embodiments allow the wearer of augmented reality glasses to track hand movements for interfacing with a virtual user interface.
[0011] In further possible implementation form of the system the augmented reality device comprises augmented reality glasses. The aspects of the disclosed embodiments allow the wearer of augmented reality glasses to track hand movements for interfacing with a virtual user interface. The smartphone position and orientation relative to the left and right display of the augmented reality glasses can be used for interaction with an augmented reality object.
[0012] In a further possible implementation form of the system the controller is configured to map the position and orientation of the mobile communication device to an augmented reality coordinate system. The aspects of the disclosed embodiment enable the smartphone position and orientation relative to the left and right display of the augmented reality glasses to be detected and used for interaction with an augmented reality object, which position is also drawn to the displays.
[0013] In a further possible implementation form the position determined by the controller from the ultrasound frequency signal is three-axis position data. The ultrasound based three-axis position data and inertial measurement unit based rotation data can be combined and used to map the smartphone position and orientation to the coordinate system of the augmented reality device.
[0014] In another possible implementation form of the system the mobile communication device comprises a mobile phone. The aspects of the disclosed embodiments allow the augmented reality glasses wearer to use the mobile phone, or smartphone, as a six degrees of freedom controller, and track hand movements for interfacing with a virtual user interface.
[0015] According to a second aspect, the above and further objects and advantages are obtained by a method. In one embodiment, the method includes detecting an ultrasound signal and inertial measurement rotation data generated by the mobile communication device and determining at least a position and orientation of the mobile communication device based on the detected ultrasound frequency signal and the detected inertial measurement rotation data. The aspects of the disclosed embodiments allow the augmented reality glasses wearer to use a smartphone as a six degrees of freedom controller, and track hand movements for interfacing with a virtual user interface. By detecting and measuring different properties associated with the mobile communication device, position and orientation can be mapped to a coordinate system of the augmented reality glasses system
[0016] In a possible implementation form of the method the mobile communication device comprises a mobile phone. The aspects of the disclosed embodiments allow the augmented reality glasses wearer to use the mobile phone or smartphone as a six degrees of freedom controller, and track hand movements for interfacing with a virtual user interface in the augmented reality system. [0017] In another possible implementation form of the method, the method further includes mapping the position and orientation of the mobile communication device to a coordinate system of augmented reality glasses. The aspects of the disclosed embodiment enable the smartphone position and orientation relative to the left and right display of the augmented reality glasses to be detected and used for interaction with an augmented reality object, which position is also drawn to the displays.
[0018] In a further possible implementation form of the method, the method includes interacting with an augmented reality object drawn to the coordinate system based on the position and orientation data of the mobile communication device. The aspects enable the smartphone to be used as the six degrees of freedom controller in an augmented reality system, eliminating the need for a separate controller.
[0019] In another possible implementation form of the method, the method further includes detecting a three-dimensional acceleration of the mobile communication device and determining at least the position and orientation of the mobile communication device based on the detected ultrasound frequency signal, the detected inertial measurement rotation data and the three-dimensional acceleration of the mobile communication device. The ultrasound based three-axis position data and inertial measurement unit based rotation data can be combined and used to map the smartphone position and orientation to the coordinate system of the augmented reality device.
[0020] In another possible implementation form of the method, the method further includes detecting a three-dimensional magnetic field of a surrounding of the mobile communication device and determining at least the position and orientation of the mobile communication device based on the detected ultrasound frequency signal, the detected inertial measurement rotation data and the three-dimensional magnetic field of a surrounding of the mobile communication device. By detecting and measuring different properties associated with the mobile communication device, position and orientation can be mapped to a coordinate system of the augmented reality glasses system. It is possible to draw the smartphone position and orientation relative to the left and right display of the augmented reality glasses and use this information for interaction with an augmented object, which position is also drawn to the displays.
[0021] These and other aspects, implementation forms, and advantages of the exemplary embodiments will become apparent from the embodiments described herein considered in conjunction with the accompanying drawings. It is to be understood, however, that the description and drawings are designed solely for purposes of illustration and not as a definition of the limits of the disclosed invention, for which reference should be made to the appended claims. Additional aspects and advantages of the invention will be set forth in the description that follows, and in part will be obvious from the description, or may be learned by practice of the invention. Moreover, the aspects and advantages of the invention may be realized and obtained by means of the instrumentalities and combinations particularly pointed out in the appended claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0022] In the following detailed portion of the present disclosure, the invention will be explained in more detail with reference to the example embodiments shown in the drawings, in which: [0023] Figures 1 illustrates a schematic view of exemplary system incorporating aspects of the disclosed embodiments.
[0024] Figure 2 illustrates a flowchart of an process incorporating aspects of the disclosed embodiments.
[0025] Figure 3 illustrates an exemplary implementation of a system incorporating aspects of the disclosed embodiments.
[0026] Figure 4 illustrates an architecture of an exemplary apparatus that can be used to practice aspects of the disclosed embodiments.
DETAILED DESCRIPTION OF THE DISCLOSED EMBODIMENTS
[0027] Referring to Figure 1 there can be seen a front view of an exemplary system 100 incorporating aspects of the disclosed embodiments. The aspects of the disclosed embodiments are directed to a system 100 that enables hand tracking for augmented reality glasses apparatus 120 using a mobile communication device 110. In one embodiment, the system 100 includes a mobile communication device 110 that is configured to transmit an ultrasound frequency signal 10 as well as inertial measurement data 12. The ultrasound frequency signal 10 and inertial measurement data 12 are configured to be processed by an application engine 126 of the augmented reality glasses apparatus 120 to determine a position and orientation of the mobile communication device relative to the apparatus 120. The position and orientation of the mobile communication device 100 can be mapped to a coordinate system of the apparatus 120. It is then possible to draw the position and orientation of the mobile communications device 120 to the left display 130 and the right display 132 of the apparatus 120. [0028] The position and orientation of the mobile communications device 110 can be used for interaction with an augmented object 302 as is illustrated in Figure 3, for example. The aspects of the disclosed embodiments allow the wearer of the augmented reality glasses 120 to use a mobile communication device 110, such as a smartphone, as the six degrees of freedom controller in the augmented reality system 100. This allows the wearer to track hand movements for interfacing with a virtual user interface of the augmented reality system 100.
[0029] In the example of Figure 1, the mobile communication device 110 is configured to generate an ultrasound signal 10 that is detected by a receiver 122. The detected ultrasound signal can be used for position determination. The receiver 122 in this example is an ultrasound receiver. In one embodiment the ultrasound receiver 122 is a standard microphone such as that used in a mobile communication device. The ultrasound receiver 122 could also comprise an array of standard microphones of any suitable type. In one embodiment, the application engine 126 coupled to the receiver 122 is configured to run an ultrasound tracking algorithm and map the position of the mobile communication device 110 to the augmented scene 304, as shown in Figure 3, for example. In alternate embodiments, the application engine 126 is configured to map the position of the mobile communication device 110 to the augmented scene 304 using any suitable position algorithm. In this manner, the mobile communication device 110, or smart phone, is used as the six degree of freedom controller (6DoF) for interaction with the augmented reality glasses 120.
[0030] The aspects of the disclosed embodiments can make use of a standard loudspeaker of the mobile communication device 110 as the ultrasound signal source or transmitter 112 for the system 100. The typical loudspeaker of a smartphone, such as the mobile communication device 110 shown in Figure 1, is designed for human hearing range from approximately 20Hz to and including 20kHz. However, such speakers or loudspeakers are still capable of generating sound in frequencies above 20kHz. The output audio power typically decreases after 20 kHz, but the needed power for ultrasound tracking in the system 100 is relatively small. For example, as is shown in Figure 3, the tracking range is small because the maximum needed distance for tracking the mobile communication device 110 in the system 100 is roughly the reach of the human arm from the wearer's head. This can be less than approximately 100 cm, in some cases. According to the aspects of the disclosed embodiments, the speaker 112 is the transmitter of the ultrasound signal 10 generated by the mobile communication device 110 for position tracking in the system 100. The augmented reality glasses 120 have a corresponding receiver 122 that is configured to detect the ultrasound signal 10.
[0031] In one embodiment, the mobile communication device 110 includes an inertial measurement unit (IMU) 114 that is configured to detect a rotation of the mobile communication device 110 about one or more axes. The inertial measurement unit data 12, or angular rotation vector from the inertial measurement unit 114, can be transmitted to the augmented reality glasses apparatus 120, where it is processed by the application engine 126, also referred to herein as controller 126. The inertial measurement unit data 12 can also be referred to as or include rotation and linear movement data. In this example, the application engine 126 is configured to determine at least the position and orientation of the mobile communication device 110 based on one or more of the ultrasound frequency signal 10 and the inertial measurement or rotation data 12.
[0032] In one embodiment, referring also to Figure 1, the inertial measurement data 12 is transmitted from the mobile communication device 110 to the augmented reality glasses 120. Using, for example, a six degree of freedom algorithm of the application engine 126, the ultrasound based three-axis position and inertial measurement unit based rotation data is combined to determine the position and orientation of the mobile communication device 110 relative to the augmented reality glasses apparatus 120. For example, in one embodiment, distance is measured by time measurement of ultrasound pulses. The pulses might be modulated to detect multiple sources. The direction is typically measured by small array of microphones, such as for example four microphones, and measuring the time difference of the same pulse to all four microphones and inertial measurement unit based rotation data is combined to determine the position and orientation of the mobile communication device 110 relative to the augmented reality glasses apparatus 120. In alternate embodiments, inertial measurement unit data can be determined using any suitable method, other than the method described above.
[0033] In the example of Figure 1 the mobile communication device 110 includes a
Bluetooth communication module 118 that is connected to the application engine 116. The augmented reality glasses apparatus 120 includes a Bluetooth communication module 128 that is connected to the application engine 126 and is configured to communicate with the Bluetooth communication module 118. In one embodiment, the inertial measurement data 12 can be transmitted from the mobile communication device 110 to the augmented reality glasses 120 using this Bluetooth communication channel 16. In alternate embodiments, the inertial measurement data 12 can be communicated from the mobile communication device 110 to the augmented reality glasses apparatus 120 using any suitable communication channel other than including a Bluetooth communication channel. For example, in one embodiment the application engine 116 is configured to process the inertial measurement data from the inertial measurement unit 114 and enable the transmitter 112 to send the data 12 to the augmented reality glasses 120. [0034] In one embodiment, the mobile communication device 110 can include an accelerometer device 115. The accelerometer device 115 can be configured to detect a three- dimensional acceleration of the mobile communication device 110. In one embodiment, the accelerometer device 115 could be used to compensate for possible errors from ultrasound detection. For example when error reflection causes the ultrasound position to jump but no movement detected by the accelerometer 115, the application engine 126 can be configured to compensate for this error..
[0035] In one embodiment, the mobile communication device 110 can include a magnetometer device 117. The magnetometer device 117 can be configured to detect the magnetic field of earth and used in long-term stabilization of a gyro of mobile communication device 110.
[0036] Figure 2 illustrates one example of a process 200 in a system 100 incorporating aspects of the disclosed embodiments. In one embodiment, an ultrasound signal 10 sent from the mobile communication device 110 is detected 202. In the example of Figure 1, the ultrasound signal 10 is sent from the sound output device 112 or speaker of the mobile communication device 110 and received by a receiver 122 of the augmented reality glasses apparatus 120. This ultrasound signal is detected 202 and processed by the application engine 126 of the augmented reality glasses apparatus 120.
[0037] Using the detected ultrasound signal, the augmented reality glasses apparatus 120 is configured to determine 206 a position and orientation of the mobile communication device 110 relative to the augmented reality glasses apparatus 120. The position and orientation of the mobile communication device 110 is then mapped to a coordinate system of the augmented reality glasses apparatus 120. This allows the position and orientation of the mobile communication device 110 to be presented in the left display 130 and right display 132 of the augmented reality glasses apparatus 120. The wearer of the augmented reality glasses apparatus 120 can then interact 210 with an augmented reality object such as the object 302 shown in Figure 3.
[0038] In one embodiment, in addition to using ultrasound sound data that is received from the mobile communication device 110, the aspects of the disclosed embodiments also use inertial measurement data from the mobile communication device 110 for position and orientation determination 206. In this example, the mobile communication device is configured to measure its rotation around one or more axes. This rotational data, referred to herein as inertial measurement data, is detected 204 by the augmented reality glasses apparatus 120 and is used to determine 206 the position and orientation of the mobile communication device 110.
[0039] In one embodiment, acceleration data of the mobile communication device 110 can also be used to determine 206 the position and orientation of the mobile communication device 110. In this example, the mobile communication device 110 can include an accelerometer device 115 that is configured to detect an acceleration of the mobile communication device 110 as is described herein. This acceleration data, which in one embodiment includes three- dimensional acceleration data, can be transmitted by the mobile communication device 110 together with the ultrasound data. Once detected 212 by the augmented reality glasses apparatus 120, this acceleration data can also be used to determine 206 the position and orientation of the mobile communication device 110. [0040] The aspects of the disclosed embodiments also allow magnetic field data of the mobile communication device 110 to be used by the augmented reality glasses apparatus 120 in determining 206 the position and orientation of the mobile communication device 110. In this example, the mobile communication device 110 includes the magnetometer device 117 as is described herein.
[0041] Figure 4 illustrates a block diagram of an exemplary apparatus 1000 appropriate for implementing aspects of the disclosed embodiments. The apparatus 1000 is appropriate for use in a wireless network and can be implemented in one or more of the mobile communication device 110 or the augmented reality glasses apparatus 120.
[0042] The apparatus 1000 includes or is coupled to a processor or computing hardware
1002, a memory 1004, a radio frequency (RF) unit 1006 and a user interface (UI) 1008. In certain embodiments, the UI 1008 may be removed from the apparatus 1000.
[0043] The processor 1002 may be a single processing device or may comprise a plurality of processing devices including special purpose devices, such as for example, digital signal processing (DSP) devices, microprocessors, graphics processing units (GPU), specialized processing devices, or general purpose computer processing unit (CPU). The processor 1002 often includes a CPU working in tandem with a DSP to handle signal processing tasks. The processor 1002, which can be implemented in one or more of the mobile communication device 110 or augmented reality apparatus 120 described with respect to Figure 1, may be configured to implement any one or more of the methods and processes described herein.
[0044] In the example of Figure 4, the processor 1002 is configured to be coupled to a memory 1004 which may be a combination of various types of volatile and non-volatile computer memory such as for example read only memory (ROM), random access memory (RAM), magnetic or optical disk, or other types of computer memory. The memory 1004 is configured to store computer program instructions that may be accessed and executed by the processor 1002 to cause the processor 1002 to perform a variety of desirable computer implemented processes or methods such as the methods as described herein.
[0045] The program instructions stored in memory 1004 are organized as sets or groups of program instructions referred to in the industry with various terms such as programs, software components, software modules, units, etc. Each module may include a set of functionality designed to support a certain purpose, such as one or more the functions described with respect to the system 100 of Figure 1. Also included in the memory 1004 are program data and data files which may be stored and processed by the processor 1002 while executing a set of computer program instructions.
[0046] The apparatus 1000 can also include or be coupled to an RF Unit 1006 such as a transceiver, coupled to the processor 1002 that is configured to transmit and receive RF signals based on digital data 1012 exchanged with the processor 1002 and may be configured to transmit and receive radio signals with other nodes in a wireless network. In certain embodiments, the RF Unit 1006 includes receivers capable of receiving and interpreting messages sent from satellites in the global positioning system (GPS) and work together with information received from other transmitters to obtain positioning information pertaining to the location of the device 1000. To facilitate transmitting and receiving RF signals the RF unit 1006 includes an antenna unit 1010 which in certain embodiments may include a plurality of antenna elements. [0047] Where the system 100 includes a user interface 1008, the UI 1008 may include one or more user interface elements such as a touch screen, keypad, buttons, voice command processor, as well as other elements adapted for exchanging information with a user. The UI 1008 may also include a display unit configured to display a variety of information appropriate for a computing device or mobile user equipment and may be implemented using any appropriate display type such as for example organic light emitting diodes (OLED), liquid crystal display (LCD), as well as less complex elements such as LEDs or indicator lamps.
[0048] The aspects of the disclosed embodiments are directed to the use of a mobile communication device such as a smart phone to be used as a six degrees of freedom controller for interaction with augmented reality glasses. Advantageously, a manufacturer of augmented reality glasses does not need to design a separate controller. Rather, a user's smart phone can be adapted and configured to be used as the controller in the system described herein.
[0049] Thus, while there have been shown, described and pointed out, fundamental novel features of the invention as applied to the exemplary embodiments thereof, it will be understood that various omissions, substitutions and changes in the form and details of devices and methods illustrated, and in their operation, may be made by those skilled in the art without departing from the spirit and scope of the presently disclosed invention. Further, it is expressly intended that all combinations of those elements, which perform substantially the same function in substantially the same way to achieve the same results, are within the scope of the invention. Moreover, it should be recognized that structures and/or elements shown and/or described in connection with any disclosed form or embodiment of the invention may be incorporated in any other disclosed or described or suggested form or embodiment as a general matter of design choice. It is the intention, therefore, to be limited only as indicated by the scope of the claims appended hereto.

Claims

1. A system (100), comprising: a mobile communication device (110) that is configured to transmit an ultrasound frequency signal and rotational data of the mobile communication device (110); a receiver (122) configured to detect the ultrasound frequency signal and rotational data from the mobile communication device (110); and a controller (126) configured to determine at least a position and orientation of the mobile communication device (100) based on the detected ultrasound frequency signal and the detected rotational data.
2. The system (100) according to claim 1, wherein the ultrasound frequency signal (10) is transmitted by a speaker (112) of the mobile communication device (110).
3. The system (100) according to any one of claims 1 or 2 wherein the mobile communication device (110) further comprising an inertial measurement unit (114) that is configured to detect the rotation of the mobile communication device (110) about an axis.
4. The system (100) according to any one of the preceding claims, wherein the inertial measurement unit (114) is further configured to detect one or more of a three-dimensional acceleration of the mobile communication device (100) or a three-dimensional magnetic field of a surrounding of the mobile communication device (100).
5. The system (100) according to any one of the preceding claims wherein the receiver (122) comprises an augmented reality device (120).
6. The system (100) according to any one of the previous claims, wherein the augmented reality device (120) comprises augmented reality glasses.
7. The system (100) according to any one of the preceding claims wherein the controller (126) is configured to map the position and orientation of the mobile communication device (100) to an augmented reality coordinate system.
8. The system (100) according to any one of the preceding claims wherein the position determined by the controller (126) from the ultrasound frequency signal is three-axis position data.
9. The system (100) according to any one of the preceding claims wherein the mobile communication device (110) comprises a mobile phone.
10. A method (200) comprising: detecting (202) an ultrasound signal generated by a mobile communication device; detecting (204) inertial measurement rotation data generated by the mobile communication device; and determining (206) at least a position and orientation of the mobile communication device based on the detected ultrasound frequency signal and the detected inertial measurement rotation data.
11. The method (200) according to claim 10, wherein the mobile communication device comprises a mobile phone.
12. The method (200) according to any one of claim 10 or 11 further comprising mapping
(208) the position and orientation of the mobile communication device to a coordinate system of augmented reality glasses.
13. The method (200) according to claim 12 comprising interacting (210) with an augmented reality object drawn to the coordinate system based on the position and orientation data of the mobile communication device.
14. The method (200) according to any one of claims 10 through 13 further comprising detecting (212) a three-dimensional acceleration of the mobile communication device and determining (206) at least the position and orientation of the mobile communication device based on the detected ultrasound frequency signal, the detected inertial measurement rotation data and the three-dimensional acceleration of the mobile communication device.
15. The method (200) according to any one of claims 10 through 14 further comprising detecting (214) a three-dimensional magnetic field of a surrounding of the mobile communication device and determining (206) at least the position and orientation of the mobile communication device based on the detected ultrasound frequency signal, the detected inertial measurement rotation data and the three-dimensional magnetic field of a surrounding of the mobile communication device.
PCT/EP2019/056398 2019-03-14 2019-03-14 Ultrasonic hand tracking system WO2020182309A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2019/056398 WO2020182309A1 (en) 2019-03-14 2019-03-14 Ultrasonic hand tracking system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2019/056398 WO2020182309A1 (en) 2019-03-14 2019-03-14 Ultrasonic hand tracking system

Publications (1)

Publication Number Publication Date
WO2020182309A1 true WO2020182309A1 (en) 2020-09-17

Family

ID=65812311

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/056398 WO2020182309A1 (en) 2019-03-14 2019-03-14 Ultrasonic hand tracking system

Country Status (1)

Country Link
WO (1) WO2020182309A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040201857A1 (en) * 2000-01-28 2004-10-14 Intersense, Inc., A Delaware Corporation Self-referenced tracking
US20160378176A1 (en) * 2015-06-24 2016-12-29 Mediatek Inc. Hand And Body Tracking With Mobile Device-Based Virtual Reality Head-Mounted Display
US20170244811A1 (en) * 2016-02-22 2017-08-24 Google Inc. Device pairing in augmented / virtual reality environment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040201857A1 (en) * 2000-01-28 2004-10-14 Intersense, Inc., A Delaware Corporation Self-referenced tracking
US20160378176A1 (en) * 2015-06-24 2016-12-29 Mediatek Inc. Hand And Body Tracking With Mobile Device-Based Virtual Reality Head-Mounted Display
US20170244811A1 (en) * 2016-02-22 2017-08-24 Google Inc. Device pairing in augmented / virtual reality environment

Similar Documents

Publication Publication Date Title
US10638213B2 (en) Control method of mobile terminal apparatus
US10334076B2 (en) Device pairing in augmented/virtual reality environment
US9351090B2 (en) Method of checking earphone wearing state
US11647352B2 (en) Head to headset rotation transform estimation for head pose tracking in spatial audio applications
US9681268B2 (en) Mobile device position detection
US10132914B2 (en) Target device positioning method and mobile terminal
US11589183B2 (en) Inertially stable virtual auditory space for spatial audio applications
US20120114132A1 (en) Headset with accelerometers to determine direction and movements of user head and method
US20160134336A1 (en) Directional proximity detection
US7616186B2 (en) Acceleration reference devices, cellular communication terminal systems, and methods that sense terminal movement for cursor control
WO2021152513A1 (en) Low profile pointing device sensor fusion
WO2023064902A1 (en) Ultrasonic device-to-device communication for wearable devices
WO2021041668A1 (en) Head-tracking methodology for headphones and headsets
WO2020182309A1 (en) Ultrasonic hand tracking system
CN113432620B (en) Error estimation method and device, vehicle-mounted terminal and storage medium
CN113132051B (en) Reducing interference between electromagnetic tracking systems
TWI598612B (en) Matching system and matching method
WO2020087041A1 (en) Mixed reality device tracking
US20210210114A1 (en) Wearable device including a sound detection device providing location information for a body part
CN210716984U (en) Pipeline detection device
EP3661233A1 (en) Wearable beamforming speaker array
CN211577871U (en) Electronic equipment, positioning device and positioning assembly
US11277706B2 (en) Angular sensing for optimizing speaker listening experience
JP2016035632A (en) Menu selection system and menu selection method
JP6294183B2 (en) Menu selection device and menu selection method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19711569

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19711569

Country of ref document: EP

Kind code of ref document: A1