CN108107435B - Virtual reality tracking method and system based on ultrasonic waves - Google Patents

Virtual reality tracking method and system based on ultrasonic waves Download PDF

Info

Publication number
CN108107435B
CN108107435B CN201711289669.7A CN201711289669A CN108107435B CN 108107435 B CN108107435 B CN 108107435B CN 201711289669 A CN201711289669 A CN 201711289669A CN 108107435 B CN108107435 B CN 108107435B
Authority
CN
China
Prior art keywords
head
ultrasonic
path
data stream
initial position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711289669.7A
Other languages
Chinese (zh)
Other versions
CN108107435A (en
Inventor
伍楷舜
谢海
陈孟奇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen University
Original Assignee
Shenzhen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen University filed Critical Shenzhen University
Priority to CN201711289669.7A priority Critical patent/CN108107435B/en
Publication of CN108107435A publication Critical patent/CN108107435A/en
Application granted granted Critical
Publication of CN108107435B publication Critical patent/CN108107435B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Acoustics & Sound (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

The invention is suitable for the technical improvement field of wireless perception and human-computer interaction, and provides an ultrasonic-based virtual reality tracking method, which comprises the following steps: s1, receiving ultrasonic data stream information by using the head-mounted VR equipment; s2, carrying out data purification treatment on the received ultrasonic data flow information; s3, estimating relative path and initial position information by using phase change; and S4, estimating and outputting the motion trail of the two-dimensional path change according to the relative path and the initial position information. The method can obtain results with high precision and high robustness, and has high academic and application values in aspects of VR perception, human-computer interaction, games and the like; excessive peripheral equipment is abandoned, corresponding loudspeaker and microphone settings are only embedded into the head-mounted equipment, not only is the complexity reduced, but also the cost is greatly reduced, and the position information of the target is estimated by utilizing the ultrasonic signals, so that the VR target positioning and tracking with high precision and high robustness is realized.

Description

Virtual reality tracking method and system based on ultrasonic waves
Technical Field
The invention belongs to the technical improvement field of wireless perception and human-computer interaction, and particularly relates to a virtual reality tracking method and system based on ultrasonic waves.
Background
Today, it is still a relatively complicated process to experience a desired virtual reality environment at any time in person, because the existing VR experience requires either a specific device, such as a head-mounted device or a handheld controller tracked by an infrared camera, or requires a specific environment to experience, such as a specific area, etc., and these limitations greatly reduce the comfort of the user experience. For a real VR experience, the position tracking can be realized at any time and any place, and the displacement can be updated in time.
Existing VR tracking systems are mainly divided into two main categories, one is an outside-in position tracking system and the other is an inside-out tracking system. In the case of outside-in systems, the tracking and positioning accuracy of such systems has been improved, and they mainly use an external device, such as an infrared camera, connected to a sensor of a head-mounted device to achieve high-accuracy positioning. However, such systems have certain drawbacks, such as that no matter where the user experiences the VR, they need to have their own hardware and peripherals installed, i.e. if one wants to experience the VR anywhere in the home, one has to install an infrared camera everywhere in the home and has to be within 2m of the distance from the infrared camera. In addition, such systems are subject to occlusion and illumination, and do not work if the infrared camera is occluded by furniture or is in a dark environment. For inside-out systems, such systems utilize true color and depth perception based tracking algorithms in the head-mounted device for positioning and tracking, and although such systems have high accuracy and do not require peripherals, they increase the number of cameras in the head-mounted device, thereby increasing complexity and making it difficult to identify transparent or textured objects.
Disclosure of Invention
The invention aims to provide a virtual reality tracking method and system based on ultrasonic waves, and aims to solve the technical problems.
The invention is realized in this way, a virtual reality tracking method and system based on ultrasonic wave, the virtual reality tracking method includes the following steps:
s1, receiving ultrasonic data stream information by using the head-mounted VR equipment;
s2, carrying out data purification treatment on the received ultrasonic data flow information;
s3, estimating relative path and initial position information by using phase change;
and S4, estimating and outputting the motion trail of the two-dimensional path change according to the relative path and the initial position information.
The further technical scheme of the invention is as follows: the step S3 further includes the following steps:
s31, removing redundant reflection data flow information in the target movement by using a linear regression method;
s32, calculating the relative distance between the moving direction of the target and the path by using the phase change and the Doppler frequency shift;
s33, calculating a power delay spectrum of the path arrival time by applying inverse Fourier transform to the phase change under different frequencies;
and S34, respectively estimating initial position information of the target moving reflection signal according to the energy in the power delay spectrum.
The further technical scheme of the invention is as follows: the step S2 further includes the following steps:
s21, carrying out noise reduction processing on the received ultrasonic data stream information to remove interference points;
and S22, filtering the received ultrasonic data stream information to remove high-frequency signals.
The further technical scheme of the invention is as follows: the step S1 further includes the following steps:
and S11, the head-mounted VR device transmits the predefined ultrasonic data stream information.
The further technical scheme of the invention is as follows: a loudspeaker and more than two microphones are embedded in the head-mounted VR equipment, and the frequency of ultrasonic waves emitted by the head-mounted VR equipment is more than 17000 Hz; the received ultrasonic data stream signal is processed by a central server or an intelligent processing terminal.
It is another object of the present invention to provide an ultrasound based virtual reality tracking system comprising
The data receiving module is used for receiving ultrasonic data stream information by using the head-mounted VR equipment;
the purification processing module is used for carrying out data purification processing on the received ultrasonic data flow information;
the distance calculation module is used for estimating a relative path and initial position information by using the phase change;
and the track calculation output module is used for estimating and outputting the motion track of the two-dimensional lower path change according to the relative path and the initial position information.
The further technical scheme of the invention is as follows: the distance calculation module also comprises
The multipath effect eliminating unit is used for removing redundant reflection data flow information in the target movement by utilizing a linear regression method;
a relative distance calculating unit for calculating the relative distance between the moving direction of the target and the path by using the phase change and the Doppler shift;
the power time delay spectrum acquisition unit is used for calculating the power time delay spectrum of the path arrival time by applying inverse Fourier transform to the phase change under different frequencies;
and the initial position acquisition unit is used for respectively estimating the initial position information of the target mobile reflection signal according to the energy in the power time delay spectrum.
The further technical scheme of the invention is as follows: the purification treatment module also comprises
The noise reduction unit is used for carrying out noise reduction treatment on the received ultrasonic data stream information to remove interference points;
and the filtering unit is used for filtering the received ultrasonic data stream information to remove high-frequency signals.
The further technical scheme of the invention is as follows: the data receiving module also comprises
And the transmitting unit is used for transmitting the predefined ultrasonic data stream information by the head-mounted VR equipment.
The further technical scheme of the invention is as follows: a loudspeaker and more than two microphones are embedded in the head-mounted VR equipment, and the frequency of ultrasonic waves emitted by the head-mounted VR equipment is more than 17000 Hz; the received ultrasonic data stream signal is processed by a central server or an intelligent processing terminal.
The invention has the beneficial effects that: the method can obtain results with high precision and high robustness, and has high academic and application values in aspects of VR perception, human-computer interaction, games and the like; excessive peripheral equipment is abandoned, corresponding loudspeaker and microphone settings are only embedded into the head-mounted equipment, not only is the complexity reduced, but also the cost is greatly reduced, and the position information of the target is estimated by utilizing the ultrasonic signals, so that the VR target positioning and tracking with high precision and high robustness is realized.
Drawings
Fig. 1 is a flowchart of an ultrasound-based virtual reality tracking method according to an embodiment of the present invention.
Fig. 2 is a block diagram of an ultrasound-based virtual reality tracking system according to an embodiment of the present invention.
Detailed Description
Fig. 1 shows a flowchart of the ultrasound-based virtual reality tracking method provided by the present invention, which is detailed as follows:
step S1, receiving ultrasonic wave data stream information by using a head-mounted VR device; the VR glasses are embedded with a loudspeaker and two or more microphones, the loudspeaker is used for transmitting ultrasonic data streams, the microphones are used for receiving the ultrasonic data streams transmitted by hands or fingers, the head-mounted VR equipment embedded with the loudspeaker transmits predefined ultrasonic waves, the predefined ultrasonic waves are reflected by the moving hands or fingers and received by the embedded microphones and then transmitted to a central server or an intelligent processing terminal, and the purpose of positioning and tracking a moving object is achieved by utilizing a tracking algorithm. A loudspeaker and more than two microphones are embedded in the head-mounted VR equipment, and the frequency of ultrasonic waves emitted by the head-mounted VR equipment is more than 17000 Hz; the received ultrasonic data stream signal is processed by a central server or an intelligent processing terminal. The continuous ultrasonic data stream with fixed frequency f more than or equal to 17000HZ is sent out by the loudspeaker and is received by the embedded microphone after being reflected by the hand or the finger.
Step S2, carrying out data purification treatment on the received ultrasonic data flow information; and carrying out data purification treatment on the received data. First, the interference points are removed, then the data is filtered to remove the high frequency part, and the signal reflected by the hand or finger is retained to prepare for the following positioning and tracking. The high frequency or noise signals are filtered out using a low pass or band pass filter, such as a butterworth filter, FIR filter, or the like.
Step S3, estimating relative path and initial position information by using phase change; and estimating the change length of the relative path by using the change of the phase, estimating initial position information of the moving target according to the delay file obtained by inverse Fourier transform, and positioning and tracking the moving target by combining the relative path and the initial position information. The position information of the moving target is estimated by evaluating the phase information of the ultrasonic signal, the reaching time of different paths is obtained by utilizing the phase change of different paths and different frequencies, a delay file is obtained by inverse Fourier change, and the energy distribution is analyzed to obtain the path length of the sound reflected by the movement of the hand or the finger. The position trajectory of the head-mounted device is accurately recovered by mining these ultrasonic signals. The moving distance of the moving target is calculated by evaluating the phase change of a signal received by a microphone and reflected by the moving target, then, under different frequencies, the Time of arrival (TOA) of the path is calculated by applying inverse fourier transform to the phase, and the path length of the signal reflected by the moving object is estimated by analyzing the energy distribution corresponding to the TOA, so as to obtain the initial position information of the target. The specific process is as follows: s31, removing redundant part reflection data flow information in the target movement according to the phase change; the method utilizes a linear regression method to fit paths under different frequencies to an optimal path of signals reflected by hands or fingers so as to achieve the purpose of filtering interference signals. S32, calculating the relative distance between the moving direction of the target and the path by using the phase change and the Doppler frequency shift; firstly, the initial position and the end position of a dynamic vector (a signal reflected by a hand or a finger) are calculated by using Doppler frequency shift, then, the average value of the extracted parts is calculated, then, the dynamic vector of a target is calculated by a subtraction method, and then, the path length change of the moving target in a period of time is estimated according to the phase change of the dynamic vector, and the calculation method comprises the following steps:
Figure BDA0001498178410000061
where d (t) is the distance from the speaker, via the moving object, to the microphone, vcIs the speed of sound of the ultrasonic wave, generally 343m/s, and f is the frequency of the ultrasonic wave emitted by the loudspeaker. Thus, the moving distance of the moving object is (d (t) -d (0))/2.
S33, calculating a power delay spectrum of the path arrival time by applying inverse Fourier transform to the phase change under different frequencies; the calculation formula of the power delay spectrum is as follows:
Figure BDA0001498178410000071
wherein k represents the number of frequencies, HpAnd (k, t) represents an ultrasonic signal received by the microphone. S34, respectively estimating the initial position of the target moving reflection signal according to the energy in the power time delay spectrumAnd setting information. After the power delay spectrum is calculated, according to the fact that the higher the energy is, the greater the intensity of the signal which represents the reflection of the moving object is, and according to the fact that the ultrasonic wave signal is reflected by the moving object and reaches the microphone along different paths in different Time of Arrival (TOA), the path length of the reflection of the moving object can be estimated by calculating the path Arrival Time with the highest energy in the inverse fourier transform result, and then the initial position of the moving object is calculated through the Time delay value of the peak in the power delay spectrum.
And step S4, estimating and outputting the motion trail of the two-dimensional path change according to the relative path and the initial position information. A coordinate system is established by utilizing the positions of the two microphones and the loudspeaker, absolute position information of the moving target can be obtained after relative distance information and initial position information are obtained, the position of the moving target in the coordinate system is calculated according to a geometric relation, the position of the moving target is updated by continuously updating the path lengths of the signals of the two microphones, and then the moving target is mapped to VR glasses, so that the purpose of real-time tracking is achieved, and interactive physical examination is completed.
Another object of the present invention is to provide an ultrasound-based virtual reality tracking system, as shown in fig. 2, which includes
The data receiving module is used for receiving ultrasonic data stream information by using the head-mounted VR equipment;
the purification processing module is used for carrying out data purification processing on the received ultrasonic data flow information;
the distance calculation module is used for estimating a relative path and initial position information by using the phase change;
and the track calculation output module is used for estimating and outputting the motion track of the two-dimensional lower path change according to the relative path and the initial position information.
The distance calculation module also comprises
The multipath effect eliminating unit is used for removing redundant part reflection data stream information in the target movement according to the phase change;
a relative distance calculating unit for calculating the relative distance between the moving direction of the target and the path by using the phase change and the Doppler shift;
the power time delay spectrum acquisition unit is used for calculating the power time delay spectrum of the path arrival time by applying inverse Fourier transform to the phase change under different frequencies;
and the initial position acquisition unit is used for respectively estimating the initial position information of the target mobile reflection signal according to the energy in the power time delay spectrum.
The purification treatment module also comprises
The noise reduction unit is used for carrying out noise reduction treatment on the received ultrasonic data stream information to remove interference points;
and the filtering unit is used for filtering the received ultrasonic data stream information to remove high-frequency signals.
The data receiving module also comprises
And the transmitting unit is used for transmitting the predefined ultrasonic data stream information by the head-mounted VR equipment.
A loudspeaker and more than two microphones are embedded in the head-mounted VR equipment, and the frequency of ultrasonic waves emitted by the head-mounted VR equipment is more than 17000 Hz; the received ultrasonic data stream signal is processed by a central server or an intelligent processing terminal.
The system utilizes a loudspeaker of the head-mounted equipment to send out ultrasonic signals, and then the ultrasonic signals are received by a microphone of the head-mounted equipment, so that the ultrasonic-based position tracking and positioning system is provided. A loudspeaker is embedded into the head-mounted VR equipment to serve as transmitting equipment for transmitting wireless ultrasonic waves, two or more microphones are embedded into the head-mounted VR equipment, and the position track of the head-mounted equipment is accurately recovered by mining ultrasonic signals. The moving distance of the moving target is calculated by evaluating the phase change of a signal received by a microphone and reflected by the moving target, then, under different frequencies, the path reaching time TOA is calculated by applying inverse Fourier transform to the phase, and the path length of the signal reflected by the moving object is estimated by analyzing the energy distribution corresponding to the TOA, so that the initial position information of the target is obtained. And the aim of positioning and tracking the moving target is fulfilled by combining the relative distance information and the initial position information. The system can obtain the results of high precision and high robustness, and has high academic and application values in aspects of VR perception, man-machine interaction, games and the like.
The invention relates to a virtual reality tracking system integrated in head-wearing VR glasses, which mainly comprises a loudspeaker and two or more microphones, wherein the loudspeaker is used for transmitting continuous ultrasonic signal streams, the microphones are used for receiving data streams formed by the interference of moving objects, and the data received by the microphones are analyzed to judge the direction and the moving distance of the moving objects so as to realize the aim of target tracking. The moving distance of the hand or finger is estimated mainly by the phase change of the signal reflected by the hand or finger. In the 2D space, we roughly calculate the path length Of the sound reflected by the hand or finger by evaluating the Time-Of-Arrival (TOA) Of the phases Of different frequencies under different paths, thereby determining the initial position information Of the hand or finger. And then the aim of identifying and tracking the motion of the 2D hand or finger is achieved by combining the relative distance of the fine granularity of the 1D. The system can obtain the results of high precision and high robustness, and has high academic and application values in aspects of VR perception, man-machine interaction, games and the like.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (8)

1. An ultrasonic-based virtual reality tracking method is characterized by comprising the following steps:
s1, receiving ultrasonic data stream information by using the head-mounted VR equipment;
s2, carrying out data purification treatment on the received ultrasonic data flow information;
s3, estimating relative path and initial position information by using phase change;
s4, estimating and outputting a motion track of two-dimensional path change according to the relative path and the initial position information;
the step S3 further includes the following steps:
s31, removing redundant reflection data flow information in the target movement by using a linear regression method;
s32, calculating the relative distance between the moving direction of the target and the path by using the phase change and the Doppler frequency shift;
s33, calculating a power delay spectrum of the path arrival time by applying inverse Fourier transform to the phase change under different frequencies, wherein the calculation formula of the power delay spectrum is as follows:
Figure FDA0002190715880000011
wherein k represents the number of frequencies, Hp(k, t) represents an ultrasonic signal received by the microphone;
and S34, respectively estimating initial position information of the target moving reflection signal according to the energy in the power delay spectrum.
2. The virtual reality tracking method according to claim 1, wherein the step S2 further comprises the steps of:
s21, carrying out noise reduction processing on the received ultrasonic data stream information to remove interference points;
and S22, filtering the received ultrasonic data stream information to remove high-frequency signals.
3. The virtual reality tracking method according to claim 2, wherein the step S1 further comprises the steps of: and S11, the head-mounted VR device transmits the predefined ultrasonic data stream information.
4. The virtual reality tracking method according to any one of claims 1-3, wherein a speaker and two or more microphones are embedded in the head-mounted VR device, and the frequency of the ultrasonic waves emitted by the head-mounted VR device is 17000Hz or higher; the received ultrasonic data stream signal is processed by a central server or an intelligent processing terminal.
5. An ultrasonic-based virtual reality tracking system, comprising
The data receiving module is used for receiving ultrasonic data stream information by using the head-mounted VR equipment;
the purification processing module is used for carrying out data purification processing on the received ultrasonic data flow information;
the distance calculation module is used for estimating a relative path and initial position information by using the phase change;
the track calculation output module is used for estimating and outputting a motion track of two-dimensional lower path change according to the relative path and the initial position information;
the distance calculation module also comprises
The multipath effect eliminating unit is used for removing redundant reflection data flow information in the target movement by utilizing a linear regression method;
a relative distance calculating unit for calculating the relative distance between the moving direction of the target and the path by using the phase change and the Doppler shift;
the power time delay spectrum acquisition unit is used for calculating the power time delay spectrum of the path arrival time by applying inverse Fourier transform to the phase change under different frequencies, and the calculation formula of the power time delay spectrum is as follows:
Figure FDA0002190715880000021
wherein k represents the number of frequencies, Hp(k, t) represents an ultrasonic signal received by the microphone;
and the initial position acquisition unit is used for respectively estimating the initial position information of the target mobile reflection signal according to the energy in the power time delay spectrum.
6. The virtual reality tracking system of claim 5, further comprising a cleansing processing module
The noise reduction unit is used for carrying out noise reduction treatment on the received ultrasonic data stream information to remove interference points;
and the filtering unit is used for filtering the received ultrasonic data stream information to remove high-frequency signals.
7. The virtual reality tracking system of claim 6, further comprising a data receiving module
And the transmitting unit is used for transmitting the predefined ultrasonic data stream information by the head-mounted VR equipment.
8. The virtual reality tracking system of any one of claims 5-7, wherein the head-mounted VR device has one speaker and two or more microphones embedded therein, and the frequency of the ultrasonic waves emitted by the head-mounted VR device is 17000Hz or higher; the received ultrasonic data stream signal is processed by a central server or an intelligent processing terminal.
CN201711289669.7A 2017-12-07 2017-12-07 Virtual reality tracking method and system based on ultrasonic waves Active CN108107435B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711289669.7A CN108107435B (en) 2017-12-07 2017-12-07 Virtual reality tracking method and system based on ultrasonic waves

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711289669.7A CN108107435B (en) 2017-12-07 2017-12-07 Virtual reality tracking method and system based on ultrasonic waves

Publications (2)

Publication Number Publication Date
CN108107435A CN108107435A (en) 2018-06-01
CN108107435B true CN108107435B (en) 2020-01-17

Family

ID=62209192

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711289669.7A Active CN108107435B (en) 2017-12-07 2017-12-07 Virtual reality tracking method and system based on ultrasonic waves

Country Status (1)

Country Link
CN (1) CN108107435B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112327252B (en) * 2020-10-12 2022-07-15 中国海洋大学 Multi-loudspeaker and multi-microphone based sound wave multi-target tracking method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103308883A (en) * 2013-06-21 2013-09-18 北京交通大学 Arrival angle estimation method based on single antenna
CN105228101A (en) * 2015-09-07 2016-01-06 同济大学 Based on the radiation pattern adaptive approach of Doppler's characteristic of channel
CN107135540A (en) * 2016-02-29 2017-09-05 富士通株式会社 Positioner, method and electronic equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102420793A (en) * 2011-11-24 2012-04-18 天津大学 Tracking control method of digital communication receiver time and carrier frequency synchronization
WO2014143867A1 (en) * 2013-03-15 2014-09-18 Springs Window Fashions, Llc Window covering motorized lift and control system motor and operation
US10254392B2 (en) * 2015-09-09 2019-04-09 The United States Of America As Represented By The Secretary Of The Navy Reverse-ephemeris method for determining position, attitude, and time
CN105260024B (en) * 2015-10-15 2018-01-26 广东欧珀移动通信有限公司 A kind of method and device that gesture motion track is simulated on screen
CN105938399B (en) * 2015-12-04 2019-04-12 深圳大学 The text input recognition methods of smart machine based on acoustics
CN105718064A (en) * 2016-01-22 2016-06-29 南京大学 Gesture recognition system and method based on ultrasonic waves

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103308883A (en) * 2013-06-21 2013-09-18 北京交通大学 Arrival angle estimation method based on single antenna
CN105228101A (en) * 2015-09-07 2016-01-06 同济大学 Based on the radiation pattern adaptive approach of Doppler's characteristic of channel
CN107135540A (en) * 2016-02-29 2017-09-05 富士通株式会社 Positioner, method and electronic equipment

Also Published As

Publication number Publication date
CN108107435A (en) 2018-06-01

Similar Documents

Publication Publication Date Title
US20210216135A1 (en) Input device for ar/vr applications
CN108089702B (en) Human-computer somatosensory interaction method and system based on ultrasonic waves
CN105718064A (en) Gesture recognition system and method based on ultrasonic waves
AU2009227717B2 (en) Object and movement detection
WO2020124681A1 (en) Target location apparatus and method for bionic sonar based on double plecotus auritus auricles
CN103064061A (en) Sound source localization method of three-dimensional space
CN110069134B (en) Method for restoring aerial moving track of hand by using radio frequency signal
Wang et al. {MAVL}: Multiresolution analysis of voice localization
CN109901112B (en) Acoustic simultaneous positioning and mapping method based on multi-channel sound acquisition
CN105607042A (en) Method for locating sound source through microphone array time delay estimation
CN108107435B (en) Virtual reality tracking method and system based on ultrasonic waves
CN111643098A (en) Gait recognition and emotion perception method and system based on intelligent acoustic equipment
US10416305B2 (en) Positioning device and positioning method
CN109212481A (en) A method of auditory localization is carried out using microphone array
Van Dam et al. In-air ultrasonic 3D-touchscreen with gesture recognition using existing hardware for smart devices
JP6697982B2 (en) Robot system
CN108872939A (en) Interior space geometric profile reconstructing method based on acoustics mirror image model
Liu et al. Toward device-free micro-gesture tracking via accurate acoustic doppler-shift detection
Zayer et al. PAWdio: Hand input for mobile VR using acoustic sensing
Al-Sheikh et al. Sound source direction estimation in horizontal plane using microphone array
JP2022511271A (en) Control of the device by tracking hand movements using acoustic signals
CN107330462A (en) Gesture identification method and its device based on time frequency analysis
Lopez et al. 3-D audio with dynamic tracking for multimedia environtments
CN110515466B (en) Motion capture system based on virtual reality scene
CN111103980B (en) VR (virtual reality) environment interaction system and method based on FMCW (frequency modulated continuous wave)

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant