WO2021206734A1 - Reconstruction de son 3d grâce à des fonctions de transfert relatives à la tête avec des dispositifs vestimentaires - Google Patents

Reconstruction de son 3d grâce à des fonctions de transfert relatives à la tête avec des dispositifs vestimentaires Download PDF

Info

Publication number
WO2021206734A1
WO2021206734A1 PCT/US2020/027743 US2020027743W WO2021206734A1 WO 2021206734 A1 WO2021206734 A1 WO 2021206734A1 US 2020027743 W US2020027743 W US 2020027743W WO 2021206734 A1 WO2021206734 A1 WO 2021206734A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
head
audio
audio signals
position coordinates
Prior art date
Application number
PCT/US2020/027743
Other languages
English (en)
Inventor
Mithra VANKIPURAM
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2020/027743 priority Critical patent/WO2021206734A1/fr
Publication of WO2021206734A1 publication Critical patent/WO2021206734A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • H04S7/302Electronic adaptation of stereophonic sound system to listener position or orientation
    • H04S7/303Tracking of listener position or orientation
    • H04S7/304For headphones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones
    • H04R1/1041Mechanical or electronic switches, or control elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R5/00Stereophonic arrangements
    • H04R5/033Headphones for stereophonic communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S2420/00Techniques used stereophonic systems covered by H04S but not provided for in its groups
    • H04S2420/01Enhancing the perception of the sound image or of the spatial distribution using head related transfer functions [HRTF's] or equivalents thereof, e.g. interaural time difference [ITD] or interaural level difference [ILD]

Definitions

  • FIG. 3 illustrates a flowchart of a method to implement one or more HRTFs with a wearable device to provide a user with three-dimensional (3D) audio.
  • a system may utilize the position and orientation of a user’s head (e.g., the local position of the head) to provide sounds that are perceived by the user as emanating from a particular point or area (e.g., a geographically fixed point or area).
  • a particular point or area e.g., a geographically fixed point or area.
  • the user’s location e.g., the global position of the user
  • the local position of the user’s head may be used with the local position of the user’s head to determine how the 3D sound is provided to the user.
  • the wearable device may provide the audio signals to the user.
  • headphones e.g., over-ear headphones, in-ear headphones, on-ear headphones, one or more earbuds, etc.
  • headphones may include an integrated IMU in one side or two sides, where implemented, of the headphones (e.g., left and/or right portions of each respective ear of the user).
  • Navigation systems may use auditory feedback to provide information to a user (e.g ., through audio systems in automobiles or via headphones in communication with a mobile device).
  • a system may use 3D spatial audio to provide navigational feedback in the direction in which the user is guided. For example, the sound may appear to come from the destination or from the location where the user should turn.
  • portions of the audio may be modulated to enhance audio effects and provide the navigational information at a more intuitive level for users.
  • the volume of the audio may be modulated while directions are being provided (e.g., the volume may be increased as the user approaches direction change or a target of the navigation).
  • other audio modulation may be implemented, such as altering audio level in a mix.
  • the mobile device may act to detect (e.g., intermittently detect) the position of the user’s head with another method outside of the sensors in the position device. For example, the mobile device may use cameras to capture the orientation of the user’s head to establish a known position of the user’s head (e.g., a baseline position, an updated verified position, an initial position). The position of the user’s head is considered “known” as the camera captures the actual position for a given time or time frame. The system may then rely on motion tracking provided by the position device to estimate the position of the user’s head between the known captures of the head as the head deviates from the detected known position.
  • a known position of the user’s head e.g., a baseline position, an updated verified position, an initial position.
  • the position of the user’s head is considered “known” as the camera captures the actual position for a given time or time frame.
  • the system may then rely on motion tracking provided by the position device to estimate the position of the user’s head between the
  • the mobile device may detect the current global location of the user (e.g ., via a global positioning system (GPS) and/or data from a compass of the mobile device) to set the global coordinates of the head.
  • the mobile device may employ a local position device or sensors, such as those discussed above (e.g., an IMU of the mobile device), to determine the orientation of the mobile device.
  • the mobile device may combine those local coordinates with the local coordinates detected from the user’s head, and then correlate those local coordinates with the detected GPS position.
  • the mobile device may start receiving accelerometer and gyroscope readings from the IMU in the position device to update relative changes to the user’s HRTFs.
  • Flowever as such readings may be prone to errors and/or noise overtime, additional (e.g., subsequent) readings of the known position of the user’s head may be performed to update the position of the user’s head.
  • Each known position capture may enable recalibration of the position of the user’s head to global coordinates when a user looks at the mobile device, which recalibration can be used to more effectively generate the FI RTFs.
  • the system 100 may include a first device (e.g., a mobile device 102) and a second device (e.g., a wearable device 104), where the mobile device 102 receives data from the wearable device 104 and may, optionally transmit data to the wearable device 104 (e.g., instruction, status queries, etc.).
  • the mobile device 102 may include a position capture subsystem 106, a local coordinate tracking subsystem 108, a global tracking subsystem 110 (e.g ., a global positioning system (GPS)), and a communications subsystem 112.
  • GPS global positioning system
  • the mobile device 102 may include one or more of a processor 114, memory 116, and a network interface 118 ⁇ e.g., to enable wired and/or wireless communications with other devices) connected to a computer-readable storage medium 120 ⁇ e.g., a non-transitory computer-readable storage medium) via a communication bus 122.
  • the processor 114 may execute or otherwise process instructions stored in the computer-readable storage medium 120.
  • the instructions stored in the computer-readable storage medium 120 include operational modules 106 through 112 to implement the subsystems described herein.
  • the wearable device 104 may include a local coordinate tracking subsystem 124, a communications subsystem 126, and an audio subsystem 128.
  • the systems and respective devices may be similar to those examples illustrated in and described with reference to FIGS. 1 and 2.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Stereophonic System (AREA)

Abstract

Selon l'invention, des systèmes, des dispositifs et des procédés peuvent comprendre la détermination d'une position de référence d'une tête d'un utilisateur et la surveillance du mouvement de la tête de l'utilisateur par rapport à la position de référence. Des fonctions de transfert relatives à la tête (HRTF) peuvent être produites pour fournir des signaux audio permettant de fournir un son tridimensionnel (3D) à l'utilisateur par l'intermédiaire d'un dispositif audio.
PCT/US2020/027743 2020-04-10 2020-04-10 Reconstruction de son 3d grâce à des fonctions de transfert relatives à la tête avec des dispositifs vestimentaires WO2021206734A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2020/027743 WO2021206734A1 (fr) 2020-04-10 2020-04-10 Reconstruction de son 3d grâce à des fonctions de transfert relatives à la tête avec des dispositifs vestimentaires

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2020/027743 WO2021206734A1 (fr) 2020-04-10 2020-04-10 Reconstruction de son 3d grâce à des fonctions de transfert relatives à la tête avec des dispositifs vestimentaires

Publications (1)

Publication Number Publication Date
WO2021206734A1 true WO2021206734A1 (fr) 2021-10-14

Family

ID=78023533

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/027743 WO2021206734A1 (fr) 2020-04-10 2020-04-10 Reconstruction de son 3d grâce à des fonctions de transfert relatives à la tête avec des dispositifs vestimentaires

Country Status (1)

Country Link
WO (1) WO2021206734A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013052132A2 (fr) * 2011-10-03 2013-04-11 Qualcomm Incorporated Procédé et système de suivi de position de tête basé sur une image
US20150189423A1 (en) * 2012-07-13 2015-07-02 Razer (Asia-Pacific) Pte. Ltd. Audio signal output device and method of processing an audio signal
US20160131761A1 (en) * 2014-11-10 2016-05-12 Valve Corporation Positional tracking systems and methods
US20170295446A1 (en) * 2016-04-08 2017-10-12 Qualcomm Incorporated Spatialized audio output based on predicted position data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013052132A2 (fr) * 2011-10-03 2013-04-11 Qualcomm Incorporated Procédé et système de suivi de position de tête basé sur une image
US20150189423A1 (en) * 2012-07-13 2015-07-02 Razer (Asia-Pacific) Pte. Ltd. Audio signal output device and method of processing an audio signal
US20160131761A1 (en) * 2014-11-10 2016-05-12 Valve Corporation Positional tracking systems and methods
US20170295446A1 (en) * 2016-04-08 2017-10-12 Qualcomm Incorporated Spatialized audio output based on predicted position data

Similar Documents

Publication Publication Date Title
KR102448284B1 (ko) 헤드 마운트 디스플레이 추적 시스템
EP2700907B1 (fr) Procédé de navigation acoustique
US9459692B1 (en) Virtual reality headset with relative motion head tracker
CN110140099B (zh) 用于跟踪控制器的系统和方法
JP6391685B2 (ja) 仮想オブジェクトの方向付け及び可視化
US9351090B2 (en) Method of checking earphone wearing state
EP3631600B1 (fr) Commande dynamique des paramètres de performance dans un sous-système d'étalonnage de capteur à six degrés de liberté
CN104205880B (zh) 基于取向的音频控制
US20170045736A1 (en) Image display device, computer program, and image display system
US20210400418A1 (en) Head to headset rotation transform estimation for head pose tracking in spatial audio applications
CN108022302B (zh) 一种Inside-Out空间定位的AR立体显示装置
KR102230645B1 (ko) 공간화 오디오를 갖는 가상 현실, 증강 현실 및 혼합 현실 시스템들
WO2019176308A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
CN107014378A (zh) 一种视线跟踪瞄准操控系统及方法
US20140219485A1 (en) Personal communications unit for observing from a point of view and team communications system comprising multiple personal communications units for observing from a point of view
US20210400420A1 (en) Inertially stable virtual auditory space for spatial audio applications
US20130191068A1 (en) Head tracking system
US10248191B2 (en) Virtual rigid framework for sensor subsystem
EP4206788A1 (fr) Systèmes et procédés pour corriger un changement de direction induit par un véhicule
US20210263592A1 (en) Hand and totem input fusion for wearable systems
CN113343457A (zh) 自动驾驶的仿真测试方法、装置、设备及存储介质
WO2021206734A1 (fr) Reconstruction de son 3d grâce à des fonctions de transfert relatives à la tête avec des dispositifs vestimentaires
EP4102470A1 (fr) Procédé et système de télécollaboration
US11356788B2 (en) Spatialized audio rendering for head worn audio device in a vehicle
WO2020087041A1 (fr) Suivi de dispositif de réalité mixte

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20930131

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20930131

Country of ref document: EP

Kind code of ref document: A1