CN110312198B - Virtual sound source repositioning method and device for digital cinema - Google Patents

Virtual sound source repositioning method and device for digital cinema Download PDF

Info

Publication number
CN110312198B
CN110312198B CN201910608456.9A CN201910608456A CN110312198B CN 110312198 B CN110312198 B CN 110312198B CN 201910608456 A CN201910608456 A CN 201910608456A CN 110312198 B CN110312198 B CN 110312198B
Authority
CN
China
Prior art keywords
sound source
virtual sound
impulse response
response function
audio
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910608456.9A
Other languages
Chinese (zh)
Other versions
CN110312198A (en
Inventor
马士超
张坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LEONIS (BEIJING) INFORMATION TECHNOLOGY CO LTD
Original Assignee
LEONIS (BEIJING) INFORMATION TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LEONIS (BEIJING) INFORMATION TECHNOLOGY CO LTD filed Critical LEONIS (BEIJING) INFORMATION TECHNOLOGY CO LTD
Priority to CN201910608456.9A priority Critical patent/CN110312198B/en
Publication of CN110312198A publication Critical patent/CN110312198A/en
Application granted granted Critical
Publication of CN110312198B publication Critical patent/CN110312198B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • H04S7/302Electronic adaptation of stereophonic sound system to listener position or orientation
    • H04S7/303Tracking of listener position or orientation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S2420/00Techniques used stereophonic systems covered by H04S but not provided for in its groups
    • H04S2420/01Enhancing the perception of the sound image or of the spatial distribution using head related transfer functions [HRTF's] or equivalents thereof, e.g. interaural time difference [ITD] or interaural level difference [ILD]

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Stereophonic System (AREA)

Abstract

The invention provides a virtual sound source repositioning method and a virtual sound source repositioning device for a digital cinema, which comprise the following steps: determining a room impulse response function RIR of the digital cinema; determining a head-related impulse response function (HRIR) from the MIT data set or the CIPIC data set; and transforming the virtual sound source from the initial position to the repositioning position through a room impulse response function RIR and a head-related impulse response function HRIR according to the initial position and the preset repositioning position of the virtual sound source. The method can realize the audio effect equivalent to a plurality of sound devices in a small digital cinema by using a small number of sound devices, and has more accurate repositioning effect.

Description

Virtual sound source repositioning method and device for digital cinema
Technical Field
The invention relates to the technical field of sound systems of digital cinema, in particular to a virtual sound source repositioning method and a virtual sound source repositioning device for a digital cinema.
Background
Digital cinema is a cinema that digitally projects movies. Such theaters use digital copies as the audio/video carrier and play them using state-approved digital projectors. In recent years, digital cinema has shown a trend towards miniaturization, and in small theaters there is often insufficient space to arrange a sufficient number of stereos, or some locations of the cinema are not suitable for installing the sound (e.g. the central location of the screen), which may affect the listening experience of the cinema audience.
Disclosure of Invention
The embodiment of the invention provides a virtual sound source repositioning method and device for a digital cinema, and solves the technical problem that the hearing experience of cinema audiences cannot be met in the prior art.
The virtual sound source repositioning method for the digital cinema comprises the following steps:
determining a room impulse response function RIR of the digital cinema;
determining a head-related impulse response function (HRIR) from the MIT data set or the CIPIC data set;
and transforming the virtual sound source from the initial position to the repositioning position through a room impulse response function RIR and a head-related impulse response function HRIR according to the initial position and the preset repositioning position of the virtual sound source.
The virtual sound source relocating device for digital cinema comprises:
the room impulse response function determining module is used for determining a room impulse response function RIR of the digital cinema;
a head-related impulse response function determining module for determining a head-related impulse response function HRIR from the MIT data set or the CIPIC data set;
and the repositioning module is used for transforming the virtual sound source from the initial position to the repositioning position through a room impulse response function RIR and a head-related impulse response function HRIR according to the initial position and the preset repositioning position of the virtual sound source.
The embodiment of the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the method when executing the computer program.
The embodiment of the invention also provides a computer readable storage medium, and the computer readable storage medium stores a computer program for executing the method.
In the embodiment of the invention, the virtual sound source is transformed from the initial position to the repositioning position based on the initial position and the preset repositioning position of the virtual sound source through the determined room impulse response function and the head-related impulse response function, and the method can realize the audio effect equivalent to that of a larger number of sounds in a small-sized digital cinema by using a smaller number of sounds.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart (one) of a virtual sound source relocation method for digital cinema according to an embodiment of the present invention;
FIG. 2 is a diagram of a room layout for RIR measurement according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an acoustic wave interference;
fig. 4 is a flowchart (two) of a virtual sound source relocation method for digital cinema according to an embodiment of the present invention;
FIG. 5 is a flow chart of a method for eliminating acoustic interference according to an embodiment of the present invention;
fig. 6 is a block diagram (one) of a virtual sound source relocating device for digital cinema according to an embodiment of the present invention;
fig. 7 is a block diagram of a virtual sound source relocating device for digital cinema according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The following describes related art to which the present invention relates.
(1) Room impulse response function
Room Impulse Response (RIR) refers to the sound spectrum filtering function of an audio source arriving from a loudspeaker at a room listener's position. In the same room, the impulse response from the sound source to the receiving point is unique and contains all the acoustic characteristics of the sound field in the room. Impulse response r for a given set of roomsj(k) Where j is 1, …, M is the total number of channels of audio s captured at the listener's position, k is the position parameter of the audio, and the reverberation signal y can be obtained byj(k):
yj(k)=s×rj(k)。
(2) Head-related impulse response function
The sound spectrum filtering function before the sound source reaches the tympanic membrane from the outer ear is called the head-related impulse response (HRIR). The HRIR in the frequency domain corresponds to the HRTF (head related transfer function) in the time domain. HRIR is a set of filters that use binaural time delay, binaural sound intensity difference, pinna frequency vibration, etc. to produce stereo effects. Binaural HRIRs can be viewed as frequency-dependent amplitude and time-delay transformations, which are mainly produced by the complex shape of the pinna. Given a set of binaural impulse responses ei(k) Where i is 1, …, N is the total number of channels s of audio received by the outer ear (if the dimensions are reduced by a downmixing operation, M and N may be different), k is a position parameter of the audio, and a binaural reverberation signal x can be obtained by the following formulai(k):
xi(k)=s×ei(k)。
Based on this, in the embodiment of the present invention, there is provided a virtual sound source relocation method for digital cinema, as shown in fig. 1, the method including:
step 101: determining a room impulse response function RIR of the digital cinema;
step 102: determining a head-related impulse response function (HRIR) from the MIT data set or the CIPIC data set;
step 103: according to the initial position and the preset repositioning position of the virtual sound source, the virtual sound source is transformed from the initial position to the repositioning position through a room impulse response function RIR and a head-related impulse response function HRIR;
the virtual sound source refers to a virtual position of an original audio (a sound emitted from the position is simulated in multiple channels). The initial position of the virtual sound source refers to the initial position of the original audio. The HRIR and RIR functions represent the frequency domain transform relationship between the original audio and the position transformed virtual audio.
The steps 101 to 103 of the present invention are specifically realized as follows:
step 101: room arrangement with measured room impulse response As shown in FIG. 2, A, B, C, D is a plurality of sets of movable loudspeakers, an array of microphonesIs arranged in the center of the room for recording audio signals, let rjroom(k) Representing the room impulse response function, jrom ∈ 1, …, M, M being the number of audio channels acquired at the listener position, k being the position parameter of the audio, inputjroomRepresenting the audio spectrum of the loudspeaker input, outputjroom(k) Representing the audio spectrum acquired at the listener position, the room impulse response function (RIR) of a digital cinema is measured according to the following formula:
Figure BDA0002121534150000041
step 102: a head-related impulse response function (HRIR) is calculated from the MIT or CIPIC data set according to the following formula:
Figure BDA0002121534150000042
wherein e isiroom(k) Representing a head-related impulse response function, iroom ∈ 1, …, N being the total number of audio channels received by the outer ear, k being a location parameter of the audio; inputiear(k) Representing the audio spectrum received by the outer ear; outputiear(k) Representing the audio spectrum acquired at the simulated eardrum location.
MIT data set or CIPIC data set, among others, are known common data sets suitable for fitting HRIR functions.
Step 103: transforming the virtual sound source from the initial position to a repositioned position by a room impulse response function (RIR) and a head-related impulse response function (HRIR) according to the initial position and a preset repositioned position of the virtual sound source, repositioning the processed audio signal modifiedi(k2,k1) Comprises the following steps:
Figure BDA0002121534150000043
wherein, modifiedi(k2,k1) Representing virtual sound source position changesThe latter output; s represents an original virtual sound source signal; k is a radical of1Representing an initial position of a virtual sound source; k is a radical of2Representing the repositioned position of the virtual source of sound; r isi(k2) Represents a pair k2The ith RIR transform of (1); r isi(k1) Represents a pair k1The ith RIR transform of (1); e.g. of the typei(k2) Represents a pair k2The ith HRIR transform of (1); i is 1 to the number of audio channels N.
Wherein r in the formula (3)i(k1)、ri(k2) Determined by calculation of formula (1), e in formula (3)i(k2) Determined by the calculation of formula (2).
In this step, the characteristics of delay, sound intensity, etc. of the multi-channel audio are also required to be adjusted, and such adjustment can be embodied in the RIR and HRIR functions.
In the embodiment of the present invention, when two or more rows of sound waves from different speakers are superimposed in a space, an interference enhancing or attenuating phenomenon occurs between the sound waves, as shown in fig. 3. If a peak between two columns of waves meets a peak (corresponding to the solid line) at some point in space (e.g., point a) at some time, the amplitude here is the sum of the amplitudes of the two columns of waves. If the wave crest and the wave trough (corresponding to the dotted line) of the two rows of waves meet at a certain point (for example, the point b) in the space at a certain moment, the amplitudes of the two rows of waves weaken each other at the point. Because the wavelength of each wavelength band (such as λ and λ/2 in fig. 3) of the stereo audio is different, the interference phenomenon generated by the audio of different frequencies at the same position is also different. This phenomenon can affect the listening experience of theatre patrons.
Based on this, as shown in fig. 4, the virtual sound source repositioning method for digital cinema proposed by the present invention further includes:
step 104: the repositioned virtual sound source (referred to as multi-channel audio) is subjected to sound wave interference cancellation processing to reduce sound wave interference caused by the distance difference of the speakers.
Specifically, as shown in fig. 5, the sound wave interference cancellation processing is performed on the repositioned virtual sound source as follows:
step 1041: dividing the relocated multi-channel audio into a plurality of frequency bands;
step 1042: determining delay variation and/or phase variation of each frequency band according to the position of the loudspeaker and the audio frequency spectrum characteristics;
wherein, according to the loudspeaker position and the audio frequency spectrum characteristic, determining the delay variation and/or phase variation of each frequency band according to the following formula:
Figure BDA0002121534150000051
where Δ t denotes the loudspeaker S1And S2The time difference required between them;
Figure BDA0002121534150000052
represents a loudspeaker S1And S2A desired phase difference therebetween; l is1Represents a loudspeaker S1Distance to listener; l is2Represents a loudspeaker S2Distance to listener; v represents the sound velocity under the current conditions; f represents a band center frequency; mod denotes modulo.
Step 1043: according to the delay change Delta t and/or phase change of each frequency band
Figure BDA0002121534150000053
The repositioned multi-channel audio is delayed and/or phased to attenuate acoustic interference.
The interference elimination method reduces sound wave interference caused by the distance difference of the loudspeakers by delaying/phase-adjusting multi-channel audio so as to improve the hearing experience of cinema audiences.
Based on the same inventive concept, the embodiment of the present invention further provides a virtual sound source relocating device for digital cinema, as described in the following embodiments. Since the principle of solving the problem of the virtual sound source relocating device for digital cinema is similar to the virtual sound source relocating method for digital cinema, the implementation of the virtual sound source relocating device for digital cinema can refer to the implementation of the virtual sound source relocating method for digital cinema, and repeated parts are not repeated. As used hereinafter, the term "unit" or "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 6 is a block diagram showing a configuration of a virtual sound source relocating device for digital cinema according to an embodiment of the present invention, as shown in fig. 6, including:
a room impulse response function determining module 601, configured to determine a room impulse response function RIR of the digital cinema;
a head-related impulse response function determining module 602 for determining a head-related impulse response function HRIR from the MIT or CIPIC data set;
a repositioning module 603, configured to transform the virtual sound source from the initial position to a repositioned position according to the initial position of the virtual sound source and a preset repositioned position through the room impulse response function RIR and the head-related impulse response function HRIR.
In this embodiment of the present invention, the room impulse response function determining module 601 is specifically configured to: the room impulse response function RIR of the digital cinema is determined according to equation (1).
The head-related impulse response function determining module 602 is specifically configured to: the head-related impulse response function HRIR is determined from the MIT data set or the CIPIC data set according to equation (2).
The relocation module 603 is specifically configured to: the position of the virtual sound source after repositioning is obtained according to the formula (3).
In an embodiment of the present invention, as shown in fig. 7, the virtual sound source relocation apparatus for digital cinema further includes:
and an acoustic wave interference elimination processing module 604, configured to perform acoustic wave interference elimination processing on the repositioned virtual sound source.
Specifically, the acoustic wave interference cancellation processing module 604 is specifically configured to:
and carrying out sound wave interference elimination processing on the repositioned virtual sound source as follows:
dividing the relocated multi-channel audio into a plurality of frequency bands;
the delay variation and/or phase variation for each frequency band is determined using equation (4) based on the speaker position and the audio spectral characteristics.
And according to the delay change and/or the phase change of each frequency band, carrying out delay and/or phase adjustment on the repositioned multi-channel audio so as to weaken sound wave interference.
The embodiment of the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the method when executing the computer program.
The embodiment of the invention also provides a computer readable storage medium, and the computer readable storage medium stores a computer program for executing the method.
In summary, compared with the prior art, the virtual sound source repositioning method and device for digital cinema provided by the invention have the following advantages:
(1) the invention makes the position of the virtual sound source consistent with the position of the object by changing the position of the virtual sound source in the three-dimensional space, and the method can realize the audio effect equivalent to that of a plurality of sound devices by using a small number of sound devices in a small digital cinema;
(2) in the process of virtual sound source relocation, the room impulse response function (RIR) and the head-related impulse response function (HRIR) are comprehensively considered, and compared with a method for relocation by directly utilizing the Room Impulse Response (RIR), the method has a more accurate relocation effect;
(3) aiming at the interference phenomenon among the audio frequencies of all loudspeakers in the digital cinema, the invention provides an interference elimination method in the process of repositioning virtual sound sources, comprehensively considers two schemes of adding delay and modifying phase and combines the two schemes to weaken the interference phenomenon among all loudspeakers and obtain better interference elimination effect;
as will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) containing computer-usable program code.
The present invention has been described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. The present invention may be implemented by a method or system in which each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, is implemented by computer program instructions; and may provide these computer program instructions to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart block or blocks and/or flowchart block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, and the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks and/or flowchart block or blocks.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made to the embodiment of the present invention by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (9)

1. A virtual sound source relocation method for digital cinema, comprising:
determining a room impulse response function RIR of the digital cinema;
determining a head-related impulse response function (HRIR) from the MIT data set or the CIPIC data set;
according to the initial position and the preset repositioning position of the virtual sound source, the virtual sound source is transformed from the initial position to the repositioning position through a room impulse response function RIR and a head-related impulse response function HRIR;
transforming the virtual sound source from the initial position to the repositioned position by the room impulse response function RIR and the head-related impulse response function HRIR according to the initial position and the preset repositioned position of the virtual sound source by the following formula:
Figure FDA0002967297710000011
wherein, modifiedi(k2,k1) Representing the output after the virtual sound source position is changed; s represents an original virtual sound source signal; k is a radical of1Representing an initial position of a virtual sound source; k is a radical of2Representing a preset repositioning position of the virtual sound source; r isi(k2) Represents a pair k2The ith RIR transform of (1); r isi(k1) Represents a pair k1The ith RIR transform of (1); e.g. of the typei(k2) Represents a pair k2The ith HRIR transform of (1); i is 1 to the number of audio channels N.
2. The virtual sound source relocation method for digital cinema according to claim 1, further comprising:
and carrying out sound wave interference elimination processing on the repositioned virtual sound source.
3. The virtual sound source relocation method for digital cinema according to claim 2, wherein the relocated virtual sound source is subjected to the acoustic wave interference cancellation process as follows:
dividing the relocated multi-channel audio into a plurality of frequency bands;
determining delay variation and/or phase variation of each frequency band according to the position of the loudspeaker and the audio frequency spectrum characteristics;
and according to the delay change and/or the phase change of each frequency band, carrying out delay and/or phase adjustment on the repositioned multi-channel audio so as to weaken sound wave interference.
4. The virtual sound source relocation method for digital cinema as claimed in claim 1, wherein the room impulse response function RIR of the digital cinema is determined according to the following formula:
Figure FDA0002967297710000012
wherein r isjroom(k) Representing a room impulse response function, jrom ∈ 1, …, M being the number of audio channels acquired at the listener position, k being the position parameter of the audio; inputjroomAn audio spectrum representing a speaker input; outputjroom(k) Representing the audio spectrum captured at the listener position.
5. The virtual audio source relocation method for digital cinema according to claim 1, characterized in that the head-related impulse response function HRIR is determined from the MIT data set or CIPIC data set according to the following formula:
Figure FDA0002967297710000021
wherein e isiroom(k) Representing head-related impulse response functionsNumber, iroom ∈ 1, …, N is the total number of audio channels received by the outer ear, k is the location parameter of the audio; inputiear(k) Representing the audio spectrum received by the outer ear; outputiear(k) Representing the audio spectrum acquired at the simulated eardrum location.
6. A virtual sound source relocation method for digital cinema according to claim 3, characterized in that the delay variation and/or phase variation of each frequency band is determined from the speaker position and the audio spectral characteristics according to the following formula:
Figure FDA0002967297710000022
where Δ t denotes the loudspeaker S1And S2The time difference required between them;
Figure FDA0002967297710000023
represents a loudspeaker S1And S2A desired phase difference therebetween; l is1Represents a loudspeaker S1Distance to listener; l is2Represents a loudspeaker S2Distance to listener; v represents the sound velocity under the current conditions; f represents a band center frequency; mod denotes modulo.
7. A virtual sound source relocation apparatus for digital cinema, comprising:
the room impulse response function determining module is used for determining a room impulse response function RIR of the digital cinema;
a head-related impulse response function determining module for determining a head-related impulse response function HRIR from the MIT data set or the CIPIC data set;
the repositioning module is used for transforming the virtual sound source from the initial position to the repositioning position through a room impulse response function RIR and a head-related impulse response function HRIR according to the initial position and the preset repositioning position of the virtual sound source;
transforming the virtual sound source from the initial position to the repositioned position by the room impulse response function RIR and the head-related impulse response function HRIR according to the initial position and the preset repositioned position of the virtual sound source by the following formula:
Figure FDA0002967297710000024
wherein, modifiedi(k2,k1) Representing the output after the virtual sound source position is changed; s represents an original virtual sound source signal; k is a radical of1Representing an initial position of a virtual sound source; k is a radical of2Representing a preset repositioning position of the virtual sound source; r isi(k2) Represents a pair k2The ith RIR transform of (1); r isi(k1) Represents a pair k1The ith RIR transform of (1); e.g. of the typei(k2) Represents a pair k2The ith HRIR transform of (1); i is 1 to the number of audio channels N.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method of any of claims 1 to 6 when executing the computer program.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program for executing the method of any one of claims 1 to 6.
CN201910608456.9A 2019-07-08 2019-07-08 Virtual sound source repositioning method and device for digital cinema Active CN110312198B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910608456.9A CN110312198B (en) 2019-07-08 2019-07-08 Virtual sound source repositioning method and device for digital cinema

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910608456.9A CN110312198B (en) 2019-07-08 2019-07-08 Virtual sound source repositioning method and device for digital cinema

Publications (2)

Publication Number Publication Date
CN110312198A CN110312198A (en) 2019-10-08
CN110312198B true CN110312198B (en) 2021-04-20

Family

ID=68079301

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910608456.9A Active CN110312198B (en) 2019-07-08 2019-07-08 Virtual sound source repositioning method and device for digital cinema

Country Status (1)

Country Link
CN (1) CN110312198B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111586553B (en) * 2020-05-27 2022-06-03 京东方科技集团股份有限公司 Display device and working method thereof

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1294782A (en) * 1998-03-25 2001-05-09 雷克技术有限公司 Audio signal processing method and appts.
CN101267687A (en) * 2007-03-12 2008-09-17 雅马哈株式会社 Array speaker apparatus
KR100932791B1 (en) * 2008-02-21 2009-12-21 한국전자통신연구원 Method of generating head transfer function for sound externalization, apparatus for processing 3D audio signal using same and method thereof
CN102572676A (en) * 2012-01-16 2012-07-11 华南理工大学 Real-time rendering method for virtual auditory environment
CN102665156A (en) * 2012-03-27 2012-09-12 中国科学院声学研究所 Virtual 3D replaying method based on earphone
CN104041081A (en) * 2012-01-11 2014-09-10 索尼公司 Sound Field Control Device, Sound Field Control Method, Program, Sound Field Control System, And Server
CN105792090A (en) * 2016-04-27 2016-07-20 华为技术有限公司 Method and device of increasing reverberation
CN108616789A (en) * 2018-04-11 2018-10-02 北京理工大学 The individualized virtual voice reproducing method measured in real time based on ears

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103237287B (en) * 2013-03-29 2015-03-11 华南理工大学 Method for processing replay signals of 5.1-channel surrounding-sound headphone with customization function
CN104240695A (en) * 2014-08-29 2014-12-24 华南理工大学 Optimized virtual sound synthesis method based on headphone replay
CN109068243A (en) * 2018-08-29 2018-12-21 上海头趣科技有限公司 A kind of 3D virtual three-dimensional sound sense of hearing auxiliary system and 3D virtual three-dimensional sound earphone

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1294782A (en) * 1998-03-25 2001-05-09 雷克技术有限公司 Audio signal processing method and appts.
CN101267687A (en) * 2007-03-12 2008-09-17 雅马哈株式会社 Array speaker apparatus
KR100932791B1 (en) * 2008-02-21 2009-12-21 한국전자통신연구원 Method of generating head transfer function for sound externalization, apparatus for processing 3D audio signal using same and method thereof
CN104041081A (en) * 2012-01-11 2014-09-10 索尼公司 Sound Field Control Device, Sound Field Control Method, Program, Sound Field Control System, And Server
CN102572676A (en) * 2012-01-16 2012-07-11 华南理工大学 Real-time rendering method for virtual auditory environment
CN102665156A (en) * 2012-03-27 2012-09-12 中国科学院声学研究所 Virtual 3D replaying method based on earphone
CN105792090A (en) * 2016-04-27 2016-07-20 华为技术有限公司 Method and device of increasing reverberation
CN108616789A (en) * 2018-04-11 2018-10-02 北京理工大学 The individualized virtual voice reproducing method measured in real time based on ears

Also Published As

Publication number Publication date
CN110312198A (en) 2019-10-08

Similar Documents

Publication Publication Date Title
US10063984B2 (en) Method for creating a virtual acoustic stereo system with an undistorted acoustic center
JP6546351B2 (en) Audio Enhancement for Head-Mounted Speakers
US7545946B2 (en) Method and system for surround sound beam-forming using the overlapping portion of driver frequency ranges
KR100608025B1 (en) Method and apparatus for simulating virtual sound for two-channel headphones
KR101877323B1 (en) Device and method for spatially selective audio playback
KR101524463B1 (en) Method and apparatus for focusing the sound through the array speaker
JP2011097561A (en) Audio system phase equalization
EP3466117A1 (en) Systems and methods for improving audio virtualisation
JP2018515032A (en) Acoustic system
US9355632B2 (en) Apparatus, method and electroacoustic system for reverberation time extension
US10440495B2 (en) Virtual localization of sound
CN117882394A (en) Apparatus and method for generating a first control signal and a second control signal by using linearization and/or bandwidth extension
US20200059750A1 (en) Sound spatialization method
CN110312198B (en) Virtual sound source repositioning method and device for digital cinema
US10945090B1 (en) Surround sound rendering based on room acoustics
JP2020526980A (en) Subband spatial audio enhancement
US9794717B2 (en) Audio signal processing apparatus and audio signal processing method
CN109923877B (en) Apparatus and method for weighting stereo audio signal
CN113645531B (en) Earphone virtual space sound playback method and device, storage medium and earphone
JP2023522995A (en) Acoustic crosstalk cancellation and virtual speaker technology
US20240056735A1 (en) Stereo headphone psychoacoustic sound localization system and method for reconstructing stereo psychoacoustic sound signals using same
US20090052701A1 (en) Spatial teleconferencing system and method
CN107534813B (en) Apparatus for reproducing multi-channel audio signal and method of generating multi-channel audio signal
TW202236255A (en) Device and method for controlling a sound generator comprising synthetic generation of the differential signal
WO1991020165A1 (en) Improved audio processing system and recordings made thereby

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant