WO2013051085A1 - Dispositif de traitement du signal audio, procédé de traitement du signal audio et programme de traitement du signal audio - Google Patents

Dispositif de traitement du signal audio, procédé de traitement du signal audio et programme de traitement du signal audio Download PDF

Info

Publication number
WO2013051085A1
WO2013051085A1 PCT/JP2011/072773 JP2011072773W WO2013051085A1 WO 2013051085 A1 WO2013051085 A1 WO 2013051085A1 JP 2011072773 W JP2011072773 W JP 2011072773W WO 2013051085 A1 WO2013051085 A1 WO 2013051085A1
Authority
WO
WIPO (PCT)
Prior art keywords
phase difference
audio signal
signal processing
speakers
speaker
Prior art date
Application number
PCT/JP2011/072773
Other languages
English (en)
Japanese (ja)
Inventor
久司 大和田
一郎 菅井
知己 長谷川
輝夫 馬場
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to PCT/JP2011/072773 priority Critical patent/WO2013051085A1/fr
Publication of WO2013051085A1 publication Critical patent/WO2013051085A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • H04S7/302Electronic adaptation of stereophonic sound system to listener position or orientation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S1/00Two-channel systems

Definitions

  • the present invention relates to an audio signal processing apparatus that performs processing for localizing a sound image.
  • Patent Document 1 describes a technique for controlling the localization position of a sound source by changing the level difference between an L channel (left channel) and an R channel (right channel). This technology is used for auto pan (panning), which is one of the effects of DJ performance.
  • Patent Document 2 describes a technique for localizing the reproduced sound of an audio signal outside a speaker by delaying the phase characteristics of the audio signals of the left and right channels without changing the frequency characteristics every time the frequency increases. ing.
  • An object of the present invention is to provide an audio signal processing device, an audio signal processing method, and an audio signal processing program capable of appropriately localizing a low-frequency component sound image at a desired position during speaker reproduction.
  • the audio signal processing device that processes the audio signals supplied to the two speakers has a binaural position within a predetermined range at the listening position at a low frequency of a predetermined value or less.
  • Phase difference control means is provided for performing control to give a relative phase difference between the audio signals supplied to each of the two speakers so as to cause a phase difference.
  • the audio signal processing method executed by the audio signal processing device that processes the audio signals supplied to the two speakers is the listening position in a low frequency range of a predetermined value or less.
  • an audio signal processing program executed by an audio signal processing apparatus that has a computer and processes audio signals supplied to two speakers is a low frequency whose frequency is a predetermined value or less.
  • Phase difference control means for performing control to give a relative phase difference between the audio signals supplied to each of the two speakers so that an interaural phase difference within a predetermined range occurs at a listening position To make the computer function.
  • an audio signal processing device that performs processing on audio signals supplied to two speakers has an interaural phase difference within a predetermined range at a listening position in a low frequency range of a predetermined value or less.
  • Phase difference control means for performing control to give a relative phase difference between the audio signals supplied to each of the two speakers.
  • the above audio signal processing apparatus performs processing on the audio signals supplied to the two speakers in order to control the localization position of the sound image.
  • the audio signal processing apparatus is preferably used as a DJ device.
  • the phase difference control means has a binaural phase difference within a predetermined range at the listening position (a phase difference for the sound that the listener listens to in both ears) at a listening position in a low frequency range below a predetermined value. Control is performed to give a relative phase difference between the audio signals supplied to the two speakers. Thereby, a desired interaural phase difference can be appropriately realized in a low frequency range. Therefore, the sound image of the low frequency component can be appropriately localized at a desired position during reproduction by the speaker.
  • the phase difference control unit is configured to determine the position based on a speaker opening angle formed by the speaker and the listening position and a distance between the speaker and the listening position. Set the phase difference. Thereby, the phase difference can be appropriately controlled according to the speaker opening angle and the distance between the speaker and the listening position, and a desired interaural phase difference can be effectively realized.
  • the phase difference control means can increase the phase difference when the speaker opening angle is small compared to when the speaker opening angle is large. This is because when the speaker opening angle is small, the interaural phase difference required for appropriately localizing the sound image tends to be larger than when the speaker opening angle is large.
  • the phase difference control means can increase the phase difference when the distance is long compared to when the distance is short. This is because when the distance between the speaker and the listening position is long, the interaural phase difference necessary for appropriately localizing the sound image tends to be larger than when the distance is short. It is.
  • the phase difference control means sets the phase difference based on the frequency of the audio signal.
  • the phase difference can be appropriately controlled according to the frequency of the audio signal, and a desired interaural phase difference can be effectively realized.
  • the phase difference control means can increase the phase difference when the frequency is low compared to when the frequency is high. This is because when the frequency of the audio signal is low, the inter-channel phase difference required to obtain the same interaural phase difference tends to be larger than when the frequency is high.
  • the phase difference control means performs control to give the phase difference so that the interaural phase difference within a range of 25 ° to 65 ° is generated. it can.
  • the low frequency is a lower limit of a frequency at which a localization position of a sound image can be controlled by changing a level difference between the audio signals supplied to the two speakers.
  • the “low range” used for the control is defined based on the lower limit value of the frequency at which the interaural phase difference necessary for appropriately localizing the sound image is obtained by the control related to “pan”. Specifically, a frequency band lower than the lower limit value can be used as a “low band”.
  • the listening position is located on a vertical bisector connecting the two speakers.
  • the two speakers are arranged either in front of the listening position, right next to the listening position, or behind the listening position.
  • an audio signal processing method executed by an audio signal processing apparatus that processes audio signals supplied to two speakers is predetermined at a listening position in a low frequency range of a predetermined value or less.
  • an audio signal processing program executed by an audio signal processing apparatus that has a computer and processes audio signals supplied to two speakers is a low frequency whose frequency is a predetermined value or less.
  • Phase difference control means for performing control to give a relative phase difference between the audio signals supplied to each of the two speakers so that an interaural phase difference within a predetermined range occurs at a listening position To make the computer function.
  • the above-described audio signal processing method and audio signal processing program can also appropriately localize a low-frequency component sound image at a desired position during reproduction by a speaker.
  • FIG. 1 is a diagram schematically illustrating an acoustic system 10 according to the first embodiment.
  • the acoustic system 10 is suitably used as a DJ device.
  • the acoustic system 10 mainly includes phase control units 1L and 1R and speakers 2L and 2R.
  • the phase control units 1L and 1R receive the same audio signal and perform control to add a phase to the input audio signal. Specifically, the phase control units 1L and 1R control to give a phase to the input audio signal so that a relative phase difference occurs between the audio signals output to the speakers 2L and 2R, respectively. I do.
  • only one of the phase control units 1L and 1R may realize a desired phase difference by adding a phase to the audio signal, or both of the phase control units 1L and 1R add a phase to the audio signal. By doing so, a desired phase difference may be realized.
  • the phase control units 1L and 1R are realized by an effector, a DSP, an amplifier, and the like.
  • the speakers 2L and 2R output the audio signals after being controlled by the phase controllers 1L and 1R, respectively.
  • the phase control units 1L and 1R control the phase difference between the audio signals of the speakers 2L and 2R, that is, the phase difference between channels corresponding to the speakers 2L and 2R (hereinafter referred to as “interchannel level”). By controlling the “phase difference”, the localization position of the sound image is controlled during reproduction by the speakers 2L and 2R.
  • the phase control units 1L and 1R correspond to the “audio signal processing device” in the present invention and function as “phase difference control means”.
  • FIG. 2 shows a specific example of an acoustic space to which the acoustic system 10 is applied.
  • the speakers 2L and 2R are arranged in front of the listening position (that is, the listener's position), the speaker 2L is arranged in front of the listening position, and the speaker 2R is arranged in front of the listening position.
  • the listening position is generally located on the vertical bisector connecting the lines connecting the speakers 2L and 2R.
  • the phase control units 1L and 1R control the phase difference between channels, so that sound images are reproduced by the speakers 2L and 2R during reproduction by the speakers 2L and 2R. Localize to any position between. Specifically, the phase control units 1L and 1R localize the sound image near the center of the speakers 2L and 2R (hereinafter referred to as “center localization”) as indicated by the broken line area A1, or indicate the broken line area A2.
  • the sound image is localized in the vicinity of the speaker 2L (hereinafter referred to as “left localization”), or the sound image is localized in the vicinity of the speaker 2R as indicated by the broken line area A3 (hereinafter referred to as “right localization”). I will let you.
  • FIG. 3 is a diagram for explaining a control method according to a comparative example.
  • the control method according to the comparative example is realized by an acoustic system 10x.
  • the acoustic system 10x includes multipliers 4L and 4R instead of the phase controllers 1L and 1R.
  • the multipliers 4L and 4R control the level of the audio signal by multiplying the input audio signal by a predetermined coefficient (a value between 0 and 1). Specifically, the multipliers 4L and 4R control the level difference between the audio signals output to the speakers 2L and 2R, respectively.
  • such multipliers 4L and 4R control the level difference between the audio signals of the speakers 2L and 2R, thereby controlling the localization position of the sound image during reproduction by the speakers 2L and 2R.
  • FIG. 3B shows a specific example of coefficients used by the multipliers 4L and 4R.
  • the horizontal axis indicates the localization position
  • the vertical axis indicates the coefficients used by the multipliers 4L and 4R.
  • the localization position shown in the center corresponds to “center localization”
  • the localization position shown on the left side corresponds to “left localization”
  • the localization position shown on the right side corresponds to “right localization” (this definition)
  • the coefficient of the multiplier 4L is multiplied by the multiplier in order to make the level of the left channel larger than the level of the right channel.
  • a value larger than the coefficient of 4R is set.
  • the coefficient of the multiplier 4R is larger than the coefficient of the multiplier 4L in order to make the right channel level larger than the left channel level. Set to a large value.
  • FIG. 4 shows an example of a simulation result when the control according to the comparative example is performed.
  • the interval between the speakers 2L and 2R is set to “4.2 [m]”, and the distance between the center position of the speakers 2L and 2R and the listening position is “2.1.
  • the simulation result is shown in the case where the distance between both ears of the listener is “0.25 [m]”.
  • Other simulation conditions are as follows. Note that a low frequency is used as the frequency of the input signal.
  • ⁇ Input signal sine wave
  • ⁇ Input signal frequency 50, 63, 80, 100, 130 [Hz] ⁇ Free sound field / point sound source
  • FIG. 4B shows an example of the interaural level difference obtained by the control according to the comparative example.
  • the horizontal position indicates the localization position
  • the vertical axis indicates the binaural level difference [dB].
  • FIG. 4C shows an example of the binaural phase difference obtained by the control according to the comparative example.
  • the horizontal position indicates the localization position
  • the vertical axis indicates the binaural phase difference [°].
  • the results obtained for each of a plurality of frequencies are shown superimposed. Note that the “localization position” shown on the horizontal axis of FIGS.
  • 4B and 4C is not the position where the sound image is actually localized, but the position where the sound image is localized by control (target position). (This definition is also applied to a graph in which the “localization position” is shown on the horizontal axis, which will be described later.)
  • control is performed so that the low-frequency component sound image is appropriately localized at a desired position during reproduction by the speakers 2L and 2R.
  • control is performed so that the low-frequency component sound image is appropriately localized at a desired position during reproduction by the speakers 2L and 2R.
  • control which gives a typical phase difference is performed. That is, in the first embodiment, when realizing the left localization or right localization in a low frequency range where it is difficult to properly localize a sound image by controlling the level difference, a desired interaural phase difference is obtained.
  • the phase control units 1L and 1R control the phase difference between channels so as to occur.
  • the phase control units 1L and 1R control the inter-channel phase difference so that the interaural phase difference within a range from 25 [°] to 65 [°] is generated.
  • the interaural time difference (generally synonymous with the interaural phase difference) and the interaural level difference (amplitude difference or level difference of the interaural transfer function) are clues for the direction perception of the horizontal plane. It is known.
  • the relationship between the time difference and the localization position is linear until the interaural time difference is “630 [ ⁇ sec]”, and the interaural time difference is “1 [msec]”. The above findings indicate that the localization position does not change.
  • the frequency range in which the interaural level difference and the interaural time difference affect localization is as shown in FIG. From FIG. 5A, in order to obtain a desired localization in a low frequency range (for example, 200 [Hz] or less), an interaural level difference of “10 [dB]” or more, or a sufficient interaural time difference (in other words, And interaural phase difference) are considered necessary. Therefore, in this embodiment, in order to obtain a desired localization in the low frequency range, control is performed based on the interaural time difference of the interaural level difference and the interaural time difference, that is, based on the interaural phase difference. Do. Specifically, the inter-channel phase difference is controlled so that a binaural phase difference in which the sound image is properly localized in a low frequency range is obtained.
  • FIG. 5B shows a result of one example (hereinafter, referred to as “Experiment 1”) of the localization experiment using the binaural phase difference.
  • Experiment 1 controls the phase difference and level difference between channels to form a state where the sound image is localized to the left and right, and measures the interaural phase difference in that state.
  • the experimental conditions in Experiment 1 are as follows. ⁇ Location: Anechoic chamber ⁇ Signal: Burst signal with narrow band noise ⁇ Signal center frequency: 50, 63, 80, 100, 125, 160, 200 [Hz] Signal level: 90 [dB] Test subject: 4 persons
  • FIG. 5B shows the frequency [Hz] on the horizontal axis and the interaural phase difference [°] on the vertical axis. That is, FIG. 5B shows the binaural phase difference for each frequency in a state where the sound image is localized to the left and right. From this, it is understood that the localization of the sound image can be obtained when the interaural phase difference of about 25 to 65
  • FIG. 5C shows the result of another example (hereinafter referred to as “Experiment 2”) of a localization experiment based on an interaural phase difference.
  • Experiment 2 a signal to which various interaural phase differences were added was auditioned using headphones, and an answer was obtained as to whether or not localization was obtained for each interaural phase difference.
  • the experimental conditions in Experiment 2 are as follows. -Signal: Burst signal with narrow band noise-Center frequency of signal: 50, 63, 80, 100, 125, 160, 200 [Hz] Test subject: 3 persons
  • FIG. 5C shows the interaural phase difference [°] on the horizontal axis and the probability [%] that the localization is obtained on the vertical axis. The result shown in FIG.
  • 5C is obtained by accumulating response results at various frequencies (center frequency). From FIG. 5 (c), it can be seen that the probability that the localization is obtained is 75 [%] or more in the binaural phase difference of about 25 to 65 [°]. Therefore, it is considered that localization of a sound image can be obtained when a binaural phase difference of about 25 to 65 [°] occurs.
  • control in response to the results of Experiment 1 and Experiment 2, control is performed so that an interaural phase difference of about 25 to 65 [°] occurs in the low frequency range.
  • the phase control unit when realizing the left localization or right localization in the low frequency range, the phase control unit is configured so that an interaural phase difference of about 25 to 65 [°] occurs in the absolute value.
  • 1L and 1R control the phase difference between channels. For example, by obtaining the interaural phase difference generated by various inter-channel phase differences through experiments or simulations, the inter-channel phase difference that produces an interaural phase difference of about 25 to 65 [°] in absolute value is obtained. Can be sought.
  • the phase control units 1L, 1R can be set to the stored inter-channel phase difference when realizing the left localization and the right localization. It is possible to perform control to add a phase to the audio signal supplied to both or one of 2L and 2R.
  • FIG. 6 shows simulation conditions used in the control according to the first embodiment.
  • the distance between the speakers 2L and 2R is “4.2 [m]”
  • the distance between the center position of the speakers 2L and 2R and the listening position is “2.1 [m]”. It is assumed that the distance between both ears of the listener is “0.25 [m]”.
  • Other simulation conditions are as follows. These simulation conditions are the same as the simulation conditions used in the control according to the above-described comparative example.
  • ⁇ Input signal sine wave
  • ⁇ Input signal frequency 50, 63, 80, 100, 130 [Hz] Free field / point sound source
  • FIG. 6B shows a specific example of the inter-channel phase difference provided at each localization position in the control according to the first embodiment. In FIG.
  • the horizontal axis indicates the localization position
  • the vertical axis indicates the inter-channel phase difference [°].
  • the phase difference between channels on the vertical axis indicates a positive value when the left channel is more advanced than the right channel, and a negative value when the right channel is more advanced than the left channel. .
  • an inter-channel phase difference is set such that the localization position decreases monotonically as it moves from left to right.
  • the phase difference between channels is set to about “150 °”, that is, the left channel is positioned.
  • the phase is set to advance about “150 [°]” from the phase of the right channel.
  • the inter-channel phase difference is set to “0 [°]”.
  • the phase difference between channels is set to about “ ⁇ 150 [°]”, that is, the phase of the right channel is It is set so as to advance about “150 [°]” from the phase of the left channel.
  • FIG. 7A shows an example of a simulation result when the control according to the above-described comparative example is performed (that is, when “pan” is performed).
  • the graph in FIG. 7A is the same as that shown in FIG.
  • FIG. 7B shows a simulation result example when the control according to the first embodiment is performed.
  • the horizontal axis indicates the localization position
  • the vertical axis indicates the binaural phase difference [°].
  • the results obtained for each of a plurality of frequencies are shown superimposed. For example, the interaural phase difference is measured by installing two microphones at positions corresponding to the listener's both ears. It is assumed that the control according to the first embodiment and the control according to the comparative example are performed using the same simulation conditions.
  • a desired binaural phase difference can be appropriately realized in a low frequency range. Therefore, according to the first embodiment, it is possible to appropriately localize a low-frequency component sound image at a desired position during reproduction by the speakers 2L and 2R.
  • phase control units 1L and 1R increase the inter-channel phase difference when the speaker opening angle is small compared to when the speaker opening angle is large. Further, the phase control units 1L and 1R increase the inter-channel phase difference when the speaker distance is long compared to when the speaker distance is short.
  • FIG. 8 is a diagram for explaining the definition of the speaker opening angle and the speaker distance.
  • the listening position is located on the vertical bisector 72 of the line segment 71 connecting the speakers 2L and 2R.
  • the speaker opening angle ⁇ is defined as the angle formed by the line segment 73L connecting the speaker 2L and the listening position and the vertical bisector 72, and the line segment 73R connecting the speaker 2R and the listening position by two vertical lines. It is defined as the angle formed by the equidistant line 72. These two angles are equal because the listening position is located on the vertical bisector 72 of the line 71 connecting the speakers 2L and 2R.
  • the speaker distance L is defined as the length of the line segment 73L connecting the speaker 2L and the listening position and the length of the line segment 73R connecting the speaker 2R and the listening position. These two lengths are also equal because the listening position is located on the vertical bisector 72 of the line 71 connecting the speakers 2L and 2R.
  • the positions of the speakers 2L and 2R are uniquely determined according to the listening position, the speaker opening angle ⁇ , and the speaker distance L. In the following, the positions are defined based on the speaker opening angle ⁇ and the speaker distance L. For the positions of the speakers 2L and 2R, words such as “speaker arrangement” and “speaker position” are used.
  • FIG. 9 shows an example of a suitable speaker arrangement. Specifically, the speaker arrangement in which an interaural phase difference of about 25 to 65 [°] is obtained by controlling the inter-channel phase difference is illustrated.
  • 9 (a) and 9 (b) express the speaker arrangement by showing the speaker opening angle ⁇ [°] and the speaker distance L [m] in polar coordinate display.
  • the speaker opening angle ⁇ is represented by an azimuth angle from a vertical line passing through the origin (listening position)
  • the speaker distance L is represented by a distance from the origin.
  • the speaker opening angle ⁇ and the speaker distance L for the speaker 2L are shown, that is, only the arrangement example of the speakers 2L is shown.
  • FIG. 9A shows an example of speaker arrangement in which an interaural phase difference of about 25 to 65 [°] is obtained by controlling the inter-channel phase difference when the frequency is set to “50 [Hz]”.
  • FIG. 9B shows a speaker in which an interaural phase difference of about 25 to 65 [°] is obtained by controlling the inter-channel phase difference when the frequency is set to “130 [Hz]”.
  • An arrangement example is shown.
  • FIGS. 9 (a) and 9 (b) show that an interaural phase difference of about 25 to 65 [°] can be obtained if the speaker 2L is arranged within a frame represented by a bold line. ing. Note that the results shown in FIGS. 9A and 9B are obtained, for example, by simulation as described above.
  • the speaker distance L at which a desired interaural phase difference can be obtained is longer than when the speaker opening angle ⁇ is small.
  • the speaker distance L at which a desired interaural phase difference is obtained is shorter than when the speaker opening angle ⁇ is large. That is, when the speaker opening angle ⁇ is large, a desired interaural phase difference can be obtained even if the speaker distance L is increased to some extent (for example, about 27 [m] at the maximum). It can be said that it is necessary to shorten the speaker distance L to some extent in order to obtain the interaural phase difference (for example, about 7 [m] at the minimum).
  • FIG. 9 shows only an arrangement example of the speaker 2L, it goes without saying that the speaker 2R is the same as this.
  • FIG. 10 illustrates the relationship between the speaker arrangement and the phase difference between channels. Specifically, an inter-channel phase difference that can obtain an interaural phase difference of about 25 to 65 [°] is illustrated for each speaker position.
  • 10 (a) and 10 (b) express the speaker arrangement by showing the speaker opening angle ⁇ [°] and the speaker distance L [m] in polar coordinate display (the detailed definition is the same as FIG. 9). Is).
  • the relationship between the speaker arrangement and the inter-channel phase difference is illustrated only for the speaker 2L.
  • FIG. 10A illustrates an inter-channel phase difference [°] necessary to obtain an interaural phase difference of “25 [°]” at each speaker position.
  • FIG. 10B illustrates an inter-channel phase difference [°] necessary to obtain an interaural phase difference of “65 [°]” at each speaker position.
  • the results shown in FIGS. 10A and 10B are obtained when the frequency is set to “100 [Hz]”. Such a result is obtained, for example, by a simulation as described above.
  • the inter-channel phase difference necessary to obtain the desired interaural phase difference (25 [°], 65 [°]) is represented by the speaker opening angle ⁇ and the speaker distance L. It turns out that it depends. Specifically, it is understood that when the speaker opening angle ⁇ is large, the inter-channel phase difference necessary for obtaining a desired interaural phase difference is smaller than when the speaker opening angle ⁇ is small. . In other words, when the speaker opening angle ⁇ is small, the inter-channel phase difference necessary to obtain the desired interaural phase difference is larger than when the speaker opening angle ⁇ is large (ie, “180 [ It is close to “°]”).
  • the inter-channel phase difference required to obtain the desired binaural phase difference is smaller than when the speaker distance L is long.
  • the inter-channel phase difference necessary to obtain a desired interaural phase difference is larger than when the speaker distance L is short (ie, “180 [°]”).
  • FIG. 10 shows only the relationship between the speaker arrangement and the phase difference between channels for the speaker 2L, but it goes without saying that the speaker 2R is the same as this.
  • the phase control units 1L and 1R control the inter-channel phase difference based on the relationship between the speaker arrangement and the inter-channel phase difference as shown in FIG. That is, the phase control units 1L and 1R are set to the phase difference between the channels 2L and 2R so that a desired interaural phase difference can be obtained at the currently set speaker opening angle ⁇ and speaker distance L. Control is performed to add a phase to the audio signal supplied to both or one.
  • an arithmetic expression for obtaining a phase difference between channels that obtains a desired interaural phase difference for each speaker position is created by experiment or simulation, and the phase control units 1L and 1R Based on such an arithmetic expression, control can be performed by obtaining a phase difference between channels corresponding to the current speaker position.
  • an inter-channel phase difference in which a desired interaural phase difference is obtained for each speaker position by experiment or simulation is stored as table data, and the phase control units 1L and 1R Control can be performed by reading the phase difference between channels according to the current speaker position from the table data. Note that the current speaker position can be acquired by input from the user, for example.
  • the inter-channel phase difference can be appropriately controlled according to the speaker opening angle and the speaker distance, and the desired interaural phase difference can be effectively realized. Therefore, according to the second embodiment, the sound image of the low frequency component can be localized at a desired position more reliably during reproduction by the speakers 2L and 2R.
  • the speaker opening angle ⁇ and the speaker distance L are not limited to be defined as shown in FIG.
  • the speaker opening angle can be defined as an angle formed by a line segment 73L connecting the speaker 2L and the listening position and a line segment 73R connecting the speaker 2R and the listening position.
  • the speaker opening angle in this example is twice the above-described speaker opening angle ⁇ .
  • the speaker distance can be defined as the distance on the vertical bisector 72 from the speakers 2L and 2R to the listening position.
  • the speaker distance in this example is “L ⁇ cos ⁇ ” when the speaker opening angle ⁇ and the speaker distance L are used.
  • the channel is changed according to the frequency of the audio signal.
  • the interphase difference may be controlled.
  • the phase difference between channels can be increased when the frequency is low compared to when the frequency is high (in other words, when the frequency is high and the frequency is low). In comparison, the phase difference between channels is reduced). This is because the inter-channel phase difference required to obtain the same binaural phase difference changes according to the frequency.
  • it is desirable that the inter-channel phase difference necessary for obtaining the same binaural phase difference is larger when the frequency is low than when the frequency is high. Therefore, in another example, when the frequency is low, the inter-channel phase difference is increased compared to when the frequency is high.
  • the present invention can be used for DJ equipment, for example.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Stereophonic System (AREA)

Abstract

La présente invention se rapporte à un dispositif de traitement du signal audio qui traite des signaux audio transmis à deux haut-parleurs dans le but de commander une position de localisation d'une image sonore. Des moyens de commande de différence de phase exécutent un commande de sorte à appliquer une différence de phase relative entre des signaux audio transmis à chacun des deux haut-parleurs de telle sorte qu'une différence de phase entre les deux oreilles, qui est comprise dans une plage prédéterminée, soit générée à une position d'écoute à laquelle la fréquence se situe dans une plage de basse fréquence qui est inférieure ou égale à une valeur prédéterminée. Par voie de conséquence, dans la plage de basse fréquence, il est possible d'obtenir une différence de phase souhaitée entre les deux oreilles, et ce de façon adéquate. Dans ces conditions, au cours de la période de lecture des haut-parleurs, il devient possible de localiser de façon appropriée l'image sonore de la composante de basse fréquence à une position souhaitée.
PCT/JP2011/072773 2011-10-03 2011-10-03 Dispositif de traitement du signal audio, procédé de traitement du signal audio et programme de traitement du signal audio WO2013051085A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/072773 WO2013051085A1 (fr) 2011-10-03 2011-10-03 Dispositif de traitement du signal audio, procédé de traitement du signal audio et programme de traitement du signal audio

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/072773 WO2013051085A1 (fr) 2011-10-03 2011-10-03 Dispositif de traitement du signal audio, procédé de traitement du signal audio et programme de traitement du signal audio

Publications (1)

Publication Number Publication Date
WO2013051085A1 true WO2013051085A1 (fr) 2013-04-11

Family

ID=48043278

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/072773 WO2013051085A1 (fr) 2011-10-03 2011-10-03 Dispositif de traitement du signal audio, procédé de traitement du signal audio et programme de traitement du signal audio

Country Status (1)

Country Link
WO (1) WO2013051085A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104837106A (zh) * 2015-05-25 2015-08-12 上海音乐学院 一种用于空间化声音的音频信号处理方法及装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58194500A (ja) * 1982-04-30 1983-11-12 Nippon Hoso Kyokai <Nhk> 音声信号ズ−ミング装置
JPS6157797U (fr) * 1984-09-20 1986-04-18
JP2002354597A (ja) * 2001-03-22 2002-12-06 New Japan Radio Co Ltd 疑似ステレオ回路および疑似ステレオ装置
JP2003061198A (ja) * 2001-08-10 2003-02-28 Pioneer Electronic Corp オーディオ再生装置
JP2011097561A (ja) * 2009-11-02 2011-05-12 Harman Becker Automotive Systems Gmbh オーディオシステム位相イコライゼーション

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58194500A (ja) * 1982-04-30 1983-11-12 Nippon Hoso Kyokai <Nhk> 音声信号ズ−ミング装置
JPS6157797U (fr) * 1984-09-20 1986-04-18
JP2002354597A (ja) * 2001-03-22 2002-12-06 New Japan Radio Co Ltd 疑似ステレオ回路および疑似ステレオ装置
JP2003061198A (ja) * 2001-08-10 2003-02-28 Pioneer Electronic Corp オーディオ再生装置
JP2011097561A (ja) * 2009-11-02 2011-05-12 Harman Becker Automotive Systems Gmbh オーディオシステム位相イコライゼーション

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104837106A (zh) * 2015-05-25 2015-08-12 上海音乐学院 一种用于空间化声音的音频信号处理方法及装置

Similar Documents

Publication Publication Date Title
KR101827032B1 (ko) 스테레오 영상 확대 시스템
US10375503B2 (en) Apparatus and method for driving an array of loudspeakers with drive signals
US10356528B2 (en) Enhancing the reproduction of multiple audio channels
AU2015413301B2 (en) Apparatus and method for sound stage enhancement
US9609418B2 (en) Signal processing circuit
JP4924119B2 (ja) アレイスピーカ装置
US20050089181A1 (en) Multi-channel audio surround sound from front located loudspeakers
CN104641659A (zh) 扬声器设备和音频信号处理方法
US20110268299A1 (en) Sound field control apparatus and sound field control method
EP2856775A1 (fr) Élargissement stéréophonique par des haut-parleurs arbitrairement configurés
EP3089476A1 (fr) Système sonore
US20190037334A1 (en) Methods and systems for providing virtual surround sound on headphones
JP2007228526A (ja) 音像定位装置
KR20120067294A (ko) 가상 서라운드 렌더링을 위한 스피커 어레이
JP6380060B2 (ja) スピーカ装置
US20170272889A1 (en) Sound reproduction system
US20080175396A1 (en) Apparatus and method of out-of-head localization of sound image output from headpones
WO2010149166A1 (fr) Dispositif à base de processeur numérique de signal (dsp) pour ségrégation auditive d&#39;entrées sonores multiples
WO2013057906A1 (fr) Appareil de reproduction de signal audio et procédé reproduction de signal audio
WO2013051085A1 (fr) Dispositif de traitement du signal audio, procédé de traitement du signal audio et programme de traitement du signal audio
US8929557B2 (en) Sound image control device and sound image control method
JP5418256B2 (ja) 音声処理装置
US20090052676A1 (en) Phase decorrelation for audio processing
CN107534813B (zh) 再现多信道音频信号的装置和产生多信道音频信号的方法
US11974106B2 (en) Array augmentation for audio playback devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11873572

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11873572

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP