EP1878308B1 - Dispositif et procede de production et de traitement d'effets sonores dans des systemes de reproduction sonore spatiale a l'aide d'une interface graphique d'utilisateur - Google Patents

Dispositif et procede de production et de traitement d'effets sonores dans des systemes de reproduction sonore spatiale a l'aide d'une interface graphique d'utilisateur Download PDF

Info

Publication number
EP1878308B1
EP1878308B1 EP06742644A EP06742644A EP1878308B1 EP 1878308 B1 EP1878308 B1 EP 1878308B1 EP 06742644 A EP06742644 A EP 06742644A EP 06742644 A EP06742644 A EP 06742644A EP 1878308 B1 EP1878308 B1 EP 1878308B1
Authority
EP
European Patent Office
Prior art keywords
impulse responses
sound
user interface
spatial
graphic user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Not-in-force
Application number
EP06742644A
Other languages
German (de)
English (en)
Other versions
EP1878308A2 (fr
Inventor
Frank Melchior
Jan Langhammer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV
Original Assignee
Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV filed Critical Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV
Publication of EP1878308A2 publication Critical patent/EP1878308A2/fr
Application granted granted Critical
Publication of EP1878308B1 publication Critical patent/EP1878308B1/fr
Not-in-force legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/40Visual indication of stereophonic sound image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S2420/00Techniques used stereophonic systems covered by H04S but not provided for in its groups
    • H04S2420/13Application of wave-field synthesis in stereophonic audio systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • H04S7/305Electronic adaptation of stereophonic audio signals to reverberation of the listening space

Definitions

  • the present invention relates to modern audio technologies, and more particularly to the generation and processing of spatial sound impressions for sound reproduction systems.
  • the use of multiple loudspeakers makes it possible to precisely locate individual sound sources in the room and to give the impression within the playback environment that one would be within a simulated space, such as a sound space. A stage or a cathedral.
  • two different playback concepts can be distinguished.
  • conventional surround sound reproduction which is also customary in the field of home entertainment, the location and room information is already mixed during the sound mixing process into individual channels to be transmitted discretely, a reproduction system comprising several loudspeakers being used to reproduce the individual channels.
  • the reproducing speakers should be in a predetermined position relative to the playback environment in order to achieve the best possible spatial impression.
  • More advanced systems such as the wave-field synthesis-based space simulations, generate the drive signals for the individual speakers only during the reproduction, based on position information of a sound source with respect to the reproduction space and the space information of a reproduction environment to be simulated.
  • position information of a sound source with respect to the reproduction space and the space information of a reproduction environment to be simulated.
  • much more authentic results can be achieved with respect to the location and the spatial impression, since the individual loudspeaker setup is taken into account during playback can be used to create a wavefront in the playback environment that best represents the space impression to be simulated.
  • WFS wave field synthesis
  • Applied to the acoustics can be simulated by a large number of speakers, which are arranged side by side (a so-called speaker array), any shape of an incoming wavefront.
  • a so-called speaker array any shape of an incoming wavefront.
  • the audio signals of each speaker must be delayed and Amplitude scaling are fed so that the radiated sound fields of each speaker properly overlap.
  • the contribution to each speaker is calculated separately for each source and the resulting signals added together. If the sources to be reproduced are in a virtual room with reflective walls, reflections must also be reproduced as additional sources via the loudspeaker array. The effort involved in the calculation therefore depends heavily on the number of sound sources, the reflection characteristics of the room and the number of loudspeakers.
  • the advantage of this technique is in particular that a natural spatial sound impression over a large area of the playback room is possible.
  • the direction and distance of sound sources are reproduced very accurately.
  • virtual sound sources can even be positioned between the real speaker array and the listener.
  • Wavefield synthesis (WFS or sound field synthesis), as developed at the TU Delft in the late 1980s, represents a holographic approach to sound reproduction. The basis for this is the Kirchhoff-Helmholtz integral. This states that any sound fields within a closed volume can be generated by means of a distribution of monopole and dipole sound sources (loudspeaker arrays) on the surface of this volume. Details can be found in MM Boone, ENG Verheijen, PF.
  • a synthesis signal for each loudspeaker of the loudspeaker array is calculated from an audio signal which emits a virtual source at a virtual position, the synthesis signals being designed in amplitude and phase such that a wave resulting from the superposition of the individual the sound wave present in the loudspeaker array will correspond to the wave that would result from the virtual source at the virtual position if that virtual source at the virtual position were a real source with a real position.
  • multiple virtual sources exist at different virtual locations.
  • the computation of the synthesis signals is performed for each virtual source at each virtual location, typically resulting in one virtual source in multiple speaker synthesis signals. Seen from a loudspeaker, this loudspeaker thus receives several synthesis signals, which go back to different virtual sources. A superimposition of these sources, which is possible due to the linear superposition principle, then gives the reproduced signal actually emitted by the speaker.
  • Spatial sound reproduction systems such as wave field synthesis make it possible to generate the sound in 360 degrees around the audience room with optimal spatial resolution. So far, these systems have been used essentially for positioning discrete sound sources and for direct sound reproduction. In addition, all known linear signal processing can be applied to the signals of the sound sources generated in this way, such. B adding reverberation. In spatial sound reproduction systems such as Wave Field Synthesis (WFS), it is still possible to generate spatial effects based on the direct sound. This happens, for example, in the room simulation, in which, for reasons of efficiency, the reproduction in a limited number of spatial directions (plane waves) can be simplified.
  • WFS Wave Field Synthesis
  • a room to be sounded is irradiated by as many individual loudspeakers as possible in order to allow the reconstruction of wavefronts with the best possible accuracy.
  • For the location of sound signals and the generation of a spatial impression usually a variety of parameters are used, which are to be determined individually for each speaker during the mixing of the sound signal.
  • the multichannel sound reproduction systems are characterized by extremely high complexity, so that the additional generation of spatial information or the location information during the mixing of the sound conditional to generate a variety of parameters, the speaker individually the location information or additional linear Describe signal processing steps (for generating acoustic effects).
  • This description based on a large number of abstract mathematical parameters without directly intuitively meaning is difficult to control, especially in wave field synthesis systems.
  • wave field synthesis offers the possibility of freely positioning sound sources on a two-dimensional listening plane. This is done by synthesizing different wavefronts depending on the position of the sound sources.
  • User interfaces as currently used, use a point in a plan view of the two-dimensional listening plane for positioning the sound source, the point representing the position of the sound source.
  • the spatial position of the sound source is adequately visualized in this approach, the sound impression of depth (spatial impression) can not be represented simultaneously in the visualization, so there are discrepancies between the real world Perception and presentation, so that only in a few exceptional cases, a visual image is provided, which corresponds to the real sound impression or a conclusion on this allows.
  • the international patent application WO 2004/047485 A1 describes an audio reproduction system based on wave field synthesis and deals in particular with a possible modularization of the wave field synthesis system in order to adapt it successively to changing reproduction environments.
  • the object of the present invention is to provide a graphical user interface that allows to more efficiently control a sound reproduction system for generating a spatial sound impression.
  • the present invention is based on the finding that a sound reproduction system which can produce a spatial tone impression in a reproduction environment can be controlled efficiently and intuitively by means of a graphical user interface if an impulse response assigned to a spatial direction with respect to the reproduction environment or a graphic image obtained from the impulse response Representation is graphed, and when the possibility is created that a user can graphically change this representation, so based on the user change input, the changed impulse response can be graphed and the changed graphical representation can be detected to control the sound reproduction system.Da systemically possible is to describe all known linear signal processing by impulse responses, it is possible with the graphical user interface according to the invention, a sound designer via a graphic R to provide intuitive access to directional sound effects, thus increasing the efficiency and quality of control of the sound reproduction system.
  • the signals for plane waves can be generated by convolution with corresponding spatial impulse responses associated with corresponding spatial directions.
  • rooms can also be simulated, wherein the impulse responses used according to the invention in addition to a description by the underlying parameters are also visualized directly.
  • the inventive new tool for sound design consists of a simultaneous visualization of all direction-dependent impulse responses corresponding to a source. The sound design takes place through direct interaction with this visualization. The processing of the visual representation is converted into a parametric description and from this the corresponding impulse responses are generated.
  • the direction information or a spatiality is thus impressed on a sound signal by a mathematical convolution with an impulse response, which will be briefly explained below for a better understanding of the inventive concept.
  • a spatial signal or reflection pattern or location information by convolution with an impulse response g (x) is impressed on a tone signal f (y), so that the combined tone signal F (x) results according to the following folding integral:
  • F x ⁇ - ⁇ ⁇ f y ⁇ G ⁇ x - y ⁇ dy ,
  • Dirac impulse is characterized by an infinitesimal length and additionally by the fact that its integral, as described above, is finite. In the case of a sound signal, this means that a Diracpuls is arbitrarily small, but carries fixed acoustic energy.
  • the simplest impulse response is again a Dirac pulse, which is registered with a propagation delay t for transmitting the test pulse at the place of emission of the test pulse. This is exactly the case if, in the direction in which the test pulse was emitted, there is an ideal reflector which reflects the acoustic test signal without attenuation, the transit time between the location of the emission of the source and the reflector being then exactly t / 2.
  • Diracpulse pulses whose width is finite and whose intensity is A are called Diracpulse.
  • such real pulses can be imagined, for example, from Gaussian curves of small width with area A.
  • the reflector described above would absorb some of the acoustic energy, thus dampening the test signal, then the reflected Dirac pulse received after runtime t would have a smaller area B under the curve than the original pulse (B ⁇ A).
  • impulse response In addition to the so far described, idealized simple cases of an impulse response, it is also possible, arbitrary to get complex impulse responses. If, for example, two reflectors in different distances corresponding to the acoustic propagation times t 1 and t 2 are at the location of the emission of the test signal, the impulse response will consist of two dirac pulses received at the times 2 * t 1 and 2 * t 2 .
  • acoustic scenes are very complex, so that a real impulse response will become a denser pulse sequence beginning with early reflections, and whose components arriving later in time, for example, describe a reverberation.
  • an impulse response in the form of a Dirac pulse describes a delay or an echo.
  • a multiple echo can be represented by a sum of dirac-shaped pulses.
  • the impulse response that is convolved with the sound signal will be continuous, e.g. B. a time t 0 rising sharply and then gently decaying signal that describes a multiple reflection, the signals reflected at later times are more attenuated.
  • sound signals are additionally attenuated in a frequency-selective manner; for example, high sound signals from carpets and wall hangings are more strongly damped than deep sound signals.
  • different impulse responses for example, can be used and visualized separately for several frequency ranges or the visualization of the impulse response must include the time and frequency range.
  • the graphical user interface is used to represent the spatial position of a sound source relative to the sound reproduction system, and the resulting impulse responses, which individually for each loudspeaker of a display system, the spatial orientation represent the sound signal with respect to the reproducing speaker visualize.
  • the user can graphically change the position of the source with respect to the reproduction environment, whereby the loudspeaker-individual impulse response or the parameters for controlling the loudspeakers result automatically from the illustrated wavefront of the punctiform acoustic signal source.
  • a sound engineer thus has the opportunity to intuitively create the complex parameters needed to control the sound reproduction system.
  • An essential aspect here is that the possibility is additionally created of directly changing the impulse responses by graphical interaction with the user interface, wherein it is immediately shown how the current change affects the perception of the position of the sound source.
  • the graphic user interface according to the invention it is thus advantageously possible to choose whether one wants to place the sound source directly from physical reality or whether one wishes to creatively use the possibilities of changing the impulse response.
  • a sound engineer can therefore choose between two possibilities of visual sound processing and pursue the approach that is most advantageous for the desired sound result or the spatial sound impression that is to be achieved.
  • the graphical user interface according to the invention is used to display impulse responses that contain information about a room to be simulated.
  • the display device the impulse responses with respect to a fixed point within the playback environment represented in the spatial directions for which they also carry the spatial information.
  • a user thus has the advantage that he receives all the information concerning the spatial sound impression at the same time or that he can change it simultaneously, whereby at any time the changed spatial sound impression resulting from a change is displayed and can be assessed.
  • the graphical representation also makes it possible to carry out the design process detached from technical conditions.
  • an impulse response function will be stored discreetly, i. H. for discrete periods, there is an associated amplitude value.
  • the intuitive operation of the graphical user interface does not need to take this into account as the relevant parameters are automatically generated based on a graphical change in the displayed impulse response.
  • Another advantage is that the complexity of a system can easily be increased without reducing the intuitiveness of the operation under the increased number of parameters.
  • the impulse responses are allowed to be multiple Spatial directions regarding frequency selective represent or edit. This makes it possible to further increase the naturalness of the spatial impression, for example by assuming different frequency-dependent attenuation profiles for different spatial directions, which on the one hand increases the authenticity of the sound impression achieved, but on the other hand also increases the complexity of generating the parameters.
  • the visual representation it is still possible to predict the achievable sound experience, and to change this creatively, for example, by a strong artificial damping is introduced at a certain frequency for a freely selectable spatial direction. These changes are immediately visible and it is possible to reliably predict the impact on the entire sound scene in the context of the overall system.
  • the same parameters can be used to describe the room for all spatial directions, which corresponds to a diffuse reverberation.
  • Direction-dependent spatial portions are only then attached. This results in a specific room impulse response for each spatial direction, an unwanted deviation of the parameters for a spatial direction can be detected and corrected immediately.
  • a further advantage of the three-dimensional representation according to the invention is that the frequency-selective impulse response representation for each direction can easily be converted into a matrix representation by simple scanning, the further processing of which is extremely efficient.
  • individual delay times are set for a given number of spatial directions, the delay times being represented as dirac-shaped impulse responses. These are relative to a fixed point in the Playback environment are shown in a three-dimensional view. It is particularly advantageous that the graphical manipulation, which allows the shifting of the dirac-shaped impulse responses with respect to a reference point, directly reflects the spatial effect visually.
  • the dirac-shaped impulse responses corresponding to a delay just describe a reflection on an object, wherein increasing the distance of the impulse response with respect to the reference point in the graph corresponds to increasing the propagation time of the reflected signal.
  • the direct correspondence of the graphical representation to the simulated reality can thus be used to simulate in a most efficient manner, for example, spaces within which the reproduction environment is located.
  • a particular advantage of this simplified type of interior design is the high degree of intuitiveness of the presentation and the associated reduced probability of errors in the control of a sound reproduction system.
  • the graphical user interface for a sound reproduction system is operated with a signal generator which generates loudspeaker signals for a plurality of loudspeakers mounted at different spatial positions.
  • the high level of user-friendliness and user-friendliness of the graphical user interface makes it possible to manipulate the reproduction of signal sources in real time in such a way that the acoustic location of a sound signal, for example a singer on the stage, matches the visual impression.
  • only a tracking of the moving sound source within the graphical user interface according to the invention is necessary, which would not be feasible by means of classical parameter input for a loudspeaker system to be controlled.
  • FIG. 2 is a block diagram illustrating the operation of a graphical user interface 10 of the present invention having a display means 12 for displaying an impulse response, means for permitting a change in the graphic display 14, means for receiving a user change input 16, and means for detecting the changed impulse response 18 ,
  • the display device 12 graphically presents the impulse responses to the user such that the effects of changing the impulse responses presented can be intuitively interpreted and predicted.
  • the device for enabling the change of the graphic display 14 has access to the display device 12 and the data visualized by it.
  • user input is required, which is received by the device for receiving a user change input 16, which change may be, for example, by means of a computer mouse, a touch pad, or interaction and visualization techniques from virtual reality systems.
  • the display device 12 can now graphically display a changed impulse response.
  • the means 14 for allowing a change 14 and the means for receiving a user change input 16 an iterative change method of user input and subsequent graphical update becomes possible.
  • This has the great advantage that the effect of a user change can be controlled directly graphically or acoustically. Explicitly performing the Changes and subsequent control by test listening within a sound reproduction system can thereby be eliminated, which contributes significantly to the cost and time savings.
  • the modified impulse response is detected and stored for further use, for example.
  • the possibility of storing the impulse response can advantageously be used to reuse an already generated impulse response, which describes a specific space to be simulated, for further projects.
  • Fig. 2 shows schematically how it is based on the in Fig. 3a or 3b shown visualization of the graphical user interface is possible to determine the position of a sound source by means of a graphical user interface according to the invention or to change an existing position so that a desired position impression is created.
  • the position of a sound source relative to the reproduction environment is initially determined graphically.
  • the graphical user interface graphically illustrates, in the second step 22, the impulse responses representing the position of the sound source, which can be changed directly by the user.
  • Fig. 3a or 3b shows an embodiment of a graphical user interface according to the invention for determining the spatial position of a sound source or for changing the impulse responses representing the sound source.
  • the position of the ball describes the position of the sound source 30 in space. Based on the position of the point source 30, the wavefront 34 is shown, which results from the sound radiation of the point-shaped signal source. For example, moving the point source 30 to a point in space farther from the rendering environment 32 will make the wavefront 34 flatter. If the point source 30 is moved closer to the speaker system, then the corresponding wavefront will be more curved.
  • the curvature of the wavefront can also be changed directly with the aid of two graspers 36a and 36b. This directly affects the perceived position of the point source 30, which is automatically displayed by the graphical user interface according to the invention.
  • the graphical user interface in Fig. 3a or 3b also shows a delay radius 38 which serves to avoid acausal states in the reproduction of a wave field synthesis based system where the position of wavefront 34 is determined by the delay radius.
  • the delay radius 38 corresponds to a basic delay, which requires a wave field synthesis system, and which corresponds to the distance of the loudspeaker farthest from the center of the system.
  • the basic delay makes it possible to position sources arbitrarily within and outside the speaker system / reconstruction area or the reproduction environment 32.
  • the position of the wavefront is defined by the intersection of the connecting line between the system center point and the position of the sound source 30 with the delay radius.
  • the thus determined position of the wavefront 34 is thus equivalent to a vanishing delay, since the delay radius 38 determines just the minimum delay time to be observed.
  • a real signal propagation time depends on the distance of the sound source to the listening space. This is determined by the distance between the sound source position and the center of the Playback system. In the creation of imaginary auditory scenes, this runtime is usually not desirable because it restricts the positioning options of the sources, as this, for example, temporal relationships can be changed in a music recording. This delay can therefore be deactivated in wave field synthesis systems, which may be required for an authentic sound impression.
  • This important additional parameter is represented in the graphic user interface according to the invention as a circle 40, the position of the circle 40 on the connecting line between the system center and the sound source 30 visualizing the set delay time.
  • the circle 40 is located directly at the boundary of the delay radius 38, the illustrated transit time has its minimum possible value, which corresponds to the basic delay of the wave field synthesis system. If the case of a real sound propagation time / deceleration is to be simulated, the position of the circle 40 would be directly below the sphere representing the sound source 30, it being understood that all intermediate values can additionally be displayed and adjusted.
  • the important delay time parameters can be set and changed intuitively, which further increases the creative freedom and, moreover, increases the efficiency of the design process in spatial sound reproduction.
  • the graphical user interface according to the invention has the advantage of extremely great flexibility, so that further parameters can be easily added, for example the area of circle 40 could describe a ratio of diffused sound to direct sound, which is considered by a listener to be another feature for the removal of a sound source Listening position is understood, changing this ratio, for example, by a Moving the circle 40 or changing its surface could be implemented.
  • the wave field synthesis algorithm calculates the impulse response IR L1..Ln for each loudspeaker involved (amplitude, delay). If, at a time t, these impulse responses are lined up next to one another, the peaks result in a sampled version of the wavefront emanating from the virtual sound source.
  • a graphical processing step (see Fig. 3a ) from the wavefront can be simplified and displayed with interaction elements. If the user now interacts with these elements, the graphical representation of the wavefront changes. This representation change can be impressed on the individual impulse responses IR L1..Ln in the next step
  • the graphical user interface allows the manipulation of impulse responses, which are preferably to be computed for each individual loudspeaker which illuminates the reproduction volume 32.
  • the graphic user interface makes it possible to manipulate impulse responses which are to be calculated for each individual loudspeaker which illuminates the reproduction volume 32.
  • the representation of the impulse responses results directly from the representation of the graphical user interface, for which purpose a connection line 42 between the sound source 30 and an imaginary loudspeaker at the edge of the reproduction volume 32 is shown by way of example.
  • the impulse response to be calculated is given directly by the shape of the wavefront at the location at which the connecting line 42 intersects the wavefront 34.
  • the spatial position of a sound source 30, as it is in Fig. 3a or 3b can be seen for each individual speaker translated into a time delay and an amplitude.
  • the amplitude results directly from the height of the graphical representation of the wavefront 34, wherein the time delay is also determined by the intersection of the line 42 with the wavefront 34, wherein for the determination of the time delay, the length of the cut sections of the line 42 is relevant.
  • the size of the sound source descriptive ball 30 can be used to represent the volume of a sound source.
  • the above-mentioned manipulation of the direct sound / diffuse sound ratio can also be displayed again here. If the volume of the direct sound corresponds to the size of the ball 30, z. For example, a distant source of sound tends to be quieter and thus corresponds to a small sphere. A link with the distance-dependent calculation of the volume of a sound source is thus easily realized by this representation.
  • Fig. 3a or 3b for positioning a sound source, that is, for determining a sound impression, which reproduces the location of the sound source, is based on the Fig. 4-8 explained that the graphical user interface according to the invention is also suitable to visualize such impulse responses and to allow their change, which cause a sound impression that corresponds to that of a room to be simulated, such as a cathedral.
  • Fig. 4 shows a possibility in which first in a positioning step 50, the sound sources are arranged in space, as for example with reference to Fig. 3a or 3b has been described.
  • the impulses are assigned impulse responses for each sound source.
  • a spatial sound impression of the sound source can be imprinted directly when it is in a spatial direction with respect to the reproduction environment for which a specific spatial sound impression is to be simulated.
  • an impulse response function is generated for each sound source and spatial direction, which is sent to a reproduction system together with the Sound source in a transfer step 54 must be transmitted in order to achieve the desired spatial sound impression during playback.
  • a positioning step 60 in which impulse responses are generated for loudspeakers for each sound source, which describe the position.
  • the impression of space that is to be created in a listening direction can, since the loudspeakers used in the reproduction system are also assigned to fixed spatial directions, also be generated by additionally generating for each loudspeaker in a room simulation step 62 an impulse response which contains the information about the in the auditory system Direction of the relevant speaker contains space.
  • a transfer or storage step 64 the sound source system must then be transmitted to the sound reproduction system and a position impulse response and a room impulse response must be transmitted for each individual loudspeaker. Due to the flexibility of the graphic user interface according to the invention, the allocation of a spatial sound impression either individually to each sound source can be done or it can groups of sound sources, which are arranged in a similar spatial direction with respect to the playback environment, summarized to represent a plurality of discrete spatial directions, whereby in the Playback the required computing capacity is reduced.
  • FIG Fig. 6 An embodiment of the graphical user interface according to the invention showing the manipulation of an impulse response in an impulse response time representation is shown in FIG Fig. 6 shown.
  • the spatial directions with respect to a reproduction environment 70 are subdivided into eight discrete sectors 72a-72h. For each of the sectors 72a-72h, therefore, a common Spatial impression achieved by means of an impulse response time representation.
  • the envelopes of the eight impulse responses used for space simulation are extruded to surfaces. These surfaces are arranged in the form of an octagon and connected to a common surface 74.
  • the height of the area above the area defined by sectors 72a-72i corresponds to the amplitude of the impulse response.
  • the distance from the center of the replay environment 70 represents the time, therefore, events occurring at the end of the impulse response are farther from the center of the replay environment 70.
  • the amplitude curves of the room impulse responses over time can be represented according to their spatial direction.
  • the change takes place interactively by moving interaction elements 76a, b and c exemplified here. It is thus possible to grasp the entire spatial sound situation at a glance and to recognize and eliminate deviations from the desired behavior.
  • the reverberation time from all directions usually be almost the same for a real space.
  • the reverberation time in the direction of the sector 72h is reduced, which can be easily recognized by the asymmetry of the total area 74, so that the difference to the real, evenly reverberant room can be recognized immediately.
  • Fig. 7 describes a representation of spatial impulse responses in a time-frequency representation. Shown is the rendering environment 80 and eight time-frequency representations of impulse responses 82a-82h associated with eight discrete spatial directions relative to the rendering environment 80.
  • both the time and the To visualize frequency components of impulse responses related to their spatial directions and to manipulate them can be changed, for example, by means of interaction elements 86a-86c.
  • the exemplified interaction elements 86a-86c allow the manipulation of the amplitude frequency response at a certain time, in the example shown here at the beginning of the impulse response.
  • low frequencies are located farther left and high frequencies farther to the right, so that it can be seen immediately that in the spatial simulation, the low frequencies begin with a higher amplitude and end longer than the high frequencies.
  • This complex relationship which can be stored in the form of a matrix, for example by describing the areas 82a-82h, can be intuitively recorded and changed here.
  • the type of representation also makes it possible to attach additional effects or to recognize their effect, for example, strong reflections from certain spatial directions would be visible as elevations on the surfaces of the corresponding spatial impulse response in this representation.
  • Fig. 8 shows another example of a graphical user interface according to the invention, in which the impulse responses the individual spatial directions consist of discrete peaks. Shown are a rendering environment 90, eight discrete spatial directions 92a-92i, and five exemplary delta-shaped impulse responses 94a-94e.
  • the wavefronts 94a-94e represent echoes from the spatial directions assigned to them. Their distance to the center of the playback volume indicates the time of the repetition of the original signal.
  • the position of the repetitions can be influenced by radial movements of the impulse responses from or to the center of the system.
  • the amplitude of the repetitions can be influenced by the height of the wavefronts in the vertical direction.
  • a time-frequency representation can also be implemented here in order to additionally impress an individual frequency response on each echo.
  • Fig. 9 describes a system for visualizing and editing spatial sound effects 100, which is composed of a signal processing part 102 and a visualization and interaction part 104.
  • the signal processing consists in that incoming audio signals 106 are folded by means of a mathematical convolution 108 with the impulse responses determined by means of the visualization and interaction part 104 in order to generate therefrom audio signals 110 which carry the sound impression of a room to be simulated.
  • the visualization and interaction part 104 has a display means for displaying calculated impulse responses 112, means for receiving a user change input 114, means for permitting a change of the graphical display 116, and means for detecting the changed impulse response 118.
  • the means for receiving a user change input 114 comprises an interaction device 120 and a means for implementing the interaction 122.
  • the means for permitting a change in the graphical display of the impulse response 116 comprises an output device 124 for displaying the original impulse response and an image calculation unit 126 for visualizing the original impulse response ,
  • a visual model 112 is generated based on parameters describing the impulse responses and thus containing the information about the space to be simulated. If a suitable visual model has been created by repeated interaction and visualization, the means for detecting the changed impulse response 118 extracts the parameters on which the visualization is based, and transmits them as impulse responses to the signal processor 102.
  • the signal processing comprises the convolution of N input signals with n impulse responses to n output signals.
  • N can be of z.
  • B eight signals in the generation from reverb effects for wave field synthesis to a very large number in the generation of a whole wave field. If multiple effects or sources are generated simultaneously, the output signals for each effect or source must be summed up at the end.
  • impulse responses required for the signal processing are thus generated with the aid of the visualization and interaction part of the system. From an impulse response, sound-relevant parameters can be generated. It must be distinguished whether it is room signals or direct signals.
  • parameters can also be obtained from the interface. However, these can only be converted into impulse responses for the loudspeaker channels by the application of the wave field synthesis algorithm. The parameters are thus at a more abstract level. But the structure of the block diagram in Fig. 9 does not change.
  • a significant advantage of the graphical user interface according to the invention is that complex mathematical parameters are made intuitively accessible. This makes it possible to generate or set these parameters, with it being possible in particular to keep an eye on the entire sound event at all times. It is particularly advantageous that in the described embodiments, which are based on 3D visualizations, the direction in which the playback environment is considered, can be varied, so that an emerging sound impression can be even better predicted that this judges from different spatial directions becomes.
  • the graphic user interface has individual discrete function blocks, such a division is only to be understood as an example, in principle any combinations and summaries of the individual function blocks are possible. So z.
  • the display 12 is combined with the means 14 for altering the graphical display, as is partially the case in the illustrated embodiments, where the possibility of modification is already implemented as part of the display, for example in the form of the handles 36a and 36b in Fig. 3a or 3b ,
  • the user input may be by means of a mouse, a touchscreen, or any other means of moving a cursor on a screen.
  • the direct input of discrete change steps by means of a keyboard can also be represented, for example in the case of a discretized representation of an impulse response, where in defined time ranges the value of the impulse response in discrete steps can be adjusted, which is easily possible for example by means of a conventional keyboard.
  • any other suitable representation of impulse response functions is also possible in order to allow setting or generation of a spatial impression according to the invention.
  • a direction-dependent spatial sound character could advantageously be represented by the fact that for each spatial direction only the difference to the common impulse response function is displayed, so that one easily gets an impression of how the observed spatial direction differs in its spatial properties from the overall sound image (middle sound image).
  • An order of processing the impulse response functions that describe the position of a sound source or the spatial impression is not fixed. It is possible to first position all the sound sources in the room and then create a spatial impression, as well as to first define the room to be simulated in order to subsequently position the sound sources within the room.
  • the processing steps for a system for driving a sound reproduction system which comprises a graphic user interface according to the invention and a signal generator for supplying loudspeaker signals differ.
  • the signal processing which is represented individually for each speaker by convolution of a sound signal with an impulse response function, can be implemented both continuously and discretely, although alternative mathematical methods are also possible for imparting the spatial impression which an impulse response describes to a sound signal.
  • the space enclosing the reproduction environment is subdivided into eight discrete spatial directions, wherein a spatial sound character can be determined individually for each spatial direction.
  • a spatial sound character can be determined individually for each spatial direction.
  • the inventive method of using a graphical user interface for using a sound reproduction system may be implemented in hardware or in software.
  • the implementation can be carried out on a digital storage medium, in particular a floppy disk or CD with electronically readable control signals, which can cooperate with a programmable computer system such that the method according to the invention for checking the success of a deconing process is carried out.
  • the invention thus also consists in a computer program product with a program code stored on a machine-readable carrier for carrying out the method according to the invention, when the computer program product runs on a computer.
  • the invention can thus be realized as a computer program with a program code for carrying out the method when the computer program runs on a computer.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Stereophonic System (AREA)
  • Electrophonic Musical Instruments (AREA)

Claims (20)

  1. Interface graphique d'utilisateur (10) pour un système de reproduction sonore, réalisée de manière à générer une impression sonore spatiale dans un environnement de reproduction (32; 70; 80; 90), aux caractéristiques suivantes:
    un moyen d'affichage (12) destiné à afficher graphiquement des réponses impulsionnelles (34; 74; 82a à 82h; 94a à 94e) qui sont associées à des directions spatiales de l'environnement de reproduction (32; 70; 80; 90), les réponses impulsionnelles étant représentées, par rapport à l'environnement de reproduction, dans les directions spatiales auxquelles elles sont associées;
    un moyen destiné à permettre une modification (14) de l'affichage graphique des réponses impulsionnelles (34; 74; 82a à 82h; 94a à 94e) par l'utilisateur, une modification de l'affichage graphique des réponses impulsionnelles (34; 74; 82a à 82h; 94a à 94e) étant rendue possible à des points prédéterminés;
    un moyen destiné à recevoir (16) une entrée de modification par l'utilisateur, pour représenter graphiquement des réponses impulsionnelles modifiées par le moyen d'affichage (12); et
    un moyen destiné à détecter les réponses impulsionnelles modifiées (18).
  2. Interface graphique d'utilisateur selon la revendication 1, dans laquelle le moyen d'affichage (12) est réalisé de manière à représenter les réponses impulsionnelles (74; 82a à 82h; 94a à 94e) comme évolutions dans le temps d'une grandeur d'intensité.
  3. Interface graphique d'utilisateur selon la revendication 2, dans laquelle le moyen d'affichage (12) est réalisé de manière à représenter les évolutions dans le temps des réponses impulsionnelles (74; 82a à 82h; 94a à 94e) de sorte qu'elles soient subdivisées en segments de temps discrets, à chaque segment de temps étant associée une grandeur d'intensité.
  4. Interface graphique d'utilisateur selon l'une des revendications précédentes, dans laquelle le moyen d'affichage (12) est réalisé de manière à représenter les réponses impulsionnelles (82a à 82h) graphiquement en fonction de la fréquence.
  5. Interface graphique d'utilisateur selon la revendication 4, dans laquelle le moyen d'affichage (12) est réalisé de manière à représenter les évolutions de fréquence des réponses impulsionnelles (74; 82a à 82h; 94a à 94e) de sorte qu'elles soient subdivisées en segments de fréquence discrets, à chaque segment de fréquence étant associée une grandeur d'intensité.
  6. Interface graphique d'utilisateur selon l'une des revendications précédentes, dans laquelle le moyen d'affichage (12) est réalisé de manière à représenter les réponses impulsionnelles (82a à 82h) graphiquement en fonction du temps et en fonction de la fréquence dans une représentation tridimensionnelle, les valeurs de fonction étant représentées comme hauteur au-dessus d'une surface bidimensionnelle dont un premier côté présente, comme mesure, le temps et dont un deuxième côté se raccordant au premier côté présente, comme mesure, la fréquence.
  7. Interface graphique d'utilisateur selon l'une des revendications précédentes, dans laquelle le moyen d'affichage (12) est réalisé de manière à afficher, en outre, une représentation graphique de l'environnement de reproduction (32; 70; 80; 90) dans une représentation tridimensionnelle, les réponses impulsionnelles (34; 74; 82a à 82h; 94a à 94e) étant représentées, par rapport à l'environnement de reproduction (32; 70; 80; 90), dans les directions spatiales auxquelles sont associées les réponses impulsionnelles (34; 74; 82a à 82h; 94a à 94e).
  8. Interface graphique d'utilisateur selon l'une des revendications précédentes, dans laquelle le moyen destiné à permettre une modification (14) de l'affichage graphique des réponses impulsionnelles (34; 74; 82a à 82h; 94a à 94e) est réalisé de manière à permettre une modification de la représentation graphique des réponses impulsionnelles (34; 74; 82a à 82h; 94a à 94e) en tout point quelconque de la représentation graphique des réponses impulsionnelles (34; 74; 82a à 82h; 94a à 94e).
  9. Interface graphique d'utilisateur selon l'une des revendications précédentes, dans laquelle le moyen destiné à permettre une modification (14) de l'affichage graphique des réponses impulsionnelles (94a à 94e) est réalisé de manière à permettre, comme modification de la représentation graphique des réponses impulsionnelles (94a à 94e), un décalage des réponses impulsionnelles (94a à 94e) dans le temps.
  10. Interface graphique d'utilisateur selon l'une des revendications précédentes, dans laquelle le moyen destiné à recevoir une entrée de modification par l'utilisateur (16) est réalisé de manière à recevoir des signaux d'une souris d'ordinateur, d'un touchpad, d'un écran tactile, d'un pointeur ou d'un clavier.
  11. Interface graphique d'utilisateur selon l'une des revendications précédentes, dans laquelle le moyen destiné à détecter (18) les réponses impulsionnelles modifiées est réalisé de manière à balayer, pour la détection, la réponse impulsionnelle modifiée représentée graphiquement et à mémoriser les valeurs balayées dans une mémoire.
  12. Interface graphique d'utilisateur selon l'une des revendications précédentes, dans laquelle le moyen d'affichage (12) est réalisé de manière à afficher graphiquement des réponses impulsionnelles (74; 82a à 82h; 94a à 94e) qui contiennent des informations sur un espace à simuler.
  13. Interface graphique d'utilisateur selon l'une des revendications précédentes, dans laquelle le moyen d'affichage (12) est réalisé de manière à afficher graphiquement des réponses impulsionnelles (34) qui contiennent des informations sur la position d'une source sonore (30) par rapport à l'environnement de reproduction (32).
  14. Dispositif d'activation d'un système de reproduction sonore, qui est réalisé de manière à générer, dans un environnement de reproduction, une impression sonore spatiale, aux caractéristiques suivantes:
    une interface graphique d'utilisateur (10) selon l'une des revendications 1 à 13; et
    un générateur de signal (102) destiné à fournir des signaux de haut-parleur (110) pour les haut-parleurs d'une pluralité de haut-parleurs pouvant être placés à des positions spatiales différentes.
  15. Dispositif d'activation selon la revendication 14, dans lequel le générateur de signal (102) présente un moyen de combinaison (108) destiné à combiner des signaux sonores (106) avec les réponses impulsionnelles modifiées, les signaux sonores (106) étant prévus pour les hauts-parleurs qui sont disposés à des positions spatiales correspondant aux directions spatiales auxquelles sont associées les réponses impulsionnelles, pour obtenir des signaux de haut-parleur (110), le moyen de combinaison (108) étant réalisé de manière à combiner de sorte que les signaux de haut-parleur (110) contiennent l'information sur l'espace à simuler.
  16. Dispositif d'activation selon la revendication 15, dans lequel le générateur de signal (102) présente un moyen de combinaison (108) destiné à combiner des signaux sonores (106) avec les réponses impulsionnelles modifiées, pour obtenir des signaux de haut-parleur (110), le moyen de combinaison (108) étant réalisé de manière à combiner de sorte que les signaux de haut-parleur (110) contiennent l'information sur les positions relatives d'une source sonore associée aux signaux sonores (106).
  17. Dispositif d'activation selon l'une des revendications 15 ou 16, dans lequel le moyen de combinaison (108) est réalisé de manière à replier, lors de la combinaison, les signaux sonores (106) avec les réponses impulsionnelles modifiées.
  18. Procédé permettant d'utiliser un système de reproduction sonore, réalisé de manière à générer, dans un environnement de reproduction (32; 70; 80; 90), une impression sonore spatiale, aux étapes suivantes consistant à:
    afficher graphiquement des réponses impulsionnelles (34; 74; 82a à 82h; 94a à 94e) qui sont associées à des directions spatiales de l'environnement de reproduction (32; 70; 80; 90), les réponses impulsionnelles étant représentées, par rapport à l'environnement de reproduction, dans les directions spatiales auxquelles elles sont associées;
    permettre une modification de l'affichage graphique des réponses impulsionnelles (34; 74; 82a à 82h; 94a à 94e) par l'utilisateur, une modification de l'affichage graphique des réponses impulsionnelles (34; 74; 82a à 82h; 94a à 94e) étant rendue possible à des points prédéterminés;
    recevoir une entrée de modification par l'utilisateur, pour représenter graphiquement des réponses impulsionnelles modifiées; et
    détecter les réponses impulsionnelles modifiées.
  19. Procédé d'activation d'un système de reproduction sonore, aux étapes du procédé selon la revendication 18 et à l'étape additionnelle suivante consistant à:
    fournir des signaux de haut-parleur pour les haut-parleurs d'une pluralité de haut-parleurs pouvant être placés à des positions spatiales différentes sur base des réponses impulsionnelles modifiées.
  20. Programme d'ordinateur avec un code de programme pour réaliser le procédé selon la revendication 18 ou 19 lorsque le programme d'ordinateur est exécuté sur un ordinateur.
EP06742644A 2005-05-04 2006-04-21 Dispositif et procede de production et de traitement d'effets sonores dans des systemes de reproduction sonore spatiale a l'aide d'une interface graphique d'utilisateur Not-in-force EP1878308B1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102005021378 2005-05-04
DE102005043641A DE102005043641A1 (de) 2005-05-04 2005-09-13 Vorrichtung und Verfahren zur Generierung und Bearbeitung von Toneffekten in räumlichen Tonwiedergabesystemen mittels einer graphischen Benutzerschnittstelle
PCT/EP2006/003709 WO2006117089A2 (fr) 2005-05-04 2006-04-21 Dispositif et procede de production et de traitement d'effets sonores dans des systemes de reproduction sonore spatiale a l'aide d'une interface graphique d'utilisateur

Publications (2)

Publication Number Publication Date
EP1878308A2 EP1878308A2 (fr) 2008-01-16
EP1878308B1 true EP1878308B1 (fr) 2009-08-19

Family

ID=37111576

Family Applications (1)

Application Number Title Priority Date Filing Date
EP06742644A Not-in-force EP1878308B1 (fr) 2005-05-04 2006-04-21 Dispositif et procede de production et de traitement d'effets sonores dans des systemes de reproduction sonore spatiale a l'aide d'une interface graphique d'utilisateur

Country Status (7)

Country Link
US (1) US8325933B2 (fr)
EP (1) EP1878308B1 (fr)
JP (1) JP4651710B2 (fr)
CN (1) CN101171882B (fr)
AT (1) ATE440459T1 (fr)
DE (2) DE102005043641A1 (fr)
WO (1) WO2006117089A2 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011119642A1 (de) 2011-11-28 2013-05-29 Shure Europe GmbH Vorrichtung und Verfahren zur Raumklangsimulation

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8799819B2 (en) 2007-03-01 2014-08-05 Apple Inc. Graphical user interface for multi-tap delay
US8396226B2 (en) 2008-06-30 2013-03-12 Costellation Productions, Inc. Methods and systems for improved acoustic environment characterization
JP5580585B2 (ja) * 2009-12-25 2014-08-27 日本電信電話株式会社 信号分析装置、信号分析方法及び信号分析プログラム
WO2011114310A2 (fr) * 2010-03-18 2011-09-22 Versonic Pte. Ltd. Système de mélange de son numérique avec des commandes graphiques
WO2012140525A1 (fr) 2011-04-12 2012-10-18 International Business Machines Corporation Convertir les sons d'une interface utilisateur en espace audio en 3d
US10448161B2 (en) 2012-04-02 2019-10-15 Qualcomm Incorporated Systems, methods, apparatus, and computer-readable media for gestural manipulation of a sound field
US9412375B2 (en) * 2012-11-14 2016-08-09 Qualcomm Incorporated Methods and apparatuses for representing a sound field in a physical space
KR102127640B1 (ko) 2013-03-28 2020-06-30 삼성전자주식회사 휴대 단말 및 보청기와 휴대 단말에서 음원의 위치를 제공하는 방법
USD784360S1 (en) 2014-05-21 2017-04-18 Dolby International Ab Display screen or portion thereof with a graphical user interface
CN106465036B (zh) * 2014-05-21 2018-10-16 杜比国际公司 配置经由家庭音频回放系统的音频的回放
US9706330B2 (en) * 2014-09-11 2017-07-11 Genelec Oy Loudspeaker control
USD828845S1 (en) 2015-01-05 2018-09-18 Dolby International Ab Display screen or portion thereof with transitional graphical user interface
WO2017192972A1 (fr) 2016-05-06 2017-11-09 Dts, Inc. Systèmes de reproduction audio immersifs
CN105979469B (zh) * 2016-06-29 2020-01-31 维沃移动通信有限公司 一种录音处理方法及终端
US10979844B2 (en) 2017-03-08 2021-04-13 Dts, Inc. Distributed audio virtualization systems
CN109933297B (zh) * 2017-12-15 2023-09-19 阿尔派株式会社 电子装置及其信号源的控制方法
CN115862665B (zh) * 2023-02-27 2023-06-16 广州市迪声音响有限公司 一种回声混响效果参数的可视化曲线界面系统

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2738099B1 (fr) 1995-08-25 1997-10-24 France Telecom Procede de simulation de la qualite acoustique d'une salle et processeur audio-numerique associe
JPH10257583A (ja) * 1997-03-06 1998-09-25 Asahi Chem Ind Co Ltd 音声処理装置およびその音声処理方法
JP2000356994A (ja) * 1999-06-15 2000-12-26 Yamaha Corp オーディオシステム、その制御方法および記録媒体
EP1158486A1 (fr) 2000-05-18 2001-11-28 TC Electronic A/S Méthode de traitement de signal
GB2357409A (en) * 1999-12-13 2001-06-20 Sony Uk Ltd Audio signal processing
GB2367409B (en) 2000-07-13 2003-12-03 Fire & Rescue Equipment Ltd As Shutter latch sensing switch
US20030007648A1 (en) 2001-04-27 2003-01-09 Christopher Currell Virtual audio system and techniques
JP4077279B2 (ja) * 2002-08-30 2008-04-16 アルパイン株式会社 残響レベル制御装置
DE10254404B4 (de) * 2002-11-21 2004-11-18 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Audiowiedergabesystem und Verfahren zum Wiedergeben eines Audiosignals
JP4464064B2 (ja) * 2003-04-02 2010-05-19 ヤマハ株式会社 残響付与装置および残響付与プログラム
JP2005080124A (ja) * 2003-09-02 2005-03-24 Japan Science & Technology Agency リアルタイム音響再現システム
EP1685554A1 (fr) 2003-10-09 2006-08-02 TEAC America, Inc. Procede, appareil et systeme pour synthetiser une performance audio a l'aide d'une convolution a frequences d'echantillonnage multiples
JP3931872B2 (ja) * 2003-10-09 2007-06-20 ヤマハ株式会社 パラメータ編集装置およびパラメータ編集方法を実現するためのプログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011119642A1 (de) 2011-11-28 2013-05-29 Shure Europe GmbH Vorrichtung und Verfahren zur Raumklangsimulation
WO2013079051A1 (fr) 2011-11-28 2013-06-06 Shure Europe GmbH Dispositif et procédé de simulation d'environnement sonore

Also Published As

Publication number Publication date
ATE440459T1 (de) 2009-09-15
WO2006117089A3 (fr) 2007-05-10
CN101171882B (zh) 2010-10-06
CN101171882A (zh) 2008-04-30
US8325933B2 (en) 2012-12-04
DE502006004596D1 (de) 2009-10-01
JP4651710B2 (ja) 2011-03-16
DE102005043641A1 (de) 2006-11-09
WO2006117089A2 (fr) 2006-11-09
JP2008541520A (ja) 2008-11-20
US20080101616A1 (en) 2008-05-01
EP1878308A2 (fr) 2008-01-16

Similar Documents

Publication Publication Date Title
EP1878308B1 (fr) Dispositif et procede de production et de traitement d'effets sonores dans des systemes de reproduction sonore spatiale a l'aide d'une interface graphique d'utilisateur
EP1652405B1 (fr) Dispositif et procede de production, de mise en memoire ou de traitement d'une representation audio d'une scene audio
EP1671516B1 (fr) Procede et dispositif de production d'un canal a frequences basses
EP3005737B1 (fr) Pupitre de mixage, procédé et programme informatique de fourniture d'un signal audio
DE3413181C3 (fr)
EP1844627B1 (fr) Dispositif et procédé pour simuler un système de synthèse de champ d'onde
DE68922885T2 (de) Verfahren und Vorrichtung zur Schallbilderzeugung.
DE69726262T2 (de) Tonaufnahme- und -wiedergabesysteme
EP1972181B1 (fr) Dispositif et procédé de simulation de systèmes wfs et de compensation de propriétés wfs influençant le son
DE102005057406A1 (de) Verfahren zur Aufnahme einer Tonquelle mit zeitlich variabler Richtcharakteristik und zur Wiedergabe sowie System zur Durchführung des Verfahrens
EP1525776A1 (fr) Dispositif de correction de niveau dans un systeme de synthese de champ d'ondes
DE102005001395B4 (de) Verfahren und Vorrichtung zur Transformation des frühen Schallfeldes
EP2754151B1 (fr) Dispositif, procédé et système électroacoustique de prolongement d'un temps de réverbération
EP2503799B1 (fr) Procédé et système de calcul de fonctions HRTF par synthèse locale virtuelle de champ sonore
EP2485504B1 (fr) Production de zones silencieuses à l'intérieur de la zone d'auditeurs d'un système de retransmission à plusieurs canaux
DE60130654T2 (de) Verfahren zur verarbeitung eines signals
DE19745392A1 (de) Tonwiedergabevorrichtung und Verfahren zur Tonwiedergabe
WO2024099733A1 (fr) Procédé de correction dépendant de la direction de la réponse en fréquence de fronts d'ondes sonores
DE102024000725A1 (de) Virtuelle akustische Reflektoren
EP1900250B1 (fr) Procede electro-acoustique
DE4007841A1 (de) Tonsignalaufbereitungsverfahren

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20071010

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REF Corresponds to:

Ref document number: 502006004596

Country of ref document: DE

Date of ref document: 20091001

Kind code of ref document: P

LTIE Lt: invalidation of european patent or patent extension

Effective date: 20090819

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20091219

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20091130

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

NLV1 Nl: lapsed or annulled due to failure to fulfill the requirements of art. 29p and 29m of the patents act
PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

REG Reference to a national code

Ref country code: IE

Ref legal event code: FD4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20091119

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20091221

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

Ref country code: IE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20100520

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20091120

BERE Be: lapsed

Owner name: FRAUNHOFER-GESELLSCHAFT ZUR FORDERUNG DER ANGEWAN

Effective date: 20100430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20100430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20100430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100220

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20100421

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20090819

REG Reference to a national code

Ref country code: AT

Ref legal event code: MM01

Ref document number: 440459

Country of ref document: AT

Kind code of ref document: T

Effective date: 20110421

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110421

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 11

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 12

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 13

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20200423

Year of fee payment: 15

Ref country code: CH

Payment date: 20200423

Year of fee payment: 15

Ref country code: FR

Payment date: 20200421

Year of fee payment: 15

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20200423

Year of fee payment: 15

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 502006004596

Country of ref document: DE

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20210421

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210430

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210430

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211103

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210430

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210421