EP0719070B1 - Dispositif de prise de sons comprenant un système vidéo pour le réglage de paramètres et procédé de réglage - Google Patents

Dispositif de prise de sons comprenant un système vidéo pour le réglage de paramètres et procédé de réglage Download PDF

Info

Publication number
EP0719070B1
EP0719070B1 EP95402816A EP95402816A EP0719070B1 EP 0719070 B1 EP0719070 B1 EP 0719070B1 EP 95402816 A EP95402816 A EP 95402816A EP 95402816 A EP95402816 A EP 95402816A EP 0719070 B1 EP0719070 B1 EP 0719070B1
Authority
EP
European Patent Office
Prior art keywords
sound
parameters
values
coefficients
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
EP95402816A
Other languages
German (de)
English (en)
French (fr)
Other versions
EP0719070A1 (fr
Inventor
Yves Grenier
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orange SA
Original Assignee
France Telecom SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by France Telecom SA filed Critical France Telecom SA
Publication of EP0719070A1 publication Critical patent/EP0719070A1/fr
Application granted granted Critical
Publication of EP0719070B1 publication Critical patent/EP0719070B1/fr
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/20Arrangements for obtaining desired frequency or directional characteristics
    • H04R1/32Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only
    • H04R1/40Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers
    • H04R1/406Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers microphones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R3/00Circuits for transducers, loudspeakers or microphones
    • H04R3/005Circuits for transducers, loudspeakers or microphones for combining the signals of two or more microphones

Definitions

  • the invention relates to a sound recording device with which a video pointing system has been associated.
  • This device can be particularly useful in certain applications and in particular during conferences, concerts or any other event deserving of perfect quality sound recording.
  • the device according to the invention makes it possible to simultaneously and independently capture the sounds coming from several sound sources, without having to approach the sensors close to these sources, while giving the auditory impression that the sound is picked up near each source. For this, it reduces the reverberation of the sound as well as the level of ambient noise.
  • These devices include networks of sensors, a control unit using in particular filters for processing the signals received by the sensors, and means for adjusting the parameters characteristic of the sound recording.
  • Patent EP-A-0 381 498 further describes a sound pickup device comprising a circuit for changing the coefficients of the digital filters which makes it possible to vary, arbitrarily, the directional characteristics of the sound reception channels.
  • Patent EP-A-0 356 327 describes a device for inputting sound signals in the presence of interference signals from a jammer.
  • the problem raised in this prior document is the elimination of interference signals and does not concern the production of a device designed for perfect sound recording comprising a video pointing system.
  • patent EP-A-0 352 627 refers to a sound reproduction device comprising a volume controller, several sound regenerators, a unit for producing a test signal and a display unit.
  • the display unit is used to display the sound regenerator corresponding to the channel in which the test signal is sent as well as the volume levels of the respective channels. This device is totally different from a sound recording in which an associated video pointing system improves the quality of sound recording.
  • the present invention overcomes this problem. Its object is in fact a device comprising a network of sensor elements, a control unit using in particular filters for processing the signals received by the sensors, a camera and a video screen.
  • the camera provides a video signal to the screen corresponding to the image of the area where the sound sources are located from which the sound is captured.
  • the video screen for its part, makes it possible to visualize, at the same time, the sound sources, filmed by the camera, and the variations of the characteristic parameters of each of the sound reception channels. Thus, a very precise adjustment of the parameters is carried out, taking into account the position and the size of the sound sources.
  • a more particular subject of the invention is a sound pickup device, comprising a network of sensors, a control unit and means for adjusting the parameters characteristic of the sound pickup, mainly characterized in that it further comprises a video camera, a video screen which displays a first video image, corresponding to the signal from the camera, and means for coupling the screen to the means for adjusting the characteristic parameters of each of the sound reception channels.
  • These coupling means make it possible to produce, for each of the sound reception channels, another video image, showing variations in the characteristic parameters, and to superimpose this image on the first video image, so as to control the adjustment of these parameters.
  • the filters used to the processing of the signals received by the sensors are filters with linear interpolation.
  • the optical field of a camera 100 covers the entire area where there are sound sources from which the device picks up sounds.
  • the video signal from the camera is then transmitted to a video screen 200 which displays a corresponding first video image.
  • the concept of screen covers any type of screen such as, for example, the screen of a video monitor.
  • the camera supplies, on the other hand, a value of its focal length to a control unit 300, this value is useful for performing angle calculations which will be described in more detail in the following.
  • adjustment means 400 make it possible to adjust the characteristic parameters of each of the sound reception channels.
  • the signal from these adjustment means 400 is transmitted to coupling means 500 of the screen 200 to the adjustment means 400.
  • the coupling means 500 make it possible to produce, for each of the sound reception channels, another video image. and superimpose it on the first image. This superimposition of images makes it possible to perform a precise adjustment of the characteristic parameters of each sound reception channel, and to control the variations of this adjustment with respect to the position and size of the sound sources from which the device picks up the sounds.
  • the signal representing the values of the adjustment parameters is also transmitted to the control unit 300 which has in particular the task of filtering the signals received by the network 600 of sensors, and of updating, periodically, the coefficients of the filters.
  • Figure 2 provides a more detailed diagram of a device according to the invention.
  • the network 600 of sensors comprises a number M of sensors 610 having the task of capturing the sounds coming from several sound sources and of transmitting the corresponding signals to a control unit 300. This control unit then processes these signals, in particular by filtering.
  • the number M of sensors 610 is preferably at least equal to 2 and the number m associated with each sensor 610 therefore varies from 1 to M.
  • the control unit To be able to process the signals received by the M sensors 610, the control unit must also know the values of the characteristic parameters of each sound reception channel. This is why signals, corresponding to the values of these parameters, are sent from the network 400 of adjustment means to the control unit.
  • This network 400 comprises a number R of adjustment means 410.
  • Each of these adjustment means 410 of the parameters characteristic of the sound pickup corresponds to a sound reception channel.
  • the number R of adjusting means 410, and consequently the number of sound reception channels, is preferably at least equal to 1 and the number r associated with each of these means therefore varies from 1 to R.
  • each channel r for receiving sound is associated with an output 710 where the signals are available.
  • means 500 of the video screen 200 are introduced into the structure of the device to the adjustment means 400.
  • These coupling means 500 advantageously comprise, for each of the means 410 for adjusting the parameters of each of the sound reception channels, a video generator 510 and a video mixer 520.
  • the video generator 510 makes it possible to transform the signal from of the corresponding adjustment means, into a video signal.
  • the mixer 520 makes it possible to mix, between them, the signals coming from the video generators, and to also mix them with the signal coming from the camera.
  • the signal from the last video mixer is then sent to the screen 200.
  • FIG. 3 illustrates the construction of a sensor 610.
  • a sensor comprises a microphone 611, a preamplifier 612, a low-pass filter 613 and an analog-digital converter 614.
  • the signal picked up by the microphone 611 is injected into a preamplifier 612 then is filtered by the low-pass filter 613 to eliminate the spectral aliasing that could be introduced by the analog-digital converter 614.
  • Each sensor receives a clock signal which regulates the frequency sampling of the converter 614.
  • the sampled signal is quantified by the converter 614 and is transmitted, in digital form, to the control unit which will process it.
  • FIG. 4 A preferred diagram of the realization of a means of adjustment, 410, of the parameters corresponding to a reception channel r is illustrated in FIG. 4.
  • a command 411 makes it possible to set the values of the characteristic parameters of the channel r for receiving the corresponding sound.
  • This control can be mechanical or electronic. This will be, for example, a joystick, a rotary or linear button, a mouse acting on a potentiometer.
  • Each of the parameters is converted into a digital value, at a fixed rate, by an analog-digital converter 412.
  • These digital values are advantageously between 1 and a limit value.
  • the sampling rate of the values will preferably be lower than the sampling rate in the sensors 610.
  • a value of 25 Hz is chosen.
  • the set of parameter values is transmitted to the control unit 300 so that it proceeds to the processing of the signals.
  • the abscissa of the point aimed on the screen is in one-to-one relation with the horizontal angle of sight noted a (r), and the ordinate of the point aimed on the screen is in one-to-one relationship with the vertical angle of sight noted b (r).
  • the width and height of the video screen correspond to the value of the focal length of the camera.
  • the camera 100 supplies the value of its focal length to the control unit 300, so that the latter can correspond values of angles to the abscissa and to the ordinate of the point targeted on the screen, which is identified in an arbitrary system of units such as, for example, the percentage.
  • the control unit knowing the value of the focal length of the camera, that is to say the value of the maximum opening angle corresponding to the width of the screen, defined by the value 100%; can, by a simple report, determine the value of the horizontal viewing angle, corresponding to any value of the abscissa of a target point on the screen.
  • A the maximum number of values corresponding to a (r)
  • B the maximum number of values corresponding to b (r)
  • C the maximum number of values corresponding to c (r)
  • D the number maximum of values corresponding to d (r)
  • P the maximum number of values corresponding to p (r).
  • a user advantageously fixes the value of a parameter, at least, among all these parameters.
  • the values of the parameters which are not set by the user advantageously receive a default value, or else a value deduced from another parameter.
  • the value taken can be equal to the width c (r) of the reception channel r.
  • the control unit 300 makes it possible to process the signals coming from the sensors 610. It also processes the signals, coming from the adjustment means, representing the values of the parameters. These parameter values influence the calculation of the values of the coefficients of the digital filters 310, that is to say on the directional characteristics of the sound reception channels. Consequently, the values of the parameters of the reception channels play an important role in the processing of the signals coming from the sensors, since these signals will not be treated in the same way according to the directional characteristic which one fixes for each reception channel.
  • the processing which must be carried out on the signals coming from the M sensors 610 consists in forming, at each instant n, the R signals at the output of the focused channels. These signals will be available at outputs 710.
  • the signals received by the M sensors and converted into digital signals, by the analog-digital converters 614, at the sampling instants n, are denoted x (m, n).
  • the signal s (r, n) in the channel r is supplied, in digital form, by the control unit 300 to the corresponding output 710.
  • a variant would consist in supplying the corresponding output 710, the signal s (r, n) in the channel r, in analog form, after passing through a digital-analog converter.
  • the processing which must be carried out on the signals coming from the R adjustment means consists in modifying, at each instant n, the values of the coefficients of the filters in order to modify the directional characteristics of the sound reception channels.
  • the coefficients h (q, r, m, n) of the filter r in the path r, for the sensor m, depend on the instant n.
  • the coefficients are updated on the basis of information, that is to say on the basis of the parameter values acquired by the control unit 300 from the R adjustment means 400 and transmitted every N samples to the unit. control 300.
  • the coefficients are updated at time n o , they will be updated again at time n o + N.
  • a method of adjusting the characteristic parameters of the sound recording furthermore consists in reconstituting, by calculation, the values of the coefficients of the filters, between these two instants n o and n o + N.
  • the control unit 300 calculates at all times n the values of the coefficients h (q, r, m, n) of the filters 310 from the values of parameters received, at the sampling rate of the converters 412, of the R means of settings 410.
  • the control unit determines, for each channel r of reception of the sound, the values of the coefficients h (q, r, m, n o + N) of the filters, which serve at interpolate, using equation (3), the values of the coefficients h (q, r, m, n), between the present time n o and the time n o + N where the following information is received.
  • the values of the coefficients are therefore interpolated over time, at each sampling instant, between these two values, n o and n o + N, which are modified at a regular rate but, preferably, slower than the sampling frequency .
  • equations (1) and (2) can be applied twice. Indeed, these equations are applied for the first time for filters with coefficients h (q, r, m, n o ), which gives the following signals: y o (r, m, n) and s o (r, n ). These equations are applied a second time for filters of coefficients h (q, r, m, n o + N) which gives the following signals: y N (r, m, n) and s N (r, n).
  • s (r, n) [(nn o ) / N] S NOT (r, n) + [(n o + Nn) / N] s o (r, n)
  • Another variant of this method would consist in interpolating the values of the coefficients of the filters 310, not only in time but also in space.
  • the filter coefficients would also be interpolated between two positions, displayed on the screen, corresponding to the renewal of the filter coefficients.
  • the values of the coefficients of the filters 310 are a function of the settings, given by the manipulator through the commands 411 of the adjusting means 410, and described by the parameters a (r), b (r), c (r), d (r ), p (r).
  • F (a, b, c, d, p) this function. It provides, for each quintuplet value (a, b, c, d, p) of parameters, a vector QxM representing the Q coefficients of the filters, corresponding to the R channels for receiving the sound of the M sensors, when the settings are (a, b, c, d, p).
  • the control unit applies R times this function F to obtain the values of the coefficients of the filters corresponding to the R reception channels formed.
  • a first step consists in determining the coordinates of the position of an actual sound source and the coordinates of the positions of fictitious sound sources taken as a reference.
  • the coordinates of a real sound source we determine, for example, the horizontal angle u a of the beam centered on the direction defined by a, the vertical angle v b of the beam centered on the direction defined by b , the horizontal angles, u a1 and u a2 , which form the horizontal limits of the beam centered on the direction defined by a and of width defined by c, and finally, the vertical angles, v b1 and v b2 which form the vertical limits of beam centered on the direction defined by b and of width defined by d.
  • the origin in three-dimensional space is advantageously defined by the position of the camera 100.
  • the coordinates of the positions of the reference sources are then calculated from their expression which is as follows: [pCos (u k ) Cos (v k ), pcos (u k ) Cos (v k ), psin (u k )]
  • the distance z (k, m) between the source and the sensor is calculated.
  • the transfer functions are also calculated from the reference sources to the sensors, for the reference frequencies.
  • This transfer function makes it possible, in a second step, to establish the expressions of the gains obtained, for the fictitious sounds originating from reference sound sources, and to fix the gains, which it is desired to obtain, for these same fictitious sounds.
  • the filter whose coefficients are f (m, q)
  • the sound from a source, located at a position k will be received, for a frequency f i , with a gain g (k, f i ) which is determined according to the equation:
  • q o f (m, q) t (m, k, f i ) e -j2 ⁇ f i q
  • the desired gains g s (k, f i ) corresponding to the sounds originating from the sound sources situated at the reference positions are fixed, this for reference frequencies f i .
  • a third step an expression of the difference between the gains obtained and the desired gains is established.
  • This difference represents an error, which can be reduced to a threshold value, which has been fixed, using, for example, the least squares calculation method.
  • This equation (7) represents a sum of squares and double products. This means that the criterion given by equation (7) is quadratic in g (k, f i ). Likewise the criterion given by equation (6) is quadratic in f (m, q). The reduction of the error to a threshold value leads to a system in these unknowns f (m, q), which admits a unique solution. The solution of F is obtained by derivation of equation (7) with respect to the values of the coefficients f (m, q).
  • the values of the filter coefficients can be determined from the expression of the function F thus found. To be able to determine the values of these coefficients, there are two possibilities.
  • the values of the coefficients are determined, before any manipulation, from the function F and for fixed values of parameters, then they are stored in a table.
  • This table can, for example, be a two-dimensional table comprising QxM rows and AxBxCxDxP columns.
  • quintuplets (a, b, c, d, p) of parameters for example, define the indices of the columns and the numbers q of the coefficients of the filters corresponding to each sensor m define the indices of the lines.
  • the dimension of the table can be higher if we decide to separate the quintuplets into 2, 3, 4, or 5 distinct parameters and if we decide to distinguish the Q coefficients and the M sensors to store them in rows and columns separated.
  • This memorization of the values of the coefficients in a table makes it possible to change the values of the coefficients more quickly during the manipulation of sound recording, for fixed values of parameters.
  • the coefficients will change values only when the values of the quintuplets of parameters, which are fixed and stored in the table, are reached. Between these quintuplet values, corresponding to the updating of filters, the values of the coefficients could, for example, be interpolated.
  • the values of the coefficients of each filter are determined in real time, from the expression of the function F and for values of parameters varying continuously.
  • the coefficients of the filters are preferably updated at regular rate and their values are interpolated according to equation (3) previously established.
  • the orientation of the camera and that of the sensor network must be connected, by any means, in order to avoid any discrepancy between, on the one hand, the image representing the position of the sound sources, and on the other hand, the images showing the variation of the characteristic parameters of the sound reception channels. In this way, one can visualize very precisely the variation of the parameters compared to the position and the size of the sound sources.
  • Another embodiment of a device according to the invention therefore relates to fixing the camera 100 relative to the network 600 of sensors.
  • the camera 100 is advantageously fixed on the same frame as the network 600 of sensors so that its pointing is strictly invariant with respect to the position of the sensors.
  • a variant of this system consists in not fixing the camera 100 on the same frame as the network 600 of sensors.
  • the sensor network must have a fixed position in space and the camera must also have a fixed position and orientation in space to obtain a pointing of the sound sources. which is invariant with respect to the positions of the sensors.
  • the device it is possible to add a remote control making it possible to remotely adjust the video pointing system.
  • a user does not necessarily have access to the video system, so that he cannot view the settings made.
  • a hearing feedback system it is also preferable to equip the device with a hearing feedback system, allowing the user to make the adjustments directly, using the audible signals reaching him.
  • Hearing feedback is for example achieved by means of an earpiece placed in the user's ear canal and connected to the device by a cable, or better, via a Hertzian channel.

Landscapes

  • Health & Medical Sciences (AREA)
  • Otolaryngology (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • General Health & Medical Sciences (AREA)
  • Stereophonic System (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Circuit For Audible Band Transducer (AREA)
  • Filters That Use Time-Delay Elements (AREA)
  • Television Signal Processing For Recording (AREA)
EP95402816A 1994-12-21 1995-12-14 Dispositif de prise de sons comprenant un système vidéo pour le réglage de paramètres et procédé de réglage Expired - Lifetime EP0719070B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR9415429A FR2728753A1 (fr) 1994-12-21 1994-12-21 Dispositif de prise de sons comprenant un systeme video pour le reglage de parametres et procede de reglage
FR9415429 1994-12-21

Publications (2)

Publication Number Publication Date
EP0719070A1 EP0719070A1 (fr) 1996-06-26
EP0719070B1 true EP0719070B1 (fr) 1997-09-17

Family

ID=9470076

Family Applications (1)

Application Number Title Priority Date Filing Date
EP95402816A Expired - Lifetime EP0719070B1 (fr) 1994-12-21 1995-12-14 Dispositif de prise de sons comprenant un système vidéo pour le réglage de paramètres et procédé de réglage

Country Status (6)

Country Link
US (1) US5760825A (enrdf_load_stackoverflow)
EP (1) EP0719070B1 (enrdf_load_stackoverflow)
JP (1) JP3575775B2 (enrdf_load_stackoverflow)
CA (1) CA2165512C (enrdf_load_stackoverflow)
DE (1) DE69500732T2 (enrdf_load_stackoverflow)
FR (1) FR2728753A1 (enrdf_load_stackoverflow)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5959667A (en) * 1996-05-09 1999-09-28 Vtel Corporation Voice activated camera preset selection system and method of operation
US6269483B1 (en) * 1998-12-17 2001-07-31 International Business Machines Corp. Method and apparatus for using audio level to make a multimedia conference dormant
US6868372B2 (en) 2000-04-12 2005-03-15 Home Box Office, Inc. Image and audio degradation simulator
FR2832282B1 (fr) * 2001-11-12 2004-07-30 France Telecom Systeme audiovisuel modulaire avec modules concatenables pour mettre en presence une scene locale et une scene distante
JP2003244800A (ja) * 2002-02-14 2003-08-29 Matsushita Electric Ind Co Ltd 音像定位装置
JP2005311604A (ja) * 2004-04-20 2005-11-04 Sony Corp 情報処理装置及び情報処理装置に用いるプログラム
JP5856295B2 (ja) * 2011-07-01 2016-02-09 ドルビー ラボラトリーズ ライセンシング コーポレイション 適応的オーディオシステムのための同期及びスイッチオーバ方法及びシステム

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4802227A (en) * 1987-04-03 1989-01-31 American Telephone And Telegraph Company Noise reduction processing arrangement for microphone arrays
CA1312369C (en) * 1988-07-20 1993-01-05 Tsutomu Ishikawa Sound reproducer
FR2635622A1 (fr) * 1988-08-19 1990-02-23 France Etat Dispositif de saisie de signaux sonores a elimination de brouilleur
JPH03245203A (ja) * 1990-02-23 1991-10-31 Yamatake Honeywell Co Ltd 予測制御における予測値のトレンド表示方法及び装置
JPH0678307A (ja) * 1992-07-06 1994-03-18 Sanyo Electric Co Ltd リモートコントロール装置及び電子機器制御システム
WO1994006246A1 (en) * 1992-08-27 1994-03-17 Kabushiki Kaisha Toshiba Moving picture encoder
US5335011A (en) * 1993-01-12 1994-08-02 Bell Communications Research, Inc. Sound localization system for teleconferencing using self-steering microphone arrays
US5548346A (en) * 1993-11-05 1996-08-20 Hitachi, Ltd. Apparatus for integrally controlling audio and video signals in real time and multi-site communication control method

Also Published As

Publication number Publication date
DE69500732T2 (de) 1998-02-12
US5760825A (en) 1998-06-02
DE69500732D1 (de) 1997-10-23
EP0719070A1 (fr) 1996-06-26
FR2728753B1 (enrdf_load_stackoverflow) 1997-02-28
CA2165512A1 (fr) 1996-06-22
FR2728753A1 (fr) 1996-06-28
CA2165512C (fr) 2006-09-19
JP3575775B2 (ja) 2004-10-13
JPH08265894A (ja) 1996-10-11

Similar Documents

Publication Publication Date Title
CN101163204B (zh) 声音拾取设备和声音拾取方法
EP0790753B1 (fr) Système de spatialisation sonore, et procédé pour sa mise en oeuvre
US5696831A (en) Audio reproducing apparatus corresponding to picture
EP1836876B1 (fr) Procédé et dispositif d'individualisation de hrtfs par modélisation
EP1586220B1 (fr) Procede et dispositif de pilotage d'un ensemble de restitution a partir d'un signal multicanal
FR2738099A1 (fr) Procede de simulation de la qualite acoustique d'une salle et processeur audio-numerique associe
EP0719070B1 (fr) Dispositif de prise de sons comprenant un système vidéo pour le réglage de paramètres et procédé de réglage
CN110267166B (zh) 一种基于双耳效应的虚拟声场实时交互系统
US12069463B2 (en) Dynamic time and level difference rendering for audio spatialization
FR3065137A1 (fr) Procede de spatialisation sonore
WO2020043979A1 (fr) Procédé pour une restitution sonore spatialisée d'un champ sonore audible en une position d'un auditeur se déplaçant et système mettant en œuvre un tel procédé
US11751003B1 (en) Personalization of head-related transfer function
EP2478715B1 (en) Method for acquiring audio signals, and audio acquisition system thereof
US12200467B2 (en) System and method for improved processing of stereo or binaural audio
EP1502475B1 (fr) Procede et systeme de representation d un champ acoustique
FR3120449A1 (fr) Procédé de détermination d’une direction de propagation d’une source sonore par création de signaux sinusoïdaux à partir des signaux sonores reçus par des microphones.
EP2987339B1 (fr) Procédé de restitution sonore d'un signal numérique audio
CN115134499A (zh) 一种音视频监控方法及系统
FR3102925A1 (fr) Dispositif de test audiometrique
EP1438871A1 (fr) Dispositif de saisie et restitution du son utilisant plusieurs capteurs
CN116723229A (zh) 一种身临其境的远程音频传输系统及方法
CN120581021A (zh) 音频变焦方法、电子设备、存储介质及计算机程序产品
WO2024240950A1 (fr) Procédé pour générer une scène audio dans un système de spatialisation binaurale
WO2019020437A1 (fr) Procédé et système de traitement d'un signal audio incluant un encodage au format ambisonique

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): DE FR GB IT

17P Request for examination filed

Effective date: 19960709

17Q First examination report despatched

Effective date: 19960905

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB IT

ITF It: translation for a ep patent filed
GBT Gb: translation of ep patent filed (gb section 77(6)(a)/1977)

Effective date: 19970919

REF Corresponds to:

Ref document number: 69500732

Country of ref document: DE

Date of ref document: 19971023

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed
REG Reference to a national code

Ref country code: GB

Ref legal event code: IF02

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20081218

Year of fee payment: 14

REG Reference to a national code

Ref country code: GB

Ref legal event code: 732E

Free format text: REGISTERED BETWEEN 20090528 AND 20090603

REG Reference to a national code

Ref country code: FR

Ref legal event code: TP

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20091214

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20111230

Year of fee payment: 17

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20121128

Year of fee payment: 18

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20121219

Year of fee payment: 18

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 69500732

Country of ref document: DE

Effective date: 20130702

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130702

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20131214

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20140829

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20131231

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20131214