WO2002041664A2 - Systeme audio a reglage automatique - Google Patents

Systeme audio a reglage automatique Download PDF

Info

Publication number
WO2002041664A2
WO2002041664A2 PCT/EP2001/013304 EP0113304W WO0241664A2 WO 2002041664 A2 WO2002041664 A2 WO 2002041664A2 EP 0113304 W EP0113304 W EP 0113304W WO 0241664 A2 WO0241664 A2 WO 0241664A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
speakers
audio
image
generating system
Prior art date
Application number
PCT/EP2001/013304
Other languages
English (en)
Other versions
WO2002041664A3 (fr
Inventor
Miroslav Trajkovic
Srinivas Gutta
Antonio Colmenarez
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Priority to JP2002543259A priority Critical patent/JP2004514359A/ja
Priority to EP01989480A priority patent/EP1393591A2/fr
Publication of WO2002041664A2 publication Critical patent/WO2002041664A2/fr
Publication of WO2002041664A3 publication Critical patent/WO2002041664A3/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • H04S7/302Electronic adaptation of stereophonic sound system to listener position or orientation
    • H04S7/303Tracking of listener position or orientation

Definitions

  • the invention relates to audio systems, such as stereo systems, television audio systems and home theater systems.
  • the invention relates to systems and methods for adjusting audio systems.
  • 2,228,324 describes a system that adjusts the balance of a stereo system as a user moves, in order to maintain the stereo effect for the listener.
  • a signal emitter carried by the user emits signals to two separate receivers that are adjacent to two stereo speakers.
  • the signal emitted may be an ultrasonic signal, infra-red signal or radio signal and may be emitted in response to an initiating signal. (It may also be a wired electrical signal.)
  • the system uses the time it takes a respective receiver (adjacent a speaker) to receive the signal from the signal emitter to determine the distance between the user and the speaker. A distance between the user and each of the two speakers is so calculated.
  • GB 2,228,324 refers to the system determining the position of the user by determining the point where the user's distance from each speaker overlaps, but notes that determining position is not necessary for adjusting stereo balance.
  • Japanese Patent Abstract 5-137200 detects the position of a viewer in one of five angular zones with respect to the front of a television by pointing a separate infra-red detector at each zone. The balance of the stereo speakers flanking the television screen is said to be adjusted based on the zone the viewer is in.
  • Japanese Patent Abstract 4-130900 uses elapsed time of light transmission to calculate the distances between a listener and two light emitting and detecting parts. The distances between the user and the two parts and the distance between the two parts is used to calculate the position of the listener and to adjust the balance of the audio signal.
  • Japanese Patent Abstract 7-302210 uses an infra-red signal to measure the distance between a listening position and a series of spealcer and to adjust an appropriate delay time for each speaker based on the distance between the spealcer and the listening position.
  • One obvious difficulty with the prior art systems is that they either require a user to wear or carry a signal emitter (as in GB 2,228,324) in order to enjoy automatic adjustment of a balance of a stereo system, or, if not, to rely on sensors (such as infra-red sensors) that are unreliable and/or crude in detecting the position of a listener.
  • sensors such as infra-red sensors
  • use of infra-red detectors may fail to detect the listener, resulting in the above-mentioned systems failing to balance properly for the user's position.
  • other people or other items, such as pets
  • may be sensed by the sensors resulting in an adjustment in the balance to someone or something other than the listener.
  • a home theater system typically has a multiplicity of speakers positioned about a room that are used to project audio, including audio effects, to a listener.
  • the audio is not simply "balanced" between speakers. Rather, the output of a particular speaker location may be raised and lowered or otherwise coordinated based on the audio effect to be projected to the listener at his or her location. For example, two speakers of a multiplicity of speakers may be driven in phase or out of phase, in order to project a particular audio effect to a listener at the listener's position.
  • the invention provides an audio system (including an audiovisual system) that can automatically adjust to the position of the listener or user of the system, including a change in position of the user.
  • the system uses image capturing and recognition that recognizes some or part of the contours of a human body, i.e., the user. Based on the position of the user in the field of view, the system determines position information of the user. In one embodiment of the system, for example, the angular position of the user is determined based on the location of the image of the user in the field of view of an imaging capturing device, and the system may adjust the output of two or more speakers based on the determined angle.
  • the image capturing device may be, for example, a video camera connected to a control unit or CPU that has image recognition software programmed to recognize all or part of the shape of a human body.
  • image recognition software programmed to recognize all or part of the shape of a human body.
  • Various methods of detecting and tracking active contours such as the human body have been developed. For example, a "person finder” that finds and follows people's bodies (or head or hands, for example) in a video image is described in "Pfinder: Real-Time Tracking Of the Human Body” by Wren et al., M.I.T. Media Laboratory Perceptual Computing Section Technical Report No. 353, published in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, no. 7, pp 780-85 (July 1997), the contents of which are hereby incorporated by reference.
  • control unit or CPU may be programmed to recognize the contours of a human head or even the contours of a particular user's face.
  • Software that can recognize faces in images is commercially available, such as the "Facelt" software sold by Visionics and described at www.faceit.com.
  • Software incorporating such algorithms which may be used to detect human bodies, faces, etc. will be generally referred to as image recognition software, image recognition algorithm and the like in the description below.
  • the position of the recognized body or head relative to the field of view of the camera may be used, for example, to determine the angle of the user's location with respect to the camera.
  • the determined angle may be used to balance or otherwise adjust the audio output and effects to be projected by each speaker to the user's location.
  • the use of an image capturing device and related image sensing software that identifies the contour of a human body or a particular face makes the detection of the user more accurate and reliable.
  • Two or more such programmed image capturing devices having overlapping fields of view may be used to accurately determine the location of the user.
  • two separate cameras as described above may be separately located and each may be used to determine the user's position in a reference coordinate system.
  • the user's location may be used by the audio system, for example, to determine the distance between the user's present location and the fixed (known) position of each speaker in the reference coordinate system and to make the appropriate adjustments to the speaker output to provide the proper audio mix to the user's location, such as audio effects in a home theater system.
  • the invention comprises an audio generating system that outputs audio through two or more speakers.
  • the audio output of each of the two or more speakers is adjustable based upon the position of a user with respect to the positions of the two or more speakers.
  • the system includes at least one image capturing device (such as a video camera) that is trainable on a listening region and coupled to a processing section having image recognition software.
  • the processing section uses the image recognition software to identify the user in an image generated by the image capturing device.
  • the processing section also has software that generates at least one measurement of the position of the user based upon the position of the user in the image.
  • Fig. 1 is a perspective view of a home theater system including automatic detection and locating of a user and adjustment of output in accordance with a first embodiment of the invention
  • Fig. la is a diagram of portions of the control system of the system of Fig. 1;
  • Fig. 2a is an image that includes an image of a user captured by a first camera of the system of Fig. 1;
  • Fig. 2b is an image that includes an image of the user captured by a second camera of the system of Fig. 1 ;
  • Fig. 3 is a representative view of a stereo system including automatic detection and locating of a user and adjustment of output in accordance with a second embodiment of the invention; and Fig. 3 a is an image that includes an image of the user captured by a camera of the system of Fig. 3.
  • a user 10 is shown positioned amongst audio and visual components of a home theater system.
  • the home theater system is comprised of a video display screen 14 and a series of audio speakers 18a-e surrounding the perimeter of a comfortable viewing area for the display screen 14.
  • the system is also comprised of a control unit 22, shown in Fig. 1 positioned atop the display screen 14.
  • the control unit 22 may be positioned elsewhere or may be incorporated within the display unit 14 itself.
  • the control unit 22, display screen 14 and speakers 18a-e are all electrically connected with electrical wires and connectors. The wires are typically run beneath carpet in a room or within an adjacent wall, so they are not shown in Fig. 1.
  • the home theater system of Fig. 1 includes electrical components that produce visual output from display screen 14 and corresponding audio output from speakers 18a-e.
  • the audio and video processing for the home theater output typically occurs in the control unit 22, which may include a processor, memory and related processing software.
  • control units and related processing components are known and available in various commercial formats.
  • Audio and video input provided to the control unit 22 may come from a television signal, a cable signal, a satellite signal, a DVD and/or a VCR.
  • the control unit 22 processes the input signal and provides appropriate signals to the driving circuitry of the display screen 14, resulting in a video display, and also processes the input signal and provides appropriate driving signals to the speakers 18a-e, as shown in Fig. la.
  • the audio portion of the signal input to the control unit 22 may be a stereophonic signal or may support more complex audio processing, such as audio effects processing by the control unit 22.
  • the control unit 22 may drive speakers 18b, 18c, 18d in an overlapping sequence in order to simulate a car passing by on the right hand portion of the display.
  • the amplitude and phase of each speaker 18b, 18c, 18d is driven based on received audio signal by the control unit 22, as well as the position of the speaker 18b, 18c, 18d relative to the user 10 as stored in the memory of control unit 22.
  • the control unit 22 may receive and store the positions of the speakers 18a-e and the position of the user 10 with respect to a common reference system, such as the one defined by origin O and unit vectors (x,y,z) in Fig. 1.
  • the x, y and z coordinates of each speaker 18a-e and the user 10 in the reference coordinate system may be physically measured or otherwise determined and input to the control unit 22.
  • the position of user 10 in Fig. 1 is shown to have coordinates (Xp,Yp, Zp) in the reference coordinate system.
  • the reference coordinate system in general may be located in positions other than shown in Fig. 1. (As described further below, the reference coordinate system in Fig.
  • control unit 22 may alternatively translate the coordinates to an internal reference coordinate system.
  • the position of the user 10 and the speakers 18a-e in such a common reference coordinate system enables the control unit 10 to determine the position of the user 10 with respect to each speaker 18a-e. (It is well known that subtracting the coordinates of the user 10 from the coordinates of the speaker 18a determines their relative positions in the reference coordinate system.)
  • Software within the control unit 22 electronically adjusts the driving signals for the audio output (such as volume, frequency, phase) of each speaker based upon the received audio signal, as well as the position of the user 10 relative to the speaker.
  • Electronic adjustment of the audio output by the control unit 22 based on the relative positions of the speakers 18a-e with respect to the user 10 is known in the art.
  • the control system may allow the user to manually adjust the audio output of each speaker 18a-e.
  • Such manual controls of the audio components via the control unit 22 is also known in the art.
  • input may be provided to the control unit 22 through a remote that wirelessly interfaces with the control unit 22 and projects a menu on the display screen 14, that allows, for example, input of positional data.
  • the home theater system of Fig. 1 can also automatically identify the user and the user's location in the reference coordinate system.
  • the locations of the user 10 and the speakers 18a-e in the reference coordinate system at origin O were presumed to be known by the control unit 22 based, for example, on manual input provided by the user.
  • the positions of the speakers 18a-e will still normally be known to the control unit 22, since they usually will remain fixed after they are placed.
  • the positions of the speakers 18a-e in the reference coordinate system are each manually input to the control system 22 during the initial system set-up and generally remained fixed thereafter.
  • the speaker location may be changed, of course, and a new position(s) may be input, but this does not occur during normal usage of the system.
  • the control unit 22 adjust the audio output to each speaker 18a-e based on the relative locations of the user 10 and the speakers 18a-e, as in the case of manual input of positions, as previously described.
  • the system is further comprised of two video cameras 26a, 26b located atop display screen 14 and directed toward the normal viewing area of the display screen 14.
  • Camera 26a is located at the origin O of the common reference coordinate system.
  • video cameras 26a, 26b may be positioned at other locations; the reference coordinate system may also be re-positioned to a different location of camera 26a or elsewhere.
  • Video cameras 26a, 26b interface with the control unit 22 and provide it with images captured in the viewing area.
  • Image recognition software is loaded in control unit 22 and is used by a processor therein to process the video images received from the cameras 26a, 26b.
  • the components, including memory, of the control unit 22 used for image recognition may be separate or may be shared with the other functions of the control unit 22, such as those shown in Fig. la. Alternatively, the image recognition may take place in a separate unit.
  • Fig. 2a depicts the image in the field of view of camera 26a on one side of the display screen of Fig. 1.
  • the image of Fig. 2a is transmitted to control unit 22, where it is processed using, for example, known image recognition software loaded therein.
  • An image recognition algorithm may be used to recognize the contours of a human body, such as the user 10.
  • image recognition software may be used that recognizes faces or may be programmed to recognize a particular face or faces, such as the face of user 10.
  • the control unit 22 is programmed to determine the point Pi' at the center of the user's 10 head in the image and the coordinates (x',y') with respect to the point Oj' in the upper left-hand corner of the image.
  • the point Oj' in the image of Fig. 2a corresponds approximately to the point (0,0,Zp) in the reference coordinate system of Fig. 1.
  • Fig. 2b depicts the image in the field of view of camera 26b on the other side of the display screen of Fig. 1.
  • the image of Fig. 2b is transmitted to control unit 22, where it is processed using image recognition software to recognize the user 10 or the image of the user's face.
  • the control unit determines the point Pi" at the center of the user's head 10 in the image of Fig. 2b and the coordinates (x",y") with respect to the point 0/' in the upper left-hand corner of the image.
  • the coordinates (Xp,Yp, Zp) of the position P of the user 10 in the reference coordinate system of Fig. 1 may be uniquely determined using standard techniques of computer vision known as the "stereo problem".
  • Basic stereo techniques of three dimensional computer vision are described for example, in “Introductory Techniques for 3-D Computer Vision” by Trucco and Verri, (Prentice Hall, 1998) and, in particular, Chapter 7 of that text entitled “Stereopsis", the contents of which are hereby incorporated by reference.
  • D is the distance between cameras 26a, 26b.
  • D is the distance between cameras 26a, 26b.
  • Eqs. 1-4 are up to linear transformations defined by camera geometry.
  • Equations 1-4 have three unknown variables (coordinates Xp,Yp, Zp), thus the simultaneous solution gives the values of Xp,Yp, and Zp and thus gives the position of the user 10 in the reference coordinate system of Fig. 1.
  • the coordinates (Xp, Yp, Zp) may be translated to another internal coordinate system of the control unit 22.
  • the processing required to determine the position (Xp,Yp, Zp) of the user and to translate the radial coordinates to another reference coordinate, if necessary, may also take place in a processing unit other than control unit 22. For example, it may take place in a processing unit that also supports the image recognition processing, thus comprising a separate processing unit dedicated to the tasks of image detection and location.
  • the fixed positions of speakers 18a-e are known to the control unit 22 based on prior input. For example, once each speaker 18a-e is placed in the room as shown in Figs. 1, the coordinates (x,y,z) of each speaker 18a-e in the reference coordinate system, and the distance D between cameras 26a, 26b may be measured and input in memory in the control unit 22.
  • the coordinates (Xp,Yp, Zp) of the user 10 as determined using the image recognition software (along with the post-recognition processing of the stereo problem described above) and the pre-stored coordinates of each speaker may then be used to determine the position of the user 10 with respect to each speaker 18a-e.
  • the audio processing of the control unit 22 may then appropriately adjust the audio output (including amplitude, frequency and phase) of each speaker 18a-e based upon the input audio signal and the position of the user 10 with respect to the speakers 18a-e.
  • the use of the video cameras 26a, 26b, image recognition software, and post- recognition processing to determine a detected user's position thus allows the location of the user of the home theater system of Fig. 1 to be automatically detected and determined. If the user moves, the processing is repeated and a new position is determined for the user, and the control unit 22 uses the new location to adjust the audio signals output by speakers 18a-e.
  • the automatic detection feature may be turned off so that the output of the speakers is based on a default or a manual input of the location of the user 10.
  • the image recognition software may also be programmed to recognize, for example, a number of different faces and the face of a particular user may be selected for recognition and automatic adjustment. Thus, the system may adjust to the position of a particular user in the viewing area.
  • the image recognition software may be used to detect all faces or human bodies in the viewing area and the processing may then automatically determine each of their respective locations.
  • the adjustment of the audio output of each speaker 18a-e may be determined by an algorithm that attempts to optimize the aural experience at the location of each detected user.
  • Fig. 1 depicted a home theater system
  • the automatic detection and adjustment may be used by other audiovisual systems or other purely audio systems. It may be used, for example, with a stereo system having a number of speakers to adjust the volume at each speaker location based on the determined location of the user with respect to the speakers in order to maintain a proper (or pre-determined) balance of the stereophonic sound at the location of the user.
  • a simpler embodiment of the invention applied to a two speaker stereo system is shown in Fig. 3.
  • the basic components of the stereo system comprise a stereo amplifier 130 attached to two speakers 100a, 100b.
  • a camera 110 is used to detect an image of a listening region, including the image of a listener 140 in the listening region.
  • Fig. 3 shows a simple reference coordinate system in the plane, having an origin O at the camera and comprised of the angle of an object with respect to the axis A of the camera 110.
  • the angle 3 is the angular position of speaker 100a
  • the angle N is the angular position of speaker 100b
  • the angle 2 is the angular position of the user 140.
  • Fig. 3 shows the top of the user's head.
  • the user 140 is assumed to listen to the stereo in the central region of Fig. 3 at an approximate distance D from the origin O.
  • the speakers 100a, 100b have a default balance at the position D along the axis A, which is approximately at the center of the listening area.
  • the angles 3 and N of the positions of speakers 100a, 100b are measured and pre-stored in processing unit 120.
  • the image captured by the camera 110 is transferred to the processing unit 120 that includes image recognition software that detects the contour of a human body, a particular face, etc., as described in the embodiment above.
  • the location of the detected body or face in the image is used by the processing unit to determine the angle 2 corresponding to the position of the user 140 in the reference coordinate system.
  • a first order determination of the angle 2 is:
  • the processing unit 120 in turn sends a signal to the amplifier that adjusts the balance of speakers 100a, 100b based on the relative angular positions of the user 140 and the speakers 100a, 100b.
  • the output of speaker 110a is adjusted using a factor (3-2) and the output of speaker 110b is adjusted using a factor (N+2).
  • the balance of speakers 100a, 100b is thus automatically adjusted based upon the position of the user 140 with respect to the speakers 100a, 100b.
  • the adjustment of the balance is based on the angular position 2 of the user is an acceptable first order adjustment.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Stereophonic System (AREA)
  • Television Receiver Circuits (AREA)

Abstract

L'invention concerne un système de génération audio qui émet du son à travers un ou plusieurs haut-parleurs. Le son émis par chaque haut-parleur est réglable en fonction de la position d'un utilisateur par rapport à l'emplacement des haut-parleurs. Le système comprend au moins un dispositif de capture d'images (tel qu'une caméra vidéo) qui peut être entraînée sur une zone d'écoute et couplée à une section de traitement possédant un logiciel de reconnaissance d'image. La section de traitement utilise le logiciel de reconnaissance d'image afin d'identifier l'utilisateur dans une image générée par le dispositif de capture d'images. Cette section comprend aussi un logiciel générant au moins une mesure de l'emplacement de l'utilisateur en fonction de la position de celui-ci dans l'image.
PCT/EP2001/013304 2000-11-16 2001-11-14 Systeme audio a reglage automatique WO2002041664A2 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2002543259A JP2004514359A (ja) 2000-11-16 2001-11-14 自動調整音響システム
EP01989480A EP1393591A2 (fr) 2000-11-16 2001-11-14 Systeme audio a reglage automatique

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US71389800A 2000-11-16 2000-11-16
US09/713,898 2000-11-16

Publications (2)

Publication Number Publication Date
WO2002041664A2 true WO2002041664A2 (fr) 2002-05-23
WO2002041664A3 WO2002041664A3 (fr) 2003-12-18

Family

ID=24867986

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2001/013304 WO2002041664A2 (fr) 2000-11-16 2001-11-14 Systeme audio a reglage automatique

Country Status (3)

Country Link
EP (1) EP1393591A2 (fr)
JP (1) JP2004514359A (fr)
WO (1) WO2002041664A2 (fr)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004004068A1 (de) * 2004-01-20 2005-08-04 Deutsche Telekom Ag Verfahren und Steuereinheit zur ortsbezogenen Einrichtung und Optimierung von Multimedia-Anlagen
WO2006005938A1 (fr) * 2004-07-13 2006-01-19 1...Limited Systeme de haut-parleur portable
FR2877534A1 (fr) * 2004-11-03 2006-05-05 France Telecom Configuration dynamique d'un systeme sonore
EP1677574A2 (fr) * 2004-12-30 2006-07-05 Mondo Systems, Inc. Système multimédia de traitement de signaux avec traitement centralisé de signaux
WO2006100644A2 (fr) * 2005-03-24 2006-09-28 Koninklijke Philips Electronics, N.V. Adaptation de l'orientation et de la position d'un dispositif electronique pour experiences d'immersion
EP1677515A3 (fr) * 2004-12-30 2007-05-30 Mondo Systems, Inc. Système de traitement de signaux audio et vidéo avec traitement centralisé de signaux
NL1029844C2 (nl) * 2004-09-21 2007-07-06 Samsung Electronics Co Ltd Werkwijze, inrichting en computerleesbaar medium om een 2-kanaalvirtueelgeluid weer te geven gebaseerd op de luisteraarpositie.
WO2007004134A3 (fr) * 2005-06-30 2007-07-19 Philips Intellectual Property Procede de controle d'un systeme
WO2007113718A1 (fr) 2006-03-31 2007-10-11 Koninklijke Philips Electronics N.V. Dispositif et procede pour traiter des donnees
US20090060235A1 (en) * 2007-08-31 2009-03-05 Samsung Electronics Co., Ltd. Sound processing apparatus and sound processing method thereof
WO2009124773A1 (fr) * 2008-04-09 2009-10-15 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Système de reproduction sonore et procédé pour réaliser une reproduction sonore en utilisant un suivi visuelle des visages
EP1833276A3 (fr) * 2006-03-08 2009-12-02 Sony Corporation Appareil de télévision
WO2010141149A2 (fr) 2009-06-03 2010-12-09 Transpacific Image, Llc Gestion de projections multimédia
EP2464127A1 (fr) * 2010-11-18 2012-06-13 LG Electronics Inc. Dispositif électronique générant un son stéréo synchronisé avec une image en mouvement stéréographique
EP2667636A1 (fr) * 2012-05-25 2013-11-27 Samsung Electronics Co., Ltd. Appareil d'affichage, appareil de commande de niveau auditif et procédé de correction du son
US8976986B2 (en) 2009-09-21 2015-03-10 Microsoft Technology Licensing, Llc Volume adjustment based on listener position
EP3032847A3 (fr) * 2014-12-08 2016-06-29 Harman International Industries, Incorporated Réglage de haut-parleurs par reconnaissance faciale
EP2517478B1 (fr) * 2009-12-24 2017-11-01 Nokia Technologies Oy Appareil
CN107318071A (zh) * 2016-04-26 2017-11-03 音律电子股份有限公司 扬声装置、其控制方法及播放控制系统
WO2018149275A1 (fr) * 2017-02-16 2018-08-23 深圳创维-Rgb电子有限公司 Procédé et appareil d'ajustement d'une sortie audio par un haut-parleur
US10171054B1 (en) 2017-08-24 2019-01-01 International Business Machines Corporation Audio adjustment based on dynamic and static rules
US10440473B1 (en) 2018-06-22 2019-10-08 EVA Automation, Inc. Automatic de-baffling
US10484809B1 (en) 2018-06-22 2019-11-19 EVA Automation, Inc. Closed-loop adaptation of 3D sound
US10511906B1 (en) 2018-06-22 2019-12-17 EVA Automation, Inc. Dynamically adapting sound based on environmental characterization
US10524053B1 (en) 2018-06-22 2019-12-31 EVA Automation, Inc. Dynamically adapting sound based on background sound
US10531221B1 (en) 2018-06-22 2020-01-07 EVA Automation, Inc. Automatic room filling
US10552115B2 (en) 2016-12-13 2020-02-04 EVA Automation, Inc. Coordination of acoustic sources based on location
EP2731360B1 (fr) * 2012-11-09 2020-02-19 Harman International Industries, Inc. Système d'amélioration audio automatique
US10708691B2 (en) 2018-06-22 2020-07-07 EVA Automation, Inc. Dynamic equalization in a directional speaker array
CN111782045A (zh) * 2020-06-30 2020-10-16 歌尔科技有限公司 一种设备角度调节方法、装置、智能音箱及存储介质
CN116736982A (zh) * 2023-06-21 2023-09-12 惠州中哲尚蓝柏科技有限公司 一种用于家庭影院的多媒体输出参数自动调节系统及方法

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8015590B2 (en) 2004-12-30 2011-09-06 Mondo Systems, Inc. Integrated multimedia signal processing system using centralized processing of signals
JP4789145B2 (ja) * 2006-01-06 2011-10-12 サミー株式会社 コンテンツ再生装置及びコンテンツ再生プログラム
TWI510106B (zh) * 2011-01-28 2015-11-21 Hon Hai Prec Ind Co Ltd 聲音輸出校正系統及方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4027338A1 (de) * 1990-08-29 1992-03-12 Drescher Ruediger Balanceregelung fuer stereoanlagen u. dgl.
JP2001054200A (ja) * 1999-08-04 2001-02-23 Mitsubishi Electric Inf Technol Center America Inc スピーカーへの音配給調整システム及びその方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04351197A (ja) * 1991-05-29 1992-12-04 Matsushita Electric Ind Co Ltd 指向性制御スピーカシステム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4027338A1 (de) * 1990-08-29 1992-03-12 Drescher Ruediger Balanceregelung fuer stereoanlagen u. dgl.
JP2001054200A (ja) * 1999-08-04 2001-02-23 Mitsubishi Electric Inf Technol Center America Inc スピーカーへの音配給調整システム及びその方法

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 017, no. 213 (E-1356), 26 April 1993 (1993-04-26) -& JP 04 351197 A (MATSUSHITA ELECTRIC IND CO LTD), 4 December 1992 (1992-12-04) *
PATENT ABSTRACTS OF JAPAN vol. 2000, no. 19, 5 June 2001 (2001-06-05) -& JP 2001 054200 A (MITSUBISHI ELECTRIC INF TECHNOL CENTER AMERICA INC), 23 February 2001 (2001-02-23) *

Cited By (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004004068A1 (de) * 2004-01-20 2005-08-04 Deutsche Telekom Ag Verfahren und Steuereinheit zur ortsbezogenen Einrichtung und Optimierung von Multimedia-Anlagen
GB2431066A (en) * 2004-07-13 2007-04-11 1 Ltd Portable speaker system
WO2006005938A1 (fr) * 2004-07-13 2006-01-19 1...Limited Systeme de haut-parleur portable
GB2431066B (en) * 2004-07-13 2007-11-28 1 Ltd Portable speaker system
US7860260B2 (en) 2004-09-21 2010-12-28 Samsung Electronics Co., Ltd Method, apparatus, and computer readable medium to reproduce a 2-channel virtual sound based on a listener position
NL1029844C2 (nl) * 2004-09-21 2007-07-06 Samsung Electronics Co Ltd Werkwijze, inrichting en computerleesbaar medium om een 2-kanaalvirtueelgeluid weer te geven gebaseerd op de luisteraarpositie.
FR2877534A1 (fr) * 2004-11-03 2006-05-05 France Telecom Configuration dynamique d'un systeme sonore
WO2006048537A1 (fr) * 2004-11-03 2006-05-11 France Telecom Configuration dynamique d'un systeme sonore
WO2006073990A2 (fr) * 2004-12-30 2006-07-13 Mondo Systems, Inc. Systeme de traitement integre de signaux multimedia par traitement centralise de signaux
EP1677515A3 (fr) * 2004-12-30 2007-05-30 Mondo Systems, Inc. Système de traitement de signaux audio et vidéo avec traitement centralisé de signaux
EP1677574A3 (fr) * 2004-12-30 2006-09-20 Mondo Systems, Inc. Système multimédia de traitement de signaux avec traitement centralisé de signaux
US7653447B2 (en) 2004-12-30 2010-01-26 Mondo Systems, Inc. Integrated audio video signal processing system using centralized processing of signals
WO2006073990A3 (fr) * 2004-12-30 2009-04-23 Mondo Systems Inc Systeme de traitement integre de signaux multimedia par traitement centralise de signaux
US7561935B2 (en) 2004-12-30 2009-07-14 Mondo System, Inc. Integrated multimedia signal processing system using centralized processing of signals
EP1677574A2 (fr) * 2004-12-30 2006-07-05 Mondo Systems, Inc. Système multimédia de traitement de signaux avec traitement centralisé de signaux
WO2006100644A3 (fr) * 2005-03-24 2007-02-15 Koninkl Philips Electronics Nv Adaptation de l'orientation et de la position d'un dispositif electronique pour experiences d'immersion
WO2006100644A2 (fr) * 2005-03-24 2006-09-28 Koninklijke Philips Electronics, N.V. Adaptation de l'orientation et de la position d'un dispositif electronique pour experiences d'immersion
WO2007004134A3 (fr) * 2005-06-30 2007-07-19 Philips Intellectual Property Procede de controle d'un systeme
US9465450B2 (en) 2005-06-30 2016-10-11 Koninklijke Philips N.V. Method of controlling a system
US8120713B2 (en) 2006-03-08 2012-02-21 Sony Corporation Television apparatus
EP1833276A3 (fr) * 2006-03-08 2009-12-02 Sony Corporation Appareil de télévision
CN101416235B (zh) * 2006-03-31 2012-05-30 皇家飞利浦电子股份有限公司 用于处理数据的设备和方法
US8675880B2 (en) 2006-03-31 2014-03-18 Koninklijke Philips N.V. Device for and a method of processing data
WO2007113718A1 (fr) 2006-03-31 2007-10-11 Koninklijke Philips Electronics N.V. Dispositif et procede pour traiter des donnees
EP2005414B1 (fr) * 2006-03-31 2012-02-22 Koninklijke Philips Electronics N.V. Dispositif et procede pour traiter des donnees
EP2005414A1 (fr) * 2006-03-31 2008-12-24 Koninklijke Philips Electronics N.V. Dispositif et procede pour traiter des donnees
EP2031905A3 (fr) * 2007-08-31 2010-02-17 Samsung Electronics Co., Ltd. Appareil de traitement sonore et son procédé de traitement sonore
US20090060235A1 (en) * 2007-08-31 2009-03-05 Samsung Electronics Co., Ltd. Sound processing apparatus and sound processing method thereof
WO2009124773A1 (fr) * 2008-04-09 2009-10-15 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Système de reproduction sonore et procédé pour réaliser une reproduction sonore en utilisant un suivi visuelle des visages
WO2010141149A3 (fr) * 2009-06-03 2011-02-24 Transpacific Image, Llc Gestion de projections multimédia
CN102484688A (zh) * 2009-06-03 2012-05-30 传斯伯斯克影像有限公司 多媒体投影管理
US8269902B2 (en) 2009-06-03 2012-09-18 Transpacific Image, Llc Multimedia projection management
WO2010141149A2 (fr) 2009-06-03 2010-12-09 Transpacific Image, Llc Gestion de projections multimédia
US8976986B2 (en) 2009-09-21 2015-03-10 Microsoft Technology Licensing, Llc Volume adjustment based on listener position
EP2517478B1 (fr) * 2009-12-24 2017-11-01 Nokia Technologies Oy Appareil
EP2464127A1 (fr) * 2010-11-18 2012-06-13 LG Electronics Inc. Dispositif électronique générant un son stéréo synchronisé avec une image en mouvement stéréographique
US9100633B2 (en) 2010-11-18 2015-08-04 Lg Electronics Inc. Electronic device generating stereo sound synchronized with stereographic moving picture
US9420373B2 (en) 2012-05-25 2016-08-16 Samsung Electronics Co., Ltd. Display apparatus, hearing level control apparatus, and method for correcting sound
EP2667636A1 (fr) * 2012-05-25 2013-11-27 Samsung Electronics Co., Ltd. Appareil d'affichage, appareil de commande de niveau auditif et procédé de correction du son
EP2731360B1 (fr) * 2012-11-09 2020-02-19 Harman International Industries, Inc. Système d'amélioration audio automatique
US9544679B2 (en) 2014-12-08 2017-01-10 Harman International Industries, Inc. Adjusting speakers using facial recognition
EP3032847A3 (fr) * 2014-12-08 2016-06-29 Harman International Industries, Incorporated Réglage de haut-parleurs par reconnaissance faciale
US9866951B2 (en) 2014-12-08 2018-01-09 Harman International Industries, Incorporated Adjusting speakers using facial recognition
CN107318071A (zh) * 2016-04-26 2017-11-03 音律电子股份有限公司 扬声装置、其控制方法及播放控制系统
US10552115B2 (en) 2016-12-13 2020-02-04 EVA Automation, Inc. Coordination of acoustic sources based on location
WO2018149275A1 (fr) * 2017-02-16 2018-08-23 深圳创维-Rgb电子有限公司 Procédé et appareil d'ajustement d'une sortie audio par un haut-parleur
US10171054B1 (en) 2017-08-24 2019-01-01 International Business Machines Corporation Audio adjustment based on dynamic and static rules
US10440473B1 (en) 2018-06-22 2019-10-08 EVA Automation, Inc. Automatic de-baffling
US10484809B1 (en) 2018-06-22 2019-11-19 EVA Automation, Inc. Closed-loop adaptation of 3D sound
US10511906B1 (en) 2018-06-22 2019-12-17 EVA Automation, Inc. Dynamically adapting sound based on environmental characterization
US10524053B1 (en) 2018-06-22 2019-12-31 EVA Automation, Inc. Dynamically adapting sound based on background sound
US10531221B1 (en) 2018-06-22 2020-01-07 EVA Automation, Inc. Automatic room filling
US10708691B2 (en) 2018-06-22 2020-07-07 EVA Automation, Inc. Dynamic equalization in a directional speaker array
CN111782045A (zh) * 2020-06-30 2020-10-16 歌尔科技有限公司 一种设备角度调节方法、装置、智能音箱及存储介质
CN116736982A (zh) * 2023-06-21 2023-09-12 惠州中哲尚蓝柏科技有限公司 一种用于家庭影院的多媒体输出参数自动调节系统及方法
CN116736982B (zh) * 2023-06-21 2024-01-26 惠州中哲尚蓝柏科技有限公司 一种用于家庭影院的多媒体输出参数自动调节系统及方法

Also Published As

Publication number Publication date
EP1393591A2 (fr) 2004-03-03
WO2002041664A3 (fr) 2003-12-18
JP2004514359A (ja) 2004-05-13

Similar Documents

Publication Publication Date Title
EP1393591A2 (fr) Systeme audio a reglage automatique
US9980040B2 (en) Active speaker location detection
US9906885B2 (en) Methods and systems for inserting virtual sounds into an environment
Ribeiro et al. Using reverberation to improve range and elevation discrimination for small array sound source localization
JP5091857B2 (ja) システム制御方法
US9485556B1 (en) Speaker array for sound imaging
US9832447B2 (en) Image processing system and image processing program
KR20020094011A (ko) 시청자의 위치에 의존하는 디스플레이의 자동 포지션닝
CN114208209B (zh) 音频处理系统、方法和介质
JPH1141577A (ja) 話者位置検出装置
JP2023508002A (ja) オーディオデバイス自動場所選定
Mulder et al. An affordable optical head tracking system for desktop VR/AR systems
EP2850506A2 (fr) Système d'entrée
Liu et al. Multiple speaker tracking in spatial audio via PHD filtering and depth-audio fusion
Łopatka et al. Application of vector sensors to acoustic surveillance of a public interior space
JP2009194447A (ja) リモートコントローラ位置検出装置、リモートコントローラ位置検出システム、リモートコントローラ位置検出方法及びプログラム
US20220210588A1 (en) Methods and systems for determining parameters of audio devices
US7599502B2 (en) Sound control installation
Deldjoo et al. A low-cost infrared-optical head tracking solution for virtual 3d audio environment using the nintendo wii-remote
CN116261095A (zh) 可动态调整目标聆听点并消除环境物件干扰的音响系统
WO2019244315A1 (fr) Dispositif de commande de sortie, système de commande de sortie et procédé de commande de sortie
Piérard et al. I-see-3d! an interactive and immersive system that dynamically adapts 2d projections to the location of a user's eyes
JP2005295181A (ja) 音声情報生成装置
EP2107390B1 (fr) Détermination d'angle rotatif pour écouteurs
KR20110097388A (ko) 디바이스 검출 시스템 및 방법

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): JP

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR

WWE Wipo information: entry into national phase

Ref document number: 2001989480

Country of ref document: EP

ENP Entry into the national phase in:

Ref country code: JP

Ref document number: 2002 543259

Kind code of ref document: A

Format of ref document f/p: F

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWP Wipo information: published in national office

Ref document number: 2001989480

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 2001989480

Country of ref document: EP