EP1617702A1 - Portable electronic equipment with 3D audio rendering - Google Patents

Portable electronic equipment with 3D audio rendering Download PDF

Info

Publication number
EP1617702A1
EP1617702A1 EP04016438A EP04016438A EP1617702A1 EP 1617702 A1 EP1617702 A1 EP 1617702A1 EP 04016438 A EP04016438 A EP 04016438A EP 04016438 A EP04016438 A EP 04016438A EP 1617702 A1 EP1617702 A1 EP 1617702A1
Authority
EP
European Patent Office
Prior art keywords
electronic equipment
user
audio
portable electronic
location
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP04016438A
Other languages
German (de)
French (fr)
Other versions
EP1617702B1 (en
Inventor
Henrik Sundström
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Mobile Communications AB
Original Assignee
Sony Ericsson Mobile Communications AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Ericsson Mobile Communications AB filed Critical Sony Ericsson Mobile Communications AB
Priority to EP04016438A priority Critical patent/EP1617702B1/en
Priority to DE602004029021T priority patent/DE602004029021D1/en
Priority to AT04016438T priority patent/ATE480960T1/en
Publication of EP1617702A1 publication Critical patent/EP1617702A1/en
Application granted granted Critical
Publication of EP1617702B1 publication Critical patent/EP1617702B1/en
Not-in-force legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • H04S7/302Electronic adaptation of stereophonic sound system to listener position or orientation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R5/00Stereophonic arrangements
    • H04R5/033Headphones for stereophonic communication

Definitions

  • the present invention relates to a portable electronic equipment, for example a mobile phone which is able to output audio signals in an audio-spatial manner.
  • the output of audio signals in an audio-spatial manner also called three-dimensional audio very simply means sound which comes from all around the listener.
  • a human being can hear three-dimensionally in the real world using just two ears.
  • many three-dimensional audio products have been built, which provide a realistic three-dimensional effect using just two speakers or a set of head phones.
  • This technology is called three-dimensional audio, which is the ability to position sound anywhere in a three-dimensional space.
  • Three-dimensional audio is therefore achievable on all usual playback environments, such as head phones, stereo speakers and of course multi-speaker arrays.
  • interactive three-dimensional audio is used for on-the-fly positioning of sounds anywhere in the three-dimensional space surrounding a listener.
  • the output of the audio signals is changed interactively on the basis of the fact that the listener's head motion is simulated, for example using inputs from a joy stick, a mouse or from a head-tracking system.
  • Interactive three-dimensional audio rendering is applicable to all kinds of technologies, such as 3D websites or video games, phone- or video conferences with multiple participants placed in 3D audio space, air traffic controllers, air planes and so forth.
  • Fig. 1 A known user position is for example shown in Fig. 1, in which case a user 1 wears a pair of head phones 2. In this case, the user's position is obviously known as being between the loudspeakers of the head phones.
  • Fig. 2 schematically shows the home stereo situation, in which speakers 3, 4 are located on fixed positions for example in a living room. The position of the user 1 is assumed to be at a certain location in the living room, as e.g. on the couch and the audio properties of the speakers 3, 4 are set to this specific situation.
  • Another example is a user sitting in front of a desktop computer, which has a pair of speakers.
  • the position of the user is always fixed in front of the joy stick or keyboard of the computer and thus in a fixed relationship to the speakers. In all these cases, the exactly known or assumed position of a user does not change so that the three-dimensional audio properties of the speakers can be fixedly set.
  • FIG. 3 schematically shows different situations in which e.g. a mobile phone 5 can be used.
  • a user 1 uses a mobile phone 5 by holding the phone on his or her ear in order to listen to the loudspeaker (s) and to speak into the microphone as shown in Fig. 3A.
  • Another situation is that the user 1 holds the mobile phone 5 in his or her hand in front of this face in order to look at the display and to listen to audio signals output from the loudspeaker (s), as shown in Fig. 3B.
  • a mobile phone 5 can be placed on a table 6 so that a user 1 which is sitting or standing close to the table can listen to audio signals output from the mobile phone 5 as shown in Fig. 3C.
  • the position of a user 1 in relation to the mobile phone is variable and changed depending on the desired application. The output of three-dimensional audio signal on the basis of a known or assumed user position is therefore not possible.
  • the object of the present invention is therefore to provide a portable electronic equipment and a method for controlling the audio output from a portable electronic equipment which enable the output of three-dimensional audio signals in a flexible and variable manner depending on a respective application.
  • a portable electronic equipment comprising at least one loudspeaker for outputting audio signals in a three-dimensional manner, detecting means for detecting the location of a user in relation to the electronic equipment, and processing means for dynamically adjusting the audio-spatial output of audio signals from the at least one loudspeaker depending on the detected location of the user.
  • a method for controlling the audio output from a portable electronic equipment comprising the steps of detecting the spatial location of a user in relation to the electronic equipment, and dynamically adjusting the audio-spatial output of audio signals from the at least one loudspeaker depending on the detected location of the user.
  • the portable electronic equipment and the method for controlling the audio output from such a portable electronic equipment according to the present invention therefore enable the variable and flexible adaptation of the audio-spatial output of audio signals depending on the detected spatial location of a user in a simple and easy manner.
  • the present invention therefore improves the three-dimensional audio experience in portable electronic equipments.
  • the process of three-dimensional audio rendering is greatly simplified and can be flexibly adapted to the spatial location of the user.
  • the processing means continuously adjusts the audio-spatial output of the audio signals depending on the detected location of the user. Since the adjustment and adaptation of the audio-spatial output of the audio signals is continuously performed, a portable electronic equipment can be used in any kind of situation and location.
  • the present invention therefore allows a dynamic and quick adjustment of the three-dimensional rendering in a continuous manner. A user therefore has the desired three-dimensional audio effect for any spatial relationship between himself and the portable electronic equipment.
  • the detecting means could detect the spatial location or position of a user in regular intervals. Alternatively, the detecting means could only detect the spatial location of a user in relation to the electronic equipment after a corresponding input instruction from a user.
  • the detecting means detects the distance and the angular position of a user in relation to the electronic equipment.
  • the spatial location of a user in relation to the electronic equipment is calculated on the basis of the detected distance and the detected angular position.
  • Other suitable detecting means for directly detecting and determining the spatial location of a user can of course be implemented.
  • the detecting means may comprise a camera for visually detecting the spatial location of a user.
  • the detecting means may comprise a microphone for detecting the location of a user on the basis of received audio signals from the user.
  • the microphone for detecting the spatial location of a user can be the same microphone as the one which is usually used for detecting speech or other audio signals from a user in order to transmit the signals to a communication partner or the like, in case that the portable electronic equipment is a portable radio communication device or the like.
  • the portable electronic equipment could comprise two or more loudspeakers for outputting audio signals.
  • two loudspeakers provide a better three-dimensional effect.
  • more than two loudspeakers e.g. three or four loudspeakers could be used in order to improve the three-dimensional audio rendering.
  • one loudspeaker could e.g. be the normal loudspeaker which is used to output received audio signals to a user
  • the second loudspeaker could be a second loudspeaker which is used to output an alarm or ring signals in case that the portable electronic equipment is a portable radio communication device.
  • the electronic equipment is a portable radio equipment for communication in a wireless communication system such as the GSM, UMTS or any other wireless communication system.
  • portable radio communication equipment includes all equipment such as mobile telephones, pagers, communicators, e.g. electronic organisers, smart phones and the like.
  • the present invention further relates to a computer programme product directly loadable into the internal memory of a portable electronic equipment, comprising software code portions for performing the steps of the method for controlling the audio output from a portable electronic equipment according to the present invention, if said product is run on the portable electronic equipment.
  • a portable electronic equipment schematically shown in Fig. 4 is a portable radio communication device, such as a mobile phone, a personal digital assistant of the like.
  • a portable radio communication device such as a mobile phone, a personal digital assistant of the like.
  • the portable electronic equipment according to the present invention is not limited or restricted to a portable radio communication device.
  • the portable radio communication device 5 shown in Fig. 4 comprises an antenna 7 and an RF part 8 for transmitting and receiving radio frequency signals.
  • the RF part 8 is connected to a processing means 9, which is e.g. a base band processor or any other suitable processing means.
  • the processing means 9 can be implemented by means of hardware or software components or a mixture thereof.
  • the processing means 9 either comprises or is connected to a memory means 10, in which data and/or software code is stored.
  • the portable radio communication device comprises a microphone 14 for detecting audio signals, e.g. speech signals from a user.
  • the processing means 9 is connected to at least one speaker 11 for outputting audio signals, as e.g. speech signals, music signals, alarm signals and so forth.
  • the processing means 9 could be connected to a second speaker 12.
  • the processing means 9 is connected to a third speaker 13. It is to be understood that more than three speakers can be implemented in the radio communication device 5.
  • the radio communication device 5 further comprises a digital camera 15 which is connected to the processing means 9.
  • the camera 15 enables a user to take still and/or moving pictures which are then further processed in the processing means 9 and eventually stored in the memory means 10.
  • the pictures taken with the camera 15 can also be communicated via the RF part 8 and the antenna 7 to a communication partner in the wireless communication system.
  • the speaker 11 and optionally the speakers 12 and 13 are adapted to output audio signals in a spatial manner (i.e. three-dimensional manner).
  • the three-dimensional or audio-spatial output of the audio signals is hereby controlled and processed by the processing means 9.
  • the specific implementation of the three-dimensional audio rendering in the processing means 9 can be done on the basis of any known or future three-dimensional audio rendering model or system.
  • the Interactive Audio Special Interest Group describes an interactive three-dimensional audio system consisting of essentially three layers, which are the application, the application programming interface and an audio renderer.
  • the application is e.g. a software application, such as a game, a music playback-composition programme and the like.
  • an interactive three-dimensional audio compatible application programming interface is required to do the translation between the application and the audio renderer.
  • the audio renderer can either be hardware or software and must be able to interpret the received events and successfully produce a believable three-dimensional audio output.
  • the application usually accepts data from the user via a joy stick, a mouse, a key board or any other input device which provides the interactive element by modifying the final positional information of the user.
  • the present invention enhances the system by suggesting the additional use of a detecting means for directly detecting the spatial location of the user in relation to the portable radio communication device 5.
  • the camera 15 can be used to detect the spatial location of the user.
  • the microphone 14 can be used to detect the location of a user on the basis of received audio signals from the user.
  • Both the visual and the aural detection of the spatial location of the user can be implemented on the basis of known or future detection technologies. It is to be understood, that any other detection technology for detecting the spatial location of a user can be used such as the detection on the basis of infrared light, ultrasonic rays and so forth. Some detection technologies might detect the spatial location of the user on the basis of the distance and the angular position of a user in relation to the radio communication device 5, whereas others are able to directly detect the location of a user.
  • the processing means 9 dynamically and continuously adjusts the audio-spatial output of audio signals from the loudspeaker 11 and optionally the loudspeakers 12 and 13 depending on the spatial location and position of the user as detected by the detecting means or e.g. the camera 15 and/or the microphone 14.
  • the three-dimensional audio rendering can be implemented in one of the known or future technologies.
  • the general and common feature is that the three-dimensional audio system reproduces and outputs a realistic three-dimensional sound field around the user on the basis of a replication of the three-dimensional audio cues which the ears of a user hear in the real world.
  • a head related transfer function can be though of as a set of audio filters for each ear that contain all the listening cues that are applied to a sound as it travels from the sound's origin (its source or position in space) through the environment and arrives at the listener's eardrums.
  • the filters change depending on the direction from which the sound arrives at the listener.
  • not only the motion and position of a user's head but also the absolute position of the user's head in relation to the portable radio communication device 5 is essential for the three-dimensional audio rendering.
  • the three-dimensional audio rendering as discussed above is implemented in the processing means 9 and/or the memory means 10 either on the basis of hardware or on the basis of software code or on the basis of a mixture thereof. It is to be understood, that additionally to the detection of the spatial location of a user in relation to the portable radio communication device, additional means for detecting the head movement and head position of the user in relation to the portable radio communication device 5 can be implemented. The detection of the head movement can also be done by the microphone 14 and/or the camera 15 or any other suitable detection means.
  • a schematic drawing in Fig. 4 only shows the essential elements for understanding the present invention.
  • the portable radio communication device 5 will comprise further elements which are necessary for the operation, such as input keys, display and so forth.

Abstract

The present invention relates to a portable electronic equipment (5) with at least one loudspeaker (11) for outputting audio signals in a three-dimensional manner, detecting means (14; 15) for detecting the spatial location of a user in relation to the electronic equipment (5), and processing means (9) for dynamically adjusting the three-dimensional output of audio signals from the at least one loudspeaker (11) depending on the detected location of the user. The present invention further relates to a method for controlling the audio output from such a portable electronic equipment and a computer programme directly loadable into the internal memory of such a portable electronic equipment for performing the method steps.

Description

    TECHNICAL FIELD OF THE INVENTION
  • The present invention relates to a portable electronic equipment, for example a mobile phone which is able to output audio signals in an audio-spatial manner.
  • DESCRIPTION OF RELATED ART
  • The output of audio signals in an audio-spatial manner, also called three-dimensional audio very simply means sound which comes from all around the listener. A human being can hear three-dimensionally in the real world using just two ears. On this basis, many three-dimensional audio products have been built, which provide a realistic three-dimensional effect using just two speakers or a set of head phones. This technology is called three-dimensional audio, which is the ability to position sound anywhere in a three-dimensional space. Three-dimensional audio is therefore achievable on all usual playback environments, such as head phones, stereo speakers and of course multi-speaker arrays. In the "3D Audio Rendering and Evaluation Guidelines" issued by the Interactive Audio Special Interest Group, the term interactive three-dimensional audio is used for on-the-fly positioning of sounds anywhere in the three-dimensional space surrounding a listener. The output of the audio signals is changed interactively on the basis of the fact that the listener's head motion is simulated, for example using inputs from a joy stick, a mouse or from a head-tracking system. Interactive three-dimensional audio rendering is applicable to all kinds of technologies, such as 3D websites or video games, phone- or video conferences with multiple participants placed in 3D audio space, air traffic controllers, air planes and so forth.
  • All known (interactive) three-dimensional audio systems, however, are restricted to a known or assumed user position. A known user position is for example shown in Fig. 1, in which case a user 1 wears a pair of head phones 2. In this case, the user's position is obviously known as being between the loudspeakers of the head phones. Fig. 2 schematically shows the home stereo situation, in which speakers 3, 4 are located on fixed positions for example in a living room. The position of the user 1 is assumed to be at a certain location in the living room, as e.g. on the couch and the audio properties of the speakers 3, 4 are set to this specific situation. Another example is a user sitting in front of a desktop computer, which has a pair of speakers. If the user plays a game on the computer or works on any kind of application, the position of the user is always fixed in front of the joy stick or keyboard of the computer and thus in a fixed relationship to the speakers. In all these cases, the exactly known or assumed position of a user does not change so that the three-dimensional audio properties of the speakers can be fixedly set.
  • For portable electronic equipment, as e.g. mobile phones, personal digital assistants, pagers, communicators and so forth, the situation is different. Fig. 3 schematically shows different situations in which e.g. a mobile phone 5 can be used. In a normal use mode, a user 1 uses a mobile phone 5 by holding the phone on his or her ear in order to listen to the loudspeaker (s) and to speak into the microphone as shown in Fig. 3A. Another situation is that the user 1 holds the mobile phone 5 in his or her hand in front of this face in order to look at the display and to listen to audio signals output from the loudspeaker (s), as shown in Fig. 3B. In a further situation, a mobile phone 5 can be placed on a table 6 so that a user 1 which is sitting or standing close to the table can listen to audio signals output from the mobile phone 5 as shown in Fig. 3C. In all these situations, the position of a user 1 in relation to the mobile phone is variable and changed depending on the desired application. The output of three-dimensional audio signal on the basis of a known or assumed user position is therefore not possible.
  • SUMMARY
  • The object of the present invention is therefore to provide a portable electronic equipment and a method for controlling the audio output from a portable electronic equipment which enable the output of three-dimensional audio signals in a flexible and variable manner depending on a respective application.
  • The above object is achieved by a portable electronic equipment according to claim 1, comprising at least one loudspeaker for outputting audio signals in a three-dimensional manner, detecting means for detecting the location of a user in relation to the electronic equipment, and processing means for dynamically adjusting the audio-spatial output of audio signals from the at least one loudspeaker depending on the detected location of the user.
  • The above object is also achieved by a method for controlling the audio output from a portable electronic equipment according to claim 8, comprising the steps of detecting the spatial location of a user in relation to the electronic equipment, and dynamically adjusting the audio-spatial output of audio signals from the at least one loudspeaker depending on the detected location of the user.
  • The portable electronic equipment and the method for controlling the audio output from such a portable electronic equipment according to the present invention therefore enable the variable and flexible adaptation of the audio-spatial output of audio signals depending on the detected spatial location of a user in a simple and easy manner.
  • The present invention therefore improves the three-dimensional audio experience in portable electronic equipments. The process of three-dimensional audio rendering is greatly simplified and can be flexibly adapted to the spatial location of the user.
  • Advantageously, the processing means continuously adjusts the audio-spatial output of the audio signals depending on the detected location of the user. Since the adjustment and adaptation of the audio-spatial output of the audio signals is continuously performed, a portable electronic equipment can be used in any kind of situation and location. The present invention therefore allows a dynamic and quick adjustment of the three-dimensional rendering in a continuous manner. A user therefore has the desired three-dimensional audio effect for any spatial relationship between himself and the portable electronic equipment. Hereby, the detecting means could detect the spatial location or position of a user in regular intervals. Alternatively, the detecting means could only detect the spatial location of a user in relation to the electronic equipment after a corresponding input instruction from a user.
  • Further advantageously, the detecting means detects the distance and the angular position of a user in relation to the electronic equipment. Hereby, the spatial location of a user in relation to the electronic equipment is calculated on the basis of the detected distance and the detected angular position. Other suitable detecting means for directly detecting and determining the spatial location of a user can of course be implemented. For example, the detecting means may comprise a camera for visually detecting the spatial location of a user. Alternatively or additionally, the detecting means may comprise a microphone for detecting the location of a user on the basis of received audio signals from the user. In this case, the microphone for detecting the spatial location of a user can be the same microphone as the one which is usually used for detecting speech or other audio signals from a user in order to transmit the signals to a communication partner or the like, in case that the portable electronic equipment is a portable radio communication device or the like.
  • Further advantageously, the portable electronic equipment could comprise two or more loudspeakers for outputting audio signals. Although three-dimensional audio rendering is possible with one loudspeaker, two loudspeakers provide a better three-dimensional effect. Of course, more than two loudspeakers, e.g. three or four loudspeakers could be used in order to improve the three-dimensional audio rendering. In case of two loudspeakers, one loudspeaker could e.g. be the normal loudspeaker which is used to output received audio signals to a user, and the second loudspeaker could be a second loudspeaker which is used to output an alarm or ring signals in case that the portable electronic equipment is a portable radio communication device.
  • Advantageously, the electronic equipment is a portable radio equipment for communication in a wireless communication system such as the GSM, UMTS or any other wireless communication system. Hereby, the term portable radio communication equipment includes all equipment such as mobile telephones, pagers, communicators, e.g. electronic organisers, smart phones and the like.
  • It is to be noted that the present invention further relates to a computer programme product directly loadable into the internal memory of a portable electronic equipment, comprising software code portions for performing the steps of the method for controlling the audio output from a portable electronic equipment according to the present invention, if said product is run on the portable electronic equipment.
  • Further, it should be emphasised that the term comprises/comprising when used in this specification is taken to specify the presence of stated features, integers, steps or components, but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In the following description, the present invention is explained in more detail in relation to the enclosed drawings, in which
    • Fig. 1 schematically shows a user with head phones,
    • Fig. 2 schematically shows a user with two loudspeakers on fixed locations,
    • Figs. 3A, 3B and 3C schematically show a user and a portable electronic equipment in different spatial relationships, and
    • Fig. 4 schematically shows a block diagram of a portable electronic equipment according to the present invention.
    DETAILED DESCRIPTION OF EMBODIMENT
  • The example of a portable electronic equipment schematically shown in Fig. 4 is a portable radio communication device, such as a mobile phone, a personal digital assistant of the like. However, it is to be understood that the portable electronic equipment according to the present invention is not limited or restricted to a portable radio communication device.
  • The portable radio communication device 5 shown in Fig. 4 comprises an antenna 7 and an RF part 8 for transmitting and receiving radio frequency signals. The RF part 8 is connected to a processing means 9, which is e.g. a base band processor or any other suitable processing means. The processing means 9 can be implemented by means of hardware or software components or a mixture thereof. The processing means 9 either comprises or is connected to a memory means 10, in which data and/or software code is stored.
  • In the example shown in Fig. 4, the portable radio communication device comprises a microphone 14 for detecting audio signals, e.g. speech signals from a user. Further, the processing means 9 is connected to at least one speaker 11 for outputting audio signals, as e.g. speech signals, music signals, alarm signals and so forth. Optionally, the processing means 9 could be connected to a second speaker 12. Further optionally, the processing means 9 is connected to a third speaker 13. It is to be understood that more than three speakers can be implemented in the radio communication device 5.
  • The radio communication device 5 further comprises a digital camera 15 which is connected to the processing means 9. The camera 15 enables a user to take still and/or moving pictures which are then further processed in the processing means 9 and eventually stored in the memory means 10. The pictures taken with the camera 15 can also be communicated via the RF part 8 and the antenna 7 to a communication partner in the wireless communication system.
  • The speaker 11 and optionally the speakers 12 and 13 are adapted to output audio signals in a spatial manner (i.e. three-dimensional manner). The three-dimensional or audio-spatial output of the audio signals is hereby controlled and processed by the processing means 9. The specific implementation of the three-dimensional audio rendering in the processing means 9 can be done on the basis of any known or future three-dimensional audio rendering model or system. The Interactive Audio Special Interest Group describes an interactive three-dimensional audio system consisting of essentially three layers, which are the application, the application programming interface and an audio renderer. The application is e.g. a software application, such as a game, a music playback-composition programme and the like. For most implementations, an interactive three-dimensional audio compatible application programming interface is required to do the translation between the application and the audio renderer. The audio renderer can either be hardware or software and must be able to interpret the received events and successfully produce a believable three-dimensional audio output. It is to be noted that the application usually accepts data from the user via a joy stick, a mouse, a key board or any other input device which provides the interactive element by modifying the final positional information of the user. The present invention enhances the system by suggesting the additional use of a detecting means for directly detecting the spatial location of the user in relation to the portable radio communication device 5. Hereby, the camera 15 can be used to detect the spatial location of the user. Alternatively or additionally, the microphone 14 can be used to detect the location of a user on the basis of received audio signals from the user. Both the visual and the aural detection of the spatial location of the user can be implemented on the basis of known or future detection technologies. It is to be understood, that any other detection technology for detecting the spatial location of a user can be used such as the detection on the basis of infrared light, ultrasonic rays and so forth. Some detection technologies might detect the spatial location of the user on the basis of the distance and the angular position of a user in relation to the radio communication device 5, whereas others are able to directly detect the location of a user.
  • The processing means 9 dynamically and continuously adjusts the audio-spatial output of audio signals from the loudspeaker 11 and optionally the loudspeakers 12 and 13 depending on the spatial location and position of the user as detected by the detecting means or e.g. the camera 15 and/or the microphone 14. Hereby, the three-dimensional audio rendering can be implemented in one of the known or future technologies. The general and common feature is that the three-dimensional audio system reproduces and outputs a realistic three-dimensional sound field around the user on the basis of a replication of the three-dimensional audio cues which the ears of a user hear in the real world.
  • Hereby, the two primary localisation cues are called interaural intensity difference and interaural time difference. Further listening cues are e.g. the outer ear effect and so forth. Most three-dimensional audio technologies are at some level based on the concept of head related transfer functions. A head related transfer function can be though of as a set of audio filters for each ear that contain all the listening cues that are applied to a sound as it travels from the sound's origin (its source or position in space) through the environment and arrives at the listener's eardrums. The filters change depending on the direction from which the sound arrives at the listener. Hereby, not only the motion and position of a user's head but also the absolute position of the user's head in relation to the portable radio communication device 5 is essential for the three-dimensional audio rendering.
  • The three-dimensional audio rendering as discussed above is implemented in the processing means 9 and/or the memory means 10 either on the basis of hardware or on the basis of software code or on the basis of a mixture thereof. It is to be understood, that additionally to the detection of the spatial location of a user in relation to the portable radio communication device, additional means for detecting the head movement and head position of the user in relation to the portable radio communication device 5 can be implemented. The detection of the head movement can also be done by the microphone 14 and/or the camera 15 or any other suitable detection means.
  • It is to be noted that a schematic drawing in Fig. 4 only shows the essential elements for understanding the present invention. However, in reality, the portable radio communication device 5 will comprise further elements which are necessary for the operation, such as input keys, display and so forth.

Claims (13)

  1. Portable electronic equipment (5), with
    at least one loudspeaker (11) for outputting audio signals in a audio-spatial manner, detecting means (14; 15) for detecting the spatial location of a user in relation to the electronic equipment (5), and
    processing means (9) for dynamically adjusting the audio-spatial output of audio signals from the at least one loudspeaker (11) depending on the detected location of the user.
  2. Portable electronic equipment (5) according to claim 1,
    characterized in,
    that the processing means (9) continuously adjusts the audio-spatial output of the audio signal depending on the detected location of the user.
  3. Portable electronic equipment (5) according to claim 1 or 2,
    characterized in,
    that the detecting means (14; 15) detects the distance and the angular position of a user in relation to the electronic equipment.
  4. Portable electronic equipment (5) according to claim 1, 2 or 3,
    characterized in,
    that the detecting means (14; 15) comprises a camera (15) for visually detecting the location of a user.
  5. Portable electronic equipment (5) according to one of the
    claims 1 to 4,
    characterized in,
    that the detecting means (14; 15) comprises a microphone (14) for detecting the location of a user on the basis of received audio signals.
  6. Portable electronic equipment (5) according to one of
    the claims 1 to 5,
    characterized by
    two loudspeakers (11, 12) for outputting audio signals.
  7. Portable electronic equipment (5) according to one of
    the claims 1 to 6,
    characterized in,
    that the electronic equipment is a portable radio equipment for communication in a wireless communication system.
  8. Method for controlling the audio output from a portable electronic equipment (5), with the steps of
    detecting the spatial location of a user in relation to the electronic equipment, and dynamically adjusting the audio-spatial output of audio signals from the at least one loudspeaker depending on the detected location of the user.
  9. Method according to claim 8,
    characterized in,
    that in the processing step the audio-spatial output of the audio signal is continuously adjusted depending on the detected location of the user.
  10. Method according to claim 8 or 9,
    characterized in,
    that in the detecting step the distance and the angular position of a user in relation to the electronic equipment are detected.
  11. Method according to claim 7, 8 or 9,
    characterized in,
    that in the detecting step the location of a user is visually detected.
  12. Method according to one of the claims 8 to 11,
    characterized in,
    that in the detecting step the location of a user is detected on the basis of received audio signals.
  13. Computer program product directly loadable into the internal memory of a portable electronic equipment, comprising software code portions for performing the steps of one of the claims 8 to 12 when said product is run on a portable electronic equipment.
EP04016438A 2004-07-13 2004-07-13 Portable electronic equipment with 3D audio rendering Not-in-force EP1617702B1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP04016438A EP1617702B1 (en) 2004-07-13 2004-07-13 Portable electronic equipment with 3D audio rendering
DE602004029021T DE602004029021D1 (en) 2004-07-13 2004-07-13 Portable electronic device with 3D audio playback
AT04016438T ATE480960T1 (en) 2004-07-13 2004-07-13 PORTABLE ELECTRONIC DEVICE WITH 3D AUDIO PLAYBACK

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP04016438A EP1617702B1 (en) 2004-07-13 2004-07-13 Portable electronic equipment with 3D audio rendering

Publications (2)

Publication Number Publication Date
EP1617702A1 true EP1617702A1 (en) 2006-01-18
EP1617702B1 EP1617702B1 (en) 2010-09-08

Family

ID=34925737

Family Applications (1)

Application Number Title Priority Date Filing Date
EP04016438A Not-in-force EP1617702B1 (en) 2004-07-13 2004-07-13 Portable electronic equipment with 3D audio rendering

Country Status (3)

Country Link
EP (1) EP1617702B1 (en)
AT (1) ATE480960T1 (en)
DE (1) DE602004029021D1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009106916A1 (en) * 2008-02-28 2009-09-03 Sony Ericsson Mobile Communications Ab Head tracking for enhanced 3d experience using face detection
US20160099009A1 (en) * 2014-10-01 2016-04-07 Samsung Electronics Co., Ltd. Method for reproducing contents and electronic device thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2361395A (en) * 2000-04-15 2001-10-17 Central Research Lab Ltd A method of audio signal processing for a loudspeaker located close to an ear
US20030045274A1 (en) 2001-09-05 2003-03-06 Yoshiki Nishitani Mobile communication terminal, sensor unit, musical tone generating system, musical tone generating apparatus, musical tone information providing method, and program
GB2382241A (en) 2001-11-07 2003-05-21 Sendo Int Ltd Controlling the volume of a speaker in response to the proximity to an object

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2361395A (en) * 2000-04-15 2001-10-17 Central Research Lab Ltd A method of audio signal processing for a loudspeaker located close to an ear
US20030045274A1 (en) 2001-09-05 2003-03-06 Yoshiki Nishitani Mobile communication terminal, sensor unit, musical tone generating system, musical tone generating apparatus, musical tone information providing method, and program
GB2382241A (en) 2001-11-07 2003-05-21 Sendo Int Ltd Controlling the volume of a speaker in response to the proximity to an object

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009106916A1 (en) * 2008-02-28 2009-09-03 Sony Ericsson Mobile Communications Ab Head tracking for enhanced 3d experience using face detection
US20160099009A1 (en) * 2014-10-01 2016-04-07 Samsung Electronics Co., Ltd. Method for reproducing contents and electronic device thereof
KR20160039400A (en) * 2014-10-01 2016-04-11 삼성전자주식회사 Method for reproducing contents and an electronic device thereof
US10148242B2 (en) * 2014-10-01 2018-12-04 Samsung Electronics Co., Ltd Method for reproducing contents and electronic device thereof

Also Published As

Publication number Publication date
ATE480960T1 (en) 2010-09-15
EP1617702B1 (en) 2010-09-08
DE602004029021D1 (en) 2010-10-21

Similar Documents

Publication Publication Date Title
CN110121695B (en) Apparatus in a virtual reality domain and associated methods
EP3629145B1 (en) Method for processing 3d audio effect and related products
EP3588926B1 (en) Apparatuses and associated methods for spatial presentation of audio
US11625222B2 (en) Augmenting control sound with spatial audio cues
US9986362B2 (en) Information processing method and electronic device
TW202014849A (en) User interface for controlling audio zones
EP3550860B1 (en) Rendering of spatial audio content
CN109314834A (en) Improve the perception for mediating target voice in reality
CN111492342B (en) Audio scene processing
US20210092545A1 (en) Audio processing
JP2022083445A (en) Computer system for producing audio content for achieving user-customized being-there and method thereof
CN109327766B (en) 3D sound effect processing method and related product
CN108605195A (en) Intelligent audio is presented
EP1617702B1 (en) Portable electronic equipment with 3D audio rendering
WO2022054900A1 (en) Information processing device, information processing terminal, information processing method, and program
US11099802B2 (en) Virtual reality
US11696085B2 (en) Apparatus, method and computer program for providing notifications
Cohen et al. Cyberspatial audio technology
US10516961B2 (en) Preferential rendering of multi-user free-viewpoint audio for improved coverage of interest
CN113632060A (en) Device, method, computer program or system for indicating audibility of audio content presented in a virtual space
US20230370801A1 (en) Information processing device, information processing terminal, information processing method, and program
CN112689825A (en) Device, method and computer program for realizing remote user access to mediated reality content
Karjalainen et al. Application Scenarios of Wearable and Mobile Augmented Reality Audio
Beinhauer et al. Using Acoustic Landscapes for the Evaluation of Multimodal Mobile Applications

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL HR LT LV MK

17P Request for examination filed

Effective date: 20060620

AKX Designation fees paid

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

17Q First examination report despatched

Effective date: 20090327

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REF Corresponds to:

Ref document number: 602004029021

Country of ref document: DE

Date of ref document: 20101021

Kind code of ref document: P

REG Reference to a national code

Ref country code: NL

Ref legal event code: VDEP

Effective date: 20100908

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20101209

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20110110

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20101219

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20110609

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602004029021

Country of ref document: DE

Effective date: 20110609

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110731

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110731

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110713

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110713

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20101208

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100908

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 13

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 14

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 15

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20180625

Year of fee payment: 15

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20180711

Year of fee payment: 15

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20190713

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190713

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190731

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20200630

Year of fee payment: 17

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602004029021

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220201