WO2009039636A1 - Interactive sound synthesis - Google Patents

Interactive sound synthesis Download PDF

Info

Publication number
WO2009039636A1
WO2009039636A1 PCT/CA2008/001686 CA2008001686W WO2009039636A1 WO 2009039636 A1 WO2009039636 A1 WO 2009039636A1 CA 2008001686 W CA2008001686 W CA 2008001686W WO 2009039636 A1 WO2009039636 A1 WO 2009039636A1
Authority
WO
WIPO (PCT)
Prior art keywords
interaction
properties
elements
sound
game
Prior art date
Application number
PCT/CA2008/001686
Other languages
French (fr)
Inventor
Gunjan Porwal
Original Assignee
Ati Technologies Ulc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ati Technologies Ulc filed Critical Ati Technologies Ulc
Priority to EP08800377A priority Critical patent/EP2225753A4/en
Priority to CN200880113777A priority patent/CN101842830A/en
Priority to JP2010526117A priority patent/JP2010540989A/en
Publication of WO2009039636A1 publication Critical patent/WO2009039636A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/54Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
    • A63F13/10
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6063Methods for processing data by generating or executing the game program for sound processing
    • A63F2300/6081Methods for processing data by generating or executing the game program for sound processing generating an output signal, e.g. under timing constraints, for spatialization
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • A63F2300/643Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car by determining the impact between objects, e.g. collision detection

Definitions

  • the pre-recorded sounds of various interactions are stored in the game memory along with various game related computer instructions.
  • the pre-recorded sounds are fetched from their storage location and are played back on occurrence of an interaction between game elements within the gaming environment. These pre-recorded sounds can be played back for the duration of the interaction, which again may be dictated by the game related computer instructions.
  • one or more interactions between game elements within a game environment are identified. Properties associated with each of the gaming element are determined and parameters of the interactions are calculated. Based on these parameters and the properties of the elements, sound is produced using stored sound samples.
  • FIG. 1 illustrates an exemplary system for synthesis of sound in a gaming system.
  • FIG. 2 illustrates a computing-based device for synthesizing sound.
  • FIG. 3 illustrates exemplary method for determining sound in a gaming environment.
  • a system includes one or more agents for generating sound for all interactions that happen in the simulated environment based on the extent of interactions between one or more elements.
  • elements include, but are not limited to, trees, houses, tanks, people, animals etc.
  • Elements used in the simulation are assigned a property. For instance, elements like people would have the property of flesh and tanks would have the property of metal. Now these properties can also contain subclasses, for example the sub-properties of metal can be tin, bronze, steel etc.
  • These sounds can be generated by calculating the extent of an interaction and by determining the properties of the elements involved in the interaction.
  • modules determine these interactions and the elements involved in these interactions.
  • the interaction between any two elements is defined and a unit of sound is created based on this interaction or collision.
  • the interactions also have various parameters assigned to them. Based on these parameters and properties of the elements calculations can be carried out. Subsequently an algorithm can be applied to generate the sound. For each instance, the parameters associated with the interaction are calculated. Thus through continuous interactions audio can be generated.
  • Fig. 1 illustrates an exemplary system 100 for synthesizing sound in a simulated environment such as a game environment, an architectural walk-through environment or the like, hi the embodiment described below system 100 is presented as a gaming system.
  • System 100 includes a gaming device 102.
  • the gaming device 102 can be a computing-based device that is instrumented to perform one or more functions in response to the execution of one or more computer instructions.
  • the system 100 includes a display area 104 and one or more game controllers 106(l)-(n) that are connected to the gaming device 102.
  • the display area 104 is utilized for displaying output data, generated as a result of the processing performed by one or more processing modules in the gaming device 102.
  • the game controllers 106(l)-(n) allow users to control one or more features or functionalities that are associated with the gaming device 102.
  • the game controller 106(1) can allow a user to control game interactions during game play and perform other game related operations such as pause, end game and so on. It will be appreciated by a skilled person that other functionalities could be associated with the game controllers 106(l)-(n).
  • the system 100 also includes one or more sound producing devices 108(1)-
  • (n) e.g., audio speakers
  • One or more modules of the gaming device 102 generate synthesized sound that is output via sound producing devices 108(l)-(n).
  • sound producing devices 108(l)-(n) can output sound in response to the execution of one or more computer instructions.
  • synthesizing sound during game play includes determining one or more interactions that occur in the gaming environment and generating sound based on the interactions.
  • the gaming device 102 includes interaction module 110 and synthesizing module 112.
  • Interaction module 110 determines one or more interactions between game elements during game play.
  • the game elements interact with other game elements, in the game environment.
  • game elements can collide with one another, say rocks hitting a car.
  • Each of the interactions are calculated based on the degree of interaction between the game elements to generate suitable sound.
  • the degree of interaction between a small rock and an object, and a large rock and the object will be different and would be of varying interaction degree.
  • Other physical parameters such as relative velocity of the game elements represented as objects, physical structure of the objects, material properties, transmission medium for the generated sound (e.g., water, air, etc.) and others can also be used as factors in generating an appropriate sound for the game element interactions.
  • the manner in which each of the interactions between one or more game elements is determined is based on properties that are associated with the interacting game elements.
  • the synthesizing module 112 on determining the interactions generates sound in accordance with the properties associated with the interactions.
  • the working of the interaction module 110 and synthesizing module 112 is further described in detail in conjunction with Fig. 2.
  • Fig. 2 illustrates an exemplary gaming device 102 for generating sound in a gaming environment. The sound is generated based on interactions between gaming elements in the gaming environment.
  • the gaming device 102 includes one or more processors 202, along with one or more I/O interfaces 204 and one or more memories 206.
  • Processors) 202 can be a single processing unit or a number of units all of which could include multiple computing units.
  • Memory 206 can include for example volatile memory (e.g. RAM) and non-volatile memory (e.g. ROM, EPROM etc).
  • the program instructions are stored in the memory 206 and are executed by the processor(s) 202.
  • I/O interfaces 204 provide input-output capabilities for the gaming device
  • the I/O interfaces 204 can include one or more ports for connecting a number of game controllers 106(l)-(n) as input devices and sound producing devices 108(l)-(n) as one of the output devices, and so on.
  • the gaming device 102 receives input data from a user via the game controllers 106(l)-(n) connected through I/O interfaces 204.
  • Memory 206 includes program modules 208 and program data 210.
  • the program module(s) 208 includes interaction module 110, synthesizing module 112 and other program module(s) 212.
  • Program data 210 includes sound(s) data 214, game data 216 and other program data 218.
  • Other program data 218 stores various data that may be generated or required during the functioning of the gaming device 102.
  • Interaction module 110 detects interactions that occur between game elements during game play in a gaming environment. Typically, elements within a simulated/ virtual environment are made up of polygons that are modified and stored in the game data 216. The co-ordinates of its vertices define the polygon position. The interaction module 110 detects an interaction, whenever polygons of different game elements intersect or overlap. In one implementation, the interaction module 110 detects an interaction when the polygons, constituting the game elements, are present within a certain proximity of each other.
  • interaction detection In a two dimensional (2D) environment, interaction detection is straightforward. Interaction occurs whenever two or more polygons intersect with each other. In three-dimensional (3D) environments, there are various methods to detect interactions. In one implementation, each element is approximated with a sphere and then the interaction module 110 checks whether the spheres intersect each other. This method is typically used, as it is computationally inexpensive. The distance between the centers of the two spheres is checked. If this distance is lesser than the sum of the two radii, an interaction occurs. In another implementation, the spheres can be broken down into smaller spheres and again can be checked for intersection. Other methods for collision detection can be used which are known in the art
  • the interaction module 110 determines the game elements involved in the interaction. For example, the interaction module 110 can identify a locomotive running on tracks as an interaction between the locomotive, air and the tracks. The unit sound associated with an interaction can be fetched from sound(s) data 214 after the properties and sub-properties that are associated with the game elements are identified. The interaction module 110 retrieves the information about properties and sub- properties associated with the game elements from the game data 216. For example, a game element like a locomotive has properties that are indicative of the material it is made up of, say metal. The sub-property may clearly relate to the type of metal used; say steel, iron, lead and so on. Similarly, the properties and the sub-properties of the other game elements involved, say the railway tracks, can also be retrieved.
  • the game designer creates game elements while designing the game.
  • the designer also decides the properties and sub-properties of the game elements during this initial designing process and stores this information in game data 216.
  • the interaction module 110 retrieves this information about the game elements from the game data 216.
  • Interaction module 110 further determines the parameters associated with each interaction. Examples of such parameters include but are not limited to force of the interaction, the time period of the interaction, the area of the game elements involved in the interaction and the yielding property of the game elements involved in the interaction. For example, when a rock collides with a car, interaction module 110 calculates the force with which the rock collides with the car, the area of the car effected by the collision , the time period of the interaction, and so on. In addition, the yielding properties of the car and rock can be determined, for example the extent of the deformation induced upon the car, the effect on the rock, and so on. A unit sound associated with an interaction is modulated (e.g., adapted) based on these calculations. For example, a pre-stored unit sound corresponding to the interaction between the rock and the car is modulated to produce appropriate sound.
  • parameters associated with each interaction include but are not limited to force of the interaction, the time period of the interaction, the area of the game elements involved in the interaction and the yielding property of the game elements
  • the unit sound is stored for the interactions between all the game elements in a game environment, during creation of the game. For example, the unit sound for an interaction between a huge block of ice and water is generated and stored for a unit of ice interacting with a unit of water with unit force.
  • the unit sound associated with pre-defined interactions is stored in the sound(s) data 214.
  • unit sounds is generated by system 100 based on the physical properties of the objects (e.g., density, stiffness, physical structure, material properties, etc.) interacting. The unit sounds generated in this manner (which may be in real time or non-real time) can then be stored for later reference in sound(s) data 214.
  • the synthesizing module 112 modulates sounds corresponding to the interactions between the various game elements, occurring in a game environment. Modulation in this context implies adapting or processing the unit sound in accordance with the parameters of the interaction. One or more unit sounds can be retrieved from the sound(s) data 214 for each interaction that occurs in the game environment during game play.
  • the synthesizing module 112 processes the unit sounds, based on the parameters and/or properties associated with an interaction and/or the game elements involved in the interaction respectively. For example, the synthesizing module 112 modulates a unit sound associated with a collision between a car and a rock based on parameters such as size of the car and the rock, and the intensity of collision, and so on.
  • the synthesizing module 112 further processes the said unit sound based on other factors associated with the interaction, which include but are not limited to yielding property of elements, and area of interaction.
  • the interaction module 110 determines if the interactions between elements in a simulated environment are pre-defined or not. For new interactions no unit sound exists in the sound(s) data 214. In such a case, the interaction module 110 can identify one or more pre-defined interactions that are similar to the new interaction. The identification is based on a comparison of the properties and sub-properties associated with the game elements involved in the new interaction with the properties and sub- properties of the pre-defined interactions. Once a similar pre-defined interaction is determined, the new interaction can be classified as a pre-defined interaction and accordingly a sound can be generated.
  • the interaction module 110 analyzes the properties and sub-properties of the game elements, namely the lead and the glass object.
  • the interaction module 110 identifies one or more interactions that are similar to this interaction based on the properties and sub-properties of game elements for the predefined interactions.
  • the unit sound associated with that pre-defined interaction is processed and appropriate sound is generated.
  • the sound that has to be generated is processed in conformance with the parameters of the interaction between the lead and the glass object.
  • An exemplary method for synthesizing game related sound using natural physical laws is described with reference to Figs. 1 and 2. These methods are described in the general context of instructions that can be executed on a computing device. Generally, such instructions include routines, programs, objects, components, data structures, procedures, modules, functions, and the like that perform particular functions or implement particular abstract data types.
  • Fig. 3 illustrates an exemplary method 300 for synthesizing game related sound based on interactions between game elements in a game environment.
  • the exemplary method 300 further illustrates the process of synthesizing sounds based on the interacting game elements and their associated parameters to facilitate a realistic gaming experience for users.
  • the order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method, or an alternate method. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein.
  • the method can be implemented in any suitable hardware, software, firmware, or combination thereof, without departing from the scope of the invention.
  • interaction module 110 detects and identifies the interactions that occur between one or more game elements during game play. For example, the interaction module 110 determines the interactions between game elements like an aircraft crashing into the sea and represent them as a set of interactions between the aircraft and the sea and/or the aircraft and air.
  • the interaction module 110 detects an interaction based on polygons intersecting or overlapping. In one implementation, the interaction module 110 detects an interaction when the polygons, constituting the game elements, are present within certain proximity of each other. Many methods can be employed for interaction detection in 2 dimensional and 3 dimensional simulated environments each having their own advantages and disadvantages. These interaction detection methods are known in the art.
  • the interaction module 110 identifies the game elements that are involved in the interactions.
  • the relevant game elements can be the aircraft, the sea, and the air.
  • the properties and sub-properties associated with each game element are retrieved.
  • the relevant properties and sub-properties for the interacting game elements are acquired from the game data 216.
  • the interaction module 110 extracts the properties and sub-properties associated with the game elements from game data 216.
  • the aircraft can have the property metal associated with it.
  • the interaction module 110 will calculate the force with which the aircraft crashes into the sea, the area of the sea that was involved in the interaction and the time period of the crash to name a few. These parameters are used to modulate the unit sound fetched from sound(s) data 214, to generate a particular sound, which is applicable to this interaction.
  • the process can be allowed to proceed and unit sound for that particular interaction can be retrieved from sound(s) data 214 (block 314).
  • the process can be allowed to proceed and a check is performed for similar interaction based on properties and sub-properties of the game elements involved in the interactions (block 314). For example, say if the interaction between an aluminum object and water is not predefined in the game data 216, it will not proceed to block 314 but instead will proceed to block 312.
  • an appropriate predefined interaction between similar elements can be identified and the associated unit sound can be assigned to the undefined interaction. Similarity can be determined by analyzing the properties and sub-properties of the interacting game elements in the predefined and undefined interactions. For example, the unit sound associated with a predefined interaction between an iron object and water can be assigned to the undefined interaction of the previous example, between an aluminum object and water.
  • the interaction module 110 determines the similarity between a new and pre-defined interaction based on their associated properties and sub-properties. On making such a determination, the interaction module 110 classifies the new interaction as a pre-defined interaction.
  • synthesizing module 112 retrieves the unit sound associated with the predefined interaction.
  • the synthesizing module 112 can retrieve the unit sound associated with a predefined interaction between elements like iron and water.
  • the unit sound is retrieved from sound(s) data 214.
  • the synthesizing module 112 processes the unit sounds based on the parameters associated with an interaction. For example, the sounds for an aircraft crashing into the sea can be processed based on the parameters associated with this crash like the force of the crash, the area of the region affected by the crash and the volume of water dispersed because of the crash to name a few. In one implementation, the synthesizing module 112 retrieves the parameters associated with the interaction along with the unit sound associated with the interaction. In one implementation, the parameters can be retrieved from the game data 216 and the unit sound can be retrieved from sound(s) data 214.
  • Modulation in this context implies adapting or processing the unit sound in accordance with the parameters of the interaction.
  • the unit sound associated with a collision between game elements like two cars can be extended for the duration of the collision, the unit sound can be amplified in accordance with the nature of the crash, and so on.
  • the modulated sound is the resultant sound generated for an interaction by utilizing the unit sound and the parameters associated with the interaction.
  • the modulated sounds generated at the previous block (316) are sent to the sound producing devices 108(l)-(n). .
  • the modulated sound associated with each interaction is combined and sent to the sound producing devices 108(l)-(n).
  • the combination of different sounds may involve mixing and filtering of sounds to produce realistic sounds during game play.
  • other module(s) 212 mix and filter the modulated sound. For example, in a game environment replicating combat operations, various interactions such as gunshots, movement of players and tanks can be occurring at an instance of time. Sounds for these interactions can be generated and mixed to produce realistic sounds at an instance of time during game play.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The subject matter relates to a method for synthesizing game related sound using the laws of physics. In one implementation, one or more interactions between game elements within a game environment are identified. Properties associated with each of the gaming element are determined and parameters of the interactions are calculated. Based on these parameters and the properties of the elements, stored sound samples are used to produce appropriate sound.

Description

INTERACTIVE SOUND SYNTHESIS
BACKGROUND
[0001] In recent years, sound synthesis for games has been developing due to the increasing capabilities of modern computer hardware, particularly processors and graphic accelerators. Most computer games include a number of objects or game elements like trees, houses, rocks, rivers, cars etc. When these game elements or objects interact with one another in the gaming environment, a sound is produced. For example, the sound generated in a game environment when a rock is hurled at a car or the sound generated during a landslide. More realism of the sound would create more interest in the user playing the game. Typically, the background sounds for games are pre-recorded in studios. The prerecorded sounds either can be both recordings of actual natural sounds, or can be sounds that are synthesized through electronic means. The pre-recorded sounds of various interactions are stored in the game memory along with various game related computer instructions. The pre-recorded sounds are fetched from their storage location and are played back on occurrence of an interaction between game elements within the gaming environment. These pre-recorded sounds can be played back for the duration of the interaction, which again may be dictated by the game related computer instructions.
SUMMARY
[0002] This summary is provided to introduce concepts relating to synthesizing sound using physics. These concepts are further described below in the detailed description. The presented summary is not intended to identify essential features of the claimed subject matter, nor is it intended for use in determining the scope of the claimed subject matter.
[0003] In one implementation, one or more interactions between game elements within a game environment are identified. Properties associated with each of the gaming element are determined and parameters of the interactions are calculated. Based on these parameters and the properties of the elements, sound is produced using stored sound samples.
BRIEF DESCRIPTION OF THE DRAWINGS
[0004] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and components.
[0005] Fig. 1 illustrates an exemplary system for synthesis of sound in a gaming system.
[0006] Fig. 2 illustrates a computing-based device for synthesizing sound.
[0007] Fig. 3 illustrates exemplary method for determining sound in a gaming environment.
DETAILED DESCRIPTION
[0008] Systems and methods for synthesizing game related sound based on natural physical laws. Typically, in present simulated environment and programs, these natural physical laws dictate the interactions between one or more elements. For example, the trajectory of a projectile is determined based on the physical laws. Based on the interactions that occur within the simulated environment, a sound is generated.
[0009] Conventional simulated systems employ the generation of sounds associated with separate interactions, by playing back a pre-recorded audio on occurrence of the interactions. For example, when a bullet is fired, an associated pre-recorded sound is played back thus creating a realistic environment. As visual aspects associated with the simulations become more intricate, it may become difficult to store large number of audio files for the intricate interactions that occur.
[0010] To this end, a system includes one or more agents for generating sound for all interactions that happen in the simulated environment based on the extent of interactions between one or more elements. Examples of such elements include, but are not limited to, trees, houses, tanks, people, animals etc. Elements used in the simulation are assigned a property. For instance, elements like people would have the property of flesh and tanks would have the property of metal. Now these properties can also contain subclasses, for example the sub-properties of metal can be tin, bronze, steel etc. These sounds can be generated by calculating the extent of an interaction and by determining the properties of the elements involved in the interaction.
[0011] Whenever an interaction occurs between elements, modules determine these interactions and the elements involved in these interactions. The interaction between any two elements is defined and a unit of sound is created based on this interaction or collision. The interactions also have various parameters assigned to them. Based on these parameters and properties of the elements calculations can be carried out. Subsequently an algorithm can be applied to generate the sound. For each instance, the parameters associated with the interaction are calculated. Thus through continuous interactions audio can be generated.
[0012] Generation of sound in the simulated environment circumvents the need to store and playback pre-recorded sounds. Consequently, separate audio files for different actions being implemented need not be stored. [0013] In one implementation, previous interactions and their associated calculations are used for determining newly occurring interactions. In such cases, the newly occurring interactions are approximated as similar to previously occurring interactions, based on parameters of the newly and the previously occurring interactions. For such interactions an audio files corresponding to the unit audio associated with the previously occurring interaction are processed and played back.
[0014] The technology described herein maybe used in many different operating environments and systems. Multiple and varied implementations are described below. An exemplary environment that is suitable for practicing various implementations is discussed in the following sections.
[0015] Fig. 1 illustrates an exemplary system 100 for synthesizing sound in a simulated environment such as a game environment, an architectural walk-through environment or the like, hi the embodiment described below system 100 is presented as a gaming system. System 100 includes a gaming device 102. In one implementation, the gaming device 102 can be a computing-based device that is instrumented to perform one or more functions in response to the execution of one or more computer instructions. The system 100 includes a display area 104 and one or more game controllers 106(l)-(n) that are connected to the gaming device 102. The display area 104 is utilized for displaying output data, generated as a result of the processing performed by one or more processing modules in the gaming device 102.
[0016] The game controllers 106(l)-(n) allow users to control one or more features or functionalities that are associated with the gaming device 102. For example, the game controller 106(1) can allow a user to control game interactions during game play and perform other game related operations such as pause, end game and so on. It will be appreciated by a skilled person that other functionalities could be associated with the game controllers 106(l)-(n).
[0017] The system 100 also includes one or more sound producing devices 108(1)-
(n) (e.g., audio speakers) connected to the gaming device 102. One or more modules of the gaming device 102 generate synthesized sound that is output via sound producing devices 108(l)-(n). In one embodiment, sound producing devices 108(l)-(n) can output sound in response to the execution of one or more computer instructions.
[0018] As indicated previously, synthesizing sound during game play includes determining one or more interactions that occur in the gaming environment and generating sound based on the interactions. To this end, the gaming device 102 includes interaction module 110 and synthesizing module 112.
[0019] Interaction module 110 determines one or more interactions between game elements during game play. The game elements interact with other game elements, in the game environment. For example, game elements can collide with one another, say rocks hitting a car. Each of the interactions are calculated based on the degree of interaction between the game elements to generate suitable sound. For example, the degree of interaction between a small rock and an object, and a large rock and the object will be different and would be of varying interaction degree. Other physical parameters such as relative velocity of the game elements represented as objects, physical structure of the objects, material properties, transmission medium for the generated sound (e.g., water, air, etc.) and others can also be used as factors in generating an appropriate sound for the game element interactions. In addition, the manner in which each of the interactions between one or more game elements is determined is based on properties that are associated with the interacting game elements.
[0020] The synthesizing module 112 on determining the interactions generates sound in accordance with the properties associated with the interactions. The working of the interaction module 110 and synthesizing module 112 is further described in detail in conjunction with Fig. 2.
[0021] Fig. 2 illustrates an exemplary gaming device 102 for generating sound in a gaming environment. The sound is generated based on interactions between gaming elements in the gaming environment. The gaming device 102 includes one or more processors 202, along with one or more I/O interfaces 204 and one or more memories 206. Processors) 202 can be a single processing unit or a number of units all of which could include multiple computing units. Memory 206 can include for example volatile memory (e.g. RAM) and non-volatile memory (e.g. ROM, EPROM etc). The program instructions are stored in the memory 206 and are executed by the processor(s) 202.
[0022] I/O interfaces 204 provide input-output capabilities for the gaming device
102. The I/O interfaces 204 can include one or more ports for connecting a number of game controllers 106(l)-(n) as input devices and sound producing devices 108(l)-(n) as one of the output devices, and so on. In one implementation, the gaming device 102 receives input data from a user via the game controllers 106(l)-(n) connected through I/O interfaces 204.
[0023] Memory 206 includes program modules 208 and program data 210. The program module(s) 208 includes interaction module 110, synthesizing module 112 and other program module(s) 212. Program data 210 includes sound(s) data 214, game data 216 and other program data 218. Other program data 218 stores various data that may be generated or required during the functioning of the gaming device 102.
[0024] Interaction module 110 detects interactions that occur between game elements during game play in a gaming environment. Typically, elements within a simulated/ virtual environment are made up of polygons that are modified and stored in the game data 216. The co-ordinates of its vertices define the polygon position. The interaction module 110 detects an interaction, whenever polygons of different game elements intersect or overlap. In one implementation, the interaction module 110 detects an interaction when the polygons, constituting the game elements, are present within a certain proximity of each other.
[0025] In a two dimensional (2D) environment, interaction detection is straightforward. Interaction occurs whenever two or more polygons intersect with each other. In three-dimensional (3D) environments, there are various methods to detect interactions. In one implementation, each element is approximated with a sphere and then the interaction module 110 checks whether the spheres intersect each other. This method is typically used, as it is computationally inexpensive. The distance between the centers of the two spheres is checked. If this distance is lesser than the sum of the two radii, an interaction occurs. In another implementation, the spheres can be broken down into smaller spheres and again can be checked for intersection. Other methods for collision detection can be used which are known in the art
[0026] Once the interaction module 110 detects an interaction, it determines the game elements involved in the interaction. For example, the interaction module 110 can identify a locomotive running on tracks as an interaction between the locomotive, air and the tracks. The unit sound associated with an interaction can be fetched from sound(s) data 214 after the properties and sub-properties that are associated with the game elements are identified. The interaction module 110 retrieves the information about properties and sub- properties associated with the game elements from the game data 216. For example, a game element like a locomotive has properties that are indicative of the material it is made up of, say metal. The sub-property may clearly relate to the type of metal used; say steel, iron, lead and so on. Similarly, the properties and the sub-properties of the other game elements involved, say the railway tracks, can also be retrieved.
[0027] The game designer creates game elements while designing the game. The designer also decides the properties and sub-properties of the game elements during this initial designing process and stores this information in game data 216. During game play, when an interaction occurs, the interaction module 110 retrieves this information about the game elements from the game data 216.
[0028] Interaction module 110 further determines the parameters associated with each interaction. Examples of such parameters include but are not limited to force of the interaction, the time period of the interaction, the area of the game elements involved in the interaction and the yielding property of the game elements involved in the interaction. For example, when a rock collides with a car, interaction module 110 calculates the force with which the rock collides with the car, the area of the car effected by the collision , the time period of the interaction, and so on. In addition, the yielding properties of the car and rock can be determined, for example the extent of the deformation induced upon the car, the effect on the rock, and so on. A unit sound associated with an interaction is modulated (e.g., adapted) based on these calculations. For example, a pre-stored unit sound corresponding to the interaction between the rock and the car is modulated to produce appropriate sound.
[0029] The unit sound is stored for the interactions between all the game elements in a game environment, during creation of the game. For example, the unit sound for an interaction between a huge block of ice and water is generated and stored for a unit of ice interacting with a unit of water with unit force. In one implementation, the unit sound associated with pre-defined interactions is stored in the sound(s) data 214. In an alternative embodiment, unit sounds is generated by system 100 based on the physical properties of the objects (e.g., density, stiffness, physical structure, material properties, etc.) interacting. The unit sounds generated in this manner (which may be in real time or non-real time) can then be stored for later reference in sound(s) data 214.
[0030] The synthesizing module 112 modulates sounds corresponding to the interactions between the various game elements, occurring in a game environment. Modulation in this context implies adapting or processing the unit sound in accordance with the parameters of the interaction. One or more unit sounds can be retrieved from the sound(s) data 214 for each interaction that occurs in the game environment during game play. The synthesizing module 112 processes the unit sounds, based on the parameters and/or properties associated with an interaction and/or the game elements involved in the interaction respectively. For example, the synthesizing module 112 modulates a unit sound associated with a collision between a car and a rock based on parameters such as size of the car and the rock, and the intensity of collision, and so on. If in this example, the force of the collision is very large, the unit sound is amplified by a larger extent, or if the collision occurs for an extended period, the unit sound is played for the duration of the collision and so on. The synthesizing module 112 further processes the said unit sound based on other factors associated with the interaction, which include but are not limited to yielding property of elements, and area of interaction.
[0031] In one embodiment, the interaction module 110 determines if the interactions between elements in a simulated environment are pre-defined or not. For new interactions no unit sound exists in the sound(s) data 214. In such a case, the interaction module 110 can identify one or more pre-defined interactions that are similar to the new interaction. The identification is based on a comparison of the properties and sub-properties associated with the game elements involved in the new interaction with the properties and sub- properties of the pre-defined interactions. Once a similar pre-defined interaction is determined, the new interaction can be classified as a pre-defined interaction and accordingly a sound can be generated.
[0032] For example, if during game play, a new type of interaction such as an interaction between a lead and a glass object occurs then the interaction module 110 analyzes the properties and sub-properties of the game elements, namely the lead and the glass object. The interaction module 110 identifies one or more interactions that are similar to this interaction based on the properties and sub-properties of game elements for the predefined interactions. On finding a similar interaction, say between an iron object and glass object, the unit sound associated with that pre-defined interaction is processed and appropriate sound is generated. The sound that has to be generated is processed in conformance with the parameters of the interaction between the lead and the glass object. An exemplary method for synthesizing game related sound using natural physical laws is described with reference to Figs. 1 and 2. These methods are described in the general context of instructions that can be executed on a computing device. Generally, such instructions include routines, programs, objects, components, data structures, procedures, modules, functions, and the like that perform particular functions or implement particular abstract data types.
[0033] Fig. 3 illustrates an exemplary method 300 for synthesizing game related sound based on interactions between game elements in a game environment. The exemplary method 300 further illustrates the process of synthesizing sounds based on the interacting game elements and their associated parameters to facilitate a realistic gaming experience for users. The order in which the method is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method, or an alternate method. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof, without departing from the scope of the invention.
[0034] At block 302, interactions occurring at any instant between one or more game elements within a game environment are detected and identified. In one implementation, interaction module 110 detects and identifies the interactions that occur between one or more game elements during game play. For example, the interaction module 110 determines the interactions between game elements like an aircraft crashing into the sea and represent them as a set of interactions between the aircraft and the sea and/or the aircraft and air.
[0035] The interaction module 110 detects an interaction based on polygons intersecting or overlapping. In one implementation, the interaction module 110 detects an interaction when the polygons, constituting the game elements, are present within certain proximity of each other. Many methods can be employed for interaction detection in 2 dimensional and 3 dimensional simulated environments each having their own advantages and disadvantages. These interaction detection methods are known in the art.
[0036] At block 304, the game elements involved in the interaction are determined.
In one implementation, the interaction module 110 identifies the game elements that are involved in the interactions. For example, in the above-mentioned example involving the aircraft and the sea, the relevant game elements can be the aircraft, the sea, and the air.
[0037] At block 306, the properties and sub-properties associated with each game element are retrieved. The relevant properties and sub-properties for the interacting game elements are acquired from the game data 216. In one implementation, the interaction module 110 extracts the properties and sub-properties associated with the game elements from game data 216. For example, the aircraft can have the property metal associated with it. Further game elements, i.e. the aircraft can further be associated with a sub-property, say aluminum; the sea can have the property water and air can have the property gas and so on.
[0038] At block 308, parameters associated with the interaction are calculated.
These calculations will affect the sound that can be synthesized. For example, during game play, when an interaction occurs between the aircraft and the sea of the previous example, the interaction module 110 will calculate the force with which the aircraft crashes into the sea, the area of the sea that was involved in the interaction and the time period of the crash to name a few. These parameters are used to modulate the unit sound fetched from sound(s) data 214, to generate a particular sound, which is applicable to this interaction.
[0039] At block 310, it is determined if the interaction is predefined and has a predefined unit sound associated with it. For example, if the interaction during the game play can be verified against the predefined interactions stored in the game data 216 ('yes' path from block 310), the process can be allowed to proceed and unit sound for that particular interaction can be retrieved from sound(s) data 214 (block 314). Alternately, if the interaction occurring during game play cannot be verified against the predefined interactions stored in the game data 216 ('no' path from block 310), the process can be allowed to proceed and a check is performed for similar interaction based on properties and sub-properties of the game elements involved in the interactions (block 314). For example, say if the interaction between an aluminum object and water is not predefined in the game data 216, it will not proceed to block 314 but instead will proceed to block 312.
[0040] At block 312, for an undefined interaction, an appropriate predefined interaction between similar elements can be identified and the associated unit sound can be assigned to the undefined interaction. Similarity can be determined by analyzing the properties and sub-properties of the interacting game elements in the predefined and undefined interactions. For example, the unit sound associated with a predefined interaction between an iron object and water can be assigned to the undefined interaction of the previous example, between an aluminum object and water. In one implementation, the interaction module 110 determines the similarity between a new and pre-defined interaction based on their associated properties and sub-properties. On making such a determination, the interaction module 110 classifies the new interaction as a pre-defined interaction. Alternatively, and as noted above, in the event that an undefined interaction between elements is identified, a new unit sound (which will be later stored in sound(s) data 214) can be generated based on the properties associated with the interacting elements. For example, each objects physical structure, material properties (e.g., density, stiffness). [0041] At block 314, synthesizing module 112 retrieves the unit sound associated with the predefined interaction. For example, the synthesizing module 112 can retrieve the unit sound associated with a predefined interaction between elements like iron and water. In one implementation, the unit sound is retrieved from sound(s) data 214.
[0042] At block 316, the synthesizing module 112 processes the unit sounds based on the parameters associated with an interaction. For example, the sounds for an aircraft crashing into the sea can be processed based on the parameters associated with this crash like the force of the crash, the area of the region affected by the crash and the volume of water dispersed because of the crash to name a few. In one implementation, the synthesizing module 112 retrieves the parameters associated with the interaction along with the unit sound associated with the interaction. In one implementation, the parameters can be retrieved from the game data 216 and the unit sound can be retrieved from sound(s) data 214.
[0043] Once the parameters are calculated, these parameters are used to modulate the unit sound. Modulation in this context implies adapting or processing the unit sound in accordance with the parameters of the interaction. For example, the unit sound associated with a collision between game elements like two cars, can be extended for the duration of the collision, the unit sound can be amplified in accordance with the nature of the crash, and so on. The modulated sound is the resultant sound generated for an interaction by utilizing the unit sound and the parameters associated with the interaction. At block 318, the modulated sounds generated at the previous block (316) are sent to the sound producing devices 108(l)-(n). . If a number of interactions occur at the same instance of time, then the modulated sound associated with each interaction is combined and sent to the sound producing devices 108(l)-(n). The combination of different sounds may involve mixing and filtering of sounds to produce realistic sounds during game play. In one implementation, other module(s) 212 mix and filter the modulated sound. For example, in a game environment replicating combat operations, various interactions such as gunshots, movement of players and tanks can be occurring at an instance of time. Sounds for these interactions can be generated and mixed to produce realistic sounds at an instance of time during game play.
[0044] Although implementations have been described above for synthesizing game related sound using natural physics laws in language specific to structural features and/or methods, it is to be understood that the subject of the appended claims is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as exemplary implementations of the claimed subject matter.

Claims

1. A method for synthesizing sound, the method comprising: responsive to an interaction between a first element and a second element in a simulated environment, generating a sound based on one or more properties of the first and second elements and one or more parameters associated with the interaction.
2. The method of claim 1, wherein the method further comprises determining the one or more properties of the first and the second elements. .
3. The method of claim 2, wherein the determining further comprises: classifying the interaction as a predefined interaction or a new interaction.
4. The method of claim 3, wherein the determining further comprises: comparing the one or more properties of the first and the second elements associated with a new interaction, with properties of elements associated with a predefined interaction; and classifying the new interaction as the pre-defined interaction based at least on the comparison.
5. The method of claim 1, wherein the one or more properties are associated with one or more sub-properties.
6. The method of claim 1, wherein the parameters are selected from a group comprising force of the interaction, time period of the interaction, interaction area of the first and second elements, yielding property of the first and second elements.
7. The method of claim 1, wherein the generating a sound further comprises processing a unit sound based on the one or more parameters of the interaction.
8. A system for synthesizing sound, the system comprising: one or more processors; a memory; one or more interaction modules configured to: identify an interaction between a first element and a second element in a simulated environment; one or more synthesizing modules configured to: generate a sound based on one or more properties of the first and second elements and one or more parameters associated with the interaction.
9. The system of claim 8, wherein the simulated environment includes a gaming environment, video/audio animation, an architectural walk through, and a virtual reality environment.
10. The system of claim 8, wherein the interaction module is further configured to determine the one or more properties of the first and second elements.
11. The system of claim 8, wherein the interaction module are further configured to classify the interaction as a predefined interaction or a new interaction.
12. The system of claim 11, wherein the interaction module are further configured to: compare the one or more properties of the first and the second element associated with a new interaction, with properties of the elements associated with a predefined interaction; classify the new interaction as the predefined interaction based at least on the comparison.
13. The system of claim 8, wherein the interaction module are further configured to retrieve one or more sub-properties associated with the one or more properties of the first and second element.
14. The system of claim 8, wherein the interaction module are further configured to select at least one parameter from a group comprising force of the interaction, time period of the interaction, interaction area of the first and second elements, yielding property of the first and second elements.
15. One or more computer-readable media comprising computer executable instructions that, when executed, direct a device to perform acts comprising: responsive to an interaction between a first and a second element in a simulated environment, generating a sound based on one or more properties of the first and the second elements and one or more parameters associated with the interaction.
16. The computer readable media of claim 15, comprises determining one or more properties and sub-properties of the first and second element.
17. The computer readable media of claim 15, further comprises classifying the interaction as predefined or new interaction.
18. The computer readable media of claim 17, further comprises: comparing the one or more properties of the first and second elements associated with a new interaction, with properties of the elements associated with a predefined interaction; and classifying the new interaction as a predefined interaction based at least on the comparing.
19. The computer readable media of claim 15, further comprises selecting the parameters from a group comprising force of the interaction, time period of the interaction, interaction area of the first and second elements, yielding property of the first and second elements involved in the interaction.
20. The computer readable media of claim 15, wherein the generating can be based on at least on one the following: properties and sub-properties of the first and second elements associated with the interaction; computation of the parameters associated with the interaction.
PCT/CA2008/001686 2007-09-28 2008-09-25 Interactive sound synthesis WO2009039636A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP08800377A EP2225753A4 (en) 2007-09-28 2008-09-25 Interactive sound synthesis
CN200880113777A CN101842830A (en) 2007-09-28 2008-09-25 Interactive sound synthesis
JP2010526117A JP2010540989A (en) 2007-09-28 2008-09-25 Interactive sound synthesis

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/864,146 US20090088246A1 (en) 2007-09-28 2007-09-28 Interactive sound synthesis
US11/864,146 2007-09-28

Publications (1)

Publication Number Publication Date
WO2009039636A1 true WO2009039636A1 (en) 2009-04-02

Family

ID=40509017

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2008/001686 WO2009039636A1 (en) 2007-09-28 2008-09-25 Interactive sound synthesis

Country Status (6)

Country Link
US (1) US20090088246A1 (en)
EP (1) EP2225753A4 (en)
JP (1) JP2010540989A (en)
KR (1) KR20100074225A (en)
CN (1) CN101842830A (en)
WO (1) WO2009039636A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112449210A (en) * 2019-08-28 2021-03-05 北京字节跳动网络技术有限公司 Sound processing method, sound processing device, electronic equipment and computer readable storage medium
US12022162B2 (en) 2019-08-28 2024-06-25 Beijing Bytedance Network Technology Co., Ltd. Voice processing method and apparatus, electronic device, and computer readable storage medium

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010022646A (en) * 2008-07-22 2010-02-04 Namco Bandai Games Inc Program, information storage medium, and image generation system
CN103854642B (en) * 2014-03-07 2016-08-17 天津大学 Flame speech synthesizing method based on physics
CN107115672A (en) * 2016-02-24 2017-09-01 网易(杭州)网络有限公司 Gaming audio resource player method, device and games system
CN109731331B (en) * 2018-12-19 2022-02-18 网易(杭州)网络有限公司 Sound information processing method and device, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5157214A (en) * 1989-04-10 1992-10-20 Matsushita Electric Industrial Co., Ltd. Musical sound synthesizing apparatus
US5371317A (en) * 1989-04-20 1994-12-06 Yamaha Corporation Musical tone synthesizing apparatus with sound hole simulation
US5466884A (en) * 1994-05-10 1995-11-14 The Board Of Trustees Of The Leland Stanford Junior University Music synthesizer system and method for simulating response of resonant digital waveguide struck by felt covered hammer
US5536902A (en) * 1993-04-14 1996-07-16 Yamaha Corporation Method of and apparatus for analyzing and synthesizing a sound by extracting and controlling a sound parameter
WO1999016049A1 (en) * 1997-09-23 1999-04-01 Kent Ridge Digital Labs (Krdl), National University Of Singapore Interactive sound effects system and method of producing model-based sound effects

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7006616B1 (en) * 1999-05-21 2006-02-28 Terayon Communication Systems, Inc. Teleconferencing bridge with EdgePoint mixing
WO2007068090A1 (en) * 2005-12-12 2007-06-21 Audiokinetic Inc. System and method for authoring media content

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5157214A (en) * 1989-04-10 1992-10-20 Matsushita Electric Industrial Co., Ltd. Musical sound synthesizing apparatus
US5371317A (en) * 1989-04-20 1994-12-06 Yamaha Corporation Musical tone synthesizing apparatus with sound hole simulation
US5536902A (en) * 1993-04-14 1996-07-16 Yamaha Corporation Method of and apparatus for analyzing and synthesizing a sound by extracting and controlling a sound parameter
US5466884A (en) * 1994-05-10 1995-11-14 The Board Of Trustees Of The Leland Stanford Junior University Music synthesizer system and method for simulating response of resonant digital waveguide struck by felt covered hammer
WO1999016049A1 (en) * 1997-09-23 1999-04-01 Kent Ridge Digital Labs (Krdl), National University Of Singapore Interactive sound effects system and method of producing model-based sound effects

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2225753A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112449210A (en) * 2019-08-28 2021-03-05 北京字节跳动网络技术有限公司 Sound processing method, sound processing device, electronic equipment and computer readable storage medium
US12022162B2 (en) 2019-08-28 2024-06-25 Beijing Bytedance Network Technology Co., Ltd. Voice processing method and apparatus, electronic device, and computer readable storage medium

Also Published As

Publication number Publication date
US20090088246A1 (en) 2009-04-02
KR20100074225A (en) 2010-07-01
JP2010540989A (en) 2010-12-24
EP2225753A1 (en) 2010-09-08
EP2225753A4 (en) 2010-11-17
CN101842830A (en) 2010-09-22

Similar Documents

Publication Publication Date Title
US9990816B2 (en) Virtual sensor in a virtual environment
EP1693092A2 (en) Referencing objects in a virtual environment
US9317112B2 (en) Motion control of a virtual environment
Andrews et al. Hapticast: a physically-based 3D game with haptic feedback
JP2000200361A (en) Image processor and information recording medium
WO2005107903A1 (en) Electronic game machine, data processing method in electronic game machine, program and storage medium for the same
JP2005319175A (en) Game machine, program, and information storage medium
US20090088246A1 (en) Interactive sound synthesis
CN112807681A (en) Game control method, device, electronic equipment and storage medium
CN111467804A (en) Hit processing method and device in game
KR100281837B1 (en) Image processing apparatus and game device having same
KR20180028764A (en) Apparatus and method for children learning using augmented reality
US8538736B1 (en) System and method for simulating object weight in animations
Magalhäes et al. Physics-based concatenative sound synthesis of photogrammetric models for aural and haptic feedback in virtual environments
JPH06236432A (en) Virtual-reality system and generation method of virtual-reality world of virtual-reality image
Almeida et al. Mapping 3D character location for tracking players' behaviour
CN111862345A (en) Information processing method and device, electronic equipment and computer readable storage medium
JP4003794B2 (en) Image processing method and image processing apparatus
JP2007259904A (en) Game device, game program, and computer readable recording medium
JP3843443B2 (en) GAME DEVICE, GAME PROCESSING METHOD, AND RECORDING MEDIUM
Chen Real-time physics based simulation for 3D computer graphics
US11921918B2 (en) Heads up display in virtual reality
Yohannes et al. Virtual reality in puppet game using depth sensor of gesture recognition and tracking
WO2020203368A1 (en) Information processing system, content generation device, content presentation device, content generation method, content presentation method, and program
Rojas Getting started with videogame development

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200880113777.1

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08800377

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 1696/DELNP/2010

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: 2010526117

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20107009031

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2008800377

Country of ref document: EP