USRE38276E1 - Tone generating apparatus for sound imaging - Google Patents
Tone generating apparatus for sound imaging Download PDFInfo
- Publication number
- USRE38276E1 USRE38276E1 US08/798,654 US79865497A USRE38276E US RE38276 E1 USRE38276 E1 US RE38276E1 US 79865497 A US79865497 A US 79865497A US RE38276 E USRE38276 E US RE38276E
- Authority
- US
- United States
- Prior art keywords
- sound
- position information
- display
- sound source
- generator
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Images
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0091—Means for obtaining special acoustic effects
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H7/00—Instruments in which the tones are synthesised from a data store, e.g. computer organs
- G10H7/002—Instruments in which the tones are synthesised from a data store, e.g. computer organs using a common processing for different operations or calculations, and a set of microinstructions (programme) to control the sequence thereof
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/155—Musical effects
- G10H2210/265—Acoustic effect simulation, i.e. volume, spatial, resonance or reverberation effects added to a musical sound, usually by appropriate filtering or delays
- G10H2210/281—Reverberation or echo
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/091—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
- G10H2220/096—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith using a touch screen
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/091—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
- G10H2220/101—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
- G10H2220/106—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters
- G10H2220/111—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters for graphical orchestra or soundstage control, e.g. on-screen selection or positioning of instruments in a virtual orchestra, using movable or selectable musical instrument icons
Definitions
- the present invention relates to a musical tone generating apparatus desirable for an electronic musical instrument, an automatic musical performance apparatus, or the like, more particularly to a technique to reproduce a sound field corresponding to positions of musical instruments which are arranged on a stage of concert hall, jazz club house, or the like.
- sound effect control information is preset in an apparatus so that the sound effect (e.g. reverberative effect) is desirably presented to a concert hall, jazz club house, or the like. Then, assuming that a sound effect for a specific concert hall is selected by an operator, or automatically selected, a specific sound effect is supplied of that concert hall based on the sound effect control information, by which this specific sound effect is converted to a musical tone signal.
- the sound effect e.g. reverberative effect
- Such conventional technique can present to some extent a desirable sound effect for listening to a performance, however, a sound field cannot be produced corresponding to respective positions of the musical instruments which are arranged on the stage of the concert hall, that is, the conventional technique cannot present a feeling of being at a live performance.
- the feeling given by the conventional technique is different from the feelings related to an actual sound field (related to a position of the sound image, a frequency component of the musical tone, a magnitude of the sound effect, or the like) since many types of the musical instruments are arranged at various positions on the stage of the concert hall, in case of a live performance. Accordingly, the conventional apparatus cannot present an accurate feeling of the sound field.
- an electronic musical instrument can have several speakers to reproduce a performance with the position of the sound image and sound effect varied by the adjustment of volume controls, switches, or the like, in which these volume controls and switches are mounted on a panel of the apparatus.
- An object of the present invention is therefore to provide a musical tone generating apparatus which can reproduce sound fields by a simple operation corresponding to musical instruments as if these musical instruments are arranged on a stage of a concert hall, or the like, so as to obtain the feeling of being at a live performance.
- Another object of the present invention is to provide a musical tone generating apparatus which can readily verify each position of the musical instruments as if these musical instruments are arranged on a stage.
- Another object of the present invention is to provide a musical tone generating apparatus which can provide a simple operation to reproduce the sound fields of musical instruments on respective stages.
- a musical tone generating apparatus comprising: a position information generating apparatus for generating musical instrument position information corresponding to positions of the musical instruments arranged on a stage of a performance place; an information converting apparatus for converting the musical instrument position information into musical tone parameter control information; a sound source apparatus for generating a musical tone source signal having a tone color corresponding to each of the musical instruments arranged on the stage; a musical tone control apparatus for controllably generating musical tone output signals corresponding to the musical tone parameter control information relative to the position of the musical instruments by receiving the musical tone source signal from the sound source apparatus; and an output apparatus for generating a musical tone from a plurality of output channels by receiving the musical tone output signal from the musical tone control apparatus so that a sound field is reproduced corresponding to the position of the musical instruments arranged on the stage.
- the operator can set the position information of the musical instruments in the position information generating apparatus, even the apparent position of the musical instruments can be moved to the desired position.
- the musical tone signal output can be read from a storage apparatus, or read from musical instruments.
- a musical tone generating apparatus comprising: a select apparatus for selecting a stage from among performance places; a storage apparatus for storing musical instrument position information which indicates a position of musical instruments arranged on a stage, and tone color indication information for indicating a tone color corresponding to each of the musical instruments; a reading apparatus for reading the musical instrument position information and the tone color indication information from the storage apparatus, in which both the musical instrument position information and the tone color indicated information are selected by the select apparatus; an information converting apparatus for converting the musical instrument position information into a musical tone parameter control information corresponding to a value of the plane coordinates and a variable which is determined by the value of the plane coordinates; a sound source apparatus for generating a musical tone source signal having a tone color corresponding to each of the musical instruments arranged on the stage; a musical tone control apparatus for controllably generating musical tone output signals in response to the musical tone parameter control information relative to the position of the musical instruments by receiving the musical tone source signal from the sound source apparatus; and an
- the musical instrument position information can be in the form of preset information corresponding to a predetermined stage as well as tone color indication information.
- FIG. 1 is a block diagram showing the construction of a musical tone generating apparatus of an embodiment
- FIG. 2 is a plan view showing the lay-out of select switches
- FIG. 3 is a plan view showing the lay-out of musical instruments arranged on a stage
- FIG. 4 is a diagram showing the control data lay-out of a memory
- FIG. 5 (A) to FIG. 5 (D) are diagrams showing the information memorized in ROM 18 ;
- FIG. 6 is a diagram showing parameter control circuit
- FIG. 7 is a diagram showing reverberative circuit 64 ;
- FIG. 8 is a flow chart showing a main routine of the musical tone generating apparatus
- FIG. 9 is a flow chart showing a subroutine of stage select switch HSS.
- FIG. 10 is a flow chart showing a subroutine for initializing sound images
- FIG. 11 is a flow chart showing a subroutine for detecting a movement of sound images.
- FIG. 12 is a flow chart showing a subroutine for setting a feature of the information.
- FIG. 1 shows a circuit diagram of an electronic musical instrument in accordance with an embodiment, in which the electronic musical instrument is controlled by a microcomputer to generate a musical tone.
- bus 10 major components are connected to bus 10 . These components are composed of keyboard circuit 12 , a group of select elements 14 , CPU (central processing unit) 16 , ROM (read only memory) 18 , RAM (random access memory) 20 , a group of registers 22 , floppy disk unit 24 , display panel interface 26 , touch panel interface 28 , sound source interface 30 , and externally input interface 32 .
- CPU central processing unit
- ROM read only memory
- RAM random access memory
- Keyboard circuit 12 detects keyboard information corresponding to respective keys of the keyboards which are composed of an upper keyboard, a lower keyboard, and a pedal keyboard.
- the group of select elements 14 comprises select elements for controlling a musical tone and for controlling a performance, and for controlling other functions, in which each select element detects the keyboard information. These select elements are described later by reference to FIG. 2 .
- CPU 16 executes many types of control processes to generate a musical tone in accordance with a control program stored in ROM 18 .
- ROM 18 also stores musical tone parameter control information which is described later by reference to FIG. 5 .
- the control processes are described later by reference to FIG. 8 to FIG. 12 .
- RAM 20 stores display control data which is read from floppy disk unit 24 . This display control data is used for a certain stage.
- the group of registers 22 is used for the control processes when CPU 16 executes the control program.
- Floppy disk unit 24 is used for reading and writing the display control data from and to a floppy disk which stores many different types of display control data for use in a plurality of performance place. The details of the above are described later by reference to FIG. 4 .
- Display panel interface 26 and touch panel interface 28 are connected to display panel 34 A and touch panel 34 B, respectively, in which both display panel 34 A and touch panel 34 B are incorporated in musical instrument position setting device 34 . Accordingly, display panel interface 26 transfers display data DS to display panel 34 A, and touch panel interface 28 receives musical instrument position data PS from touch panel 34 B corresponding to the touch position of the keyboard which is detected by touch panel 34 B.
- Musical instrument position setting device 34 is described later by reference to FIG. 3 .
- Sound source interface 30 transfers sound source control information TS to distributing circuit 36 , in which sound source control information TS is composed of key-on and key-off signals corresponding to the operation of the keyboard; performance information such as key-data (tone pitch data) corresponding to a depressed key; musical tone parameter control information PD read from ROM 18 ; and tone color indicated data TS and reverberation control data RVD both read from RAM 20 .
- sound source control information TS is composed of key-on and key-off signals corresponding to the operation of the keyboard; performance information such as key-data (tone pitch data) corresponding to a depressed key; musical tone parameter control information PD read from ROM 18 ; and tone color indicated data TS and reverberation control data RVD both read from RAM 20 .
- Externally input interface 32 receives performance information corresponding to the operation of the keyboard, and performance information read from a memory device incorporated in the electronic musical instrument. This input performance information is supplied to distributing circuit 36 through sound source interface 30 , together with a performance information from keyboard circuit 12 .
- Distributing circuit 36 generates first sound source control information S 1 , second sound source control information S 2 , and third sound source control information S 3 depending on the type of the musical instruments indicated by sound source control information TS.
- the first, second, and third sound source control information S 1 , S 2 , and S 3 is supplied to first sound source control circuit (TG 1 ) 38 , second sound source control circuit (TG 2 ) 40 , and third sound source control circuit (TG 3 ) 42 , respectively.
- distributing circuit 36 receives musical tone parameter control information PD and reverberation control data RVD, both also being contained in sound source control information TS, and this musical tone parameter control information PD and reverberation control data RVD is directly supplied to parameter control circuit 44 .
- first sound source control information S 1 represents tone color indication data corresponding to musical instrument 1 (e.g. piano) and performance information based on the upper keyboard in operation
- second sound source control information S 2 represents other tone color indication data corresponding to musical instrument 2 (e.g. violin) and performance information based on the lower keyboard
- third sound source control information S 3 represents other tone color indication data corresponding to musical instrument 3 (e.g. bass) and a performance information based on the pedal keyboard.
- other performance information can be supplied from an electronic musical instrument through externally input interface 32 , sound source interface 30 , and distributing circuit 36 , instead of the performance information input from keyboard circuit 12 , based on the upper keyboard, lower keyboard, and pedal keyboard, so that various types of electronic musical instruments can be used to play an ensemble, which can even be an automatic performance ensemble.
- First sound source control circuit TG 1 therefore supplies digital musical tone signals S 11 to parameter control circuit 44 corresponding to first sound source control information S 1
- second sound source control circuit TG 2 supplies digital musical tone signal S 12 to parameter control circuit 44 corresponding to second sound source control information S 2
- third sound source control circuit TG 3 supplies digital musical tone signal S 13 to parameter control circuit 44 corresponding to third sound source control information S 3 .
- Parameter control circuit 44 thus controls digital musical tone signals S 11 , S 12 , and S 13 based on musical tone parameter control information PD, and generates a reverberative effect signal based on reverberation control data RVD. Parameter control circuit 44 then converts such digital musical tone signals S 11 , S 12 , and S 13 into analog musical tone signals AS(R) for the right channel, and AS(L) for the left channel by a digital-analog converter incorporated in parameter control circuit 44 . The details of parameter control circuit 44 are described later by reference to FIG. 6 and FIG. 7 .
- Musical tone signal AS(R) and musical tone signal AS(L) are supplied to right speaker 48 R and left speaker 48 L through amplifier 46 R and amplifier 46 L to generate musical tone, respectively.
- FIG. 2 shows a lay-out of the select elements, each of which is related to this embodiment, and each of which is arranged in the group of select elements 14 .
- performance mode switch PMS is used for indicating a normal performance mode, that is, a manual performance (or an automatic performance) can be carried out without reproducing the sound field of the selected concert hall when it is depressed. After depression, light-emitting element PML is turned on, in which this light-emitting element PML is mounted beside performance mode switch PMS.
- Hall select switch HSS comprises N switches, which are laterally arranged in the panel. Adjacent to the N switches are respective light-emitting elements HSL. Accordingly, when one of the hall select switches HSS is depressed to select a particular concert hall. A corresponding light-emitting element HSL is turned on. The manual performance (or the automatic performance) is then carried out with reproduction of a sound field for the concert hall which is selected by the hall select switch HSS.
- FIG. 3 shows a plan view of musical instrument position setting device 34 which comprises a transparent touch panel 34 B having matrix-arranged switches, and display panel 34 A arranged behind touch panel 34 B.
- Display panel 34 A has a hall symbol HSY corresponding to a stage of performance place such as a concert hall, hall name HNM such as “HALL 1 ”, musical instrument display frame FLM, musical instrument symbol ISY, and musical instrument name INM.
- Music instrument display frame FLM is displayed in touch panel 34 B having a rectangular shape, and musical instrument symbol ISY and musical instrument name INM are displayed in each musical instrument display frame FLM.
- hall name HNM is displayed at the left-top corner of display panel 34 A as “HALL 1 ”
- musical instrument symbol ISY is displayed at the bottom-left of the display panel as “Pp” for a piano
- musical instrument name INM is displayed in musical instrument display frame FLM as “piano”.
- a symbol “Pv” is displayed at the bottom-middle of the display panel as “violin” which is also displayed in the musical instrument display frame
- a symbol “Pb” is displayed at the top-right of the display panel as “bass” which is also displayed in the musical instrument display frame.
- Touch panel 34 B has rectangular coordinates which are represented by a character W corresponding to the width of the stage of a concert hall, and by a character H corresponding to the depth thereof.
- the origin of the coordinates (P 0 (0,0) is set at the top-left corner of touch panel 34 B, the y axis is set in a vertical direction and the x axis is set in a horizontal direction. Accordingly, the position of the piano is indicated by P p (x 1 , y 1 ), similarly, the position of the violin is indicated by P v (x 2 , y 2 ), and the position of the bass is indicated by P b (x 3 , y 3 ).
- the positions can be adjusted by touching a finger within musical instrument display frame FLM in touch panel 34 B corresponding to, for example, the piano position, and moving the finger to a desired position to set the piano in position.
- musical instrument display frame FLM, musical instrument name INM, and musical instrument symbol ISY move with the movement of the finger contact point.
- the display position of the piano is finally set in touch panel 34 B.
- the position of the violin and bass can also be set in touch panel 34 B in the same manner as described above.
- the position of the musical instruments can be selectively and readily arranged as if on the stage of a concert hall by touching and moving the finger over the surface of the touch panel 34 B.
- FIG. 4 shows a format of display control data stored in a floppy disk.
- the display control data is composed of hall index data and hall data.
- Hall index data is composed of hall 1 (e.g. a small concert hall), hall 2 (e.g. a large concert hall), hall 3 (e.g. an outdoor stage), and hall N (e.g. a jazz club house).
- Hall data is composed of hall characteristic data and musical instrument data. This hall data is described later.
- floppy disk unit 24 reads the display control data from the floppy disk, and then writes it into RAM 20 with the format shown in FIG. 4 .
- the hall data has identification data ID followed by hall characteristic data and musical instrument data. This hall data is used for hall 1 .
- the hall characteristic data is composed of a value of bytes K 0 occupied by hall name data HNMD, a value of bytes L 0 occupied by hall symbol data HSYD, a value of bytes M 0 occupied by reverberation control data RVD, as well as actual hall name data HNMD indicated by a hall name, actual hall symbol data HSYD indicated by a hall symbol, and actual reverberation control data RVD which controls the reverberative effect.
- a term of HAD 0 represents a head address of RAM 20 when the hall characteristic data is written into RAM 20 .
- hall name data HNMD hall symbol data HSYD
- reverberation control data RVD are read from RAM 20 depending on the respective value of bytes occupied by the respective HNMD, HSYD, and RVD.
- Musical instrument data is composed of data of musical instrument 1 (e.g. a piano), data of musical instrument 2 (e.g. a violin), and data of musical instrument 3 (e.g. a bass).
- musical instrument 1 e.g. a piano
- musical instrument 2 e.g. a violin
- musical instrument 3 e.g. a bass
- Data of musical instrument 1 is composed of data which indicates a value of bytes K 1 occupied by musical instrument data INMD, data which indicates a value of bytes L 1 occupied by musical instrument symbol data ISYD, and data which indicates a value of bytes M 1 occupied by tone color indicated data TSD, as well as actual musical instrument name data INMD, actual musical instrument symbol data ISYD, actual tone color indicated data which indicates a tone color (e.g. the tone color of the piano) of the musical instrument, and data which indicates the musical instrument position in the x direction (x 1 ), and data which indicates the musical instrument position in the y direction (y 1 ).
- a term of HAD 1 represents a head address of RAM 20 when the data of musical instrument 1 is written into RAM 20 .
- musical instrument name data INMD musical instrument symbol data ISYD, and tone color indication data TSD are read from RAM 20 depending on the respective number of bytes occupied by the respective INMD, ISYD, and TSD data; and musical instrument position data PS (x 1 , y 1 ) is read from RAM 20 , in which X axis component x 1 is stored in storage area X 1 , and Y axis component y 1 is stored in storage area Y 1 .
- HAD 2 and HAD 3 representing head addresses data of musical instruments 2 and 3 is read from RAM 20 , as well as musical instrument position data (x 2 , y 2 ) and (x 3 , y 3 ) indicates the position of musical instruments 2 and 3 , respectively.
- This musical instrument position data (x 2 , y 2 ) and (x 3 , y 3 ) is not shown in FIG. 4, but X axis components x 2 and x 3 are stored in storage area X 2 and X 3 , and Y axis components Y 2 and y 3 are stored in storage area Y 2 and Y 3 , respectively.
- These (x 2 , y 2 ) and (x 3 , y 3 ) components indicate musical instrument position data read from RAM 20 , but not musical instrument position data PS transferred from musical instrument position setting device 34 .
- FIG. 5 (A) to FIG. 5 (D) show five types of musical tone parameter control information PD stored in respective memory portions of ROM 18 .
- One of the memory stores information as shown in FIG. 5 (A).
- This information is composed of a normalized value P y which indicates the value of the y coordinate of a musical instrument on the stage of the hall, and a first multiplication constant MP 1 which determines the position of a sound image in a y direction of the stage.
- the first multiplication constant MP 1 is directly proportional to the normalized value P y .
- Another memory stores information as shown in FIG. 5 (B).
- This information is composed of the normalized value P y which indicates the value of the y coordinate of a musical instrument on the stage of the hall, and a fourth multiplication constant MP 4 which determines the magnitude of a reverberative effect in the y direction of the stage.
- the fourth multiplication constant MP 4 is inversely proportional to the normalized value P y .
- Another memory stores information as shown in FIG. 5 (C).
- This information is composed of a normalized value P y which indicates the value of the y coordinate of a musical instrument, and a filtering constant CF which determines a cut-off frequency of a low-pass filter.
- the filtering constant CF is directly proportional to the normalized value P y .
- FIG. 5 (D) Another memory stores information as shown in FIG. 5 (D).
- This information is composed of a normalized value P x which indicates the value of the x coordinate of a musical instrument, and second and third multiplication constants MP 2 and MP 3 which determine the position of a sound image in the direction to the right and left of the stage.
- the multiplication constant MP 2 is directly proportional to the normalized value P x as shown by “L 2 ”, while the multiplication constant MP 3 is inversely proportional to the normalized value P x as shown by “L 3 ”.
- both of the values P y and P x are determined from the musical instrument position data (e.g. indicated by x 1 and y 1 ) read from RAM 20 , and musical instrument position data PS transferred from musical instrument position setting device 34 .
- FIG. 6 shows parameter control circuit 4 .
- This parameter control circuit 44 comprises three parameter controllers CN 1 , CN 2 , and CN 3 .
- These parameter controllers CN 1 , CN 2 , and CN 3 receive digital musical tone signals S 11 , S 12 , and S 13 from first sound source control circuit TG 1 , second sound source control circuit TG 2 , and third sound source control circuit TG 3 , respectively. Since parameter controllers CN 1 , CN 2 , and CN 3 are identical in construction, only parameter controller CN 1 is described in this embodiment.
- Digital musical tone signal S 11 is supplied to multiplier 50 to be multiplied by first multiplication constant MP 1 .
- a multiplication value output from multiplier 50 is supplied to low-pass filter 52 to control a frequency corresponding to filtering constant CF.
- a value output from low-pass filter 52 is supplied to multiplier 54 to be multiplied by second multiplication constant MP 2 , then supplied to multiplier 56 to be multiplied by third multiplication constant MP 3 , and also supplied to multiplier 58 to be multiplied by fourth multiplication constant MP 4 .
- Multiplied values output from multipliers 54 and 56 are supplied to adders 60 and 62 , respectively, while a multiplied value output from multiplier 58 is supplied to reverberation circuit 64 .
- FIG. 7 shows reverberation circuit 64 .
- Input data IN is supplied to adder ADD, and data output from adder ADD is supplied to delay circuit DL.
- Data output from delay circuit DL is supplied to multiplier MPL, and then data output from multiplier MPL is supplied to adder ADD as a feedback.
- Delay control data RVD 1 which is a part of reverberation control data RVD is supplied to delay circuit DL to set a delay time
- multiplication constant data RVD 2 is supplied to multiplier MPL to be multiplied by the data output from delay circuit DL, so that output data OUT is output from delay circuit DL with a reverberative effect assigned.
- Output data OUT is supplied to both adders 60 and 62 to be added to the data output from multipliers 54 and 56 , respectively.
- Data output from adder 60 is digital musical tone signal SR 1 for the right channel, which is supplied to adder 66 . While data output from adder 62 is digital musical tone signal SL 1 for the left channel, which is supplied to adder 70 .
- Digital musical tone signals SR 2 and SR 3 for the right channel are also supplied from parameter controllers CN 2 and CN 3 to adder 66 to add digital musical tone signal SR 1 .
- digital musical tone signals SL 2 and SL 3 for the left channel are supplied from parameter controllers CN 2 and CN 3 to adder 70 to add to digital musical tone signal SL 1 .
- Added data output from adder 66 is converted into analog musical tone signal AS(R) for the right channel by D-A converter 68 to output to a speaker.
- Added data output from adder 70 is also converted into analog musical tone signal AS(L) for the left channel by D-A converter 72 to output to a speaker.
- the sound image can be moved in the y direction of the stage shown in FIG. 3, when first multiplication constant MP 1 is changed with respect to normalized value P y which indicates the y coordinate of the musical instrument as shown in FIG. 5 (A).
- the fine variation of tone color can be produced corresponding to the position of the musical instrument in the y direction of the stage, when filtering constant CF is changed with respect to normalized value P y which indicates the y coordinate of the musical instrument as shown in FIG. 5 (C).
- multipliers 54 and 56 a sound image can be moved in the x direction of the stage as shown in FIG. 3, when second and third multiplication constants MP 2 and MP 3 are changed with respect to normalized value P x which indicates the x coordinate of the musical instrument as shown in FIG. 5 (D).
- multiplier 58 the magnitude of reverberative effect can be adjusted in the y direction of the stage, when fourth multiplication constant MP 4 is changed with respect to normalized value P y which indicates the y coordinate of the musical instrument as shown in FIG. 5 (B).
- adders 60 , 62 , 66 , and 70 electrically mix inputs with adjusted musical tone signals, and output musical tone signals to two speakers.
- adders 60 , 62 , 66 , and 70 electrically mix inputs with adjusted musical tone signals, and output musical tone signals to two speakers.
- several musical tones can be mixed in the air space by using several speakers, and in this case the number of adders can be reduced.
- the group of registers 22 is described next for use in this embodiment.
- Mode register MOD this register stores from “0” to “2”, “0” for a normal performance mode, “1” for a musical instrument position setting mode, and “2” for a performance mode having a reproduction of a sound field (referred to as a reproduction performance mode in the following).
- Switch number register SNO this register stores a switch number (1 to N) of hall select switch HSS when hall select switch HSS is turned on.
- Switch flags SFL 1 to SFL n these registers set “1” to a flag corresponding to a hall select switch HSS (1 to N) when hall select switch HSS is turned on.
- Head address registers ADR 0 to ADR 3 are for storing head addresses HAD 0 to HAD 3 shown in FIG. 4 .
- x coordinate register P x this register is for storing the normalized value P x which indicates the x coordinate.
- this register is for storing the normalized value P y which indicates the y coordinate.
- Control variable register i this register is for storing a control variable i.
- FIG. 8 shows the flow chart of a main routine which is started by turning on a power switch.
- step 80 an initialize routine is executed to initialize each register.
- step 82 a “0” is set in mode register MOD for the normal performance mode. This makes light-emitting element PML turn on.
- step 84 the process decides whether mode register MOD is “0” or “2” (the performance mode). When this decision is “Y”, the process moves to step 86 , otherwise it moves to step 94 .
- step 86 the process decides whether keyboard circuit 12 has a key-on event of the keyboard or not. When this decision is “Y”, the process moves to step 88 , other wise it moves to step 90 .
- step 88 the process executes a tone generation. This is, key-on signal and key data corresponding to a depressed key are supplied to keyboard circuit 12 to generate a musical tone, then the process moves to step 90 .
- step 90 the process decides whether keyboard circuit 12 has a key-off event of the keyboard or not. When this decision is “Y”, the process moves to step 92 , otherwise it moves to step 94 .
- step 92 the process executes a reduction of sound, that is, the key-off signal and the key data for a released key are supplied to the sound source corresponding to the keyboard which made the key-off event to start reduction of the musical tone corresponding to the released key, then the process moves to step 94 .
- step 94 the process decides whether hall select switch HSS has an on-event or not. When this decision is “Y”, the process moves to step 96 , otherwise it moves to step 98 .
- step 96 a subroutine is executed for the ON-state of hall select switch HSS, then the process moves to step 98 . Details of this subroutine are described later by reference to FIG. 9 .
- step 98 another process is executed such as a setting process of a tone color, tone volume, and the like, then the process moves back to step 84 to repeat the processes.
- FIG. 9 shows the flow chart of a subroutine when one of the hall select switches HSS is turned on.
- step 100 a number n of hall select switch HSS is set in switch number register SNO when one of hall select switch HSS is turned on, then the process moves to step 102 .
- step 102 the process decides whether mode register MOD is “2” (reproducing performance mode) or not. When this decision is “Y”, the process moves to step 104 , otherwise it moves to step 108 .
- step 104 the process decides whether switch flag SFL n is “1” (the sound field for reproduction for a stage corresponding to a value n set in switch number register SNO) or not. When this decision is “Y”, the process moves to step 106 , otherwise it moves to step 108 .
- step 106 a “0” is set in mode register MOD, and the light-emitting element PML is turned on.
- a “0” is set in respective switch flags SFL 1 to SFL n to turn light-emitting element HSL.
- the process returns to the main routine shown in FIG. 8 .
- the hall select switch HSS corresponding to a value is turned on, to reproduce a sound field corresponding to a value n, and the reproduction mode is canceled to return to the normal performance mode.
- step 108 a “1” is set in mode register MOD, and light-emitting element PML is turned off, then the process moves to step 110 , and is changed from the normal performance mode to the musical instrument position setting mode when the process has come from step 102 , and is changed from the reproducing performance mode to the musical instrument position setting mode when the process has come from step 104 .
- step 110 a “1” is set in switch flag SFL n to turn light-emitting element HSL on.
- a “0” is also set in switch flags SFL except for switch flag SFL n to turn respective light-emitting elements HSL of, the stage is thus indicated by the light-emitting element corresponding to one of the hall select switch HSS which is turned on, then the process moves to step 112 .
- step 112 a display control data for the selected stage is written into RAM 20 from the floppy disk, then the process moves to step 114 .
- step 114 head addresses HAD 0 to HAD 3 are set in head address registers ADR 0 to ADR 3 , then the process moves to step 116 as shown in FIG. 4 .
- step 116 an initialized display is indicated in display panel 34 A, then the process moves to step 118 . That is, hall name data HNMD and hall symbol data HSYD are read from RAM 20 , in which the data is a part of the hall characteristic data corresponding to the selected stage, then hall name HNM and hall symbol HSY are indicated in a predetermined position of display panel 34 A based on that data.
- hall name data HNMD is read from RAM 20
- a “3” is added to head address HAD 0 which is set in address register ADR 0 to indicate the head address
- hall name data HNMD is read depending on a value of bytes K 0 .
- hall symbol data HSYD When hall symbol data HSYD is read from RAM 20 , the value of bytes K 0 is added to address “HAD 0 +3” to indicate the head address of hall symbol data HSYD, hall symbol data HSYD is therefore read depending on a value of bytes L 0 .
- musical instrument name data INMD After displaying hall name HNM and hall symbol HSY, musical instrument name data INMD, musical instrument symbol data ISYD, and musical instrument position data (e.g. each value of the x 1 and y 1 values) are read from RAM 20 , and display data for a musical instrument is therefore formed consisting of the musical instrument name INM and musical instrument symbol ISY, both surrounded by musical instrument display frame FLM indicated in display panel 34 A.
- a plurality of the display data for the next two musical instruments is also formed by similar data as described in the above and indicated in display panel 34 A.
- Reading the musical instrument data from RAM 20 is described in the case of a musical instrument 1 .
- the head address is indicated by adding a “3” to head address HAD 1 which is set in address register ADR 1 to read musical instrument name data INMD corresponding to the value of bytes K 1 .
- Each value of the bytes L 1 to M 1 (for tone color indicated by tone color indication data TSD) is also added to an address “HAD 1 +3+K 1 ” to indicate the head address of the musical instrument position data, then each value of the x 1 and y 1 is, in turn, read from RAM 20 .
- step 118 a sound image initialization is executed as shown in FIG. 10 which is described later.
- step 120 a sound image movement described by reference to FIG. 11 is executed, then the process returns to the main routine shown in FIG. 8 .
- FIG. 10 shows the sound image initialization
- reverberation control data RVD is read from RAM 20 to set in reverberation circuit 64 .
- reverberation control data RVD is read from RAM 20
- a value of bytes L 0 of hall symbol data HSYD is added to address “HAD 0 +3+K 0 ” to indicate the head address of reverberation control data RVD
- reverberation control data RVD is read depending on the value of bytes of M 0 , then the process moves to step 124 .
- step 124 a “1” is added to control variable register i, then the process moves to step 126 .
- step 126 the process decides whether the value of control variable register i is greater than “3” or not. When this decision is “N”, the process moves to step 128 , otherwise it returns to the subroutine shown in FIG. 9 .
- tone color indicated data TSD for musical instrument i from RAM 20 is set in sound source control circuit TGi for a number i, where i is any integer.
- tone color indicated data TSD is read from RAM 20
- a value of bytes L 1 corresponding to musical instrument symbol data ISYD is added to the address “HAD 1 +3+K 1 ” to indicate the head address of tone color indicated data TSD, then this tone color indicated data TSD is read depending on a value of bytes M 1 , then the process moves to step 130 .
- step 130 a characteristic setting of the musical instrument is executed by a subroutine which is described later by reference to FIG. 12, then the process moves to step 132 .
- control variable register i is incremented by “1”, then the process returns to step 126 to repeat step 126 to step 132 until control variable i is greater than “3”.
- control variable i is greater than “3”
- the tone color setting and characteristic setting processes for the three musical instruments are terminated.
- FIG. 11 shows a subroutine for the sound image movement.
- step 140 the process decides whether musical instrument position data (the x and y coordinates) is indicated in touch panel 34 B, or not. When this decision is “Y”, the process moves to step 142 , otherwise it moves to step 158 .
- step 142 a “1” is added to control variable register i, then the process moves to step 144 .
- step 144 the process decides whether each of the values for the x and y coordinates is indicated within musical instrument display frame FLM or not. When this decision is “Y”, then the process moves to step 146 , otherwise it moves to step 154 .
- step 146 each value of the x and y coordinates is written into storage area Xi and Yi of RAM 20 , respectively, then the process moves to step 148 .
- step 148 the display position of a musical instrument i is changed to a desired position in display panel 34 A corresponding t each value of the Xi and Yi coordinates, then the process moves to step 150 .
- step 150 the characteristic setting is executed by a subroutine which is described later by reference to FIG. 12, then the process moves to step 152 .
- step 152 the process decides whether the musical instrument position data is indicated in touch panel 34 B or not. When this decision is “Y”, then the process returns to step 146 to repeat step 146 to step 152 . Thus, each value of the Xi and Yi coordinates can be changed in response to a touch position of the finger while the finger keeps touching touch panel 34 B and moves to another position in touch panel 34 B to set a desired position of a musical instrument in display panel 34 B.
- the decision of step 152 is “N”, the process moves to step 140 to repeat the processes described in the above.
- step 144 After setting the position of musical instrument 1 , if the finger then touches touch panel 34 B to position musical instrument 2 , the decision of step 144 is “N” so that each value of the x and y coordinates is indicated in musical instrument display frame FL of musical instrument 2 . The process therefore moves to step 154 .
- control variable register i is incremented by “1”, then the process moves to step 156 .
- step 156 the process decides whether control variable i is greater than “3” or not. When this decision is “N”, the process returns to step 144 .
- step 144 the decision is “Y” so that each value of the x and y coordinates is indicated in musical instrument display frame FLM for musical instrument 2 .
- the position of musical instrument 2 can then be established by executing step 146 to step 152 .
- step 144 the decision of step 144 is “N” so steps 154 to 156 have to be executed twice after executing step 140 to step 142 , the process moves to step 146 .
- the position of musical instrument 3 can be established by step 146 to step 152 .
- step 156 when the finger touches an area which is not a part of a musical instrument display frame FLM, the decision of step 156 is “Y”, after executing step 154 three times, then the process returns to step 140 .
- step 140 determines whether the finger does not touch panel 34 B. If the finger does not touch panel 34 B, the decision of step 140 is “N”, then the process moves to step 158 .
- step 158 the process decides whether performance mode switch PMS indicates an on-event or not. When this decision is “N”, then the process returns to step 140 , otherwise it moves to step 160 .
- step 158 is then “Y”, and the process moves to step 160 .
- step 160 a “2” is set in mode register MOD to turn light-emitting element PML on.
- the performance mode is changed from the musical instrument position setting mode to the performance reproducing mode, which enables manual performance (or automatic performance) with reproduction of the sound field corresponding to the selected stage.
- FIG. 12 shows a subroutine of the characteristic setting.
- step 170 normalized value P x which is the result of dividing the value of the x coordinate stored in the storage area Xi by the length W shown in FIG. 3 is set in the storage area Px.
- normalized value P y which is the result of dividing the value of the y coordinate stored in storage area Yi by the length H in FIG. 3 is set in the storage area Py.
- each value of the P x and P y value (contents of Px and Py) is converted into five types of musical tone parameter control information PD (first multiplications constant MP 1 to fourth multiplication constant MP 4 , and filtering constant CF), then a plurality of the data is set in each of parameter controllers CN 1 , CN 2 , and CN 3 shown in FIG. 6 .
- the sound field of the selected stage is reproduced in response to the data read from RAM 20 .
- the sound field of the selected stage is reproduced in accordance with the positions of musical instruments set by musical instrument position setting device 34 .
- touch panel 34 B is used for indicating the musical instrument position, but select elements such as a variable resister, a switch, and the like can be used instead of touch panel 34 B.
- the stage is selected in combination with the musical instruments, but the stage can also be selected separately from the musical instruments.
- the musical instrument position information can be stored in a storage area together with a plurality of performance information so that a sound image can be moved.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Electrophonic Musical Instruments (AREA)
Abstract
A musical tone generating apparatus includes a position information generating device to generate musical instrument position information (PS) as plane coordinates coordinate values. This information (PS) is stored in a memory device, or selectively determined by a manual operation. The apparatus also includes an information converting device to converter information (PS) into musical tone parameter control information (PD). This control information (PD) controls musical tone source signals (S11, S12, and S13) to generate a sound field corresponding to the position of musical instruments arranged on a stage. This enables an operator to verify the musical instrument positions on a stage, thereby providing a feeling of being at a live performance.
Description
This reissue application is a continuation of reissue application Ser. No. 08/345,531, filed on Nov. 28, 1994, now abandoned, which is a continuation of reissue application Ser. No. 08/084,812, filed on Jun. 29, 1993, now abandoned, which is a reissue application of U.S. Pat. No. 5,027,689 granted Jul. 2, 1991.
1. Field of the Invention
The present invention relates to a musical tone generating apparatus desirable for an electronic musical instrument, an automatic musical performance apparatus, or the like, more particularly to a technique to reproduce a sound field corresponding to positions of musical instruments which are arranged on a stage of concert hall, jazz club house, or the like.
2. Prior Art
In a conventional sound effect technique, sound effect control information is preset in an apparatus so that the sound effect (e.g. reverberative effect) is desirably presented to a concert hall, jazz club house, or the like. Then, assuming that a sound effect for a specific concert hall is selected by an operator, or automatically selected, a specific sound effect is supplied of that concert hall based on the sound effect control information, by which this specific sound effect is converted to a musical tone signal.
Such conventional technique can present to some extent a desirable sound effect for listening to a performance, however, a sound field cannot be produced corresponding to respective positions of the musical instruments which are arranged on the stage of the concert hall, that is, the conventional technique cannot present a feeling of being at a live performance. In other words, the feeling given by the conventional technique is different from the feelings related to an actual sound field (related to a position of the sound image, a frequency component of the musical tone, a magnitude of the sound effect, or the like) since many types of the musical instruments are arranged at various positions on the stage of the concert hall, in case of a live performance. Accordingly, the conventional apparatus cannot present an accurate feeling of the sound field.
On the other hand, it is well known that an electronic musical instrument can have several speakers to reproduce a performance with the position of the sound image and sound effect varied by the adjustment of volume controls, switches, or the like, in which these volume controls and switches are mounted on a panel of the apparatus.
However, this is very complicated in that many select elements such as the volume controls and switches must be adjusted to reproduce a desirable feeling of the sound field, especially it is not easy to adjust a sound field based on an imagination of the position of the musical instruments as if these musical instruments are arranged on the stage of the concert hall. Up until recently, the sound effect control information has been thus preset in the apparatus to reproduce the sound effect corresponding to a stage of the concert hall, requiring a great deal of the information to be preset in the apparatus, and an apparatus of highly complicated construction.
An object of the present invention is therefore to provide a musical tone generating apparatus which can reproduce sound fields by a simple operation corresponding to musical instruments as if these musical instruments are arranged on a stage of a concert hall, or the like, so as to obtain the feeling of being at a live performance.
Another object of the present invention is to provide a musical tone generating apparatus which can readily verify each position of the musical instruments as if these musical instruments are arranged on a stage.
Another object of the present invention is to provide a musical tone generating apparatus which can provide a simple operation to reproduce the sound fields of musical instruments on respective stages.
In a first aspect of the invention, there is provided a musical tone generating apparatus comprising: a position information generating apparatus for generating musical instrument position information corresponding to positions of the musical instruments arranged on a stage of a performance place; an information converting apparatus for converting the musical instrument position information into musical tone parameter control information; a sound source apparatus for generating a musical tone source signal having a tone color corresponding to each of the musical instruments arranged on the stage; a musical tone control apparatus for controllably generating musical tone output signals corresponding to the musical tone parameter control information relative to the position of the musical instruments by receiving the musical tone source signal from the sound source apparatus; and an output apparatus for generating a musical tone from a plurality of output channels by receiving the musical tone output signal from the musical tone control apparatus so that a sound field is reproduced corresponding to the position of the musical instruments arranged on the stage.
The operator can set the position information of the musical instruments in the position information generating apparatus, even the apparent position of the musical instruments can be moved to the desired position.
The musical tone signal output can be read from a storage apparatus, or read from musical instruments.
In a second aspect of the invention, there is provided a musical tone generating apparatus comprising: a select apparatus for selecting a stage from among performance places; a storage apparatus for storing musical instrument position information which indicates a position of musical instruments arranged on a stage, and tone color indication information for indicating a tone color corresponding to each of the musical instruments; a reading apparatus for reading the musical instrument position information and the tone color indication information from the storage apparatus, in which both the musical instrument position information and the tone color indicated information are selected by the select apparatus; an information converting apparatus for converting the musical instrument position information into a musical tone parameter control information corresponding to a value of the plane coordinates and a variable which is determined by the value of the plane coordinates; a sound source apparatus for generating a musical tone source signal having a tone color corresponding to each of the musical instruments arranged on the stage; a musical tone control apparatus for controllably generating musical tone output signals in response to the musical tone parameter control information relative to the position of the musical instruments by receiving the musical tone source signal from the sound source apparatus; and an output apparatus for generating musical tone from a plurality of output channels by receiving the musical tone output signal from the musical tone control apparatus so that a sound field is reproduced corresponding to the position of the musical instruments arranged on the stage.
The musical instrument position information can be in the form of preset information corresponding to a predetermined stage as well as tone color indication information.
FIG. 1 is a block diagram showing the construction of a musical tone generating apparatus of an embodiment;
FIG. 2 is a plan view showing the lay-out of select switches;
FIG. 3 is a plan view showing the lay-out of musical instruments arranged on a stage;
FIG. 4 is a diagram showing the control data lay-out of a memory;
FIG. 5(A) to FIG. 5(D) are diagrams showing the information memorized in ROM 18;
FIG. 6 is a diagram showing parameter control circuit
FIG. 7 is a diagram showing reverberative circuit 64;
FIG. 8 is a flow chart showing a main routine of the musical tone generating apparatus;
FIG. 9 is a flow chart showing a subroutine of stage select switch HSS;
FIG. 10 is a flow chart showing a subroutine for initializing sound images;
FIG. 11 is a flow chart showing a subroutine for detecting a movement of sound images; and
FIG. 12 is a flow chart showing a subroutine for setting a feature of the information.
Hereinafter, an embodiment of the present invention is described by reference to the drawings.
FIG. 1 shows a circuit diagram of an electronic musical instrument in accordance with an embodiment, in which the electronic musical instrument is controlled by a microcomputer to generate a musical tone.
In FIG. 1, major components are connected to bus 10. These components are composed of keyboard circuit 12, a group of select elements 14, CPU (central processing unit) 16, ROM (read only memory) 18, RAM (random access memory) 20, a group of registers 22, floppy disk unit 24, display panel interface 26, touch panel interface 28, sound source interface 30, and externally input interface 32.
The group of select elements 14 comprises select elements for controlling a musical tone and for controlling a performance, and for controlling other functions, in which each select element detects the keyboard information. These select elements are described later by reference to FIG. 2.
The group of registers 22 is used for the control processes when CPU 16 executes the control program.
Externally input interface 32 receives performance information corresponding to the operation of the keyboard, and performance information read from a memory device incorporated in the electronic musical instrument. This input performance information is supplied to distributing circuit 36 through sound source interface 30, together with a performance information from keyboard circuit 12.
Distributing circuit 36 generates first sound source control information S1, second sound source control information S2, and third sound source control information S3 depending on the type of the musical instruments indicated by sound source control information TS. The first, second, and third sound source control information S1, S2, and S3 is supplied to first sound source control circuit (TG1) 38, second sound source control circuit (TG2) 40, and third sound source control circuit (TG3) 42, respectively. In the addition, distributing circuit 36 receives musical tone parameter control information PD and reverberation control data RVD, both also being contained in sound source control information TS, and this musical tone parameter control information PD and reverberation control data RVD is directly supplied to parameter control circuit 44.
In the sound source control information described in the above, first sound source control information S1 represents tone color indication data corresponding to musical instrument 1 (e.g. piano) and performance information based on the upper keyboard in operation, second sound source control information S2 represents other tone color indication data corresponding to musical instrument 2 (e.g. violin) and performance information based on the lower keyboard, and third sound source control information S3 represents other tone color indication data corresponding to musical instrument 3 (e.g. bass) and a performance information based on the pedal keyboard.
In the above description, other performance information can be supplied from an electronic musical instrument through externally input interface 32, sound source interface 30, and distributing circuit 36, instead of the performance information input from keyboard circuit 12, based on the upper keyboard, lower keyboard, and pedal keyboard, so that various types of electronic musical instruments can be used to play an ensemble, which can even be an automatic performance ensemble.
First sound source control circuit TG1 therefore supplies digital musical tone signals S11 to parameter control circuit 44 corresponding to first sound source control information S1, second sound source control circuit TG2 supplies digital musical tone signal S12 to parameter control circuit 44 corresponding to second sound source control information S2, and similarly, third sound source control circuit TG3 supplies digital musical tone signal S13 to parameter control circuit 44 corresponding to third sound source control information S3.
Musical tone signal AS(R) and musical tone signal AS(L) are supplied to right speaker 48R and left speaker 48L through amplifier 46R and amplifier 46L to generate musical tone, respectively.
FIG. 2 shows a lay-out of the select elements, each of which is related to this embodiment, and each of which is arranged in the group of select elements 14.
In FIG. 2, performance mode switch PMS is used for indicating a normal performance mode, that is, a manual performance (or an automatic performance) can be carried out without reproducing the sound field of the selected concert hall when it is depressed. After depression, light-emitting element PML is turned on, in which this light-emitting element PML is mounted beside performance mode switch PMS.
Hall select switch HSS comprises N switches, which are laterally arranged in the panel. Adjacent to the N switches are respective light-emitting elements HSL. Accordingly, when one of the hall select switches HSS is depressed to select a particular concert hall. A corresponding light-emitting element HSL is turned on. The manual performance (or the automatic performance) is then carried out with reproduction of a sound field for the concert hall which is selected by the hall select switch HSS.
On the other hand, when the previously depressed hall select switch HSS corresponding to the turned on light-emitting element HSL is again depressed the light-emitting element HSL is turned off, and light-emitting element PML is also turned off to terminate the manual performance.
FIG. 3 shows a plan view of musical instrument position setting device 34 which comprises a transparent touch panel 34B having matrix-arranged switches, and display panel 34A arranged behind touch panel 34B.
After roughly inputting the position of all musical instruments in display panel 34A, the positions can be adjusted by touching a finger within musical instrument display frame FLM in touch panel 34B corresponding to, for example, the piano position, and moving the finger to a desired position to set the piano in position. At this time, musical instrument display frame FLM, musical instrument name INM, and musical instrument symbol ISY move with the movement of the finger contact point. When the finger stops moving, the display position of the piano is finally set in touch panel 34B. Similarly, the position of the violin and bass can also be set in touch panel 34B in the same manner as described above. Thus, the position of the musical instruments can be selectively and readily arranged as if on the stage of a concert hall by touching and moving the finger over the surface of the touch panel 34B.
FIG. 4 shows a format of display control data stored in a floppy disk. The display control data is composed of hall index data and hall data. Hall index data is composed of hall 1 (e.g. a small concert hall), hall 2 (e.g. a large concert hall), hall 3 (e.g. an outdoor stage), and hall N (e.g. a jazz club house). Hall data is composed of hall characteristic data and musical instrument data. This hall data is described later.
For example, when hall 1 is selected by one of the hall select switches HSS, floppy disk unit 24 reads the display control data from the floppy disk, and then writes it into RAM 20 with the format shown in FIG. 4.
The hall data has identification data ID followed by hall characteristic data and musical instrument data. This hall data is used for hall 1. The hall characteristic data is composed of a value of bytes K0 occupied by hall name data HNMD, a value of bytes L0 occupied by hall symbol data HSYD, a value of bytes M0 occupied by reverberation control data RVD, as well as actual hall name data HNMD indicated by a hall name, actual hall symbol data HSYD indicated by a hall symbol, and actual reverberation control data RVD which controls the reverberative effect. A term of HAD0 represents a head address of RAM 20 when the hall characteristic data is written into RAM 20. Corresponding to the head address HAD0, hall name data HNMD, hall symbol data HSYD, and reverberation control data RVD are read from RAM 20 depending on the respective value of bytes occupied by the respective HNMD, HSYD, and RVD.
Musical instrument data is composed of data of musical instrument 1 (e.g. a piano), data of musical instrument 2 (e.g. a violin), and data of musical instrument 3 (e.g. a bass).
Data of musical instrument 1 is composed of data which indicates a value of bytes K1 occupied by musical instrument data INMD, data which indicates a value of bytes L1 occupied by musical instrument symbol data ISYD, and data which indicates a value of bytes M1 occupied by tone color indicated data TSD, as well as actual musical instrument name data INMD, actual musical instrument symbol data ISYD, actual tone color indicated data which indicates a tone color (e.g. the tone color of the piano) of the musical instrument, and data which indicates the musical instrument position in the x direction (x1), and data which indicates the musical instrument position in the y direction (y1). A term of HAD1 represents a head address of RAM 20 when the data of musical instrument 1 is written into RAM 20. Corresponding to the head address HAD1, musical instrument name data INMD, musical instrument symbol data ISYD, and tone color indication data TSD are read from RAM 20 depending on the respective number of bytes occupied by the respective INMD, ISYD, and TSD data; and musical instrument position data PS (x1, y1) is read from RAM 20, in which X axis component x1 is stored in storage area X1, and Y axis component y1 is stored in storage area Y1.
While data of musical instruments 2 and 3 are handled similarly to data of musical instrument 1 described in the above, therefore details are omitted for the sake of simplicity.
With the terms HAD2 and HAD3 representing head addresses data of musical instruments 2 and 3 is read from RAM 20, as well as musical instrument position data (x2, y2) and (x3, y3) indicates the position of musical instruments 2 and 3, respectively. This musical instrument position data (x2, y2) and (x3, y3) is not shown in FIG. 4, but X axis components x2 and x3 are stored in storage area X2 and X3, and Y axis components Y2 and y3 are stored in storage area Y2 and Y3, respectively. These (x2, y2) and (x3, y3) components indicate musical instrument position data read from RAM 20, but not musical instrument position data PS transferred from musical instrument position setting device 34.
FIG. 5(A) to FIG. 5(D) show five types of musical tone parameter control information PD stored in respective memory portions of ROM 18.
One of the memory stores information as shown in FIG. 5(A). This information is composed of a normalized value Py which indicates the value of the y coordinate of a musical instrument on the stage of the hall, and a first multiplication constant MP1 which determines the position of a sound image in a y direction of the stage. The first multiplication constant MP1 is directly proportional to the normalized value Py. Thus, when Py=1 and MP1=1, a sound image is produced corresponding to a musical instrument positioned at the most front side of the stage.
Another memory stores information as shown in FIG. 5(B). This information is composed of the normalized value Py which indicates the value of the y coordinate of a musical instrument on the stage of the hall, and a fourth multiplication constant MP4 which determines the magnitude of a reverberative effect in the y direction of the stage. The fourth multiplication constant MP4 is inversely proportional to the normalized value Py. Thus, when Py=0 and MP4=1, a reverberative effect can be produced corresponding to a musical instrument positioned at the most rear side of the stage.
Another memory stores information as shown in FIG. 5(C). This information is composed of a normalized value Py which indicates the value of the y coordinate of a musical instrument, and a filtering constant CF which determines a cut-off frequency of a low-pass filter. The filtering constant CF is directly proportional to the normalized value Py. When Py=1 and CF=fs/2 (fs is a sampling frequency for digital musical tone signals), a sound barrier spreads to a high tone corresponding to a musical instrument positioned at the most front side of the stage.
Another memory stores information as shown in FIG. 5(D). This information is composed of a normalized value Px which indicates the value of the x coordinate of a musical instrument, and second and third multiplication constants MP2 and MP3 which determine the position of a sound image in the direction to the right and left of the stage. The multiplication constant MP2 is directly proportional to the normalized value Px as shown by “L2”, while the multiplication constant MP3 is inversely proportional to the normalized value Px as shown by “L3”. Thus, when Px=1, MP2=1, and MP3=0, a sound image is produced corresponding to a musical instrument which is positioned at the right most side of the stage. When Px=0, MP2=0, and MP3=1, a sound image is produced corresponding to a musical instrument which is positioned at the left most side of the stage.
On the other hand, with the normalized value Py indicated by the position of a musical instrument along the y coordinate, and the normalized value Px indicated by the position of a musical instrument along the x coordinate, both of the values Py and Px are determined from the musical instrument position data (e.g. indicated by x1 and y1) read from RAM 20, and musical instrument position data PS transferred from musical instrument position setting device 34.
FIG. 6 shows parameter control circuit 4. This parameter control circuit 44 comprises three parameter controllers CN1, CN2, and CN3. These parameter controllers CN1, CN2, and CN3 receive digital musical tone signals S11, S12, and S13 from first sound source control circuit TG1, second sound source control circuit TG2, and third sound source control circuit TG3, respectively. Since parameter controllers CN1, CN2, and CN3 are identical in construction, only parameter controller CN1 is described in this embodiment.
Digital musical tone signal S11 is supplied to multiplier 50 to be multiplied by first multiplication constant MP1. A multiplication value output from multiplier 50 is supplied to low-pass filter 52 to control a frequency corresponding to filtering constant CF.
A value output from low-pass filter 52 is supplied to multiplier 54 to be multiplied by second multiplication constant MP2, then supplied to multiplier 56 to be multiplied by third multiplication constant MP3, and also supplied to multiplier 58 to be multiplied by fourth multiplication constant MP4.
Multiplied values output from multipliers 54 and 56 are supplied to adders 60 and 62, respectively, while a multiplied value output from multiplier 58 is supplied to reverberation circuit 64.
FIG. 7 shows reverberation circuit 64. Input data IN is supplied to adder ADD, and data output from adder ADD is supplied to delay circuit DL. Data output from delay circuit DL is supplied to multiplier MPL, and then data output from multiplier MPL is supplied to adder ADD as a feedback. Delay control data RVD1 which is a part of reverberation control data RVD is supplied to delay circuit DL to set a delay time, and multiplication constant data RVD2 is supplied to multiplier MPL to be multiplied by the data output from delay circuit DL, so that output data OUT is output from delay circuit DL with a reverberative effect assigned.
Output data OUT is supplied to both adders 60 and 62 to be added to the data output from multipliers 54 and 56, respectively.
Data output from adder 60 is digital musical tone signal SR1 for the right channel, which is supplied to adder 66. While data output from adder 62 is digital musical tone signal SL1 for the left channel, which is supplied to adder 70.
Digital musical tone signals SR2 and SR3 for the right channel are also supplied from parameter controllers CN2 and CN3 to adder 66 to add digital musical tone signal SR1. In addition, digital musical tone signals SL2 and SL3 for the left channel are supplied from parameter controllers CN2 and CN3 to adder 70 to add to digital musical tone signal SL1.
Added data output from adder 66 is converted into analog musical tone signal AS(R) for the right channel by D-A converter 68 to output to a speaker. Added data output from adder 70 is also converted into analog musical tone signal AS(L) for the left channel by D-A converter 72 to output to a speaker.
According to FIG. 6, in multiplier 50, the sound image can be moved in the y direction of the stage shown in FIG. 3, when first multiplication constant MP1 is changed with respect to normalized value Py which indicates the y coordinate of the musical instrument as shown in FIG. 5(A).
In low-pass filter 52, the fine variation of tone color can be produced corresponding to the position of the musical instrument in the y direction of the stage, when filtering constant CF is changed with respect to normalized value Py which indicates the y coordinate of the musical instrument as shown in FIG. 5(C).
In multipliers 54 and 56, a sound image can be moved in the x direction of the stage as shown in FIG. 3, when second and third multiplication constants MP2 and MP3 are changed with respect to normalized value Px which indicates the x coordinate of the musical instrument as shown in FIG. 5(D).
In multiplier 58, the magnitude of reverberative effect can be adjusted in the y direction of the stage, when fourth multiplication constant MP4 is changed with respect to normalized value Py which indicates the y coordinate of the musical instrument as shown in FIG. 5(B).
In this embodiment, adders 60, 62, 66, and 70 electrically mix inputs with adjusted musical tone signals, and output musical tone signals to two speakers. However, several musical tones can be mixed in the air space by using several speakers, and in this case the number of adders can be reduced.
The group of registers 22 is described next for use in this embodiment.
(1) Mode register MOD: this register stores from “0” to “2”, “0” for a normal performance mode, “1” for a musical instrument position setting mode, and “2” for a performance mode having a reproduction of a sound field (referred to as a reproduction performance mode in the following).
(2) Switch number register SNO: this register stores a switch number (1 to N) of hall select switch HSS when hall select switch HSS is turned on.
(3) Switch flags SFL1 to SFLn: these registers set “1” to a flag corresponding to a hall select switch HSS (1 to N) when hall select switch HSS is turned on.
(4) Head address registers ADR0 to ADR3: these registers are for storing head addresses HAD0 to HAD3 shown in FIG. 4.
(5) x coordinate register Px: this register is for storing the normalized value Px which indicates the x coordinate.
(6) y coordinate register Py: this register is for storing the normalized value Py which indicates the y coordinate.
(7) Control variable register i: this register is for storing a control variable i.
FIG. 8 shows the flow chart of a main routine which is started by turning on a power switch.
In step 80, an initialize routine is executed to initialize each register.
In step 82, a “0” is set in mode register MOD for the normal performance mode. This makes light-emitting element PML turn on.
In step 84, the process decides whether mode register MOD is “0” or “2” (the performance mode). When this decision is “Y”, the process moves to step 86, otherwise it moves to step 94.
In step 86, the process decides whether keyboard circuit 12 has a key-on event of the keyboard or not. When this decision is “Y”, the process moves to step 88, other wise it moves to step 90.
In step 88, the process executes a tone generation. This is, key-on signal and key data corresponding to a depressed key are supplied to keyboard circuit 12 to generate a musical tone, then the process moves to step 90.
In step 90, the process decides whether keyboard circuit 12 has a key-off event of the keyboard or not. When this decision is “Y”, the process moves to step 92, otherwise it moves to step 94.
In step 92, the process executes a reduction of sound, that is, the key-off signal and the key data for a released key are supplied to the sound source corresponding to the keyboard which made the key-off event to start reduction of the musical tone corresponding to the released key, then the process moves to step 94.
In step 94, the process decides whether hall select switch HSS has an on-event or not. When this decision is “Y”, the process moves to step 96, otherwise it moves to step 98.
In step 96, a subroutine is executed for the ON-state of hall select switch HSS, then the process moves to step 98. Details of this subroutine are described later by reference to FIG. 9.
In step 98, another process is executed such as a setting process of a tone color, tone volume, and the like, then the process moves back to step 84 to repeat the processes.
FIG. 9 shows the flow chart of a subroutine when one of the hall select switches HSS is turned on.
In step 100, a number n of hall select switch HSS is set in switch number register SNO when one of hall select switch HSS is turned on, then the process moves to step 102.
In step 102, the process decides whether mode register MOD is “2” (reproducing performance mode) or not. When this decision is “Y”, the process moves to step 104, otherwise it moves to step 108.
In step 104, the process decides whether switch flag SFLn is “1” (the sound field for reproduction for a stage corresponding to a value n set in switch number register SNO) or not. When this decision is “Y”, the process moves to step 106, otherwise it moves to step 108.
In step 106, a “0” is set in mode register MOD, and the light-emitting element PML is turned on. A “0” is set in respective switch flags SFL1 to SFLn to turn light-emitting element HSL. Afterwards, the process returns to the main routine shown in FIG. 8. In this case, the hall select switch HSS corresponding to a value is turned on, to reproduce a sound field corresponding to a value n, and the reproduction mode is canceled to return to the normal performance mode.
In step 108, a “1” is set in mode register MOD, and light-emitting element PML is turned off, then the process moves to step 110, and is changed from the normal performance mode to the musical instrument position setting mode when the process has come from step 102, and is changed from the reproducing performance mode to the musical instrument position setting mode when the process has come from step 104.
In step 110, a “1” is set in switch flag SFLn to turn light-emitting element HSL on. A “0” is also set in switch flags SFL except for switch flag SFLn to turn respective light-emitting elements HSL of, the stage is thus indicated by the light-emitting element corresponding to one of the hall select switch HSS which is turned on, then the process moves to step 112.
In step 112, a display control data for the selected stage is written into RAM 20 from the floppy disk, then the process moves to step 114.
In step 114, head addresses HAD0 to HAD3 are set in head address registers ADR0 to ADR3, then the process moves to step 116 as shown in FIG. 4.
In step 116, an initialized display is indicated in display panel 34A, then the process moves to step 118. That is, hall name data HNMD and hall symbol data HSYD are read from RAM 20, in which the data is a part of the hall characteristic data corresponding to the selected stage, then hall name HNM and hall symbol HSY are indicated in a predetermined position of display panel 34A based on that data. When hall name data HNMD is read from RAM 20, a “3” is added to head address HAD0 which is set in address register ADR0 to indicate the head address, and then hall name data HNMD is read depending on a value of bytes K0. When hall symbol data HSYD is read from RAM 20, the value of bytes K0 is added to address “HAD0+3” to indicate the head address of hall symbol data HSYD, hall symbol data HSYD is therefore read depending on a value of bytes L0.
After displaying hall name HNM and hall symbol HSY, musical instrument name data INMD, musical instrument symbol data ISYD, and musical instrument position data (e.g. each value of the x1 and y1 values) are read from RAM 20, and display data for a musical instrument is therefore formed consisting of the musical instrument name INM and musical instrument symbol ISY, both surrounded by musical instrument display frame FLM indicated in display panel 34A.
A plurality of the display data for the next two musical instruments is also formed by similar data as described in the above and indicated in display panel 34A.
Reading the musical instrument data from RAM 20 is described in the case of a musical instrument 1. The head address is indicated by adding a “3” to head address HAD1 which is set in address register ADR1 to read musical instrument name data INMD corresponding to the value of bytes K1. This value of bytes Kis added to “HAD1+3” to indicate the head address of musical instrument symbol data ISYD, then this musical instrument symbol data ISYD is read depending on the value of bytes L1. Each value of the bytes L1 to M1 (for tone color indicated by tone color indication data TSD) is also added to an address “HAD1+3+K1” to indicate the head address of the musical instrument position data, then each value of the x1 and y1 is, in turn, read from RAM 20.
In step 118, a sound image initialization is executed as shown in FIG. 10 which is described later.
In step 120, a sound image movement described by reference to FIG. 11 is executed, then the process returns to the main routine shown in FIG. 8.
FIG. 10 shows the sound image initialization.
In step 122, reverberation control data RVD is read from RAM 20 to set in reverberation circuit 64. When reverberation control data RVD is read from RAM 20, a value of bytes L0 of hall symbol data HSYD is added to address “HAD0+3+K0” to indicate the head address of reverberation control data RVD, then reverberation control data RVD is read depending on the value of bytes of M0, then the process moves to step 124.
In step 124, a “1” is added to control variable register i, then the process moves to step 126.
In step 126, the process decides whether the value of control variable register i is greater than “3” or not. When this decision is “N”, the process moves to step 128, otherwise it returns to the subroutine shown in FIG. 9.
In step 128, tone color indicated data TSD for musical instrument i from RAM 20 is set in sound source control circuit TGi for a number i, where i is any integer. When tone color indicated data TSD is read from RAM 20, a value of bytes L1 corresponding to musical instrument symbol data ISYD is added to the address “HAD1+3+K1” to indicate the head address of tone color indicated data TSD, then this tone color indicated data TSD is read depending on a value of bytes M1, then the process moves to step 130.
In step 130, a characteristic setting of the musical instrument is executed by a subroutine which is described later by reference to FIG. 12, then the process moves to step 132.
In step 132, control variable register i is incremented by “1”, then the process returns to step 126 to repeat step 126 to step 132 until control variable i is greater than “3”.
When control variable i is greater than “3”, the tone color setting and characteristic setting processes for the three musical instruments are terminated.
FIG. 11 shows a subroutine for the sound image movement.
In step 140, the process decides whether musical instrument position data (the x and y coordinates) is indicated in touch panel 34B, or not. When this decision is “Y”, the process moves to step 142, otherwise it moves to step 158.
In step 142, a “1” is added to control variable register i, then the process moves to step 144.
In step 144, the process decides whether each of the values for the x and y coordinates is indicated within musical instrument display frame FLM or not. When this decision is “Y”, then the process moves to step 146, otherwise it moves to step 154.
In step 146, each value of the x and y coordinates is written into storage area Xi and Yi of RAM 20, respectively, then the process moves to step 148.
In step 148, the display position of a musical instrument i is changed to a desired position in display panel 34A corresponding t each value of the Xi and Yi coordinates, then the process moves to step 150.
In step 150, the characteristic setting is executed by a subroutine which is described later by reference to FIG. 12, then the process moves to step 152.
In step 152, the process decides whether the musical instrument position data is indicated in touch panel 34B or not. When this decision is “Y”, then the process returns to step 146 to repeat step 146 to step 152. Thus, each value of the Xi and Yi coordinates can be changed in response to a touch position of the finger while the finger keeps touching touch panel 34B and moves to another position in touch panel 34B to set a desired position of a musical instrument in display panel 34B. When the decision of step 152 is “N”, the process moves to step 140 to repeat the processes described in the above.
After setting the position of musical instrument 1, if the finger then touches touch panel 34B to position musical instrument 2, the decision of step 144 is “N” so that each value of the x and y coordinates is indicated in musical instrument display frame FL of musical instrument 2. The process therefore moves to step 154.
In step 154, control variable register i is incremented by “1”, then the process moves to step 156.
In step 156, the process decides whether control variable i is greater than “3” or not. When this decision is “N”, the process returns to step 144.
On returning to step 144, the decision is “Y” so that each value of the x and y coordinates is indicated in musical instrument display frame FLM for musical instrument 2. The position of musical instrument 2 can then be established by executing step 146 to step 152.
Afterwards, if the finger touches touch panel 34B to position musical instrument 3, at this time, the decision of step 144 is “N” so steps 154 to 156 have to be executed twice after executing step 140 to step 142, the process moves to step 146. Thus, the position of musical instrument 3 can be established by step 146 to step 152.
In touch panel 34B, when the finger touches an area which is not a part of a musical instrument display frame FLM, the decision of step 156 is “Y”, after executing step 154 three times, then the process returns to step 140.
On the other hand, when the finger does not touch panel 34B, the decision of step 140 is “N”, then the process moves to step 158.
In step 158, the process decides whether performance mode switch PMS indicates an on-event or not. When this decision is “N”, then the process returns to step 140, otherwise it moves to step 160.
Accordingly, if after or before setting the position at least one of three musical instruments 1 to 3, performance mode switch PMS is turned on, the decision of step 158 is then “Y”, and the process moves to step 160.
In step 160, a “2” is set in mode register MOD to turn light-emitting element PML on. Thus, the performance mode is changed from the musical instrument position setting mode to the performance reproducing mode, which enables manual performance (or automatic performance) with reproduction of the sound field corresponding to the selected stage.
The musical instrument position established in steps 146 to 152 (each of the revised Xi and Yi values) can be transferred to a floppy disk driven by floppy disk unit 24. FIG. 12 shows a subroutine of the characteristic setting. In step 170, normalized value Px which is the result of dividing the value of the x coordinate stored in the storage area Xi by the length W shown in FIG. 3 is set in the storage area Px. In addition, normalized value Py which is the result of dividing the value of the y coordinate stored in storage area Yi by the length H in FIG. 3 is set in the storage area Py.
In step 172, each value of the Px and Py value (contents of Px and Py) is converted into five types of musical tone parameter control information PD (first multiplications constant MP1 to fourth multiplication constant MP4, and filtering constant CF), then a plurality of the data is set in each of parameter controllers CN1, CN2, and CN3 shown in FIG. 6.
As a result, in FIG. 10, the sound field of the selected stage is reproduced in response to the data read from RAM 20. In FIG. 11, the sound field of the selected stage is reproduced in accordance with the positions of musical instruments set by musical instrument position setting device 34.
In this embodiment, touch panel 34B is used for indicating the musical instrument position, but select elements such as a variable resister, a switch, and the like can be used instead of touch panel 34B.
Also in this embodiment, the stage is selected in combination with the musical instruments, but the stage can also be selected separately from the musical instruments.
In addition, in the case where this invention is used for an aqueous performance, the musical instrument position information can be stored in a storage area together with a plurality of performance information so that a sound image can be moved.
The preferred embodiment described herein is illustrative and restrictive; the scope of the invention is indicated by the appended claims and all variations which fall within the claims are intended to be embraced therein.
Claims (71)
1. A musical tone generating apparatus for providing a performance effect of a plurality of musical instruments arranged in a performance place, comprising:
tone color designating means for designating a tone color corresponding to each musical instrument arranged in the performance place;
position information generating means for generating musical instrument position information corresponding to a position of each musical instrument arranged at the performance place;
display means for displaying images of musical instruments at positions corresponding to the musical instrument position information;
information converting means for converting the musical instrument position information into musical tone parameter control information;
musical tone generating means for generating musical tone signals;
musical tone control means for controlling the musical tone parameters signals in accordance with the musical tone parameter control information; and
output means for outputting a musical tones in accordance with the controlled musical tone parameter outputted from the musical tone control means signals.
2. A musical tone generating apparatus according to claim 1 , wherein the musical instrument position information comprises a value in a plane coordinate system and a variable which is determined by the value of the plane coordinates.
3. A musical tone generating apparatus according to claim 1 , wherein the information converting means comprises a CPU (central processing unit) having a control program, a ROM (read only memory), and a RAM (read access memory) to convert the musical instrument position information into the musical tone parameter control information, this musical tone parameter control information being transferred to the musical tone control means together with sound source control information.
4. A musical tone generating apparatus according to claim 1 wherein the position information generating means comprises a display means for displaying a position of musical instruments corresponding to the musical instrument position information.
5. A musical tone generating apparatus according to claim 4 1wherein the display means comprises a transparent type touch panel and a display panel arranged behind the touch panel for indicating the respective positions of the musical instruments.
6. A musical tone generating apparatus according to claim 1 2, wherein the plane coordinate system is the x and y cartesian coordinate system, and each of the musical instrument positions is indicated by x and y cartesian coordinates.
7. A musical tone generating apparatus according to claim 5 , wherein the a surface of the display means includes x and y coordinates thereon.
8. A musical tone generating apparatus according to claim 5 , wherein the musical tone control means comprises a parameter control circuit for generating analog musical tone signals output to the right and left channels.
9. A musical tone generating apparatus according to claim 1 A musical tone generating apparatus for providing a performance effect of a plurality of musical instruments arranged in a performance place, comprising:
tone color designating means for designating a tone color corresponding to each musical instrument arranged in the performance place;
position information generating means for generating musical instrument position information corresponding to a position of each musical instrument arranged at the performance place;
information converting means for converting the musical instrument position information into musical tone parameter control information;
musical tone control means for controlling musical tone parameters in accordance with the musical tone parameter control information; and
output means for outputting a musical tone in accordance with the musical tone parameter outputted from the musical tone control means, wherein the musical tone control means comprises a low pass filter and wherein the musical tone parameter control information comprises:
a first multiplication constant MP1 which is directly proportional to a normalized value Py, in which the normalized value Py indicates a position of the a y coordinate in the stage, and the first multiplication constant MP1 determines a position in a y direction of the stage;
a fourth multiplication constant MP4 which is inversely proportional to the normalized value Py, in which the fourth multiplication constant MP4 determines a magnitude of a reverberative effect in the y direction of the stage;
a filtering constant CF which is directly proportional to a normalized value of Py, in which the filtering constant CF determines a cut-off frequency of a the low-pass filter; and
a second multiplication constant MP2 which is directly proportional to a normalized value Px and a third multiplication constant MP3 which is inversely proportional to the normalized value Px, in which the second and third multiplication constants MP2 and MP3 determine the position of a sound image in the right and left directions an x direction of the stage, and the normalized value of Px indicates the position of the an x coordinate of the stage; and
a fourth multiplication constant MP4 which is inversely proportional to the normalized value P y , in which the fourth multiplication constant MP4 determines a magnitude of a reverberative effect in the y direction of the stage.
10. A musical tone generating apparatus comprising:
select means for selecting a stage among performance places;
storage means for storing musical instrument position information which indicates the position of a musical instrument arranged on the stage, and the tone color corresponding to the musical instrument;
reading means for reading the musical instrument position information and the tone color from the storage means, in which both the musical instrument position information and the tone color are selected by the select means;
display means for displaying images of musical instruments at positions corresponding to the musical instrument position information;
information converting means for converting the musical instrument position information into musical tone parameter control information in response to a value of the plane coordinates and a variable which is determined by the value of the plane coordinates;
musical tone control means for controlling musical tone parameters in accordance with the musical tone parameter control information; and
output means for outputting a musical tone in accordance with the musical tone parameter outputted from the musical tone control means.
11. A musical tone generating apparatus according to claim 10 , wherein the select means comprises select elements having variable resistors.
12. A musical tone generating apparatus according to claim 10 , wherein the storage means comprises a ROM.
13. A musical tone generating apparatus according to claim 10 , wherein the reading means is controlled by a computer program stored in a CPU (central processing unit) to read the musical instrument position information and the tone color from the storage means.
14. A musical tone generating apparatus according to claim 10 , wherein the select means comprises select elements having variable switches.
15. A musical tone generating apparatus according to claim 10 , wherein the storage means comprises a floppy disk.
16. A sound processing apparatus for moving a visual image position and a sound image position together, the sound processing apparatus comprising:
display means for displaying a visual image representing a sound source;
sound generating means including a tone generator and a plurality of loudspeakers for generating a sound corresponding to the sound source;
position information generating means for generating sound source position information;
control means for controlling the display means based on the position information so that the display means displays the visual image at a visual image position corresponding to the position information and for controlling the sound generating means based on the position information so that a sound image of the sound generated by the sound generating means is produced at a sound image position corresponding to the position information,
wherein the visual image position and the sound image position are updated in response to changes in the position information.
17. A sound processing apparatus according to claim 16 , wherein the sound source is a musical instrument having a given tone color.
18. A sound processing apparatus according to claim 16 , wherein the position information generating means generates data representing a x-y position in a plane coordinate system as the position information.
19. A sound processing apparatus according to claim 18 , wherein the position information generating means includes a transparent touch panel disposed in front of the display means, and the x-y position is designated on the touch panel by a performer at the position of the visual image.
20. A sound processing apparatus according to claim 16 , wherein the position information generating means includes a variable resistor and the position information is designated with the variable resistor by a performer.
21. A sound processing apparatus according to claim 16 , further comprising:
reverberation effect imparting means for imparting a reverberation effect, based on the position information, to the sound generated by the sound generating means to impart a sound image effect to the sound generated.
22. A sound processing apparatus according to claim 21 , wherein the reverberation effect imparting means controls the amount of the reverberation effect to be imparted in response to the position information.
23. A sound processing apparatus according to claim 21 , wherein the sound generating means generates a sound corresponding to data input through an external interface.
24. A sound processing apparatus according to claim 16 , further comprising:
sound modification means for modifying the sound generated by the sound generating means based on the position information so that the sound generating means generates the sound modified in response to the position information, wherein the sound modification means modifies a sound image effect in response to the position information.
25. A sound processing apparatus according to claim 24 , wherein the sound modification means includes a low pass filter.
26. A sound processing apparatus according to claim 25 , wherein the sound modification means controls the cut-off frequency of the low pass filter in response to the position information.
27. A sound processing apparatus for generating a sound, the sound processing apparatus comprising:
a display for displaying a visual image representing a sound source located in an image space;
sound generating means including a tone generator for generating a sound corresponding to a sound characteristic of the sound source;
position information generating means for generating sound source position information representing a desired display position;
storage means for storing conversion characteristics;
information converting means for converting the position information to sound parameters in accordance with the conversion characteristics stored in the storage means; and
control means for controlling the display based on the position information so that the display displays the visual image at a visual image position corresponding to the position information, and for controlling the sound generating means based on the converted sound parameters so that a sound image of the sound signals generated by the sound generating means is produced at a sound source position corresponding to the position information.
28. A sound processing apparatus for generating a sound, the sound processing apparatus comprising:
sound generating means including a tone generator for generating a sound corresponding to a sound characteristic of the sound source;
storage means for storing sound source position information;
reading means for reading the sound source position information from the storage means;
control means for controlling the tone generator based on the sound source position information so that a sound image of the sound signals generated by the tone generator is produced at a position corresponding to the sound source position information; and
a display for displaying a visual image corresponding to the sound source in an image space, wherein the control means controls the display so that the display displays the visual image at a position corresponding to the sound source position information.
29. A sound processing apparatus according to claim 28 , wherein the storage means comprises a floppy disk which stores the sound source position information.
30. A sound processing apparatus according to claim 22 , wherein the storage means further stores symbol information representing visual images to be displayed, and the display means displays a visual image based on the symbol information.
31. A sound processing apparatus according to claim 22 , wherein the visual image has a shape of a corresponding sound source.
32. A sound processing apparatus according to claim 28 , further comprising:
position information generating means for generating position information in response to a player's operation, wherein the sound source position information is changed based on the generated position information, and the sound generating means and the display are controlled based on the changed sound source position information.
33. A sound processing apparatus for generating a sound, the sound processing apparatus comprising:
sound generating means including a tone generator for generating a sound corresponding to a sound characteristic of a sound source;
a display for displaying a visual image corresponding to a sound source in an image space;
storage means for storing sound source position information;
reading means for reading the sound source position information from the storage means;
control means for controlling the sound generating means based on the sound source position information so that a sound image of the sound generated by the sound generating means is produced at a position corresponding to the sound source position information; and
position information generating means for generating position information in response to a player's operation, wherein the sound source position information is changed based on the generated position information, and the control means controls the sound generating means and the display means based on the changed sound source position information.
34. A sound processing apparatus according to claim 33 , wherein the position information generating means generates data representing a x-y position in a plane coordinate system as the position information.
35. A sound processing apparatus according to claim 34 , wherein the position information generating means includes a transparent touch panel disposed on the display means, and the x-y position is designated on the touch panel by a performer.
36. A sound processing apparatus according to claim 34 , wherein the position information generating means includes a variable resistor, wherein the x-y position is responsive to changes in the variable resistor.
37. A sound processing apparatus for generating a sound, processing apparatus comprising:
a sound generator including a tone generator and speakers generating a sound corresponding to a sound characteristic of a sound source;
an external memory device which stores sound source position information;
a reading circuit which reads the sound source position information from the external memory device and which causes the sound source position information to be stored within a local memory;
a controller coupled to the sound generator and responsive to the sound source position information, wherein the controller controls a position of a sound image of the second generated by the sound generator in response to the sound source position information;
a display for displaying a visual image corresponding to a sound source located in an image space; and
a position information generator which generates position information in response to a player's operation, wherein the sound source position information is changed based on the generated position information, and the controller controls the sound generator based on the changed sound source position information.
38. A sound processing apparatus according to claim 37 , wherein the external memory device is a floppy disk and the reading circuit is a floppy disk control unit.
39. A sound processing apparatus according to claim 37 , further comprising:
a display which displays a visual image corresponding to a sound source located in an image space;
wherein the controller is responsive to the sound source position information so that the visual image appears at a position corresponding to the sound source position information.
40. A sound processing apparatus according to claim 39 , wherein the external memory device stores symbol information representing visual images to be displayed, and the visual image is displayed based on the symbol information read out by the reading out means.
41. A method of moving a visual image position and a sound image position simultaneously, the method comprising the steps of:
displaying a visual image representing a sound source located in an image space on a display;
generating a sound corresponding to a sound source;
generating position information in response to player control;
controlling the display based on the position information so that the display produces a visual image corresponding to a sound source at a visual position corresponding to the position information; and
controlling a sound generator which includes a tone generator and speakers based on the position information so that the sound generator produces a sound image corresponding to the sound source at a sound image position corresponding to the position information;
wherein the visual image position and the sound image position are updated in response to changes in the position information.
42. A method of sound processing comprising the steps of:
displaying a visual image representing a sound source located in an image space on a display;
generating sounds with a tone generator which corresponds to the sound source;
reading conversion characteristics from a storage device;
generating sound source position information corresponding to a desired display position;
converting the position information to sound parameters in accordance with the characteristics read from the storage device;
controlling the display based upon the position information so that the display displays the visual image at a visual image position corresponding to the position information; and
controlling the tone generator based on the converted sound parameters so that the tone generator produces a sound image at a sound source position corresponding to the position information.
43. A method of sound processing comprising the steps of:
generating a sound signal in response to a designation and generating a sound based upon the sound signal corresponding to a sound characteristic of a sound source;
storing sound source position information into an external storage;
reading out the sound source position information from the external storage so that the sound source position information is stored within a local memory;
controlling a sound generator which includes a tone generator and speakers in accordance with the locally stored sound source position information so that the sound generator generates a sound at a sound image position corresponding to the sound source position information; and
displaying a visual image representing a sound source located in an image space on a display;
generating position information in response to a player's operation and changing the sound source position information based on the generated position information, and wherein said controlling step controls the sound generator based on the changed sound source position information.
44. A method of sound processing according to claim 43 , the method further comprising the steps of:
controlling a display in accordance with the sound source position information so that a visual image displayed by the display is provided at a position corresponding to the sound source position information.
45. A method of sound processing according to claim 44 , the method further comprising the step of:
storing symbol information representing visual images to be displayed into the external storage,
wherein the visual image is displayed based on the symbol information.
46. A sound apparatus for moving a visual image position and a sound image position together, the sound processing apparatus comprising:
a display that displays a visual image corresponding to a sound source located in an image space;
a sound generator having a tone generator and plurality of loudspeakers for generating a sound corresponding to the sound source;
a position information generator that generates position information;
a controller that controls the display based on the position information so that the display displays the visual image at a visual position corresponding to the position information and for controlling the sound generator based on the position information so that a sound image of the sound generated by the sound generator is produced at a sound position corresponding to the position information;
wherein the visual image and the sound image are moved together in response to changes in the position information.
47. A sound processing apparatus according to claim 46 , further comprising:
a reverberation effect imparting circuit that imparts a reverberation effect, based on the position information, to the sound generated by the sound generator.
48. A sound processing apparatus according to claim 46 , further comprising
a sound modifier that modifies the sound generated by the sound generator based on the position information so that the sound generator generates the sound modified in response to the position information.
49. A sound processing apparatus according to claim 46 , wherein the sound source is a musical instrument having a given tone color.
50. A sound processing apparatus according to claim 46 wherein the position information generator generates data representing a x-y position in a plane coordinate system as the position information.
51. A sound processing apparatus according to claim 50 , wherein the position information generator includes a transparent touch panel disposed in front of the display, and the x-y position is designated on the touch panel by a performer at the position of the visual image.
52. A sound processing apparatus according to claim 46 , wherein the position information generator includes a variable resistor and the position information is designated with the variable resistor by a performer.
53. A sound processing apparatus comprising:
a sound generator including a tone generator and speakers that generates a sound corresponding to a sound source;
a display that displays a visual image corresponding to a sound source located in an image space;
a position information generator that generates sound source position information corresponding to display position;
a storage medium that stores conversion characteristics;
an information converter that converts the position information to sound parameters in accordance with the conversion characteristics stored in the storage medium; and
a sound image position controller that controls the sound generator based on the converted sound parameters so that the sound image of the sound generated by the sound generator is produced at a sound source position corresponding to the position information.
54. A sound processing apparatus comprising:
a sound generator including a tone generator and speakers that generates a sound corresponding to a sound characteristic of a sound source;
a storage means that stores sound source position information;
a reader that reads the sound source position information from the storage medium;
a controller that controls the sound generator based on the sound source position information so that the sound image of the sound generated by the second generator is produced at a position corresponding to the sound source position information; and
a display that displays a visual image corresponding to the sound source, wherein the controller controls the display to display the visual image at a position corresponding to the sound source position information.
55. A sound processing apparatus comprising:
a sound generator including a tone generator driving a plurality of loudspeakers with a sound signal;
an external memory device which stores sound source position information;
a reading circuit which reads the sound source position information from the external memory device;
a controller coupled to the sound generator and responsive to the sound source position information, wherein the controller alters the position of the sound image generated by the sound generator in response to changes in the sound source position information; and
a display that displays a visual image corresponding to a sound source located in an image space.
56. A method of sound processing comprising the steps of:
generating a sound corresponding to a sound characteristic of a sound source;
storing sound source position information;
reading out the stored sound source position information;
controlling a sound generator which includes a tone generator and speakers in accordance with the sound position information so that the sound generator produces a sound image at a position corresponding to the sound source position information; and
displaying a visual image corresponding to the sound source, wherein said controlling step causes the displaying step to display the visual image at a position corresponding to the sound source position information.
57. A processing apparatus comprising:
a display that displays a visual image corresponding to a sound source located in an image space;
a sound generator having a tone generator and a plurality of loudspeakers for generating a sound corresponding to the sound source;
a position information generator that generates position information; and
a controller that controls the display based on the position information so that the display displays the visual image at a visual position corresponding to the position information and controls the sound generator based on the position information so that a sound image of the sound generated by the sound generator is produced at a sound position corresponding to the visual position.
58. A sound processing apparatus according to claim 57 , further comprising:
a converter that converts the position information into the sound parameter information.
59. Sound processing apparatus comprising:
a designating device that designates a plurality of sound sources;
a display that displays a plurality of visual images, located in an image space, each of which represents one of the plurality sound sources;
a sound generator including a tone generator and a plurality of loudspeakers that generates a plurality of sounds each of which corresponds to one of the plurality of sound sources;
a position information generator that generates position information for each of the plurality of sound sources; and
a controller that controls the display based on the position information so that the display displays the plurality of visual images at visual positions corresponding to the position information and that controls the sound generator based on the position information so that sound images of the sounds generated by the sound generator are produced at sound positions corresponding to the position information.
60. Sound processing apparatus comprising:
a display having a display panel that displays a plurality of visual images, located in an image space, each of which has a shape representing one of a plurality of sound sources;
a sound generator including a tone generator and a plurality of loudspeakers that generates a plurality of sounds each of which corresponds to one of the plurality of sound sources;
a device with which a player provides a moving operation input for each of the plurality of sound sources; and
a controller that controls the display so that the display moves the plurality of visual images in response to the moving operation and that controls the sound generator so that sound images of the sounds generated by the sound generator are moved in response to the moving operation.
61. Sound processing apparatus comprising:
a selecting device that selects a performance space;
sound generator including a tone generator that generates a sound;
an acoustic effector that imparts to the sound an acoustic effect corresponding to the selected performance space;
a display;
a controller that controls the display so that the display displays a visual image representing the selected performance space.
62. Sound processing apparatus according to claim 61 , wherein the display is controlled to display a name of the selected performance space.
63. Sound processing apparatus according to claim 61 , wherein the sound generator simultaneously generates a plurality of sounds having a plurality of tone colors, and the display is controlled to display a plurality of visual images representing the plurality of tone colors.
64. Sound processing apparatus according to claim 63 , wherein the display is controlled to display names of the plurality of tone colors.
65. Sound processing apparatus according to claim 63 , wherein the acoustic effector imparts a reverberative effect to the sound as acoustic effect.
66. Sound processing apparatus according to claim 63 , wherein the acoustic effector imparts to the sound an effect, in which a sound image of the sound is produced at a desired position, as the acoustic effect.
67. Sound processing apparatus according to claim 66 , wherein the acoustic effector is controlled by the controller to move the position of the sound image.
68. Sound processing apparatus comprising:
a storage medium that stores a set of sound information and acoustic effect information for each of a plurality of performance spaces;
a selecting device that selects one of the plurality of performance spaces;
a display which displays information about the selected performance space and/or sound;
a sound generator including a tone generator that generates a sound based on the sound information which corresponds to the selected performance space and is read out from the storage medium; and
an acoustic effector that imparts an acoustic effect to the sound based on the acoustic effect information which corresponds to the selected performance space and is read out from the storage medium.
69. Sound processing apparatus according to claim 68 , wherein the storage medium stores reverberative information as the acoustic effect information, and the acoustic effector imparts a reverberation to the sound based on the reverberative information.
70. Sound processing apparatus according to claim 68 , wherein the storage medium stores position information as the acoustic effect information, and the acoustic effector imparts to the sound an effect in which a sound image of the sound is produced at a position corresponding to the position information read out as the acoustic effect information.
71. Sound processing apparatus according to claim 70 , wherein the apparatus further comprises a position designating device which designates a desirable position in response to a player's operation, and, when one of the plurality of performance spaces is selected by the selecting device, the sound image of the sound is produced at a position corresponding to the position information which corresponds to the selected performance space, and, when the desired position is designated by the position designating device, the sound image of the sound is produced at a position corresponding to the desired position.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/798,654 USRE38276E1 (en) | 1988-09-02 | 1997-02-11 | Tone generating apparatus for sound imaging |
Applications Claiming Priority (12)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP63-220010 | 1988-09-02 | ||
JP63220012A JP2605821B2 (en) | 1988-09-02 | 1988-09-02 | Music control device |
JP63-220009 | 1988-09-02 | ||
JP63220009A JP3089421B2 (en) | 1988-09-02 | 1988-09-02 | Sound processing device |
JP63220011A JP2629874B2 (en) | 1988-09-02 | 1988-09-02 | Music parameter controller |
JP63220010A JPH0267599A (en) | 1988-09-02 | 1988-09-02 | Musical sound generating device |
JP63-220011 | 1988-09-02 | ||
JP63-220012 | 1988-09-02 | ||
US07/401,158 US5027689A (en) | 1988-09-02 | 1989-08-31 | Musical tone generating apparatus |
US8481293A | 1993-06-29 | 1993-06-29 | |
US34553194A | 1994-11-28 | 1994-11-28 | |
US08/798,654 USRE38276E1 (en) | 1988-09-02 | 1997-02-11 | Tone generating apparatus for sound imaging |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US07/401,158 Reissue US5027689A (en) | 1988-09-02 | 1989-08-31 | Musical tone generating apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
USRE38276E1 true USRE38276E1 (en) | 2003-10-21 |
Family
ID=28795406
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/798,654 Expired - Lifetime USRE38276E1 (en) | 1988-09-02 | 1997-02-11 | Tone generating apparatus for sound imaging |
Country Status (1)
Country | Link |
---|---|
US (1) | USRE38276E1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112396918A (en) * | 2020-11-24 | 2021-02-23 | 辽东学院 | Five-line visual display technology-based blackboard for vocal music teaching and method |
Citations (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS4925901A (en) | 1972-06-29 | 1974-03-07 | ||
JPS5131225A (en) | 1974-09-11 | 1976-03-17 | Matsushita Electric Ind Co Ltd | |
JPS554012A (en) | 1978-06-22 | 1980-01-12 | Casio Computer Co Ltd | Musical tone generator in electronic instrument |
US4188504A (en) | 1977-04-25 | 1980-02-12 | Victor Company Of Japan, Limited | Signal processing circuit for binaural signals |
US4219696A (en) | 1977-02-18 | 1980-08-26 | Matsushita Electric Industrial Co., Ltd. | Sound image localization control system |
US4275267A (en) * | 1979-05-30 | 1981-06-23 | Koss Corporation | Ambience processor |
JPS57116500A (en) | 1981-01-12 | 1982-07-20 | Keio Giken Kogyo Kk | Signal processor |
JPS57195195A (en) | 1981-05-26 | 1982-11-30 | Mitsubishi Electric Corp | Purification of orthophosphoric ester oil for electric insulation |
JPS58160992A (en) * | 1982-03-19 | 1983-09-24 | カシオ計算機株式会社 | Electronic musical instrument |
US4410761A (en) | 1980-11-05 | 1983-10-18 | Willi Schickedanz | Stereo loudspeaker system for a picture reproducing screen |
JPS5987100A (en) | 1982-11-09 | 1984-05-19 | Ebara Infilco Co Ltd | Drying method for water-contg. matter |
JPS59100498A (en) | 1982-11-30 | 1984-06-09 | カシオ計算機株式会社 | Electronic musical instrument |
JPS59187300A (en) | 1983-04-07 | 1984-10-24 | 渡辺 尚道 | Pyramid energy generator |
JPS6075887A (en) * | 1983-10-03 | 1985-04-30 | カシオ計算機株式会社 | Sound image static apparatus |
US4577540A (en) * | 1982-09-09 | 1986-03-25 | Casio Computer Co., Ltd. | Electronic musical instrument having a pan-pot function |
JPS61184594A (en) | 1985-02-12 | 1986-08-18 | 松下電器産業株式会社 | Musical sound source unit |
JPS61257099A (en) | 1985-05-10 | 1986-11-14 | Nippon Gakki Seizo Kk | Acoustic control device |
US4648116A (en) * | 1984-10-10 | 1987-03-03 | Ayal Joshua | Sound panning apparatus |
JPS6253100A (en) | 1985-09-02 | 1987-03-07 | Nippon Gakki Seizo Kk | Acoustic characteristic controller |
JPS62236022A (en) | 1986-04-08 | 1987-10-16 | Matsushita Electric Ind Co Ltd | Transparent touch input device |
JPS6348237A (en) | 1986-08-14 | 1988-02-29 | Agency Of Ind Science & Technol | Production of ethanol |
US4731848A (en) * | 1984-10-22 | 1988-03-15 | Northwestern University | Spatial reverberator |
JPS6360700A (en) | 1986-08-29 | 1988-03-16 | Matsushita Electric Ind Co Ltd | Acoustic effect device |
JPS6398593A (en) | 1986-10-15 | 1988-04-30 | 株式会社日立製作所 | Earthquakeproof supporter for large-sized vessel |
JPS63222323A (en) | 1987-03-12 | 1988-09-16 | Hitachi Maxell Ltd | Magnetic recording medium |
US4817149A (en) * | 1987-01-22 | 1989-03-28 | American Natural Sound Company | Three-dimensional auditory display apparatus and method utilizing enhanced bionic emulation of human binaural sound localization |
US4893120A (en) * | 1986-11-26 | 1990-01-09 | Digital Electronics Corporation | Touch panel using modulated light |
US5027687A (en) * | 1987-01-27 | 1991-07-02 | Yamaha Corporation | Sound field control device |
US5040220A (en) * | 1986-09-30 | 1991-08-13 | Yamaha Corporation | Control circuit for controlling reproduced tone characteristics |
US5046097A (en) * | 1988-09-02 | 1991-09-03 | Qsound Ltd. | Sound imaging process |
US5105462A (en) | 1989-08-28 | 1992-04-14 | Qsound Ltd. | Sound imaging method and apparatus |
US5164840A (en) * | 1988-08-29 | 1992-11-17 | Matsushita Electric Industrial Co., Ltd. | Apparatus for supplying control codes to sound field reproduction apparatus |
-
1997
- 1997-02-11 US US08/798,654 patent/USRE38276E1/en not_active Expired - Lifetime
Patent Citations (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS4925901A (en) | 1972-06-29 | 1974-03-07 | ||
JPS5131225A (en) | 1974-09-11 | 1976-03-17 | Matsushita Electric Ind Co Ltd | |
US4219696A (en) | 1977-02-18 | 1980-08-26 | Matsushita Electric Industrial Co., Ltd. | Sound image localization control system |
US4188504A (en) | 1977-04-25 | 1980-02-12 | Victor Company Of Japan, Limited | Signal processing circuit for binaural signals |
JPS554012A (en) | 1978-06-22 | 1980-01-12 | Casio Computer Co Ltd | Musical tone generator in electronic instrument |
US4275267A (en) * | 1979-05-30 | 1981-06-23 | Koss Corporation | Ambience processor |
US4410761A (en) | 1980-11-05 | 1983-10-18 | Willi Schickedanz | Stereo loudspeaker system for a picture reproducing screen |
JPS57116500A (en) | 1981-01-12 | 1982-07-20 | Keio Giken Kogyo Kk | Signal processor |
JPS57195195A (en) | 1981-05-26 | 1982-11-30 | Mitsubishi Electric Corp | Purification of orthophosphoric ester oil for electric insulation |
JPS58160992A (en) * | 1982-03-19 | 1983-09-24 | カシオ計算機株式会社 | Electronic musical instrument |
US4577540A (en) * | 1982-09-09 | 1986-03-25 | Casio Computer Co., Ltd. | Electronic musical instrument having a pan-pot function |
JPS5987100A (en) | 1982-11-09 | 1984-05-19 | Ebara Infilco Co Ltd | Drying method for water-contg. matter |
JPS59100498A (en) | 1982-11-30 | 1984-06-09 | カシオ計算機株式会社 | Electronic musical instrument |
JPS59187300A (en) | 1983-04-07 | 1984-10-24 | 渡辺 尚道 | Pyramid energy generator |
JPS6075887A (en) * | 1983-10-03 | 1985-04-30 | カシオ計算機株式会社 | Sound image static apparatus |
US4648116A (en) * | 1984-10-10 | 1987-03-03 | Ayal Joshua | Sound panning apparatus |
US4731848A (en) * | 1984-10-22 | 1988-03-15 | Northwestern University | Spatial reverberator |
JPS61184594A (en) | 1985-02-12 | 1986-08-18 | 松下電器産業株式会社 | Musical sound source unit |
JPS61257099A (en) | 1985-05-10 | 1986-11-14 | Nippon Gakki Seizo Kk | Acoustic control device |
JPS6253100A (en) | 1985-09-02 | 1987-03-07 | Nippon Gakki Seizo Kk | Acoustic characteristic controller |
JPS62236022A (en) | 1986-04-08 | 1987-10-16 | Matsushita Electric Ind Co Ltd | Transparent touch input device |
JPS6348237A (en) | 1986-08-14 | 1988-02-29 | Agency Of Ind Science & Technol | Production of ethanol |
JPS6360700A (en) | 1986-08-29 | 1988-03-16 | Matsushita Electric Ind Co Ltd | Acoustic effect device |
US5040220A (en) * | 1986-09-30 | 1991-08-13 | Yamaha Corporation | Control circuit for controlling reproduced tone characteristics |
JPS6398593A (en) | 1986-10-15 | 1988-04-30 | 株式会社日立製作所 | Earthquakeproof supporter for large-sized vessel |
US4893120A (en) * | 1986-11-26 | 1990-01-09 | Digital Electronics Corporation | Touch panel using modulated light |
US4817149A (en) * | 1987-01-22 | 1989-03-28 | American Natural Sound Company | Three-dimensional auditory display apparatus and method utilizing enhanced bionic emulation of human binaural sound localization |
US5027687A (en) * | 1987-01-27 | 1991-07-02 | Yamaha Corporation | Sound field control device |
JPS63222323A (en) | 1987-03-12 | 1988-09-16 | Hitachi Maxell Ltd | Magnetic recording medium |
US5164840A (en) * | 1988-08-29 | 1992-11-17 | Matsushita Electric Industrial Co., Ltd. | Apparatus for supplying control codes to sound field reproduction apparatus |
US5046097A (en) * | 1988-09-02 | 1991-09-03 | Qsound Ltd. | Sound imaging process |
US5105462A (en) | 1989-08-28 | 1992-04-14 | Qsound Ltd. | Sound imaging method and apparatus |
Non-Patent Citations (2)
Title |
---|
Sakamoto, Gotoh, Kogure, and Shimbo, "Controlling Sound Image Localization in Stereo Reproduction," J. Audio Eng. Soc., vol. 29, Nov. 1981 at 794. |
Sakamoto, Gotoh, Kogure, and Shimbo, "Controlling Sound Image Localization in Stereo Reproduction: Part II," J. Audio Eng. Soc., vol. 30, Oct. 1982 at 719. |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112396918A (en) * | 2020-11-24 | 2021-02-23 | 辽东学院 | Five-line visual display technology-based blackboard for vocal music teaching and method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5027689A (en) | Musical tone generating apparatus | |
EP1357538B1 (en) | Method for making electronic tones close to acoustic tones, recording system and tone generating system | |
US4577540A (en) | Electronic musical instrument having a pan-pot function | |
US5354948A (en) | Tone signal generation device for generating complex tones by combining different tone sources | |
JP2967471B2 (en) | Sound processing device | |
JP3183385B2 (en) | Performance information input device for electronic musical instruments | |
USRE38276E1 (en) | Tone generating apparatus for sound imaging | |
JPH0720866A (en) | Electronic musical instrument | |
JP2650489B2 (en) | Electronic musical instrument | |
JP3089421B2 (en) | Sound processing device | |
JP2605885B2 (en) | Tone generator | |
JP2576528B2 (en) | Musical sound visualization device | |
JP3055557B2 (en) | Sound processing device | |
JP2629874B2 (en) | Music parameter controller | |
JP3045106B2 (en) | Sound processing device | |
JP2605821B2 (en) | Music control device | |
US5345036A (en) | Volume control apparatus for an automatic player piano | |
JP3360604B2 (en) | Display device for musical tone control element group and recording medium storing display program for musical tone control element group | |
JP3055556B2 (en) | Sound processing device | |
JPH075873A (en) | Sound processor | |
JP3543384B2 (en) | Electronic musical instrument | |
JP2888712B2 (en) | Music generator | |
JPH0267599A (en) | Musical sound generating device | |
JPH056170A (en) | Electronic musical instrument | |
JP2530892B2 (en) | Keyboard type electronic musical instrument |