WO2011030761A1 - 音楽ゲームシステム及びそのコンピュータプログラム並びに効果音データの生成方法 - Google Patents

音楽ゲームシステム及びそのコンピュータプログラム並びに効果音データの生成方法 Download PDF

Info

Publication number
WO2011030761A1
WO2011030761A1 PCT/JP2010/065337 JP2010065337W WO2011030761A1 WO 2011030761 A1 WO2011030761 A1 WO 2011030761A1 JP 2010065337 W JP2010065337 W JP 2010065337W WO 2011030761 A1 WO2011030761 A1 WO 2011030761A1
Authority
WO
WIPO (PCT)
Prior art keywords
sound
data
voice
sound effect
pitch
Prior art date
Application number
PCT/JP2010/065337
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
右寺 修
宜隆 西村
Original Assignee
株式会社コナミデジタルエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コナミデジタルエンタテインメント filed Critical 株式会社コナミデジタルエンタテインメント
Priority to US13/394,967 priority Critical patent/US20120172099A1/en
Priority to CN201080039640.3A priority patent/CN102481488B/zh
Publication of WO2011030761A1 publication Critical patent/WO2011030761A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • G10H1/361Recording/reproducing of accompaniment for use with an external source, e.g. karaoke systems
    • G10H1/368Recording/reproducing of accompaniment for use with an external source, e.g. karaoke systems displaying animated or moving pictures synchronized with the music or audio part
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/54Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/424Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/44Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/814Musical performances, e.g. by evaluating the player's ability to follow a notation
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • G10H1/40Rhythm
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/215Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1081Input via voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/206Game information storage, e.g. cartridges, CD ROM's, DVD's, smart cards
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6063Methods for processing data by generating or executing the game program for sound processing
    • A63F2300/6081Methods for processing data by generating or executing the game program for sound processing generating an output signal, e.g. under timing constraints, for spatialization
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/63Methods for processing data by generating or executing the game program for controlling the execution of the game in time
    • A63F2300/638Methods for processing data by generating or executing the game program for controlling the execution of the game in time according to the timing of operation or a time limit
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8047Music games
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/031Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
    • G10H2210/066Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for pitch analysis as part of wider processing for musical purposes, e.g. transcription, musical performance evaluation; Pitch recognition, e.g. in polyphonic sounds; Estimation or use of missing fundamental
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/101Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
    • G10H2220/106Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/135Musical aspects of games or videogames; Musical instrument-shaped game input interfaces
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2230/00General physical, ergonomic or hardware implementation of electrophonic musical tools or instruments, e.g. shape or architecture
    • G10H2230/005Device type or category
    • G10H2230/015PDA [personal digital assistant] or palmtop computing devices used for musical purposes, e.g. portable music players, tablet computers, e-readers or smart phones in which mobile telephony functions need not be used

Definitions

  • the present invention relates to a music game system or the like in which sound input by a player is reflected in game content.
  • JP 2002-136664 A Japanese Patent Laid-Open No. 10-268876
  • the game content is changed by taking in the voice of the player. Processing is performed to detect the pitch of the player's voice and change the action content of the character based on the comparison result with the reference pitch.
  • the sound input by the player is reflected in the game content as a material and the game is enjoyed with the input sound.
  • an object of the present invention is to provide a music game system capable of discriminating a voice input by a player and forming a scale based on the discrimination result, a computer program thereof, and a method of generating sound effect data.
  • the music game system of the present invention includes a sound input device for inputting sound, a sound output device for reproducing and outputting game sound, and sound effect data for outputting each of a plurality of sound effects having different pitches from the sound output device.
  • Sound data storage means for storing sound
  • sequence data storage means for storing sequence data describing the relationship between the sound effects to be output in response to the player's operation, and the sound of the sound input by the sound input device
  • a scale is formed by pitch determination means for determining the pitch representative of the input voice, and a plurality of sound data having different pitches from the voice data based on the pitch determination result of the pitch determination means.
  • Scale generation means for generating the sound data, and the plurality of sound data generated by the scale generation means as at least a part of the sound effect data
  • sound effect data storage control means for storing in said sound effect data storage means, in which comprises a.
  • the computer program for the music game system of the present invention outputs a sound input device for inputting sound, a sound output device for reproducing and outputting game sound, and each of a plurality of sound effects having different pitches from the sound output device.
  • a music game comprising sound effect data storage means for storing sound effect data for playing, and sequence data storage means for storing sequence data describing the relationship between sound effects to be output in response to the player's operation
  • a computer incorporated in the system, based on the voice data of the voice input by the voice input device, on the basis of the pitch discrimination means for discriminating the pitch representative of the input voice, on the basis of the pitch discrimination result of the pitch discrimination means;
  • a scale generating means for generating a plurality of sound data having different pitches from the sound data so that a scale is formed. And a plurality of sound data generated by the scale generating means is configured to function as sound effect data storage control means for storing the sound effect data storage means as at least a part of the sound effect data. .
  • voice data is generated by the pitch discriminating unit based on the voice input by the player to the voice input device, and the pitch that is representative of the voice data is discriminated. Then, based on the pitch determination result of the voice data, the scale generation unit generates a plurality of sound data having different pitches based on the voice data whose pitch has been determined.
  • the plurality of sound data forms a musical scale.
  • the plurality of sound data is stored in the sound effect data storage means as sound effect data, and the plurality of sound data is used as the sound effect to be output in response to the operation of the player.
  • a scale is formed based on the voice arbitrarily input by the player, so that a melody based on the input voice can be played, or the input voice can be reflected in the game content and the game can be enjoyed with the voice input by the player. can do.
  • the pitch determination unit may determine the pitch of the voice by specifying a representative frequency from voice data of the voice input by the voice input device.
  • the pitch of the voice is determined by specifying the frequency having the maximum distribution as a representative value with reference to the frequency spectrum of the voice data.
  • the scale generation means may generate a scale of at least one octave or more. According to this aspect, it is possible to play a melody by generating a scale. If a large amount of sound data is generated, the range of musical scales can be expanded to increase the number of melodies that can be played, and the game content can be enhanced.
  • the audio output device further includes an input device having at least one operation unit, and the sound effect according to the description of the sequence data is based on the operation of the player by the input device. May be played from.
  • the player can reproduce the sound effect composed of the scale formed using the sound input by the player by operating the operation unit. Therefore, it is possible to reflect the input sound as a material in the game content and enjoy the game with the sound input by the player.
  • the sound effect data generation method of the present invention includes a pitch determination step for determining a pitch representing the input voice based on the voice data of the voice input by the voice input device, and a pitch determination result of the pitch determination step. And generating a plurality of sound data having different pitches from the sound data so that a scale is formed, and outputting the plurality of sound data generated in the scale generation step from a sound output device And storing it in the storage means as sound effect data.
  • the present invention is a method for generating sound effect data in a music game system and a computer program thereof, and has the same effects.
  • the present invention is not limited to a music game system, and can be applied to various electronic devices such as an electronic musical instrument.
  • the sound data is generated by the pitch determination unit based on the voice input by the player to the voice input device, and the pitch that is representative of the voice data is determined. Is done. Then, based on the pitch determination result of the voice data, the scale generation unit generates a plurality of sound data having different pitches based on the voice data whose pitch has been determined. The plurality of sound data forms a musical scale. The plurality of sound data is stored in the sound effect data storage means as sound effect data, and the plurality of sound data is used as the sound effect to be output in response to the operation of the player.
  • a scale is formed based on the voice arbitrarily input by the player, so that a melody based on the input voice can be played, or the input voice can be reflected in the game content and the game can be enjoyed with the voice input by the player. can do.
  • the same effect can be achieved in the method for generating sound effect data.
  • 1 is a functional block diagram of a game machine according to one embodiment of the present invention.
  • the game machine 1 is disposed on a housing 2 that a player (user) can hold, a first monitor 3 disposed on the right side of the housing 2, and a left side of the housing 2.
  • a second monitor 4 a plurality of push button switches 5 disposed on the upper side of the first monitor 3, and a cross key 6 disposed on the lower side of the first monitor 3 are provided.
  • a transparent touch panel 7 is superimposed on the surface of the first monitor 3.
  • the touch panel 7 is a known input device that outputs a signal corresponding to the contact position when the player touches with a touch pen or the like.
  • the game machine 1 is provided with various input devices and output devices such as a power switch, a volume operation switch, and a power lamp, which are included in a normal portable game machine. Illustration is omitted.
  • a control unit 10 as a computer is provided inside the game machine 1.
  • the control unit 10 includes a game control unit 11 as a control subject, and a pair of display control units 12 and 13 and an audio output control unit 14 that operate according to an output from the game control unit 11.
  • the game control unit 11 is configured as a unit in which a microprocessor and various peripheral devices such as an internal storage device (for example, ROM and RAM) necessary for the operation of the microprocessor are combined.
  • the display control units 12 and 13 draw an image corresponding to the image data supplied from the game control unit 11 in the frame buffer, and output video signals corresponding to the drawn image to the monitors 3 and 4, respectively. A predetermined image is displayed on 3 and 4.
  • the sound output control unit 14 generates a sound reproduction signal corresponding to the sound reproduction data given from the game control unit 11 and outputs the sound reproduction signal to the speaker 8, whereby predetermined sound (including musical sounds and the like) is output from the speaker 8. Let it play.
  • the game control unit 11 is connected with the push button switch 5, the cross key 6 and the touch panel 7 described above as input devices, and in addition to these, a voice input device (microphone) 9 is connected.
  • various input devices may be connected to the game control unit 11.
  • an external storage device 20 is connected to the game control unit 11.
  • the external storage device 20 is a storage medium that can hold the storage even when power is not supplied, such as a nonvolatile semiconductor memory device such as an EEPROM or a magnetic storage device.
  • the storage medium of the external storage device 20 is detachable from the game machine 1.
  • the external storage device 20 stores a game program 21 and game data 22.
  • the game program 21 is a computer program necessary for executing a music game according to a predetermined procedure on the game machine 1, and includes a sequence control module 23 and a pitch determination module for realizing the functions according to the present invention. 24 and a scale generation module 25 are included.
  • the game control unit 11 executes various operation settings stored in the internal storage device, thereby executing various initial settings necessary to operate as the game machine 1. Then, by reading the game program 21 from the external storage device 20 and executing it, an environment for executing the music game according to the game program 21 is set.
  • the sequence control module 23 of the game program 21 is executed by the game control unit 11, a sequence processing unit 15 is generated in the game control unit 11.
  • the pitch determination module 24 of the game program 21 is executed by the game control unit 11
  • the pitch determination unit 16 is generated in the game control unit 11.
  • the scale generation module 25 is generated by the game control unit 11.
  • a scale generation unit 17 is generated in the game control unit 11.
  • the sequence processing unit 15, the pitch determination unit 16, and the scale generation unit 17 are logical devices realized by a combination of computer hardware and a computer program.
  • the sequence processing unit 15 performs music game processing in which an operation is instructed to the player in accordance with the reproduction of music (music) selected by the player, or a sound effect is generated in accordance with the operation of the player.
  • the pitch determination unit 16 takes in an arbitrary sound input by the player to the sound input device 9 and performs a predetermined process described later to determine a representative value of the frequency.
  • the scale generation unit 17 generates a plurality of sound data whose pitches are changed based on the representative values determined by the pitch determination unit 16. These sound data form a scale of a predetermined octave number and constitute sound effects.
  • the game program 21 includes various program modules necessary for executing the music game in addition to the modules 23 to 25 described above, and the game control unit 11 generates logical devices corresponding to these modules. However, their illustration is omitted.
  • the game data 22 includes various data to be referred to when the music game is executed according to the game program 21.
  • the game data 22 includes music data 26, sound effect data 27, and image data 28.
  • the music data 26 is data necessary for reproducing and outputting music to be played from the speaker 8.
  • FIG. 2 one type of music data 26 is shown, but in reality, the player can select music to play from a plurality of music.
  • a plurality of pieces of music data 26 are recorded in the game data 22 with information for identifying each music piece.
  • the sound effect data 27 is data in which a plurality of types of sound effects to be output from the speaker 8 in response to the operation of the player are recorded in association with unique codes for each sound effect. Sound effects include musical instruments and various other types of sounds.
  • a vocal sound for outputting text from the speaker 8 is also included as a kind of sound effect.
  • the sound effect data 27 is prepared for a predetermined number of octaves by changing the pitch for each type.
  • the image data 28 is data for displaying the background image, various objects, icons, and the like in the game screen on the monitors 3 and 4.
  • the game data 22 further includes sequence data 29.
  • the sequence data 29 is data defining operations and the like to be instructed to the player. At least one sequence data 29 is prepared for one piece of music data 26.
  • the game operation instruction screen 100 is displayed on the first monitor 3, and the game information screen 110 is displayed on the second monitor 4.
  • the operation instruction screen 100 is visually displayed by a procedure in which the first lane 101, the second lane 102, and the third lane 103 extending in the vertical direction are divided by the dividing line 104. The state divided into is displayed. Operation reference signs 105 are displayed at the lower ends of the lanes 101, 102, and 103, respectively.
  • the lanes 101, 102, and 103 display objects 106 as operation instruction signs according to the sequence data 27.
  • the object 106 appears at the upper end of the lanes 101, 102, and 103 at an appropriate time in the music and is scrolled downward as the music progresses as indicated by an arrow A in FIG.
  • the player is requested to touch the lane 101, 102, or 103 on which the object 106 is displayed with an operation member such as the touch pen 120 as the object 106 reaches the operation reference mark 105.
  • an operation member such as the touch pen 120 as the object 106 reaches the operation reference mark 105.
  • a time difference between the time when the object 106 matches the operation reference sign 105 and the time of the touch operation of the player is detected. The smaller the deviation time, the higher the player's operation is evaluated.
  • sound effects corresponding to each object 106 are reproduced from the speaker 8 in response to the touch operation.
  • the player may touch the second lane 102 in accordance with the arrival.
  • the touch position may be anywhere within the second lane 102. That is, in this embodiment, three operation units are formed by a combination of the lanes 101, 102, and 103 displayed on the first monitor 3 and the touch panel 7 superimposed on them.
  • each of the lanes 101, 102, and 103 may be used as a term representing the operation unit.
  • the sound effect corresponding to each object 106 reproduced in response to the touch operation is selected from a plurality of sound effects recorded in the sound effect data 27.
  • the sound effect data 27 includes original data 27 a recorded in advance in the game data 22 and user data 27 b obtained based on the sound input by the player using the sound input device 9.
  • a plurality of sound effects A1, B1,... are recorded, and the sound effect A1 will be described as an example.
  • Other sound effects B1, C1,... Have similar sound data.
  • the user data 27b is similar to the original data 27a in the structure of the sound data of the sound effects A2, B2,..., But the sound data is generated based on the sound input by the player using the sound input device 9. It differs from the original data 27a recorded in advance.
  • the sequence data 29 includes an initial setting unit 29a and an operation sequence unit 29b.
  • the initial setting unit 29a is different for each song, such as music tempo (BPM as an example) which is an initial setting for playing a game, and information for specifying sound effects to be generated when the lanes 101 to 103 are operated. Information specifying game execution conditions and the like is described.
  • operation designation information 29c and sound effect switching instruction information 29d are described in the operation sequence portion 29b.
  • the operation designation information 29c the operation timing of the lanes 101 to 103 is described in association with information for designating any of the lanes 101 to 103. That is, as partly illustrated in FIG. 5, the operation designation information 29c includes a plurality of pieces of information in which the time (operation time) when the operation should be performed in the music and the information designating the operation unit (lane) are associated with each other. It is structured as a set of records. The operation time is described by separating a bar number, a beat number, and a time value in the music by commas.
  • the operation unit is described as “button 1” when the first lane 101 is designated, “button 2” when the second lane 102 is designated, and “button 3” when the third lane 103 is designated. In the example of FIG.
  • the first lane 101 is touched at the start time (000) of the first beat of the first bar
  • the second lane 102 is touched at the start time (000) of the second beat of the first bar
  • the operation time and the operation unit are designated such that the third lane 103 is touched when the time corresponding to “025” has elapsed since the start of the second beat of the first measure.
  • the sound effect switching instruction information 29d is inserted at an appropriate position in the middle of the operation specifying information 29c.
  • the sound effect switching instruction information 29d is described in association with the time on the music for which the sound effect is to be changed and sound data of the sound effect that should be generated when the lanes 101 to 103 are operated.
  • the sound effect generated when the lane designated in the designation information 29c is touched is changed.
  • the time on the music is described in the same format as the operation time in the operation designation information 29c.
  • the sound effect switching instruction information 29d designates one of the sound data of the original data 27a and the user data 27b recorded in the sound effect data 27 for each lane.
  • the sound effect switching instruction information 29d is inserted at a time on the music to which the sound effect is to be switched, and the setting of the sound effect is maintained until an instruction is given by the next sound effect switching instruction information 29d.
  • the sequence processing unit 15 of the game control unit 11 controls the display of each of the lanes 101 to 103 so that the object 106 matches the operation reference sign 105 at the operation time specified by the operation specifying information 29c described above. Further, the sequence processing unit 15 controls to switch the sound effect generated when the player touches the designated lanes 101 to 103 at the time on the music designated by the sound effect switching instruction information 29d.
  • the game control unit 11 When the game control unit 11 reads the game program 21 and completes the initial settings necessary to execute the music game, the game control unit 11 stands by in preparation for a game start instruction from the player.
  • the instruction to start the game includes, for example, an operation for specifying data used in the game such as selection of music to be played in the game or difficulty level.
  • the procedure for receiving these instructions may be the same as that of a known music game or the like.
  • the game control section 11 When the game start is instructed, the game control section 11 reads the music data 26 corresponding to the music selected by the player and outputs the music data 26 to the audio output control section 14, thereby starting the reproduction of the music from the speaker 8. Thereby, the control unit 10 functions as a music reproducing means. In addition, the game control unit 11 reads out the sequence data 29 corresponding to the player's selection in synchronization with the reproduction of the music, and refers to the image data 28 and images necessary for drawing the operation instruction screen 100 and the information screen 110. By generating data and outputting it to the display control units 12 and 13, the operation instruction screen 100 and the information display surface 110 are displayed on the monitors 3 and 4. Furthermore, during the execution of the music game, the game control unit 11 repeatedly executes a sequence processing routine shown in FIG. 6 at a predetermined cycle as a process necessary for displaying the operation instruction screen 100 and the like.
  • the sequence processing unit 15 of the game control unit 11 first acquires the current time on the music in step S1. For example, timing is started with the internal clock of the game control unit 11 with the music reproduction start time as a reference, and the current time is acquired from the value of the internal clock.
  • the sequence processing unit 15 acquires, from the sequence data 28, operation timing data existing for a time length corresponding to the display range of the operation instruction screen 100.
  • the display range is set to a time range corresponding to two measures of music from the current time to the future.
  • the sequence processing unit 15 calculates the coordinates in the operation instruction screen 100 of all the objects 106 to be displayed on the lanes 101 to 103.
  • the calculation is performed as follows as an example. Based on the designation of the lanes 101 to 103 associated with the operation timing included in the display range, that is, the designation of any of “button 1” to “button 3” in the example of FIG. It is determined in which of these should be arranged. Further, the position of each object 106 in the time axis direction from the operation reference mark 105 (that is, the moving direction of the object 106) is determined according to the time difference between each operation time and the current time. As a result, the coordinates of each object 106 necessary for arranging each object 106 along the time axis from the operation reference mark 105 in the designated lanes 101 to 103 can be acquired.
  • step S4 determines whether or not the sound effect switching instruction information 29d is present in the data acquired from the sequence data 29.
  • the sequence processing unit 15 obtains the current time in step S5 and compares it with the time on the music designated by the sound effect switching instruction information 29d, and the current time is the sound effect. It is determined whether or not the timing of the switching instruction is met. In the case where it corresponds to the timing of the switching instruction, in step S6, the sequence control unit 15 specifies the sound effect generated in each of the lanes 101 to 103 specified by the subsequent operation specifying information 29c by using the sound effect switching instruction information 29d. Change to the sound effect.
  • the sound data sd_101, sd_105, and sd_106 of the sound effect A2 of the user data 27b of the sound effect data 27 are respectively stored in the lane 101 after the start of the third beat of the first bar of the music. , 102 and 103, and when the player touches the lanes 101 to 103, each sound data is reproduced. If there is no sound effect switching instruction information 29d in step S4, or if there is no sound effect switching instruction information 29d in step S5, the sequence processing unit 15 proceeds to step S7.
  • the sequence processing unit 15 proceeds to the next step S7, and generates image data necessary for drawing the operation instruction screen 100 based on the coordinates of the object 106 calculated in step S3. . Specifically, the image data is generated so that the object 106 is arranged at the calculated coordinates. The image of the object 106 may be acquired from the image data 28.
  • step S8 the sequence processing unit 15 outputs the image data to the display control unit 12. Thereby, the operation instruction screen 100 is displayed on the first monitor 3.
  • step S8 the sequence processing unit 15 ends the current sequence processing routine.
  • the object 106 is scroll-displayed in the lanes 101 to 103 so that the object 106 reaches the operation reference mark 105 at the operation time described in the sequence data 29.
  • the sound effect is created by, for example, instructing the player to start while the music game is not being executed.
  • the pitch discriminating unit 16 first executes the pitch discriminating processing routine shown in FIG. 7, and the scale generating unit 17 based on the result of the pitch discriminating processing routine is shown in FIG. Execute.
  • the pitch determination unit 16 of the game control unit 11 acquires the voice input by the player in step S11.
  • the player inputs voice while the voice input device 9 can capture voice
  • raw voice data is generated.
  • the pitch determination unit 16 performs A / D conversion on the raw voice data.
  • the analog signal of the raw voice data is converted into a digital signal, and the voice data of the input voice is generated.
  • FIG. 9 shows an example of audio data.
  • the audio data in FIG. 9 is a digital waveform of a guitar sound, where the horizontal axis indicates the dynamic range and the vertical axis indicates the duration.
  • a well-known technique may be used for A / D conversion.
  • the pitch discriminating unit 16 obtains the frequency spectrum of the voice data in step S13.
  • FIG. 10 shows a frequency spectrum generated by fast Fourier transform from the audio data obtained in step S12. The horizontal axis indicates the frequency, and the vertical axis indicates the frequency distribution degree. The generation of the frequency spectrum is not limited to the calculation by the fast Fourier transform, and various known techniques may be used.
  • the pitch determination unit 16 determines a representative value from the frequency spectrum obtained in step S13.
  • the representative value is the maximum value of the frequency spectrum distribution.
  • the peak frequency indicated by the arrow p is a representative value.
  • the pitch of the voice data based on the voice input by the player is determined based on the frequency of the representative value thus determined.
  • the representative value may be calculated from the data of the band q occupied at both ends of the peak having such a maximum peak. Even when the peak is unclear, such as when the frequency of the peak is wide, the representative value can be calculated from a certain band by this method.
  • the pitch determination unit 16 ends the current pitch determination processing routine. With the above processing, representative values are determined for the voice data based on the voice input by the player, and the unique pitch is determined.
  • the scale generation unit 17 executes the scale generation processing routine of FIG.
  • the scale generation unit 17 generates a plurality of sound data forming a scale from the sound data whose representative value is determined in step S21.
  • the scale generation unit 17 frequency-converts the sound data based on the representative value so that the representative value of each sound data becomes the frequency of each sound forming a scale having a predetermined number of octaves.
  • FIG. 11 shows an example of sound data subjected to frequency conversion. The waveform of FIG. 11 is obtained by frequency-converting the audio data of FIG. 9 upward by one octave.
  • the scale generation unit 17 stores the generated sound data set in the sound effect data 27.
  • step S22 the scale generation unit 17 ends the current scale generation process routine.
  • a plurality of pieces of sound data having different representative value frequencies are generated based on the sound data for which the representative value is determined, and a scale is formed.
  • a set of sound data forming the scale is stored as sound effects in the user data 27b of the sound effect data 27.
  • the external storage device 20 of the game machine 1 functions as sound effect data storage means and sequence data storage means.
  • the control unit 10 functions as a pitch discriminating unit by causing the pitch discriminating unit 16 to execute the processes of steps S11 to S14 in FIG. 7, and the scale generating unit 17 executes the step S21 in FIG. It functions as a sound effect data storage control means by causing the scale generation unit 17 to execute step S22 of FIG.
  • the present invention can be implemented in various forms without being limited to the above-described forms.
  • the music game machine 1 has been described as an example of a device that causes the pitch determination unit, the scale generation unit, and the sound effect data storage control unit to function, but the present invention is not limited thereto.
  • a melody can be played with an arbitrary voice input by the player.
  • the music game system of the present invention is not limited to that realized by a portable game machine, but is realized by using a stationary game machine for home use, an arcade game machine installed in a commercial facility, and a network. It may be realized in an appropriate form such as a game system.
  • the input device is not limited to an example using a touch panel, and input devices having various configurations such as a push button, a lever, and a trackball can be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Human Computer Interaction (AREA)
  • Electrophonic Musical Instruments (AREA)
PCT/JP2010/065337 2009-09-11 2010-09-07 音楽ゲームシステム及びそのコンピュータプログラム並びに効果音データの生成方法 WO2011030761A1 (ja)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/394,967 US20120172099A1 (en) 2009-09-11 2010-09-07 Music game system, computer program of same, and method of generating sound effect data
CN201080039640.3A CN102481488B (zh) 2009-09-11 2010-09-07 音乐游戏系统以及效果音数据的生成方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009-210571 2009-09-11
JP2009210571A JP5399831B2 (ja) 2009-09-11 2009-09-11 音楽ゲームシステム及びそのコンピュータプログラム並びに効果音データの生成方法

Publications (1)

Publication Number Publication Date
WO2011030761A1 true WO2011030761A1 (ja) 2011-03-17

Family

ID=43732433

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/065337 WO2011030761A1 (ja) 2009-09-11 2010-09-07 音楽ゲームシステム及びそのコンピュータプログラム並びに効果音データの生成方法

Country Status (4)

Country Link
US (1) US20120172099A1 (zh)
JP (1) JP5399831B2 (zh)
CN (1) CN102481488B (zh)
WO (1) WO2011030761A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6360280B2 (ja) * 2012-10-17 2018-07-18 任天堂株式会社 ゲームプログラム、ゲーム装置、ゲームシステム、およびゲーム処理方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08123448A (ja) * 1994-10-18 1996-05-17 Sega Enterp Ltd 音声信号の波形解析を用いた画像処理装置
JP2001009152A (ja) * 1999-06-30 2001-01-16 Konami Co Ltd ゲームシステムおよびコンピュータ読み取り可能な記憶媒体
JP2002215151A (ja) * 2001-01-22 2002-07-31 Sega Corp 音響信号出力方法及びbgm生成方法
JP2002351489A (ja) * 2001-05-29 2002-12-06 Namco Ltd ゲーム情報、情報記憶媒体、及びゲーム装置
JP2008054851A (ja) * 2006-08-30 2008-03-13 Namco Bandai Games Inc プログラム、情報記憶媒体及びゲーム装置
JP2008178449A (ja) * 2007-01-23 2008-08-07 Yutaka Kojima パズルゲームシステム、テンキーキャラクター

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1106209C (zh) * 1989-01-10 2003-04-23 任天堂株式会社 能够产生伪立体声的电子游戏装置
EP1280150A3 (en) * 1994-12-02 2008-05-14 Sony Computer Entertainment Inc. Method for processing and apparatus for recording and reproducing sound data
DE69632351T2 (de) * 1995-09-29 2005-05-25 Yamaha Corp., Hamamatsu Verfahren und Vorrichtung zur Musiktonerzeugung
TR199902422T2 (xx) * 1997-04-14 2000-03-21 Thomson Consumer Electronics,Inc. Gösterim için bir birleşik program kılavuzunun oluşturulması için bir çok kaynaktan gelen verilerin derlenmesi için sistem.
US6464585B1 (en) * 1997-11-20 2002-10-15 Nintendo Co., Ltd. Sound generating device and video game device using the same
AU2325800A (en) * 1999-03-08 2000-09-28 Faith, Inc. Data reproducing device, data reproducing method, and information terminal
JP2003509729A (ja) * 1999-09-16 2003-03-11 ハンスルソフト コーポレーション リミテッド ディジタル音楽ファイルに基づいて楽器を演奏するための方法及び装置
JP3630075B2 (ja) * 2000-05-23 2005-03-16 ヤマハ株式会社 副旋律生成装置及び方法並びに記憶媒体
JP4206332B2 (ja) * 2003-09-12 2009-01-07 株式会社バンダイナムコゲームス 入力装置、ゲームシステム、プログラムおよび情報記憶媒体
JP3981382B2 (ja) * 2005-07-11 2007-09-26 株式会社コナミデジタルエンタテインメント ゲームプログラム、ゲーム装置及びゲーム制御方法
CN1805003B (zh) * 2006-01-12 2011-05-11 深圳市蔚科电子科技开发有限公司 一种音准训练方法
US20080200224A1 (en) * 2007-02-20 2008-08-21 Gametank Inc. Instrument Game System and Method
JP4467601B2 (ja) * 2007-05-08 2010-05-26 ソニー株式会社 ビート強調装置、音声出力装置、電子機器、およびビート出力方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08123448A (ja) * 1994-10-18 1996-05-17 Sega Enterp Ltd 音声信号の波形解析を用いた画像処理装置
JP2001009152A (ja) * 1999-06-30 2001-01-16 Konami Co Ltd ゲームシステムおよびコンピュータ読み取り可能な記憶媒体
JP2002215151A (ja) * 2001-01-22 2002-07-31 Sega Corp 音響信号出力方法及びbgm生成方法
JP2002351489A (ja) * 2001-05-29 2002-12-06 Namco Ltd ゲーム情報、情報記憶媒体、及びゲーム装置
JP2008054851A (ja) * 2006-08-30 2008-03-13 Namco Bandai Games Inc プログラム、情報記憶媒体及びゲーム装置
JP2008178449A (ja) * 2007-01-23 2008-08-07 Yutaka Kojima パズルゲームシステム、テンキーキャラクター

Also Published As

Publication number Publication date
JP5399831B2 (ja) 2014-01-29
CN102481488A (zh) 2012-05-30
JP2011056122A (ja) 2011-03-24
US20120172099A1 (en) 2012-07-05
CN102481488B (zh) 2015-04-01

Similar Documents

Publication Publication Date Title
JP3686906B2 (ja) 音楽ゲームプログラム及び音楽ゲーム装置
JP3317686B2 (ja) 歌唱伴奏システム
JP3719124B2 (ja) 演奏指示装置及び方法並びに記憶媒体
JP2003302984A (ja) 歌詞表示方法、歌詞表示プログラムおよび歌詞表示装置
JP2006030692A (ja) 楽器演奏教習装置及びそのプログラム
JP3728942B2 (ja) 楽音および画像生成装置
JP2001145778A (ja) ゲームシステム、及びそれを実現するためのコンピュータ読取可能な記憶媒体
JP5806936B2 (ja) テキスト出力が可能な音楽ゲームシステム及びそのコンピュータプログラムが記憶されたコンピュータ読み取り可能な記憶媒体
JP2020056938A (ja) 演奏情報表示装置及び演奏情報表示方法、演奏情報表示プログラム、並びに、電子楽器
JP3286683B2 (ja) 旋律合成装置及び旋律合成方法
JP2014200454A (ja) 記録媒体、ゲーム装置及びゲーム進行方法
JP2013068657A (ja) 画像生成装置、画像生成方法及びそのプログラム、並びに演奏支援装置、演奏支援方法及びプログラム
JP5399831B2 (ja) 音楽ゲームシステム及びそのコンピュータプログラム並びに効果音データの生成方法
JP4211388B2 (ja) カラオケ装置
JP4131279B2 (ja) 合奏パラメータ表示装置
US8878044B2 (en) Processing device and method for displaying a state of tone generation apparatus
JP6411412B2 (ja) プログラム、ゲーム装置及びゲーム進行方法
JP6260176B2 (ja) 演奏練習装置、方法、およびプログラム
JPWO2014174621A1 (ja) 記録媒体、ゲーム装置及びゲーム進行方法
JP7425558B2 (ja) コード検出装置及びコード検出プログラム
JP5773956B2 (ja) 楽曲演奏装置、楽曲演奏制御方法及びプログラム
JP2008165098A (ja) 電子楽器
JP2017015957A (ja) 演奏記録装置およびプログラム
JPH11231872A (ja) 楽音生成装置、画像生成装置、ゲーム装置及び情報記憶媒体
JP3873872B2 (ja) 演奏情報記録装置及びプログラム

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080039640.3

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10815358

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 13394967

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10815358

Country of ref document: EP

Kind code of ref document: A1