JP5399831B2 - Music game system, computer program thereof, and method of generating sound effect data - Google Patents

Music game system, computer program thereof, and method of generating sound effect data Download PDF

Info

Publication number
JP5399831B2
JP5399831B2 JP2009210571A JP2009210571A JP5399831B2 JP 5399831 B2 JP5399831 B2 JP 5399831B2 JP 2009210571 A JP2009210571 A JP 2009210571A JP 2009210571 A JP2009210571 A JP 2009210571A JP 5399831 B2 JP5399831 B2 JP 5399831B2
Authority
JP
Japan
Prior art keywords
sound
data
sound effect
voice
pitch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2009210571A
Other languages
Japanese (ja)
Other versions
JP2011056122A (en
Inventor
修 右寺
宜隆 西村
Original Assignee
株式会社コナミデジタルエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コナミデジタルエンタテインメント filed Critical 株式会社コナミデジタルエンタテインメント
Priority to JP2009210571A priority Critical patent/JP5399831B2/en
Publication of JP2011056122A publication Critical patent/JP2011056122A/en
Application granted granted Critical
Publication of JP5399831B2 publication Critical patent/JP5399831B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/54Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/424Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/814Musical performances, e.g. by evaluating the player's ability to follow a notation
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • G10H1/361Recording/reproducing of accompaniment for use with an external source, e.g. karaoke systems
    • G10H1/368Recording/reproducing of accompaniment for use with an external source, e.g. karaoke systems displaying animated or moving pictures synchronized with the music or audio part
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • G10H1/40Rhythm
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/215Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1081Input via voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/206Game information storage, e.g. cartridges, CD ROM's, DVD's, smart cards
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6063Methods for processing data by generating or executing the game program for sound processing
    • A63F2300/6081Methods for processing data by generating or executing the game program for sound processing generating an output signal, e.g. under timing constraints, for spatialization
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/63Methods for processing data by generating or executing the game program for controlling the execution of the game in time
    • A63F2300/638Methods for processing data by generating or executing the game program for controlling the execution of the game in time according to the timing of operation or a time limit
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8047Music games
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/031Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
    • G10H2210/066Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for pitch analysis as part of wider processing for musical purposes, e.g. transcription, musical performance evaluation; Pitch recognition, e.g. in polyphonic sounds; Estimation or use of missing fundamental
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/101Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
    • G10H2220/106Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/135Musical aspects of games or videogames; Musical instrument-shaped game input interfaces
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS
    • G10H2230/00General physical, ergonomic or hardware implementation of electrophonic musical tools or instruments, e.g. shape or architecture
    • G10H2230/005Device type or category
    • G10H2230/015PDA [personal digital assistant] or palmtop computing devices used for musical purposes, e.g. portable music players, tablet computers, e-readers or smart phones in which mobile telephony functions need not be used

Description

  The present invention relates to a music game system or the like in which sound input by a player is reflected in game content.

  Music game machines in which game content changes based on voice input by a player are well known. For example, there are known ones that reflect the input voice in the action of the character (see Patent Document 1) and those that input and score a player's song and compete for superiority (see Patent Document 2).

JP 2002-136664 A Japanese Patent Laid-Open No. 10-268876

  In any of the above game machines, the game content is changed by taking in the voice of the player. Processing is performed to detect the pitch of the player's voice and change the action content of the character based on the comparison result with the reference pitch. However, there is no configuration in which the sound input by the player is reflected in the game content as a material and the game is enjoyed with the input sound.

  Accordingly, an object of the present invention is to provide a music game system capable of discriminating a voice input by a player and forming a scale based on the discrimination result, a computer program thereof, and a method of generating sound effect data.

The music game system of the present invention outputs a sound input device (9) for inputting sound, a sound output device (8) for reproducing and outputting game sound, and a plurality of sound effects having different pitches from the sound output device. The sound effect data storage means (20) for storing the sound effect data (27) to be stored, and the sequence data (29) describing the relationship between the operation of the player and the sound effect to be output in response to the operation Sequence data storage means (20), a pitch determination means (10) for determining a pitch representing the input voice based on voice data of the voice input by the voice input device, and the pitch determination means A plurality of pitches different from the representative pitch determined in step 2 are selected so that a scale is formed, and sound data is selected for each of the selected pitches. A musical scale generating means for generating a wavenumber conversion (10), the sound effect data storage control means for storing the plurality of sound data to which the musical scale generating means has generated the sound effect data storage means as at least a part of the sound effect data (10).

The computer program for a music game system according to the present invention includes a sound input device (9) for inputting sound, a sound output device (8) for reproducing and outputting game sound, and a plurality of sound effects having different pitches. A sequence describing the relationship between the sound effect data storage means (20) for storing the sound effect data (27) to be output from the sound output device and the operation of the player and the sound effect to be output in response to the operation. A computer (10) incorporated in a music game system comprising sequence data storage means (20) for storing data (29), based on voice data inputted by the voice input device; and pitch determination means for determining the pitch representing the voice (10), a plurality of pitch different from the pitch of the representative that is determined by the pitch determining means Selected to scale is formed, the plurality of sound data scale generating means sound data generated by the frequency conversion of the audio data (10), and the musical scale generating means has generated for each of the plurality of pitch selected Is stored in the sound effect data storage means as at least a part of the sound effect data.

  In the present invention, voice data is generated by the pitch discriminating unit based on the voice input by the player to the voice input device, and the pitch that is representative of the voice data is discriminated. Then, based on the pitch determination result of the voice data, the scale generation unit generates a plurality of sound data having different pitches based on the voice data whose pitch has been determined. The plurality of sound data forms a musical scale. The plurality of sound data is stored in the sound effect data storage means as sound effect data, and the plurality of sound data is used as the sound effect to be output in response to the operation of the player. Therefore, a scale is formed based on the voice arbitrarily input by the player, so that a melody based on the input voice can be played, or the input voice can be reflected in the game content and the game can be enjoyed with the voice input by the player. can do.

  In one form of the music game system of the present invention, the pitch determining means may determine the pitch of the voice by specifying a representative frequency from the voice data of the voice input by the voice input device. According to this aspect, for example, the pitch of the voice is determined by specifying the frequency having the maximum distribution with reference to the frequency spectrum of the voice data as a representative value.

  In one form of the music game system of the present invention, the scale generation means may generate a scale of at least one octave or more. According to this embodiment, a melody can be played by generating a scale. If a large amount of sound data is generated, the range of musical scales can be expanded to increase the number of melodies that can be played, and the game content can be enhanced.

  In one form of the music game system of the present invention, the audio output device further includes an input device having at least one operation unit, and a sound effect according to the description of the sequence data is generated based on an operation of the player by the input device. May be played from. According to this aspect, the player can reproduce the sound effect composed of the scale formed by using the sound input by the player by operating the operation unit. Therefore, it is possible to reflect the input sound as a material in the game content and enjoy the game with the sound input by the player.

The sound effect data generating method according to the present invention includes a sound input device (9) for inputting sound, a sound output device (8) for reproducing and outputting game sound, and outputting a plurality of sound effects having different pitches. Sequence data (29) describing the relationship between the sound effect data storage means (20) for storing sound effect data (27) for output from the apparatus, and the player's operation and the sound effect to be output in response to the operation. And a sequence data storage means (20) for storing the pitches representing the input voice based on the voice data of the voice input by the voice input device in a computer incorporated in the music game system. and pitch determination step of determining, selected to scale the different pitches are formed to the pitch determination step that the representative is determined in pitch were selected plurality A musical scale generating step of the sound data generated by the frequency conversion of the audio data for each of pitch, the sound effect data storage means the plurality of sound data to which the musical scale generating step is generated as at least part of the sound effect data and sound effect data storage control step of storing the one in which to run.

The present invention is a method for generating sound effect data in a music game system and a computer program thereof, and has the same effects .

  In addition, in the above description, in order to make an understanding of this invention easy, the reference sign of the accompanying drawing was attached in parenthesis, but this invention is not limited to the form of illustration by it.

  As described above, in the music game system and the computer program thereof according to the present invention, the sound data is generated by the pitch determination unit based on the voice input by the player to the voice input device, and the pitch that is representative of the voice data is determined. Is done. Then, based on the pitch determination result of the voice data, the scale generation unit generates a plurality of sound data having different pitches based on the voice data whose pitch has been determined. The plurality of sound data forms a musical scale. The plurality of sound data is stored in the sound effect data storage means as sound effect data, and the plurality of sound data is used as the sound effect to be output in response to the operation of the player. Therefore, a scale is formed based on the voice arbitrarily input by the player, so that a melody based on the input voice can be played, or the input voice can be reflected in the game content and the game can be enjoyed with the voice input by the player. can do. The same effect can be achieved in the method for generating sound effect data.

The figure which shows the external appearance of the game machine which concerns on one form of this invention. 1 is a functional block diagram of a game machine according to one embodiment of the present invention. The enlarged view of the operation instruction | indication screen displayed as a part of game screen. The figure which shows an example of the content of sound effect data. The figure which shows an example of the content of sequence data. The flowchart which shows the sequence processing routine which a game control part performs. The flowchart which shows the pitch discrimination | determination process routine which a game control part performs. The flowchart which shows the scale production | generation processing routine which a game control part performs. The graph which shows an example of audio | voice data. The graph which shows the frequency spectrum of the audio | voice data of FIG. The graph which shows the sound data which frequency-converted the audio | voice data of FIG.

  Hereinafter, an embodiment in which the present invention is applied to a portable game machine will be described. As shown in FIG. 1, the game machine 1 is disposed on a housing 2 that a player (user) can hold, a first monitor 3 disposed on the right side of the housing 2, and a left side of the housing 2. A second monitor 4, a plurality of push button switches 5 disposed on the upper side of the first monitor 3, and a cross key 6 disposed on the lower side of the first monitor 3 are provided. A transparent touch panel 7 is superimposed on the surface of the first monitor 3. The touch panel 7 is a known input device that outputs a signal corresponding to the contact position when the player touches with a touch pen or the like. In addition, the game machine 1 is provided with various input devices and output devices such as a power switch, a volume operation switch, and a power lamp, which are included in a normal portable game machine. Illustration is omitted.

  As shown in FIG. 2, a control unit 10 as a computer is provided inside the game machine 1. The control unit 10 includes a game control unit 11 as a control subject, and a pair of display control units 12 and 13 and an audio output control unit 14 that operate according to an output from the game control unit 11. The game control unit 11 is configured as a unit in which a microprocessor and various peripheral devices such as an internal storage device (for example, ROM and RAM) necessary for the operation of the microprocessor are combined. The display control units 12 and 13 draw an image corresponding to the image data supplied from the game control unit 11 in the frame buffer, and output video signals corresponding to the drawn image to the monitors 3 and 4, respectively. A predetermined image is displayed on 3 and 4. The sound output control unit 14 generates a sound reproduction signal corresponding to the sound reproduction data given from the game control unit 11 and outputs the sound reproduction signal to the speaker 8, whereby predetermined sound (including musical sounds and the like) is output from the speaker 8. Let it play.

  The game control unit 11 is connected with the push button switch 5, the cross key 6 and the touch panel 7 described above as input devices, and in addition to these, a voice input device (microphone) 9 is connected. In addition, various input devices may be connected to the game control unit 11. Furthermore, an external storage device 20 is connected to the game control unit 11. The external storage device 20 is a storage medium that can hold the storage even when power is not supplied, such as a nonvolatile semiconductor memory device such as an EEPROM or a magnetic storage device. The storage medium of the external storage device 20 is detachable from the game machine 1.

  The external storage device 20 stores a game program 21 and game data 22. The game program 21 is a computer program necessary for executing a music game according to a predetermined procedure on the game machine 1, and includes a sequence control module 23 and a pitch determination module for realizing the functions according to the present invention. 24 and a scale generation module 25 are included. When the game machine 1 is activated, the game control unit 11 executes various operation settings stored in the internal storage device, thereby executing various initial settings necessary to operate as the game machine 1. Then, by reading the game program 21 from the external storage device 20 and executing it, an environment for executing the music game according to the game program 21 is set. When the sequence control module 23 of the game program 21 is executed by the game control unit 11, a sequence processing unit 15 is generated in the game control unit 11. In addition, when the pitch determination module 24 of the game program 21 is executed by the game control unit 11, the pitch determination unit 16 is generated in the game control unit 11. Similarly, the scale generation module 25 is generated by the game control unit 11. When executed, a scale generation unit 17 is generated in the game control unit 11.

  The sequence processing unit 15, the pitch determination unit 16, and the scale generation unit 17 are logical devices realized by a combination of computer hardware and a computer program. The sequence processing unit 15 performs music game processing in which an operation is instructed to the player in accordance with the reproduction of music (music) selected by the player, or a sound effect is generated in accordance with the operation of the player. The pitch determination unit 16 takes in an arbitrary sound input by the player to the sound input device 9 and performs a predetermined process described later to determine a representative value of the frequency. The scale generation unit 17 generates a plurality of sound data whose pitches are changed based on the representative values determined by the pitch determination unit 16. These sound data form a scale of a predetermined octave number and constitute sound effects. The game program 21 includes various program modules necessary for executing the music game in addition to the modules 23 to 25 described above, and the game control unit 11 generates logical devices corresponding to these modules. However, their illustration is omitted.

  The game data 22 includes various data to be referred to when executing a music game according to the game program 21. For example, the game data 22 includes music data 26, sound effect data 27, and image data 28. The music data 26 is data necessary for reproducing and outputting music to be played from the speaker 8. In FIG. 2, one type of music data 26 is shown, but in reality, the player can select music to play from a plurality of music. A plurality of pieces of music data 26 are recorded in the game data 22 with information for identifying each music piece. The sound effect data 27 is data in which a plurality of types of sound effects to be output from the speaker 8 in response to the operation of the player are recorded in association with unique codes for each sound effect. Sound effects include musical instruments and various other types of sounds. A vocal sound for outputting text from the speaker 8 is also included as a kind of sound effect. The sound effect data 27 is prepared for a predetermined number of octaves by changing the pitch for each type. The image data 28 is data for displaying the background image, various objects, icons, and the like in the game screen on the monitors 3 and 4.

  The game data 22 further includes sequence data 29. The sequence data 29 is data defining operations and the like to be instructed to the player. At least one sequence data 29 is prepared for one piece of music data 26.

  Next, an outline of a music game executed on the game machine 1 will be described. As shown in FIG. 1, during the execution of the music game by the game machine 1, the game operation instruction screen 100 is displayed on the first monitor 3, and the game information screen 110 is displayed on the second monitor 4. As shown in FIG. 3, the operation instruction screen 100 is visually displayed by a procedure in which the first lane 101, the second lane 102, and the third lane 103 extending in the vertical direction are divided by the dividing line 104. The state divided into is displayed. Operation reference signs 105 are displayed at the lower ends of the lanes 101, 102, and 103, respectively. While the music game is being executed, that is, during the reproduction of the music, the lanes 101, 102, and 103 display objects 106 as operation instruction signs according to the sequence data 27.

  The object 106 appears at the upper end of the lanes 101, 102, and 103 at an appropriate time in the music and is scrolled downward as the music progresses as indicated by an arrow A in FIG. The player is requested to touch the lane 101, 102, or 103 on which the object 106 is displayed with an operation member such as the touch pen 120 as the object 106 reaches the operation reference mark 105. When the player performs a touch operation, a time difference between the time when the object 106 matches the operation reference sign 105 and the time of the touch operation of the player is detected. The smaller the deviation time, the higher the player's operation is evaluated. In addition, sound effects corresponding to each object 106 are reproduced from the speaker 8 in response to the touch operation. In the example of FIG. 3, it is just before the object 106 reaches the operation reference mark 105 in the second lane 102, and the player may touch the second lane 102 in accordance with the arrival. The touch position may be anywhere within the second lane 102. That is, in this embodiment, three operation units are formed by a combination of the lanes 101, 102, and 103 displayed on the first monitor 3 and the touch panel 7 superimposed on them. Hereinafter, each of the lanes 101, 102, and 103 may be used as a term representing the operation unit.

  The sound effect corresponding to each object 106 reproduced in response to the touch operation is selected from a plurality of sound effects recorded in the sound effect data 27. As shown in FIG. 4, the sound effect data 27 includes original data 27 a recorded in advance in the game data 22 and user data 27 b obtained based on the sound input by the player using the sound input device 9. In the original data 27a and the user data 27b, a plurality of sound effects A1, B1,... Are recorded, and the sound effect A1 will be described as an example. Are associated with sound data sd_000, sd_001, sd_002,. Other sound effects B1, C1,... Have similar sound data. The user data 27b is similar to the original data 27a in the structure of the sound data of the sound effects A2, B2,..., But the sound data is generated based on the sound input by the player using the sound input device 9. It differs from the original data 27a recorded in advance.

  Next, details of the sequence data 29 will be described. As shown in FIG. 5, the sequence data 29 includes an initial setting unit 29a and an operation sequence unit 29b. The initial setting unit 29a is different for each piece of music, such as music tempo (BPM as an example), which is an initial setting for playing a game, and information for specifying sound effects to be generated when the lanes 101 to 103 are operated. Information specifying game execution conditions and the like is described.

  On the other hand, operation designation information 29c and sound effect switching instruction information 29d are described in the operation sequence portion 29b. The operation designation information 29c is described in association with the operation timing of the lanes 101 to 103 associated with information for designating any of the lanes 101 to 103. That is, as partly illustrated in FIG. 5, the operation designation information 29c includes a plurality of pieces of information in which the time (operation time) when the operation should be performed in the music and the information designating the operation unit (lane) are associated with each other. It is structured as a set of records. The operation time is described by separating a bar number, a beat number, and a time value in the music by commas. The time in the beat is the elapsed time from the beginning of one beat, and is expressed by the number of units from the beginning of the beat when the time length of one beat is equally divided into n unit times. For example, when n = 100, the second beat of the first measure of the music and a time that has passed by ¼ from the beginning of the beat is designated as the operation time, “01, 2, 025” is described. Is done. The operation unit is described as “button 1” when the first lane 101 is designated, “button 2” when the second lane 102 is designated, and “button 3” when the third lane 103 is designated. In the example of FIG. 5, the first lane 101 is touched at the start time (000) of the first beat of the first bar, the second lane 102 is touched at the start time (000) of the second beat of the first bar, The operation time and the operation unit are designated such that the third lane 103 is touched when the time corresponding to “025” has elapsed since the start of the second beat of the first measure.

  Sound effect switching instruction information 29d is inserted at an appropriate position in the middle of the operation designation information 29c. The sound effect switching instruction information 29d is described in association with the time on the music for which the sound effect is to be changed and sound data of the sound effect that should be generated when the lanes 101 to 103 are operated. The sound effect generated when the lane designated in the designation information 29c is touched is changed. The time on the music is described in the same format as the operation time in the operation designation information 29c. The sound effect switching instruction information 29d designates one of the sound data of the original data 27a and the user data 27b recorded in the sound effect data 27 for each lane. The sound effect switching instruction information 29d is inserted at a time on the music to which the sound effect is to be switched, and the setting of the sound effect is maintained until an instruction is given by the next sound effect switching instruction information 29d.

  The sequence processing unit 15 of the game control unit 11 controls the display of each of the lanes 101 to 103 so that the object 106 matches the operation reference mark 105 at the operation time specified by the operation specifying information 29c described above. Further, the sequence processing unit 15 performs control so as to switch the sound effect generated when the player touches the specified lanes 101 to 103 at the time on the music specified by the sound effect switching instruction information 29d.

  Next, processing of the game control unit 11 when a music game is executed on the game machine 1 will be described. When the game control unit 11 reads the game program 21 and completes the initial settings necessary to execute the music game, the game control unit 11 stands by in preparation for a game start instruction from the player. The instruction to start the game includes, for example, an operation for specifying data used in the game such as selection of music to be played in the game or difficulty level. The procedure for receiving these instructions may be the same as that of a known music game or the like.

  When the game start is instructed, the game control unit 11 starts to reproduce the music from the speaker 8 by reading the music data 26 corresponding to the music selected by the player and outputting it to the audio output control unit 14. Thereby, the control unit 10 functions as a music reproducing means. In addition, the game control unit 11 reads out the sequence data 29 corresponding to the player's selection in synchronization with the reproduction of the music, and refers to the image data 28 and images necessary for drawing the operation instruction screen 100 and the information screen 110. By generating data and outputting it to the display control units 12 and 13, the operation instruction screen 100 and the information display surface 110 are displayed on the monitors 3 and 4. Furthermore, during the execution of the music game, the game control unit 11 repeatedly executes a sequence processing routine shown in FIG. 6 at a predetermined cycle as a process necessary for displaying the operation instruction screen 100 and the like.

  When the sequence processing routine of FIG. 6 is started, the sequence processing unit 15 of the game control unit 11 first acquires the current time on the music in step S1. For example, timing is started with the internal clock of the game control unit 11 with the music reproduction start time as a reference, and the current time is acquired from the value of the internal clock. In subsequent step S <b> 2, the sequence processing unit 15 acquires, from the sequence data 28, operation timing data existing for a time length corresponding to the display range of the operation instruction screen 100. For example, the display range is set to a time range corresponding to two measures of music from the current time to the future.

  In the next step S3, the sequence processing unit 15 calculates the coordinates in the operation instruction screen 100 of all the objects 106 to be displayed in the lanes 101 to 103. The calculation is performed as follows as an example. Based on the designation of the lanes 101 to 103 associated with the operation time included in the display range, that is, the designation of “button 1” to “button 3” in the example of FIG. It is determined in which of these should be arranged. Further, the position of each object 106 in the time axis direction from the operation reference mark 105 (that is, the moving direction of the object 106) is determined according to the time difference between each operation time and the current time. Thereby, the coordinates of each object 106 necessary for arranging each object 106 along the time axis from the operation reference sign 105 in the designated lanes 101 to 103 can be acquired.

  When the coordinate calculation of the object 106 is completed, the sequence processing unit 15 proceeds to step S4 and determines whether or not the sound effect switching instruction information 29d is present in the data acquired from the sequence data 29. When there is the sound effect switching instruction information 29d, the sequence processing unit 15 obtains the current time in step S5 and compares it with the time on the music designated by the sound effect switching instruction information 29d, and the current time is the sound effect. It is determined whether or not the timing of the switching instruction is met. In the case where it corresponds to the timing of the switching instruction, the sequence control unit 15 specifies the sound effect generated in each of the lanes 101 to 103 specified by the subsequent operation specifying information 29c by the sound effect switching instruction information 29d in step S6. Change to the sound effect. For example, in the example shown in FIG. 5, the sound data sd_101, sd_105, and sd_106 of the sound effect A2 of the user data 27b of the sound effect data 27 are respectively stored in the lane 101 after the start of the third beat of the first bar of the music. , 102, and 103, and when the player touches the lanes 101 to 103, each sound data is reproduced. If there is no sound effect switching instruction information 29d in step S4, or if there is no sound effect switching instruction information 29d in step S5, the sequence processing unit 15 proceeds to step S7.

  When the switching of the sound effects is completed, the sequence processing unit 15 proceeds to the next step S7, and generates image data necessary for drawing the operation instruction screen 100 based on the coordinates of the object 106 calculated in step S3. . Specifically, the image data is generated so that the object 106 is arranged at the calculated coordinates. The image of the object 106 may be acquired from the image data 28.

  In subsequent step S <b> 8, the sequence processing unit 15 outputs the image data to the display control unit 12. Thereby, the operation instruction screen 100 is displayed on the first monitor 3. When the process of step S8 is completed, the sequence processing unit 15 ends the current sequence processing routine. By repeatedly executing the above processing, the object 106 is scroll-displayed in the lanes 101 to 103 so that the object 106 reaches the operation reference mark 105 at the operation time described in the sequence data 29.

  Next, processing of the pitch determination unit 16 and the scale generation unit 17 when creating a sound effect based on the voice input by the player in the game machine 1 will be described. The sound effect is created by, for example, instructing the player to start while the music game is not being executed. When the creation of the sound effect is started, first, the pitch discriminating unit 16 executes the pitch discriminating processing routine shown in FIG. 7, and the scale generating unit 17 based on the result of the pitch discriminating processing routine is shown in FIG. Execute.

  When the pitch determination processing routine of FIG. 7 is started, the pitch determination unit 16 of the game control unit 11 acquires the voice input by the player in step S11. When the player inputs voice while the voice input device 9 can capture voice, raw voice data is generated. In subsequent step S12, the pitch determination unit 16 performs A / D conversion on the raw voice data. As a result, the analog signal of the raw voice data is converted into a digital signal, and the voice data of the input voice is generated. FIG. 9 shows an example of audio data. The audio data in FIG. 9 is a digital waveform of a guitar sound, where the horizontal axis indicates the dynamic range and the vertical axis indicates the duration. A well-known technique may be used for A / D conversion.

  Then, the pitch determination unit 16 obtains the frequency spectrum of the voice data in step S13. FIG. 10 shows a frequency spectrum generated by fast Fourier transform from the audio data obtained in step S12. The horizontal axis indicates the frequency, and the vertical axis indicates the frequency distribution degree. The generation of the frequency spectrum is not limited to the calculation by the fast Fourier transform, and various known techniques may be used. In subsequent step S14, the pitch determination unit 16 determines a representative value from the frequency spectrum obtained in step S13. The representative value is the maximum value of the frequency spectrum distribution. Explaining with the graph of FIG. 10, the peak frequency indicated by the arrow p is a representative value. The pitch of the voice data based on the voice input by the player is determined based on the frequency of the representative value thus determined. Further, the representative value may be calculated from the data of the band q occupied at both ends of the peak having such a maximum peak. Even when the peak is unclear, such as when the frequency of the peak is wide, the representative value can be calculated from a certain band by this method. When the process of step S14 is completed, the pitch determination unit 16 ends the current pitch determination processing routine. With the above processing, representative values are determined for the voice data based on the voice input by the player, and the unique pitch is determined.

  When the representative value is obtained in the pitch determination processing routine, the scale generation unit 17 executes the scale generation processing routine of FIG. The scale generation unit 17 generates a plurality of sound data forming a scale from the sound data whose representative value is determined in step S21. The scale generation unit 17 frequency-converts the sound data based on the representative value so that the representative value of each sound data becomes the frequency of each sound forming a scale having a predetermined number of octaves. FIG. 11 shows an example of sound data subjected to frequency conversion. The waveform of FIG. 11 is obtained by frequency-converting the audio data of FIG. 9 upward by one octave. In step S <b> 22, the scale generation unit 17 stores the generated sound data set in the sound effect data 27. These sound data are stored in the user data 27 b in the sound effect data 27. When the process of step S22 is completed, the scale generation unit 17 ends the current scale generation process routine. Through the above processing, a plurality of pieces of sound data having different representative value frequencies are generated based on the sound data for which the representative value is determined, and a scale is formed. A set of sound data forming the scale is stored as sound effects in the user data 27b of the sound effect data 27.

  In the above embodiment, the external storage device 20 of the game machine 1 functions as sound effect data storage means and sequence data storage means. Further, the control unit 10 functions as a pitch discriminating unit by causing the pitch discriminating unit 16 to execute the processes of steps S11 to S14 of FIG. 7, and the scale generating unit 17 executes the step S21 of FIG. It functions as a sound effect data storage control means by causing the scale generation unit 17 to execute step S22 of FIG.

  The present invention is not limited to the above-described form and can be implemented in various forms. For example, in the present embodiment, the music game machine 1 has been described as an example of a device that causes the pitch determination unit, the scale generation unit, and the sound effect data storage control unit to function, but the present invention is not limited thereto. For example, you may apply to various electronic devices, such as an electronic musical instrument. When the present invention is applied to an electronic musical instrument, a melody can be played with an arbitrary voice input by the player.

  The music game system of the present invention is not limited to that realized by a portable game machine, but is realized by using a stationary game machine for home use, an arcade game machine installed in a commercial facility, and a network. It may be realized in an appropriate form such as a game system. The input device is not limited to an example using a touch panel, and input devices having various configurations such as a push button, a lever, and a trackball can be used.

1 Music game machine 8 Speaker (voice output device)
9 Voice input device 10 Control unit (computer)
16 pitch determination unit 17 scale generation unit 20 external storage device (sound effect data storage means, sequence data storage means)
27 Sound effect data 29 Sequence data

Claims (6)

  1. A voice input device for inputting voice;
    An audio output device for reproducing and outputting game sounds;
    Sound effect data storage means for storing sound effect data for outputting each of a plurality of sound effects having different pitches from the sound output device;
    Sequence data storage means for storing sequence data describing a relationship between a player's operation and sound effects to be output in response to the operation ;
    A pitch determination means for determining a pitch representative of the input voice based on voice data of the voice input by the voice input device;
    A scale in which a plurality of pitches different from the representative pitch determined by the pitch determination means are selected so that a scale is formed, and sound data is generated by frequency conversion of the audio data for each of the selected multiple pitches. Generating means;
    Sound effect data storage control means for storing the plurality of sound data generated by the scale generation means in the sound effect data storage means as at least part of the sound effect data;
    Music game system with
  2.   The music game system according to claim 1, wherein the pitch determination unit determines a pitch of the voice by specifying a representative frequency from voice data of the voice input by the voice input device.
  3.   The music game system according to claim 1 or 2, wherein the scale generation means generates a scale of at least one octave or more.
  4. An input device having at least one operation unit;
    The music game system according to any one of claims 1 to 3, wherein a sound effect according to the description of the sequence data is reproduced from the audio output device based on a player's operation by the input device.
  5. A sound input device for inputting sound, a sound output device for reproducing and outputting game sound, and sound effect data storage means for storing sound effect data for outputting each of a plurality of sound effects having different pitches from the sound output device A sequence data storage means for storing sequence data describing a relationship between a player's operation and sound effects to be output in response to the operation, and a computer incorporated in a music game system,
    A pitch discriminating means for discriminating a pitch representative of the inputted voice based on voice data of the voice inputted by the voice input device;
    A scale in which a plurality of pitches different from the representative pitch determined by the pitch determination means are selected so that a scale is formed, and sound data is generated by frequency conversion of the audio data for each of the selected multiple pitches. And a sound effect data storage control means for storing the plurality of sound data generated by the scale generation means in the sound effect data storage means as at least a part of the sound effect data. Computer program for music game system.
  6. A sound input device for inputting sound, a sound output device for reproducing and outputting game sound, and sound effect data storage means for storing sound effect data for outputting each of a plurality of sound effects having different pitches from the sound output device A sequence data storage means for storing sequence data describing the relationship between the player's operation and sound effects to be output in response to the operation, and a computer incorporated in a music game system comprising:
    And pitch determination step of determining the pitch representative of speech is the input based on the speech voice data input by the voice input device,
    A scale in which a plurality of pitches different from the representative pitch determined in the pitch determination step are selected so that a scale is formed, and sound data is generated by frequency conversion of the audio data for each of the selected multiple pitches. Generation process;
    And sound effect data storage control step of storing the sound effect data storage means the plurality of sound data to which the musical scale generating step is generated as at least part of the sound effect data,
    The method of generating the sound effect data for execution.
JP2009210571A 2009-09-11 2009-09-11 Music game system, computer program thereof, and method of generating sound effect data Active JP5399831B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009210571A JP5399831B2 (en) 2009-09-11 2009-09-11 Music game system, computer program thereof, and method of generating sound effect data

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2009210571A JP5399831B2 (en) 2009-09-11 2009-09-11 Music game system, computer program thereof, and method of generating sound effect data
US13/394,967 US20120172099A1 (en) 2009-09-11 2010-09-07 Music game system, computer program of same, and method of generating sound effect data
CN201080039640.3A CN102481488B (en) 2009-09-11 2010-09-07 Music game system and method of generating sound effect data
PCT/JP2010/065337 WO2011030761A1 (en) 2009-09-11 2010-09-07 Music game system, computer program of same, and method of generating sound effect data

Publications (2)

Publication Number Publication Date
JP2011056122A JP2011056122A (en) 2011-03-24
JP5399831B2 true JP5399831B2 (en) 2014-01-29

Family

ID=43732433

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009210571A Active JP5399831B2 (en) 2009-09-11 2009-09-11 Music game system, computer program thereof, and method of generating sound effect data

Country Status (4)

Country Link
US (1) US20120172099A1 (en)
JP (1) JP5399831B2 (en)
CN (1) CN102481488B (en)
WO (1) WO2011030761A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6360280B2 (en) * 2012-10-17 2018-07-18 任天堂株式会社 Game program, game device, game system, and game processing method

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1106209C (en) * 1989-01-10 2003-04-23 任天堂株式会社 Electronic gaming device with pseude-stereophonic sound generating cap abilities
JPH08123448A (en) * 1994-10-18 1996-05-17 Sega Enterp Ltd Image processor using waveform analysis of sound signal
AT243878T (en) * 1994-12-02 2003-07-15 Sony Computer Entertainment Inc Method for generating sound source data, recording medium and processor for such data.
EP1011091B1 (en) * 1995-09-29 2004-04-28 Yamaha Corporation Musical tone-generating method and musical tone-generating apparatus
AU4496797A (en) * 1997-04-14 1998-11-11 Thomson Consumer Electronics, Inc System for forming program guide information for user initiation of control and communication functions
US6464585B1 (en) * 1997-11-20 2002-10-15 Nintendo Co., Ltd. Sound generating device and video game device using the same
JP4236024B2 (en) * 1999-03-08 2009-03-11 株式会社フェイス Data reproducing apparatus and information terminal
JP2001009152A (en) * 1999-06-30 2001-01-16 Konami Co Ltd Game system and storage medium readable by computer
EP1212747A1 (en) * 1999-09-16 2002-06-12 Hanseulsoft Co., Ltd. Method and apparatus for playing musical instruments based on a digital music file
JP3630075B2 (en) * 2000-05-23 2005-03-16 ヤマハ株式会社 Sub-melody generation apparatus and method, and storage medium
JP4497264B2 (en) * 2001-01-22 2010-07-07 株式会社セガ Game program, game apparatus, sound effect output method, and recording medium
JP2002351489A (en) * 2001-05-29 2002-12-06 Namco Ltd Game information, information storage medium, and game machine
JP4206332B2 (en) * 2003-09-12 2009-01-07 任天堂株式会社 Input device, game system, program, and information storage medium
JP3981382B2 (en) * 2005-07-11 2007-09-26 株式会社コナミデジタルエンタテインメント Game program, game device, and game control method
CN1805003B (en) * 2006-01-12 2011-05-11 深圳市蔚科电子科技开发有限公司 Pitch training method
JP4108719B2 (en) * 2006-08-30 2008-06-25 株式会社バンダイナムコゲームス Program, information storage medium, and game device
JP2008178449A (en) * 2007-01-23 2008-08-07 Yutaka Kojima Puzzle game system and numeric keypad character
US20080200224A1 (en) * 2007-02-20 2008-08-21 Gametank Inc. Instrument Game System and Method
JP4467601B2 (en) * 2007-05-08 2010-05-26 ソニー株式会社 Beat enhancement device, audio output device, electronic device, and beat output method

Also Published As

Publication number Publication date
CN102481488B (en) 2015-04-01
JP2011056122A (en) 2011-03-24
US20120172099A1 (en) 2012-07-05
CN102481488A (en) 2012-05-30
WO2011030761A1 (en) 2011-03-17

Similar Documents

Publication Publication Date Title
US10252163B2 (en) Music video game with user directed sound generation
US8419516B2 (en) Game system and game program
JP3260653B2 (en) Karaoke equipment
JP3962059B2 (en) Game device, game control method, and program
JP3724376B2 (en) Musical score display control apparatus and method, and storage medium
JP2925754B2 (en) Karaoke equipment
US5005459A (en) Musical tone visualizing apparatus which displays an image of an animated object in accordance with a musical performance
US7435169B2 (en) Music playing apparatus, storage medium storing a music playing control program and music playing control method
DE60215750T2 (en) Apparatus and method for displaying chords
US6555737B2 (en) Performance instruction apparatus and method
US6252153B1 (en) Song accompaniment system
US20100132536A1 (en) File creation process, file format and file playback apparatus enabling advanced audio interaction and collaboration capabilities
US6140565A (en) Method of visualizing music system by combination of scenery picture and player icons
KR20010078200A (en) Amusement system having typing practice function, typing practice system, and computer readable storage medium
JP4328828B2 (en) Portable chord output device, computer program, and recording medium
JP3058051B2 (en) Musical amusement system
US20120014673A1 (en) Video and audio content system
JP5198766B2 (en) Recording method of metronome and beat interval corresponding to tempo change
CN102811780B (en) Game system, method for controlling game system, and program for game system device
US8702509B2 (en) Game system, control method of controlling computer and a storage medium storing a computer program used thereof
JP2001129244A (en) Music playing game device, method of displaying image for guiding play, and readable storage medium storing play guide image formation program
JP5427659B2 (en) Game system, computer program thereof, and game system control method
JPH08234771A (en) Karaoke device
JP4797523B2 (en) Ensemble system
JP4182750B2 (en) Karaoke equipment

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20111017

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20120703

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130108

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130307

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20131001

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20131024

R150 Certificate of patent or registration of utility model

Ref document number: 5399831

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250