CN116635119A - Game system, computer program used in the game system, and control method - Google Patents

Game system, computer program used in the game system, and control method Download PDF

Info

Publication number
CN116635119A
CN116635119A CN202180087065.2A CN202180087065A CN116635119A CN 116635119 A CN116635119 A CN 116635119A CN 202180087065 A CN202180087065 A CN 202180087065A CN 116635119 A CN116635119 A CN 116635119A
Authority
CN
China
Prior art keywords
character
performance
game
action
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180087065.2A
Other languages
Chinese (zh)
Inventor
佐原祐
宫崎泰地
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Konami Digital Entertainment Co Ltd
Original Assignee
Konami Digital Entertainment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konami Digital Entertainment Co Ltd filed Critical Konami Digital Entertainment Co Ltd
Publication of CN116635119A publication Critical patent/CN116635119A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/48Starting a game, e.g. activating a game device or waiting for other players to join a multiplayer session
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • A63F13/798Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/814Musical performances, e.g. by evaluating the player's ability to follow a notation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/847Cooperative playing, e.g. requiring coordinated actions from several players to achieve a common goal

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Computer Security & Cryptography (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Provided is a game system capable of causing a plurality of characters in a game to execute the same process as a process of making steps identical to that executed in a performance by a plurality of persons before the performance starts. The HMD game machine (4) provides a music game including a performance performed by a plurality of characters (54), wherein the plurality of characters (54) include a user character (54C) operated by a user. Further, the HMD game machine (4) judges whether or not a start condition is satisfied, wherein the necessary condition in the start condition includes formation of a cooperative state in which a response action to a cooperative action performed by the user character (54C) is formed between the user character (54C) and another character (54B) when the response action is performed by the other character (54B), and starts performance with the start condition being satisfied when the start condition is satisfied, and on the other hand, stands by the start of performance until the start condition is satisfied.

Description

Game system, computer program used in the game system, and control method
Technical Field
The present invention relates to a game system and the like, which is connected to an input device for inputting a game play behavior of a user, and a display device for displaying a game screen so as to include a plurality of characters including a user character as a character operated by the game play behavior of the user, and which provides a game including a performance performed by the plurality of characters.
Background
There is one game system as follows: the game system is connected to an input device for inputting a game play of a user, and a display device for displaying a game screen so as to include a plurality of characters including a user character as a character operated by the game play of the user, and provides a game including a performance performed by the plurality of characters. A game system that provides a music game that plays a singing by acting as a master of a band as such a game is known (for example, refer to patent literature 1).
Prior art literature
Patent literature
Patent document 1: japanese patent No. 6727807
Disclosure of Invention
Problems to be solved by the invention
In a realistic performance by a plurality of persons, it is often the case that the performance is started by making a pace between the plurality of persons by eye contact or the like. Therefore, such a process of making the pace uniform is expected to have an effect of improving the feeling of presence or the sense of unity. On the other hand, in the music game of patent document 1, live performance by own songs and performance of band members is realized. In this music game, performance is started according to an input of a start action (hand action) of a player, and transition is made to a live sequence. That is, in this music game, live performance is started according to the hand action of the player. However, this is simply a process of using a start action as an alternative to pressing the performance play button, rather than making steps among a plurality of characters uniform. Therefore, there is room for improvement in the sense of presence and the like for performances in the game.
Accordingly, an object of the present invention is to provide a game system or the like capable of causing a plurality of characters of a game to execute the same process as the process of making steps identical to that executed in a performance by a plurality of persons before the performance starts.
Solution for solving the problem
A game system according to the present invention is a game system connected to an input device for inputting game play of a user, and a display device for displaying a game screen so as to include a plurality of characters including a user character that is a character operated by the game play of the user, the game system providing a game including a performance performed by the plurality of characters, the game system including: a condition discrimination unit that discriminates whether or not a start condition is satisfied, the necessary condition in the start condition including formation of a cooperative state that is a state formed at least between at least one character and another character in a case where a response action that is a prescribed action that responds to an action performed by the at least one character is performed by the other character; and a progress control unit that controls progress of the game in the following manner: when the start condition is satisfied, the performance is started with the start condition being satisfied, and the performance is started on standby until the start condition is satisfied.
In the present invention, the computer program is configured to cause a computer connected to the input device and the display device to function as each unit of the game system.
Further, a control method according to the present invention is a control method for causing a computer mounted in a game system to execute a condition discriminating process and a progress controlling process, wherein the game system is connected to an input device for inputting a game play action of a user, and a display device for displaying a game screen so as to include a plurality of characters including a user character as a character operated by the game play action of the user, the game system providing a game including a performance executed by the plurality of characters, wherein in the condition discriminating process, whether or not a start condition is satisfied is discriminated, a necessary condition in the start condition includes formation of a cooperative state which is a state formed at least between at least one character and another character in a case where a response action as a prescribed action performed by the at least one character is performed by the other character, and in the progress controlling process, progress of the game is controlled in the following manner: when the start condition is satisfied, the performance is started with the start condition being satisfied, and the performance is started on standby until the start condition is satisfied.
Drawings
Fig. 1 is a diagram showing an outline configuration of a game system according to an embodiment of the present invention.
Fig. 2 is a functional block diagram showing a main part of a control system of the game system.
Fig. 3 is a diagram schematically showing an example of a game screen.
Fig. 4 is an explanatory diagram for explaining the positional relationship of each character in the virtual three-dimensional space.
Fig. 5 is a diagram schematically showing an example of a game screen in the case where the field of view range is moved to the right tilt range of the example of fig. 4.
Fig. 6 is an explanatory diagram for explaining an example of the flow until the performance starts.
Fig. 7 is an explanatory diagram for explaining a field of view set in a game screen.
Fig. 8 is an explanatory diagram for explaining an example of the response action.
Fig. 9 is an explanatory diagram for explaining a flow of changes in the motion of other character images.
Fig. 10 is a diagram showing an example of the structure of the performance data.
Fig. 11 is a flowchart showing an example of the procedure of the performance selection process.
Fig. 12 is a flowchart showing an example of the procedure of the cooperation response processing.
Fig. 13 is a flowchart showing an example of a procedure of the music piece start processing.
Fig. 14 is a flowchart showing an example of a procedure of the performance data generation process.
Detailed Description
An example of a game system according to an embodiment of the present invention will be described below. First, an overall configuration of a game system according to an embodiment of the present invention will be described with reference to fig. 1. The game system 1 includes a center server 2 as a server device. The central server 2 is configured by combining a plurality of server units as computer devices to form a single logical server device. Alternatively, the central server 2 may also be logically constituted by cloud computing.
One or a plurality of game devices of an appropriate number are connected to the central server 2 as client devices connectable via the network 3. The game device may suitably include various game devices such as arcade game devices (business game devices provided in a house or the like for exchanging users for a predetermined price to play a game in a range corresponding to the play fee thereof), and in the example of fig. 1, the user terminal device 4 is shown. Specifically, the central server 2 is connected to a plurality of user terminal devices 4, which are examples of game devices, via the network 3.
The user terminal device 4 is a computer device capable of network connection and used for personal use by a user. The user terminal device 4 can allow the user to enjoy various services provided by the central server 2 by installing various computer software. Such computer software (application program) includes an application program for a game for providing a paid or gratuitous game. The user terminal device 4 functions as a game machine by executing such a game application. Such a user terminal device 4 includes, for example, a stationary or notebook personal computer, a stationary home game machine, a mobile tablet terminal device, a mobile phone (including a smart phone), and other various mobile terminal devices. As the user terminal device 4, these terminal devices can be appropriately used, but in the example of fig. 1, an HMD (head mounted display or head mounted device) type game machine is used.
An HMD game machine is a well-known game machine that is worn on the head so that the display surface of the display occupies a large part of the user's field of view. The HMD type game machine includes, for example, an HMD type dedicated game machine, a game machine composed of a combination of an appropriate mobile terminal device such as a mobile phone and a box that houses the mobile terminal device so that a display surface faces a field of view of a user, and a glasses type projector (so-called smart glasses) that projects an image so as to be in focus on an inner side of an eyeball. The HMD-type dedicated game device further includes a linked game device that is connected to a tablet terminal device, a smart phone, or the like, and functions as a game machine by application programs of the tablet terminal device. These various HMD-type game machines can be used as the user terminal device 4. Hereinafter, the HMD type game machine functioning as the user terminal device 4 may be given the same reference numerals as the user terminal device 4, and may be referred to as the HMD type game machine 4.
The HMD type game machine 4 is appropriately provided with various input devices for inputting game play behavior of a user. For example, a sensor that detects the head motion as a game play behavior is incorporated in the HMD type game machine 4, and the sensor may function as an input device. Alternatively, another device independent of the HMD type game machine 4 may be connected to the HMD type game machine 4 by an appropriate method, and the other device may function as an input device. As described above, various input devices may be provided as appropriate to the HMD type game machine 4, and in the example of fig. 1, a joystick OS is provided as such an input device. The joystick OS is a well-known input device connected to the HMD game machine 4 by a predetermined wireless communication standard. The lever OS can be used for input of various operations (game playing actions), and accordingly, an appropriate number of lever OS can be connected to each HMD type game machine 4, and as an example, two lever OS (only one of the two lever OS is shown in fig. 1) can be connected to each HMD type game machine 4 so as to be operated by the left and right hands, respectively. The HMD game machine 4 provides a game that progresses through game play behavior input through such a lever OS.
The HMD game machine 4 can appropriately provide various games, for example, action games, simulation games, role-playing games, and the like, and as an example, music games. A music game is one of Timing games (Timing game). The game of chance is a type of game in which the execution timing of an appropriate game play behavior is evaluated. In the case of a music game, the execution period during which the appropriate game play action should be executed is provided together with a musical composition. In addition, in the music game, a time period matching the melody of the musical composition is used as the execution time period. That is, the music game is a type of game in which a user is guided with respect to a period in which an appropriate game play action should be performed and evaluated with respect to a period in which the game play action is actually performed in a manner matching with the melody of the musical composition. In addition, for example, in a music game, a plurality of pieces of music for playing the game are prepared, and a piece of music selected from them is used in the actual game play. Such a music game can be appropriately provided via various output devices, and is provided by a game screen displayed on a display, for example.
The game provided by the HMD type game machine 4 includes a plurality of characters (including various objects such as vehicles and animals) and a performance performed by the plurality of characters. The plurality of characters can appropriately perform various performances according to the kind of action game or the like, but in the case of a music game, as an example, a performance of a musical instrument (hereinafter referred to as a performance) is performed. Therefore, the game screen sometimes includes a plurality of characters for performing such performance. Details of such game screens are described later.
The network 3 may be appropriately configured as long as the HMD game machine 4 can be connected to the center server 2. As an example, the network 3 is configured to realize network communication using the TCP/IP protocol. Typically, the internet as a WAN is combined with an intranet as a LAN to constitute the network 3. In the example of fig. 1, the center server 2 is connected to the network 3 via the router 3a, and the HMD game machine is connected to the network 3 via the access point 3 b. The network 3 is not limited to a system using the TCP/IP protocol. As the network 3, various modes using a wired line for communication, a wireless line (including infrared communication, short-range wireless communication, and the like) or the like can be employed.
The center server 2 provides various Web services to the user of the HMD game machine 4 via the network 3. The Web service includes a distribution service that distributes various data or software (including updating of data and the like) to each HMD game machine 4. In addition, in the case where a music game is played as a fight-type game or a cooperation-type game, the Web service may appropriately include various services such as a matching service for matching with users (fighters or cooperators) of other HMD-type game machines 4 via the network 3, and a service for giving a user ID for identifying each user.
Next, the main parts of the control system of the game system 1 will be described with reference to fig. 2. First, the central server 2 is provided with a control unit 21 and a storage unit 22 as storage means. The control unit 21 is configured as a computer that combines a CPU, which is an example of a processor that executes various arithmetic processing and operation control according to a predetermined computer program, with an internal memory and other peripheral devices necessary for performing the operation.
The storage unit 22 is an external storage device implemented by a storage unit including a nonvolatile storage medium (computer-readable storage medium) such as a hard disk array. The storage unit 22 may be configured to hold all data in one storage unit, or may be configured to store data in a plurality of storage units in a distributed manner. The program PG1 is recorded in the storage unit 22 as an example of a computer program for causing the control unit 21 to execute various processes required for providing various services to the user. The storage unit 22 stores server data necessary for providing various services. Such server data includes various data for services, but in the example of fig. 2, as one of such various data, musical composition data MD, sequence data QD, play data PD, and performance data OD are shown.
The music data MD is data for playing each music. The music data MD is used for playing of each music piece in, for example, a music game. The sequence data QD is data describing each execution period in which an appropriate game play action should be executed in the music game. The sequence data QD is used to guide such execution times to the user. In addition, when the game play is actually performed by the user, the game play is evaluated based on the execution timing of the sequence data QD. That is, the sequence data QD is used for guidance of each execution period and evaluation of the execution period. Therefore, in the sequence data QD, each execution period and information of an appropriate game play action to be executed at the execution period are described in a correlated manner. In addition, when a plurality of pieces of music or a plurality of difficulty levels are prepared in the music game, the sequence data QD is prepared for each piece of music or each difficulty level. In a music game, when a game target includes a plurality of characters, the execution timing and the appropriate game play behavior may be different depending on the selected character (or a musical instrument different for each character as described later). In this case, the sequence data QD is also prepared for each such character (or instrument). The game play data PD is data describing information on past game play performance of each user. The game play data PD is used to inherit the game play result (past actual results) until the last time to the next and subsequent games or to inherit the setting contents inherent to each user. The performance data OD is data for realizing performance. Details of the performance data OD are described later.
In addition to this, the server data may further include various data for realizing various services. Such data may include, for example, image data or ID management data, etc. The image data is data for causing a display device to display various images such as a game screen. The ID management data is data for managing various IDs such as user IDs. However, illustration of these data is omitted.
The control unit 21 is provided with a Web service management section 24 as a logic device implemented by a combination of a hardware resource of the control unit 21 and a program PG1 as a software resource. The Web service management unit 24 executes various processes for providing the HMD game machine 4 with the Web service described above.
On the other hand, the HMD type game machine 4 is provided with a control unit 41 and a storage unit 42 as storage means. A computer is configured by combining a CPU, which is an example of a processor that executes various computation processes and operation control according to a predetermined computer program, with an internal memory and other peripheral devices necessary for performing the operation.
The storage section 42 is an external storage device implemented by a storage unit including a nonvolatile storage medium (computer-readable storage medium) such as a hard disk or a semiconductor storage device. The program PG2 is recorded in the storage unit 42 as an example of a computer program for causing the control unit 41 to execute various processes required for providing various services to the user. In addition, game data necessary for providing a music game is recorded in the storage section 42. Such game data includes various data for music games, but in the example of fig. 2, musical composition data MD, sequence data QD, play data PD, and play data OD are shown as examples thereof.
The musical composition data MD, the sequence data QD, the play data PD, and the performance data OD can be provided to the storage unit 42 by various methods such as initial installation, provision via various storage media, and the like, and as an example, provided from the center server 2 via a distribution service. For example, when there is another sound different from music, the game data may include various data for a game such as sound data for playing such various sounds. Such data can appropriately include image data, ID management data, and the like provided by a distribution service and the like as in the music data MD and the like. However, illustration of these data is omitted.
In the control unit 41, various logic devices are constituted by a combination of hardware resources of the control unit 41 and the program PG2 as a software resource. Further, various processes necessary for providing a music game (including processes necessary for enjoying a Web service provided by the Web service management section 24 of the central server 2) are executed by these logic devices, and in the example of fig. 2, the progress control section 43 and the data management section 44 are shown as logic devices associated with a music game.
The progress control unit 43 is a logic device for various processes required for progress of the game. Such processing also includes, for example, processing of executing various preparations for playing a game, processing of guiding each execution timing of a game playing behavior by a user, or processing of evaluating a game playing behavior executed by a user. The preparation for playing the game includes appropriate elements such as various settings, including, for example, provision of a selection opportunity, discrimination of guidance start at each execution timing, and the like. Specifically, as an example of such various processes, the progress control section 43 executes performance selection processing, cooperation response processing, and musical composition start processing. On the other hand, the data management unit 44 is a logic device that performs various processes related to management of game data recorded in the storage unit 42. For example, the data management unit 44 performs processing of acquiring game data supplied from the center server 2 or saving it in the storage unit 42. For example, as an example of such various processes, the data management section 44 executes performance data generation processing. Details of the procedures of the performance selection process, the cooperation response process, the musical composition start process, and the performance data generation process are described later.
Various output devices and input devices can be appropriately provided in the HMD type game machine 4, and in the example of fig. 2, a display 47 and a speaker SP are provided as an example of the output devices, and a sensor SM is provided as an example of the input devices. The display 47 is a well-known display device for displaying a game screen or the like. The speaker SP is a well-known sound playing device for playing various sounds including music. The sensor SM is a well-known detection device (detection means) for detecting various states of the HMD game machine 4. The sensor SM may suitably include various detection devices according to the state of the detection target, and may suitably include, for example, an eye tracking sensor or the like for tracking the line of sight of the user, and in the example of fig. 2, an acceleration sensor SM1 and a gyro sensor SM2 are shown as examples of such various sensors.
The acceleration sensor SM1 is a well-known detection device for detecting accelerations (for example, accelerations in three axis directions) generated by the HMD game machine 4. The acceleration sensor SM1 can be suitably and flexibly used by detecting such acceleration, and is used, for example, to detect a state such as a horizontal state or an inclination or orientation of the HMD game machine 4. Similarly, the gyro sensor SM2 is a well-known detection device for detecting a change in angle with respect to a reference axis (for example, three axes). The gyro sensor SM2 can be suitably and flexibly used by detecting such an angle, and is used for detecting a state such as rotation, inclination, or orientation of the HMD game machine 4, for example. The detection results of the acceleration sensor SM1 and the gyro sensor SM2 may be used alone (in the case of using alone, either one may be omitted), and may be used in combination with detection of various states such as the orientation of the HMD game machine 4, for example.
The HMD game machine 4 may be appropriately connected to various other output devices or input devices, and in the example of fig. 2, the above-described joystick OS is connected thereto. The lever OS is connected to the control unit 41 as described above, and outputs various input signals corresponding to game play behavior to the control unit 41.
Next, an example of a game screen for playing a music game will be described with reference to fig. 3. Fig. 3 is a diagram schematically showing an example of a game screen displayed on the display 47 of the HMD type game machine 4. The music game can include various game screens, and the example of fig. 3 shows a game screen for guiding each execution period of a game play action to be executed. In addition, the music game may be appropriately constituted, and as a performance of a musical instrument, a performance sound of the musical instrument is often played with a game play action. In this case, in order to enhance the sense of presence, a character performing a playing action of the musical instrument in association with the playing action may be brought out of the game screen. There are also cases where the same plurality of characters respectively corresponding to other users (collaborators) or computers come out. The example of fig. 3 shows a game screen in the case where these characters are included. More specifically, a game screen including two characters, a character performing a playing action of playing a drum as an example of an instrument in association with a game playing action of a user and a character performing a playing action of another instrument in association with a game playing action of another user or the like, is shown. Such a game screen may be appropriately configured, but the example of fig. 3 shows a case where the performance is configured to be performed in a virtual three-dimensional space. Such a virtual three-dimensional space may be appropriately cut out as a result of photographing by a virtual camera and displayed as a game screen, but the example of fig. 3 shows a case where the virtual three-dimensional space (photographed by a virtual camera set approximately at the position of the eyes of a character) is cut out so as to show the field of view of the character corresponding to the user (hereinafter, sometimes referred to as a user character) and displayed as a game screen. In this case, the game screen 50 includes an instruction object 51, a score display area 52, a stage area 53, a character image 54, and a drum kit image 55.
Drum set image 55 is an image that mimics a drum set as a musical instrument. Drum set image 55 may correspond to an appropriate drum set, but in the example of FIG. 3, includes two drum images 55a and four cymbal images 55b. Each drum image 55a and each cymbal image 55b are images corresponding to a drum and cymbal in the drum set, respectively. They are shown in the same configuration as the actual drum kit. Specifically, the two drum images 55a are arranged in parallel in the vicinity of the center in the drum set image 55. On the other hand, the four cymbal images 55b are arranged two by two on the left and right sides of the two drum images 55a so as to sandwich the two drum images 55a in the drum set image 55. The two cymbal images 55b on the left side and the two cymbal images 55b on the right side are arranged so as to be divided into upper and lower parts.
The score display area 52 is an area for displaying a score (obtaining a score). The score display area 52 may be appropriately configured, but in the example of fig. 3, is formed in a square shape and is displayed so as to be positioned on the back side (the user side is the near side) in the virtual three-dimensional space. The instruction object 51 is an image corresponding to each execution time (in other words, an identification image indicating each execution time). The pointing object 51 appears at a suitable position on the back side in the virtual three-dimensional space at a suitable timing, and moves along a predetermined movement path so as to approach the front side, that is, the user. Such a moving path may be set appropriately, but is set to pass through either one of the drum images 55a and the cymbal images 55b. In each movement path, each of the drum image 55a and each of the cymbal image 55b functions as a reference image indicating the current time. Therefore, each pointing object 51 moves along such a movement path so as to coincide with one of each drum image 55a and each cymbal image 55b at the corresponding execution timing. The movement path may also be displayed, but in the example of fig. 3, the movement path is not displayed. Each execution time can be appropriately guided by a relative displacement between the instruction object 51 (the instruction identification image) and any one of the drum images 55a and the like (the identification image at the current time), but as an example, each execution time is guided by a movement of such an instruction object 51 to each drum image 55a and the like.
The pointing object 51 may be classified into various types, but in the example of fig. 3, two types of the first pointing object 51A and the second pointing object 51B are included. The first instruction object 51A and the second instruction object 51B are instruction objects 51 that require different game play behaviors at each execution timing. Specifically, the first pointing object 51A and the second pointing object 51B each require the user to make an action of tapping the lower cymbal image 55B as a game play, but the first pointing object 51A corresponds to a request for a tap action of the left cymbal image 55B and the second pointing object 51B corresponds to a request for a tap action of the right cymbal image 55B. Therefore, after the first pointing object 51A and the second pointing object 51B appear at the appropriate positions, they move to the near side so as to coincide with the position of the cymbal image 55B on the lower left side and the position of the cymbal image 55B on the lower right side, respectively, at each execution timing.
The tap action that the user is required to make is entered through the joystick OS. Specifically, for example, in accordance with the coincidence of the first pointing object 51A with the cymbal image 55b located on the lower right side, the user is required to make an operation of rocking the lever OS so as to perform a tap operation on the position of the cymbal image 55b in the virtual three-dimensional space. Therefore, in order to cause such a knocking operation on each cymbal image 55b or the like to function as an appropriate game play for each execution timing, the sequence data QD is described with information on the movement path of each pointing object 51, each cymbal image 55b on the movement path, or the like as information on the appropriate game play. The evaluation is higher as the deviation time between the actual execution time of the knocking operation and the actual coincidence time (execution time described in the sequence data QD) is smaller. The same applies to the second instruction object 51B.
The determination of the operation of knocking the drum images 55a and the like can be appropriately performed, and is performed as follows, for example. First, the drum images 55a and the cymbal images 55b are arranged at predetermined positions defined by space coordinates in the virtual three-dimensional space, and the pointing objects 51 are moved to the drum images 55a and the like at the predetermined positions (space coordinates). Further, an evaluation range (a range which is the same as or slightly larger than the respective drum images 55a and the like) is set with respect to the predetermined position for the respective drum images 55a and the like. The stick image 56 moves in the virtual three-dimensional space in response to the operation of the lever OS, but determines whether or not the tapping operation has been performed based on whether or not the position (spatial coordinates) of the tip portion thereof has entered the evaluation range of each of the stick image 55a and the cymbal image 55 b. Then, a deviation time between a time when the stick image 56 enters the evaluation range (a time when the actual game play is performed) and a time when the instruction object 51 should reach each of the stick images 55a and the like (an execution time in the sequence data QD) is calculated, and an evaluation is determined based on the deviation time. The evaluation can be appropriately performed, and is realized by, for example, a criterion indicating an evaluation result such as "perfect", "very good", "good", or "bad". If the tapping operation is not performed for a predetermined time based on each execution time, it is determined that the operation is a faulty operation (evaluation result corresponding to the fault). Similarly, if the tapping operation is performed earlier or later than the period belonging to the "bad", the determination is made as a fault. In addition, in the case where a tapping action is performed in a period not belonging to any of these, the action is not used for any purpose and can be ignored.
In addition, the pointing object 51 may include a type corresponding to the cymbal piece image 55b located on the upper left and right sides, a type corresponding to each drum image 55a, and the like, and the operation of knocking the drum image 55a and the like may be appropriately requested by these types in the same manner as the first pointing object 51A, but the display thereof is omitted in the example of fig. 3. The various pointing objects 51 such as the first pointing object 51A may be displayed in an appropriate manner, and for example, each may be displayed in the same shape as the identification image at the current time on the moving path such as the cymbal image 55 b. Similarly, various indication objects 51 such as the first indication object 51A may be appropriately displayed so as to be distinguishable from each other, and the first indication object 51A and the second indication object 51B may be displayed with different colors, for example. The same applies to each drum image 55a and the like in the drum set image 55. For example, the drum images 55a may be displayed in different colors from each other. In this case, in order to enable the user to easily recognize the game play target, each of the pointing objects 51 may be displayed in the same color as the corresponding drum image 55a or the like.
The stage area 53 is an area corresponding to a stage formed in a virtual three-dimensional space. In the virtual three-dimensional space, each character performs a performance action on the stage. Accordingly, in the stage area 53, the drum set image 55 and the character image 54 corresponding to each character are arranged. An appropriate number of character images 54 may be arranged in the stage area 53, and in the example of fig. 3, three character images 54 are displayed. Specifically, the character image 54 includes a first character image 54A of the performance keyboard, a second character image 54B of the performance guitar, and a third character image 54C of the performance drum. These character images 54A, 54B, 54C correspond to three-dimensional characters each forming a single playing unit for playing a keyboard, guitar, and drum (each being an example of a musical instrument) in a virtual three-dimensional space. These three characters may be appropriately associated with the user and the collaborator (including a computer), or may be played based on any game play behavior of these characters, and the first character image 54A and the second character image 54B may be associated with the computer, and the third character image 54C may be associated with the user, for example. In this example, the first to third character images 54A to 54C function as a plurality of characters of the present invention.
The first to third character images 54A to 54C may be appropriately displayed, and in the example of fig. 3, the whole bodies of the first and second character images 54A and 54B are displayed so as to correspond to the field of view of the user character (third character image 54C), and only both hands (only a part of the body) of the third character image 54C (user character) are displayed. In addition, in order to show a state in which a user character holds a stick for striking a drum in a virtual three-dimensional space, two stick images 56 are displayed at the left hand and the right hand of the third color image 54C, respectively. In this case, the third color image 54C also operates to reproduce the tap operation in response to the execution of the tap operation by the user. More specifically, the third color image 54C performs an operation (performance operation) of tapping the target cymbal image 55b or the like on the stick image 56 in association with the execution of the tapping operation. The same applies to the first character image 54A and the second character image 54B. That is, with the game play behavior of the collaborators (including the computer) (the game play behavior may be appropriately performed according to the type of the targeted musical instrument or input device, for example, the action of flicking the keyboard in the case of a keyboard, and the action of plucking the string in the case of a guitar), the first character image 54A and the second character image 54B also perform the same game play action on the game screen 50.
The game screen 50 will be further described with reference to fig. 4 to 5. Fig. 4 is an explanatory diagram for explaining the positional relationship of each character in the virtual three-dimensional space. The example of fig. 4 schematically shows a case where stages and roles in the virtual three-dimensional space are observed from above. Since the stage and the like in the virtual three-dimensional space correspond to the stage area 53 and the like in the game screen 50, in the example of fig. 4, the same reference numerals as those of the stage area 53 and the character images 54 and the like in the game screen 50 are given to the stage and the characters for convenience of explanation.
As shown in fig. 4, each character image 54 (each character in the virtual three-dimensional space) is arranged in the stage area 53 (stage in the virtual three-dimensional space) with a predetermined interval. The position and interval of each character image 54 may be appropriately changed or may be fixed, for example. That is, each character image 54 is fixedly arranged at a predetermined position in the stage area 53.
In addition, the third color image 54C corresponds to a user character. Accordingly, the field of view range IA is set for the third color image 54C. The field of view range IA corresponds to the photographing range of the virtual camera for drawing the game picture 50. The field of view range IA may be set to an appropriate range, but in the example of fig. 4, the field of view range IA set in the case where the third character image 54C is directed toward the front is shown. The field of view IA is moved in accordance with the motion of the third color image 54C, in other words, in accordance with the motion of the user. Specifically, the HMD type game machine 4 is operated by the user wearing the game machine on the head of the user, and the head of the third character image 54C also operates in accordance with the movement of the head of the user. For example, when the head of the user (HMD type game machine 4) faces the front of the body, the field of view range IA in the virtual space is adjusted so that the head of the third color image 54C faces the same direction in the virtual three-dimensional space when the user faces the front, and thus the field of view range IA is set to the front side (the range shown by the solid line) in the example of fig. 4.
On the other hand, when the user is facing right from the state of facing front, the movement of the head is detected by the sensor SM of the HMD type game machine 4, and reflected in the movement of the third character image 54C. That is, with such head movement, the head of the third character image 54C also moves in the virtual three-dimensional space in the same manner as the head toward the right. In this case, the angle of the field of view range IA changes in the right direction according to the movement of the head. Specifically, with the movement of the user's head toward the right, the field of view range IA moves to a right inclination range IA1 indicated by a dash-dot line. Then, the virtual three-dimensional space included in the right inclination range IA1 is displayed as the game screen 50.
Fig. 5 is a diagram schematically showing an example of the game screen 50 in the case where the field of view range IA is shifted to the right tilt range IA1 in the example of fig. 4. The example of fig. 4 also corresponds to the game screen 50 in the case where the user is oriented obliquely upward to the right with respect to the example of fig. 3. Such a change in the head up-down direction is also detected by the sensor SM, and reflected on the game screen 50. In this case, therefore, as shown in fig. 5, the second character image 54B moves to the vicinity of the center of the game screen 50 in accordance with the movement of the field of view (field of view range IA) as compared with the example of fig. 3, and the display of the first character image 54A disappears. The stage area 53, the score display area 52, and the drum set image 55 also change in the same manner.
Next, a flow until the start of play in the music game will be described with reference to fig. 6 to 10. As described above, in association with the game play behavior (the tap motion) of the user, the character image 54 also performs the same tap motion, that is, the play motion, on the game screen 50. Accordingly, each character performs a performance action accompanying the start of game play. The character image 54 other than the user character also performs the same performance action with the start of game play. The performance is formed by a combination of these performance actions, in other words, by the performance action performed by one performance unit as a whole. As a result, the start of game play corresponds to the performance action and the start of performance. Such performance, i.e., game play, is started with the start condition satisfied. In the performance, the first character image 54A and the second character image 54B may be operated by other users as described above, and a case where the first character image 54A and the second character image 54B are both operated (controlled) by a computer will be described below.
Fig. 6 is an explanatory diagram for explaining an example of the flow until the performance starts. As shown in fig. 6, the start condition is satisfied in the case where the start action is performed after the cooperative state is formed between the plurality of character images 54 (characters), so that the performance, that is, the play of the game is started. Specifically, in the game screen 50, before the game play (guidance of the execution timing by each instruction object 51) is started, each character image 54 is displayed in a standby state. Such a standby state can be appropriately expressed, but in the case of a computer-controlled character, it is expressed by a predetermined standby operation. An appropriate action may be performed as such a standby action, but in the case of the second character image 54B, for example, an action of tuning the guitar is performed. In the standby state, the first character image 54A also performs an appropriate standby operation such as a flick operation of the keyboard, but in the example of fig. 6, the display thereof is omitted for convenience of explanation.
On the other hand, the user's action (operation) is reflected in the third character image 54C operated by the user (in the case of control by another user, the same applies to other characters). The user is allowed to take various actions in the standby state, but such actions include cooperative actions. Along with the user's cooperative action, the third character image 54C also performs the same cooperative action. The cooperative action may be performed so that both the second character image 54B and the first character image 54A are targeted at the same time, and may be performed so as to correspond to either one of the second character image 54B and the first character image 54A, for example, individually. When the third character image 54C performs a cooperative action on the second character image 54B, the computer determines that the action is performed, and causes the second character image 54B to perform a response action for responding to the cooperative action. Then, when the response action is performed by the second character image 54B, a cooperative state is formed between the third character image 54C and the second character image 54B. In this example, the third character image 54C and the second character image 54B function as one character and the other character of the present invention, respectively. In addition, the cooperative action and the responsive action function as the action and the responsive action of the present invention, respectively.
The cooperative state can be appropriately formed with the second character image 54B and the first character image 54A. For example, in the case where a cooperative state is to be formed by performing a cooperative action with respect to both the second character image 54B and the first character image 54A and by a response action of either one, the cooperative state may be formed for each character individually, for example, by forming the cooperative state for the whole performance unit, as long as the cooperative state is formed with either one (part of the performance unit) of the second character image 54B and the first character image 54A. Accordingly, the cooperative action performed by the third character image 54C on the first character image 54A and the response action performed by the first character image 54A to respond to the cooperative action are similarly performed between the third character image 54C and the first character image 54A. When the third character image 54C and both the second character image 54B and the first character image 54A (in other words, the whole of the performance unit) form a cooperative state, the whole of the performance unit becomes a cooperative state. The cooperative action may be performed by a character other than the user character, such as the second character image 54B, for example, and in this case, the user may be in a cooperative state when the user returns a response action to such a cooperative action. As such, the collaborative action may be performed by an appropriate character, such as by a user character as in the example of fig. 6.
The start condition may include various requirements, which may be appropriately satisfied, including, for example, formation of a cooperative state of the whole performance unit and execution of a start action in the cooperative state. Therefore, the start condition is satisfied in the case where the third color image 54C (user) performs the start action in the cooperative state of the whole performance unit. Also, when the start condition is satisfied, the performance starts. In this example, the start action functions as a special action of the present invention.
The performance may be appropriately started with the start condition. For example, the performance corresponds to a play of a music game, which includes the play of a music piece, since the play of a music game is achieved by guiding each execution period (display instruction object 51) in a manner matching with the melody of the music piece. Accordingly, it is also possible to start playing the music piece for game play immediately after the start condition is provided, that is, immediately after the start action. As described above, the performance can be appropriately started, and as an example, the performance is started first with the start condition. That is, the performance includes a part for starting the performance and a performance part in which each character image 54 actually performs the performance, and the performance starts first with the start condition.
The start of the performance may be suitably performed, for example, by functioning as a countdown to the playing of the music. Therefore, the start of the performance functions as a preparation section for matching the timing with the play of the musical composition. The countdown function can be suitably implemented, for example, by sequentially illuminating the performance of each character image 54 with a spotlight at regular intervals. That is, after the action is started, the performance of sequentially irradiating the spot lights with the character images 54 is started first, and the music is started to be played after the spot lights are irradiated with the whole person. Then, along with the playback of the music, the display of the instruction object 51 is started, and the performance operation (game playing behavior) with respect to the instruction object 51 is also started. That is, the actual performance action (performance section) performed by each character image 54 starts. In this example, the preparation section and the performance section function as the preparation section and the performance section of the present invention, respectively.
The cooperative action may be an appropriate action. For example, as the cooperative action, an action such as tapping the cymbal piece with a fixed melody a plurality of times by a drummer may be employed as performed by an actual band (performance unit) at the start of performance. Alternatively, a call (sound production) may be employed as a cooperative action in which the user character calls another character. As described above, various actions can be appropriately adopted as the cooperative action, and as an example, an action of looking at the target character can be adopted. Such a gaze operation may be appropriately determined, for example, as in the example of fig. 5, when only one character image 54 is included in the field of view IA, it may be determined that the character image 54 is gazed, and in this case, the field of view IA may be used for determining the gaze operation. As described above, the gaze operation can be appropriately determined, and as an example, a visual field range is set in a part of the visual field range IA, and whether or not the gaze operation is present can be determined based on the visual field range. Accordingly, a visual field range for discriminating the line of sight of the user (the line of sight of the user character) is set in the game screen 50 (or the virtual three-dimensional space).
Fig. 7 is an explanatory diagram for explaining a visual field range set in the game screen 50. Fig. 7 illustrates an example of a field of view set in the game screen 50 of the example of fig. 5. As shown in fig. 7, the visual field range IR includes a central visual field range IR1 and a peripheral visual field range IR2. The visual field range IR may be formed at an appropriate position of the game screen 50, and the position may be variable. In the case where the position of the visual field IR is variably set on the game screen 50, the position can be set appropriately by various detection results such as tracking of the eye tracking with respect to the line of sight. The visual field range IR may be set appropriately in the game screen 50 as described above, and may be formed fixedly near the center of the game screen 50 (visual field range), for example. The visual field IR may be visualized, but is set to be invisible, for example. That is, in the example of fig. 7, the center visual field range IR1 is displayed in a dot pattern and the peripheral visual field range IR2 is displayed in a right oblique line for convenience, but as in the example of fig. 5, the visual field range IR is not displayed in the actual game screen 50.
The central visual field range IR1 is a range used when determining gaze motion. The center visual field range IR1 may be formed in an appropriate shape, for example, a circular shape having a predetermined size. The determination of the gaze operation may be appropriately performed by using the central visual field IR1, or may be determined to be gazing when a main portion such as a head is included in the central visual field IR1, or may be determined to be gazing when a portion of the character image 54 is included (enters) the central visual field IR1, for example. The predetermined ratio may be appropriately set, and as an example, half or more of the area of the target character image 54 is used. For example, in the example of fig. 7, more than half of the second character image 54B enters the center visual field range IR1. In this case, it is determined that the user character (third character image 54C) is looking at the second character image 54B.
On the other hand, the peripheral visual field range IR2 is a region formed around the visual field range IR1 so as to include the central visual field range IR 1. The shape of the peripheral visual field range IR2 may be appropriately formed, and for example, an ellipse having a predetermined size including the central visual field range IR1 in the vicinity of the center may be formed. The peripheral visual field range IR2 may be used appropriately or may be omitted, and the peripheral visual field range IR2 may be used in preparation for determining the cooperative action, for example. Specifically, when the character image 54 enters the peripheral visual field range IR2, the computer determines that there is a possibility of looking at the character image 54, and starts determining whether or not the central visual field range IR1 includes the character image 54. As an example, such a visual field range IR is set in the game screen 50 to realize determination of whether or not cooperative action is performed, that is, whether or not the character image 54 of the user is looking at other character images 54.
Fig. 8 is an explanatory diagram for explaining an example of the response action. As the response action, various actions (reactions) may be appropriately employed in accordance with or independent of the cooperative action, and for example, an action of each character to gently play its own musical instrument, a thumb (thumb up) or the like may be employed to indicate that the cooperative action is recognized, and as an example of such an action, a gaze action is employed. That is, after the user character performs an action of gazing at another character as a cooperative action, the other character also performs the same action of gazing at the user character as a responsive action. Fig. 8 schematically illustrates an example of another character image 54 that performs such a gaze action as a response action. In addition, (a) of the example of fig. 8 shows the character image 54 before the response action, and (B) shows the character image 54 at the time of the response action.
As shown in fig. 8 (a), before the response action, the other character image 54 (the character controlled by the computer) is not directed to the front side, that is, the direction of the character image 54 (the user) having performed the gaze as the cooperative action. Before the response, the other character image 54 may be directed in an appropriate direction, for example, in a direction away from the user, and in the example of fig. 8, the head is directed to the left, and the line of sight is directed to the same left. On the other hand, as shown in fig. 8 (B), the other character image 54 is directed to the front side and the line of sight is directed to the same front side at the time of the response action. That is, when the user performs a gaze motion that is a cooperative motion (when a portion of the character image 54 is included in the center visual field IR1 by a certain proportion or more), the other character image 54 also performs the same gaze motion in the direction of the user to respond to the gaze motion and returns the gaze to the user. The same applies to the case where the other character image 54 faces away from the user (the front side) (the direction facing away from the user). That is, the other character image 54 performs a gazing action as a response action to the direction of the user regardless of the orientation before the response action. Before the performance is started, the movement of the other character image 54 or the like may be controlled so as not to face away from the user in order to make it easier for the user to look at the other character image 54 or the like.
The direction of the line of sight of the other character image 54 may be appropriately determined, for example, by setting a visual field range IR similar to the visual field range of the user character for the other character image 54, and determining whether the user character has entered the visual field range IR, more specifically, by determining whether a portion of the user character having entered the central visual field range IR1 of the visual field range IR in a certain proportion or more. Such a fixation operation (a return-to-line operation) may be performed at an appropriate timing, or may be performed with a time difference from the cooperative operation, for example. That is, the action of returning the gaze to the line of sight may also be performed after the user has moved away from the line of sight. As described above, the return-to-line-of-sight operation can be performed at an appropriate timing, and the return-to-line-of-sight operation is performed during the process of the user's gaze, for example, to achieve eye contact between both sides. In this case, if the user has not seen his or her line of sight before the other character image 54 looks at it is determined that the response action (in this case, the cooperative action is required to be executed again in order to perform the response action) is not completed, the line of sight-looking action of the user may be permitted after the other character image 54 starts looking at it, for example. As an example, the character image 54 that is the target (gazed) of the cooperative action executes such a response action.
Fig. 9 is an explanatory diagram for explaining a flow of a change in the operation of the other character image 54 focused as the cooperative action. For example, in the example of fig. 3, an appropriate character image 54 such as the first character image 54A may function as such other character image 54, but in the example of fig. 9, a case is shown in which the second character image 54B of the example of fig. 3 functions as such other character image 54. In this case, as shown in fig. 9, the second character image 54B first forms a standby state on the game screen 50 as described above, and performs a standby operation in the standby (idle) state (S1). In addition, the second character image 54B (computer) discriminates whether or not the user character, that is, the third character image 54C, performs the cooperative action in the standby state (S2). More specifically, the second character image 54B determines whether or not the third character image 54C is looking at itself (whether or not a part of itself having a certain proportion or more is included in the user's center visual field IR 1). When the third character image 54C is not focused on itself (the portion of the third character image having not less than a predetermined proportion is included in the user's center visual field IR 1), the second character image 54B continues the standby state.
On the other hand, when the third character image 54C looks at itself (the part of which is included in the center visual field IR1 in a certain proportion or more), the second character image 54B performs a turning operation (response operation) of turning the head toward the third character image 54C (S3). The response action may be appropriately constituted, or may be constituted by only a turning action (action in the direction of looking at the third color image 54C), and may include, for example, a response gesture. A reply gesture is an action used to convey to a user the execution of a reply action. That is, the second character image 54B performs a response gesture as one of response actions after the turning-head action (S4). As such a response gesture, various head movements such as waving, thumb and index finger looping (OK signal), nodding, and the like, and sounding can be used, and as an example, a thumb-up (thumb-up) movement is used. That is, the second character image 54B performs the thumbs up motion subsequent to or simultaneously with the turning up motion. Then, the second character image 54B completes the response action by performing the turn-around action and the response gesture, thereby forming a cooperative state.
The flow until the cooperation state is formed may be appropriate, for example, when the user (user character) performs an appropriate operation such as a thumbs operation in a state in which the user looks at another character and the other character is also in a line of sight, and the other character performs an appropriate operation such as the same thumbs operation for the operation, the cooperation state may be formed. Alternatively, part of these operations may be omitted as appropriate. In these cases, a part of various actions mutually performed in order to form the cooperative state may function appropriately as a cooperative action and a response action. The same applies to the start conditions. For example, the start condition may be satisfied when a cooperation state is formed between the user character and all other characters. That is, the start action may not be included in the necessary conditions for the start condition. In this case, the start of the performance may be appropriately started, for example, performed immediately after the cooperative state is formed with the last other character. In addition, various buttons for the movement using a part of the hand, such as the thumb movement, may be provided on the operation lever OS as appropriate in order to distinguish from the operation of the stick image 56. Alternatively, a camera for photographing the entire user (or a main part) may be separately provided, and appropriate movements including movements of various parts such as the thumbs may be detected by such a camera.
The second character image 54B may perform an appropriate operation in the cooperative state (standby period in which formation of the cooperative state with the first character image 54A is awaited), and as an example, the standby operation may be performed in the same manner as in the standby state (S5). Further, it is determined whether or not the user has performed the start action in a cooperative state with the second character image 54B or the like (standby action period) (S6). The various actions may be appropriately performed as such a start action, and as an example, the thumbs action may be performed as such a start action. That is, it is determined whether or not the user has performed the thumbs as the start actions in the cooperation with the second character image 54B or the like. When the thumb-up operation is not performed, that is, the operation is started, for a predetermined time from the formation of the cooperative state, the second character image 54B releases the cooperative state and returns to the standby state again.
On the other hand, when the thumbs up operation, that is, the start operation is performed within a predetermined time from the formation of the cooperative state, the second character image 54B further performs the response gesture with respect to the start operation (S7). Such a response gesture may be omitted, but is performed, for example. The response gesture may be different from the response gesture that is the response action, but the response gesture is configured to be the same as the response gesture that is the response action, for example. That is, the same thumbs-up action is performed as a response gesture in association with the start action performed by the third character image 54C. The first color image 54A also changes its operation in the same flow, and forms a cooperation state with the third color image 54C.
The requirement of the start condition may be only the start action in the cooperative state between all the characters except the character corresponding to the user, or may include other requirements. The response gesture (S7) performed by all the character images 54 except the third character image 54C may also function as such other necessary conditions. As described above, the necessary conditions for the start conditions may be appropriate, and as an example, the performance may be started after the response gesture (S7) is performed on both the first character image 54A and the second character image 54B. More specifically, the start condition is satisfied in accordance with the response gesture (S7) of both the first character image 54A and the second character image 54B, and the show start is started. Then, after the countdown of the performance is started, the playing of the musical composition, that is, the performance section is started. As an example, the operations of other characters start performance by such a flow change. When a character other than the user character is controlled by the computer, the response gesture (S7) is automatically executed along with the execution of the start action. Therefore, in this case, the determination of whether or not the start action is present can function in the same manner as the determination of whether or not the start condition is satisfied.
Next, details of the performance data OD will be described. When the performance operation (performance) performed by the other character image 54 such as the first character image 54A or the second character image 54B is controlled by the computer, the other character image 54 may perform an appropriate performance operation or may perform a performance operation (one operation or a plurality of operations) set in advance, and for example, when the user has played a game through the other character image 54 in the past, the same performance operation as that in the game play is performed. That is, the computer controls the motion of the other character image 54 so as to track (copy) the same motion as the performance motion (play performance) performed by the user at the time of playing the game. The performance data OD is data for causing another character to perform performance of such a user. For example, when the user plays a game (selected as a user character in a character selection opportunity for selecting a character for play) via the second character image 54B, play content (performance of a guitar performed via the second character image 54B) during the play is recorded in the performance data OD and managed. The same applies to the performance of the game played by the user via the first character image 54A.
Fig. 10 is a diagram showing an example of the structure of the performance data OD. As shown in fig. 10, the video data unit OD1 and the information management unit OD2 are included. The video data unit OD1 is a portion configured as video data for causing the other character image 54 to perform a playing operation. Such a video data unit may be appropriately configured as various video data, and as an example, the motion of each character image 54 is configured as motion capture data, and thus the video data unit is similarly configured as motion capture data. The motion capture data may be appropriately generated, for example, by detecting a motion of the user from a camera that captures the entire body of the user and converting the motion into data, or by detecting a motion of the user from a mark attached to the body of the user and converting the motion into data. The motion capture data (video data unit OD 1) can be appropriately generated in this way, and motion of the HMD game machine 4 (motion of the head) and motion of the joystick OS with time are recorded, and these motions are replaced with operations to generate motion capture data, for example. The video data section is prepared for each performance operation. On the other hand, the information management unit OD2 is a part in which information for managing each video data unit is described. The information management unit OD2 can appropriately include various information necessary for managing each video data unit, and in the example of fig. 10, each video data unit (performance operation) includes a performance record ODR for managing information about the video data unit (performance operation). In order to perform such management, the performance record ODR contains information of "performance ID", "character", "musical composition", "user ID", and "date and time". In the performance record ODR, these pieces of information are recorded in a correlated manner. In this example, the performance data OD functions as candidate data of the present invention.
The "performance ID" is information indicating a unique performance ID for each performance action to manage each performance action (video data section). The "character" is information for specifying a character corresponding to the performance action. As such information, information that can identify each character is used, and as an example, information of a unique character ID for each character is used. Specifically, for example, in the case of a performance operation corresponding to the performance of a game played through the third color image 54C, information of a character ID corresponding to the third color image 54C is described in "character". The same applies to the first character image 54A and the like. In addition, each character corresponds to a musical instrument being played. Therefore, the information of the "character" also functions as the information of the musical instrument that is the playing target. Thus, information of "musical instrument" (e.g., ID of musical instrument) may also be described instead of "character". The "musical composition" is information representing musical compositions used in playing actions. The "musical composition" may describe information capable of specifying the musical composition, and as an example, describes information of a unique musical composition ID for each musical composition. The "user ID" is information indicating a unique user ID for each user to identify each user. The performance action that can be used as the play performance may be limited to the play performance of each user, and may be used by other users, for example. The "date and time" is information indicating the date and time of playing the game corresponding to the actual performance of each playing action. The performance data OD is not limited to these pieces of information, and for example, information necessary for realizing performance may be appropriately managed. Or a part of such information may be omitted as appropriate.
Next, the procedure of the performance selection process, the cooperation response process, the musical composition start process, and the performance data generation process will be described. The performance selection process is a process for providing a performance selection opportunity for selecting a performance action that should be performed by a character other than the user character. For example, in the case where the user plays a music game via the third character image 54C, a performance action to be executed by the first character image 54A and the second character image 54B, respectively, is selected in the performance selection opportunity. Specifically, when there are a plurality of game play results for the same character, the performance data OD includes a plurality of video data units OD1 (a plurality of performance operations) for the same character. In this case, among the candidates for the plurality of performance actions, a performance action to be executed by another character is selected in the performance selection opportunity. An example of a process for providing such performance selection opportunities is shown in fig. 11. In this case, the progress control section 43 starts performance selection processing of fig. 11 every time a character selection opportunity for selecting a character to be used in playing a music game is provided to the user, and first, acquires a selection result of a character in the character selection opportunity (step S101).
Next, the progress control section 43 provides a performance selection opportunity for selecting a performance action to be performed by a character other than the character selected in the character selection opportunity (step S102). Specifically, the progress control unit 43 refers to the performance data OD, and provides a performance selection opportunity for selecting one performance action from among a plurality of candidates for performance actions (which may include performance results of other users and a plurality of predetermined actions prepared in advance) for each target character. Such performance selection opportunities can be appropriately realized, for example, by a selection screen (not shown) for a performance selector that includes information necessary for selecting a performance operation. The information required for the selection of the performance action may include, for example, various information such as a user (who is the performance action) corresponding to the performance action or information related to various performances such as the level and score of the user. The options in the performance selection opportunity are presented corresponding to the musical piece, instrument selected by the user. For example, performance data OD generated by an instrument that is the same as the one selected by the user and different from the one selected by the user is presented to the user as an option in the performance selection opportunity. In addition, the definition conditions of the options may include, for example, a period during which the performance data OD is produced, as appropriate.
Next, the progress control unit 43 decides a performance operation of the other character based on the selection result in the performance selection opportunity (step S103). After this determination, the progress control unit 43 ends the performance selection processing of this time. Thereby, a performance selection opportunity for selecting performance actions to be performed by characters other than the user character is realized. In addition, in the performance selection opportunity, performance results of a plurality of performance actions including game play results of other users as candidates for such performance actions are presented. Therefore, with such a performance selection opportunity, other characters are caused to perform performance actions corresponding to actual game play performance in performance.
The cooperation response processing is processing for bringing each character into a cooperation state. In this case, it is determined whether or not a response action is performed by another user, and the response action is performed, but the example of fig. 12 shows a case where the action of each character is controlled by a computer (progress control unit 43). In this case, each time a user character (for example, the third character image 54C in the example of fig. 3) is operated by the user (in a case where another character is included in the peripheral visual field range IR 2), the progress control unit 43 starts the cooperation response processing of fig. 12, and first, determines whether or not the operation of the user matches a cooperation action (step S201). For example, when the user's operation corresponds to a gaze motion such that a portion of the other character is included in the center visual field range IR1 by a certain proportion or more, the operation (motion) corresponds to a cooperative motion. Therefore, the progress control unit 43 determines whether or not the operation by the user corresponds to an operation in which a part of the other character is included in the center visual field range IR1 by a certain proportion or more. When the user operation does not match the operation (gaze operation) such that the other character is included in the center visual field range IR1 in a part of a certain proportion or more, that is, when the user operation does not match the cooperative action, the progress control unit 43 skips the subsequent processing and ends the present cooperative response processing.
On the other hand, in a case where the user 'S operation corresponds to an action (gaze action) in which a part of the other characters is included in the central visual field IR1 in a certain proportion or more, that is, in a case where the user' S operation corresponds to a cooperative action, the progress control section 43 causes the target character of the cooperative action, that is, the character in which a part of the other characters is included in the central visual field IR1 in a certain proportion or more, to execute a response action (step S202). Specifically, for example, in the case where a gaze (back line of sight) action and a response gesture (thumbs) are employed as the response actions as described above, the action of the target character is controlled so that the target character performs the action of gazing at the user character and performs the thumbs action.
Next, the progress control unit 43 forms a cooperation state between the user character and the character for which the response action was executed in step S202 (step S203). The formation of the cooperative state may be appropriately performed, for example, when the cooperative state is expressed by a dedicated operation, the formation of the cooperative state may be performed by executing such an operation, and the formation of the cooperative state may be performed by updating a flag for managing the presence or absence of the cooperative state, for example. Specifically, parameters for managing the presence or absence of the cooperation state are set for each character. Accordingly, the progress control section 43 realizes the formation of the cooperative state by updating the flag to a state indicating the formation of the cooperative state. After the cooperation state is established, the progress control unit 43 ends the cooperation response processing of this time. Thus, the actions of the other characters are controlled to perform responsive actions in response to the cooperative actions. That is, the actions of the other characters are controlled such that a cooperative state is actively formed between the user character and the other characters. Then, in the case where the cooperative state is formed, the state is managed by a flag.
The musical composition start processing is processing for starting performance (guidance at each execution timing) with the start condition as a trigger. Fig. 13 is an example showing a music piece start process executed in a case where other characters operate in accordance with the flow of fig. 9. In this case, the progress control section 43 starts the music piece start processing of fig. 13 every time the start action is performed by the user and every time a predetermined time has elapsed from the formation of the cooperative state, and first, determines whether or not the start condition is satisfied (step S301). Specifically, as described above, the start condition is satisfied when, as an example, a start action (for example, a thumbs action) is performed and a response gesture to respond to the start action is performed. Accordingly, the progress control unit 43 may determine that the start condition is satisfied when all other characters have performed the response gesture, but the response gesture is automatically performed by the computer (progress control unit 43) in association with the execution of the start action, and thus, for example, determines that the start condition is satisfied when the start action is performed in a state in which the user character and all other characters have formed a cooperative state.
If the start condition is not satisfied, the progress control unit 43 determines whether or not a release condition for releasing the cooperation state is satisfied (step S302). The release condition may be appropriately satisfied, and is satisfied when a predetermined time has elapsed since the formation of the cooperative state, for example. Therefore, the progress control unit 43 determines whether or not a predetermined time has elapsed from the formation of the cooperative state. When the release condition is not satisfied, that is, when a predetermined time has not elapsed since the formation of the cooperative state, the cooperative state is maintained (step S303). That is, the progress control unit 43 maintains the state of the flag indicating the cooperation state as the state. On the other hand, when the release condition is satisfied, that is, when a predetermined time has elapsed since the formation of the cooperative state, the progress control section 43 releases the cooperative state (step S304). Specifically, the progress control unit 43 changes the state of the flag indicating the cooperative state to correspond to the non-cooperative state. After the maintenance of the cooperative state or the release of the cooperative state, the progress control section 43 ends the musical composition start processing of this time.
On the other hand, when the start condition is satisfied in step S301, that is, when the start action is performed in a state in which the user character and all other characters form a cooperative state, the progress control section 43 starts the performance (step S305). Specifically, the progress control unit 43 causes the game screen 50 to display a start performance. Next, the progress control section 43 starts playing of the musical composition, in other words, the performance section after the start of the performance (step S303). Then, after the playback of the music is started, the progress control section 43 ends the music start processing of this time. Thus, the performance is started with the start condition including the necessary condition for the formation of the cooperation state between the user character and the other character. More specifically, the start condition is satisfied in the case where the start action is performed in the state where the cooperative state is formed between the user character and all other characters, so that the start performance in the performance, that is, the preparation section starts. Then, after the start of the performance, the playing of the musical composition, that is, the performance section starts.
The performance data generation process is a process for generating performance data OD based on game play results of the respective users. The performance data OD may be appropriately generated, for example, based on the results of all games played by all users, and may be generated when the user wishes to generate the performance data OD, for example. In this case, when the performance section starts in the game play of the user desiring to generate the performance data OD (the user may be appropriately given an opportunity to confirm whether or not additional generation is required), the data management section 44 starts the performance data generation process of fig. 14, and first, the operation of the user in the performance section, that is, the action of the user character, is recorded (step S401). Next, the data management section 44 generates performance data OD based on the recording result of step S401 (step S402). Specifically, the data management unit 44 generates a video data unit OD1 for reproducing the motion of the user character and an information management unit OD2 for recording various information corresponding to the video data unit OD 1. After the generation of the performance data OD, the data management unit 44 ends the performance data generation processing of this time. Thereby, performance data OD for reproducing performance actions corresponding to game play results of the respective users is generated. Note that, the performance data OD may be uniformly generated, and stored when a predetermined condition is satisfied, for example, when the user desires, when the optimum score is calculated, or the like.
As described above, according to this aspect, the cooperative state is formed by the cooperative action by the user character (for example, the third character image 54C) and the response action by the other character (for example, the second character image 54B) in response to the cooperative action. Thus, multiple characters in the game can be made to mimic the process of pacing through these collaborative actions and response actions. More specifically, when a user character performs a gaze operation, a cooperative state is formed by a return line-of-sight operation and a thumbs-up operation performed by other characters that have performed the gaze. This allows the user roles to simulate the process of making pace by eye contact with other roles.
In addition, the performance is started when the start condition is satisfied, and on the other hand, the start of the performance is made standby before the start condition is satisfied. Such a requirement as a starting condition for a trigger for starting performance includes the formation of a cooperative state. That is, the performance starts after the formation of the engaged state. Therefore, before the performance starts, it is possible to cause a plurality of characters in the game to perform the same process as the process of making the pace performed in the actual performance performed by a plurality of persons, more specifically, the same process as the process of making the pace by eye contact. As a result, the feeling of presence or unity of the performance can be improved.
In addition, when the computer controls the other character to cause the other character to execute the response action in association with the execution of the cooperation action by the user character, the cooperation state can be positively formed with the cooperation action by the user character as a trigger. Therefore, the time period having the start condition can be matched with the state of the user. When such other characters act so as to track the play performance of the user during the performance, the play performance of the user, that is, the performance of the performance operation, can be reflected in the actions of the characters other than the user's character during the performance. Therefore, the reality of performance can be further improved, and the sense of presence is further improved. Further, by flexibly utilizing the performance results of such performance actions, it is possible to promote each user to use another character or to promote another user to use his own performance results, and further to promote the use of a game.
In the above manner, the progress control section 43 of the HMD game machine 4 functions as the condition discriminating means and the progress control means of the present invention by executing the music start processing of fig. 13. Specifically, the progress control unit 43 functions as a condition determination means by executing step S301 in fig. 13. The progress control unit 43 functions as a progress control means by executing the processing of step S305 and step S303 in fig. 13. Similarly, the progress control unit 43 of the HMD type game machine 4 functions as the character control means of the present invention by executing step S202 in the cooperation response process of fig. 12. The progress control unit 43 of the HMD type game machine 4 performs step S102 in the performance selection process of fig. 11, thereby functioning as the chance providing means of the present invention.
The present invention is not limited to the above-described embodiments, and may be implemented with appropriate modifications and alterations. For example, in the above-described embodiment, the processing of fig. 11 to 14 is executed by the HMD type game machine 4. However, the present invention is not limited to such a mode. For example, all or part of the processing of fig. 11 to 14 may be executed by the central server 2. For example, when all of the processing shown in fig. 11 to 13 is executed by the center server 2, the center server 2 alone (may include a plurality of server devices) may function as the game system of the present invention. On the other hand, the single body of the HMD type game machine 4 can function as the game system of the present invention. That is, in the game system of the present invention, the center server 2 may be omitted as appropriate.
Various aspects of the present invention, which are derived from the above-described embodiments and modifications, are described below. In the following description, corresponding members illustrated in the drawings are denoted by brackets for easy understanding of the embodiments of the present invention, but the present invention is not limited to the illustrated embodiments.
A game system of the present invention is connected to an input device (OS) for inputting a game play of a user and a display device (47), wherein the display device (47) displays a game screen (50) so as to include a plurality of characters (54), and the plurality of characters (54) include a user character (54C) that is a character operated by the game play of the user, and wherein the game system provides a game including a performance executed by the plurality of characters, and wherein the game system (4) includes: a condition determination unit (43) that determines whether or not a start condition is satisfied, the necessary condition in the start condition including formation of a cooperative state that is formed at least between at least one character (54C) among the plurality of characters in a case where a response action that is a prescribed action that responds to an action performed by the one character is performed by the other character (54B); and a progress control unit (43) that controls progress of the game in the following manner: when the start condition is satisfied, the performance is started with the start condition being satisfied, and the performance is started on standby until the start condition is satisfied.
According to the present invention, a cooperative state is formed by an action by one character and a response action by another character in response to the action. Thus, multiple characters in the game can be made to mimic a step-like process by these actions and responsive actions. Further, the performance is started when the start condition is satisfied, and the performance is started to stand by until the start condition is satisfied. Such a requirement as a start condition for a trigger to start performance includes the formation of a cooperative state. That is, the performance begins after the formation of the engaged state. Therefore, it is possible to cause a plurality of characters in a game to execute the same process of making steps as that executed in a performance by a plurality of persons before the performance starts. As a result, the feeling of presence or unity of the performance can be improved.
The plurality of roles may be appropriately controlled as long as at least one user role is included. For example, a plurality of characters may be controlled by a plurality of users, respectively. Alternatively, all roles except the user role may be controlled by the computer. The action to be a trigger for forming the cooperative state may be executed by a user character (including characters corresponding to other users) or by a computer-controlled character. For example, as one embodiment of the game system of the present invention, the following may be adopted: and a character control unit (43), wherein when the user character functions as the one character, the character control unit (43) controls the other character so that the other character responds to the action performed by the user character and executes the response action. In this case, control is performed such that the other character performs a response action in the case where an action for forming a cooperative state is performed. This makes it possible to actively form a cooperative state in response to an action performed by a user character. Therefore, the time period having the start condition can be matched with the state of the user.
The actions of the respective roles in the performance may be appropriately controlled, and for example, the roles may be controlled by a computer in all or only roles other than the user roles. The computer-controlled character may perform an appropriate operation or may perform a predetermined operation (including both a fixed operation and an operation that varies according to various conditions). Further, the number of such operations may be one or a plurality of. Specifically, for example, as a means for automatically controlling such that the other character performs the response action, the following means may be adopted: further, the game machine is provided with an opportunity providing unit (43), wherein when the action of the user character in the performance is controlled by the game playing action, the opportunity providing unit (43) provides a selection opportunity for selecting an action actually to be executed by the other character in the performance from among a plurality of actions based on candidate data (OD) which describes the other character in association with the plurality of actions which are candidates for the action to be executed by the other character in the performance.
Alternatively, in the case where there is a result of the character controlled by the computer that has been used in the past as the user character in playing the game, an action performed in accordance with the game playing behavior of the user (including not only the user who played the game this time but also other users) may be executed during the game playing. Specifically, in the aspect of the present invention that provides a selection opportunity, the game may be provided such that each user selects the user character from among the plurality of characters, and when there is a performance that has been selected as the user character by each user in the past for the other character, the plurality of actions include actions that the other character has performed in the performance. In this case, the play performance of the user can be reflected in the actions of the characters other than the user character in the performance.
As the action and the response action for forming the cooperative state, various actions can be suitably adopted. For example, various actions for making each part of the character such as the head, the hand, or the leg act may be appropriately used as the action or the response action for forming the cooperative state. In addition, an action such as sounding or eye contact may be appropriately utilized as an action for forming the cooperative state or a response action. Specifically, for example, in one aspect of the game system of the present invention, when an action of bringing the other character into a visual field (IR) set for the one character functions as the action and an action of bringing the one character into a visual field (IR) set for the other character functions as the response action, the cooperative state may be formed when the other character included in the visual field of the one character brings the one character into the visual field of the other character. In this case, as a simulation of the process of making the steps uniform, the user character and other characters can be made to execute an action corresponding to eye contact.
The start condition may include only the formation of the cooperative state as a necessary condition, or may include various other necessary conditions as appropriate. As such other necessary conditions, appropriate conditions such as game play behavior, game play status, and game state of the user can be utilized. For example, in one aspect of the game system of the present invention, the necessary condition for the start condition may further include a special action to be executed by the one character in the cooperation state, and the start condition may be satisfied when the special action is executed in the cooperation state.
The performance may include appropriate parts, for example, not only parts where a plurality of characters actually perform the performance, but also other various parts as appropriate. Specifically, for example, in one mode of the game system of the present invention, the performance may include a performance section in which the plurality of characters actually perform the performance and a preparation section for indicating a period in which the performance section starts, and the progress control unit may control the progress of the game in the following manner: the preparation section is started as the start of the performance upon the provision of the start condition, whereby the performance section is started after the preparation section.
As the game, various games can be appropriately provided. In such a game, similarly, various performances by a plurality of characters can be appropriately performed according to the kind of game. For example, a game in a sports game such as a soccer game or a baseball game may be used as a performance, and a ball may be played with the game having a start condition. Alternatively, dance, performance, or the like in a music game may also be utilized as the performance. Specifically, for example, in one embodiment of the game system of the present invention, a music game may be provided as the game, in which the execution timing of a game play action to be executed by the user is guided so as to match the melody of the music piece, and the user character executes a performance action for playing the music piece in association with the game play action, and in which the performance action by each of the plurality of characters is executed as the performance.
On the other hand, a computer program (PG 2) of the present invention is configured to cause a computer (41) to function as each unit of the game system, wherein the computer (41) is connected to the input device and the display device.
Further, the control method of the present invention causes a computer (41) mounted on a game system (4) to execute a condition discriminating process and a progress controlling process, wherein the game system is connected to an input device (OS) for inputting a game play behavior of a user, and a display device (47), the display device (47) displays a game screen (50) so as to include a plurality of characters (54), the plurality of characters (54) including a user character (54C) as a character operated by the game play behavior of the user, the game system provides a game including a performance executed by the plurality of characters, and wherein in the condition discriminating process, whether or not a start condition is satisfied is discriminated, a necessary condition among the start condition includes formation of a cooperative state which is a state formed at least between the one character and the other character when a response action as a prescribed action to be executed by at least one character (54C) among the plurality of characters is executed by the other character (54B), and in the progress controlling process, in the following manner: when the start condition is satisfied, the performance is started with the start condition being satisfied, and the performance is started on standby until the start condition is satisfied. The game system of the present invention can be realized by the computer program or the control method of the present invention.
Description of the reference numerals
1: a game system; 2: a central server; 4: HMD type game machines (game systems); 41: a control unit (computer); 43: a progress control unit (condition discrimination unit, progress control unit, character control unit, opportunity providing unit); 47: a display (display device); 50: a game picture; 54: character images (characters); 54B: a second character image (other characters); 54C: a third color image (user character, one character); IR: a field of view; OS: an operation lever (input device).

Claims (10)

1. A game system connected to an input device for inputting game play of a user and a display device for displaying a game screen so as to include a plurality of characters including a user character that is a character operated by the game play of the user, the game system providing a game including a performance performed by the plurality of characters, the game system comprising:
a condition discrimination unit that discriminates whether or not a start condition is satisfied, the necessary condition in the start condition including formation of a cooperative state that is a state formed at least between at least one character and another character in a case where a response action that is a prescribed action that responds to an action performed by the at least one character is performed by the other character; and
A progress control unit that controls progress of the game in the following manner: when the start condition is satisfied, the performance is started with the start condition being satisfied, and the performance is started on standby until the start condition is satisfied.
2. The gaming system of claim 1, wherein,
and a character control unit that, in a case where the user character functions as the one character, controls the other character so that the other character responds to the action by the user character and executes the responding action.
3. The game system according to claim 2, wherein,
and an opportunity providing unit that, in the case where an action of the user character in the performance is controlled by the game play behavior, provides a selection opportunity for selecting an action that the other character should actually perform in the performance from among a plurality of actions based on candidate data that describes the other character in association with the plurality of actions that are candidates for the action that the other character should perform in the performance.
4. The game system according to claim 3, wherein,
the game is provided in such a manner that each user selects the user character from the plurality of characters,
in the case where there is an actual result of the other character that was used by each user as the user character in playing the game, the plurality of actions includes an action that the other character performed in the actual result.
5. The game system according to any one of claims 1 to 4, wherein,
the cooperative state is formed when the other character included in the visual field of the one character brings the one character into the visual field of the other character, when the action of bringing the other character into the visual field of the one character functions as the action and the action of bringing the one character into the visual field of the other character functions as the response action.
6. The game system according to any one of claims 1 to 5, wherein,
the necessary conditions for the start condition further include a special action to be performed by the one character in the cooperative state, and the start condition is satisfied when the special action is performed in the cooperative state.
7. The game system according to any one of claims 1 to 6, wherein,
the performance includes a performance section where the plurality of characters actually perform the performance and a preparation section for indicating a period in which the performance section starts,
the progress control unit controls progress of the game in the following manner: the preparation section is started as the start of the performance upon the provision of the start condition, whereby the performance section is started after the preparation section.
8. The game system according to any one of claims 1 to 7, wherein,
providing a music game in which an execution period of a game play action to be executed by the user is guided in a manner matching with a melody of a musical composition, and the user character executes a performance action of playing the musical composition in accompaniment with the game play action,
the performance action by each of the plurality of characters is performed as the performance in the music game.
9. A computer program configured to cause a computer to function as each unit of the game system according to any one of claims 1 to 8, the computer being connected to the input device and the display device.
10. A control method for causing a computer mounted on a game system to execute a condition discriminating process and a progress controlling process, wherein the game system is connected to an input device for inputting a game play behavior of a user and a display device for displaying a game screen so as to include a plurality of characters including a user character as a character operated by the game play behavior of the user, the game system providing a game including a performance executed by the plurality of characters,
in a condition discriminating process, discriminating whether or not a start condition is satisfied, a necessary condition in the start condition including formation of a cooperative state which is a state formed at least between at least one character and another character in a case where a response action which is a prescribed action responding to an action performed by the at least one character among the plurality of characters is performed by the other character,
in the progress control process, the progress of the game is controlled in the following manner: when the start condition is satisfied, the performance is started with the start condition being satisfied, and the performance is started on standby until the start condition is satisfied.
CN202180087065.2A 2020-12-25 2021-11-24 Game system, computer program used in the game system, and control method Pending CN116635119A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2020-217979 2020-12-25
JP2020217979A JP2022102913A (en) 2020-12-25 2020-12-25 Game system, computer program used for the same, and control method
PCT/JP2021/043007 WO2022137958A1 (en) 2020-12-25 2021-11-24 Game system, computer program employed in same, and control method

Publications (1)

Publication Number Publication Date
CN116635119A true CN116635119A (en) 2023-08-22

Family

ID=82157634

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180087065.2A Pending CN116635119A (en) 2020-12-25 2021-11-24 Game system, computer program used in the game system, and control method

Country Status (4)

Country Link
JP (1) JP2022102913A (en)
KR (1) KR20230104959A (en)
CN (1) CN116635119A (en)
WO (1) WO2022137958A1 (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002320771A (en) * 2001-04-26 2002-11-05 Square Co Ltd Video game device and its control method, program for video game, and computer readable recording medium recording the program
US8678896B2 (en) * 2007-06-14 2014-03-25 Harmonix Music Systems, Inc. Systems and methods for asynchronous band interaction in a rhythm action game
JP2010000257A (en) * 2008-06-20 2010-01-07 Namco Bandai Games Inc Game controller case, game controller case set, program, and information storage medium
JP5052548B2 (en) * 2009-02-27 2012-10-17 株式会社スクウェア・エニックス Video game processing apparatus and video game processing program
JP6706473B2 (en) * 2015-09-14 2020-06-10 株式会社コーエーテクモゲームス Information processing apparatus, display control method, and display control program
JP6727807B2 (en) 2015-12-29 2020-07-22 株式会社バンダイナムコアミューズメント Game device and program
JP6832061B2 (en) * 2015-12-29 2021-02-24 株式会社バンダイナムコエンターテインメント Game equipment and programs
JP6624175B2 (en) * 2017-08-18 2019-12-25 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME DEVICE PROGRAM, AND GAME SYSTEM

Also Published As

Publication number Publication date
KR20230104959A (en) 2023-07-11
WO2022137958A1 (en) 2022-06-30
JP2022102913A (en) 2022-07-07

Similar Documents

Publication Publication Date Title
JP2020167606A (en) Viewing program, viewing method, and information terminal device
JP6233809B2 (en) GAME SYSTEM, CONTROL METHOD AND COMPUTER PROGRAM USED FOR THE SAME
US9751019B2 (en) Input methods and devices for music-based video games
JP2021053179A (en) Program, method and viewing terminal
JP6621156B1 (en) GAME SYSTEM, COMPUTER PROGRAM USED FOR THE SAME, AND CONTROL METHOD
CN116635119A (en) Game system, computer program used in the game system, and control method
CN110559648A (en) Network game control method and device
JP6651091B2 (en) Game system and computer program used therefor
JP7361736B2 (en) Game programs, game methods, and terminal devices
WO2022191170A1 (en) Game system, computer program employed in same, and control method
JP6449561B2 (en) PROGRAM, GAME DEVICE, AND GAME SYSTEM
WO2020255991A1 (en) Game program, game method, and information terminal device
JP6826626B2 (en) Viewing program, viewing method, and viewing terminal
JP7174456B1 (en) Game system, computer program used therefor, and control method
JP7457753B2 (en) PROGRAM AND INFORMATION PROCESSING APPARATUS
JP7354466B1 (en) Information processing systems and programs
JP7282731B2 (en) Program, method and terminal
JP7412613B1 (en) Information processing systems and programs
JP7412617B1 (en) Information processing systems and programs
JP6661176B1 (en) Game system, computer program used therefor, and control method
JP7007737B2 (en) Game consoles and computer programs
WO2022137377A1 (en) Information processing method, computer-readable medium, computer system, and information processing device
WO2020166514A1 (en) Game system, computer program for use in same, and control method
WO2022137375A1 (en) Method, computer-readable medium, and information processing device
JP2023130458A (en) Game system, computer program used therefor, and control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination