WO2022137958A1 - ゲームシステム、それに用いるコンピュータプログラム、及び制御方法 - Google Patents
ゲームシステム、それに用いるコンピュータプログラム、及び制御方法 Download PDFInfo
- Publication number
- WO2022137958A1 WO2022137958A1 PCT/JP2021/043007 JP2021043007W WO2022137958A1 WO 2022137958 A1 WO2022137958 A1 WO 2022137958A1 JP 2021043007 W JP2021043007 W JP 2021043007W WO 2022137958 A1 WO2022137958 A1 WO 2022137958A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- character
- performance
- action
- user
- game
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 83
- 238000004590 computer program Methods 0.000 title claims description 9
- 230000009471 action Effects 0.000 claims abstract description 231
- 230000004044 response Effects 0.000 claims abstract description 73
- 230000015572 biosynthetic process Effects 0.000 claims abstract description 28
- 230000000875 corresponding effect Effects 0.000 claims description 50
- 230000000007 visual effect Effects 0.000 claims description 44
- 230000006870 function Effects 0.000 claims description 37
- 238000002360 preparation method Methods 0.000 claims description 12
- 230000033764 rhythmic process Effects 0.000 claims description 6
- 230000001276 controlling effect Effects 0.000 claims description 4
- 230000008569 process Effects 0.000 abstract description 63
- 230000029058 respiratory gaseous exchange Effects 0.000 abstract description 12
- 230000033001 locomotion Effects 0.000 description 50
- 238000003860 storage Methods 0.000 description 26
- 230000002079 cooperative effect Effects 0.000 description 19
- 210000003128 head Anatomy 0.000 description 17
- 230000000694 effects Effects 0.000 description 14
- 238000010586 diagram Methods 0.000 description 13
- 238000007726 management method Methods 0.000 description 13
- 238000010079 rubber tapping Methods 0.000 description 13
- 238000001514 detection method Methods 0.000 description 11
- 238000012545 processing Methods 0.000 description 11
- 210000001508 eye Anatomy 0.000 description 10
- 230000002093 peripheral effect Effects 0.000 description 9
- 238000013523 data management Methods 0.000 description 8
- 238000011156 evaluation Methods 0.000 description 8
- 230000001133 acceleration Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 5
- 210000004247 hand Anatomy 0.000 description 4
- 210000003813 thumb Anatomy 0.000 description 4
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 3
- 230000006399 behavior Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000001755 vocal effect Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 210000003811 finger Anatomy 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 210000002414 leg Anatomy 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
- A63F13/48—Starting a game, e.g. activating a game device or waiting for other players to join a multiplayer session
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/79—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/79—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
- A63F13/798—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/814—Musical performances, e.g. by evaluating the player's ability to follow a notation
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/847—Cooperative playing, e.g. requiring coordinated actions from several players to achieve a common goal
Definitions
- the present invention is connected to an input device for inputting a user's play action and a display device for displaying a game screen so as to include a plurality of characters including a user character as a character operated through the user's play action.
- a game system or the like that provides a game including performances executed by the plurality of characters.
- a plurality of input devices for inputting a user's play action and a display device for displaying a game screen so as to include a plurality of characters including a user character as a character operated through the user's play action.
- game systems that provide games that include performances performed by characters.
- a game system that provides a music game in which a vocal is played by pretending to be the vocal of a band (see, for example, Patent Document 1).
- an object of the present invention is to provide a game system or the like capable of causing a plurality of characters in a game to perform a process similar to the breathing process performed in a performance by a plurality of people before the start of the performance. do.
- the game system of the present invention displays a game screen so as to include an input device for inputting a user's play action, and a plurality of characters including a user character as a character operated through the user's play action.
- a game system that is connected to a device and provides a game including performances executed by the plurality of characters, and is a corresponding action as a predetermined action corresponding to an action executed by at least one of the plurality of characters.
- a condition determining means for determining whether or not a start condition including at least the formation of a cooperative state formed between the one character and the other character is satisfied when another character executes the above. When the start condition is satisfied, the performance is started when the start condition is satisfied, while the progress of the game is controlled so that the start of the performance is waited until the start condition is satisfied. It is provided with a control means.
- the computer program of the present invention is configured to make the input device and the computer connected to the display device function as each means of the above-mentioned game system.
- control method of the present invention displays a game screen so as to include an input device for inputting a user's play action and a plurality of characters including a user character as a character operated through the user's play action.
- a predetermined action corresponding to an action performed by at least one of the plurality of characters on a computer connected to a display device and incorporated in a game system that provides a game including a performance performed by the plurality of characters.
- Conditional determination to determine whether or not the start condition including the formation of at least the cooperative state formed between the one character and the other character when the other character executes the corresponding action is satisfied.
- the progress of the game so that the performance is started when the procedure and the start condition are satisfied, and the start of the performance is waited until the start condition is satisfied. It is to execute the progress control procedure for controlling the above.
- the game system 1 includes a center server 2 as a server device.
- the center server 2 may be configured as one logical server device by combining server units as a plurality of computer devices. Alternatively, the center server 2 may be logically configured by using cloud computing.
- An appropriate number of game devices are connected to the center server 2 as client devices that can be connected via the network 3.
- the game device is a variety of game machines such as an arcade game machine (a commercial game machine installed in a facility such as an amusement store and allowing the user to play a game within a range corresponding to the play fee in exchange for payment of a predetermined price). May be included as appropriate, but in the example of FIG. 1, the user terminal device 4 is shown. Specifically, a plurality of user terminal devices 4 are connected to the center server 2 via a network 3 as an example of a game device.
- the user terminal device 4 is a computer device that can be connected to a network and is used for personal use by the user. By implementing various computer software, the user terminal device 4 can allow the user to enjoy various services provided by the center server 2.
- Such computer software includes an application for a game for providing a paid or free game. Then, the user terminal device 4 functions as a game machine through the execution of such a game application.
- Such a user terminal device 4 is, for example, a stationary or book-type personal computer, a stationary home-use game machine, or various mobile terminal devices such as a portable tablet terminal device and a mobile phone (including a smartphone). Includes. These may be appropriately used as the user terminal device 4, but in the example of FIG. 1, an HMD (head-mounted display or head-mounted device) type game machine is used.
- the HMD type game machine is a well-known game device that is mounted on the head so that the display surface of the display occupies most of the user's field of view.
- the HMD type game machine is a game composed of a combination of, for example, an HMD type dedicated game device, an appropriate mobile terminal device such as a mobile phone, and a case for accommodating the HMD type game machine so as to direct the user's view to the display surface. It includes a device and a glasses-type projector (so-called smart glasses) that projects an image so that the image is focused on the back of the eyeball.
- the HMD-type dedicated game device includes an interlocking game device that is connected to a tablet terminal device, a smartphone, or the like and functions as a game machine through the application of the tablet terminal device. Any of these various HMD-type game machines may be used as the user terminal device 4.
- the HMD-type game machine that functions as the user terminal device 4 may be referred to as the HMD-type game machine 4 with the same reference numerals as those of the user terminal device 4.
- the HMD type game machine 4 is appropriately provided with various input devices for inputting the user's play action.
- the HMD type game machine 4 may have a built-in sensor that detects the movement of the head as a play action, and the sensor may function as an input device.
- a device different from the HMD type game machine 4 may be connected to the HMD type game machine 4 by an appropriate method, and the other device may function as an input device.
- the HMD type game machine 4 may be appropriately provided with various input devices, and in the example of FIG. 1, an operation stick OS is provided as such an input device.
- the operation stick OS is a well-known input device connected to the HMD type game machine 4 through a predetermined wireless communication standard.
- the operation stick OS may be used for inputting various operations (playing actions), and an appropriate number of operation stick OSs may be connected to each HMD type game machine 4 accordingly, but as an example, the left and right hands may be used.
- Two operation stick OSs (only one of them is shown in FIG. 1) are connected to each HMD type game machine 4 so as to be operated respectively.
- the HMD type game machine 4 provides a game that progresses through a play action input via such an operation stick OS.
- the HMD type game machine 4 may appropriately provide various games, for example, an action game, a simulation game, a role playing game, or the like, but an music game is provided as an example.
- Music games are a type of timing game.
- a timing game is a type of game that evaluates the execution time of an appropriate play action.
- the execution time at which the appropriate play action should be performed is provided together with the music.
- a time that matches the rhythm of the music is used as an execution time. That is, the music game is a type of game that guides the user to the time when an appropriate play act should be executed according to the rhythm of the music, and evaluates the time when the play act is actually executed.
- a music game a plurality of songs are prepared for play, and the songs selected from the songs are used for actual play.
- a music game may be appropriately provided via various output devices, but as an example, it is provided through a game screen displayed on a display.
- the game provided by the HMD type game machine 4 includes a plurality of characters (including various objects such as a car or an animal) and performances executed by the plurality of characters.
- a plurality of characters may appropriately perform various performances according to the type of an action game or the like, but in the case of a music game, a performance of playing a musical instrument (hereinafter referred to as a performance performance) is executed as an example. Therefore, the game screen may include a plurality of characters that perform such a performance performance. Details of such a game screen will be described later.
- the network 3 may be appropriately configured as long as the HMD type game machine 4 can be connected to the center server 2.
- the network 3 is configured to realize network communication using the TCP / IP protocol.
- the network 3 is configured by combining the Internet as a WAN and the intranet as a LAN.
- the center server 2 is connected to the network 3 via the router 3a, and the HMD type game machine is connected to the network 3 via the access point 3b.
- the network 3 is not limited to the form using the TCP / IP protocol.
- various forms using a wired line for communication, a wireless line (including infrared communication, short-range wireless communication, etc.) and the like may be used.
- the center server 2 provides various Web services to the user of the HMD type game machine 4 via the network 3.
- the Web service includes a distribution service that distributes various data or software (including updates of data and the like) to each HMD type game machine 4.
- the Web service is a matching that matches users (competitors or cooperators) of another HMD type game machine 4 via the network 3 when the music game is played as a battle type or a cooperative type game.
- Various services such as a service and a service for assigning a user ID for identifying each user may be appropriately included.
- the center server 2 is provided with a control unit 21 and a storage unit 22 as storage means.
- the control unit 21 is configured as a computer in which a CPU as an example of a processor that executes various arithmetic processes and operation controls according to a predetermined computer program, and an internal memory and other peripheral devices necessary for the operation are combined.
- the storage unit 22 is an external storage device realized by a storage unit including a non-volatile storage medium (computer-readable storage medium) such as a hard disk array.
- the storage unit 22 may be configured to hold all the data on one storage unit, or may be configured to distribute and store the data in a plurality of storage units.
- the program PG1 is recorded in the storage unit 22 as an example of a computer program that causes the control unit 21 to execute various processes necessary for providing various services to the user.
- the storage unit 22 stores server data necessary for providing various services.
- server data includes various data for services, but in the example of FIG. 2, as a kind of such various data, music data MD, sequence data QD, play data PD, and performance data are used. OD is shown.
- Music data MD is data for playing each music.
- the music data MD is used for playing each music in, for example, a music game.
- the sequence data QD is data that describes each execution time in which an appropriate play action should be executed in a music game.
- the sequence data QD is used to inform the user of each such execution time. Further, when the play action is actually executed by the user, the play action is evaluated based on the execution time of the sequence data QD. That is, the sequence data QD is used for guidance of each execution time and its evaluation. Therefore, in the sequence data QD, information on each execution time and an appropriate play action to be executed at the execution time is described so as to be associated with each other.
- the sequence data QD is prepared for each song or each difficulty level.
- each execution time and appropriate play action may differ depending on the selected character (or a musical instrument different for each character as described later).
- the sequence data QD is also prepared for each such character (or musical instrument).
- the play data PD is data in which information regarding the past play results of each user is described.
- the play data PD is used to inherit the play results (past achievements) up to the previous time from the next time onward, or to inherit the setting contents unique to each user.
- the performance data OD is data for realizing a performance performance. The details of the performance data OD will be described later.
- the server data may also include various data for realizing various services.
- data may include image data, ID management data, and the like.
- the image data is data for displaying various images such as a game screen on a display device.
- the ID management data is data for managing various IDs such as user IDs. However, their illustration is omitted.
- the control unit 21 is provided with a Web service management unit 24 as a logical device realized by a combination of the hardware resources of the control unit 21 and the program PG1 as software resources.
- the Web service management unit 24 executes various processes for providing the above-mentioned Web service to the HMD type game machine 4.
- the HMD type game machine 4 is provided with a control unit 41 and a storage unit 42 as storage means. It is configured as a computer in which a CPU as an example of a processor that executes various arithmetic processes and operation controls according to a predetermined computer program and an internal memory and other peripheral devices necessary for the operation are combined.
- the storage unit 42 is an external storage device realized by a storage unit including a non-volatile storage medium (computer-readable storage medium) such as a hard disk or a semiconductor storage device.
- the program PG2 is recorded in the storage unit 42 as an example of a computer program that causes the control unit 41 to execute various processes necessary for providing various services to the user. Further, the storage unit 42 records game data necessary for providing the music game.
- game data includes various data for music games, and in the example of FIG. 2, music data MD, sequence data QD, play data PD, and performance data OD are shown as examples.
- the music data MD, sequence data QD, play data PD, and performance data OD may be provided to the storage unit 42 by various methods such as initial installation and provision through various storage media.
- the center server may be provided through a distribution service. Provided from 2.
- the game data may include various data for the game, for example, voice data for reproducing such various voices when there is another voice different from the music.
- Such data may appropriately include image data provided through a distribution service or the like, ID management data, or the like, as in the case of music data MD or the like.
- these illustrations are omitted.
- control unit 41 various logical devices are configured by the combination of the hardware resource of the control unit 41 and the program PG2 as the software resource. Then, various processes necessary for providing the music game (including processes necessary for enjoying the Web service provided by the Web service management unit 24 of the center server 2) are executed through these logical devices.
- a progress control unit 43 and a data management unit 44 are shown as logical devices related to a music game.
- the progress control unit 43 is a logical device that performs various processes necessary for the progress of the game. Such a process includes, for example, a process of executing various preparations for play, a process of guiding the user to each execution time of the play action, and a process of evaluating the play action executed by the user.
- preparations for play include appropriate elements such as various settings, such as provision of selection opportunities and determination of the start of guidance for each execution time.
- the progress control unit 43 executes a performance selection process, a cooperative response process, and a music start process as an example of such various processes.
- the data management unit 44 is a logical device that performs various processes related to the management of the game data recorded in the storage unit 42.
- the data management unit 44 executes a process of acquiring game data provided from the center server 2 and storing the game data in the storage unit 42.
- the data management unit 44 executes a performance data generation process as an example of such various processes. The details of the procedure of the performance selection process, the cooperative response process, the music start process, and the performance data generation process will be described later.
- the display 47 and the speaker SP are examples of the output devices, and the sensor SM is an example of the input devices.
- the display 47 is a well-known display device for displaying a game screen or the like.
- the speaker SP is a well-known audio reproduction device for reproducing various audios including music.
- the sensor SM is a well-known detection device (detection means) for detecting various states of the HMD type game machine 4.
- the sensor SM may appropriately include various detection devices according to the state of the detection target, and may appropriately include, for example, a sensor for eye tracking that tracks the line of sight of the user, but in the example of FIG. 2, the sensor SM may appropriately include the sensor SM.
- an acceleration sensor SM1 and a gyro sensor SM2 are shown.
- the acceleration sensor SM1 is a well-known detection device for detecting the acceleration (for example, acceleration in the triaxial direction) generated in the HMD type game machine 4.
- the acceleration sensor SM1 may be appropriately utilized through the detection of such acceleration, but as an example, it is used for detecting a state such as a horizontal state, an inclination of the HMD type game machine 4, or an orientation.
- the gyro sensor SM2 is a well-known detection device for detecting a change in angle with respect to a reference axis (for example, three axes). The gyro sensor SM2 may be appropriately utilized through such angle detection, but as an example, it is used for detecting a state such as rotation, tilt, or orientation of the HMD type game machine 4.
- acceleration sensors SM1 and gyro sensor SM2 may be used independently (when used alone, either one may be omitted), but as an example, an HMD type game. It is used in combination with the detection of various states such as the orientation of the machine 4.
- Various other output devices or input devices may be appropriately connected to the HMD type game machine 4, but in the example of FIG. 2, the above-mentioned operation stick OS is connected.
- the operation stick OS is as described above, but is connected to the control unit 41 and outputs various input signals according to the play action to the control unit 41.
- FIG. 3 is a diagram schematically showing an example of a game screen displayed on the display 47 of the HMD type game machine 4.
- a music game may include various game screens, but the example of FIG. 3 shows a game screen for guiding each execution time when a play act should be executed. Further, the music game may be appropriately configured, and in many cases, the playing sound of the musical instrument is reproduced along with the playing act so as to direct the playing of the musical instrument. In such a case, a character that executes a musical instrument playing action may appear on the game screen in order to improve the sense of presence. Similar characters corresponding to other users (cooperators) or computers may appear. The example of FIG.
- FIG. 3 shows a game screen when these characters are included. More specifically, a character that performs a performance action of playing a drum as an example of a musical instrument according to a user's play action, and a character that performs a performance action of playing another musical instrument along with another user's play action. Shows a game screen that includes both. Such a game screen may be appropriately configured, but the example of FIG. 3 shows a case where it is configured to produce a virtual three-dimensional space. Further, such a virtual three-dimensional space may be appropriately cut out as a shooting result of a virtual camera and displayed as a game screen, but the example of FIG. 3 may be referred to as a character corresponding to a user (hereinafter, may be referred to as a user character).
- the game screen 50 includes an instruction object 51, a score display area 52, a stage area 53, a character image 54, and a drum set image 55.
- the drum set image 55 is an image imitating a drum set of a musical instrument.
- the drum set image 55 may correspond to an appropriate drum set, but in the example of FIG. 3, two drum images 55a and four cymbal images 55b are included.
- Each drum image 55a and each cymbal image 55b are images corresponding to the drums and cymbals in the drum set, respectively. These are displayed in the same arrangement as the actual drum set.
- the two drum images 55a are arranged near the center in the drum set image 55 so as to be arranged side by side.
- the four cymbal images 55b are arranged on the left and right sides of the two drum images 55a so as to sandwich the two drum images 55a in the drum set image 55.
- the two cymbal images 55b on the left side and the two cymbal images 55b on the right side are both arranged so as to be vertically separated from each other.
- the score display area 52 is an area for displaying a score (acquired score).
- the score display area 52 may be appropriately configured, but in the example of FIG. 3, it is formed in a square shape and is displayed so as to be located on the back side (user side is in front) in the virtual three-dimensional space.
- the instruction object 51 is an image corresponding to each execution time (in other words, a sign image indicating each execution time).
- the instruction object 51 appears at an appropriate position on the back side in the virtual three-dimensional space at an appropriate time, and moves along a predetermined movement path so as to approach the front side, that is, the user.
- Such a movement path may be appropriately set, but is set so as to pass through either the drum image 55a or the cymbal image 55b.
- each drum image 55a and each cymbal image 55b functions as a marker image indicating a reference of the current time. Therefore, each instruction object 51 moves along such a movement path so as to match either the drum image 55a or the cymbal image 55b at the corresponding execution time.
- the movement route may be displayed, but the movement route is not displayed in the example of FIG.
- Each execution time may be appropriately guided through a relative displacement between the instruction object 51 (instruction marker image) and each drum image 55a or the like (current time indicator image), as an example. It is guided through the movement of the instruction object 51 to each drum image 55a or the like.
- the instruction object 51 may be classified into various types, but in the example of FIG. 3, two types of the first instruction object 51A and the second instruction object 51B are included.
- the first instruction object 51A and the second instruction object 51B are instruction objects 51 that request different play actions at each execution time. Specifically, both the first instruction object 51A and the second instruction object 51B request the user to hit the lower cymbal image 55b as a play action, but the first instruction object 51A is on the left side.
- the second instruction object 51B corresponds to the request for the tapping operation for the cymbal image 55b, and the second instruction object 51B corresponds to the request for the tapping operation for the cymbal image 55b on the right side. Therefore, after the first instruction object 51A and the second instruction object 51B appear at appropriate positions, the front side so as to coincide with the positions of the left and right lower cymbal images 55b at each execution time. Move towards.
- the tapping action required by the user is input through the operation stick OS.
- the user executes an operation of tapping the position of the cymbal image 55b in the virtual three-dimensional space.
- the operation of shaking the operation stick OS is required. Therefore, the sequence data QD includes the movement path of each instruction object 51 or the movement path thereof so that the action of hitting each cymbal image 55b or the like functions as an appropriate play action at each execution time. Information such as each cymbal image 55b of the above is described as information on an appropriate play action. Then, the smaller the deviation time between the actual execution time of the tapping operation and the actual coincidence time (execution time described in the sequence data QD), the higher the evaluation. The same applies to the second instruction object 51B.
- each drum image 55a and each cymbal image 55b are arranged at a predetermined position defined by spatial coordinates in the virtual three-dimensional space, and each instruction object 51 is directed toward each drum image 55a or the like at the predetermined position (spatial coordinates). Moves. Further, an evaluation range (a range equal to or slightly larger than that of each drum image 55a) is set for each drum image 55a or the like based on the predetermined position.
- the stick image 56 moves in a virtual three-dimensional space with an operation on the operation stick OS, and depending on whether or not the position (spatial coordinates) of the tip thereof is within the evaluation range of each drum image 55a or cymbal image 55b. It is determined whether or not the tapping operation has been executed. Then, between the time when the stick image 56 is included in the evaluation range (the time when the actual playing action is performed) and the time when the instruction object 51 should reach each drum image 55a or the like (the time when the sequence data QD is executed). The deviation time is calculated, and the evaluation is determined according to the deviation time. This evaluation may be performed as appropriate, but is achieved, for example, through criteria such as perfect, great, good, or bad.
- the tapping operation is not executed within a predetermined time based on each execution time, it is determined as a mistake operation (evaluation result corresponding to the mistake). Similarly, if the tapping action is executed earlier or later than the period belonging to the bad, it is determined as a mistake. If the tapping operation is executed at a time not belonging to any of these, the operation is not used for anything and may be ignored.
- the instruction object 51 also includes a type corresponding to the cymbal image 55b located on the upper left and right sides, a type corresponding to each drum image 55a, and the like, and the same as the first instruction object 51A through these types.
- the operation of hitting the drum image 55a or the like may be appropriately requested, but the display thereof is omitted in the example of FIG.
- various instruction objects 51 such as the first instruction object 51A may be displayed in an appropriate form, but as an example, all of them are the same as the indicator image of the current time located on the movement path such as the cymbal image 55b. It is displayed in the shape of.
- various instruction objects 51 such as the first instruction object 51A may be appropriately displayed so as to be distinguishable from each other, but as an example, the first instruction object 51A and the second instruction object 51B are different from each other. It is displayed in a color scheme.
- each drum image 55a and the like in the drum set image 55 may be displayed in a different color scheme from each other.
- each instruction object 51 may be displayed in the same color as each corresponding drum image 55a or the like so that the user can easily identify the play target.
- the stage area 53 is an area corresponding to the stage formed in the virtual three-dimensional space.
- each character performs a performance operation on the stage. Therefore, the character image 54 and the drum set image 55 corresponding to each character are arranged in the stage area 53.
- An appropriate number of character images 54 may be arranged in the stage area 53, but in the example of FIG. 3, three character images 54 are displayed.
- the character image 54 includes a first character image 54A for playing a keyboard, a second character image 54B for playing a guitar, and a third character image 54C for playing a drum.
- These character images 54A, 54B, 54C correspond to three characters forming one playing unit for playing a keyboard, a guitar, and a drum (all are examples of musical instruments) in a virtual three-dimensional space, respectively. There is. These three characters may appropriately correspond to the user and the collaborator (including the computer), and may execute the performance operation based on any of the play actions, but as an example, the first character.
- the image 54A and the second character image 54B correspond to the computer, and the third character image 54C corresponds to the user.
- the first character image 54A to the third character image 54C function as a plurality of characters of the present invention.
- the first character image 54A to the third character image 54C may be displayed as appropriate, but in the example of FIG. 3, the first character image 54A and the first character image 54A and the first character image 54C correspond to the field of view of the user character (third character image 54C).
- the whole body is displayed in the two-character image 54B, and only both hands (only a part of the body) are displayed in the third character image 54C (user character).
- two stick images 56 are displayed on the left and right hands of the third character image 54C so as to show a situation in which the user character is holding a stick for hitting a drum in a virtual three-dimensional space.
- the third character image 54C also operates so as to reproduce the tapping operation.
- the third character image 54C executes an operation (playing operation) of hitting the target cymbal image 55b or the like on the stick image 56 with the execution of the hitting operation.
- the play action by a collaborator including a computer
- the action of playing a key in the case of a keyboard the action of playing a string in the case of a guitar, for example, although it may be appropriate depending on the type of the target instrument or input device.
- the first character image 54A and the second character image 54B also execute the same playing operation on the game screen 50.
- FIG. 4 is an explanatory diagram for explaining the positional relationship of each character in the virtual three-dimensional space.
- the example of FIG. 4 schematically shows a stage in a virtual three-dimensional space and a case where each character is viewed from above. Since the stage or the like in the virtual three-dimensional space corresponds to the stage area 53 or the like on the game screen 50, the stage in the example of FIG. 4 is for convenience of explanation, and each character has the stage area 53 on the game screen 50 and each character.
- the same reference numerals as those of the image 54 and the like are attached.
- each character image 54 (each character in the virtual three-dimensional space) is arranged in the stage area 53 (stage in the virtual three-dimensional space) at a predetermined interval.
- the position or interval of each character image 54 may be appropriate or variable, but is fixed as an example. That is, each character image 54 is fixedly arranged at a predetermined position in the stage area 53.
- the third character image 54C corresponds to the user character. Therefore, the field of view range IA is set in the third character image 54C.
- the field of view IA corresponds to the shooting range of the virtual camera for drawing the game screen 50.
- the field of view IA may be set as an appropriate range, but in the example of FIG. 4, the field of view IA set when the third character image 54C faces the front is shown. Further, this field of view range IA moves according to the operation of the third character image 54C, in other words, the operation of the user.
- the HMD type game machine 4 is mounted on the head of the user and played, and the head of the third character image 54C also moves with the movement of the head of the user.
- the field of view IA is set to the front side (the range displayed by the solid line) in the example of FIG.
- the head of the third character image 54C also operates so as to face the same right in the virtual three-dimensional space.
- the field of view IA changes its angle to the right according to the movement of its head.
- the visual field range IA moves to the right tilt range IA1 indicated by the alternate long and short dash line as the user moves his / her head toward the right.
- the virtual three-dimensional space included in the right tilt range IA1 is displayed as the game screen 50.
- FIG. 5 is a diagram schematically showing an example of a game screen 50 when the field of view IA is moved to the right tilt range IA1 in the example of FIG. Further, the example of FIG. 4 also corresponds to the game screen 50 when the user faces diagonally upward to the right from the example of FIG. Such a change in the head in the vertical direction is also detected by the sensor SM and reflected on the game screen 50. Therefore, in this case, as shown in FIG. 5, the second character image 54B moves closer to the center of the game screen 50 as the field of view (field of view IA) moves, as compared with the example of FIG. Then, the display of the first character image 54A has disappeared. Further, similar changes occur in the stage area 53, the score display area 52, and the drum set image 55.
- the character image 54 also executes a similar tapping operation, that is, a playing operation, on the game screen 50 in accordance with the user's playing action (tapping operation). Therefore, each character executes a performance operation with the start of play.
- Character images 54 other than the user character also perform the same performance operation with the start of play.
- a performance performance is formed by a combination of these performance movements, in other words, a performance movement by one performance unit as a whole.
- the start of play corresponds to the performance operation and the start of the performance performance.
- Such a performance performance that is, a game play, is started when the start condition is satisfied.
- the first character image 54A and the second character image 54B may be operated by another user as described above, but the case where both are operated (controlled) by a computer will be described below. ..
- FIG. 6 is an explanatory diagram for explaining an example of the flow up to the start of the performance performance.
- the start action is executed after the cooperation state is formed between the plurality of character images 54 (characters)
- the start condition is satisfied
- the performance performance that is, the play of the game is started.
- each character image 54 is first displayed in a standby state before the start of play (guidance of execution time by each instruction object 51).
- a standby state may be appropriately expressed, but in the case of a computer-controlled character, it is expressed by a predetermined standby operation.
- An appropriate operation may be executed as such a standby operation, but in the case of the second character image 54B, for example, an operation of tuning the guitar is executed.
- the first character image 54A also executes an appropriate standby operation such as an operation of lightly flipping the keyboard, but the display thereof is omitted in the example of FIG. 6 for convenience of explanation.
- the user's operation is reflected in the third character image 54C operated by the user (when controlled by another user, the same applies to the other characters).
- the user is allowed various actions in the standby state, and such actions include cooperative actions.
- the third character image 54C also executes the same cooperation action.
- the cooperative action may be executed so that both the second character image 54B and the first character image 54A are targeted at the same time, but as an example, one of the second character image 54B and the first character image 54A can be executed, respectively. It is executed to correspond individually.
- the computer determines the execution of the operation, and causes the second character image 54B to execute a response action for responding to the linked action. .. Then, when the response action is executed by the second character image 54B, a cooperative state is formed between the third character image 54C and the second character image 54B.
- the third character image 54C and the second character image 54B function as one character of the present invention and another character, respectively. Further, the cooperation action and the response action function as the action of the present invention and the corresponding action, respectively.
- the cooperative state may be appropriately formed between the second character image 54B and the first character image 54A.
- the cooperation state is the second character image 54B and the first character image 54B. If it is formed between one of the character images 54A (a part of the playing unit), it may be formed in the entire playing unit, but as an example, it is formed individually for each character. Therefore, the cooperation action by the third character image 54C with respect to the first character image 54A and the response action by the first character image 54A in response to the cooperation action are similarly performed between the third character image 54C and the first character image 54A. Will be executed.
- the third character image 54C forms a linked state between both the second character image 54B and the first character image 54A (in other words, the entire playing unit)
- the entire playing unit is in the linked state.
- the cooperation action may be executed by a character other than the user character such as the second character image 54B.
- the cooperation state is formed when the user returns a response action to such a cooperation action. May be done.
- the cooperation action may be executed by an appropriate character, but as an example, it is executed by the user character as shown in the example of FIG.
- the start condition may include various requirements and may be appropriately satisfied, but as an example, the requirement includes the formation of a cooperative state in the entire performance unit and the execution of the start action in the cooperative state. Therefore, the start condition is satisfied when the third character image 54C (user) executes the start action in the linked state of the entire performance unit. Then, when the start condition is satisfied, the performance performance is started.
- the initiation action serves as a special action of the present invention.
- the performance performance may be started as appropriate when the start conditions are met.
- the performance performance corresponds to the play of the music game, and the play of the music game includes the reproduction of the music because it is realized by the guidance of each execution time (display of the instruction object 51) according to the rhythm of the music. .. Therefore, the reproduction of the music for play may be started immediately after the start condition is satisfied, that is, immediately after the start action.
- the performance may be started as appropriate, but as an example, the start effect is first executed when the start condition is satisfied. That is, the performance performance includes a part for the start effect and a performance part in which each character image 54 actually executes the performance performance, and the start effect is first started when the start condition is satisfied.
- the start effect may be realized as appropriate, but as an example, it is realized to function as a countdown to the music reproduction. Therefore, the start staging functions as a preparatory part for adjusting the timing to the playback of the music.
- the countdown function may be appropriately realized, but as an example, it is realized by an effect in which each character image 54 is sequentially spotlighted at regular intervals. That is, after the start action, the effect of shining a spotlight on each character image 54 in order is first started, and after the spotlight is shined on all of them, the reproduction of the music is started. Then, as the music is played, the display of the instruction object 51 is started, and the performance operation (playing action) for the instruction object 51 is also started. That is, the actual performance operation (performance part) by each character image 54 is started.
- the preparation part and the performance part function as the preparation part and the performance part of the present invention, respectively.
- the cooperative action may be an appropriate action.
- an action in which the drummer taps a cymbal or the like lightly several times with a constant rhythm may be adopted as a cooperative action, as is executed at the start of a performance by an actual band (performance unit).
- a voice call (vocalization) in which the user character calls another character to start may be adopted as the cooperative action.
- various actions may be appropriately adopted as cooperative actions, but as an example, an action of directing the line of sight to the target character is adopted. Further, such an action of directing the line of sight may be appropriately determined. For example, when only one character image 54 is included in the field of view range IA as in the example of FIG. 5, the line of sight is directed to the character image 54.
- the visual field range IA may be used to determine the movement of directing the line of sight.
- the motion of directing the line of sight may be appropriately determined in this way, but as an example, a visual field range is set in a part of the visual field range IA, and the presence or absence of the motion of directing the line of sight is determined based on the visual field range. Therefore, a visual field range for discriminating the user's line of sight (user character's line of sight) is set on the game screen 50 (or virtual three-dimensional space).
- FIG. 7 is an explanatory diagram for explaining the visual field range set on the game screen 50.
- the example of FIG. 7 shows the field of view set on the game screen 50 of the example of FIG.
- the visual field range IR includes the central visual field range IR1 and the peripheral visual field range IR2.
- the field of view IR may be formed at an appropriate position on the game screen 50, and the position may be variable. Further, when the position of the visual field range IR is variably set on the game screen 50, the position may be appropriately set through various detection results such as eye tracking such as eye tracking.
- the field of view IR may be appropriately set on the game screen 50, but as an example, it is fixedly formed near the center of the game screen 50 (field of view).
- the field of view IR may be visualized, but is set invisible as an example. That is, in the example of FIG. 7, the central visual field range IR1 is displayed with a dot pattern and the peripheral visual field range IR2 is displayed with a right diagonal line for convenience, but the field of view 50 is displayed on the actual game screen 50 as in the example of FIG. The range IR is not displayed.
- the central visual field range IR1 is a range used for discriminating the motion of directing the line of sight.
- the central visual field range IR1 may be formed in an appropriate shape, but as an example, it is formed in a circle having a predetermined size. Further, the determination of the action of directing the line of sight may be appropriately realized by using the central visual field range IR1, and it may be determined that the line of sight is directed when a main part such as the head is included. However, as an example, when the character image 54 is included (included) in a certain ratio or more in the central visual field range IR1, it is determined that the line of sight is directed to the character image 54.
- such a fixed ratio may be appropriately set, but as an example, an area of half or more of the target character image 54 is used.
- an area of half or more of the target character image 54 is used.
- more than half of the second character image 54B is in the central visual field range IR1. In this case, it is determined that the user character (third character image 54C) is looking at the second character image 54B.
- the peripheral visual field range IR2 is a region formed around the central visual field range IR1 so as to include it.
- the shape of the peripheral visual field range IR2 may be appropriately formed, but as an example, it is formed into an elliptical shape having a predetermined size including the central visual field range IR1 in the vicinity of the center.
- Peripheral visual field range IR2 may be appropriately used or omitted, but is used as an example in preparation for determining a cooperative action.
- the computer determines that if the character image 54 enters the peripheral visual field range IR2, the line of sight may be directed to the character image 54, and whether or not the central visual field range IR1 includes the character image 54. Start determining.
- such a field of view IR is set on the game screen 50, and it is realized whether or not the cooperative action is executed, that is, whether or not the user's character image 54 is looking at another character image 54.
- FIG. 8 is an explanatory diagram for explaining an example of the response action.
- various actions may be appropriately adopted according to the cooperation action or independently of the cooperation action, for example, the action of each character playing his / her own musical instrument lightly or the cooperation such as thumb-up.
- Various actions indicating action recognition may be adopted, and an action of directing the line of sight is used as an example of this kind of action. That is, after the user character executes an action of directing the line of sight to another character as a cooperative action, the other character also executes the same action of directing the line of sight to the user character as a response action.
- the example of FIG. 8 schematically shows an example of another character image 54 that executes such an action of directing the line of sight as a response action. Further, in the example of FIG. 8, (A) shows the character image 54 before the response action, and (B) shows the character image 54 at the time of the response action.
- the other character image 54 (character controlled by the computer) is directed to the front side before the response action, that is, the direction of the character image 54 (user) whose line of sight is directed as a cooperative action. Not suitable. Before the response action, the other character image 54 may be oriented in an appropriate direction, for example, in a direction such as the back, but in the example of FIG. 8, the head is directed to the left and the line of sight is the same. It is facing to the left. On the other hand, as shown in FIG. 8B, the other character images 54 face the front side at the time of the response action, and the line of sight also faces the same front side.
- the line of sight is returned to the user in response to the action of directing the line of sight.
- the other character image 54 also performs an operation of directing the same line of sight toward the user. This is the same even if the other character image 54 has its back turned toward the user (front side) (even if it faces the direction in which the back is turned). That is, regardless of the orientation before the response action, the other character image 54 executes an operation of directing the line of sight toward the user as the response action.
- the movement of the other character image 54 or the like may be controlled so that the user does not turn his or her back toward the user so that the user can easily turn his or her line of sight to the other character image 54 or the like.
- the direction of the line of sight of the other character image 54 may be appropriately determined, but as an example, whether or not the other character image 54 has the same visual field range IR as the user character and is within the visual field range IR is determined. Specifically, it is determined according to whether or not the user character is included in the central visual field range IR1 in the visual field range IR at a certain ratio or more. Further, such an action of directing the line of sight (an action of returning the line of sight) may be executed at an appropriate timing, and may be executed, for example, at a time difference from the cooperative action. That is, the operation of returning the line of sight to the action of directing the line of sight may be executed after the user has taken the line of sight.
- the operation of returning the line of sight in this way may be executed at an appropriate timing, but as an example, it is executed while the user is turning the line of sight so that eye contact by both parties is realized.
- the response action is incomplete (in this case, the execution of the cooperative action is required again for the response action).
- the other character image 54 has started the action of turning the line of sight, the action of removing the line of sight of the user is permitted.
- the character image 54 (directed to the line of sight) that is the target of the cooperative action executes such a response action.
- FIG. 9 is an explanatory diagram for explaining the flow of changes in the movement of another character image 54 whose line of sight is directed as a cooperative action.
- an appropriate character image 54 such as the first character image 54A may function in the example of FIG. 3, but in the example of FIG. 9, the second character image 54B of the example of FIG. 3 may function. Shows when works.
- the second character image 54B first forms a standby state on the game screen 50 as described above, and executes a standby operation in the standby (idle) state (S1). Further, the second character image 54B (computer) determines whether or not the user character, that is, the third character image 54C has executed the cooperation action in the standby state (S2).
- the second character image 54B determines whether or not the third character image 54C directs the line of sight to this (whether or not the self is included in the user's central visual field range IR1 by a certain percentage or more). Then, when the third character image 54C does not direct the line of sight to this side (self is not included in the user's central visual field range IR1 by a certain percentage or more), the second character image 54B continues in the standby state.
- the second character image 54B turns around toward the third character image 54C.
- (Response action) is executed (S3).
- the response action may be appropriately configured, and may be configured only by a turning motion (a motion of directing the line of sight toward the third character image 54C), but also includes a response gesture as an example.
- the response gesture is an action for informing the user of the execution of the response action. That is, the second character image 54B executes a response gesture as one of the response actions following the turning motion (S4).
- a response gesture various head movements such as waving, making a ring with the thumb and index finger (OK sign), nodding, or vocalization may be adopted, but as an example, a thumb-up (thumbs up) may be adopted. (Stand up) operation is adopted. That is, the second character image 54B executes a thumb-up operation following the turning operation or at the same time. Then, the second character image 54B completes the response action by turning around and executing the response gesture, and forms a cooperative state.
- the flow until the formation of the cooperative state may be appropriate.
- the user user character
- performs an appropriate operation such as a thumb-up operation while the user directs the line of sight to another character and the other character also returns the line of sight.
- a cooperative state may be formed when another character performs an appropriate operation such as a similar thumb-up operation.
- some of these operations may be omitted as appropriate.
- some of the various actions that are mutually executed in order to form a cooperative state may appropriately function as a cooperative action and a response action.
- the starting conditions are the same. For example, the start condition may be satisfied when a cooperative state is formed between the user character and all other characters. That is, the start condition does not have to include the start action in the requirement.
- the start effect may be started as appropriate, but for example, the start effect is executed immediately after the cooperation state is formed with the last other character.
- the operation stick OS may be appropriately provided with various buttons for an operation using a part of the hand such as a thumb-up operation in order to distinguish it from the operation for the stick image 56.
- a separate camera may be provided for photographing the entire user (which may be the main part), and even if such a camera detects an appropriate operation including the operation of various parts such as a thumb-up operation. good.
- the second character image 54B may execute an appropriate operation in the linked state (waiting for the formation of the linked state with the first character image 54A), but as an example, the second character image 54B executes a standby operation in the same manner as the standby state (waiting). S5). Further, it is determined whether or not the start action is executed by the user in the linked state (during standby operation) with the second character image 54B or the like (S6). Various actions may function appropriately as such a start action, but as an example, a thumb-up action functions. That is, it is determined whether or not the thumb-up operation is executed as the start action by the user in the linked state with the second character image 54B or the like. If the thumb-up operation, that is, the start action is not executed within a predetermined time from the formation of the linked state, the second character image 54B cancels the linked state and returns to the standby state again.
- the second character image 54B when the thumb-up operation, that is, the start action is executed within a predetermined time from the formation of the cooperative state, the second character image 54B further executes a response gesture to the start action (S7).
- Such response gestures may be omitted, but they are performed as an example. Further, this response gesture may be different from the response gesture as a response action, but it is similarly configured as an example. That is, the same thumb-up operation is executed as a response gesture with the start action by the third character image 54C.
- the operation of the first character image 54A changes in the same flow, and a cooperative state or the like is formed with the third character image 54C.
- the requirement of the start condition may be only the start action in the state of cooperation with all characters other than the character corresponding to the user, or may include other requirements. Further, as such another requirement, the response gesture (S7) by all the character images 54 other than the third character image 54C may function.
- the requirements for the start condition may be appropriate, but as an example, the performance performance is started after the response gesture (S7) is executed from both the first character image 54A and the second character image 54B. More specifically, the start condition is satisfied by the response gesture (S7) from both the first character image 54A and the second character image 54B, and the start effect is started. Then, after the countdown by the start production, the reproduction of the music, that is, the performance part is started.
- the movements of other characters change in such a flow, and the performance performance is started.
- the response gesture (S7) is automatically executed along with the execution of the start action. Therefore, in this case, the determination of the presence or absence of the start action may function in the same manner as the determination of whether or not the start condition is satisfied.
- the other character image 54 may perform an appropriate performance operation in advance.
- the set playing motion one motion or a plurality of motions may be encountered
- the play may be performed.
- the performance data OD is data for causing another character to execute the actual performance of such a user's performance operation.
- the play content in the play (guitar via the second character image 54B).
- the actual performance of the performance) is recorded in the performance data OD and managed. The same applies to the actual results when the user plays through the first character image 54A.
- FIG. 10 is a diagram showing an example of the configuration of the performance data OD. As shown in FIG. 10, it includes a moving image data unit OD1 and an information management unit OD2.
- the moving image data unit OD1 is a part configured as moving image data for causing another character image 54 to perform a performance operation.
- Such a video data unit may be appropriately configured as various types of video data, but as an example, since the operation of each character image 54 is configured as motion capture data, the video data unit is also configured as motion capture data.
- This motion capture data may be appropriately generated, for example, the user's motion may be detected from a camera that captures the entire body of the user and converted into data, or the movement may be detected from a marker attached to the user's body. It may be converted into data.
- the motion capture data (video data unit OD1) may be appropriately generated, but as an example, the movement of the HMD type game machine 4 over time (movement of the head) and the movement of the operation stick OS over time. Are generated by recording and replacing them with motion.
- the video data section is prepared for each performance operation.
- the information management unit OD2 is a part in which information for managing each video data unit is described.
- the information management unit OD2 may appropriately include various information necessary for managing each video data unit, but in the example of FIG. 10, the information management unit OD2 includes a performance record ODR that manages information related to each video data unit (performance operation). I'm out.
- the performance record ODR includes information of "performance ID”, “character”, “music”, “user ID”, and “date and time” for such management.
- the performance record ODR records such information so as to be related to each other. Further, in this example, the performance data OD functions as the candidate data of the present invention.
- the "performance ID” is information indicating a unique performance ID for each performance operation in order to manage each performance operation (video data unit).
- the "character” is information for identifying a character corresponding to the performance operation. As such information, appropriate information that can identify each character may be used, but as an example, information of a character ID unique to each character is used. Specifically, for example, in the case of a performance operation corresponding to a performance played through the third character image 54C, the information of the character ID corresponding to the third character image 54C is described in the "character". The same applies to the case of the first character image 54A and the like. In addition, each character is associated with a musical instrument to be played. Therefore, the "character” information also functions as information on the musical instrument to be played.
- the "musical instrument” (for example, the ID of the musical instrument) may be described instead of the "character”.
- the "musical piece” is information indicating the musical piece used during the performance operation. Appropriate information that can identify each music may be described in the "music", but as an example, information of a unique music ID is described for each music.
- the "user ID” is information indicating a user ID unique to each user in order to identify each user.
- the performance operation that can be used as the play record may be limited to the own play record for each user, but as an example, the play record of another user can also be used.
- the "date and time” is information indicating the date and time of the play corresponding to the actual performance of each performance operation.
- the performance data OD is not limited to these information, and for example, information necessary for realizing a performance performance may be appropriately managed. Alternatively, some of this information may be omitted as appropriate.
- the performance selection process is a process for providing a performance selection opportunity for selecting a performance operation to be performed by a character other than the user character. For example, when the user plays a music game via the third character image 54C, a performance operation to be executed by the first character image 54A and the second character image 54B is selected at the performance selection opportunity.
- the performance data OD includes a plurality of moving image data units OD1 (plural performance operations) for the same character. In this case, the performance operation to be executed by another character is selected from the plurality of performance operation candidates at the performance selection opportunity.
- step S101 shows an example of a procedure for providing such a performance selection opportunity.
- the progress control unit 43 starts the performance selection process of FIG. 11 every time the user is provided with a character selection opportunity for selecting a character to be used for playing a music game. Acquire the selection result (step S101).
- the progress control unit 43 provides a performance selection opportunity for selecting a performance operation to be executed by a character other than the character selected in the character selection opportunity (step S102).
- the progress control unit 43 refers to the performance data OD, and includes a plurality of performance motion candidates (performance records of other users, and a plurality of predetermined motions prepared in advance) for each target character. It provides a performance selection opportunity so that one performance operation is selected from (may be). Further, such a performance selection opportunity may be appropriately realized, but as an example, it is realized through a selection screen (not shown) for a performance selection opportunity including information necessary for selecting a performance operation.
- various information such as a user corresponding to the performance of the performance motion (who is the performance motion) or information on various achievements such as the level and score of the user is appropriate. May be included in.
- the choices in the performance selection opportunity are presented corresponding to the music and the musical instrument selected by the user.
- a performance data OD generated by a musical instrument that is the same as the music selected by the user but is different from the musical instrument selected by the user is presented to the user as an option in the performance selection opportunity.
- the limiting condition of the options may appropriately include, for example, the period during which the performance data OD is created.
- the progress control unit 43 determines the performance operation of another character based on the selection result in the performance selection opportunity (step S103). Then, after this determination, the progress control unit 43 ends the performance selection process this time. As a result, a performance selection opportunity for selecting a performance operation to be performed by a character other than the user character is realized. Further, in the performance selection opportunity, the performances of a plurality of performances including the performances of other users are presented as candidates for such performances. Therefore, through such a performance selection opportunity, a performance operation corresponding to the actual performance performance is executed for another character in the performance performance.
- the cooperation response process is a process for forming each character in a cooperation state.
- Each character may be operated by another user, in which case the formation of the linked state is determined whether or not the response action is executed by the other user, and is formed when the response action is executed.
- the example shows the case where the operation of each character is controlled by a computer (progress control unit 43).
- the progress control unit 43 cooperates with FIG. 12 every time the user character (for example, the third character image 54C in the example of FIG. 3) is operated by the user (in a situation where the peripheral visual field range IR2 includes another character).
- the response process is started, and first, it is determined whether or not the user's operation corresponds to the cooperation action (step S201).
- the progress control unit 43 determines whether or not the user's operation corresponds to an operation of including another character in the central visual field range IR1 at a certain ratio or more. If the user's operation does not correspond to the operation of including another character in the central visual field range IR1 by a certain percentage or more (the operation of directing the line of sight), that is, if the user's operation does not correspond to the cooperative action, the progress control unit 43 performs the subsequent processing. Is skipped and the current linkage response processing ends.
- the progress control unit 43 performs the operation.
- the response action is executed by the character targeted for the cooperative action, that is, the character included in the central visual field range IR1 at a certain ratio or more (step S202).
- the action of directing the line of sight returning the line of sight
- the response gesture thumbs up
- the action of directing the line of sight to the user's character is executed and the thumb is executed.
- the movement of the target character is controlled so as to execute the up movement.
- the progress control unit 43 forms a cooperative state between the user character and the character whose response action is executed in step S202 (step S203).
- the formation of the linked state may be realized as appropriate, and for example, when the linked state is expressed by a dedicated operation, it may be realized by executing such an operation, but as an example, in order to manage the presence or absence of the linked state. It is realized by updating the flag of. Specifically, a parameter for managing the presence / absence of a linked state is set for each character. Therefore, the progress control unit 43 realizes the formation of the cooperative state by updating this flag to the state indicating the formation of the cooperative state. Then, after the cooperation state is formed, the progress control unit 43 ends the cooperation response processing this time.
- the actions of other characters are controlled to execute the response action according to the cooperation action. That is, the operation of the other character is controlled so that the cooperation state is positively formed between the user character and the other character. Then, when a linked state is formed, that state is managed by the flag.
- the music start process is a process for starting a performance performance (guidance of each execution time) triggered by a start condition.
- the example of FIG. 13 shows a music start process executed when another character operates in the flow of FIG.
- the progress control unit 43 starts the music start process of FIG. 13 every time the start action is executed by the user and every predetermined time elapses from the formation of the cooperative state, and first determines whether or not the start condition is satisfied. (Step S301). Specifically, the start condition is satisfied when the start action (for example, a thumb-up operation) is executed as an example as described above and the response gesture in response to the start action is executed.
- the start action for example, a thumb-up operation
- the progress control unit 43 may determine that the start condition is satisfied when all the other characters execute the response gesture, but the response gesture is a computer (progress control unit 43) as the start action is executed. ) Is automatically executed. Therefore, as an example, it is determined that the start condition is satisfied when the start action is executed in the situation where the cooperation state is formed between the user character and all the other characters.
- the progress control unit 43 determines whether or not the release condition for canceling the cooperation state is satisfied (step S302).
- the cancellation condition may be appropriately satisfied, but as an example, it is satisfied when a predetermined time has elapsed from the formation of the cooperative state. Therefore, the progress control unit 43 determines whether or not a predetermined time has elapsed from the formation of the cooperative state.
- the cancellation condition is not satisfied, that is, when a predetermined time has not elapsed from the formation of the linked state
- the linked state is maintained (step S303). That is, the progress control unit 43 maintains the state of the flag indicating the cooperation state as it is.
- the progress control unit 43 cancels the linked state (step S304). Specifically, the progress control unit 43 changes the state of the flag indicating the linked state so as to correspond to the non-linked state. Then, the progress control unit 43 ends the current music start process after maintaining the linked state or canceling the linked state.
- step S301 when the start condition is satisfied in step S301, that is, when the start action is executed in a situation where a cooperative state is formed between the user character and all the other characters, the progress control unit 43 performs the start effect.
- Start step S305. Specifically, the progress control unit 43 displays the start effect on the game screen 50. Subsequently, the progress control unit 43 starts playing the music, in other words, the performance part after the start effect (step S303). Then, after starting the reproduction of the music, the progress control unit 43 ends the music start processing this time.
- the performance performance is started when the start condition including the formation of the cooperative state between the user character and the other character is satisfied as a requirement.
- the start condition is satisfied when the start action is executed in the situation where the cooperation state is formed between the user character and all the other characters, and the start effect of the performance performance, that is, the preparation part is performed. It will be started. Then, after the start production, the reproduction of the music, that is, the performance part is started.
- the performance data generation process is a process for generating performance data OD based on the play performance of each user.
- the performance data OD may be appropriately generated, for example, it may be uniformly generated based on the performance of all the plays of all users, but as an example, it is generated when the user desires to generate the performance data OD.
- the data management unit 44 will perform the performance data OD.
- the performance data generation process of FIG. 14 is started, and first, the user's operation in the performance part, that is, the operation of the user character is recorded (step S401).
- the data management unit 44 generates the performance data OD based on the recording result of step S401 (step S402). Specifically, the data management unit 44 generates a moving image data unit OD1 for reproducing the operation of the user character and an information management unit OD2 for recording various information corresponding thereto. Then, after the performance data OD is generated, the data management unit 44 ends the performance data generation process this time. As a result, the performance data OD for reproducing the performance operation corresponding to the play performance of each user is generated. It should be noted that the performance data OD may be uniformly generated, and may be saved when a predetermined condition is satisfied, such as when the user desires or when the best score is calculated.
- the cooperation state includes the cooperation action by the user character (for example, the third character image 54C) and the response action by another character (for example, the second character image 54B) in response to the cooperation action. Is formed by. Therefore, it is possible to imitate the process of breathing together a plurality of characters in the game through these cooperative actions and response actions. More specifically, when the operation of directing the line of sight is executed by the user character, the operation of returning the line of sight by another character to which the line of sight is directed and the thumb-up operation form a cooperative state. This makes it possible to mimic the process of breathing through eye contact between these user characters and other characters.
- the start condition that triggers the start of such a performance performance includes the formation of a cooperative state as a requirement. That is, the performance performance is started after the formation of the cooperative state. For this reason, prior to the start of the performance, a process similar to the breathing process performed in an actual performance by multiple people, more specifically the breathing process through eye contact, is performed on multiple characters in the game. Can be made to. As a result, the sense of presence or unity of the performance can be improved.
- the cooperation state can be positively formed by the cooperation action by the user character. .. Therefore, the time when the start condition is satisfied can be adjusted to the convenience of the user.
- the past user's play performance that is, the performance performance performance
- the performance performance performance is added to the movement of the character other than the user character in the performance performance. Can be reflected.
- the reality of the performance performance and, by extension, the sense of presence can be further improved.
- by utilizing the performance results of such performance operations it is possible to encourage each user to use another character, or to encourage other users to use their own performance performance, and it is also possible to promote the use of the game.
- the progress control unit 43 of the HMD type game machine 4 functions as the condition determination means and the progress control means of the present invention by executing the music start process of FIG. Specifically, the progress control unit 43 functions as a condition determination means by executing step S301 in FIG. Further, the progress control unit 43 functions as a progress control means by executing the processes of steps S305 and S303 of FIG. Similarly, the progress control unit 43 of the HMD type game machine 4 functions as the character control means of the present invention by executing step S202 in the cooperative response process of FIG. 12. Further, the progress control unit 43 of the HMD type game machine 4 functions as an opportunity providing means of the present invention by executing step S102 in the performance selection process of FIG.
- the present invention is not limited to the above-mentioned form, and may be carried out in a form in which appropriate modifications or changes are made.
- the processes of FIGS. 11 to 14 are executed by the HMD type game machine 4.
- the present invention is not limited to such a form.
- all or part of the processes of FIGS. 11 to 14 may be executed by the center server 2.
- the center server 2 executes all the processes of FIGS. 11 to 13
- the center server 2 alone which may include a plurality of server devices
- a single unit of the HMD type game machine 4 may function as the game system of the present invention. That is, in the game system of the present invention, the center server 2 may be omitted as appropriate.
- the game system of the present invention includes an input device (OS) for inputting a user's play action, and a plurality of characters (54) including a user character (54C) as a character operated through the user's play action.
- a game system (4) connected to a display device (47) that displays a game screen (50) so as to provide a game including a performance executed by the plurality of characters, and among the plurality of characters. At least formed between the one character and the other character when the other character (54B) executes the corresponding action as a predetermined action corresponding to the action executed by at least one character (54C).
- condition determination means (43) for determining whether or not the start condition including the formation of the linked state is satisfied and the start condition are satisfied the performance is started when the start condition is satisfied.
- the game is provided with a progress control means (43) for controlling the progress of the game so that the start of the performance is waited until the start condition is satisfied.
- the cooperation state is formed by an action by one character and a corresponding action by another character corresponding to the action. Therefore, it is possible to imitate the process of breathing together a plurality of characters in the game through these actions and corresponding actions. Also, while the performance starts when the start condition is met, the start of the performance is waited until the start condition is met.
- the start condition that triggers the start of such performance includes the formation of a cooperative state as a requirement. In other words, the performance starts after the formation of the cooperative state. Therefore, before the start of the performance, it is possible to have a plurality of characters in the game perform a process similar to the breathing process performed in a performance by a plurality of people. As a result, the sense of presence or unity of the performance can be improved.
- a plurality of characters may be appropriately controlled as long as they include at least one user character.
- a plurality of characters may be controlled by a plurality of users.
- all characters other than the user character may be controlled by the computer.
- the action that triggers the formation of the cooperative state may be executed by a user character (including a character corresponding to another user) or may be executed by a computer-controlled character.
- the user character functions as the one character
- the other character is controlled so as to execute the corresponding action in response to the action by the user character.
- An aspect including the character control means (43) may be adopted.
- the action for forming the cooperative state is executed, the other characters are controlled to execute the corresponding action.
- each character may be appropriately controlled, for example, all may be controlled by a computer, or only characters other than the user character may be controlled by the computer.
- the computer-controlled character may operate as appropriate, and may perform a predetermined predetermined operation (including both a fixed operation and an operation that changes depending on various conditions). Further, such an operation may be one or a plurality. Specifically, for example, as an embodiment in which another character is automatically controlled to execute a corresponding action, when the action of the user character in the performance is controlled through the play action, the other character and the said person.
- the other character Based on the candidate data (OD) described so that the other character is associated with a plurality of actions as candidates for the action to be performed in the performance, the other character is actually in the performance among the plurality of actions.
- An embodiment comprising an opportunity providing means (43) that provides a selection opportunity for selecting an action to be performed may be adopted.
- the computer-controlled character may include other users (not only the user playing the game this time) in the play if the character has a track record of being used as a user character in the past.
- the action executed according to the play action of may be executed.
- the game is provided such that each user selects the user character from the plurality of characters, and the plurality of actions are the other. If the character has a track record selected by each user in the past as the user character, the action performed by the other character in the track record may be included. In this case, the past user's play record can be reflected in the behavior of a character other than the user character in the performance.
- actions may be appropriately adopted as actions for forming a cooperative state and corresponding actions.
- various actions that operate each part of the character such as the head, hands, or legs may be appropriately used as an action for forming a cooperative state or a corresponding action.
- an action such as vocalization or eye contact may be appropriately used as an action for forming a cooperative state or a corresponding action.
- the action of inserting the other character into the visual field range (IR) set for the one character is the action of the other.
- the other character included in the field of view of the one character is the other character so that the action of putting the one character in the field of view (IR) set for the character functions as the corresponding action. It may be formed when the one character is put in the field of view of.
- the user character and another character can be made to perform an action corresponding to eye contact as an imitation of the process of adjusting breathing.
- the start condition may include only the formation of a cooperative state as a requirement, or may appropriately include various other requirements. As such other requirements, appropriate conditions such as a user's playing action, playing situation, or game convenience may be used.
- the start condition further includes a special action to be performed by the one character in the linked state, and the special action is executed in the linked state. May be satisfied if
- the performance may include an appropriate part, for example, not only a part in which a plurality of characters actually execute the performance, but also various other parts as appropriate.
- the performance is a performance part in which the plurality of characters actually perform the performance, and a preparation part for indicating the start time of the performance part.
- the progress control means controls the progress of the game so that the performance part is started after the preparation part by starting the preparation part as the start of the performance when the start condition is satisfied. You may.
- various games may be provided as appropriate.
- various performances by a plurality of characters may be appropriately performed depending on the type of the game.
- a game play in a sports game such as a soccer game or a baseball game may be used as a performance, and a kick-off or a play ball may be started when a start condition is satisfied.
- dance, performance, or the like in a music game may be used as a performance.
- the user is informed of the execution time of the play act to be executed according to the rhythm of the music, and the user is accompanied by the play act.
- a music game is provided in which a character performs a performance operation of playing the music, and as the performance, the performance operation by each character of the plurality of characters may be executed in the music game.
- the computer program (PG2) of the present invention is configured to make the input device and the computer (41) connected to the display device function as each means of the above-mentioned game system.
- control method of the present invention includes a plurality of characters (54) including an input device (OS) for inputting a user's play action and a user character (54C) as a character operated through the user's play action.
- a computer connected to a display device (47) that displays a game screen (50) so as to include, and incorporated into a game system (4) that provides a game including performances executed by the plurality of characters.
- the one character and the other character When another character (54B) executes a corresponding action as a predetermined action corresponding to an action executed by at least one character (54C) among the plurality of characters, the one character and the other character
- the performance is triggered by the condition determination procedure for determining whether or not the start condition including the formation of at least the linked state formed between the two is satisfied, and the fulfillment of the start condition when the start condition is satisfied.
- the progress control procedure for controlling the progress of the game is executed so that the start of the performance is waited until the start condition is satisfied.
- the game system of the present invention can be realized through the computer program of the present invention or the control method.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- Computer Security & Cryptography (AREA)
- General Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
2 センターサーバ
4 HMD型のゲーム機(ゲームシステム)
41 制御ユニット(コンピュータ)
43 進行制御部(条件判別手段、進行制御手段、キャラクタ制御手段、機会提供手段)
47 ディスプレイ(表示装置)
50 ゲーム画面
54 キャラクタ画像(キャラクタ)
54B 第2キャラクタ画像(他のキャラクタ)
54C 第3キャラクタ画像(ユーザキャラクタ、一つのキャラクタ)
IR 視野範囲
OS 操作スティック(入力装置)
Claims (10)
- ユーザのプレイ行為を入力するための入力装置、及び前記ユーザのプレイ行為を通じて操作されるキャラクタとしてのユーザキャラクタを含む複数のキャラクタが含まれるようにゲーム画面を表示する表示装置に接続され、前記複数のキャラクタが実行するパフォーマンスを含むゲームを提供するゲームシステムであって、
前記複数のキャラクタのうちの少なくとも一つのキャラクタが実行したアクションに対応する所定のアクションとしての対応アクションを他のキャラクタが実行した場合に前記一つのキャラクタと前記他のキャラクタとの間に少なくとも形成される連携状態の形成を要件に含む開始条件が満たされるか否か判別する条件判別手段と、
前記開始条件が満たされた場合に当該開始条件の具備を契機に前記パフォーマンスが開始される一方で、前記開始条件が満たされるまで前記パフォーマンスの開始が待機されるように前記ゲームの進行を制御する進行制御手段と、
を備える、ゲームシステム。 - 前記ユーザキャラクタが前記一つのキャラクタとして機能する場合に、前記ユーザキャラクタによる前記アクションに対応して前記対応アクションを実行するように前記他のキャラクタを制御するキャラクタ制御手段を備える、請求項1のゲームシステム。
- 前記パフォーマンスにおける前記ユーザキャラクタの動作が前記プレイ行為を通じて制御される場合に、前記他のキャラクタと当該他のキャラクタが前記パフォーマンスにおいて実行すべき動作の候補として複数の動作とが関連付けられるように記述された候補データに基づいて、前記複数の動作のうち前記パフォーマンスにおいて前記他のキャラクタが実際に実行すべき動作を選択するための選択機会を提供する機会提供手段を備える、請求項2のゲームシステム。
- 前記ゲームは、各ユーザによって前記複数のキャラクタから前記ユーザキャラクタが選択されるように提供され、
前記複数の動作は、前記他のキャラクタが前記ユーザキャラクタとして過去に各ユーザによって前記ゲームのプレイに使用された実績が存在する場合に、当該実績において前記他のキャラクタが実行した動作を含んでいる、請求項3のゲームシステム。 - 前記連携状態は、前記一つのキャラクタに設定される視野範囲に前記他のキャラクタを入れる動作が前記アクションとして、前記他のキャラクタに設定される視野範囲に前記一つのキャラクタを入れる動作が前記対応アクションとして、それぞれ機能するように、前記一つのキャラクタの視野範囲に含まれる前記他のキャラクタが当該他のキャラクタの視野範囲に前記一つのキャラクタを入れた場合に形成される、請求項1~4のいずれか一項のゲームシステム。
- 前記開始条件は、前記連携状態において前記一つのキャラクタによって実行されるべき特別のアクションを更に要件に含み、前記連携状態において前記特別のアクションが実行された場合に満たされる、請求項1~5のいずれか一項のゲームシステム。
- 前記パフォーマンスは、当該パフォーマンスを前記複数のキャラクタが実際に実行するパフォーマンスパート、及び当該パフォーマンスパートの開始の時期を示すための準備パートを含み、
前記進行制御手段は、前記開始条件の具備を契機に前記パフォーマンスの開始として前記準備パートを開始させることにより前記準備パートの後に前記パフォーマンスパートが開始されるように前記ゲームの進行を制御する、請求項1~6のいずれか一項のゲームシステム。 - 前記ゲームとして、楽曲のリズムに合わせて前記ユーザが実行すべきプレイ行為の実行時期を案内するとともに、当該プレイ行為に伴い前記ユーザキャラクタが前記楽曲を演奏する演奏動作を実行する音楽ゲームが提供され、
前記パフォーマンスとして、前記音楽ゲームでは前記複数のキャラクタの各キャラクタによる前記演奏動作が実行される、請求項1~7のいずれか一項のゲームシステム。 - 前記入力装置、及び前記表示装置に接続されるコンピュータを、請求項1~8のいずれか一項のゲームシステムの各手段として機能させるように構成されたコンピュータプログラム。
- ユーザのプレイ行為を入力するための入力装置、及び前記ユーザのプレイ行為を通じて操作されるキャラクタとしてのユーザキャラクタを含む複数のキャラクタが含まれるようにゲーム画面を表示する表示装置に接続され、前記複数のキャラクタが実行するパフォーマンスを含むゲームを提供するゲームシステムに組み込まれるコンピュータに、
前記複数のキャラクタのうちの少なくとも一つのキャラクタが実行したアクションに対応する所定のアクションとしての対応アクションを他のキャラクタが実行した場合に前記一つのキャラクタと前記他のキャラクタとの間に少なくとも形成される連携状態の形成を要件に含む開始条件が満たされるか否か判別する条件判別手順と、
前記開始条件が満たされた場合に当該開始条件の具備を契機に前記パフォーマンスが開始される一方で、前記開始条件が満たされるまで前記パフォーマンスの開始が待機されるように前記ゲームの進行を制御する進行制御手順と、
を実行させる、制御方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202180087065.2A CN116635119A (zh) | 2020-12-25 | 2021-11-24 | 游戏系统、该游戏系统中使用的计算机程序以及控制方法 |
KR1020237019791A KR20230104959A (ko) | 2020-12-25 | 2021-11-24 | 게임 시스템, 거기에 사용하는 컴퓨터 프로그램 및 제어 방법 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-217979 | 2020-12-25 | ||
JP2020217979A JP2022102913A (ja) | 2020-12-25 | 2020-12-25 | ゲームシステム、それに用いるコンピュータプログラム、及び制御方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022137958A1 true WO2022137958A1 (ja) | 2022-06-30 |
Family
ID=82157634
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/043007 WO2022137958A1 (ja) | 2020-12-25 | 2021-11-24 | ゲームシステム、それに用いるコンピュータプログラム、及び制御方法 |
Country Status (4)
Country | Link |
---|---|
JP (1) | JP2022102913A (ja) |
KR (1) | KR20230104959A (ja) |
CN (1) | CN116635119A (ja) |
WO (1) | WO2022137958A1 (ja) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002320771A (ja) * | 2001-04-26 | 2002-11-05 | Square Co Ltd | ビデオゲーム装置およびその制御方法、ならびにビデオゲームのプログラムおよびそのプログラムを記録したコンピュータ読取り可能な記録媒体。 |
JP2010000257A (ja) * | 2008-06-20 | 2010-01-07 | Namco Bandai Games Inc | ゲームコントローラケース、ゲームコントローラケースセット、プログラム、及び情報記憶媒体 |
US20100029386A1 (en) * | 2007-06-14 | 2010-02-04 | Harmonix Music Systems, Inc. | Systems and methods for asynchronous band interaction in a rhythm action game |
JP2010200792A (ja) * | 2009-02-27 | 2010-09-16 | Square Enix Co Ltd | ビデオゲーム処理装置、ビデオゲーム処理方法、およびビデオゲーム処理プログラム |
JP2017055851A (ja) * | 2015-09-14 | 2017-03-23 | 株式会社コーエーテクモゲームス | 情報処理装置、表示制御方法、及び表示制御プログラム |
JP2017119033A (ja) * | 2015-12-29 | 2017-07-06 | 株式会社バンダイナムコエンターテインメント | ゲーム装置及びプログラム |
JP2019033996A (ja) * | 2017-08-18 | 2019-03-07 | 株式会社コナミデジタルエンタテインメント | ゲーム装置、ゲーム装置のプログラム、及び、ゲームシステム |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6727807B2 (ja) | 2015-12-29 | 2020-07-22 | 株式会社バンダイナムコアミューズメント | ゲーム装置及びプログラム |
-
2020
- 2020-12-25 JP JP2020217979A patent/JP2022102913A/ja active Pending
-
2021
- 2021-11-24 KR KR1020237019791A patent/KR20230104959A/ko unknown
- 2021-11-24 CN CN202180087065.2A patent/CN116635119A/zh active Pending
- 2021-11-24 WO PCT/JP2021/043007 patent/WO2022137958A1/ja active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002320771A (ja) * | 2001-04-26 | 2002-11-05 | Square Co Ltd | ビデオゲーム装置およびその制御方法、ならびにビデオゲームのプログラムおよびそのプログラムを記録したコンピュータ読取り可能な記録媒体。 |
US20100029386A1 (en) * | 2007-06-14 | 2010-02-04 | Harmonix Music Systems, Inc. | Systems and methods for asynchronous band interaction in a rhythm action game |
JP2010000257A (ja) * | 2008-06-20 | 2010-01-07 | Namco Bandai Games Inc | ゲームコントローラケース、ゲームコントローラケースセット、プログラム、及び情報記憶媒体 |
JP2010200792A (ja) * | 2009-02-27 | 2010-09-16 | Square Enix Co Ltd | ビデオゲーム処理装置、ビデオゲーム処理方法、およびビデオゲーム処理プログラム |
JP2017055851A (ja) * | 2015-09-14 | 2017-03-23 | 株式会社コーエーテクモゲームス | 情報処理装置、表示制御方法、及び表示制御プログラム |
JP2017119033A (ja) * | 2015-12-29 | 2017-07-06 | 株式会社バンダイナムコエンターテインメント | ゲーム装置及びプログラム |
JP2019033996A (ja) * | 2017-08-18 | 2019-03-07 | 株式会社コナミデジタルエンタテインメント | ゲーム装置、ゲーム装置のプログラム、及び、ゲームシステム |
Also Published As
Publication number | Publication date |
---|---|
KR20230104959A (ko) | 2023-07-11 |
CN116635119A (zh) | 2023-08-22 |
JP2022102913A (ja) | 2022-07-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106924966B (zh) | 游戏装置以及处理方法 | |
US20110028214A1 (en) | Music-based video game with user physical performance | |
US10441876B2 (en) | Video game integrating recorded video | |
JP6123066B2 (ja) | ゲーム装置及びゲームプログラム | |
JP6123118B2 (ja) | ゲーム装置及びゲームプログラム | |
JP2019101050A (ja) | 仮想空間において楽器の演奏を支援するためのプログラム、楽器の選択を支援するためにコンピュータで実行される方法、および情報処理装置 | |
JP6233809B2 (ja) | ゲームシステム、それに用いられる制御方法及びコンピュータプログラム | |
US9751019B2 (en) | Input methods and devices for music-based video games | |
JP6621156B1 (ja) | ゲームシステム、それに用いるコンピュータプログラム、及び制御方法 | |
WO2022137958A1 (ja) | ゲームシステム、それに用いるコンピュータプログラム、及び制御方法 | |
WO2022191170A1 (ja) | ゲームシステム、それに用いるコンピュータプログラム、及び制御方法 | |
JP6449561B2 (ja) | プログラム、ゲーム装置及びゲームシステム | |
JP2021104299A (ja) | 観戦システム、観戦システム用のコンピュータプログラム、及び観戦システムの制御方法 | |
JP7093590B1 (ja) | ゲームシステム、コンピュータプログラム及び制御方法 | |
JP7317364B2 (ja) | ゲームシステム、それに用いるコンピュータプログラム、及び制御方法 | |
JP7174456B1 (ja) | ゲームシステム、それに用いるコンピュータプログラム、及び制御方法 | |
WO2022137375A1 (ja) | 方法、コンピュータ可読媒体、および情報処理装置 | |
JP6661176B1 (ja) | ゲームシステム、それに用いるコンピュータプログラム、及び制御方法 | |
JP6328286B2 (ja) | ゲーム装置及びゲームプログラム | |
WO2013069647A1 (ja) | ゲーム機、それに用いる制御方法及び、コンピュータプログラム | |
JP2022157561A (ja) | ゲームシステム及びプログラム | |
JP2021010720A (ja) | ゲームシステム、それに用いるコンピュータプログラム、及び制御方法 | |
JP2021010656A (ja) | ゲームシステム、それに用いるコンピュータプログラム、及び制御方法 | |
JP2021104289A (ja) | 観戦システム、観戦システム用のコンピュータプログラム、及び観戦システムの制御方法 | |
JP2021104288A (ja) | 観戦システム、観戦システム用のコンピュータプログラム、及び観戦システムの制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21910111 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20237019791 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180087065.2 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21910111 Country of ref document: EP Kind code of ref document: A1 |