WO2024171277A1 - Information processing device and game play method - Google Patents

Information processing device and game play method Download PDF

Info

Publication number
WO2024171277A1
WO2024171277A1 PCT/JP2023/004907 JP2023004907W WO2024171277A1 WO 2024171277 A1 WO2024171277 A1 WO 2024171277A1 JP 2023004907 W JP2023004907 W JP 2023004907W WO 2024171277 A1 WO2024171277 A1 WO 2024171277A1
Authority
WO
WIPO (PCT)
Prior art keywords
play
game
agent
unit
user
Prior art date
Application number
PCT/JP2023/004907
Other languages
French (fr)
Japanese (ja)
Inventor
達紀 網本
宣之 中村
Original Assignee
株式会社ソニー・インタラクティブエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ソニー・インタラクティブエンタテインメント filed Critical 株式会社ソニー・インタラクティブエンタテインメント
Priority to PCT/JP2023/004907 priority Critical patent/WO2024171277A1/en
Publication of WO2024171277A1 publication Critical patent/WO2024171277A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/67Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players

Definitions

  • This disclosure relates to technology in which an agent plays a game on behalf of a user.
  • Patent document 1 discloses a system in which an AI (artificial intelligence) character progresses through a game on behalf of a user by handing over gameplay to a gameplay controller.
  • Patent document 2 discloses a method of training a machine learning model to mimic the user's gameplay using gameplay data generated when a user plays a video game.
  • the machine learning model is an artificial neural network that learns and mimics the user's gameplay style and tendencies. By training the machine learning model, the user's AI bot can perform gameplay that mimics the user.
  • AI agents in strategic games such as chess, shogi, and go have evolved significantly, and AI agents that outperform humans have appeared.
  • games with relatively complex rules such as role-playing games and action games, it is still not easy to create AI agents with high playing accuracy.
  • AI agents are often used as opponents for users, but if the playing accuracy of AI agents improves, users will be able to have the AI agents play on their behalf and progress through the game. Therefore, there is a need to build a system that allows users to effectively use AI agents with high playing accuracy.
  • An information processing device includes one or more processors having hardware, and the one or more processors receive operation information from a user, execute a game based on the operation information, notify the user of play units that an agent can play, and have the agent play the play units on behalf of the user.
  • Another aspect of the game playing method of the present disclosure is a method of playing a game in an information processing device, which accepts operation information from a user, executes a game based on the operation information, notifies the user of a play unit that an agent can play, and has the agent play that play unit on behalf of the user.
  • FIG. 1 is a diagram showing a game system according to an embodiment.
  • FIG. 2 is a diagram illustrating a hardware configuration of an information processing device.
  • FIG. 2 is a diagram illustrating functional blocks of the information processing device.
  • FIG. 13 is a diagram showing an example of a game image.
  • FIG. 1 is a diagram showing an example of a plurality of tasks that make up an activity.
  • FIG. 2 is a diagram illustrating functional blocks of the server device.
  • FIG. 13 is a diagram showing an example of a displayed notification area.
  • FIG. 13 is a diagram showing an example of a displayed notification area.
  • FIG. 13 is a diagram showing an example of a displayed notification area.
  • FIG. 13 is a diagram showing an example of a displayed notification area.
  • FIG. 13 is a diagram showing an example of a displayed notification area.
  • FIG. 13 is a diagram showing an example of a displayed notification area.
  • FIG. 13 is a diagram showing an example of a displayed notification area.
  • FIG. 1 shows a game system 1 according to an embodiment of the present disclosure.
  • the game system 1 of the embodiment includes an information processing device 10 operated by a user and a server device 5.
  • An access point (hereinafter referred to as "AP") 8 has the functions of a wireless access point and a router, and the information processing device 10 connects to the AP 8 wirelessly or via a wired connection to communicate with the server device 5 on a network 3 such as the Internet.
  • FIG. 1 shows one user and one information processing device 10, in the game system 1, multiple information processing devices 10 operated by multiple users are connected to the server device 5 via the network 3.
  • the information processing device 10 is connected wirelessly or via a wire to an input device 6 operated by a user, and the input device 6 outputs information operated by the user to the information processing device 10.
  • the information processing device 10 receives operation information from the input device 6, it reflects the information in the processing of the system software and game software, and causes the output device 4 to output the processing results.
  • the information processing device 10 is a game device (game console) that executes a game
  • the input device 6 is a device such as a game controller that supplies user operation information to the information processing device 10.
  • the input device 6 may also be an input interface such as a keyboard or mouse.
  • the auxiliary storage device 2 is a large-capacity storage device such as an HDD (hard disk drive) or SSD (solid state drive), and may be an internal storage device, or an external storage device connected to the information processing device 10 via a USB (Universal Serial Bus) or the like.
  • the output device 4 may be a television having a display for outputting images and a speaker for outputting sound.
  • the output device 4 may be connected to the information processing device 10 via a wired cable, or may be connected wirelessly.
  • the server device 5 provides network services to users of the game system 1.
  • the server device 5 manages user accounts that identify each user, and each user signs in to the network service provided by the server device 5 using their user account.
  • users can register game save data and trophies, which are virtual rewards acquired during game play, in the server device 5.
  • game save data and trophies which are virtual rewards acquired during game play, in the server device 5.
  • the save data and trophies can be synchronized even if the user uses an information processing device other than the information processing device 10.
  • the server device 5 in the embodiment has a function of generating an AI agent that plays a game and providing it to the information processing device 10.
  • the server device 5 collects play data indicating the game play situation from multiple information processing devices 10.
  • the play data may be operation information of a player (user) operating the input device 6, or may be game footage (play footage) drawn based on the player's operation information.
  • the server device 5 uses play data from one or more players to machine-learn a game operation model, and generates an AI agent (hereinafter sometimes simply referred to as an "agent") that can play the game autonomously.
  • an AI agent hereinafter sometimes simply referred to as an "agent”
  • the user can have an agent play the game on his/her behalf. While the agent is playing the game, the user does not perform any game operations related to the game, and the agent plays the game autonomously or automatically. For example, if the user is unable to operate the game because he/she has a meal, the user can have the agent play the game on his/her behalf, and the game can progress. The user can also take over operation of the game that the agent is playing midway.
  • the agent of the embodiment is generated by learning past play data from one or more players.
  • the agent is a learned game operation model, and autonomously or automatically generates operation information for the input device 6 according to the progress of the game.
  • the agent may generate game commands that the game can understand.
  • the learned game operation model is provided from the server device 5 to the information processing device 10 to assist the user in playing the game.
  • FIG. 2 shows the hardware configuration of the information processing device 10.
  • the information processing device 10 is configured to include a main power button 20, a power ON LED 21, a standby LED 22, a system controller 24, a clock 26, a device controller 30, a media drive 32, a USB module 34, a flash memory 36, a wireless communication module 38, a wired communication module 40, a subsystem 50, and a main system 60.
  • the main system 60 comprises a main CPU (Central Processing Unit), a memory and memory controller serving as the main storage device, a GPU (Graphics Processing Unit), etc.
  • the GPU is primarily used for the calculation and processing of game programs.
  • the main CPU has the function of starting up the system software and executing the game program installed in the auxiliary storage device 2 in the environment provided by the system software.
  • the subsystem 50 comprises a sub-CPU, a memory and memory controller serving as the main storage device, etc., but does not comprise a GPU.
  • the sub-CPU While the main CPU has the function of executing game programs installed in the auxiliary storage device 2, the sub-CPU does not have such a function. However, the sub-CPU has the function of accessing the auxiliary storage device 2 and the function of sending and receiving data with the server device 5.
  • the sub-CPU is configured with only these limited processing functions, and therefore can operate with less power consumption compared to the main CPU. These functions of the sub-CPU are executed when the main CPU is in standby mode.
  • the main power button 20 is an input unit through which the user inputs operations, is provided on the front of the housing of the information processing device 10, and is operated to turn on or off the power supply to the main system 60 of the information processing device 10.
  • the power ON LED 21 lights up when the main power button 20 is turned on, and the standby LED 22 lights up when the main power button 20 is turned off.
  • the system controller 24 detects when the main power button 20 is pressed by the user.
  • the clock 26 is a real-time clock that generates current date and time information and supplies it to the system controller 24, the subsystem 50, and the main system 60.
  • the device controller 30 is configured as an LSI (Large-Scale Integrated Circuit) that transfers information between devices like a south bridge. As shown in the figure, devices such as the system controller 24, media drive 32, USB module 34, flash memory 36, wireless communication module 38, wired communication module 40, subsystem 50, and main system 60 are connected to the device controller 30.
  • the device controller 30 absorbs differences in the electrical characteristics and data transfer speeds of each device and controls the timing of data transfer.
  • the media drive 32 is a drive device that operates by mounting a ROM medium 44 on which application software such as games and license information are recorded, and reads programs, data, and the like from the ROM medium 44.
  • the ROM medium 44 is a read-only recording medium such as an optical disk, magneto-optical disk, or Blu-ray disk.
  • the USB module 34 is a module that connects to an external device via a USB cable.
  • the USB module 34 may also be connected to the auxiliary storage device 2 and the camera 7 via a USB cable.
  • the flash memory 36 is an auxiliary storage device that constitutes the internal storage.
  • the wireless communication module 38 wirelessly communicates with the input device 6 using a communication protocol such as the Bluetooth (registered trademark) protocol or the IEEE802.11 protocol.
  • the wired communication module 40 communicates with an external device via a wired connection and connects to the network 3 via the AP 8.
  • FIG. 3 shows functional blocks of the information processing device 10.
  • the information processing device 10 has a processing unit 100 and a communication unit 102, and executes game software.
  • the processing unit 100 has a reception unit 110, a game execution unit 112, a game image and sound generation unit 114, an output processing unit 120, a play data acquisition unit 122, an agent acquisition unit 130, a game play controller 132, an event information acquisition unit 134, a notification unit 136, and a transmission processing unit 140.
  • the game image and sound generation unit 114 has a game image generation unit 116 and a game sound generation unit 118.
  • the information processing device 10 includes a computer, which executes a program to realize the various functions shown in FIG. 3.
  • the computer includes hardware such as a memory into which the program is loaded, one or more processors that execute the loaded program, an auxiliary storage device, and other LSIs.
  • the processor is composed of multiple electronic circuits including semiconductor integrated circuits and LSIs, and the multiple electronic circuits may be mounted on a single chip or on multiple chips.
  • the functional blocks shown in FIG. 3 are realized by cooperation between hardware and software, and therefore, it will be understood by those skilled in the art that these functional blocks can be realized in various forms using only hardware, only software, or a combination thereof.
  • the communication unit 102 receives information (operation information) of the user operating the input device 6 to play the game, and provides it to the reception unit 110.
  • the reception unit 110 receives the user's operation information, and provides it to the game execution unit 112 as well as to the play data acquisition unit 122.
  • the communication unit 102 communicates with the server device 5 to send and receive various information or data.
  • the communication unit 102 may have the functions of both the wireless communication module 38 and the wired communication module 40.
  • the game execution unit 112 executes game software based on the user's operation information.
  • the game software includes at least a game program, image data, and sound data. While the user is playing a game, the reception unit 110 receives the user's operation information, and the game execution unit 112 performs calculation processing to move a player character in a virtual space based on the user's operation information.
  • the game image generation unit 116 includes a GPU, and receives the results of the calculation processing in the virtual space, and generates a game image from a viewpoint position (virtual camera) in the virtual space.
  • the game sound generation unit 118 also generates game sound from a viewpoint position in the virtual space.
  • the output processing unit 120 outputs the game images and game sounds from the output device 4.
  • FIG. 4 shows an example of a game image displayed on the output device 4.
  • the game program of the embodiment outputs event information including information for identifying the occurred event (event ID) to the event information acquisition unit 134 .
  • a game includes multiple activities, each of which has a start condition and an end condition.
  • An activity is one of the play units made up of stages, quests, missions, tournaments, sessions, etc. incorporated into the game, and the game progress from when the start condition is met to when the end condition is met constitutes one activity. Activities are set appropriately by the game maker, and for example, one quest that appears in the game progress may constitute one activity, or multiple quests may constitute one activity. When multiple quests constitute one activity, the end condition for that activity is to complete all of the multiple quests.
  • An activity is a single unit of play, but may be made up of multiple tasks.
  • a game maker sets up an activity, they can divide it and set multiple tasks. In this case, one activity is made up of multiple tasks.
  • Figure 5 shows an example of multiple tasks that make up an activity.
  • activity A is made up of task a, task b, and task c, and the user completes activity A by completing tasks a, b, and c in that order.
  • task a may be assigned to the first quest, task b to the second quest, and task c to the third quest.
  • task a may be assigned to the first game scene, task b to the middle game scene, and task c to the last game scene.
  • Each task makes up one play unit. Game makers have the authority to freely set the play units, which are activities and tasks.
  • first play unit>second play unit the relationship is: first play unit>second play unit. Note that one task may be composed of multiple subtasks, in which case the subtasks may be called “third play unit.”
  • the game program When an activity is started, the game program outputs event information including information identifying the start event of the activity (event ID) to the event information acquisition unit 134. When an activity is ended, the game program outputs event information including information identifying the end event of the activity (event ID) to the event information acquisition unit 134.
  • the event information may include information identifying the activity (activity ID), and may also include information indicating the result of the activity (for example, success or failure).
  • the event ID is set to be different for each activity.
  • event information including information identifying the start event of the task (event ID) to the event information acquisition unit 134.
  • event ID event information including information identifying the start event of the task
  • event information acquisition unit 134 event information acquisition unit 134.
  • the event information may include information identifying the task (task ID), and may also include information indicating the result of the task (for example, success or failure).
  • the event ID is set to be different for each task.
  • the event information acquisition unit 134 When the event information acquisition unit 134 acquires the event information, it generates event data by adding a user identifier (user account) that identifies the user, a game identifier (game ID) that identifies the game, and time information (timestamp) that indicates the time when the event occurred to the event information, and provides the event data to the transmission processing unit 140.
  • the game program may output the event information including the game ID and/or the timestamp to the event information acquisition unit 134.
  • the transmission processing unit 140 transmits the event data provided from the event information acquisition unit 134 to the server device 5 via the communication unit 102.
  • the reception unit 110 when it receives operation information from a user, it provides the information to the game execution unit 112 and also to the play data acquisition unit 122.
  • the play data acquisition unit 122 generates play data by adding the user account, game ID, and time information (timestamp) indicating the time of the information provided to the operation information, and provides the play data to the transmission processing unit 140.
  • the transmission processing unit 140 transmits the play data provided by the play data acquisition unit 122 to the server device 5 via the communication unit 102.
  • the play data acquisition unit 122 may provide the transmission processing unit 140 with game images (play footage) generated by the game image generation unit 116 as play data, in addition to the user's operation information. At this time, the transmission processing unit 140 transmits the user's operation information and play footage as play data to the server device 5 via the communication unit 102.
  • the process of transmitting event data and play data is performed by all information processing devices 10 connected to the server device 5, and the server device 5 collects event data and play data for various games from the multiple information processing devices 10.
  • FIG. 6 shows functional blocks of the server device 5.
  • the server device 5 of the embodiment includes a processing unit 200, a communication unit 202, and a recording device 230.
  • the processing unit 200 includes a play data acquisition unit 210, an event data acquisition unit 212, an agent generation unit 214, and an agent providing unit 216.
  • the play data acquisition unit 210 acquires play data transmitted from the information processing device 10, it records the play data in the play data recording unit 232.
  • the event data acquisition unit 212 acquires event data transmitted from the information processing device 10, it records the event data in the event data recording unit 234.
  • the agent generation unit 214 generates an agent that plays the game, it records the agent in the game operation model recording unit 236.
  • the server device 5 is an information processing device equipped with a computer, which executes a program to realize the various functions shown in FIG. 6.
  • the computer includes hardware such as a memory into which the programs are loaded, one or more processors that execute the loaded programs, an auxiliary storage device, and other LSIs.
  • the processor is composed of multiple electronic circuits including semiconductor integrated circuits and LSIs, and the multiple electronic circuits may be mounted on a single chip or on multiple chips.
  • the functional blocks shown in FIG. 6 are realized by the cooperation of hardware and software, and therefore, it will be understood by those skilled in the art that these functional blocks can be realized in various forms by hardware alone, software alone, or a combination of both.
  • the play data is used as learning data for training a game operation model, which is a machine learning model (or an artificial intelligence model).
  • the play data may be used as teacher data in supervised learning.
  • the play data includes operation information of the input device 6 operated by the player (user) to play the game, a user account, a game ID, and a timestamp.
  • the agent generation unit 214 may use the play data of multiple players as teacher data to train a game operation model.
  • the game operation model may be trained to output operation information (or game commands) of the input device 6 when the game video (play video) that is the teacher data is input.
  • the game operation model may be a CNN (Convolutional Neural Network) that is equivalent to a multi-layer neural network including an input layer, one or more convolutional layers, and an output layer, and the agent generation unit 214 may learn each coupling coefficient (weight) in the CNN using a learning method such as deep learning.
  • CNN Convolutional Neural Network
  • the agent generation unit 214 may further enhance the play accuracy of the game operation model by subjecting the game operation model generated by supervised learning to reinforcement learning. Play accuracy can be evaluated based on the play time of the game operation model, the remaining HP of the player character, and other status values.
  • the agent generation unit 214 generates an AI agent for each play unit of the game, thereby improving the play accuracy of the AI agent dedicated to each play unit.
  • the game software includes an "activity,” which is a first play unit, and a "task,” which is a second play unit that is even finer than that. It may also include a "subtask,” which is a third play unit that is even finer than a task.
  • the agent generation unit 214 generates an AI agent (game operation model) for each such play unit, thereby improving the play accuracy of the AI agent.
  • the agent generation unit 214 refers to the timestamps included in the play data and the timestamps included in the event data to identify the play data (operation information and play video) used in each unit of play (for example, an activity or task). For example, the agent generation unit 214 refers to the event data of player A and recognizes that the start event of task a occurs at 10:00 on January 1st, and the end event of task a occurs at 10:05 on January 1st, and identifies that player A's play data between 10:00 and 10:05 on January 1st is play data related to task a. In this way, the agent generation unit 214 can generate an AI agent for each unit of play by identifying the play data related to each unit of play.
  • activity A shown in FIG. 5 is composed of task a, task b, and task c.
  • the agent generation unit 214 generates an AI agent specialized in the activity and an AI agent specialized in the task, it is expected that the playing accuracy of the AI agent specialized in the task, which is a small unit of play, will be higher. Therefore, it is preferable for the agent generation unit 214 to generate an AI agent specialized in a task, which is a small unit of play. Therefore, with regard to activity A, the agent generation unit 214 generates AI agents specialized in each of task A, task B, and task C.
  • the agent generation unit 214 evaluates the playing accuracy of each AI agent using an evaluation index such as playing time. For example, if the playing time of an AI agent is longer than a target playing time, the agent generation unit 214 determines that the playing accuracy of the AI agent is low. If the playing accuracy does not meet a predetermined standard, the agent generation unit 214 continues to perform supervised learning and/or reinforcement learning of the AI agent in order to improve the playing accuracy.
  • an evaluation index such as playing time. For example, if the playing time of an AI agent is longer than a target playing time, the agent generation unit 214 determines that the playing accuracy of the AI agent is low. If the playing accuracy does not meet a predetermined standard, the agent generation unit 214 continues to perform supervised learning and/or reinforcement learning of the AI agent in order to improve the playing accuracy.
  • the agent generation unit 214 records an AI agent (game operation model) whose playing accuracy meets a predetermined standard in the game operation model recording unit 236.
  • the AI agent recorded in the game operation model recording unit 236 is provided to the user before the start of the game and is used to support the user's game play, etc.
  • the agent generation unit 214 may generate an AI agent specialized in activity A.
  • the time that an AI agent specialized in activity A takes to play activity A i.e., task A, task B, task C
  • the time that an AI agent specialized in task B takes to play task B, and the time that an AI agent specialized in task C takes to play task C the play accuracy of the AI agent specialized in activity A may be evaluated as being high.
  • the agent generation unit 214 may record the AI agent specialized in activity A in the game operation model recording unit 236.
  • the agent generation unit 214 when an activity is made up of multiple tasks, the agent generation unit 214 generates an AI agent dedicated to the activity and an AI agent dedicated to the task, and by comparing the play accuracy of both, the user can use the AI agent with the superior quality. In this way, the server device 5 generates a dedicated AI agent specialized for each play unit from the play data of multiple players.
  • the game execution unit 112 executes the game based on information obtained by the user operating the input device 6.
  • the agent acquisition unit 130 acquires an agent generated for the game from the server device 5. This allows the user to have the agent play the game on their behalf during the game.
  • the agent acquisition unit 130 records the acquired agent (game operation model) in the auxiliary storage device 2.
  • the server device 5 records agents whose playing accuracy meets a predetermined standard in the game operation model recording unit 236, but does not record agents whose playing accuracy does not meet the predetermined standard in the game operation model recording unit 236. Therefore, within a single game, there are play units in which agents are generated and play units in which agents are not generated. Therefore, the agent acquisition unit 130 cannot acquire agents for all play units in a game, and there are play units in which agents cannot be acquired.
  • the notification section 136 may notify the user of play units (activities or tasks) that the agent can play.
  • 7 shows an example of a notification area 150 displayed on the output device 4.
  • the notification unit 136 When a user performs a predetermined operation during gameplay, the notification unit 136 generates information for notifying a list of recommended activities and whether or not AI play is possible.
  • the notification information generated by the notification unit 136 is provided to the output processing unit 120, the output processing unit 120 displays the notification area 150 including such information superimposed on a game image. Note that the output processing unit 120 may display the notification area 150 on the output device 4 not only during gameplay.
  • the notification area 150 shown in FIG. 7 displays a plurality of activities recommended to be played, and here activity A, activity B, and activity C are displayed.
  • the display area 150a for activity A shows that it is possible to play using an agent in part of activity A. This means that, for example, if activity A is made up of three tasks, it is not possible to use an agent in all three tasks, but it is possible to use an agent in at least one task.
  • the display area 150b for Activity B indicates that it is not possible to play Activity B using an agent. Therefore, the user must play Activity B on his or her own, without relying on an agent.
  • Display area 150c for activity C shows that all activities in activity C can be played using the activity. This means that, for example, if activity C is made up of four tasks, an agent can be used in all four tasks.
  • Each display area 150a, 150b, 150c is configured as a GUI (Graphical User Interface), and when the user selects one of the display areas, the output processing unit 120 displays a notification area including a group of tasks that make up the activity of that display area. Here, it is assumed that the user has selected display area 150a.
  • GUI Graphic User Interface
  • FIG. 8 shows an example of a notification area 152 displayed on the output device 4.
  • the notification unit 136 When the notification unit 136 generates a list of tasks included in activity A and information for notifying whether AI play is possible and provides this to the output processing unit 120, the output processing unit 120 displays the notification area 152 including such information superimposed on the game image. Task a, task b, and task c included in activity A are displayed in the notification area 152 shown in FIG. 8.
  • the display area 152a for task a shows that it is possible to play task a using an agent.
  • the display area 152b for task b shows that it is not possible to play task b using an agent.
  • the display area 152c for task c shows that it is possible to play task c using an agent.
  • the reception unit 110 provides information on the user's operation of the input device 6 to the game execution unit 112, and the game execution unit 112 executes the game based on the user's operation information.
  • the agent is not started, and the user plays the game by himself/herself.
  • the user may select the task for which agent play is desired. For example, if the user wishes to play tasks a and c as an agent, the user presses display area 152a and display area 152c once to select them, and then selects AI play button 152e, which is a button GUI. When AI play button 152e is selected, game play controller 132 starts the agent, and game execution unit 112 executes the game based on the operation information of the agent. Note that in a play unit (task b) in which no agent exists, an agent cannot be started, so game execution unit 112 executes the game based on the operation information of the user.
  • AI play button 152e If the user wishes to play with an agent for task a only, he or she presses display area 152a once to select it, and then selects AI play button 152e. If the user wishes to play with an agent for task c only, he or she presses display area 152c once to select it, and then selects AI play button 152e.
  • the operation to select a play unit in this way is accepted by acceptance unit 110, and game play controller 132 activates an agent corresponding to the play unit selected by the user, and causes the game to be played.
  • the game play controller 132 starts up an agent (game operation model) for task a.
  • the agent as a player, generates operation information (or game commands) for the game, and the game execution unit 112 executes the game program for task a based on the agent's operation information.
  • the game play controller 132 starts up the agent and has the agent play the game on behalf of the user.
  • FIG. 9 shows an example of a notification area 154 that is displayed while an agent is playing task a. While the agent is playing the game, the notification unit 136 generates information to notify the user that the agent is playing and provides this information to the output processing unit 120, and the output processing unit 120 displays the notification area 154 containing this notification information superimposed on the game image. This allows the user to confirm that the agent is playing.
  • FIG. 10 shows an example of a notification area 156 that is displayed when the agent finishes playing task a.
  • the notification unit 136 generates information for notifying the end of the agent's play and provides it to the output processing unit 120, and the output processing unit 120 displays the notification area 156 including the notification information superimposed on the game image. This allows the user to recognize that they need to play and to begin operating the input device 6.
  • the game play controller 132 starts an agent (game operation model) for task c.
  • the agent generates operation information for the game, and the game execution unit 112 executes the game program for task c based on the operation information of the agent.
  • the output processing unit 120 displays the notification area 154 shown in FIG. 9 superimposed on the game image.
  • the output processing unit 120 may also present the user with notification information indicating that the agent is available.
  • FIG. 11 shows an example of a notification area 158 that is displayed when agent play is available.
  • the notification unit 136 When a play-by-play scene in which an agent has been generated is displayed, the notification unit 136 generates notification information indicating that the agent is available to play and provides it to the output processing unit 120, and the output processing unit 120 displays the notification area 158 containing such notification information superimposed on the game image. This allows the user to recognize that an agent is available. For example, when the user selects the notification area 158, the game play controller 132 may launch the agent and begin playing with the agent.
  • control rights for the game can be returned to the user.
  • the user may return control rights for the game to the user by displaying the notification area 152 and selecting the user play button 152d.
  • control rights for the game may be returned to the user.
  • the operation authority of the game may be controlled to be transferred to the agent. Furthermore, if the user is unable to complete a task successfully, the notification unit 136 may generate notification information indicating that an agent is available, and the output processing unit 120 may display such notification information superimposed on the game image. When the user's progress in the game is stalled, it is possible to support the user's play by suggesting the use of an agent.
  • an agent is used as a player on behalf of a user, but it is also possible to use an agent to generate help videos (hint videos), for example, and it is also possible to use an agent as an NPC.
  • the game execution unit 112 is provided in the information processing device 10, but in a modified example, the game execution unit 112 may be provided in the server device 5, and the game system 1 may provide a cloud gaming service.
  • the server device 5 may also include a game play controller 132, and the server device 5 may function like the information processing device 10.
  • the playing accuracy required of the agent only needs to be moderately high, and does not need to be extremely high.
  • the playing accuracy required may be at the same level as a player with average skill.
  • the playing accuracy may be at a level that a human can achieve, and there is no need to require high-speed play that is impossible for a human to achieve.
  • an analysis device that can suggest dividing play units into lengths that are easy to learn. For example, when a game developer test-plays a certain play unit (such as a task) in game software under development, the analysis device can determine the ease of learning from the operation information and gameplay footage from the test play.
  • the analysis device analyzes the operation information and play footage of one play unit, and if the context (play content) has changed, proposes dividing the play unit.
  • the analysis device may determine that the play unit is long and unsuitable for learning, and may suggest dividing the play unit.
  • the analysis device may determine that the play unit contains multiple play contents and suggest dividing the play unit. In this way, when the analysis device detects a change in the play content from the operation information and play footage in one provisionally set play unit, it may suggest to the game maker that the one play unit be divided.
  • This disclosure can be used in technology where an agent plays a game on behalf of a user.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Computer Security & Cryptography (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

A reception unit (110) receives user operation information, and a game execution unit (112) executes a game on the basis of the operation information. A notification unit (136) generates notification information for issuing notification of a play unit that an agent can play, and an output processing unit (120) displays the notification information so as to be superimposed over a game image. A game play controller (132) allows an agent generated for each play unit to play the game.

Description

情報処理装置およびゲームプレイ方法Information processing device and game playing method
 本開示は、ユーザの代わりにエージェントがゲームをプレイする技術に関する。 This disclosure relates to technology in which an agent plays a game on behalf of a user.
 特許文献1は、ゲームプレイをゲームプレイコントローラに引き継がせることで、ユーザの代わりにAI(artificial intelligence)キャラクタがゲームを進行するシステムを開示する。特許文献2は、ユーザがビデオゲームをプレイしたときに生じるゲームプレイデータを使用して、ユーザのゲームプレイを模倣させるように機械学習モデルを訓練する方法を開示する。機械学習モデルは人工ニューラルネットワークであり、ユーザのゲームプレイスタイル及び傾向を学習及び模倣する。機械学習モデルを訓練することで、ユーザのAIボットが、ユーザを模倣したゲームプレイを行うことができる。 Patent document 1 discloses a system in which an AI (artificial intelligence) character progresses through a game on behalf of a user by handing over gameplay to a gameplay controller. Patent document 2 discloses a method of training a machine learning model to mimic the user's gameplay using gameplay data generated when a user plays a video game. The machine learning model is an artificial neural network that learns and mimics the user's gameplay style and tendencies. By training the machine learning model, the user's AI bot can perform gameplay that mimics the user.
特表2019-520154号公報Special table 2019-520154 publication 特表2022-525413号公報Special Publication No. 2022-525413
 チェスや将棋、碁などの戦略ゲームにおけるAIエージェントの進化は著しく、人間のパフォーマンスを上回るAIエージェントも登場している。一方、ロールプレイングゲームやアクションゲームなど、ルールが比較的複雑なゲームでは、今なお、プレイ精度の高いAIエージェントを生成することは容易でない。 AI agents in strategic games such as chess, shogi, and go have evolved significantly, and AI agents that outperform humans have appeared. However, in games with relatively complex rules, such as role-playing games and action games, it is still not easy to create AI agents with high playing accuracy.
 一般にAIエージェントは、ユーザの対戦相手として利用されることが多いが、AIエージェントのプレイ精度が高まれば、ユーザは自身の代わりにAIエージェントにプレイさせて、ゲームを進めることも可能となる。そこでユーザが、プレイ精度の高いAIエージェントを効果的に利用できる仕組みを構築することが望まれている。  Generally, AI agents are often used as opponents for users, but if the playing accuracy of AI agents improves, users will be able to have the AI agents play on their behalf and progress through the game. Therefore, there is a need to build a system that allows users to effectively use AI agents with high playing accuracy.
 本開示のある態様の情報処理装置は、ハードウェアを有する1つ以上のプロセッサを備え、1つ以上のプロセッサは、ユーザの操作情報を受け付け、操作情報にもとづいてゲームを実行し、エージェントがプレイ可能なプレイ単位をユーザに通知し、当該プレイ単位を、ユーザの代わりにエージェントにプレイさせる。 An information processing device according to one embodiment of the present disclosure includes one or more processors having hardware, and the one or more processors receive operation information from a user, execute a game based on the operation information, notify the user of play units that an agent can play, and have the agent play the play units on behalf of the user.
 本開示の別の態様のゲームプレイ方法は、情報処理装置において、ゲームをプレイする方法であって、ユーザの操作情報を受け付け、操作情報にもとづいてゲームを実行し、エージェントがプレイ可能なプレイ単位をユーザに通知し、当該プレイ単位を、ユーザの代わりにエージェントにプレイさせる。 Another aspect of the game playing method of the present disclosure is a method of playing a game in an information processing device, which accepts operation information from a user, executes a game based on the operation information, notifies the user of a play unit that an agent can play, and has the agent play that play unit on behalf of the user.
 なお、以上の構成要素の任意の組合せ、本開示の表現を方法、装置、システム、記録媒体、コンピュータプログラムなどの間で変換したものもまた、本開示の態様として有効である。 In addition, any combination of the above components, and conversions of the expressions of this disclosure between methods, devices, systems, recording media, computer programs, etc., are also valid aspects of this disclosure.
実施形態にかかるゲームシステムを示す図である。FIG. 1 is a diagram showing a game system according to an embodiment. 情報処理装置のハードウェア構成を示す図である。FIG. 2 is a diagram illustrating a hardware configuration of an information processing device. 情報処理装置の機能ブロックを示す図である。FIG. 2 is a diagram illustrating functional blocks of the information processing device. ゲーム画像の例を示す図である。FIG. 13 is a diagram showing an example of a game image. アクティビティを構成する複数のタスクの例を示す図である。FIG. 1 is a diagram showing an example of a plurality of tasks that make up an activity. サーバ装置の機能ブロックを示す図である。FIG. 2 is a diagram illustrating functional blocks of the server device. 表示される通知領域の例を示す図である。FIG. 13 is a diagram showing an example of a displayed notification area. 表示される通知領域の例を示す図である。FIG. 13 is a diagram showing an example of a displayed notification area. 表示される通知領域の例を示す図である。FIG. 13 is a diagram showing an example of a displayed notification area. 表示される通知領域の例を示す図である。FIG. 13 is a diagram showing an example of a displayed notification area. 表示される通知領域の例を示す図である。FIG. 13 is a diagram showing an example of a displayed notification area.
 図1は、本開示の実施形態にかかるゲームシステム1を示す。実施形態のゲームシステム1は、ユーザが操作する情報処理装置10とサーバ装置5とを備える。アクセスポイント(以下、「AP」とよぶ)8は、無線アクセスポイントおよびルータの機能を有し、情報処理装置10は、無線または有線経由でAP8に接続して、インターネットなどのネットワーク3上のサーバ装置5と通信可能に接続する。図1には、1人のユーザおよび1台の情報処理装置10が示されているが、ゲームシステム1では、複数のユーザが操作する複数の情報処理装置10が、サーバ装置5とネットワーク3を介して接続している。 FIG. 1 shows a game system 1 according to an embodiment of the present disclosure. The game system 1 of the embodiment includes an information processing device 10 operated by a user and a server device 5. An access point (hereinafter referred to as "AP") 8 has the functions of a wireless access point and a router, and the information processing device 10 connects to the AP 8 wirelessly or via a wired connection to communicate with the server device 5 on a network 3 such as the Internet. Although FIG. 1 shows one user and one information processing device 10, in the game system 1, multiple information processing devices 10 operated by multiple users are connected to the server device 5 via the network 3.
 情報処理装置10は、ユーザが操作する入力装置6と無線または有線で接続し、入力装置6はユーザが操作した情報を情報処理装置10に出力する。情報処理装置10は入力装置6から操作情報を受け付けるとシステムソフトウェアやゲームソフトウェアの処理に反映し、出力装置4から処理結果を出力させる。ゲームシステム1において情報処理装置10はゲームを実行するゲーム装置(ゲームコンソール)であり、入力装置6はゲームコントローラなど情報処理装置10に対してユーザの操作情報を供給する機器である。なお入力装置6は、キーボードやマウスなどの入力インタフェースであってもよい。 The information processing device 10 is connected wirelessly or via a wire to an input device 6 operated by a user, and the input device 6 outputs information operated by the user to the information processing device 10. When the information processing device 10 receives operation information from the input device 6, it reflects the information in the processing of the system software and game software, and causes the output device 4 to output the processing results. In the game system 1, the information processing device 10 is a game device (game console) that executes a game, and the input device 6 is a device such as a game controller that supplies user operation information to the information processing device 10. The input device 6 may also be an input interface such as a keyboard or mouse.
 補助記憶装置2は、HDD(ハードディスクドライブ)やSSD(ソリッドステートドライブ)などの大容量記録装置であり、内蔵型記録装置であってよく、またUSB(Universal Serial Bus)などによって情報処理装置10と接続する外部記録装置であってもよい。出力装置4は画像を出力するディスプレイおよび音を出力するスピーカを有するテレビであってよい。出力装置4は、情報処理装置10に有線ケーブルで接続されてよく、また無線接続されてもよい。 The auxiliary storage device 2 is a large-capacity storage device such as an HDD (hard disk drive) or SSD (solid state drive), and may be an internal storage device, or an external storage device connected to the information processing device 10 via a USB (Universal Serial Bus) or the like. The output device 4 may be a television having a display for outputting images and a speaker for outputting sound. The output device 4 may be connected to the information processing device 10 via a wired cable, or may be connected wirelessly.
 撮像装置であるカメラ7は出力装置4の近傍に設けられ、出力装置4周辺の空間を撮像する。図1ではカメラ7が出力装置4の上部に取り付けられている例を示しているが、出力装置4の側部または下部に配置されてもよく、いずれにしても出力装置4の前方に位置するユーザを撮像できる位置に配置される。カメラ7はステレオカメラであってもよい。 Camera 7, which is an imaging device, is provided near output device 4 and captures an image of the space around output device 4. While FIG. 1 shows an example in which camera 7 is attached to the top of output device 4, it may be disposed on the side or bottom of output device 4, and in either case, it is disposed in a position where it can capture an image of a user positioned in front of output device 4. Camera 7 may be a stereo camera.
 サーバ装置5は、ゲームシステム1のユーザに対してネットワークサービスを提供する。サーバ装置5は、各ユーザを識別するユーザアカウントを管理しており、各ユーザは、ユーザアカウントを用いて、サーバ装置5が提供するネットワークサービスにサインインする。ユーザは情報処理装置10からネットワークサービスにサインインすることで、サーバ装置5に、ゲームのセーブデータや、ゲームプレイ中に獲得した仮想的な褒賞であるトロフィを登録できる。サーバ装置5にセーブデータやトロフィが登録されることで、ユーザが情報処理装置10とは別の情報処理装置を使用しても、セーブデータやトロフィを同期できるようになる。 The server device 5 provides network services to users of the game system 1. The server device 5 manages user accounts that identify each user, and each user signs in to the network service provided by the server device 5 using their user account. By signing in to the network service from the information processing device 10, users can register game save data and trophies, which are virtual rewards acquired during game play, in the server device 5. By registering the save data and trophies in the server device 5, the save data and trophies can be synchronized even if the user uses an information processing device other than the information processing device 10.
 実施形態のサーバ装置5は、ゲームをプレイするAIエージェントを生成して、情報処理装置10に提供する機能を有する。AIエージェントを生成するために、サーバ装置5は、複数の情報処理装置10から、ゲームプレイの状況を示すプレイデータを収集する。実施形態においてプレイデータは、プレイヤ(ユーザ)が入力装置6を操作した操作情報であってよく、プレイヤの操作情報にもとづいて描画されたゲーム映像(プレイ映像)であってもよい。サーバ装置5は、1人または複数のプレイヤによるプレイデータを用いてゲーム操作モデルを機械学習して、自律的にゲームをプレイできるAIエージェント(以下、単に「エージェント」と呼ぶこともある)を生成する。 The server device 5 in the embodiment has a function of generating an AI agent that plays a game and providing it to the information processing device 10. To generate the AI agent, the server device 5 collects play data indicating the game play situation from multiple information processing devices 10. In the embodiment, the play data may be operation information of a player (user) operating the input device 6, or may be game footage (play footage) drawn based on the player's operation information. The server device 5 uses play data from one or more players to machine-learn a game operation model, and generates an AI agent (hereinafter sometimes simply referred to as an "agent") that can play the game autonomously.
 実施形態のゲームシステム1において、ユーザは、自身の代わりに、エージェントにゲームをプレイさせることができる。エージェントによるゲームプレイ中、ユーザは、そのゲームに関して一切のゲーム操作を行わず、エージェントが自律的または自動的にゲームをプレイする。たとえばユーザが食事のためにゲームを操作できなくなると、ユーザの代わりにエージェントにゲームをプレイさせて、ゲームを進めることができる。なおユーザは、エージェントがプレイしているゲームの操作を途中で交替することもできる。 In the game system 1 of the embodiment, the user can have an agent play the game on his/her behalf. While the agent is playing the game, the user does not perform any game operations related to the game, and the agent plays the game autonomously or automatically. For example, if the user is unable to operate the game because he/she has a meal, the user can have the agent play the game on his/her behalf, and the game can progress. The user can also take over operation of the game that the agent is playing midway.
 実施形態のエージェントは、1人または複数のプレイヤによる過去のプレイデータを学習することで生成される。エージェントは、学習済みのゲーム操作モデルであって、ゲームの進行状況に応じた入力装置6の操作情報を自律的または自動的に生成する。エージェントは、ゲームが理解できるゲームコマンドを生成してもよい。ゲームを開始する前に、学習済みのゲーム操作モデルはサーバ装置5から情報処理装置10に提供されて、ユーザのゲームプレイを支援する。 The agent of the embodiment is generated by learning past play data from one or more players. The agent is a learned game operation model, and autonomously or automatically generates operation information for the input device 6 according to the progress of the game. The agent may generate game commands that the game can understand. Before the game starts, the learned game operation model is provided from the server device 5 to the information processing device 10 to assist the user in playing the game.
 図2は、情報処理装置10のハードウェア構成を示す。情報処理装置10は、メイン電源ボタン20、電源ON用LED21、スタンバイ用LED22、システムコントローラ24、クロック26、デバイスコントローラ30、メディアドライブ32、USBモジュール34、フラッシュメモリ36、無線通信モジュール38、有線通信モジュール40、サブシステム50およびメインシステム60を有して構成される。 FIG. 2 shows the hardware configuration of the information processing device 10. The information processing device 10 is configured to include a main power button 20, a power ON LED 21, a standby LED 22, a system controller 24, a clock 26, a device controller 30, a media drive 32, a USB module 34, a flash memory 36, a wireless communication module 38, a wired communication module 40, a subsystem 50, and a main system 60.
 メインシステム60は、メインCPU(Central Processing Unit)、主記憶装置であるメモリおよびメモリコントローラ、GPU(Graphics Processing Unit)などを備える。GPUはゲームプログラムの演算処理に主として利用される。メインCPUはシステムソフトウェアを起動し、システムソフトウェアが提供する環境下において、補助記憶装置2にインストールされたゲームプログラムを実行する機能をもつ。サブシステム50は、サブCPU、主記憶装置であるメモリおよびメモリコントローラなどを備え、GPUを備えない。 The main system 60 comprises a main CPU (Central Processing Unit), a memory and memory controller serving as the main storage device, a GPU (Graphics Processing Unit), etc. The GPU is primarily used for the calculation and processing of game programs. The main CPU has the function of starting up the system software and executing the game program installed in the auxiliary storage device 2 in the environment provided by the system software. The subsystem 50 comprises a sub-CPU, a memory and memory controller serving as the main storage device, etc., but does not comprise a GPU.
 メインCPUは補助記憶装置2にインストールされているゲームプログラムを実行する機能をもつ一方で、サブCPUはそのような機能をもたない。しかしながらサブCPUは補助記憶装置2にアクセスする機能や、サーバ装置5との間でデータを送受信する機能を有している。サブCPUは、このような制限された処理機能のみを有して構成されており、したがってメインCPUと比較して小さい消費電力で動作できる。これらのサブCPUの機能は、メインCPUがスタンバイ状態にある際に実行される。 While the main CPU has the function of executing game programs installed in the auxiliary storage device 2, the sub-CPU does not have such a function. However, the sub-CPU has the function of accessing the auxiliary storage device 2 and the function of sending and receiving data with the server device 5. The sub-CPU is configured with only these limited processing functions, and therefore can operate with less power consumption compared to the main CPU. These functions of the sub-CPU are executed when the main CPU is in standby mode.
 メイン電源ボタン20は、ユーザからの操作入力が行われる入力部であって、情報処理装置10の筐体の前面に設けられ、情報処理装置10のメインシステム60への電源供給をオンまたはオフするために操作される。電源ON用LED21は、メイン電源ボタン20がオンされたときに点灯し、スタンバイ用LED22は、メイン電源ボタン20がオフされたときに点灯する。システムコントローラ24は、ユーザによるメイン電源ボタン20の押下を検出する。 The main power button 20 is an input unit through which the user inputs operations, is provided on the front of the housing of the information processing device 10, and is operated to turn on or off the power supply to the main system 60 of the information processing device 10. The power ON LED 21 lights up when the main power button 20 is turned on, and the standby LED 22 lights up when the main power button 20 is turned off. The system controller 24 detects when the main power button 20 is pressed by the user.
 クロック26はリアルタイムクロックであって、現在の日時情報を生成し、システムコントローラ24やサブシステム50およびメインシステム60に供給する。 The clock 26 is a real-time clock that generates current date and time information and supplies it to the system controller 24, the subsystem 50, and the main system 60.
 デバイスコントローラ30は、サウスブリッジのようにデバイス間の情報の受け渡しを実行するLSI(Large-Scale Integrated Circuit)として構成される。図示のように、デバイスコントローラ30には、システムコントローラ24、メディアドライブ32、USBモジュール34、フラッシュメモリ36、無線通信モジュール38、有線通信モジュール40、サブシステム50およびメインシステム60などのデバイスが接続される。デバイスコントローラ30は、それぞれのデバイスの電気特性の違いやデータ転送速度の差を吸収し、データ転送のタイミングを制御する。 The device controller 30 is configured as an LSI (Large-Scale Integrated Circuit) that transfers information between devices like a south bridge. As shown in the figure, devices such as the system controller 24, media drive 32, USB module 34, flash memory 36, wireless communication module 38, wired communication module 40, subsystem 50, and main system 60 are connected to the device controller 30. The device controller 30 absorbs differences in the electrical characteristics and data transfer speeds of each device and controls the timing of data transfer.
 メディアドライブ32は、ゲームなどのアプリケーションソフトウェア、およびライセンス情報を記録したROM媒体44を装着して駆動し、ROM媒体44からプログラムやデータなどを読み出すドライブ装置である。ROM媒体44は、光ディスクや光磁気ディスク、ブルーレイディスクなどの読出専用の記録メディアである。 The media drive 32 is a drive device that operates by mounting a ROM medium 44 on which application software such as games and license information are recorded, and reads programs, data, and the like from the ROM medium 44. The ROM medium 44 is a read-only recording medium such as an optical disk, magneto-optical disk, or Blu-ray disk.
 USBモジュール34は、外部機器とUSBケーブルで接続するモジュールである。USBモジュール34は補助記憶装置2およびカメラ7とUSBケーブルで接続してもよい。フラッシュメモリ36は、内部ストレージを構成する補助記憶装置である。無線通信モジュール38は、Bluetooth(登録商標)プロトコルやIEEE802.11プロトコルなどの通信プロトコルで、入力装置6と無線通信する。有線通信モジュール40は、外部機器と有線通信し、AP8を介してネットワーク3に接続する。 The USB module 34 is a module that connects to an external device via a USB cable. The USB module 34 may also be connected to the auxiliary storage device 2 and the camera 7 via a USB cable. The flash memory 36 is an auxiliary storage device that constitutes the internal storage. The wireless communication module 38 wirelessly communicates with the input device 6 using a communication protocol such as the Bluetooth (registered trademark) protocol or the IEEE802.11 protocol. The wired communication module 40 communicates with an external device via a wired connection and connects to the network 3 via the AP 8.
 図3は、情報処理装置10の機能ブロックを示す。情報処理装置10は、処理部100および通信部102を備え、ゲームソフトウェアを実行する。処理部100は、受付部110、ゲーム実行部112、ゲーム画音生成部114、出力処理部120、プレイデータ取得部122、エージェント取得部130、ゲームプレイコントローラ132、イベント情報取得部134、通知部136および送信処理部140を備える。ゲーム画音生成部114は、ゲーム画像生成部116およびゲーム音生成部118を有する。 FIG. 3 shows functional blocks of the information processing device 10. The information processing device 10 has a processing unit 100 and a communication unit 102, and executes game software. The processing unit 100 has a reception unit 110, a game execution unit 112, a game image and sound generation unit 114, an output processing unit 120, a play data acquisition unit 122, an agent acquisition unit 130, a game play controller 132, an event information acquisition unit 134, a notification unit 136, and a transmission processing unit 140. The game image and sound generation unit 114 has a game image generation unit 116 and a game sound generation unit 118.
 情報処理装置10はコンピュータを備え、コンピュータがプログラムを実行することによって、図3に示す様々な機能が実現される。コンピュータは、プログラムをロードするメモリ、ロードされたプログラムを実行する1つ以上のプロセッサ、補助記憶装置、その他のLSIなどをハードウェアとして備える。プロセッサは、半導体集積回路やLSIを含む複数の電子回路により構成され、複数の電子回路は、1つのチップ上に搭載されてよく、または複数のチップ上に搭載されてもよい。図3に示す機能ブロックは、ハードウェアとソフトウェアとの連携によって実現され、したがって、これらの機能ブロックがハードウェアのみ、ソフトウェアのみ、またはそれらの組合せによっていろいろな形で実現できることは、当業者には理解されるところである。 The information processing device 10 includes a computer, which executes a program to realize the various functions shown in FIG. 3. The computer includes hardware such as a memory into which the program is loaded, one or more processors that execute the loaded program, an auxiliary storage device, and other LSIs. The processor is composed of multiple electronic circuits including semiconductor integrated circuits and LSIs, and the multiple electronic circuits may be mounted on a single chip or on multiple chips. The functional blocks shown in FIG. 3 are realized by cooperation between hardware and software, and therefore, it will be understood by those skilled in the art that these functional blocks can be realized in various forms using only hardware, only software, or a combination thereof.
 通信部102は、ユーザがゲームプレイのために入力装置6を操作した情報(操作情報)を受信し、受付部110に提供する。受付部110は、ユーザの操作情報を受け付け、ゲーム実行部112に提供するとともに、プレイデータ取得部122にも提供する。通信部102は、サーバ装置5と通信して、様々な情報ないしはデータを送受信する。通信部102は無線通信モジュール38および有線通信モジュール40の機能を併せ持ってよい。 The communication unit 102 receives information (operation information) of the user operating the input device 6 to play the game, and provides it to the reception unit 110. The reception unit 110 receives the user's operation information, and provides it to the game execution unit 112 as well as to the play data acquisition unit 122. The communication unit 102 communicates with the server device 5 to send and receive various information or data. The communication unit 102 may have the functions of both the wireless communication module 38 and the wired communication module 40.
 ゲーム実行部112は、ユーザの操作情報にもとづいてゲームソフトウェアを実行する。ゲームソフトウェアは、少なくともゲームプログラム、画像データおよび音データを含む。ユーザによるゲームプレイ中、受付部110はユーザの操作情報を受け付け、ゲーム実行部112は、ユーザの操作情報をもとに、仮想空間内でプレイヤキャラクタを動かす演算処理を行う。ゲーム画像生成部116はGPUを含み、仮想空間における演算処理結果を受けて、仮想空間内の視点位置(仮想カメラ)からのゲーム画像を生成する。またゲーム音生成部118は、仮想空間内の視点位置からのゲーム音を生成する。出力処理部120は、ゲーム画像およびゲーム音を出力装置4から出力する。 The game execution unit 112 executes game software based on the user's operation information. The game software includes at least a game program, image data, and sound data. While the user is playing a game, the reception unit 110 receives the user's operation information, and the game execution unit 112 performs calculation processing to move a player character in a virtual space based on the user's operation information. The game image generation unit 116 includes a GPU, and receives the results of the calculation processing in the virtual space, and generates a game image from a viewpoint position (virtual camera) in the virtual space. The game sound generation unit 118 also generates game sound from a viewpoint position in the virtual space. The output processing unit 120 outputs the game images and game sounds from the output device 4.
 図4は、出力装置4に表示されるゲーム画像の例を示す。
 実施形態のゲームプログラムは、ゲーム内でイベントが発生すると、発生したイベントを識別する情報(イベントID)を含むイベント情報をイベント情報取得部134に出力する。
FIG. 4 shows an example of a game image displayed on the output device 4.
When an event occurs in the game, the game program of the embodiment outputs event information including information for identifying the occurred event (event ID) to the event information acquisition unit 134 .
 ゲームには、開始条件および終了条件が設定されている複数のアクティビティが含まれる。アクティビティは、ゲームに組み込まれているステージや、クエスト、ミッション、トーナメント、セッション等から構成されるプレイ単位の1つであり、開始条件が成立してから終了条件が成立するまでのゲーム進行が、1つのアクティビティを構成する。アクティビティはゲームメーカにより適宜設定され、たとえばゲーム進行上に登場する1つのクエストが1つのアクティビティを構成してよく、また複数のクエストが1つのアクティビティを構成してもよい。複数のクエストが1つのアクティビティを構成している場合、複数のクエストの全てを完了することが、当該アクティビティの終了条件となる。 A game includes multiple activities, each of which has a start condition and an end condition. An activity is one of the play units made up of stages, quests, missions, tournaments, sessions, etc. incorporated into the game, and the game progress from when the start condition is met to when the end condition is met constitutes one activity. Activities are set appropriately by the game maker, and for example, one quest that appears in the game progress may constitute one activity, or multiple quests may constitute one activity. When multiple quests constitute one activity, the end condition for that activity is to complete all of the multiple quests.
 1つのアクティビティは1つのプレイ単位であるが、複数のタスクにより構成されてよい。ゲームメーカは、1つのアクティビティを設定すると、そのアクティビティを分割して、複数のタスクを設定できる。この場合、1つのアクティビティが、複数のタスクから構成される。 An activity is a single unit of play, but may be made up of multiple tasks. When a game maker sets up an activity, they can divide it and set multiple tasks. In this case, one activity is made up of multiple tasks.
 図5は、アクティビティを構成する複数のタスクの例を示す。この例でアクティビティAは、タスクa、タスクb、タスクcによって構成されており、ユーザは、タスクa、タスクb、タスクcの順にタスクをクリアすることで、アクティビティAをクリアすることになる。たとえば3つのクエストがアクティビティAを構成している場合、タスクaが第1クエストに、タスクbが第2クエストに、タスクcが第3クエストに割り当てられてよい。また1つのクエストがアクティビティAとして設定されている場合、タスクaが最初のゲームシーンに、タスクbが中間のゲームシーンに、タスクcが最後のゲームシーンに割り当てられてもよい。各タスクは、1つのプレイ単位を構成する。ゲームメーカは、アクティビティやタスクであるプレイ単位を、自由に設定する権限を有する。 Figure 5 shows an example of multiple tasks that make up an activity. In this example, activity A is made up of task a, task b, and task c, and the user completes activity A by completing tasks a, b, and c in that order. For example, if activity A is made up of three quests, task a may be assigned to the first quest, task b to the second quest, and task c to the third quest. Also, if one quest is set as activity A, task a may be assigned to the first game scene, task b to the middle game scene, and task c to the last game scene. Each task makes up one play unit. Game makers have the authority to freely set the play units, which are activities and tasks.
 アクティビティを「第1のプレイ単位」と呼び、タスクを「第2のプレイ単位」と呼ぶと、両者の間には、
 第1のプレイ単位 > 第2のプレイ単位
 の関係が成立する。なお1つのタスクが、複数のサブタスクから構成されてよく、この場合、サブタスクを「第3のプレイ単位」と呼んでもよい。
If we call an activity the "first play unit" and a task the "second play unit," then between the two,
The relationship is: first play unit>second play unit. Note that one task may be composed of multiple subtasks, in which case the subtasks may be called "third play unit."
 ゲームプログラムは、あるアクティビティを開始すると、そのアクティビティの開始イベントを識別する情報(イベントID)を含むイベント情報をイベント情報取得部134に出力する。またゲームプログラムは、あるアクティビティを終了すると、そのアクティビティの終了イベントを識別する情報(イベントID)を含むイベント情報をイベント情報取得部134に出力する。イベント情報は、イベントIDに加えて、アクティビティを識別する情報(アクティビティID)を含んでもよく、またアクティビティの実施結果を示す情報(たとえば成功または失敗)を含んでもよい。イベントIDは、アクティビティごとに異なるように設定される。 When an activity is started, the game program outputs event information including information identifying the start event of the activity (event ID) to the event information acquisition unit 134. When an activity is ended, the game program outputs event information including information identifying the end event of the activity (event ID) to the event information acquisition unit 134. In addition to the event ID, the event information may include information identifying the activity (activity ID), and may also include information indicating the result of the activity (for example, success or failure). The event ID is set to be different for each activity.
 同様に、ゲームプログラムは、あるタスクを開始すると、そのタスクの開始イベントを識別する情報(イベントID)を含むイベント情報をイベント情報取得部134に出力する。またゲームプログラムは、あるタスクを終了すると、そのタスクの終了イベントを識別する情報(イベントID)を含むイベント情報をイベント情報取得部134に出力する。イベント情報は、イベントIDに加えて、タスクを識別する情報(タスクID)を含んでもよく、またタスクの実施結果を示す情報(たとえば成功または失敗)を含んでもよい。イベントIDは、タスクごとに異なるように設定される。 Similarly, when a game program starts a task, it outputs event information including information identifying the start event of the task (event ID) to the event information acquisition unit 134. Furthermore, when a game program ends a task, it outputs event information including information identifying the end event of the task (event ID) to the event information acquisition unit 134. In addition to the event ID, the event information may include information identifying the task (task ID), and may also include information indicating the result of the task (for example, success or failure). The event ID is set to be different for each task.
 イベント情報取得部134は、イベント情報を取得すると、そのイベント情報に、ユーザを識別するユーザ識別子(ユーザアカウント)、ゲームを識別するゲーム識別子(ゲームID)およびイベントが発生した時間を示す時間情報(タイムスタンプ)を付加したイベントデータを生成して、送信処理部140に提供する。ゲームプログラムは、ゲームIDおよび/またはタイムスタンプを含むイベント情報をイベント情報取得部134に出力してもよい。送信処理部140は、イベント情報取得部134から提供されるイベントデータを、通信部102を介してサーバ装置5に送信する。 When the event information acquisition unit 134 acquires the event information, it generates event data by adding a user identifier (user account) that identifies the user, a game identifier (game ID) that identifies the game, and time information (timestamp) that indicates the time when the event occurred to the event information, and provides the event data to the transmission processing unit 140. The game program may output the event information including the game ID and/or the timestamp to the event information acquisition unit 134. The transmission processing unit 140 transmits the event data provided from the event information acquisition unit 134 to the server device 5 via the communication unit 102.
 上記したように、受付部110は、ユーザの操作情報を受け付けると、ゲーム実行部112に提供するとともに、プレイデータ取得部122にも提供する。プレイデータ取得部122は、提供された操作情報に、ユーザアカウント、ゲームIDおよび提供された時刻を示す時間情報(タイムスタンプ)を付加したプレイデータを生成して、送信処理部140に提供する。送信処理部140は、プレイデータ取得部122から提供されるプレイデータを、通信部102を介してサーバ装置5に送信する。 As described above, when the reception unit 110 receives operation information from a user, it provides the information to the game execution unit 112 and also to the play data acquisition unit 122. The play data acquisition unit 122 generates play data by adding the user account, game ID, and time information (timestamp) indicating the time of the information provided to the operation information, and provides the play data to the transmission processing unit 140. The transmission processing unit 140 transmits the play data provided by the play data acquisition unit 122 to the server device 5 via the communication unit 102.
 なおプレイデータ取得部122は、ユーザの操作情報に加えて、ゲーム画像生成部116により生成されるゲーム画像(プレイ映像)も、プレイデータとして送信処理部140に提供してもよい。このとき送信処理部140は、プレイデータとして、ユーザの操作情報およびプレイ映像を、通信部102を介してサーバ装置5に送信する。 The play data acquisition unit 122 may provide the transmission processing unit 140 with game images (play footage) generated by the game image generation unit 116 as play data, in addition to the user's operation information. At this time, the transmission processing unit 140 transmits the user's operation information and play footage as play data to the server device 5 via the communication unit 102.
 ゲームシステム1において、イベントデータおよびプレイデータの送信処理は、サーバ装置5に接続している全ての情報処理装置10により実施され、サーバ装置5は、複数の情報処理装置10から、様々なゲームのイベントデータおよびプレイデータを収集する。 In the game system 1, the process of transmitting event data and play data is performed by all information processing devices 10 connected to the server device 5, and the server device 5 collects event data and play data for various games from the multiple information processing devices 10.
 図6は、サーバ装置5の機能ブロックを示す。実施形態のサーバ装置5は、処理部200、通信部202および記録装置230を備える。処理部200は、プレイデータ取得部210、イベントデータ取得部212、エージェント生成部214およびエージェント提供部216を備える。プレイデータ取得部210は、情報処理装置10から送信されるプレイデータを取得すると、プレイデータ記録部232に記録する。イベントデータ取得部212は、情報処理装置10から送信されるイベントデータを取得すると、イベントデータ記録部234に記録する。エージェント生成部214は、ゲームをプレイするエージェントを生成すると、ゲーム操作モデル記録部236に記録する。 FIG. 6 shows functional blocks of the server device 5. The server device 5 of the embodiment includes a processing unit 200, a communication unit 202, and a recording device 230. The processing unit 200 includes a play data acquisition unit 210, an event data acquisition unit 212, an agent generation unit 214, and an agent providing unit 216. When the play data acquisition unit 210 acquires play data transmitted from the information processing device 10, it records the play data in the play data recording unit 232. When the event data acquisition unit 212 acquires event data transmitted from the information processing device 10, it records the event data in the event data recording unit 234. When the agent generation unit 214 generates an agent that plays the game, it records the agent in the game operation model recording unit 236.
 サーバ装置5はコンピュータを備えた情報処理装置であって、コンピュータがプログラムを実行することによって、図6に示す様々な機能が実現される。コンピュータは、プログラムをロードするメモリ、ロードされたプログラムを実行する1つ以上のプロセッサ、補助記憶装置、その他のLSIなどをハードウェアとして備える。プロセッサは、半導体集積回路やLSIを含む複数の電子回路により構成され、複数の電子回路は、1つのチップ上に搭載されてよく、または複数のチップ上に搭載されてもよい。図6に示す機能ブロックは、ハードウェアとソフトウェアとの連携によって実現され、したがって、これらの機能ブロックがハードウェアのみ、ソフトウェアのみ、またはそれらの組合せによっていろいろな形で実現できることは、当業者には理解されるところである。 The server device 5 is an information processing device equipped with a computer, which executes a program to realize the various functions shown in FIG. 6. The computer includes hardware such as a memory into which the programs are loaded, one or more processors that execute the loaded programs, an auxiliary storage device, and other LSIs. The processor is composed of multiple electronic circuits including semiconductor integrated circuits and LSIs, and the multiple electronic circuits may be mounted on a single chip or on multiple chips. The functional blocks shown in FIG. 6 are realized by the cooperation of hardware and software, and therefore, it will be understood by those skilled in the art that these functional blocks can be realized in various forms by hardware alone, software alone, or a combination of both.
 実施形態においてプレイデータは、機械学習モデル(または人工知能モデル)であるゲーム操作モデルを学習させるための学習用データとして利用される。プレイデータは、教師あり学習における教師データとして利用されてよい。上記したように、プレイデータは、プレイヤ(ユーザ)がゲームプレイのために操作した入力装置6の操作情報、ユーザアカウント、ゲームIDおよびタイムスタンプを含んでいる。エージェント生成部214は、複数のプレイヤによるプレイデータを教師データとして用いて、ゲーム操作モデルを学習させてよい。 In an embodiment, the play data is used as learning data for training a game operation model, which is a machine learning model (or an artificial intelligence model). The play data may be used as teacher data in supervised learning. As described above, the play data includes operation information of the input device 6 operated by the player (user) to play the game, a user account, a game ID, and a timestamp. The agent generation unit 214 may use the play data of multiple players as teacher data to train a game operation model.
 なおユーザによるゲーム操作を考えると、ゲームプレイ中、ユーザは、ゲーム映像(プレイ映像)を見てゲームの状況を認識し、ゲームの状況に応じて入力装置6を操作して、ゲームを進行させる。そのためプレイデータが、入力装置6の操作情報により描画されたプレイ映像を含む場合、ゲーム操作モデルは、教師データであるゲーム映像(プレイ映像)を入力すると、入力装置6の操作情報(またはゲームのコマンド)を出力するように訓練されてよい。ゲーム映像をゲーム操作モデルの入力とする場合、ゲーム操作モデル(機械学習モデル)は、入力層と、1つ以上の畳み込み層と、出力層とを含む多層のニューラルネットワークに相当するCNN(Convolutional Neural Network)であってよく、エージェント生成部214は、CNNにおける各結合係数(重み)をディープラーニング等の学習手法で学習させてよい。 When considering the game operation by the user, during game play, the user recognizes the game situation by watching the game video (play video), and operates the input device 6 according to the game situation to progress through the game. Therefore, when the play data includes play video drawn by operation information of the input device 6, the game operation model may be trained to output operation information (or game commands) of the input device 6 when the game video (play video) that is the teacher data is input. When the game video is used as the input for the game operation model, the game operation model (machine learning model) may be a CNN (Convolutional Neural Network) that is equivalent to a multi-layer neural network including an input layer, one or more convolutional layers, and an output layer, and the agent generation unit 214 may learn each coupling coefficient (weight) in the CNN using a learning method such as deep learning.
 エージェント生成部214は、教師あり学習で生成したゲーム操作モデルを、さらに強化学習させることで、ゲーム操作モデルのプレイ精度を高めてよい。プレイ精度は、ゲーム操作モデルによるプレイ時間や、プレイヤキャラクタの残りHP等の状態値で評価することができる。 The agent generation unit 214 may further enhance the play accuracy of the game operation model by subjecting the game operation model generated by supervised learning to reinforcement learning. Play accuracy can be evaluated based on the play time of the game operation model, the remaining HP of the player character, and other status values.
 ロールプレイングゲームなど、ルールが比較的複雑なゲームでは、ゲームシーンごとにプレイ内容が変化することが多く、ゲームスタートからオールクリアを達成するAIエージェント(ゲーム操作モデル)のプレイ精度を高めることは容易でない。そこで実施形態では、エージェント生成部214が、ゲームのプレイ単位ごとにAIエージェントを生成して、各プレイ単位専用のAIエージェントのプレイ精度を高めるようにする。上記したようにゲームソフトウェアには、第1のプレイ単位である「アクティビティ」や、それよりも細かい第2のプレイ単位である「タスク」が含まれている。またタスクより細かい第3のプレイ単位である「サブタスク」が含まれていてもよい。エージェント生成部214は、このようなプレイ単位ごとにAIエージェント(ゲーム操作モデル)を生成して、AIエージェントのプレイ精度を高める。 In games with relatively complicated rules, such as role-playing games, the play content often changes for each game scene, and it is not easy to improve the play accuracy of an AI agent (game operation model) that achieves an all-clear from the start of the game. In this embodiment, the agent generation unit 214 generates an AI agent for each play unit of the game, thereby improving the play accuracy of the AI agent dedicated to each play unit. As described above, the game software includes an "activity," which is a first play unit, and a "task," which is a second play unit that is even finer than that. It may also include a "subtask," which is a third play unit that is even finer than a task. The agent generation unit 214 generates an AI agent (game operation model) for each such play unit, thereby improving the play accuracy of the AI agent.
 エージェント生成部214は、プレイデータに含まれるタイムスタンプと、イベントデータに含まれるタイムスタンプを参照して、各プレイ単位(たとえばアクティビティやタスク)で用いられたプレイデータ(操作情報やプレイ動画)を特定する。たとえばエージェント生成部214はプレイヤAのイベントデータを参照して、タスクaの開始イベントが1月1日10:00に発生し、タスクaの終了イベントが1月1日10:05に発生していることを認識すると、プレイヤAの1月1日10:00-10:05の間のプレイデータが、タスクaに関するプレイデータであることを特定する。このようにエージェント生成部214は、各プレイ単位に関するプレイデータを特定することで、各プレイ単位でのAIエージェントを生成することができる。 The agent generation unit 214 refers to the timestamps included in the play data and the timestamps included in the event data to identify the play data (operation information and play video) used in each unit of play (for example, an activity or task). For example, the agent generation unit 214 refers to the event data of player A and recognizes that the start event of task a occurs at 10:00 on January 1st, and the end event of task a occurs at 10:05 on January 1st, and identifies that player A's play data between 10:00 and 10:05 on January 1st is play data related to task a. In this way, the agent generation unit 214 can generate an AI agent for each unit of play by identifying the play data related to each unit of play.
 たとえば図5に示すアクティビティAは、タスクa、タスクb、タスクcから構成されているが、エージェント生成部214が、アクティビティに特化したAIエージェントと、タスクに特化したAIエージェントを生成した場合、小さなプレイ単位であるタスクに特化したAIエージェントのプレイ精度の方が高いことが予想される。そのためエージェント生成部214は、小さなプレイ単位であるタスクに特化したAIエージェントを生成することが好ましい。したがって、アクティビティAに関してエージェント生成部214は、タスクA、タスクB、タスクCのそれぞれに特化したAIエージェントを生成する。 For example, activity A shown in FIG. 5 is composed of task a, task b, and task c. If the agent generation unit 214 generates an AI agent specialized in the activity and an AI agent specialized in the task, it is expected that the playing accuracy of the AI agent specialized in the task, which is a small unit of play, will be higher. Therefore, it is preferable for the agent generation unit 214 to generate an AI agent specialized in a task, which is a small unit of play. Therefore, with regard to activity A, the agent generation unit 214 generates AI agents specialized in each of task A, task B, and task C.
 エージェント生成部214は、それぞれのAIエージェントのプレイ精度を、たとえばプレイ時間などの評価指標を用いて評価する。たとえばAIエージェントによるプレイ時間が目標とするプレイ時間よりも長ければ、エージェント生成部214は、そのAIエージェントのプレイ精度が低いことを判定する。プレイ精度が所定の基準を満たさない場合、エージェント生成部214は、プレイ精度を高めるために、そのAIエージェントの教師あり学習および/または強化学習を継続して実施する。 The agent generation unit 214 evaluates the playing accuracy of each AI agent using an evaluation index such as playing time. For example, if the playing time of an AI agent is longer than a target playing time, the agent generation unit 214 determines that the playing accuracy of the AI agent is low. If the playing accuracy does not meet a predetermined standard, the agent generation unit 214 continues to perform supervised learning and/or reinforcement learning of the AI agent in order to improve the playing accuracy.
 エージェント生成部214は、プレイ精度が所定の基準を満たしたAIエージェント(ゲーム操作モデル)を、ゲーム操作モデル記録部236に記録する。ゲーム操作モデル記録部236に記録されたAIエージェントは、ゲームの開始前にユーザに提供されて、ユーザのゲームプレイの支援等に利用される。 The agent generation unit 214 records an AI agent (game operation model) whose playing accuracy meets a predetermined standard in the game operation model recording unit 236. The AI agent recorded in the game operation model recording unit 236 is provided to the user before the start of the game and is used to support the user's game play, etc.
 なおエージェント生成部214は、アクティビティAに特化したAIエージェントを生成してもよい。プレイ精度をプレイ時間で評価する場合、アクティビティAに特化したAIエージェントがアクティビティA(つまりタスクA、タスクB、タスクC)をプレイした時間が、タスクAに特化したAIエージェントがタスクAをプレイした時間と、タスクBに特化したAIエージェントがタスクBをプレイした時間と、タスクCに特化したAIエージェントがタスクCをプレイした時間との合計時間よりも短ければ、アクティビティAに特化したAIエージェントのプレイ精度が高いことを評価してもよい。この場合、エージェント生成部214は、アクティビティAに特化したAIエージェントをゲーム操作モデル記録部236に記録してよい。 The agent generation unit 214 may generate an AI agent specialized in activity A. When evaluating play accuracy by play time, if the time that an AI agent specialized in activity A takes to play activity A (i.e., task A, task B, task C) is shorter than the total time that an AI agent specialized in task A takes to play task A, the time that an AI agent specialized in task B takes to play task B, and the time that an AI agent specialized in task C takes to play task C, the play accuracy of the AI agent specialized in activity A may be evaluated as being high. In this case, the agent generation unit 214 may record the AI agent specialized in activity A in the game operation model recording unit 236.
 図5に示すように、アクティビティが複数のタスクから構成される場合、エージェント生成部214は、アクティビティ専用のAIエージェントと、タスク専用のAIエージェントを生成して、両者のプレイ精度を比較しておくことで、優れた方のAIエージェントをユーザが利用することが可能となる。以上のようにして、サーバ装置5は、複数のプレイヤによるプレイデータから、プレイ単位に特化した専用のAIエージェントを生成する。 As shown in FIG. 5, when an activity is made up of multiple tasks, the agent generation unit 214 generates an AI agent dedicated to the activity and an AI agent dedicated to the task, and by comparing the play accuracy of both, the user can use the AI agent with the superior quality. In this way, the server device 5 generates a dedicated AI agent specialized for each play unit from the play data of multiple players.
 図3に戻り、ゲーム実行部112は、ユーザが入力装置6を操作した情報にもとづいてゲームを実行する。なおゲーム実行部112がゲームを開始する前に、エージェント取得部130が、当該ゲームに関して生成されたエージェントをサーバ装置5から取得しておく。これによりユーザはゲームの途中で、代わりにエージェントにゲームをプレイしてもらうことが可能となる。エージェント取得部130は、取得したエージェント(ゲーム操作モデル)を補助記憶装置2に記録する。 Returning to FIG. 3, the game execution unit 112 executes the game based on information obtained by the user operating the input device 6. Before the game execution unit 112 starts the game, the agent acquisition unit 130 acquires an agent generated for the game from the server device 5. This allows the user to have the agent play the game on their behalf during the game. The agent acquisition unit 130 records the acquired agent (game operation model) in the auxiliary storage device 2.
 上記したように、サーバ装置5は、プレイ精度が所定の基準を満たしたエージェントをゲーム操作モデル記録部236に記録するが、プレイ精度が所定の基準を満たさないエージェントはゲーム操作モデル記録部236に記録していない。したがって1つのゲームの中で、エージェントが生成されているプレイ単位と、エージェントが生成されていないプレイ単位とが存在する。そのためエージェント取得部130は、ゲームの全てのプレイ単位についてエージェントを取得できるわけではなく、エージェントを取得できないプレイ単位が存在している。 As described above, the server device 5 records agents whose playing accuracy meets a predetermined standard in the game operation model recording unit 236, but does not record agents whose playing accuracy does not meet the predetermined standard in the game operation model recording unit 236. Therefore, within a single game, there are play units in which agents are generated and play units in which agents are not generated. Therefore, the agent acquisition unit 130 cannot acquire agents for all play units in a game, and there are play units in which agents cannot be acquired.
 そこで通知部136は、エージェントがプレイ可能なプレイ単位(アクティビティまたはタスク)をユーザに通知してよい。
 図7は、出力装置4に表示される通知領域150の例を示す。ユーザがゲームプレイ中に所定の操作を行うと、通知部136が、推奨するアクティビティリストおよびAIプレイの可否を通知するための情報を生成する。通知部136が生成した通知情報を出力処理部120に提供すると、出力処理部120は、かかる情報を含む通知領域150を、ゲーム画像に重畳表示する。なおゲームプレイ中に限らず、出力処理部120は、通知領域150を出力装置4に表示してよい。図7に示す通知領域150には、プレイを推奨される複数のアクティビティが表示され、ここではアクティビティA、アクティビティB、アクティビティCが表示されている。
Therefore, the notification section 136 may notify the user of play units (activities or tasks) that the agent can play.
7 shows an example of a notification area 150 displayed on the output device 4. When a user performs a predetermined operation during gameplay, the notification unit 136 generates information for notifying a list of recommended activities and whether or not AI play is possible. When the notification information generated by the notification unit 136 is provided to the output processing unit 120, the output processing unit 120 displays the notification area 150 including such information superimposed on a game image. Note that the output processing unit 120 may display the notification area 150 on the output device 4 not only during gameplay. The notification area 150 shown in FIG. 7 displays a plurality of activities recommended to be played, and here activity A, activity B, and activity C are displayed.
 アクティビティAの表示領域150aには、アクティビティAの一部でエージェントを用いたプレイが可能であることが示されている。このことは、たとえばアクティビティAが3つのタスクで構成される場合に、3つ全てのタスクでエージェントを利用することはできないが、少なくとも1つのタスクでエージェントを利用できることを意味する。 The display area 150a for activity A shows that it is possible to play using an agent in part of activity A. This means that, for example, if activity A is made up of three tasks, it is not possible to use an agent in all three tasks, but it is possible to use an agent in at least one task.
 アクティビティBの表示領域150bには、アクティビティBにおいてエージェントを用いたプレイが不可能であることが示されている。したがってユーザは、アクティビティBを、エージェントに頼ることなく、自分でプレイしなければならない。 The display area 150b for Activity B indicates that it is not possible to play Activity B using an agent. Therefore, the user must play Activity B on his or her own, without relying on an agent.
 アクティビティCの表示領域150cには、アクティビティCの全てでアクティビティを用いたプレイが可能であることが示されている。このことは、たとえばアクティビティCが4つのタスクで構成される場合に、4つ全てのタスクでエージェントを利用できることを意味する。 Display area 150c for activity C shows that all activities in activity C can be played using the activity. This means that, for example, if activity C is made up of four tasks, an agent can be used in all four tasks.
 各表示領域150a、150b、150cはGUI(Graphical User Interface)として構成され、ユーザがいずれかの表示領域を選択操作すると、出力処理部120は、当該表示領域のアクティビティを構成するタスク群を含む通知領域を表示する。ここではユーザが表示領域150aを選択操作したものとする。 Each display area 150a, 150b, 150c is configured as a GUI (Graphical User Interface), and when the user selects one of the display areas, the output processing unit 120 displays a notification area including a group of tasks that make up the activity of that display area. Here, it is assumed that the user has selected display area 150a.
 図8は、出力装置4に表示される通知領域152の例を示す。通知部136が、アクティビティAに含まれるタスクのリストおよびAIプレイの可否を通知するための情報を生成し、出力処理部120に提供すると、出力処理部120は、かかる情報を含む通知領域152を、ゲーム画像に重畳表示する。図8に示す通知領域152には、アクティビティAに含まれるタスクa、タスクb、タスクcが表示されている。 FIG. 8 shows an example of a notification area 152 displayed on the output device 4. When the notification unit 136 generates a list of tasks included in activity A and information for notifying whether AI play is possible and provides this to the output processing unit 120, the output processing unit 120 displays the notification area 152 including such information superimposed on the game image. Task a, task b, and task c included in activity A are displayed in the notification area 152 shown in FIG. 8.
 タスクaの表示領域152aには、タスクaにおいてエージェントを用いたプレイが可能であることが示されている。タスクbの表示領域152bには、タスクbにおいてエージェントを用いたプレイが不可能であることが示されている。タスクcの表示領域152cには、タスクcにおいてエージェントを用いたプレイが可能であることが示されている。 The display area 152a for task a shows that it is possible to play task a using an agent. The display area 152b for task b shows that it is not possible to play task b using an agent. The display area 152c for task c shows that it is possible to play task c using an agent.
 通知領域152において、ユーザが、ボタンGUIであるユーザプレイボタン152dを選択操作すると、受付部110が、ユーザが入力装置6を操作した情報をゲーム実行部112に提供し、ゲーム実行部112は、ユーザの操作情報にもとづいてゲームを実行する。このとき、エージェントは起動されず、ユーザが自力でゲームをプレイすることになる。 When the user selects the user play button 152d, which is a button GUI, in the notification area 152, the reception unit 110 provides information on the user's operation of the input device 6 to the game execution unit 112, and the game execution unit 112 executes the game based on the user's operation information. At this time, the agent is not started, and the user plays the game by himself/herself.
 ユーザがエージェントによるタスクのプレイを希望する場合、エージェントプレイを希望するタスクを選択してよい。たとえばユーザがタスクa、タスクcのエージェントプレイを希望する場合、表示領域152aと表示領域152cを1回押下操作して、選択状態にした後、ボタンGUIであるAIプレイボタン152eを選択操作する。AIプレイボタン152eが選択操作されると、ゲームプレイコントローラ132がエージェントを起動し、ゲーム実行部112は、エージェントの操作情報にもとづいてゲームを実行する。なおエージェントが存在していないプレイ単位(タスクb)では、エージェントが起動できないため、ゲーム実行部112は、ユーザの操作情報にもとづいてゲームを実行する。 If the user wishes to play a task with an agent, the user may select the task for which agent play is desired. For example, if the user wishes to play tasks a and c as an agent, the user presses display area 152a and display area 152c once to select them, and then selects AI play button 152e, which is a button GUI. When AI play button 152e is selected, game play controller 132 starts the agent, and game execution unit 112 executes the game based on the operation information of the agent. Note that in a play unit (task b) in which no agent exists, an agent cannot be started, so game execution unit 112 executes the game based on the operation information of the user.
 ユーザがタスクaのみエージェントプレイを希望する場合、表示領域152aを1回押下操作して、選択状態にした後、AIプレイボタン152eを選択操作する。またユーザがタスクcのみエージェントプレイを希望する場合、表示領域152cを1回押下操作して、選択状態にした後、AIプレイボタン152eを選択操作する。このようにプレイ単位を選択する操作は受付部110によって受け付けられ、ゲームプレイコントローラ132は、ユーザにより選択されたプレイ単位に対応するエージェントを起動して、ゲームをプレイさせる。 If the user wishes to play with an agent for task a only, he or she presses display area 152a once to select it, and then selects AI play button 152e. If the user wishes to play with an agent for task c only, he or she presses display area 152c once to select it, and then selects AI play button 152e. The operation to select a play unit in this way is accepted by acceptance unit 110, and game play controller 132 activates an agent corresponding to the play unit selected by the user, and causes the game to be played.
 以下、タスクa、タスクcが選択された状態で、ユーザがAIプレイボタン152eを選択操作したときの挙動について説明する。このときゲームプレイコントローラ132は、タスクaのエージェント(ゲーム操作モデル)を起動する。エージェントはプレイヤとして、ゲームに対する操作情報(もしくはゲームコマンド)を生成し、ゲーム実行部112は、エージェントの操作情報にもとづいて、タスクaのゲームプログラムを実行する。このようにユーザがエージェントにプレイを依頼すると、ゲームプレイコントローラ132はエージェントを起動して、ユーザの代わりにエージェントにゲームをプレイさせる。 Below is a description of the behavior when the user selects the AI play button 152e while tasks a and c are selected. At this time, the game play controller 132 starts up an agent (game operation model) for task a. The agent, as a player, generates operation information (or game commands) for the game, and the game execution unit 112 executes the game program for task a based on the agent's operation information. In this way, when the user requests the agent to play, the game play controller 132 starts up the agent and has the agent play the game on behalf of the user.
 図9は、エージェントによるタスクaのプレイ中に表示される通知領域154の例を示す。エージェントによるゲームプレイ中、通知部136が、エージェントがプレイしていることを通知するための情報を生成して出力処理部120に提供し、出力処理部120は、かかる通知情報を含む通知領域154を、ゲーム画像に重畳表示する。これによりユーザは、エージェントがプレイ中であることを確認できる。 FIG. 9 shows an example of a notification area 154 that is displayed while an agent is playing task a. While the agent is playing the game, the notification unit 136 generates information to notify the user that the agent is playing and provides this information to the output processing unit 120, and the output processing unit 120 displays the notification area 154 containing this notification information superimposed on the game image. This allows the user to confirm that the agent is playing.
 図8を参照して、アクティビティAにおいて、タスクaの専用エージェントは存在するものの、タスクbの専用エージェントは存在していない。そのためエージェントがタスクaをプレイし終わると、続くタスクbは、ユーザが自身でプレイする必要がある。 Referring to Figure 8, in activity A, a dedicated agent for task a exists, but a dedicated agent for task b does not exist. Therefore, when the agent finishes playing task a, the user must play the following task b by himself/herself.
 図10は、エージェントによるタスクaのプレイ終了時に表示される通知領域156の例を示す。通知部136が、エージェントによるプレイが終了することを通知するための情報を生成して出力処理部120に提供し、出力処理部120は、かかる通知情報を含む通知領域156を、ゲーム画像に重畳表示する。これによりユーザは、自身がプレイする必要があることを認識して、入力装置6の操作を開始する。 FIG. 10 shows an example of a notification area 156 that is displayed when the agent finishes playing task a. The notification unit 136 generates information for notifying the end of the agent's play and provides it to the output processing unit 120, and the output processing unit 120 displays the notification area 156 including the notification information superimposed on the game image. This allows the user to recognize that they need to play and to begin operating the input device 6.
 ユーザがタスクbをプレイし終わると、ゲームプレイコントローラ132は、タスクcのエージェント(ゲーム操作モデル)を起動する。エージェントは、ゲームに対する操作情報を生成し、ゲーム実行部112は、エージェントの操作情報にもとづいて、タスクcのゲームプログラムを実行する。このとき出力処理部120は、図9に示す通知領域154をゲーム画像に重畳表示する。 When the user finishes playing task b, the game play controller 132 starts an agent (game operation model) for task c. The agent generates operation information for the game, and the game execution unit 112 executes the game program for task c based on the operation information of the agent. At this time, the output processing unit 120 displays the notification area 154 shown in FIG. 9 superimposed on the game image.
 以上の例では、ユーザが通知領域150からアクティビティを選択して、エージェントにゲームプレイを依頼する例を示したが、出力処理部120が、エージェントが利用可能であることを示す通知情報を、ユーザに提示してもよい。 In the above example, the user selects an activity from the notification area 150 and requests an agent to play a game, but the output processing unit 120 may also present the user with notification information indicating that the agent is available.
 図11は、エージェントプレイを利用可能なときに表示される通知領域158の例を示す。エージェントが生成されているプレイ単位のシーンが表示されると、通知部136は、エージェントがプレイ可能であることを示す通知情報を生成して出力処理部120に提供し、出力処理部120が、かかる通知情報を含む通知領域158を、ゲーム画像に重畳表示する。これによりユーザは、エージェントが利用可能であることを認識できる。たとえばユーザが通知領域158を選択操作すると、ゲームプレイコントローラ132がエージェントを起動して、エージェントによるプレイを開始してもよい。 FIG. 11 shows an example of a notification area 158 that is displayed when agent play is available. When a play-by-play scene in which an agent has been generated is displayed, the notification unit 136 generates notification information indicating that the agent is available to play and provides it to the output processing unit 120, and the output processing unit 120 displays the notification area 158 containing such notification information superimposed on the game image. This allows the user to recognize that an agent is available. For example, when the user selects the notification area 158, the game play controller 132 may launch the agent and begin playing with the agent.
 なおエージェントによるプレイ中、ユーザは所定の操作を行うことで、ゲームの操作権をユーザに戻すことができる。たとえばユーザは、通知領域152を表示させて、ユーザプレイボタン152dを選択操作することで、ゲームの操作権を自身に戻してもよい。またエージェントによるプレイ中に、ユーザが入力装置6を操作すると、ゲームの操作権がユーザに戻るようにしてもよい。 Note that while playing with an agent, the user can return control rights for the game to the user by performing a specified operation. For example, the user may return control rights for the game to the user by displaying the notification area 152 and selecting the user play button 152d. Also, while playing with an agent, if the user operates the input device 6, control rights for the game may be returned to the user.
 またユーザに操作権がある状態で、ユーザが所定時間、入力装置6を操作せずに放置すると、ゲームの操作権がエージェントに移るように制御されてもよい。またユーザがタスクを成功できない場合に、通知部136が、エージェントが利用可能であることを示す通知情報を生成して、出力処理部120が、かかる通知情報をゲーム画像に重畳表示してもよい。ユーザによるゲーム進行が停滞しているときに、エージェントの利用を提案することで、ユーザのプレイを支援することが可能となる。 Furthermore, when the user has the operation authority, if the user leaves the input device 6 without operating it for a predetermined time, the operation authority of the game may be controlled to be transferred to the agent. Furthermore, if the user is unable to complete a task successfully, the notification unit 136 may generate notification information indicating that an agent is available, and the output processing unit 120 may display such notification information superimposed on the game image. When the user's progress in the game is stalled, it is possible to support the user's play by suggesting the use of an agent.
 以上、本開示を実施形態をもとに説明した。この実施形態は例示であり、それらの各構成要素や各処理プロセスの組合せにいろいろな変形例が可能なこと、またそうした変形例も本開示の範囲にあることは当業者に理解されるところである。実施形態では、エージェントを、ユーザに代わるプレイヤとして利用したが、たとえばヘルプ動画(ヒント動画)の生成にエージェントを利用することも可能であり、またNPCにエージェントを利用することも可能である。 The present disclosure has been described above based on an embodiment. This embodiment is merely an example, and those skilled in the art will understand that various modifications are possible in the combination of each of the components and each processing process, and that such modifications are also within the scope of the present disclosure. In the embodiment, an agent is used as a player on behalf of a user, but it is also possible to use an agent to generate help videos (hint videos), for example, and it is also possible to use an agent as an NPC.
 実施形態では、ゲーム実行部112が情報処理装置10に設けられているが、変形例では、サーバ装置5にゲーム実行部112が設けられて、ゲームシステム1がクラウドゲーミングサービスを提供してもよい。この場合、サーバ装置5がゲームプレイコントローラ132も備えて、サーバ装置5が情報処理装置10のように機能してよい。 In the embodiment, the game execution unit 112 is provided in the information processing device 10, but in a modified example, the game execution unit 112 may be provided in the server device 5, and the game system 1 may provide a cloud gaming service. In this case, the server device 5 may also include a game play controller 132, and the server device 5 may function like the information processing device 10.
 サーバ装置5におけるエージェントの学習に際して、エージェントに求めるプレイ精度は、適度に高ければよく、極端に高い必要はない。たとえば求めるプレイ精度は、平均的スキルを有するプレイヤと同等レベルのプレイ精度であってよい。つまり人間が実現できる程度のプレイ精度であってよく、人間が実現不可能な高速プレイを求めなくてよい。 When the agent is learning in the server device 5, the playing accuracy required of the agent only needs to be moderately high, and does not need to be extremely high. For example, the playing accuracy required may be at the same level as a player with average skill. In other words, the playing accuracy may be at a level that a human can achieve, and there is no need to require high-speed play that is impossible for a human to achieve.
 実施形態では、タスクbについての学習がうまくいかず、タスクbのエージェントのプレイ精度を高められない例を示した。これは、タスクbが、学習するプレイ単位として適切でなかったことに原因があると考えられる。 In the embodiment, an example was shown in which learning about task b did not go well, and the agent's playing accuracy for task b could not be improved. This is thought to be because task b was not appropriate as a unit of play for learning.
 そこでゲームメーカは、ゲームソフトウェアを開発する際に、プレイ単位を学習しやすい長さで区切る提案を行える解析装置を利用することが好ましい。たとえば開発中のゲームソフトウェアにおいて、あるプレイ単位(たとえばタスク)をゲーム開発者がテストプレイしたときに、当該解析装置は、テストプレイ時の操作情報とプレイ映像から、学習のしやすさを判定してよい。 Therefore, when developing game software, it is preferable for game makers to use an analysis device that can suggest dividing play units into lengths that are easy to learn. For example, when a game developer test-plays a certain play unit (such as a task) in game software under development, the analysis device can determine the ease of learning from the operation information and gameplay footage from the test play.
 具体的に解析装置は、1つのプレイ単位における操作情報とプレイ映像を解析して、コンテキスト(プレイ内容)が変化している場合に、当該プレイ単位を分割することを提案する。
 操作情報を時系列に解析したときに、無操作の期間が長かったり、メニュー表示操作が頻繁に行われている場合、解析装置は、当該プレイ単位が長く、学習に不向きであることを判定して、当該プレイ単位の分割を提案してよい。
 またプレイ映像を時系列に解析したとき、大きなシーンチェンジが存在する場合に、解析装置は、当該プレイ単位に複数のプレイ内容が含まれていることを判定して、当該プレイ単位の分割を提案してよい。
 このように解析装置は、仮設定された1つのプレイ単位における操作情報およびプレイ映像からプレイ内容の変化を検出すると、当該1つのプレイ単位を分割することをゲームメーカに提案してよい。
Specifically, the analysis device analyzes the operation information and play footage of one play unit, and if the context (play content) has changed, proposes dividing the play unit.
When analyzing operation information in a chronological order, if there are long periods of no operation or menu display operations are frequently performed, the analysis device may determine that the play unit is long and unsuitable for learning, and may suggest dividing the play unit.
In addition, when analyzing gameplay footage in chronological order, if there is a major scene change, the analysis device may determine that the play unit contains multiple play contents and suggest dividing the play unit.
In this way, when the analysis device detects a change in the play content from the operation information and play footage in one provisionally set play unit, it may suggest to the game maker that the one play unit be divided.
 本開示は、ユーザの代わりにエージェントがゲームをプレイする技術に利用できる。 This disclosure can be used in technology where an agent plays a game on behalf of a user.
1・・・ゲームシステム、5・・・サーバ装置、6・・・入力装置、10・・・情報処理装置、100・・・処理部、102・・・通信部、110・・・受付部、112・・・ゲーム実行部、114・・・ゲーム画音生成部、116・・・ゲーム画像生成部、118・・・ゲーム音生成部、120・・・出力処理部、122・・・プレイデータ取得部、130・・・エージェント取得部、132・・・ゲームプレイコントローラ、134・・・イベント情報取得部、136・・・通知部、140・・・送信処理部、200・・・処理部、202・・・通信部、210・・・プレイデータ取得部、212・・・イベントデータ取得部、214・・・エージェント生成部、216・・・エージェント提供部、230・・・記録装置、232・・・プレイデータ記録部、234・・・イベントデータ記録部、236・・・ゲーム操作モデル記録部。 1: Game system, 5: Server device, 6: Input device, 10: Information processing device, 100: Processing unit, 102: Communication unit, 110: Reception unit, 112: Game execution unit, 114: Game image and sound generation unit, 116: Game image generation unit, 118: Game sound generation unit, 120: Output processing unit, 122: Play data acquisition unit, 130: Agent acquisition unit, 132: Game play controller, 134: Event information acquisition unit, 136: Notification unit, 140: Transmission processing unit, 200: Processing unit, 202: Communication unit, 210: Play data acquisition unit, 212: Event data acquisition unit, 214: Agent generation unit, 216: Agent provision unit, 230: Recording device, 232: Play data recording unit, 234: Event data recording unit, 236: Game operation model recording unit.

Claims (8)

  1.  情報処理装置であって、ハードウェアを有する1つ以上のプロセッサを備え、
     前記1つ以上のプロセッサは、
     ユーザの操作情報を受け付け、
     操作情報にもとづいてゲームを実行し、
     エージェントがプレイ可能なプレイ単位をユーザに通知し、
     当該プレイ単位を、ユーザの代わりにエージェントにプレイさせる、
     情報処理装置。
    An information processing device, comprising one or more processors having hardware,
    The one or more processors:
    Accepts user operation information,
    Execute the game based on the operation information,
    The agent notifies the user of available play units;
    having an agent play the play unit on behalf of the user;
    Information processing device.
  2.  前記1つ以上のプロセッサは、
     エージェントがプレイ可能なプレイ単位を選択する操作を受け付け、
     選択されたプレイ単位を、エージェントにプレイさせる、
     請求項1に記載の情報処理装置。
    The one or more processors:
    Accepting an operation by an agent to select a playable unit of play,
    Have the agent play the selected play unit;
    The information processing device according to claim 1 .
  3.  前記1つ以上のプロセッサは、
     エージェントによるプレイ中、エージェントがプレイしていることを示す情報を表示する、
     請求項1に記載の情報処理装置。
    The one or more processors:
    While an agent is playing, display information indicating that the agent is playing;
    The information processing device according to claim 1 .
  4.  前記1つ以上のプロセッサは、
     エージェントがプレイできなくなると、エージェントによるプレイを停止することを示す情報を表示する、
     請求項2に記載の情報処理装置。
    The one or more processors:
    When an agent becomes unable to play, display a message indicating that the agent will stop playing;
    The information processing device according to claim 2 .
  5.  前記1つ以上のプロセッサは、
     ユーザによるプレイ中、エージェントがプレイ可能である場合に、エージェントがプレイ可能であることを示す情報を表示する、
     請求項1に記載の情報処理装置。
    The one or more processors:
    During play by the user, if the agent is playable, display information indicating that the agent is playable;
    The information processing device according to claim 1 .
  6.  エージェントは、プレイ単位ごとに生成されており、
     前記1つ以上のプロセッサは、
     プレイ単位ごとに異なるエージェントにプレイさせる、
     請求項1に記載の情報処理装置。
    Agents are generated for each play unit.
    The one or more processors:
    Have a different agent play each play unit,
    The information processing device according to claim 1 .
  7.  情報処理装置において、ゲームをプレイする方法であって、
     ユーザの操作情報を受け付け、
     操作情報にもとづいてゲームを実行し、
     エージェントがプレイ可能なプレイ単位をユーザに通知し、
     当該プレイ単位を、ユーザの代わりにエージェントにプレイさせる、
     ゲームプレイ方法。
    A method for playing a game in an information processing device, comprising:
    Accepts user operation information,
    Execute the game based on the operation information,
    The agent notifies the user of available play units;
    having an agent play the play unit on behalf of the user;
    How to play the game.
  8.  コンピュータに、
     ユーザの操作情報を受け付ける機能と、
     操作情報にもとづいてゲームを実行する機能と、
     エージェントがプレイ可能なプレイ単位をユーザに通知する機能と、
     当該プレイ単位を、ユーザの代わりにエージェントにプレイさせる機能と、
     を実現させるためのプログラム。
    On the computer,
    A function for receiving user operation information;
    A function for executing a game based on the operation information;
    A function to inform the user of the play units that the agent can play;
    a function of having an agent play the unit of play on behalf of the user;
    A program to achieve this.
PCT/JP2023/004907 2023-02-14 2023-02-14 Information processing device and game play method WO2024171277A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/004907 WO2024171277A1 (en) 2023-02-14 2023-02-14 Information processing device and game play method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/004907 WO2024171277A1 (en) 2023-02-14 2023-02-14 Information processing device and game play method

Publications (1)

Publication Number Publication Date
WO2024171277A1 true WO2024171277A1 (en) 2024-08-22

Family

ID=92420954

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/004907 WO2024171277A1 (en) 2023-02-14 2023-02-14 Information processing device and game play method

Country Status (1)

Country Link
WO (1) WO2024171277A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012143408A (en) * 2011-01-12 2012-08-02 Square Enix Co Ltd Network game system, game device, and program
JP2012205749A (en) * 2011-03-29 2012-10-25 Konami Digital Entertainment Co Ltd Game system, control method of game system and program
JP2016154708A (en) * 2015-02-25 2016-09-01 株式会社コロプラ Game program with automatic control function
JP2016526952A (en) * 2013-05-30 2016-09-08 エンパイア テクノロジー ディベロップメント エルエルシー Control a multiplayer online role-playing game
JP2019520154A (en) * 2016-06-30 2019-07-18 株式会社ソニー・インタラクティブエンタテインメント Control mode to play a specific task during gaming application
JP2020130474A (en) * 2019-02-15 2020-08-31 株式会社コナミデジタルエンタテインメント Game system, computer program used therefor, and control method
JP2021534931A (en) * 2018-11-05 2021-12-16 ソニー・インタラクティブエンタテインメント エルエルシー Training of artificial intelligence (AI) model using cloud gaming network

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012143408A (en) * 2011-01-12 2012-08-02 Square Enix Co Ltd Network game system, game device, and program
JP2012205749A (en) * 2011-03-29 2012-10-25 Konami Digital Entertainment Co Ltd Game system, control method of game system and program
JP2016526952A (en) * 2013-05-30 2016-09-08 エンパイア テクノロジー ディベロップメント エルエルシー Control a multiplayer online role-playing game
JP2016154708A (en) * 2015-02-25 2016-09-01 株式会社コロプラ Game program with automatic control function
JP2019520154A (en) * 2016-06-30 2019-07-18 株式会社ソニー・インタラクティブエンタテインメント Control mode to play a specific task during gaming application
JP2021534931A (en) * 2018-11-05 2021-12-16 ソニー・インタラクティブエンタテインメント エルエルシー Training of artificial intelligence (AI) model using cloud gaming network
JP2020130474A (en) * 2019-02-15 2020-08-31 株式会社コナミデジタルエンタテインメント Game system, computer program used therefor, and control method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
FUN FUN SHOGI: "Shogi wars Gameplay English Commentary for beginners #7 【Power of KISHIN (God of Shogi)】", 27 February 2021 (2021-02-27), XP093201358, Retrieved from the Internet <URL:https://www.youtube.com/watch?v=qIY8yrqQsLY> *

Similar Documents

Publication Publication Date Title
JP3699660B2 (en) Game device and network game system
US10071315B2 (en) Content providing method, content providing server, and content providing system
KR101459821B1 (en) System and method for dynamic matchmaking population herding
US20090137320A1 (en) Network game system and program
WO2009153910A1 (en) Game device
JP7365312B2 (en) Server device, event data processing method, and information processing device
JP2011200437A (en) Contents server, content providing system, and contents providing method
JP7365314B2 (en) Server device, event data processing method, and information processing device
JP5308727B2 (en) Game device
JP5112971B2 (en) Game device
WO2007094303A1 (en) Game server, bystander evaluating method, information recording medium, and program
CN111936213A (en) Generating Meta-Game resources with social engagement
JP5955278B2 (en) Game device and system software
WO2024171277A1 (en) Information processing device and game play method
US20090124388A1 (en) Network game system, game machine, game machine control method, and information storage medium
JP7365313B2 (en) Server device, skill value derivation method, and information processing device
JP2010252863A (en) Game system and program
JP2023133891A (en) Program and system
WO2024166255A1 (en) Server device and information processing device
WO2024127575A1 (en) Information processing device and game image display method
WO2024180580A1 (en) Information processing device and activity generating method
JP7519407B2 (en) Server device, information processing device and information providing method
JP7553507B2 (en) Server device, information processing device, gameplay video providing method, and information providing method
WO2024180581A1 (en) Information processing device and information display method
JP7556124B1 (en) Program, method, information processing device, and system