WO2022137376A1 - Method, computer-readable medium, and information processing device - Google Patents

Method, computer-readable medium, and information processing device Download PDF

Info

Publication number
WO2022137376A1
WO2022137376A1 PCT/JP2020/048107 JP2020048107W WO2022137376A1 WO 2022137376 A1 WO2022137376 A1 WO 2022137376A1 JP 2020048107 W JP2020048107 W JP 2020048107W WO 2022137376 A1 WO2022137376 A1 WO 2022137376A1
Authority
WO
WIPO (PCT)
Prior art keywords
game
user
progress
character
user terminal
Prior art date
Application number
PCT/JP2020/048107
Other languages
French (fr)
Japanese (ja)
Inventor
功淳 馬場
Original Assignee
株式会社コロプラ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コロプラ filed Critical 株式会社コロプラ
Priority to JP2022570845A priority Critical patent/JPWO2022137376A1/ja
Priority to PCT/JP2020/048107 priority patent/WO2022137376A1/en
Publication of WO2022137376A1 publication Critical patent/WO2022137376A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/86Watching games played by other players
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/87Communicating with other players during game play, e.g. by e-mail or chat

Definitions

  • This disclosure relates to methods, computer-readable media, and information processing devices.
  • Non-Patent Document 1 discloses a romance simulation game whose main purpose is to virtually deepen friendship with a girl character. The user selects the most suitable action for the character from the presented options, and the story progresses by repeating the reaction of the character to the action.
  • Non-Patent Document 1 a character response pattern is prepared in advance. Then, according to the input operation of the user, the response of the character is determined from the response pattern and output, and the game progresses. Therefore, the variation of the character's movement does not extend beyond the contents of the data prepared in advance. Therefore, there is a problem that the user cannot feel the reality as if the character is in the real world with respect to the relationship with the character, and eventually gets tired of it. Generally, in a game developed with the intention of letting the user play for a long time, it is important how to deal with the problem that the user gets tired of the game. Games are always required to provide compelling content that motivates users to play. For example, if a character appearing in a game has a high sense of reality, the user can easily immerse himself in the world of the game and find interest in the relationship with the character.
  • One aspect of the present disclosure is intended to enhance the immersive feeling of the game in the world and to improve the interest of the game.
  • One aspect of the present disclosure is a method performed by a computer as a user terminal including a processor, memory, display unit and operation unit, wherein the user terminal is physically separated from the space in which the user terminal exists.
  • An external device that exists in space and is different from the user of the user terminal and controls the character in response to input from a performer who plays at least one character appearing in the game in a space invisible to the user. It is configured to be able to communicate via the network, and the method is to advance the game according to the user's input operation input to the computer via the operation unit by the processor, and to the external device, the progress of the game.
  • a step of sequentially transmitting progress information for making it possible to display a game screen of a game in progress which is a screen displayed on a display unit or a simulated screen obtained by simplifying the screen.
  • it is audio data acquired by an external device that displays the game screen of the game in progress in real time so that the performer can see it based on the progress information transmitted sequentially, and is displayed in real time by the performer.
  • the game is in progress with the step of receiving the voice data of the voice that is emitted from the game screen at an arbitrary timing and cannot be heard directly by the user from the external device and the reception of the voice data as a trigger.
  • the step of operating the character and after the end of the game the character is received again from the outside in response to the request to watch the progressed game. It is a method including a step of re-moving a character based on voice data.
  • Another aspect of the present disclosure is a computer-readable medium containing computer-executable instructions, which, upon execution of the computer-executable instruction, causes the processor to perform the steps included in the above method. Is.
  • Another aspect of the present disclosure is performed by a first user terminal of a first user other than at least one second user who comprises a first processor and a first memory and has viewed real-time distribution of a game performed by a performer in real time.
  • the recorded voice data related to the game and the voice data for controlling at least one character appearing in the game are received from the outside.
  • the delivered game includes a second processor, a second memory, a display unit, and a step of executing the reproduction of the delivered game by operating the character based on the voice data.
  • Sequential transmission of progress information for making it possible to display a game screen of a game in progress, including a screen displayed on the display unit or a simulated screen simplified from the screen, and the sequential transmission of the progress information.
  • Appearing during the progress of the game triggered by the reception of voice data from the external device and the reception of voice data, which is voice that is emitted at an arbitrary timing and cannot be heard directly by the second user.
  • It is a method which is a game generated by operating a character by causing the character to speak at least the content of the voice data, and by executing the character.
  • Another aspect of the present disclosure is a computer-readable medium containing computer-executable instructions, wherein upon execution of the computer-executable instructions, the first processor causes the first processor to perform the steps included in the method. It is a readable medium.
  • an information processing apparatus as a user terminal including a processor, a memory, a display unit, and an operation unit, and the user terminal is a space physically separated from the space in which the user terminal exists.
  • An external device that is different from the user of the user terminal and controls the character in response to an input from a performer who plays at least one character appearing in the game in a space invisible to the user. It is configured to be communicable via the network, and the processor reads the program stored in the memory to advance the game according to the user's input operation input to the computer via the operation unit, and externally.
  • a game screen of a game in progress which is a screen displayed on a display unit or a simulated screen obtained by simplifying the screen, as information indicating the progress of the game.
  • It is audio data acquired by an external device that sequentially transmits progress information and displays the game screen of the game in progress in real time so that the performer can see it based on the sequentially transmitted progress information.
  • Another aspect of the present disclosure is information as a first user terminal of a first user other than at least one second user who has a first processor and a first memory and has viewed real-time distribution of a game performed by a performer in real time.
  • the first processor which is a processing device, appears in the game as recorded voice data related to the game from the outside after the real-time distribution of the game is completed. It is configured to receive voice data for controlling at least one character and to execute the reproduction of the delivered game by operating the character based on the voice data.
  • the distributed game is a process executed by a computer as a second user terminal of a second user including a second processor, a second memory, a display unit, and an operation unit, and the second user terminal is the process.
  • the space in which the second user terminal exists is an external device that exists in a space physically separated from the second user of the second user terminal, and the character is played in a space that cannot be visually recognized by the second user. It is configured to be able to communicate with an external device that controls the character in response to input from the performer via the network, and the second processor reads the program stored in the second memory to the computer via the operation unit.
  • the game is progressed according to the input operation of the second user input to, and the game screen of the game in progress is displayed on the display unit as information indicating the progress of the game on the external device.
  • the performer visually recognizes the progress information for sequentially transmitting the screen or the game screen including the simulated screen that simplifies the screen, and the game screen of the game in progress based on the sequentially transmitted progress information.
  • It is audio data acquired by an external device that is displayed in real time so that it is possible, and it is audio that is emitted from the performer at an arbitrary timing to the game screen displayed in real time, and cannot be heard directly by the second user.
  • the voice data of the voice from the external device By receiving the voice data of the voice from the external device and triggering the reception of the voice data, the character appearing during the progress of the game is made to speak at least the content of the voice data.
  • Is an information processing device which is a game generated by operating and executing.
  • Another aspect of the present disclosure is a method for controlling a character appearing in a game, wherein the method comprises a processor, a memory and a display, and is input from a performer who plays at least one character appearing in the game.
  • a computer that controls the character according to the situation, and is physically separated from the space where the user terminal of the user different from the performer exists, and exists in the space where the user cannot see the performer, and can communicate with the user terminal via the network. It is a method performed by a computer, and the method is in progress based on progress information by a processor, which is sequentially transmitted from a user terminal for advancing the game, so as to be able to display a game screen of the in-progress game.
  • the game screen of the game which is the screen displayed on the display unit of the user terminal or the game screen including the simulated screen simplified from the screen, is displayed in real time on the display unit of the computer so that the performer can see it. Steps to be performed, a step to accept a voice that is emitted from the performer at an arbitrary timing to the game screen and cannot be heard directly by the user, and a step to send the voice data of the received voice to the user terminal. After the step and the end of the game, in response to a request from the user terminal, the voice data is transmitted to the user terminal on the user terminal in order to make the character speak the content of the voice data as in the end of the game.
  • the voice data is transmitted to the terminal other than the user terminal in order to make the character speak the content of the voice data in the terminal other than the user terminal as in the end of the game. , Steps, and methods.
  • Another aspect of the present disclosure is a computer-readable medium containing computer-executable instructions, which, upon execution of the computer-executable instruction, causes the processor to perform the steps included in the above method. be.
  • Another aspect of the present disclosure is an information processing device for controlling a character appearing in a game, wherein the information processing device includes a processor, a memory, and a display unit, and is a performer who plays at least one character appearing in the game.
  • a computer that controls the character in response to input from, and is physically separated from the space where the user terminal of a user different from the performer exists, and exists in a space where the user cannot see the performer, and the user terminal and the network. It is configured to be communicable via, and the processor can display the game screen of the game in progress, which is sequentially transmitted from the user terminal that advances the game by reading the program stored in the memory.
  • the performer can visually recognize the game screen of the game in progress, including the screen displayed on the display unit of the user terminal or the simulated screen simplified from the screen. It is displayed in real time on the display unit of the computer, it accepts the voice that is emitted from the performer at any timing to the game screen displayed in real time and cannot be heard directly by the user, and the received voice. Voice data is transmitted to the user terminal, and after the game is finished, in response to a request from the user terminal, the character is made to speak the content of the voice data on the user terminal as in the case of the finished game.
  • it has the effect of improving the interest of the game.
  • the game system according to the present disclosure is a system for providing a game to a plurality of users.
  • the game system will be described with reference to the drawings. It should be noted that the present invention is not limited to these examples, and is indicated by the scope of claims, and it is intended that all modifications within the meaning and scope equivalent to the scope of claims are included in the present invention. To. In the following description, the same elements are designated by the same reference numerals in the description of the drawings, and duplicate description is not repeated.
  • FIG. 1 is a diagram showing a hardware configuration of the game system 1.
  • the game system 1 includes a plurality of user terminals 100 and a server 200. Each user terminal 100 connects to the server 200 via the network 2.
  • the network 2 is composed of various mobile communication systems constructed by the Internet and a radio base station (not shown). Examples of this mobile communication system include so-called 3G and 4G mobile communication systems, LTE (Long Term Evolution), and wireless networks (for example, Wi-Fi (registered trademark)) that can be connected to the Internet by a predetermined access point. Will be.
  • the server 200 (computer, information processing device) may be a general-purpose computer such as a workstation or a personal computer.
  • the server 200 includes a processor 20, a memory 21, a storage 22, a communication IF 23, and an input / output IF 24. These configurations of the server 200 are electrically connected to each other by a communication bus.
  • the user terminal 100 may be a mobile terminal such as a smartphone, a feature phone, a PDA (Personal Digital Assistant), or a tablet computer.
  • the user terminal 100 may be a game device suitable for game play.
  • the user terminal 100 includes a processor 10, a memory 11, a storage 12, a communication interface (IF) 13, an input / output IF 14, a touch screen 15 (display unit), a camera 17, and a distance measuring sensor 18. And prepare.
  • IF communication interface
  • IF input / output IF 14
  • touch screen 15 display unit
  • camera 17 a camera 17
  • a distance measuring sensor 18 a distance measuring sensor 18.
  • These configurations included in the user terminal 100 are electrically connected to each other by a communication bus.
  • the user terminal 100 may be provided with an input / output IF 14 to which a display (display unit) configured separately from the user terminal 100 main body can be connected in place of or in addition to the touch screen 15.
  • the user terminal 100 may be configured to be communicable with one or more controllers 1020.
  • the controller 1020 establishes communication with the user terminal 100 according to a communication standard such as Bluetooth (registered trademark).
  • the controller 1020 may have one or more buttons or the like, and transmits an output value based on a user's input operation to the buttons or the like to the user terminal 100.
  • the controller 1020 may have various sensors such as an acceleration sensor and an angular velocity sensor, and transmits the output values of the various sensors to the user terminal 100.
  • the controller 1020 may have the camera 17 and the distance measuring sensor 18.
  • the user terminal 100 causes a user who uses the controller 1020 to input user identification information such as the user's name or login ID via the controller 1020, for example, at the start of a game.
  • the user terminal 100 can associate the controller 1020 with the user, and identifies which user the output value belongs to based on the source of the received output value (controller 1020). be able to.
  • each user terminal 100 When the user terminal 100 communicates with a plurality of controllers 1020, each user grips each controller 1020 so that the user terminal 100 does not communicate with other devices such as the server 200 via the network 2. Multiplayer can be realized with.
  • each user terminal 100 communicates with each other according to a wireless standard such as a wireless LAN (Local Area Network) standard (communication connection is made without going through the server 200), thereby realizing local multiplayer with a plurality of user terminals 100. You can also do it.
  • a wireless standard such as a wireless LAN (Local Area Network) standard (communication connection is made without going through the server 200), thereby realizing local multiplayer with a plurality of user terminals 100. You can also do it.
  • the user terminal 100 When the above-mentioned multiplayer is realized locally by one user terminal 100, the user terminal 100 may further include at least a part of various functions described later described in the server 200. Further, when the above-mentioned multiplayer is realized locally by a plurality of user terminals 100, the
  • the user terminal 100 may communicate with the server 200.
  • information indicating a play result such as a result or a win or loss in a certain game may be associated with user identification information and transmitted to the server 200.
  • the controller 1020 may be configured to be detachable from the user terminal 100.
  • a coupling portion with the controller 1020 may be provided on at least one surface of the housing of the user terminal 100.
  • the user terminal 100 may accept the attachment of a storage medium 1030 such as an external memory card via the input / output IF14. As a result, the user terminal 100 can read the program and data recorded on the storage medium 1030.
  • the program recorded on the storage medium 1030 is, for example, a game program.
  • the user terminal 100 may store the game program acquired by communicating with an external device such as the server 200 in the memory 11 of the user terminal 100, or may store the game program acquired by reading from the storage medium 1030 in the memory 11. You may memorize it in.
  • the user terminal 100 includes a communication IF 13, an input / output IF 14, a touch screen 15, a camera 17, and a distance measuring sensor 18 as an example of a mechanism for inputting information to the user terminal 100.
  • a communication IF 13 an input / output IF 14
  • a touch screen 15 a camera 17, and a distance measuring sensor 18
  • an input mechanism can be regarded as an operation part configured to accept a user's input operation.
  • the operation unit when the operation unit is configured by at least one of the camera 17 and the distance measuring sensor 18, the operation unit detects an object 1010 in the vicinity of the user terminal 100 and performs an input operation from the detection result of the object. Identify.
  • a user's hand as an object 1010, a marker having a predetermined shape, or the like is detected, and an input operation is specified based on the color, shape, movement, or type of the object 1010 obtained as a detection result.
  • the user terminal 100 inputs a gesture (a series of movements of the user's hand) detected based on the captured image. Specify and accept as.
  • the captured image may be a still image or a moving image.
  • the user terminal 100 identifies and accepts the user's operation performed on the input unit 151 of the touch screen 15 as the user's input operation.
  • the operation unit is configured by the communication IF 13
  • the user terminal 100 identifies and accepts a signal (for example, an output value) transmitted from the controller 1020 as an input operation of the user.
  • a signal output from an input device (not shown) different from the controller 1020 connected to the input / output IF14 is specified as an input operation of the user and received.
  • the game system 1 further includes an operation instruction device 300.
  • the operation instruction device 300 connects to each of the server 200 and the user terminal 100 via the network 2. At least one operation instruction device 300 is provided in the game system 1.
  • a plurality of operation instruction devices 300 may be provided depending on the number of user terminals 100 that use the service provided by the server 200.
  • One operation instruction device 300 may be provided for one user terminal 100.
  • One operation instruction device 300 may be provided for a plurality of user terminals 100.
  • the operation instruction device 300 may be a computer such as a server, a desktop personal computer, a laptop computer, or a tablet, and a computer group in which these are combined.
  • the operation instruction device 300 includes a processor 30, a memory 31, a storage 32, a communication IF 33, an input / output IF 34, and a touch screen 35 (display unit). These configurations included in the operation instruction device 300 are electrically connected to each other by a communication bus.
  • the operation instruction device 300 may include an input / output IF 34 to which a display (display unit) configured separately from the operation instruction device 300 main body can be connected in place of or in addition to the touch screen 35.
  • the operation instruction device 300 is connected to peripheral devices such as one or more microphones 3010, one or more motion capture devices 3020, and one or more controllers 3030 via wireless or wired. It may be configured to be communicable.
  • the wirelessly connected peripheral device establishes communication with the operation instruction device 300 according to a communication standard such as Bluetooth (registered trademark).
  • the microphone 3010 acquires the voice generated in the surroundings and converts it into an electric signal.
  • the voice converted into an electric signal is transmitted to the operation instruction device 300 as voice data, and is received by the operation instruction device 300 via the communication IF 33.
  • the motion capture device 3020 tracks the motion (including facial expressions, mouth movements, etc.) of the tracking target (for example, a person), and transmits the output value as the tracking result to the operation instruction device 300.
  • the motion data which is an output value, is received by the operation instruction device 300 via the communication IF 33.
  • the motion capture method of the motion capture device 3020 is not particularly limited.
  • the motion capture device 3020 selectively includes all mechanisms for capturing motion, such as a camera, various sensors, markers, a suit worn by a model (person), a signal transmitter, etc., depending on the method adopted. ..
  • the controller 3030 may have one or more physical input mechanisms such as buttons, levers, sticks, and wheels.
  • the controller 3030 transmits an output value based on an input operation input to the input mechanism by the operator of the operation instruction device 300 to the operation instruction device 300.
  • the controller 3030 may have various sensors such as an acceleration sensor and an angular velocity sensor, and may transmit the output values of the various sensors to the operation instruction device 300.
  • the above output value is received by the operation instruction device 300 via the communication IF 33.
  • Called an operator The operator includes a person who operates the operation instruction device 300 by using the input unit 351 and the controller 3030, a voice actor who inputs voice through the microphone 3010, and moves via the motion capture device 3020.
  • a model for inputting is also included.
  • the operation instruction device 300 may include a camera and a distance measuring sensor (not shown).
  • the motion capture device 3020 and the controller 3030 may have a camera and a distance measuring sensor.
  • the operation instruction device 300 includes a communication IF 33, an input / output IF 34, and a touch screen 35 as an example of a mechanism for inputting information to the operation instruction device 300. If necessary, a camera and a distance measuring sensor may be further provided.
  • Each of the above-mentioned parts as an input mechanism can be regarded as an operation part configured to accept a user's input operation.
  • the operation unit may be composed of the touch screen 35.
  • the operation instruction device 300 identifies and accepts the user's operation performed on the input unit 351 of the touch screen 35 as the user's input operation.
  • the operation instruction device 300 identifies and accepts a signal (for example, an output value) transmitted from the controller 3030 as an input operation of the user.
  • a signal output from an input device (not shown) different from the controller 3030 connected to the input / output IF34 is specified as an input operation of the user and received.
  • the server 200 and the user terminal 100 cooperate to execute the game program 131, and the game played by the user is advanced on the user terminal 100.
  • the operation instruction device 300 (character control device) executes the character control program 134 and controls the operation of at least a part of the characters appearing in the game executed by the user terminal 100. can do.
  • the game system 1 is not limited to a specific genre, and may be a system for executing a game of any genre.
  • sports-themed games such as tennis, table tennis, dodgeball, baseball, soccer and hockey, puzzle games, quiz games, RPGs, adventure games, shooting games, simulation games, training games, and action games. May be good.
  • the game system 1 is not limited to a specific play form, and may be a system for executing a game of any play form.
  • a single-player game by a single user a multi-play game by a plurality of users, a battle game in which a plurality of users play against each other, and a cooperative play game in which a plurality of users cooperate among the multi-play games.
  • the battle game may include a battle game on the subject of sports such as tennis or baseball.
  • the battle game may include a board game in which two players play against each other, such as shogi, go, chess, and Othello.
  • the battle game may include a race game in which a plurality of users operate vehicles, players, or the like to go around the same course and compete for the time.
  • the processor 10 controls the operation of the entire user terminal 100.
  • the processor 20 controls the operation of the entire server 200.
  • the processor 30 controls the operation of the entire operation instruction device 300.
  • Processors 10, 20 and 30 include a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a GPU (Graphics Processing Unit).
  • the processor 10 reads a program from the storage 12 described later and expands it into the memory 11 described later.
  • the processor 20 reads a program from the storage 22 described later and expands it into the memory 21 described later.
  • the processor 30 reads a program from the storage 32 described later and expands it into the memory 31 described later. Processor 10, processor 20 and processor 30 execute the expanded program.
  • the memories 11, 21 and 31 are the main storage devices.
  • the memories 11, 21 and 31 are composed of storage devices such as ROM (Read Only Memory) and RAM (Random Access Memory).
  • the memory 11 provides a work area to the processor 10 by temporarily storing a program and various data read from the storage 12 described later by the processor 10.
  • the memory 11 also temporarily stores various data generated while the processor 10 is operating according to the program.
  • the memory 21 provides a work area to the processor 20 by temporarily storing various programs and data read from the storage 22 described later by the processor 20.
  • the memory 21 also temporarily stores various data generated while the processor 20 is operating according to the program.
  • the memory 31 provides a work area to the processor 30 by temporarily storing various programs and data read from the storage 32 described later by the processor 30.
  • the memory 31 also temporarily stores various data generated while the processor 30 is operating according to the program.
  • the program may be a game program for realizing the game by the user terminal 100.
  • the program may be a game program for realizing the game in collaboration with the user terminal 100 and the server 200.
  • the program may be a game program for realizing the game in cooperation with the user terminal 100, the server 200, and the operation instruction device 300.
  • the game realized by the cooperation of the user terminal 100 and the server 200 and the game realized by the cooperation of the user terminal 100, the server 200, and the operation instruction device 300 are started by the user terminal 100 as an example. It may be a game executed on a browser.
  • the program may be a game program for realizing the game by the cooperation of a plurality of user terminals 100.
  • the various data include data related to the game such as user information and game information, and instructions or notifications to be transmitted / received between the devices of the game system 1.
  • Storages 12, 22 and 32 are auxiliary storage devices.
  • the storages 12, 22 and 32 are composed of a storage device such as a flash memory or an HDD (Hard Disk Drive).
  • Various data related to the game are stored in the storages 12, 22 and 32.
  • the communication IF 13 controls the transmission and reception of various data in the user terminal 100.
  • the communication IF 23 controls the transmission / reception of various data in the server 200.
  • the communication IF 33 controls the transmission / reception of various data in the operation instruction device 300.
  • Communication IFs 13, 23 and 33 control communication using, for example, communication via a wireless LAN (Local Area Network), Internet communication via a wired LAN, a wireless LAN, or a mobile phone network, and communication using short-range wireless communication. ..
  • a wireless LAN Local Area Network
  • Internet communication via a wired LAN, a wireless LAN, or a mobile phone network
  • short-range wireless communication . .
  • the input / output IF 14 is an interface for the user terminal 100 to accept data input, and an interface for the user terminal 100 to output data.
  • the input / output IF 14 may input / output data via USB (Universal Serial Bus) or the like.
  • the input / output IF 14 may include, for example, a physical button, a camera, a microphone, a speaker, or the like of the user terminal 100.
  • the input / output IF 24 of the server 200 is an interface for the server 200 to receive data input, and an interface for the server 200 to output data.
  • the input / output IF 24 may include, for example, an input unit that is an information input device such as a mouse or a keyboard, and a display unit that is a device that displays and outputs an image.
  • the input / output IF 34 of the operation instruction device 300 is an interface for the operation instruction device 300 to receive data input, and an interface for the operation instruction device 300 to output data.
  • the input / output IF34 includes, for example, information input devices such as a mouse, keyboard, stick, and lever, devices for displaying and outputting images such as a liquid crystal display, and peripheral devices (microphone 3010, motion capture device 3020, and controller 3030). May include connections for sending and receiving data between.
  • the touch screen 15 of the user terminal 100 is an electronic component that combines an input unit 151 and a display unit 152.
  • the touch screen 35 of the operation instruction device 300 is an electronic component in which an input unit 351 and a display unit 352 are combined.
  • the input units 151 and 351 are, for example, touch-sensitive devices, and are configured by, for example, a touch pad.
  • the display units 152 and 352 are configured by, for example, a liquid crystal display, an organic EL (Electro-Luminescence) display, or the like.
  • the input units 151 and 351 detect the position where the user's operation (mainly a physical contact operation such as a touch operation, a slide operation, a swipe operation, and a tap operation) is input to the input surface, and information indicating the position. It has a function to transmit as an input signal.
  • the input units 151 and 351 may be provided with a touch sensing unit (not shown).
  • the touch sensing unit may adopt any method such as a capacitance method or a resistance film method.
  • the user terminal 100 may include one or more sensors for specifying the holding posture of the user terminal 100.
  • This sensor may be, for example, an acceleration sensor, an angular velocity sensor, or the like.
  • the processor 10 can also specify the holding posture of the user terminal 100 from the output of the sensor and perform processing according to the holding posture.
  • the processor 10 may be a vertical screen display in which a vertically long image is displayed on the display unit 152 when the user terminal 100 is held vertically.
  • the user terminal 100 when the user terminal 100 is held horizontally, it may be a horizontal screen display in which a horizontally long image is displayed on the display unit. In this way, the processor 10 may be able to switch between the vertical screen display and the horizontal screen display according to the holding posture of the user terminal 100.
  • the camera 17 includes an image sensor and the like, and generates a captured image by converting the incident light incident from the lens into an electric signal.
  • the distance measuring sensor 18 is a sensor that measures the distance to the object to be measured.
  • the distance measuring sensor 18 includes, for example, a light source that emits pulse-converted light and a light receiving element that receives the light.
  • the distance measuring sensor 18 measures the distance to the object to be measured by the timing of light emission from the light source and the timing of receiving the reflected light generated by the light emitted from the light source being reflected by the object to be measured.
  • the distance measuring sensor 18 may have a light source that emits light having directivity.
  • the camera 17 and the distance measuring sensor 18 may be provided on the side surface of the housing of the user terminal 100, for example.
  • a ranging sensor 18 may be provided in the vicinity of the camera 17.
  • the camera 17 for example, an infrared camera can be used.
  • the camera 17 may be provided with a lighting device that irradiates infrared rays, a filter that blocks visible light, and the like. This makes it possible to further improve the detection accuracy of the object based on the captured image of the camera 17, regardless of whether it is outdoors or indoors.
  • the processor 10 may perform one or more of the processes shown in the following (1) to (5), for example, on the captured image of the camera 17.
  • the processor 10 performs image recognition processing on the captured image of the camera 17 to specify whether or not the captured image includes a user's hand.
  • the processor 10 may use, for example, a technique such as pattern matching as the analysis technique adopted in the above-mentioned image recognition process.
  • the processor 10 detects the user's gesture from the shape of the user's hand.
  • the processor 10 specifies, for example, the number of fingers of the user (the number of extended fingers) from the shape of the user's hand detected from the captured image.
  • the processor 10 further identifies the gesture performed by the user from the number of identified fingers.
  • the processor 10 determines that the user has made a "par" gesture when the number of fingers is five. Further, when the number of fingers is 0 (the finger is not detected), the processor 10 determines that the user has made a “goo” gesture. Further, when the number of fingers is two, the processor 10 determines that the user has performed the "choki” gesture. (3) The processor 10 performs image recognition processing on the captured image of the camera 17 to detect whether the user's finger is in a state where only the index finger is raised or whether the user's finger is repelled. ..
  • the processor 10 is an object 1010 (user's hand or the like) in the vicinity of the user terminal 100 based on at least one of the image recognition result of the captured image of the camera 17 and the output value of the distance measuring sensor 18. ) And the user terminal 100.
  • the processor 10 may have the user's hand near the user terminal 100 (for example, a distance less than a predetermined value) or far away (for example, a predetermined value) depending on the size of the shape of the user's hand specified from the captured image of the camera 17. Detect if it is at the above distance).
  • the processor 10 may detect whether the user's hand is approaching or moving away from the user terminal 100.
  • the processor. 10 recognizes that the user is waving his hand in the shooting direction of the camera 17.
  • the processor 10 determines that the user is waving his hand in a direction orthogonal to the shooting direction of the camera. recognize.
  • the processor 10 determines whether or not the user is holding the hand (whether it is a “goo” gesture or another gesture (for example, “par”) by recognizing the image captured by the camera 17. Or) is detected. In addition, the processor 10 detects the shape of the user's hand and how the user is moving the hand. In addition, the processor 10 detects whether the user is approaching or moving this hand toward or away from the user terminal 100. Such an operation can correspond to an operation using a pointing device such as a mouse or a touch panel, for example. For example, the user terminal 100 moves the pointer on the touch screen 15 in response to the movement of the user's hand, and detects the user's gesture “goo”. In this case, the user terminal 100 recognizes that the user is continuing the selection operation.
  • a pointing device such as a mouse or a touch panel
  • the continuation of the selection operation corresponds to, for example, the state in which the mouse is clicked and pressed, or the state in which the mouse is touched after the touch-down operation is performed on the touch panel.
  • the user terminal 100 performs such a series of gestures as an operation corresponding to a swipe operation (or a drag operation). You can also recognize it.
  • the user terminal 100 detects a gesture that the user flips a finger based on the detection result of the user's hand by the image taken by the camera 17, the gesture is clicked by the mouse or tapped on the touch panel. It may be recognized as an operation corresponding to.
  • FIG. 2 is a block diagram showing a functional configuration of a user terminal 100, a server 200, and an operation instruction device 300 included in the game system 1.
  • Each of the user terminal 100, the server 200, and the operation instruction device 300 is not shown, and has a functional configuration necessary for functioning as a general computer and a functional configuration necessary for realizing a known function in a game. May include.
  • the user terminal 100 has a function as an input device that accepts a user's input operation and a function as an output device that outputs a game image or sound.
  • the user terminal 100 functions as a control unit 110 and a storage unit 120 by the cooperation of the processor 10, the memory 11, the storage 12, the communication IF 13, the input / output IF 14, and the like.
  • the server 200 has a function of communicating with each user terminal 100 and supporting the user terminal 100 to advance the game. For example, when the user terminal 100 downloads an application related to this game for the first time, the user terminal 100 is provided with data to be stored in the user terminal 100 at the start of the first game. For example, the server 200 transmits the operation instruction data for operating the character to the user terminal 100.
  • the motion instruction data may include motion capture data that captures the movement of an actor such as a model in advance, or may include voice data that records the voice of an actor such as a voice actor, or causes the character to operate. It may include operation history data indicating the history of input operations for the purpose, or may include a motion command group in which commands associated with the above-mentioned series of input operations are arranged in chronological order.
  • the server 200 may have a function of communicating with each user terminal 100 participating in the game and mediating an exchange between the user terminals 100 and a synchronization control function. Further, the server 200 has a function of mediating between the user terminal 100 and the operation instruction device 300. As a result, the operation instruction device 300 can supply the operation instruction data to the user terminal 100 or a group of a plurality of user terminals 100 in a timely manner without making a mistake in the destination.
  • the server 200 functions as a control unit 210 and a storage unit 220 by the cooperation of the processor 20, the memory 21, the storage 22, the communication IF23, the input / output IF24, and the like.
  • the operation instruction device 300 has a function of generating operation instruction data for instructing the operation of a character in the user terminal 100 and supplying the operation instruction data to the user terminal 100.
  • the operation instruction device 300 functions as a control unit 310 and a storage unit 320 in cooperation with the processor 30, the memory 31, the storage 32, the communication IF 33, the input / output IF 34, and the like.
  • the storage units 120, 220 and 320 store the game program 131, the game information 132 and the user information 133.
  • the game program 131 is a game program executed by the user terminal 100, the server 200, and the operation instruction device 300.
  • the game information 132 is data that the control units 110, 210, and 310 refer to when executing the game program 131.
  • the user information 133 is data related to the user's account.
  • the storage unit 320 further stores the character control program 134.
  • the character control program 134 is a program executed by the operation instruction device 300, and is a program for controlling the operation of a character appearing in a game based on the above-mentioned game program 131.
  • the control unit 210 comprehensively controls the server 200 by executing the game program 131 stored in the storage unit 220. For example, the control unit 210 transmits various data, programs, and the like to the user terminal 100. The control unit 210 receives a part or all of the game information or the user information from the user terminal 100. When the game is a multiplayer game, the control unit 210 may receive a request for synchronization of multiplayer from the user terminal 100 and transmit data for synchronization to the user terminal 100. Further, the control unit 210 communicates with the user terminal 100 and the operation instruction device 300 as necessary to send and receive information.
  • the control unit 210 functions as a progress support unit 211 and a shared support unit 212 according to the description of the game program 131.
  • the control unit 210 can also function as another functional block (not shown) in order to support the progress of the game on the user terminal 100, depending on the nature of the game to be executed.
  • the progress support unit 211 communicates with the user terminal 100 and supports the user terminal 100 to progress various parts included in this game. For example, when the user terminal 100 advances the game, the progress support unit 211 provides the user terminal 100 with information necessary for advancing the game.
  • the sharing support unit 212 communicates with a plurality of user terminals 100, and supports a plurality of users to share each other's decks on each user terminal 100. Further, the sharing support unit 212 may have a function of matching the online user terminal 100 with the operation instruction device 300. As a result, information can be smoothly transmitted and received between the user terminal 100 and the operation instruction device 300.
  • the control unit 110 comprehensively controls the user terminal 100 by executing the game program 131 stored in the storage unit 120. For example, the control unit 110 advances the game according to the game program 131 and the user's operation. Further, the control unit 110 communicates with the server 200 and the operation instruction device 300 as necessary to transmit and receive information while the game is in progress.
  • the control unit 110 includes an operation reception unit 111, a display control unit 112, a user interface (hereinafter, UI) control unit 113, an animation generation unit 114, a game progress unit 115, an analysis unit 116, and a progress unit according to the description of the game program 131. It functions as an information generation unit 117.
  • the control unit 110 can also function as other functional blocks (not shown) in order to advance the game, depending on the nature of the game to be executed.
  • the operation reception unit 111 detects and accepts a user's input operation to the input unit 151.
  • the operation reception unit 111 determines what input operation has been performed from the action exerted by the user on the console via the touch screen 15 and other input / output IF 14, and outputs the result to each element of the control unit 110. do.
  • the operation receiving unit 111 receives an input operation for the input unit 151, detects the coordinates of the input position of the input operation, and specifies the type of the input operation.
  • the operation receiving unit 111 specifies, for example, a touch operation, a slide operation, a swipe operation, a tap operation, and the like as types of input operations. Further, the operation receiving unit 111 detects that the contact input is canceled from the touch screen 15 when the continuously detected input is interrupted.
  • the UI control unit 113 controls the UI object to be displayed on the display unit 152 in order to construct the UI.
  • the UI object is a tool for the user to make an input necessary for the progress of the game to the user terminal 100, or a tool for obtaining information output during the progress of the game from the user terminal 100.
  • UI objects are, but are not limited to, icons, buttons, lists, menu screens, and the like.
  • the animation generation unit 114 generates an animation showing the motion of various objects based on the control mode of various objects. For example, the animation generation unit 114 may generate an animation or the like that expresses how the character moves as if it were there, moves the mouth, or changes the facial expression.
  • the display control unit 112 outputs a game screen reflecting the processing result executed by each of the above elements to the display unit 152 of the touch screen 15.
  • the display control unit 112 may display the game screen including the animation generated by the animation generation unit 114 on the display unit 152. Further, the display control unit 112 may superimpose and draw the above-mentioned UI object controlled by the UI control unit 113 on the game screen.
  • the game progress unit 115 advances the game.
  • the game progress unit 115 advances the game executed in the game system 1 according to the present embodiment (hereinafter, this game) according to the input operation of the user input via the operation reception unit 111. Let me.
  • the game progress unit 115 advances the game according to the specifications of each part.
  • this game is a competitive tennis game and is divided into a tutorial part, a competitive part, and a lottery part.
  • the game progress unit 115 provides the knowledge necessary for a novice user to play other parts (competition part and lottery part), or in the battle part which is the main part of the game. It provides a simple practice mode to learn core operations.
  • the game progress unit 115 advances the tutorial part according to the input operation of the user and the game program 131 previously downloaded to the storage unit 120.
  • the game progress unit 115 causes a tennis player operated by a user to play a tennis player operated by another user in a battle part.
  • the game progress unit 115 shares information with the user terminal 100 operated by another user via the server 200, and advances the tennis match while synchronizing with each other. That is, the game progress unit 115 advances the battle part according to the input operation of the user, the input operation of another user, and the game program 131.
  • the game progress unit 115 executes a lottery in the lottery part, and causes the user to acquire the winning game medium.
  • the game medium is digital data that can be used in this game, and is, for example, an item that can enhance a tennis player operated by a user in order to make a match advantageous.
  • the game progress unit 115 advances the lottery part according to the input operation of the user, the game program 131 downloaded in advance to the storage unit 120, and the lottery result executed by the server 200.
  • "getting the user to acquire the game medium” may, as an example, change the status of the game medium managed in association with the user from unusable to usable.
  • the game medium may be stored in at least one of the memories (memory 11, memory 21, memory 31) included in the game system 1 in association with the user identification information, the user terminal ID, or the like.
  • the analysis unit 116 analyzes (renders) the operation instruction data and instructs the game progress unit 115 to operate the character based on the analysis result.
  • the analysis unit 116 starts rendering of the operation instruction data triggered by the fact that the operation instruction data supplied by the operation instruction device 300 is received via the communication IF 33.
  • the operation instruction device 300 transmits the analysis result to the game progress unit 115, and immediately instructs the character to operate based on the operation instruction data. That is, the game progress unit 115 uses the reception of the operation instruction data as a trigger to operate the character based on the operation instruction data. This makes it possible to show the user a character that operates in real time.
  • the progress information generation unit 117 generates progress information indicating the progress of the game being executed by the game progress unit 115, and sends it to the server 200 or the operation instruction device 300 in a timely manner.
  • the progress information may include, for example, information that specifies the currently displayed game screen, or may include a progress log indicating the progress of the game in chronological order by characters, symbols, and the like.
  • the progress information generation unit 117 may be omitted.
  • the control unit 310 comprehensively controls the operation instruction device 300 by executing the character control program 134 stored in the storage unit 320. For example, the control unit 310 generates operation instruction data according to the operation of the character control program 134 and the operator, and supplies the operation instruction data to the user terminal 100. The control unit 310 may further execute the game program 131, if necessary. Further, the control unit 310 communicates with the server 200 and the user terminal 100 running the game to send and receive information.
  • the control unit 310 functions as an operation reception unit 311, a display control unit 312, a UI control unit 313, an animation generation unit 314, a progress simulation unit 315, and a character control unit 316 according to the description of the character control program 134.
  • the control unit 310 can also function as another functional block (not shown) in order to control a character appearing in the game according to the nature of the game executed in the game system 1.
  • the operation reception unit 311 detects and accepts the operator's input operation to the input unit 351.
  • the operation reception unit 311 determines what kind of input operation has been performed on the console via the touch screen 35 and other input / output IF 34s from the action exerted by the operator, and outputs the result to each element of the control unit 310. Output.
  • the details of the function of the operation reception unit 311 are almost the same as those of the operation reception unit 111 in the user terminal 100.
  • the UI control unit 313 controls the UI object to be displayed on the display unit 352.
  • the animation generation unit 314 generates an animation showing the motion of various objects based on the control mode of various objects.
  • the animation generation unit 314 may generate an animation or the like that reproduces the game screen actually displayed on the user terminal 100 that is the communication partner.
  • the display control unit 312 outputs a game screen reflecting the processing result executed by each of the above-mentioned elements to the display unit 352 of the touch screen 35.
  • the details of the functions of the display control unit 312 are substantially the same as those of the display control unit 112 in the user terminal 100.
  • the progress simulation unit 315 grasps the progress of the game on the user terminal 100 based on the progress information indicating the progress of the game received from the user terminal 100. Then, the progress simulation unit 315 presents the progress of the user terminal 100 to the operator by simulating the behavior of the user terminal 100 in the operation instruction device 300.
  • the progress simulation unit 315 may display a reproduction of the game screen displayed on the user terminal 100 on the display unit 352 of the own device. Further, the progress simulation unit 315 may display the progress of the game on the display unit 352 as the above-mentioned progress log on the user terminal 100.
  • the progress simulation unit 315 grasps the progress of the game of the user terminal 100 based on the progress information. Then, the progress simulation unit 315 may completely or simplify the game screen currently displayed on the user terminal 100 based on the game program 131 and reproduce it on the display unit 352 of the own device. Alternatively, the progress simulation unit 315 may grasp the progress of the game at the present time, predict the progress of the game after the present time based on the game program 131, and output the prediction result to the display unit 352.
  • the character control unit 316 controls the behavior of the character displayed on the user terminal 100. Specifically, the operation instruction data for operating the character is generated and supplied to the user terminal 100. For example, the character control unit 316 generates operation instruction data instructing an operator (voice actor or the like) to speak to the character to be controlled based on the voice data input via the microphone 3010. The operation instruction data generated in this way includes at least the above-mentioned voice data. Further, for example, an operator (model or the like) generates motion instruction data instructing the character to be controlled to perform a motion based on the motion capture data input via the motion capture device 3020. The motion instruction data generated in this way includes at least the above-mentioned motion capture data.
  • the operation instruction data generated in this way includes at least the above-mentioned operation history data.
  • the operation history data is, for example, information in which operation logs indicating which button of the controller 3030 is pressed at what timing by the operator when which screen is displayed on the display unit are organized in chronological order. ..
  • the display unit here may be a display unit linked to the controller 3030, may be a display unit 352 of the touch screen 35, or may be another display unit connected via the input / output IF 34. ..
  • the character control unit 316 identifies a command instructing the operation of the character associated with the input operation input by the operator via the above-mentioned input mechanism or operation unit. Then, the character control unit 316 arranges the commands in the order in which they are input to generate a motion command group indicating a series of actions of the character, and generates motion instruction data instructing the character to be operated according to the motion command group. You may.
  • the motion instruction data generated in this way includes at least the above-mentioned motion command group.
  • the reaction processing unit 317 receives feedback on the user's reaction from the user terminal 100 and outputs this to the operator of the operation instruction device 300.
  • the user terminal 100 can create a comment addressed to the character while the character is operated according to the above-mentioned operation instruction data.
  • the reaction processing unit 317 receives the comment data of the comment and outputs it.
  • the reaction processing unit 317 may display the text data corresponding to the user's comment on the display unit 352, or may output the voice data corresponding to the user's comment from a speaker (not shown).
  • the functions of the user terminal 100, the server 200, and the operation instruction device 300 shown in FIG. 2 are merely examples. Each device of the user terminal 100, the server 200, and the operation instruction device 300 may have at least a part of the functions of the other devices. Further, another device other than the user terminal 100, the server 200, and the operation instruction device 300 may be used as a component of the game system 1, and the other device may be made to execute a part of the processing in the game system 1. That is, the computer that executes the game program in the present embodiment may be any of a user terminal 100, a server 200, an operation instruction device 300, and another device other than the user terminal 100, and is realized by a combination of a plurality of these devices. May be done.
  • the user terminal 100 is configured to execute the following steps in order to improve the interest of the game based on the game program 131.
  • the user terminal 100 includes a step of advancing the game in response to a user's input operation input to the user terminal 100 via an operation unit (for example, an input unit 151), and at least one appearing in the game.
  • a step of transmitting progress information indicating the progress of the game to the operation instruction device 300 that controls the character, and voice data corresponding to the character's remarks input in the operation instruction device 300 according to the progress of the game are input to the operation instruction device 300.
  • Execution of a step of receiving from the device 300 and a step of operating the character by causing a character appearing during the progress of the game to speak at least the content of the voice data by using the reception of the voice data as a trigger. do.
  • the operation instruction device 300 is configured to execute the following steps in order to improve the interest of the game based on the character control program 134.
  • the operation instruction device 300 includes a step of displaying a progress screen indicating the progress of the game on the display unit 352 based on the progress information indicating the progress of the game received from the user terminal 100 for advancing the game.
  • the operator of the operation instruction device 300 can grasp how far the user has advanced the game.
  • the operator or the voice actor who receives the instruction from the operator can input the voice data according to the progress of the user's game to the operation instruction device 300.
  • the voice data input from the operator to the operation instruction device 300 based on the progress is supplied to the user terminal 100.
  • the user terminal 100 causes the character to speak the content of the voice data while the game is in progress.
  • the user can play the game while recognizing the existence of a character who speaks the content according to the progress of the game being played. Since the character can be made to speak according to the progress of the game, the user can feel the reality as if the character is playing the game together. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game.
  • FIG. 3 is a diagram showing an example of a data structure of screen transition information.
  • the screen transition information is configured to include each item of "game part” and "screen information”.
  • the item "game part” stores the identification information of each part constituting this game.
  • this game is a competitive tennis game, and is composed of three parts: a tutorial part, a competitive part, and a lottery part.
  • the item "screen information” stores information that defines the screen displayed in each part and its transition. For example, in this game, in the tutorial part, the introduction screen of ID "0001" is displayed on the display unit 152, then the practice screen of ID "0002” is displayed, and finally, the explanation screen of ID "0003” is displayed. It is displayed and the tutorial part ends.
  • the user terminal 100 and the operation instruction device 300 may store and share this screen transition information in their respective storage units.
  • the operation instruction device 100 can easily determine which game screen of which part the user terminal 100 is currently displaying. It can be transmitted to 300.
  • FIG. 4 is a diagram showing an example of a data structure of progress information generated by the progress information generation unit 117 of the user terminal 100.
  • the progress information is configured to include each item of "screen ID" and "progress log”.
  • the above-mentioned screen ID is stored in the item "screen ID”.
  • the progress information generation unit 117 identifies the game screen currently displayed on the display unit 152 by the game progress unit 115, and stores the screen ID in the item.
  • the item "progress log” stores the progress log of the game being executed by the game progress unit 115.
  • the game progress unit 115 records the progress log in the storage unit 120 periodically or every time an event to be recorded occurs.
  • the progress information generation unit 117 stores the latest progress log recorded by the game progress unit 115 in the item.
  • the progress log is, for example, a record in which the time when an event occurred and the content of the event are associated with each other are arranged in chronological order.
  • the progress information generation unit 117 may store the entire progress log in the item of "progress log", or may store only the record of the progress log that has not yet been reported to the operation instruction device 300.
  • the progress information may further include items necessary for grasping the progress of the game, depending on the nature of the game. For example, in a competitive part of a competitive tennis game, in a virtual game space, the positions of the players in the match and the positions of the balls that the players hit each other change according to the input operation of the user.
  • An object whose attribute changes according to a user's input operation, such as the player and the ball described above, is hereinafter referred to as a dynamic object.
  • the attribute of an object refers to, for example, the position, size, or shape of the object in the game space. Therefore, in the present embodiment, the progress information may be further configured to include the item of "dynamic object".
  • the item "dynamic object” stores attribute-related information necessary for identifying the attributes of the dynamic object.
  • the attribute-related information may be, for example, coordinate information for specifying the position of the dynamic object in the game space.
  • the attribute-related information may be information for specifying the movement path of the ball, for example, when the dynamic object is a ball.
  • the progress information generation unit 117 acquires the velocity vector, rotation axis, rotation amount, etc. of the ball immediately after colliding with the racket from the game progress unit 115, and stores them in the item of "dynamic object" as attribute-related information. May be good.
  • the progress information generation unit 117 generates progress information including each of the above items periodically or every time a predetermined event occurs during the progress of the game, and transmits the progress information to the operation instruction device 300. As a result, the operation instruction device 300 can grasp the progress of the game on the user terminal 100.
  • FIG. 5 is a diagram showing an example of a data structure of operation instruction data processed by the game system 1 according to the present embodiment.
  • the action instruction data includes each item of "destination” and “creator” which is meta information, and "character ID”, "voice”, “movement”, and "attention point” which are the contents of the data. It is configured to include each item.
  • the destination designation information is stored in the item "destination".
  • the destination designation information is information indicating to which device the operation instruction data is transmitted.
  • the destination designation information may be, for example, an address unique to the user terminal 100, or may be identification information of the group to which the user terminal 100 belongs. It may be a symbol (for example, "ALL") indicating that the destination is all user terminals 100 satisfying a certain condition.
  • the creation source information is stored in the item "creation source".
  • the creation source information is information indicating which device created the operation instruction data.
  • the creation source information is information related to a user, such as a user ID, a user terminal ID, and a unique address of the user terminal, which can identify a specific user.
  • the creation source information may be an ID or an address indicating the server 200 or the operation instruction device 300, and if the creation source is the server 200 or the operation instruction device 300, the value of the item is left empty.
  • the item itself may not be provided in the operation instruction data.
  • character ID stores a character ID for uniquely identifying a character appearing in this game.
  • the character ID stored here represents which character's action is indicated by the action instruction data.
  • the item "voice” stores voice data to be expressed in the character.
  • Motion data that specifies the movement of the character is stored in the item "movement".
  • the motion data may be motion capture data acquired by the motion instruction device 300 via the motion capture device 3020.
  • the motion capture data may be data that tracks the movement of the actor's entire body, may be data that tracks the facial expression and mouth movement of the actor, or may be both.
  • the motion data may be a motion command group instructing a series of movements of the character specified by an operation input by the operator of the operation instruction device 300 via the controller 3030.
  • buttons A, B, C, and D of the controller 3030 For example, when the commands “raise the right hand”, “raise the left hand”, “walk”, and “run” are assigned to the buttons A, B, C, and D of the controller 3030, the operator can use them. , Button A, Button B, Button C, and Button D are pressed in order, and the commands “Raise right hand”, “Raise left hand”, “Walk”, and “Run” are arranged in the above order.
  • the motion commands are stored in the "movement” item as motion data.
  • the voice data and the motion data are included in the operation instruction data in a synchronized state.
  • attention information for specifying a point to be noticed by the user in the interaction between the character and the user on the game screen displayed on the display unit 152 of the user terminal 100 is stored.
  • the attention information may be identification information that identifies an object that the user wants to pay attention to among the objects arranged on the game screen, for example, an object ID.
  • the attention information may be position coordinates (second position coordinates) indicating a specific position of the display unit 152 for displaying the game screen.
  • the progress simulation unit 315 of the operation instruction device 300 may reproduce the game screen displayed on the display unit 152 of the user terminal 100 on the display unit 352.
  • the operator of the operation instruction device 300 can specify a position to be noticed by the user by an input operation via the input unit 351.
  • the first position coordinates in the display unit 352 on which the input operation is performed are converted into the above-mentioned second position coordinates by the character control unit 316.
  • the character control unit 316 displays the specifications (number of pixels, aspect ratio, etc.) of the touch screen 15 previously acquired from the user terminal 100 and the game screen reproduced on the display unit 352 for the above conversion. Refer to the position where you are.
  • the character control unit 316 can obtain the second position coordinates on the touch screen 15 of the user terminal 100 corresponding to the first position coordinates on the display unit 352 of the own device.
  • the display unit 352 and the input unit 351 may form a touch screen 35.
  • the first position coordinates of the touch operation (input operation) input by the user with respect to the simulated screen displayed on the display unit 352 are the character control units 316 as the position coordinates on the touch screen 35 via the input unit 151.
  • the input unit 351 and the display unit 352 may be formed separately.
  • the input unit 351 is an input device formed separately from the display unit 352 such as a mouse or a keyboard, a user can use a mouse to determine a predetermined screen displayed on the display unit 352. Click the position.
  • the first position coordinates of the click operation (input operation) input by the user on the simulated screen are determined based on the input timing of the click operation and the display position of the cursor displayed on the display unit 352 at that time. It is supplied to the character control unit 316.
  • the game progress unit 115 of the user terminal 100 uses the information of the "attention point" to surely convey to the user a place to be noticed by the user. Is possible. For example, even if a character's utterance contains a demonstrative word such as "this”, “here”, “over there", or "here”, the game progress unit 115 grasps the point pointed to by the demonstrative word. , You can highlight it. Therefore, the user can know exactly where the demonstrative is pointing. As a result, a conversation including a demonstrative word is established between the user and the character. As a result, it is possible to realize immersive, fast-paced and natural communication between the user and the character as if the characters were playing together on the spot.
  • the game progress unit 115 can operate the character appearing in the game as intended by the creator of the motion instruction data. Specifically, when the motion instruction data includes voice data, the game progress unit 115 causes the character to speak based on the voice data, and when the motion instruction data includes motion data, the game progress unit 115 causes the character to speak. Moves the character based on the motion data, that is, generates an animation of the character so as to move based on the motion data.
  • FIG. 6 is a diagram showing an example of a progress screen displayed on the display unit 352 of the operation instruction device 300.
  • the progress simulation unit 315 displays the progress information acquired from the user terminal 100 on the display unit 352 as, for example, the progress screen 400.
  • the progress screen 400 includes a simulated screen 401 and a progress log 402 as an example. Further, the progress screen 400 may include a UI component 403 for the operator (or voice actor) to input voice data to the own device via the microphone 3010.
  • the progress screen 400 may include the same screen that is actually displayed on the display unit 152 of the user terminal 100 as a simulated screen 401, or may include a simulated screen 401 that is a simplification of the screen. ..
  • the progress simulation unit 315 analyzes the progress information acquired from the user terminal 100.
  • the progress simulation unit 315 identifies the game screen displayed on the user terminal 100 based on the screen ID included in the progress information.
  • the progress simulation unit 315 displays the specified game screen on the display unit 352.
  • the progress simulation unit 315 may reproduce the above-mentioned game screen in detail based on the game information 132, the user information 133, and the game program 131 stored in the storage unit 320.
  • the progress simulation unit 315 preferably generates a simulation screen 401 that simplifies the above-mentioned game screen.
  • the simulated screen 401 includes only the minimum information necessary for the operator of the operation instruction device 300 to determine the progress of the game among the information arranged on the above-mentioned game screen.
  • the simulated screen 401 may include, for example, the layout of each object, the description of the function of each object, and the status of each object, with the drawing of the appearance of the object omitted as the minimum information.
  • the progress simulation unit 315 reproduces the attributes of the dynamic objects arranged on the game screen on the simulation screen 401 based on the attribute-related information of the dynamic objects included in the progress information.
  • the progress simulation unit 315 may calculate the movement trajectory of the ball 404 based on the attribute-related information of the ball 404, which is a dynamic object, and move the ball 404 accordingly.
  • the progress simulation unit 315 specifies the position of each player in the game space based on the attribute-related information of the player 405 operated by the user and the player 406 operated by the opponent user, which are dynamic objects.
  • the progress simulation unit 315 determines the size of each player based on the positional relationship between each player and the virtual camera, and arranges each player at the specified position.
  • the display size of each player is defined in advance as attribute-related information, and the progress simulation unit 315 may arrange each player according to the defined display size.
  • the progress simulation unit 315 arranges the progress log included in the progress information on the progress screen 400. For example, as shown in the figure, the progress simulation unit 315 generates a progress log 402 including each record associated with the event occurrence time and the content of the event as text data.
  • the operator of the operation instruction device 300 can grasp the progress of the game being executed on the user terminal 100 by checking the progress screen 400.
  • the progress simulation unit 315 may include the prediction result of predicting the future game development from the current game progress in the progress screen 400 based on the game program 131.
  • the progress simulation unit 315 may display the prediction result superimposed on the simulation screen 401, may display the prediction result so as to be switched with the simulation screen 401, or display the prediction result side by side with the simulation screen 401. You may.
  • the progress simulation unit 315 may additionally display the prediction result in the progress log 402. In this case, it is preferable that the progress simulation unit 315 displays the prediction result in a display mode different from that of the event that has already occurred, for example, in a different character color.
  • the operator of the operation instruction device 300 can give advice on the future based on the prediction result, in addition to making the character speak about the current situation according to the progress of the game on the user terminal. become.
  • progress information indicating that the ball has been hit back based on the input operation of the user is supplied from the user terminal 100 to the operation instruction device 300.
  • the progress simulation unit 315 can predict what kind of ball the computer will return next based on the game program 131, and can superimpose and display the return ball trajectory on the simulation screen 401.
  • the operator can input a voice to advise the character, such as "Move to the right!
  • the progress simulation unit 315 detects one move that the user is trying to point to based on the progress information, and if that one move is actually pointed. Predict the subsequent development of. If the progress simulation unit 315 determines that the one move is the one for which the defeat is confirmed, the progress simulation unit 315 may display a pop-up message to that effect on the simulation screen 401 or in another margin area of the progress screen 400. good. The operator who sees this can input a voice to advise the character, "You should stop there!”.
  • the operation reception unit 311 accepts the operator's touch operation on the UI component 403 arranged on the progress screen 400 when the microphone 3010 input is disabled and the microphone is off.
  • the character control unit 316 enables the input from the microphone 3010, acquires the voice data input via the microphone 3010, and includes it in the operation instruction data. It is assumed that the operation reception unit 311 accepts the operator's touch operation on the UI component 403 arranged on the progress screen 400 when the microphone 3010 input is enabled and the microphone is on. In this case, the character control unit 316 again invalidates the input from the microphone 3010.
  • the operator determines the content of the utterance on the spot so as to match the progress, and outputs the voice data corresponding to the content of the utterance to the operation instruction device 300. Can be entered.
  • the character control unit 316 When the operation reception unit 311 receives a touch operation on the display area of the simulated screen 401 from the operator, the character control unit 316 identifies which object on the simulated screen 401 is designated by the touch operation, and the object is specified. Identification information of may be acquired.
  • the character control unit 316 may store the identification information in the item of "attention point" of the operation instruction data.
  • the character control unit 316 sets the first position coordinates of the above-mentioned touch operation on the touch screen 35 to the second position coordinates on the touch screen 15 of the user terminal 100 according to the specifications of the display unit 152 of the user terminal 100. Convert to.
  • the character control unit 316 may store the second position coordinates for indicating the position of the operator's touch operation in the coordinate system of the display unit 152 in the item of "attention point" of the operation instruction data.
  • the operator confirms the progress of the game, inputs the voice data corresponding to the utterance content of the character so as to match the progress, and pays attention to the user at the same time as the utterance. You can specify the location.
  • Game screen When the game progress unit 115 of the user terminal 100 receives the operation instruction data from the operation instruction device 300 while the game is in progress, the character specified by the operation instruction data is superimposed on the displayed game screen. The character is operated based on the operation instruction data. For example, it is assumed that the game progress unit 115 receives the operation instruction data while displaying the practice screen of the tutorial part. In this case, the game progress unit 115 superimposes the character 801 on the practice screen 800 displayed on the display unit 152. Then, the game progress unit 115 operates the character 801 based on the received operation instruction data.
  • FIG. 7 is a diagram showing an example of a game screen displayed on the display unit 152 of the user terminal 100.
  • the practice screen 800 displayed second in the tutorial part of this game is illustrated.
  • the layout of the practice screen 800 is almost the same as the layout of the battle screen displayed in the battle part.
  • the game progress unit 115 draws a court on the game space on the practice screen 800. Then, the game progress unit 115 arranges the player 802 operated by the user in front and the player 803 operated by the opponent (COM in the tutorial part) in the back. By operating the player 802 and playing the tutorial part while looking at the practice screen 800, the user can learn the operation method as if it were the actual performance in the battle part.
  • the progress information generation unit 117 generates progress information in a timely manner while the game progress unit 115 is progressing the tutorial part, and transmits the progress information to the operation instruction device 300.
  • the progress information generation unit 117 includes the progress log including the event that the skill activation button 804 has changed from the unusable state to the usable state in the progress information, and transmits the operation instruction device 300.
  • the game progress unit 115 does not have to display the character 801 until the operation instruction data is supplied from the operation instruction device 300.
  • the pre-made operation instruction data supplied in advance when the application of this game is downloaded may be stored in the storage unit 120 together with the game program 131.
  • the game progress unit 115 may operate the character 801 according to the game program 131 based on the pre-made operation instruction data read from the storage unit 120.
  • the operation instruction device 300 that has received the progress information from the user terminal 100 displays the progress information on the display unit 352 as the progress screen 400 shown in FIG. 6 described above.
  • the operator who has confirmed the progress screen 400 can determine, for example, that the user should be advised to use the skill activation button 804.
  • the operator or voice actor 701 inputs a voice 700 including advice according to the progress of the game to the operation instruction device 300 via the microphone 3010.
  • the operator or the model (person) 702 may input the movement of the character to the motion instruction device 300 via the motion capture device 3020, if necessary.
  • the operator may operate the simulated screen 401 via the touch screen 35 and specify a portion of the practice screen 800 that the user wants to pay attention to, if necessary.
  • operation instruction data including at least voice data, motion capture data added as needed, and attention information is generated by the character control unit 316 and transmitted to the user terminal 100.
  • the game progress unit 115 uses the operation instruction data as a trigger to display the character 801 on the practice screen 800 based on the received operation instruction data. Superimpose. Then, the motion indicated by the motion capture data included in the motion instruction data is reflected in the motion of the character 801. As described above, the motion capture data is obtained by acquiring the movement of the model 702 via the motion capture device 3020 at the installation location of the motion instruction device 300. Therefore, the movement of the model 702 is directly reflected in the movement of the character 801 displayed on the display unit 152.
  • the game progress unit 115 outputs the voice data 805 included in the operation instruction data supplied from the operation instruction device 300 as the voice emitted by the character 801 in synchronization with the movement of the character 801.
  • the voice data is obtained by acquiring the voice 700 of the voice actor 701 through the microphone 3010 at the installation location of the operation instruction device 300. Therefore, the voice data 805 corresponding to the voice 700 emitted by the voice actor 701 is output as it is from the speaker of the user terminal 100.
  • the game progress unit 115 highlights the part on the practice screen 800 pointed to by the attention information included in the above-mentioned operation instruction data. For example, when the attention information is the identification information pointing to the skill activation button 804 or the position coordinates where the skill activation button 804 is arranged, the game progress unit 115 highlights the skill activation button 804. For example, the game progress unit 115 changes the display mode of the skill activation button 804. As an example, the game progress unit 115 may change the color of the skill activation button 804, or may attach animations such as blinking, edging, and moving in small steps up, down, left, and right to the skill activation button 804.
  • the game progress unit 115 may superimpose the instruction object 806 that specifies the point of interest on the practice screen 800. Further, the game progress unit 115 may draw the instruction object 806 as an equipment of the character 801 and output an effect in which the character 801 uses the instruction object 806 to point to the skill activation button 804.
  • the voice of the voice actor 701 that actually exists at the installation location of the operation instruction device 300 is emitted according to the progress of the game on the user terminal 100, and it is directly reflected in the voice of the character 801.
  • the movement of the existing model 702 may be input to the operation instruction device 300 and supplied to the user terminal 100 together with the voice data.
  • the character 801 can be moved according to the character 801 that speaks according to the progress of the game.
  • the simulated screen 401 may be operated by the operator via the touch screen 35 at the installation location of the operation instruction device 300.
  • the part on the practice screen 800 that the user wants to pay attention to in relation to the utterance content of the character 801 can be transmitted from the operation instruction device 300 to the user terminal 100.
  • the user can be notified of the point of interest on the practice screen 800 as the character 801 speaks according to the progress of the game.
  • the game progress unit 115 may display the character 801 to be displayed during the progress of the game on the display unit 152 in a display mode according to the play results so far.
  • the game progressing unit 115 has acquired an item (costume, accessory, or the like) that can be worn by the character 801 in the battle part or the lottery part that has been played so far.
  • the object of the item may be combined with the character 801.
  • the item acquired by the user by playing the game can be reflected in the clothing item of the character 801.
  • the user can consume valuable data (items, thrown money, etc.) purchased by virtual currency or billing that can be used in the game before, during, or after the game starts. You may.
  • the user can feel the attachment to the character 801 and enjoy the game even more. Further, the user's motivation to upgrade the clothing of the character 801 can be cultivated, and as a result, the motivation to play the game can be strengthened.
  • the game progress unit 115 may be able to input a comment addressed to the character 801 in response to the operation of the character 801.
  • the game progress unit 115 may arrange a comment input button 807 on the practice screen 800.
  • the UI may be for the user to select a desired comment from some prepared comments.
  • the UI may be for the user to edit characters and enter comments.
  • the UI may be for the user to input a comment by voice.
  • a user's input operation is always required for the progress of the game and the user cannot afford to perform an operation for inputting a comment.
  • the comment input button 807 is not provided, the user may always be configured to input voice.
  • the user can play the game in real time while enjoying interactive interaction with the character 801.
  • FIG. 8 is a flowchart showing a flow of processing executed by each device constituting the game system 1.
  • step S101 when the game progress unit 115 of the user terminal 100 receives an input operation for starting a game from the user, it accesses the server 200 and requests login.
  • step S102 the progress support unit 211 of the server 200 confirms that the status of the user terminal 100 is online, and responds that the login has been accepted.
  • step S103 the game progress unit 115 advances the game according to the input operation of the user while communicating with the server 200 as necessary.
  • the game progress unit 115 advances a tutorial part, a battle part, or a lottery part.
  • step S104 the progress support unit 211 supports the progress of the game on the user terminal 100 by providing necessary information to the user terminal 100 as needed.
  • the sharing support unit 212 of the server 200 proceeds from YES in step S105 to step S106.
  • the live distribution time is, for example, predetermined by the game master and may be managed by the server 200 and the operation instruction device 300. Further, the live distribution time may be notified to the user terminal 100 in advance, or may be kept secret until the actual live distribution time is reached. In the former case, live distribution can be stably supplied to the user, and in the latter case, live distribution with special added value can be supplied to the user as a surprise distribution.
  • the sharing support unit 212 searches for one or more user terminals 100 having the right to receive live distribution.
  • the conditions for receiving live distribution may be set by the game master as appropriate, but at least the conditions are that the application of this game is installed and that the game is online at the time of live distribution.
  • the sharing support unit 212 has a user terminal 100 having the right to receive live distribution from a specific user terminal 100 that has been reserved in advance to receive live distribution at the above-mentioned live distribution time. Search as.
  • the sharing support unit 212 may search for the user terminal 100 that is online at the time of the live distribution, that is, that is running the application of this game, as the user terminal 100 that has the right to receive the live distribution. ..
  • the sharing support unit 212 may further add that the user terminal 100 is owned by the user who has paid the consideration for receiving the live distribution.
  • step S107 the sharing support unit 212 notifies the operation instruction device 300 of one or more detected user terminals 100.
  • the sharing support unit 212 may notify the operation instruction device 300 of the terminal ID of the user terminal 100, the user ID of the user who is the owner of the user terminal 100, the address of the user terminal 100, and the like.
  • step S108 the sharing support unit 212 notifies the user terminal 100 detected in step S106 of the operation instruction device 300 specified as the execution subject of the live distribution.
  • the sharing support unit 212 may notify the user terminal 100, for example, the address or device ID of the operation instruction device 300.
  • step S109 the character control unit 316 of the operation instruction device 300 proceeds from YES in step S108 to the processing after S111 when the live distribution time is reached.
  • the character control unit 316 receives the request to start the live distribution from the server 200 and the information about the user terminal 100 of the live distribution destination (after step S107)
  • the character control unit 316 performs the processing after S111. You may start.
  • live distribution may be configured to be started in response to a user's request.
  • the user terminal 100 sends a request to the server 200 to request live distribution of the operation instruction data according to the input operation of the user.
  • the server 200 returns a response to the user terminal 100 that the live distribution is possible if there is an operation instruction device 300 that can support the live distribution.
  • the information of the device of the communication partner is notified to each of the user terminal 100 and the operation instruction device 300 for which matching is established.
  • step S110 the progress information generation unit 117 of the user terminal 100 generates progress information and transmits it to the operation instruction device 300.
  • the progress information generation unit 117 updates the progress log periodically or every time an event occurs while the game is progressing by the game progress unit 115 after step S103, and generates and transmits the progress information in a timely manner. ..
  • the progress simulation unit 315 of the operation instruction device 300 simulates the progress of the game on the user terminal 100 on its own device based on the progress information.
  • the progress simulation unit 315 generates a progress screen 400 showing the progress of the game on the user terminal 100 based on the progress information, and displays it on the display unit 352.
  • the progress simulation unit 315 arranges the progress log 402 included in the progress information on the progress screen 400 as shown in FIG.
  • the progress simulation unit 315 identifies the game screen being displayed on the user terminal 100 based on the screen ID included in the progress information, and arranges the simulation screen 401 of the game screen on the progress screen 400.
  • the progress simulation unit 315 may reproduce the dynamic object on the simulation screen 401 based on the attribute-related information of the dynamic object included in the progress information.
  • step S112 the character control unit 316 receives the voice emitted by an actor such as a voice actor as voice data via the microphone 3010.
  • step S113 the character control unit 316 acquires the motion input by the actor such as the model via the motion capture device 3020 as motion capture data.
  • step S114 the character control unit 316 specifies a location on the game screen that the operator wants the user to pay attention to, based on the touch operation input by the operator via the input unit 351 of the touch screen 35.
  • steps S112 to S114 may be executed in any order.
  • step S115 the character control unit 316 generates operation instruction data. Specifically, the character control unit 316 identifies a character to be superimposed on the game screen of the user terminal 100, and stores the character ID of the character in the item of "character ID" of the operation instruction data. Which character is superimposed may be scheduled in advance by the game master and registered in the operation instruction device 300. Alternatively, the operator of the operation instruction device 300 may specify in advance to the operation instruction device 300 which character the operation instruction data should be created.
  • the character control unit 316 stores at least the voice data acquired in step S112 in the “voice” item of the operation instruction data. If there is motion capture data acquired in step S113, the character control unit 316 stores the motion capture data in the “movement” item of the operation instruction data.
  • the character control unit 316 associates the voice data with the motion capture data so that the voice data and the motion capture data are synchronized with each other.
  • the character control unit 316 When there is a touch operation received in step S114, the character control unit 316 generates attention information based on the touch operation and stores it in the item of "attention point" of the operation instruction data.
  • the character control unit 316 uses the group identification information of the group of these user terminals 100 so that the destination is one or more user terminals 100 notified by the server 200 in step S107, in the case of one user terminal.
  • the address of 100 is stored in the "destination" item of the operation instruction data as the destination designation information.
  • step S116 the character control unit 316 transmits the operation instruction data generated as described above to each user terminal 100 designated as a destination via the communication IF 33.
  • the character control unit 316 acquires voice data and motion capture data obtained by the actor making a voice or moving, and immediately renders them into motion instruction data, and distributes them to each user terminal 100 in real time. Is desirable.
  • steps S112 to S116 may be executed prior to step S111 after step S107.
  • step S117 the analysis unit 116 of the user terminal 100 receives the above-mentioned operation instruction data via the communication IF 13.
  • the analysis unit 116 may receive the operation instruction data at a time previously announced to be live-streamed from the operation instruction device 300 or the server 200.
  • step S118 the analysis unit 116 immediately analyzes the received operation instruction data by using the reception as a trigger.
  • step S119 the game progress unit 115 superimposes the character specified by the operation instruction data analyzed by the analysis unit 116 on the game screen being displayed on the display unit 152, and superimposes the character on the operation instruction data. Operate based on. Specifically, the game progress unit 115 superimposes the character 801 on the place where the practice screen 800 and the like shown in FIG. 7 are displayed on the display unit 152. The game progress unit 115 practices the voice and movement in real time almost at the same time as the actors such as the voice actor 701 and the model 702 make a voice or move at the place where the operation instruction device 300 is installed. It is reflected in the speech and movement of the character 801 in 800.
  • the analysis unit 116 and the game progress unit 115 continue rendering and reproducing the real-time moving image while continuously receiving the operation instruction data from the operation instruction device 300. Specifically, the game progress unit 115 does not accept any input operation from the user, and while the operation instruction data is received, returns from NO in step S120 to step S103, and repeats the subsequent steps.
  • step S120 If the operation receiving unit 111 receives an input operation from the user while the character is operating based on the operation instruction data in step S120, the game progressing unit 115 proceeds from YES in step S120 to step S121.
  • the operation reception unit 111 accepts an input operation for the comment input button 807 on the practice screen 800.
  • the game progress unit 115 transmits the comment data generated in response to the above-mentioned input operation to the operation instruction device 300. Specifically, the game progress unit 115 may transmit the comment ID of the selected comment as comment data. Alternatively, the game progress unit 115 may transmit the text data of the text input by the user as comment data. Alternatively, the game progress unit 115 may transmit the voice data of the voice input by the user as comment data. Alternatively, the game progress unit 115 may recognize the voice input by the user, convert it into text data, and transmit it as comment data.
  • step S122 the reaction processing unit 317 of the operation instruction device 300 receives the comment data transmitted from the user terminal 100 via the communication IF 33.
  • step S123 the reaction processing unit 317 outputs the received comment data to the operation instruction device 300.
  • the reaction processing unit 317 displays the text data included in the comment data on the display unit 352. This allows the operator or actor to receive feedback on how the user responded to the character they moved. The operator or actor can then determine further character actions in response to this feedback. That is, the operation instruction device 300 returns to step S112, continues to acquire voice data and motion capture data as needed, and continues to provide operation instruction data to the user terminal 100.
  • the user terminal 100 receives the operation instruction data transmitted from the operation instruction device 300 after the content of the input operation in the own terminal is received by the operation instruction device 300.
  • the user terminal 100 receives voice data corresponding to the content of the character's speech, motion capture data corresponding to the movement of the character, and the like, and operation instruction data. Then, the user terminal 100 continuously operates the character based on the operation instruction data. As a result, it is possible for the user to experience real-time interactive interaction with the character.
  • the user terminal 100 may receive a motion command group in which one or more commands instructing the operation of the character are arranged in the order instructed by the operator of the operation instruction device 300.
  • the motion command group is also associated with the voice data in synchronization with the motion capture data.
  • the game progress unit 115 of the user terminal 100 can move the character according to the motion command group in accordance with the utterance of the content of the voice data.
  • the operation instruction device 300 communicates with a plurality of user terminals 100 participating in the multiplayer game, and grasps the progress of the game in each user terminal 100 based on the progress information of each user terminal 100. Then, the operation instruction device 300 generates operation instruction data for causing the character to behave according to the overall progress, and distributes the operation instruction data to each user terminal 100.
  • the game executed in the game system 1 according to the present embodiment is, for example, a car racing game in which a plurality of users operate their respective vehicles to go around the same course and compete for their time. be.
  • the progress simulation unit 315 of the motion instruction device 300 acquires progress information from each of the plurality of user terminals 100 participating in the same race.
  • the progress simulation unit 315 may acquire integrated progress information indicating the progress of the entire race from the server 200 that supports the progress of the race and performs synchronization control.
  • the progress simulation unit 315 may arrange the acquired progress information for each user side by side on the progress screen, or may integrate the progress information and arrange the overall progress obtained by the integration on the progress screen. .. Alternatively, the progress simulation unit 315 may arrange the already integrated progress information received from the server 200 on the progress screen.
  • the character control unit 316 distributes the operation instruction data generated as described above to a plurality of user terminals 100 participating in the same race.
  • the progress information individually for each user terminal 100 is, for example, "user ID”, “user name”, “current position”, “rank”, “lap”, and “time (difference from the top)”. Each item of may be included. Also, the integrated progress information generated by the server 200 may include items such as “elapsed time”, “top lap”, “course map” and "current position of each race car”.
  • FIG. 9 is a diagram showing another example of the progress screen displayed on the display unit 352 of the operation instruction device 300.
  • the progress simulation unit 315 integrates the progress information acquired from each user terminal 100, and displays the progress screen 500 on the display unit 352.
  • the progress screen 500 includes, as an example, the integrated progress diagram 501, the individual progress list 502, and the integrated progress information 503. Further, the progress screen 500 may include a UI component 403 for the operator (or voice actor) to input voice data to the own device via the microphone 3010.
  • the progress simulation unit 315 analyzes the progress information acquired from the user terminal 100 and the server 200. Based on the analysis result, the progress simulation unit 315 specifically generates the integrated progress diagram 501 as follows.
  • the progress simulation unit 315 draws a course map based on the information of the "course map" included in the integrated progress information, and each user operates based on the information of the "current position of each race car".
  • the race car objects are plotted on the above course map.
  • the progress simulation unit 315 generates an individual progress list 502 as follows.
  • the progress simulation unit 315 extracts each item of "user name”, “rank”, “lap”, and “time” from the individual progress information. Then, the information of each item is arranged so that each extracted item is listed for each user, and the individual progress list 502 is generated.
  • the progress simulation unit 315 may extract the item of "top lap” from the integrated progress information and reflect this in the integrated progress information 503.
  • the operator of the operation instruction device 300 can grasp the whole picture of the progress of the game being executed by the plurality of user terminals 100 by checking the progress screen 500.
  • the progress screen 500 may be provided with a UI component 403.
  • the operator confirms the overall progress of the game, determines the content of the utterance on the spot so as to match the progress, and transfers the voice data corresponding to the content of the utterance to the operation instruction device 300. You can enter it.
  • FIG. 10 is a diagram showing an example of a game screen displayed on the display unit 152 of the user terminal 100.
  • the race screen 600 displayed during the progress of the race in this game is shown.
  • the race screen 600 shows, as an example, a screen displayed on the user terminal 100 of the user running at the top of the race.
  • the game progress unit 115 draws a course on the game space on the race screen 600, for example. Then, the game progress unit 115 arranges a race car operated by the user. If the race car operated by another user fits in the angle of view due to the positional relationship with the virtual camera position in the game space, the game progress unit 115 arranges the race car of the other user on the race screen 600. You may.
  • the progress information generation unit 117 generates progress information in a timely manner while the game progress unit 115 is proceeding with the race, and transmits the progress information to the operation instruction device 300.
  • the operation instruction device 300 which receives individual progress information from each user terminal 100 participating in the race and receives integrated progress information from the server 200 that synchronizes the race, may use these progress information as it is or
  • the processed product is displayed on the display unit 352 as the progress screen 500 shown in FIG.
  • the operator who confirms the progress screen 500 can grasp the situation of the race and determine the actual situation.
  • the operator or the voice actor 701 inputs a voice including a live commentary suitable for the race development to the operation instruction device 300 via the microphone 3010. Further, the operator or the model 702 may input the movement of the character to the operation instruction device 300 via the motion capture device 3020, if necessary.
  • the game progress unit 115 uses the operation instruction data as a trigger to display the character 601 on the race screen 600 based on the received operation instruction data.
  • Superimpose For example, it is preferable that the area on which the character 601 can be superimposed is predetermined for each game screen in a place that does not interfere with the user's play.
  • the motion indicated by the motion capture data included in the motion instruction data is reflected in the motion of the character 601. Similar to the first embodiment, the motion capture data is obtained by acquiring the movement of the model 702 via the motion capture device 3020 at the installation location of the motion instruction device 300. Therefore, the movement of the model 702 is directly reflected in the movement of the character 801 displayed on the display unit 152.
  • the game progress unit 115 outputs the voice data 602 included in the operation instruction data supplied from the operation instruction device 300 as the voice emitted by the character 601 in synchronization with the movement of the character 601. Similar to the first embodiment, the voice data is obtained by acquiring the voice of the voice actor 701 through the microphone 3010 at the installation location of the operation instruction device 300. Therefore, the voice data 602 corresponding to the voice emitted by the voice actor 701 is output as it is from the speaker of the user terminal 100.
  • the operation instruction device 300 can transmit the above-mentioned operation instruction data to all the user terminals 100 participating in the game. Therefore, the character 601 is superimposed and displayed together with the race screen 600 on each of the display units 152 of the participating user terminals 100.
  • the present invention can be applied even in a multiplayer game in which a plurality of user terminals 100 participate.
  • the user requests the progress of the completed live distribution part, and the live distribution part is re-progressed based on the received operation instruction data. Can be made to.
  • the user can look back at the live stream again, and even if he / she misses it, he / she can watch the live stream again.
  • the scene after the end of the live distribution time is assumed.
  • the character here is assumed to be a character (including an avatar object) that is not a target of direct operation by the user.
  • the "live distribution part” mainly includes the above-mentioned battle part, but may further include a lottery part and a tutorial part (the same applies hereinafter).
  • the user terminal 100 is configured to execute the following steps in order to improve the interest of the game based on the game program 131.
  • the user terminal 100 (computer) has been completed from the step of requesting the progress of the completed live distribution part and the operation instruction device 300 from the server 200 or the operation instruction device 300, for example, via an operation unit such as an input unit 151.
  • the step of receiving the recorded operation instruction data related to the live distribution part of the above, and the step of advancing the completed live distribution part by operating the character based on the recorded operation instruction data are executed.
  • the recorded action instruction data includes motion data and voice data input by the operator associated with the character.
  • the operator includes not only the model and the voice actor but also the operator who performs some operation on the operation instruction device 300, but does not include the user.
  • the recorded operation instruction data is often stored in the storage unit 220 of the server 200 or the storage unit 320 of the operation instruction device 300, and is delivered to the user terminal 110 again in response to a request from the user terminal 100. It is good to do it.
  • the progress of the completed live distribution part based on the recorded operation instruction data is different depending on the result of whether or not the user has progressed the live distribution part in real time. Specifically, when it is determined that the user has a track record of advancing the live distribution part in real time, the same live distribution part as the user has advanced the live distribution part in real time is advanced again. Is good (return delivery). In return delivery, it is better to perform a selective progression of the live delivery part. On the other hand, when it is determined that the user has no record of progressing the live distribution part in real time, it is preferable to proceed with the live distribution part having a progress mode different from that progressed in real time (missed distribution).
  • the real-time live distribution part can proceed. Nevertheless, this includes cases where this was not actually done. For missed deliveries, it's a good idea to perform a limited progression of the live stream part.
  • the game progress unit 115 further receives the user action history information in the live distribution part.
  • the user action history information is a data set of user actions recorded by an input operation during the progress of the live distribution part, in addition to the contents of the recorded action instruction data.
  • the user action history information is often associated with the recorded operation instruction data, and is preferably stored in the storage unit 220 of the server 200 or the storage unit 320 of the operation instruction device 300.
  • the user behavior history information may be stored in the storage unit 120 of the user terminal 100.
  • FIG. 11 is a diagram showing an example of a data structure of user behavior history information.
  • the user action history information includes, for example, items such as action time, action type, and action details in which the user has acted in the live distribution part, and is associated with a user ID that identifies the user.
  • the item “behavior time” is the time information in which the user performed an action in the live distribution part
  • the item “behavior type” is a type indicating the user's action
  • the item “behavior details” is the specific action of the user. Content.
  • actions such as changing items (so-called dress-up) such as clothing of the character may be included.
  • Such actions may also include time selection for later playback of a particular progress portion of the livestreaming part.
  • actions may include the acquisition of rewards, points, etc. during the live distribution part.
  • the user action history information is preferably associated with each other between the data structure of the action instruction data and the data structure of the game information described later in FIG. It should be understood by those skilled in the art that these data structures are merely examples and are not limited thereto.
  • FIG. 12 is a diagram showing an example of the data structure of the game information 132 processed by the system 1 according to the present embodiment.
  • the items provided in the game information 132 are appropriately determined according to the genre, nature, content, etc. of the game, and the exemplary items do not limit the scope of the present invention.
  • the game information 132 may be configured to include each item of "play history”, “item”, “delivery history”, and "game object". Each of these items may be appropriately referred to when the game progress unit 115 advances the game.
  • the user's play history is stored in the item "play history”.
  • the play history is information indicating whether or not the user's play is completed for each scenario stored in the storage unit 120.
  • the scenario is, for example, a unit of distribution of a video (for example, a tutorial, a game, a lottery, etc. in each game part), or a part or all of a video to be played in a retrospective distribution or a missed distribution described later (for example, execution of playback).
  • each unit may be used as the unit.
  • the play history may include a list of fixed scenarios downloaded at the first time of play and a list of acquired scenarios acquired later. In each list, statuses such as "played", “unplayed”, “playable”, and "unplayable” are associated with each scenario.
  • the item "item” stores a list of items owned by the user as a game medium.
  • the item is, for example, a clothing item worn by a character.
  • the user can make the character wear the clothing items obtained by playing the scenario and customize the appearance of the character.
  • the item "Distribution history” stores a list of videos, so-called back numbers, that were live-distributed by the operator in the past in the live distribution part.
  • the video that is PUSH-distributed in real time can be viewed only at that time.
  • the moving images for past distribution are recorded by the server 200 or the operation instruction device 300, and can be PULL distributed in response to a request from the user terminal 100.
  • the back number may be made available for download by the user for a fee.
  • the item "game object” stores data of various objects that appear in the live distribution part such as character 801 and enemy objects and obstacle objects in the battle game.
  • FIG. 13 is a flowchart showing an example of a basic game progress of a game executed based on the game program according to the present embodiment.
  • the processing flow is applied to the scenes after the end of the live distribution time when the real-time live distribution part has already been completed.
  • step S301 the operation unit of the user terminal 100 newly requests the progress of the completed live distribution part.
  • step S302 in response to the request in step S301, the user terminal 100 receives the recorded operation instruction data related to the completed live distribution part from the server 200 or the operation instruction device 300.
  • the recorded action instruction data includes motion data and voice data input by the operator associated with the character.
  • the user terminal 100 may receive various progress record data acquired and recorded along with the movement of the character during the progress of the real-time live distribution part.
  • the progress record data may include viewer behavior data in which the user who participated in the real-time live distribution part behaves in accordance with the movement of the character.
  • the viewer behavior data is data including a record of the behavior during the live of all the users (that is, the viewers who participated in the live) who have advanced the real-time live distribution part in real time.
  • the viewer behavior data should include messaging content such as text messages and icons sent by the viewer to the character in real time during the live performance.
  • the viewer behavior data may be, for example, the user behavior history information shown in FIG.
  • the recorded operation instruction data and progress record data may be received by the user terminal 100 as separate data, and each may be analyzed (rendered).
  • the server 200 or the operation instruction device 300 the previously recorded operation instruction data and the viewer behavior data may be combined, and the combined data set may be received by the user terminal 100 at one time. By receiving the combined data set, it is possible to reduce the load of subsequent data analysis (rendering) by the user terminal 100.
  • the progress record data is combined with the recorded action instruction data (that is, the recorded action order data includes the progress record data).
  • step S303 the game progress unit 115 determines whether or not the user has a track record of progressing the live distribution part in real time. The determination may be performed, for example, based on whether there is a record in which the action instruction data has been sent to the user terminal 100. Alternatively, even if the live distribution part is executed based on whether the status is "played" by referring to the item "play history" shown in FIG. 12, the item “distribution history" is also referred to. It may be executed based on whether or not there is a record of live distribution from the character in the past. In addition to this, when the operation instruction data already recorded is stored in the storage unit 120 of the user terminal 100, it may be determined that the live distribution part has already been advanced in real time. In addition, the determination may be performed by combining them, or by any other method.
  • step S303 If it is determined in step S303 that the user has a track record of advancing the live distribution part in real time (YES), the progress of the completed live distribution part is "return distribution”. On the other hand, when it is determined in step S303 that the user has no record of advancing the live distribution part in real time (NO), the progress of the completed live distribution part is "missed distribution”. As mentioned above, the user experience is different between "return delivery” and "missed delivery”.
  • step S303 If it is determined in step S303 that the user has a track record of advancing the live distribution part in real time, the processing flow proceeds from YES in step S303 to step S304.
  • step S304 the game progress unit 115 acquires the user action history information of the live distribution part shown in FIG. 11 and analyzes it.
  • the user action history information may be acquired from the server 200 or the operation instruction device 300, or may be used directly when it is already stored in the storage unit 120 of the user terminal 100.
  • step S305 the game progress unit 115 re-progresses the completed live distribution part (that is, the above-mentioned “return distribution”). Specifically, the recorded operation instruction data and the user action history information analyzed in step S304 are used to re-progress the live distribution part.
  • the endpaper distribution it is better to give the user the choice of costumes to be worn by the character, as in the live distribution part. For example, the user may be able to select from one of the costumes assigned below the ranking band associated with the user. As a result, in the return delivery, it is possible to provide a large number of costume options to high-ranked users.
  • the throwing item input in the real-time live distribution part may be reflected in the movement mode of the character. For example, if the user has acquired it as a clothing item (here, a "necklace"), the character is operated based on the item (that is, wearing a necklace). As a result, the live distribution part may be re-progressed. That is, the re-progress of the real-time live distribution part reflects the user behavior history information and the reward information, is similar to the live distribution part that has progressed in real time, and is unique to the user. ..
  • the live distribution part will be re-progressed.
  • the user specifies a specific action time, and the live distribution part is selectively advanced from there. Can be made to. For example, if the user inputs a comment 2 minutes and 45 seconds after the start of the live distribution part, the user can advance the live distribution part again by specifying the timing after 2 minutes and 45 seconds. ..
  • action time corresponding to the consumption of valuable data by the user's input operation and the record of actions such as change of items such as character's clothing. It is better to make it feasible based on.
  • the live distribution part is selectively progressed by using the data of the action time. Can be made to. For example, if the user has selected a period of 2 minutes 45 seconds to 5 minutes 10 seconds from the start of the live distribution part, the user can re-progress the live distribution part over that period.
  • step S303 determines that the user has no record of advancing the live distribution part in real time
  • the processing flow proceeds from NO in step S303 to step S306.
  • step S306 the game progress unit 115 executes a limited progress (that is, the above-mentioned "missed distribution") of the completed live distribution part.
  • the reason why the missed delivery is restricted is that the user has the right to receive the live stream, but it can be considered that he has waived this right, so not all of the live stream is necessarily limited. Based on the idea that it is not necessary to reproduce and present it to the user.
  • the progress of the live distribution part is executed using the recorded operation instruction data.
  • a clothing item eg, a "necklace”
  • the live distribution part progressed in real time would image-synthesize the item so that it would be worn and acted upon by the character (ie,).
  • the character is wearing a necklace or other clothing item). That is, in the real-time live distribution part, the movement mode of the character was associated with clothing items.
  • clothing items and the like are not associated with the movement mode of the character. That is, the image composition process is not performed so that the character wears the item and operates.
  • the progress of the completed live distribution part is limited in that it does not reflect information such as clothing items and is not unique to the user.
  • the thrown money item input in the real-time live distribution part can be reflected in the movement mode of the character.
  • the missed delivery the thrown item thrown in the real-time live delivery part is not reflected in the operation mode of the character (however, the thrown item thrown by another viewer user who was watching the live stream in real time is missed. It may be reflected in the movement mode of the character in the distribution).
  • the retrospective distribution viewing real-time live distribution in the past
  • the missed distribution unlike the live distribution part that progressed in real time, it is better to limit the actions of the users that can be accepted. Specifically, in the live distribution part that progressed in real time, it was possible to accept the consumption of valuable data by the user's input operation (in one example, throwing money, charging by purchasing items, etc.). On the other hand, in the progress of the completed live distribution part, the consumption of such valuable data may be restricted so as not to be accepted. More specifically, in a live delivery part progressed in real time, a user interface (UI) including buttons and screens for executing the consumption of valuable data is displayed on the display unit 352, and the user can use such a UI. It is assumed that the consumption of valuable data can be executed through the input operation of. On the other hand, in the overlooked delivery, such a UI should be hidden so that the user cannot explicitly perform an input operation. As a result, in the return delivery and the overlooked delivery, the user 3 cannot throw in a tossed item or the like for supporting the character.
  • UI user interface
  • the live distribution part and the like include, but are not limited to, user-participatory events as described in embodiments 1 and 2, and the user is provided with an interactive experience with the character.
  • user-participatory events include games such as embodiments 1 and 2, questionnaires provided by characters, quizzes given by characters, battles with characters (eg, rock-paper-scissors games, bingo games), and the like. sell. Then, as in the case of live distribution in real time, the participation result of such a user participation type event is fed back to the user in the overlooked distribution.
  • the result of the correctness determination is fed back to the user.
  • the program may automatically make only a simple judgment (correctness judgment, etc.) and give feedback.
  • the user is different from the one during live participation.
  • a display such as "The answer is different from that during the live” may be displayed and output to the user terminal by comparing with the answer of the user who is participating in the live.
  • the user may be restricted from earning a predetermined game point for the above feedback.
  • predetermined game points may be associated with the user and added to the points owned by the user.
  • points may not be associated with the user.
  • the points owned by the user for example, in the case of a game in which a plurality of users who are game players are ranked based on the points, when the user advances the completed live distribution part, this is the case. It will not affect the ranking.
  • the user terminal 100 may request the progress of the completed live distribution part again. That is, it is preferable that the return delivery or the missed delivery can be repeatedly executed a plurality of times. In this case, the processing flow returns to step S301.
  • the user terminal 100 even after the live distribution part has progressed in real time, the user can proceed with the live distribution part again in various modes. As a result, the user feels more attached to the character through the experience of interacting with the character in a rich sense of reality, so that another part that operates the character can be played with even more interest. .. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game.
  • ⁇ Modification 1> whether the progress of the completed live distribution part is a return distribution or a missed distribution is determined based on whether or not the user has a track record of advancing the live distribution part in real time.
  • the user may be configured to select the return delivery or the overlooked delivery. Alternatively, regardless of the presence or absence of the above-mentioned achievements, only the overlooked distribution may be provided to the user.
  • the progress of the completed live distribution part may be requested again. That is, the return delivery or the missed delivery could be repeatedly executed a plurality of times.
  • the second and subsequent return delivery or missed delivery correspond to the record of the previous return delivery or missed delivery.
  • the first distribution history data may be stored in the storage unit 220 of the server 200 or the storage unit 320 of the operation instruction device 300. After that, when the recorded operation instruction data related to the completed live distribution part is requested again from the user terminal 100, the first distribution history data from the server 200 or the operation instruction device 300 is the recorded operation instruction data. Delivered with. In the user terminal 100, the received first delivery history data is referred to, and if the first return delivery or the missed delivery is performed halfway, the user terminal 100 will perform the second return delivery or the second return delivery from the continuation. Resume the progress of overlooked delivery. As a result, the user can efficiently perform return delivery or missed delivery.
  • the return delivery should be executed from the second time onward, and if the first delivery is a missed delivery, the missed delivery should be executed from the second time onward. Further, when the recorded operation instruction data already exists in the user terminal 100, the user terminal 100 may not receive the recorded operation instruction data again. As a result, the amount of data received by the user terminal 100 can be saved.
  • ⁇ Modification 3> whether the progress of the completed live distribution part is a return distribution or a missed distribution is determined according to the actual result of the user advancing the live distribution part in real time (). Step S303 in FIG. 13).
  • the third modification when it is determined that the user has progressed the live distribution part halfway in real time, it is preferable to restart the progress of the completed live distribution part from the continuation.
  • the record of how far the user has advanced the live distribution part in real time can be determined from the user behavior history information described above in FIG. That is, the user behavior history information may record how long the user has progressed with respect to a specific live distribution part.
  • the resumption of the completed live distribution part should be a missed distribution, which is a limited progress. As a result, the user can efficiently execute the overlooked delivery.
  • FIG. 14 shows an example of a screen displayed on the display unit 152 of the user terminal 100 based on the game program according to the present embodiment, and an example of a transition between these screens.
  • screens include a home screen 800A, a live selection screen 800B for live distribution, and a missed selection screen 800C for missed distribution.
  • the transition from the home screen 800A to the live selection screen 800B is possible.
  • the live selection screen 800B can be transitioned to the home screen 800A and the overlooked selection screen 800C.
  • the actual distribution screen (not shown) is transitioned from the live selection screen 800B and the overlooked selection screen 800C.
  • the home screen 800A displays various menus for advancing the live distribution part on the display unit 152 of the user terminal 100.
  • the game progress unit 115 receives an input operation for starting the game, the game progress unit 115 first displays the home screen 800A.
  • the home screen 800A includes a "live" icon 802 for transitioning to the live selection screen 800B.
  • the game progress unit 115 Upon receiving an input operation for the "live" icon 802 on the home screen 800A, the game progress unit 115 causes the display unit 152 to display the live selection screen 800B.
  • the live selection screen 800B presents live information that can be distributed to the user.
  • a list of one or more live notification information for notifying the user of the live distribution time and the like in advance is displayed.
  • the live announcement information includes at least the live delivery date and time.
  • the live announcement information may include free / paid live information, an advertisement image including an image of a character appearing in the live, and the like.
  • the live selection screen 800B may display the notification information regarding the live distribution to be distributed in the near future on the live selection screen on the pop-up screen 806.
  • the server 200 searches for one or more user terminals 100 having the right to receive the live distribution.
  • the right to receive livestreaming is conditioned on the fact that the consideration for receiving livestreaming has been paid (for example, holding a ticket).
  • the corresponding live notification information will be displayed on the user terminal 100 having the right to receive the live distribution.
  • the user terminal 100 accepts a live playback operation, for example, a selection operation for a live at the live distribution time on the live selection screen 800B (more specifically, a touch operation for a live image). Accordingly, the game progress unit 115 shifts the display unit 152 to the actual distribution screen (not shown). As a result, the user terminal 100 can advance the live distribution part and advance the live viewing process in real time.
  • the game progress unit 115 operates the character in the live distribution part based on the received operation instruction data.
  • the game progress unit 115 generates a moving image reproduction screen including a character that operates based on the operation instruction data in the live distribution part, and displays it on the display unit 152.
  • the live selection screen 800B has a "return (x)" icon 808 for transitioning to the screen displayed immediately before and a "missing delivery” icon 810 for transitioning to the missed selection screen 800C on the display unit 152. It may be displayed.
  • the game progress unit 115 shifts the screen 800B to the home screen 800A in response to an input operation for the “return (x)” icon 808 on the live selection screen 800B.
  • the game progress unit 115 shifts to the missed selection screen 800C on the screen 800B for the input operation for the missed distribution icon 810 on the live selection screen 800B.
  • the overlook selection screen 800C displays the delivered information related to one or more live delivered in the past, in particular, the delivered information that the user has not actually advanced the live delivery part in real time.
  • the operation unit of the user terminal 100 receives an input operation for the live distribution information displayed on the overlook selection screen 800C, for example, the image 830 including the character appearing in the live
  • the game progress unit 115 receives the input operation for the image 830
  • the game progress unit 115 finishes the live distribution part. You can proceed with the livestreaming part that has already been completed.
  • the delivered information about the live is further delivered with the playback time 812 of each delivered live, the period until the end of delivery (days, etc.) 814, and how many days before the present. It may include information 816 indicating whether or not it has been done, a past delivery date and time, and the like.
  • the overlooked selection screen 800C includes a “back ( ⁇ )” icon 818 for transitioning to the live selection screen 800B. In response to the input operation for the "return ( ⁇ )" icon 818, the game progress unit 115 transitions to the live selection screen 800B.
  • the overlooked selection screen 800C is not limited to this, but it is preferable that the transition is made only from the live selection screen 800B and not directly from the home screen 800A.
  • the missed distribution is performed for the user who missed the live distribution, and is only a function accompanying the live distribution function.
  • one of the purposes of this game is to enhance the fun of the game by allowing the user to watch the live stream in real time, support the character in real time, and deepen the interaction with the character. Therefore, in order to guide the user to watch the live distribution in real time rather than the overlooked distribution in which real-time interaction with the character (operator) is not possible, here, the overlooked selection screen 800C cannot be directly transitioned from the home screen 800A. It is better to do so.
  • the delivered information that the user has not made the live delivery part in real time is displayed.
  • the delivered information about all the live delivered in the past may be displayed in a list for each live.
  • it is preferable that either the return distribution or the overlooked distribution is executed depending on whether or not the user has progressed the live distribution part in real time. Specifically, when it is determined that the user has a track record of advancing the live distribution part in real time, the above-mentioned return distribution is performed. On the other hand, if it is determined that the user has no record of advancing the live distribution part in real time, the distribution will be overlooked. As described above with respect to FIG. 13, different user experiences can be provided between the look-back delivery and the missed delivery.
  • Control of the control block of the control unit 110 (particularly, the operation reception unit 111, the display control unit 112, the UI control unit 113, the animation generation unit 114, the game progress unit 115, the analysis unit 116 and the progress information generation unit 117), and the control unit 210.
  • Blocks (particularly progress support unit 211 and shared support unit 212), and control blocks of control unit 310 (particularly, operation reception unit 311, display control unit 312, UI control unit 313, animation generation unit 314, progress simulation unit 315).
  • Character control unit 316 and reaction processing unit 317) may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or by software using a CPU (Central Processing Unit). You may.
  • control unit 110 the control unit 210 or the control unit 310, or an information processing device including a plurality of these units is a CPU that executes instructions of a program that is software that realizes each function, the above program, and various types. It is equipped with a ROM (Read Only Memory) or storage device (these are referred to as "recording media") in which data is readablely recorded by a computer (or CPU), a RAM (Random Access Memory) for expanding the above program, and the like. .. Then, the object of the present invention is achieved by the computer (or CPU) reading the program from the recording medium and executing the program.
  • a "non-temporary tangible medium" for example, a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • the program may be supplied to the computer via any transmission medium (communication network, broadcast wave, etc.) capable of transmitting the program. It should be noted that one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the above program is embodied by electronic transmission.
  • a method is a method performed by a computer as a user terminal comprising a processor, memory, display and operation unit, wherein the user terminal is physical with respect to the space in which the user terminal resides.
  • An external device that exists in a distant space and is different from the user of the user terminal, and plays the character at least one character appearing in the game in a space invisible to the user. It is configured to be able to communicate with the external device to be controlled via the network, and the method is to proceed with the game according to the user's input operation input to the computer via the operation unit by the processor, and to the external device.
  • progress information for making it possible to display a game screen of the game in progress, including a screen displayed on the display unit or a simulated screen that simplifies the screen. It is audio data acquired by an external device that displays the game screen of the game in progress in real time so that the performer can see it based on the step of sequentially transmitting and the progress information transmitted sequentially, and is real-time from the performer.
  • the step of receiving the voice data of the voice that is emitted at an arbitrary timing with respect to the game screen displayed on the computer and cannot be heard directly by the user from the external device and the reception of the voice data as a trigger are used as a trigger.
  • the receiving step is to receive the motion data input by the performer who plays the character in the external device together with the voice data from the external device, and the step to operate the character is the content of the voice data.
  • the step of moving the character according to the motion data and re-moving the character in accordance with the speech data causes the character to re-operate based on the motion data in addition to the voice data.
  • the action record includes the time information, and the step of operating the character again follows the designation of the time information by the user's input operation via the operation unit while the game is in progress.
  • the recording of the action includes the consumption of valuable data by the input operation of the user, and during the execution of the step of reactivating the character, the character's The mode of operation is determined based on the consumption of valuable data.
  • the method comprises a first user terminal of a first user other than at least one second user who has a first processor and a first memory and has viewed real-time distribution of a game performed by a performer in real time.
  • This is a method executed by the first processor, and after the real-time distribution of the game is completed, the recorded voice data related to the game and the voice for controlling at least one character appearing in the game from the outside.
  • the distributed game includes a step of receiving data and a step of executing playback of the distributed game by operating a character based on the voice data, and the distributed game includes a second processor, a second memory, and the like.
  • An external device and network that controls the character in response to input from a performer who is different from the second user of the second user terminal and plays the character in a space invisible to the second user. It is configured to be communicable via, and the second processor advances the game according to the input operation of the second user input to the computer via the operation unit, and indicates the progress of the game to the external device.
  • progress information for making it possible to display a game screen of a game in progress, including a screen displayed on the display unit or a simulated screen simplified from the screen, is sequentially transmitted.
  • a game that is audio data acquired by an external device that displays the game screen of the game in progress in real time so that the performer can see it based on the progress information transmitted sequentially, and is displayed in real time by the performer.
  • the progress of the game is triggered by receiving the voice data of the voice that is emitted from the screen at an arbitrary timing and cannot be directly heard by the second user from the external device and receiving the voice data.
  • It is a game generated by operating the character by causing the character appearing in the character to speak at least the content of the voice data, and by executing.
  • the step of receiving the voice data executed by the computer receives the motion data input by the performer who plays the character in the external device together with the voice data from the external device, and is executed by the computer.
  • the step of operating the character to be performed is to move the character according to the motion data in accordance with the speech of the content of the voice data, and the step to execute the reproduction of the delivered game executed by the first user terminal is the voice. By operating the character based on the motion data in addition to the data, the delivered game is played back.
  • the first user terminal further includes a display unit, and the display unit has a first screen and a first screen for displaying a menu related to real-time distribution. It is configured to be able to display a second screen for displaying real-time distribution that can be viewed and a third screen for displaying information about the delivered game, and the third screen is from the second screen. It is configured so that it is transitioned and not transitioned from the first screen.
  • (Item 11) (Item 10) further includes a step of transitioning from the second screen to the third screen when an input operation by the first user for the second screen is accepted.
  • a computer-readable medium is a computer-readable medium that stores computer-executable instructions, and when a computer-executable instruction is executed, the processor is informed of (item 1) to (item 6). Execute the step included in the method of any one item.
  • a computer-readable medium is a computer-readable medium that stores a computer-executable instruction, and when the computer-executable instruction is executed, the first processor is notified from (item 7) to (item 11). ) To execute the step included in the method of any one item.
  • the information processing apparatus is an information processing apparatus as a user terminal including a processor, a memory, a display unit, and an operation unit, and the user terminal is a physical space in which the user terminal exists.
  • An external device that exists in a distant space and is different from the user of the user terminal, and plays the character at least one character appearing in the game in a space invisible to the user. It is configured to be able to communicate with the external device to be controlled via the network, and the processor reads the program stored in the memory and advances the game according to the input operation of the user input to the computer via the operation unit.
  • the external device displays a game screen of the game in progress, including a screen displayed on the display unit or a simulated screen that simplifies the screen.
  • Voice acquired by an external device that sequentially transmits progress information to enable it and displays the game screen of the game in progress in real time so that the performer can see it based on the sequentially transmitted progress information.
  • Receiving the voice data of the voice which is the data and is the voice emitted from the performer to the game screen displayed in real time at an arbitrary timing and cannot be heard directly by the user, from the external device and the voice data.
  • the character appearing during the progress of the game is made to speak at least the content of the voice data to operate the character, and after the end of the game, the progressed game is viewed. It is configured to re-operate and execute the character based on the audio data received again from the outside in response to the request.
  • the information processing apparatus comprises a first processor and a first memory, and is a first user other than at least one second user who has viewed real-time distribution of a game performed by a performer in real time.
  • An information processing device as a user terminal, the first processor reads a program stored in the first memory, and after the real-time distribution of the game is completed, the first processor is recorded voice data related to the game from the outside. To receive voice data for controlling at least one character appearing in the game, and to execute the delivered game by operating the character based on the voice data.
  • the game configured to be executed and distributed is a process executed by a computer as a second user terminal of a second user including a second processor, a second memory, a display unit, and an operation unit, and is a second process.
  • the user terminal is an external device that exists in a space physically separated from the space in which the second user terminal exists, and is different from the second user of the second user terminal, and the character is used by the second user. It is configured to be able to communicate via a network with an external device that controls the character in response to input from a performer who plays in an invisible space, and is operated by the second processor reading a program stored in the second memory.
  • the display unit is a game screen of the game in progress as information indicating the progress of the game to the external device and to advance the game according to the input operation of the second user input to the computer through the unit.
  • the progress information for making it possible to display the screen displayed on the screen or the game screen including the simulated screen which is a simplification of the screen is sequentially transmitted, and the game of the game in progress based on the sequentially transmitted progress information.
  • it is voice data acquired by an external device that displays the screen in real time so that the performer can see it, and is voice emitted from the performer at an arbitrary timing to the game screen displayed in real time.
  • the character appearing during the progress of the game is made to speak at least the content of the voice data. Thereby, it is a game generated by operating the character and executing.
  • a method is a method for controlling a character appearing in a game, wherein the method comprises a processor, a memory and a display, from a performer who plays at least one character appearing in the game.
  • a computer that controls the character in response to the input of It is a method performed by a computer capable of communicating with the computer, and the method is based on progress information by the processor to enable the display of the game screen of the game in progress, which is sequentially transmitted from the user terminal in which the game is progressed.
  • On the display unit of the computer so that the performer can visually recognize the game screen of the game in progress, which is displayed on the display unit of the user terminal or includes a simulated screen obtained by simplifying the screen.
  • a step of displaying in real time a step of accepting a voice that is emitted from the performer at an arbitrary timing to the game screen and cannot be heard directly by the user, and a step of accepting the voice data of the received voice on the user terminal.
  • the voice data is transmitted to the user terminal on the user terminal in order to make the character speak the content of the voice data as in the end of the game.
  • the voice data is transmitted to the terminal other than the user terminal. Send to, including steps and.
  • a computer-readable medium is a computer-readable medium containing computer-executable instructions that, upon execution of the computer-executable instruction, causes the processor to perform the steps included in the above method. ..
  • the information processing device is an information processing device for controlling a character appearing in a game
  • the information processing device includes a processor, a memory, and a display unit, and at least one appearing in the game.
  • a computer that controls a character in response to input from a performer who plays one character, and exists in a space where the user cannot see the performer because it is physically separated from the space where the user terminal of a user different from the performer exists. It is configured to be able to communicate with the user terminal via the network, and the processor reads the program stored in the memory to display the game screen of the game in progress, which is sequentially transmitted from the user terminal that advances the game.
  • the performer visually recognizes the game screen of the game in progress, including the screen displayed on the display unit of the user terminal or the simulated screen simplified from the screen. It is possible to display in real time on the display of the computer so that it is possible, and to accept the voice that is emitted from the performer at any timing to the game screen displayed in real time and that the user cannot hear directly.
  • the voice data of the received voice is transmitted to the user terminal, and after the game is finished, the character is made to speak the content of the voice data on the user terminal in the same manner as during the finished game in response to the request from the user terminal.
  • 1 game system 2 network, 10, 20, 30 processor, 11,21,31 memory, 12, 22, 32 storage, 13, 23, 33 communication IF (operation unit), 14, 24, 34 input / output IF (operation) Unit, display unit), 15, 35 touch screen (display unit, operation unit), 17 camera (operation unit), 18 distance measurement sensor (operation unit), 100 user terminal (computer, information processing device), 110, 113, 210, 310 control unit, 111,311 operation reception unit, 112,312 display control unit, 113,313 UI control unit, 114,314 animation generation unit, 115 game progress unit, 116 analysis unit, 117 progress information generation unit, 120 , 220, 320 storage unit, 131 game program, 132 game information, 133 user information, 134 character control program, 151,351 input unit (operation unit), 152,352 display unit, 200 server (computer), 211 progress support unit , 212 shared support unit, 300 operation instruction device (computer, NPC control device, character control device, information processing device), 315 progress simulation unit, 316 character control unit, 3

Abstract

The present invention enhances the amusement of a game. Provided is a method executed by a computer serving as a user terminal comprising a processor, a memory, a display unit, and an operation unit. The user terminal can communicate with an external device that is present in a space physically separated from the space in which the user terminal is present, and that controls a character in accordance with input from a performer who is different from the user and who portrays the character during the game. The method comprises: a step in which the game is advanced, by means of the processor, according to user input; a step in which progress information of the ongoing game is sequentially transmitted to the external device; a step in which audio data, which is acquired by the external device that displays a game screen in real time to the performer on the basis of the progress information, is received, the audio data being audio spoken by the performer to the game screen at an arbitrary timing; a step in which, when the audio data is received, content of the audio data is caused to be spoken by the character in the game so as to operate the character; and a step in which, after completion of the game, the character is operated again on the basis of audio data that is received again from the outside in accordance with a request for viewing the advanced game.

Description

方法、コンピュータ可読媒体、および情報処理装置Methods, computer-readable media, and information processing equipment
 本開示は、方法、コンピュータ可読媒体、および情報処理装置に関する。 This disclosure relates to methods, computer-readable media, and information processing devices.
 従来、ユーザが選択した選択肢に応じて結末が異なるように物語が進行するゲームが広く知られている。例えば、非特許文献1には、女の子のキャラクタと仮想的に親睦を深めることを主たる目的とする恋愛シミュレーションゲームが開示されている。ユーザは、提示された選択肢の中からキャラクタに対する働きかけとして最適と思うものを選択し、その働きかけに対して該キャラクタが反応することを繰り返すことで物語が進行する。 Conventionally, a game in which a story progresses so that the ending differs depending on the option selected by the user is widely known. For example, Non-Patent Document 1 discloses a romance simulation game whose main purpose is to virtually deepen friendship with a girl character. The user selects the most suitable action for the character from the presented options, and the story progresses by repeating the reaction of the character to the action.
 非特許文献1に開示されているゲームでは、キャラクタの応答パターンが予め用意されている。そして、ユーザの入力操作に応じて、該キャラクタの応答が、該応答パターンの中から決定されて出力され、ゲームが進行する。したがって、キャラクタの動作のバリエーションは、予め用意されたデータの内容を超えて広がることがない。そのため、ユーザは、キャラクタとの関わり合いに対して、該キャラクタがまるで現実の世界にいるかのような現実感を覚えることができず、いずれ飽きるという問題がある。一般に、ユーザに長くプレイさせることを意図して開発されたゲームにおいては、ユーザがゲームに飽きるという問題に如何に対処するかが重要である。ゲームには、常に、ユーザにプレイを動機付けるような魅力的なコンテンツを提供することが求められる。例えば、ゲームに登場するキャラクタが、高い現実感を備えていれば、ユーザは、ゲームの世界に没入しやすくなり、キャラクタとの関わり合いにより興趣性を見出すと考えられる 。 In the game disclosed in Non-Patent Document 1, a character response pattern is prepared in advance. Then, according to the input operation of the user, the response of the character is determined from the response pattern and output, and the game progresses. Therefore, the variation of the character's movement does not extend beyond the contents of the data prepared in advance. Therefore, there is a problem that the user cannot feel the reality as if the character is in the real world with respect to the relationship with the character, and eventually gets tired of it. Generally, in a game developed with the intention of letting the user play for a long time, it is important how to deal with the problem that the user gets tired of the game. Games are always required to provide compelling content that motivates users to play. For example, if a character appearing in a game has a high sense of reality, the user can easily immerse himself in the world of the game and find interest in the relationship with the character.
 本開示の一態様は、ゲームの世界への没入感を高め、該ゲームの興趣性を向上させることを目的とする。 One aspect of the present disclosure is intended to enhance the immersive feeling of the game in the world and to improve the interest of the game.
 本開示の一態様は、プロセッサ、メモリ、表示部および操作部を備えるユーザ端末としてのコンピュータにより実行される方法であって、ユーザ端末は、当該ユーザ端末が存在する空間とは物理的に離れた空間に存在する外部装置であって当該ユーザ端末のユーザとは異なる者でありゲームに登場する少なくとも1つのキャラクタをユーザが視認できない空間で演じる演者からの入力に応じて当該キャラクタを制御する外部装置とネットワーク経由で通信可能に構成されており、方法は、プロセッサによる、操作部を介してコンピュータに入力されたユーザの入力操作に応じてゲームを進行させるステップと、外部装置に、ゲームの進捗を示す情報として、進行中のゲームのゲーム画面であって、表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を表示可能にするための進捗情報を逐次送信するステップと、当該逐次送信される進捗情報に基づき進行中のゲームのゲーム画面を演者が視認可能となるようにリアルタイムに表示する外部装置により取得される音声データであって、演者からリアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であってユーザが直接聞けない音声の音声データを、該外部装置から受信するステップと、音声データを受信したことをトリガにして、ゲームの進行中に登場するキャラクタに、少なくとも該音声データの内容を発話させることにより、該キャラクタを動作させるステップと、ゲームの終了後に、進行されたゲームの視聴を要求したのに応じて外部から再び受信される音声データに基づいて、キャラクタを再び動作させるステップとを含む、方法である。 One aspect of the present disclosure is a method performed by a computer as a user terminal including a processor, memory, display unit and operation unit, wherein the user terminal is physically separated from the space in which the user terminal exists. An external device that exists in space and is different from the user of the user terminal and controls the character in response to input from a performer who plays at least one character appearing in the game in a space invisible to the user. It is configured to be able to communicate via the network, and the method is to advance the game according to the user's input operation input to the computer via the operation unit by the processor, and to the external device, the progress of the game. As information to be shown, a step of sequentially transmitting progress information for making it possible to display a game screen of a game in progress, which is a screen displayed on a display unit or a simulated screen obtained by simplifying the screen. And, it is audio data acquired by an external device that displays the game screen of the game in progress in real time so that the performer can see it based on the progress information transmitted sequentially, and is displayed in real time by the performer. The game is in progress with the step of receiving the voice data of the voice that is emitted from the game screen at an arbitrary timing and cannot be heard directly by the user from the external device and the reception of the voice data as a trigger. By causing the character appearing in the computer to speak at least the content of the voice data, the step of operating the character and after the end of the game, the character is received again from the outside in response to the request to watch the progressed game. It is a method including a step of re-moving a character based on voice data.
 本開示の他の態様は、コンピュータ実行可能命令を格納したコンピュータ可読媒体であって、コンピュータ実行可能命令が実行されると、上記プロセッサに、上記の方法に含まれるステップを実行させる、コンピュータ可読媒体である。 Another aspect of the present disclosure is a computer-readable medium containing computer-executable instructions, which, upon execution of the computer-executable instruction, causes the processor to perform the steps included in the above method. Is.
 本開示の他の態様は、第1プロセッサおよび第1メモリを備え、演者が行うゲームのリアルタイム配信をリアルタイムに視聴した少なくとも1人の第2ユーザ以外の第1ユーザの第1ユーザ端末により実行される方法であって、第1プロセッサによる、ゲームのリアルタイム配信の終了後に、外部から、当該ゲームに関する記録済みの音声データであってゲームに登場する少なくとも1つのキャラクタを制御するための音声データを受信するステップと、当該音声データに基づいて、キャラクタを動作させることにより、配信済みのゲームの再生を実行するステップと、を含み、配信済みのゲームは、第2プロセッサ、第2メモリ、表示部および操作部を備える第2ユーザの第2ユーザ端末としてのコンピュータにより実行される処理であって、第2ユーザ端末は、当該第2ユーザ端末が存在する空間とは物理的に離れた空間に存在する外部装置であって当該第2ユーザ端末の第2ユーザとは異なる者でありキャラクタを第2ユーザが視認できない空間で演じる演者からの入力に応じて当該キャラクタを制御する外部装置とネットワーク経由で通信可能に構成されており、第2プロセッサによって、操作部を介してコンピュータに入力された第2ユーザの入力操作に応じてゲームを進行させることと、外部装置に、ゲームの進捗を示す情報として、進行中のゲームのゲーム画面であって、表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を表示可能にするための進捗情報を逐次送信することと、当該逐次送信される進捗情報に基づき進行中のゲームのゲーム画面を演者が視認可能となるようにリアルタイムに表示する外部装置により取得される音声データであって、演者からリアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であって第2ユーザが直接聞けない音声の音声データを、該外部装置から受信することと、音声データを受信したことをトリガにして、ゲームの進行中に登場するキャラクタに、少なくとも該音声データの内容を発話させることにより、該キャラクタを動作させることと、が実行されることによって生成されたゲームである、方法である。 Another aspect of the present disclosure is performed by a first user terminal of a first user other than at least one second user who comprises a first processor and a first memory and has viewed real-time distribution of a game performed by a performer in real time. After the real-time distribution of the game is completed by the first processor, the recorded voice data related to the game and the voice data for controlling at least one character appearing in the game are received from the outside. The delivered game includes a second processor, a second memory, a display unit, and a step of executing the reproduction of the delivered game by operating the character based on the voice data. It is a process executed by a computer as a second user terminal of a second user including an operation unit, and the second user terminal exists in a space physically separated from the space in which the second user terminal exists. Communication via a network with an external device that is different from the second user of the second user terminal and controls the character in response to input from a performer who plays the character in a space invisible to the second user. It is possible to advance the game according to the input operation of the second user input to the computer through the operation unit by the second processor, and to the external device as information indicating the progress of the game. Sequential transmission of progress information for making it possible to display a game screen of a game in progress, including a screen displayed on the display unit or a simulated screen simplified from the screen, and the sequential transmission of the progress information. Audio data acquired by an external device that displays the game screen of the game in progress in real time so that the performer can see it based on the progress information transmitted, and is the audio data that is displayed in real time by the performer. Appearing during the progress of the game, triggered by the reception of voice data from the external device and the reception of voice data, which is voice that is emitted at an arbitrary timing and cannot be heard directly by the second user. It is a method, which is a game generated by operating a character by causing the character to speak at least the content of the voice data, and by executing the character.
 本開示の他の態様は、コンピュータ実行可能命令を格納したコンピュータ可読媒体であって、コンピュータ実行可能命令が実行されると、上記第1プロセッサに、上記の方法に含まれるステップを実行させる、コンピュータ可読媒体である。 Another aspect of the present disclosure is a computer-readable medium containing computer-executable instructions, wherein upon execution of the computer-executable instructions, the first processor causes the first processor to perform the steps included in the method. It is a readable medium.
 本開示の他の態様は、プロセッサ、メモリ、表示部および操作部を備えるユーザ端末としての情報処理装置であって、当該ユーザ端末は、当該ユーザ端末が存在する空間とは物理的に離れた空間に存在する外部装置であって当該ユーザ端末のユーザとは異なる者でありゲームに登場する少なくとも1つのキャラクタをユーザが視認できない空間で演じる演者からの入力に応じて当該キャラクタを制御する外部装置とネットワーク経由で通信可能に構成されており、プロセッサは、メモリに記憶されたプログラムを読み出すことにより、操作部を介してコンピュータに入力されたユーザの入力操作に応じてゲームを進行させることと、外部装置に、ゲームの進捗を示す情報として、進行中のゲームのゲーム画面であって、表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を表示可能にするための進捗情報を逐次送信することと、当該逐次送信される進捗情報に基づき進行中のゲームのゲーム画面を演者が視認可能となるようにリアルタイムに表示する外部装置により取得される音声データであって、演者からリアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であってユーザが直接聞けない音声の音声データを、該外部装置から受信することと、音声データを受信したことをトリガにして、ゲームの進行中に登場するキャラクタに、少なくとも該音声データの内容を発話させることにより、該キャラクタを動作させることと、ゲームの終了後に、進行されたゲームの視聴を要求したのに応じて外部から再び受信される音声データに基づいて、キャラクタを再び動作させることと、を実行するように構成された、情報処理装置である。 Another aspect of the present disclosure is an information processing apparatus as a user terminal including a processor, a memory, a display unit, and an operation unit, and the user terminal is a space physically separated from the space in which the user terminal exists. An external device that is different from the user of the user terminal and controls the character in response to an input from a performer who plays at least one character appearing in the game in a space invisible to the user. It is configured to be communicable via the network, and the processor reads the program stored in the memory to advance the game according to the user's input operation input to the computer via the operation unit, and externally. To enable the device to display a game screen of a game in progress, which is a screen displayed on a display unit or a simulated screen obtained by simplifying the screen, as information indicating the progress of the game. It is audio data acquired by an external device that sequentially transmits progress information and displays the game screen of the game in progress in real time so that the performer can see it based on the sequentially transmitted progress information. Triggered to receive voice data from the external device and voice data that is emitted from the performer to the game screen displayed in real time at an arbitrary timing and cannot be heard directly by the user. Then, in response to requesting the character appearing during the progress of the game to operate the character by at least speaking the content of the voice data, and requesting the viewing of the progressed game after the end of the game. It is an information processing apparatus configured to operate the character again based on the voice data received again from the outside.
 本開示の他の態様は、第1プロセッサおよび第1メモリを備え、演者が行うゲームのリアルタイム配信をリアルタイムに視聴した少なくとも1人の第2ユーザ以外の第1ユーザの第1ユーザ端末としての情報処理装置であって、第1プロセッサは、第1メモリに記憶されたプログラムを読み出すことにより、ゲームのリアルタイム配信の終了後に、外部から、当該ゲームに関する記録済みの音声データであってゲームに登場する少なくとも1つのキャラクタを制御するための音声データを受信することと、当該音声データに基づいて、キャラクタを動作させることにより、当該配信済みのゲームの再生を実行することと、を実行するように構成され、配信済みのゲームは、第2プロセッサ、第2メモリ、表示部および操作部を備える第2ユーザの第2ユーザ端末としてのコンピュータにより実行される処理であって、第2ユーザ端末は、当該第2ユーザ端末が存在する空間とは物理的に離れた空間に存在する外部装置であって当該第2ユーザ端末の第2ユーザとは異なる者でありキャラクタを第2ユーザが視認できない空間で演じる演者からの入力に応じて当該キャラクタを制御する外部装置とネットワーク経由で通信可能に構成されており、第2プロセッサが、第2メモリに記憶されたプログラムを読み出すことによって、操作部を介してコンピュータに入力された第2ユーザの入力操作に応じてゲームを進行させることと、外部装置に、ゲームの進捗を示す情報として、進行中のゲームのゲーム画面であって、表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を表示可能にするための進捗情報を逐次送信することと、当該逐次送信される進捗情報に基づき進行中のゲームのゲーム画面を演者が視認可能となるようにリアルタイムに表示する外部装置により取得される音声データであって、演者からリアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であって第2ユーザが直接聞けない音声の音声データを、該外部装置から受信することと、音声データを受信したことをトリガにして、ゲームの進行中に登場するキャラクタに、少なくとも該音声データの内容を発話させることにより、該キャラクタを動作させることと、が実行されることによって生成されたゲームである、情報処理装置である。 Another aspect of the present disclosure is information as a first user terminal of a first user other than at least one second user who has a first processor and a first memory and has viewed real-time distribution of a game performed by a performer in real time. By reading the program stored in the first memory, the first processor, which is a processing device, appears in the game as recorded voice data related to the game from the outside after the real-time distribution of the game is completed. It is configured to receive voice data for controlling at least one character and to execute the reproduction of the delivered game by operating the character based on the voice data. The distributed game is a process executed by a computer as a second user terminal of a second user including a second processor, a second memory, a display unit, and an operation unit, and the second user terminal is the process. The space in which the second user terminal exists is an external device that exists in a space physically separated from the second user of the second user terminal, and the character is played in a space that cannot be visually recognized by the second user. It is configured to be able to communicate with an external device that controls the character in response to input from the performer via the network, and the second processor reads the program stored in the second memory to the computer via the operation unit. The game is progressed according to the input operation of the second user input to, and the game screen of the game in progress is displayed on the display unit as information indicating the progress of the game on the external device. The performer visually recognizes the progress information for sequentially transmitting the screen or the game screen including the simulated screen that simplifies the screen, and the game screen of the game in progress based on the sequentially transmitted progress information. It is audio data acquired by an external device that is displayed in real time so that it is possible, and it is audio that is emitted from the performer at an arbitrary timing to the game screen displayed in real time, and cannot be heard directly by the second user. By receiving the voice data of the voice from the external device and triggering the reception of the voice data, the character appearing during the progress of the game is made to speak at least the content of the voice data. Is an information processing device, which is a game generated by operating and executing.
 本開示の他の態様は、ゲームに登場するキャラクタを制御するための方法であって、方法は、プロセッサ、メモリおよび表示部を備え、ゲームに登場する少なくとも1つのキャラクタを演じる演者からの入力に応じて当該キャラクタを制御するコンピュータであって、演者とは異なるユーザのユーザ端末が存在する空間とは物理的に離れておりユーザが演者を視認できない空間に存在しユーザ端末とネットワーク経由で通信可能なコンピュータにより実行される方法であり、方法は、プロセッサによる、ゲームを進行させるユーザ端末から逐次送信される、進行中のゲームのゲーム画面を表示可能にするための進捗情報に基づいて、進行中のゲームのゲーム画面であって、ユーザ端末の表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を演者が視認可能となるようにコンピュータの表示部においてリアルタイムに表示するステップと、演者からリアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であってユーザが直接聞けない音声を受け付けるステップと、受け付けられた音声の音声データをユーザ端末に送信するステップと、ゲームの終了後に、ユーザ端末からの要求に応じて、ユーザ端末において、終了したゲーム中と同様にキャラクタに音声データの内容を発話させるために、音声データをユーザ端末に送信する、または、ユーザ端末以外の端末からの要求に応じて、ユーザ端末以外の端末において、終了したゲーム中と同様にキャラクタに音声データの内容を発話させるために、音声データをユーザ端末以外の端末に送信する、ステップと、を含む、方法である。 Another aspect of the present disclosure is a method for controlling a character appearing in a game, wherein the method comprises a processor, a memory and a display, and is input from a performer who plays at least one character appearing in the game. A computer that controls the character according to the situation, and is physically separated from the space where the user terminal of the user different from the performer exists, and exists in the space where the user cannot see the performer, and can communicate with the user terminal via the network. It is a method performed by a computer, and the method is in progress based on progress information by a processor, which is sequentially transmitted from a user terminal for advancing the game, so as to be able to display a game screen of the in-progress game. The game screen of the game, which is the screen displayed on the display unit of the user terminal or the game screen including the simulated screen simplified from the screen, is displayed in real time on the display unit of the computer so that the performer can see it. Steps to be performed, a step to accept a voice that is emitted from the performer at an arbitrary timing to the game screen and cannot be heard directly by the user, and a step to send the voice data of the received voice to the user terminal. After the step and the end of the game, in response to a request from the user terminal, the voice data is transmitted to the user terminal on the user terminal in order to make the character speak the content of the voice data as in the end of the game. , In response to a request from a terminal other than the user terminal, the voice data is transmitted to the terminal other than the user terminal in order to make the character speak the content of the voice data in the terminal other than the user terminal as in the end of the game. , Steps, and methods.
 本開示の他の態様は、コンピュータ実行可能命令を格納したコンピュータ可読媒体であって、コンピュータ実行可能命令が実行されると、プロセッサに、上記の方法に含まれるステップを実行させる、コンピュータ可読媒体である。 Another aspect of the present disclosure is a computer-readable medium containing computer-executable instructions, which, upon execution of the computer-executable instruction, causes the processor to perform the steps included in the above method. be.
 本開示の他の態様は、ゲームに登場するキャラクタを制御するための情報処理装置であって、情報処理装置は、プロセッサ、メモリおよび表示部を備え、ゲームに登場する少なくとも1つのキャラクタを演じる演者からの入力に応じて当該キャラクタを制御するコンピュータであって、演者とは異なるユーザのユーザ端末が存在する空間とは物理的に離れておりユーザが演者を視認できない空間に存在しユーザ端末とネットワーク経由で通信可能に構成されており、プロセッサは、メモリに記憶されたプログラムを読み出すことにより、ゲームを進行させるユーザ端末から逐次送信される、進行中のゲームのゲーム画面を表示可能にするための進捗情報に基づいて、進行中のゲームのゲーム画面であって、ユーザ端末の表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を演者が視認可能となるようにコンピュータの表示部においてリアルタイムに表示することと、演者からリアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であってユーザが直接聞けない音声を受け付けることと、受け付けられた音声の音声データをユーザ端末に送信することと、ゲームの終了後に、ユーザ端末からの要求に応じて、ユーザ端末において、終了したゲーム中と同様にキャラクタに音声データの内容を発話させるために、音声データをユーザ端末に送信する、または、ユーザ端末以外の端末からの要求に応じて、ユーザ端末以外の端末において、終了したゲーム中と同様にキャラクタに音声データの内容を発話させるために、音声データをユーザ端末以外の端末に送信することと、を実行するように構成された、情報処理装置である。 Another aspect of the present disclosure is an information processing device for controlling a character appearing in a game, wherein the information processing device includes a processor, a memory, and a display unit, and is a performer who plays at least one character appearing in the game. A computer that controls the character in response to input from, and is physically separated from the space where the user terminal of a user different from the performer exists, and exists in a space where the user cannot see the performer, and the user terminal and the network. It is configured to be communicable via, and the processor can display the game screen of the game in progress, which is sequentially transmitted from the user terminal that advances the game by reading the program stored in the memory. Based on the progress information, the performer can visually recognize the game screen of the game in progress, including the screen displayed on the display unit of the user terminal or the simulated screen simplified from the screen. It is displayed in real time on the display unit of the computer, it accepts the voice that is emitted from the performer at any timing to the game screen displayed in real time and cannot be heard directly by the user, and the received voice. Voice data is transmitted to the user terminal, and after the game is finished, in response to a request from the user terminal, the character is made to speak the content of the voice data on the user terminal as in the case of the finished game. To the user terminal, or in response to a request from a terminal other than the user terminal, in order to make the character speak the content of the voice data on the terminal other than the user terminal as in the end of the game. It is an information processing device configured to transmit and execute data to a terminal other than the user terminal.
 本開示の一態様によれば、ゲームの興趣性を向上させる効果を奏する。 According to one aspect of the present disclosure, it has the effect of improving the interest of the game.
ゲームシステムのハードウェア構成を示す図である。It is a figure which shows the hardware configuration of a game system. ユーザ端末、サーバおよび動作指図装置の機能的構成を示すブロック図である。It is a block diagram which shows the functional configuration of a user terminal, a server, and an operation instruction device. 画面遷移情報のデータ構造の一例を示す図である。It is a figure which shows an example of the data structure of the screen transition information. 進捗情報のデータ構造の一例を示す図である。It is a figure which shows an example of the data structure of progress information. 動作指図データのデータ構造の一例を示す図である。It is a figure which shows an example of the data structure of the operation instruction data. 動作指図装置の表示部に表示される進捗画面の一例を示す図である。It is a figure which shows an example of the progress screen displayed on the display part of the operation instruction apparatus. ユーザ端末の表示部に表示されるゲーム画面の一例を示す図である。It is a figure which shows an example of the game screen displayed on the display part of a user terminal. ゲームシステムを構成する各装置が実行する処理の流れを示すフローチャートである。It is a flowchart which shows the flow of the process executed by each device which constitutes a game system. 動作指図装置の表示部に表示される進捗画面の他の例を示す図である。It is a figure which shows the other example of the progress screen displayed on the display part of the operation instruction apparatus. ユーザ端末の表示部に表示されるゲーム画面の一例を示す図である。It is a figure which shows an example of the game screen displayed on the display part of a user terminal. 一実施形態に係るユーザ行動履歴情報のデータ構造の一例を示す。An example of the data structure of the user behavior history information according to the embodiment is shown. 一実施形態に係るゲーム情報のデータ構造の一例を示す。An example of the data structure of the game information according to one embodiment is shown. 一実施形態に係るゲームの基本的なゲーム進行の一例を示すフローチャートである。It is a flowchart which shows an example of the basic game progress of the game which concerns on one Embodiment. 一実施形態に係るユーザ端末の表示部に表示される画面の遷移例を示す。An example of a transition of a screen displayed on a display unit of a user terminal according to an embodiment is shown.
 〔実施形態1〕
 本開示に係るゲームシステムは、複数のユーザにゲームを提供するためのシステムである。以下、ゲームシステムについて図面を参照しつつ説明する。なお、本発明はこれらの例示に限定されるものではなく、特許請求の範囲によって示され、特許請求の範囲と均等の意味および範囲内でのすべての変更が本発明に含まれることが意図される。以下の説明では、図面の説明において同一の要素には同一の符号を付し、重複する説明を繰り返さない。
[Embodiment 1]
The game system according to the present disclosure is a system for providing a game to a plurality of users. Hereinafter, the game system will be described with reference to the drawings. It should be noted that the present invention is not limited to these examples, and is indicated by the scope of claims, and it is intended that all modifications within the meaning and scope equivalent to the scope of claims are included in the present invention. To. In the following description, the same elements are designated by the same reference numerals in the description of the drawings, and duplicate description is not repeated.
 <ゲームシステム1のハードウェア構成>
 図1は、ゲームシステム1のハードウェア構成を示す図である。ゲームシステム1は図示の通り、複数のユーザ端末100と、サーバ200とを含む。各ユーザ端末100は、サーバ200とネットワーク2を介して接続する。ネットワーク2は、インターネットおよび図示しない無線基地局によって構築される各種移動通信システム等で構成される。この移動通信システムとしては、例えば、所謂3G、4G移動通信システム、LTE(Long Term Evolution)、および所定のアクセスポイントによってインターネットに接続可能な無線ネットワーク(例えばWi-Fi(登録商標))等が挙げられる。
<Hardware configuration of game system 1>
FIG. 1 is a diagram showing a hardware configuration of the game system 1. As shown in the figure, the game system 1 includes a plurality of user terminals 100 and a server 200. Each user terminal 100 connects to the server 200 via the network 2. The network 2 is composed of various mobile communication systems constructed by the Internet and a radio base station (not shown). Examples of this mobile communication system include so-called 3G and 4G mobile communication systems, LTE (Long Term Evolution), and wireless networks (for example, Wi-Fi (registered trademark)) that can be connected to the Internet by a predetermined access point. Will be.
 サーバ200(コンピュータ、情報処理装置)は、ワークステーションまたはパーソナルコンピュータ等の汎用コンピュータであってよい。サーバ200は、プロセッサ20と、メモリ21と、ストレージ22と、通信IF23と、入出力IF24とを備える。サーバ200が備えるこれらの構成は、通信バスによって互いに電気的に接続される。 The server 200 (computer, information processing device) may be a general-purpose computer such as a workstation or a personal computer. The server 200 includes a processor 20, a memory 21, a storage 22, a communication IF 23, and an input / output IF 24. These configurations of the server 200 are electrically connected to each other by a communication bus.
 ユーザ端末100(コンピュータ、情報処理装置)は、スマートフォン、フィーチャーフォン、PDA(Personal Digital Assistant)、またはタブレット型コンピュータ等の携帯端末であってよい。ユーザ端末100は、ゲームプレイに適したゲーム装置であってもよい。ユーザ端末100は図示の通り、プロセッサ10と、メモリ11と、ストレージ12と、通信インターフェース(IF)13と、入出力IF14と、タッチスクリーン15(表示部)と、カメラ17と、測距センサ18とを備える。ユーザ端末100が備えるこれらの構成は、通信バスによって互いに電気的に接続される。なお、ユーザ端末100は、タッチスクリーン15に代えて、または、加えて、ユーザ端末100本体とは別に構成されたディスプレイ(表示部)を接続可能な入出力IF14を備えていてもよい。 The user terminal 100 (computer, information processing device) may be a mobile terminal such as a smartphone, a feature phone, a PDA (Personal Digital Assistant), or a tablet computer. The user terminal 100 may be a game device suitable for game play. As shown in the figure, the user terminal 100 includes a processor 10, a memory 11, a storage 12, a communication interface (IF) 13, an input / output IF 14, a touch screen 15 (display unit), a camera 17, and a distance measuring sensor 18. And prepare. These configurations included in the user terminal 100 are electrically connected to each other by a communication bus. The user terminal 100 may be provided with an input / output IF 14 to which a display (display unit) configured separately from the user terminal 100 main body can be connected in place of or in addition to the touch screen 15.
 また、図1に示すように、ユーザ端末100は、1つ以上のコントローラ1020と通信可能に構成されることとしてもよい。コントローラ1020は、例えば、Bluetooth(登録商標)等の通信規格に従って、ユーザ端末100と通信を確立する。コントローラ1020は、1つ以上のボタン等を有していてもよく、該ボタン等に対するユーザの入力操作に基づく出力値をユーザ端末100へ送信する。また、コントローラ1020は、加速度センサ、および、角速度センサ等の各種センサを有していてもよく、該各種センサの出力値をユーザ端末100へ送信する。 Further, as shown in FIG. 1, the user terminal 100 may be configured to be communicable with one or more controllers 1020. The controller 1020 establishes communication with the user terminal 100 according to a communication standard such as Bluetooth (registered trademark). The controller 1020 may have one or more buttons or the like, and transmits an output value based on a user's input operation to the buttons or the like to the user terminal 100. Further, the controller 1020 may have various sensors such as an acceleration sensor and an angular velocity sensor, and transmits the output values of the various sensors to the user terminal 100.
 なお、ユーザ端末100がカメラ17および測距センサ18を備えることに代えて、または、加えて、コントローラ1020がカメラ17および測距センサ18を有していてもよい。 Note that, instead of or in addition to the user terminal 100 including the camera 17 and the distance measuring sensor 18, the controller 1020 may have the camera 17 and the distance measuring sensor 18.
 ユーザ端末100は、例えばゲーム開始時に、コントローラ1020を使用するユーザに、該ユーザの名前またはログインID等のユーザ識別情報を、該コントローラ1020を介して入力させることが望ましい。これにより、ユーザ端末100は、コントローラ1020とユーザとを紐付けることが可能となり、受信した出力値の送信元(コントローラ1020)に基づいて、該出力値がどのユーザのものであるかを特定することができる。 It is desirable that the user terminal 100 causes a user who uses the controller 1020 to input user identification information such as the user's name or login ID via the controller 1020, for example, at the start of a game. As a result, the user terminal 100 can associate the controller 1020 with the user, and identifies which user the output value belongs to based on the source of the received output value (controller 1020). be able to.
 ユーザ端末100が複数のコントローラ1020と通信する場合、各コントローラ1020を各ユーザが把持することで、ネットワーク2を介してサーバ200などの他の装置と通信せずに、該1台のユーザ端末100でマルチプレイを実現することができる。また、各ユーザ端末100が無線LAN(Local Area Network)規格等の無線規格により互いに通信接続する(サーバ200を介さずに通信接続する)ことで、複数台のユーザ端末100によりローカルでマルチプレイを実現することもできる。1台のユーザ端末100によりローカルで上述のマルチプレイを実現する場合、ユーザ端末100は、さらに、サーバ200が備える後述する種々の機能の少なくとも一部を備えていてもよい。また、複数のユーザ端末100によりローカルで上述のマルチプレイを実現する場合、複数のユーザ端末100は、サーバ200が備える後述する種々の機能を分散して備えていてもよい。 When the user terminal 100 communicates with a plurality of controllers 1020, each user grips each controller 1020 so that the user terminal 100 does not communicate with other devices such as the server 200 via the network 2. Multiplayer can be realized with. In addition, each user terminal 100 communicates with each other according to a wireless standard such as a wireless LAN (Local Area Network) standard (communication connection is made without going through the server 200), thereby realizing local multiplayer with a plurality of user terminals 100. You can also do it. When the above-mentioned multiplayer is realized locally by one user terminal 100, the user terminal 100 may further include at least a part of various functions described later described in the server 200. Further, when the above-mentioned multiplayer is realized locally by a plurality of user terminals 100, the plurality of user terminals 100 may be provided with various functions described later described in the server 200 in a distributed manner.
 なお、ローカルで上述のマルチプレイを実現する場合であっても、ユーザ端末100はサーバ200と通信を行ってもよい。例えば、あるゲームにおける成績または勝敗等のプレイ結果を示す情報と、ユーザ識別情報とを対応付けてサーバ200に送信してもよい。 Even when the above-mentioned multiplayer is realized locally, the user terminal 100 may communicate with the server 200. For example, information indicating a play result such as a result or a win or loss in a certain game may be associated with user identification information and transmitted to the server 200.
 また、コントローラ1020は、ユーザ端末100に着脱可能な構成であるとしてもよい。この場合、ユーザ端末100の筐体における少なくともいずれかの面に、コントローラ1020との結合部が設けられていてもよい。該結合部を介して有線によりユーザ端末100とコントローラ1020とが結合している場合は、ユーザ端末100とコントローラ1020とは、有線を介して信号を送受信する。 Further, the controller 1020 may be configured to be detachable from the user terminal 100. In this case, a coupling portion with the controller 1020 may be provided on at least one surface of the housing of the user terminal 100. When the user terminal 100 and the controller 1020 are connected by wire via the coupling portion, the user terminal 100 and the controller 1020 transmit and receive signals via the wire.
 図1に示すように、ユーザ端末100は、外部のメモリカード等の記憶媒体1030の装着を、入出力IF14を介して受け付けてもよい。これにより、ユーザ端末100は、記憶媒体1030に記録されるプログラム及びデータを読み込むことができる。記憶媒体1030に記録されるプログラムは、例えばゲームプログラムである。 As shown in FIG. 1, the user terminal 100 may accept the attachment of a storage medium 1030 such as an external memory card via the input / output IF14. As a result, the user terminal 100 can read the program and data recorded on the storage medium 1030. The program recorded on the storage medium 1030 is, for example, a game program.
 ユーザ端末100は、サーバ200等の外部の装置と通信することにより取得したゲームプログラムをユーザ端末100のメモリ11に記憶してもよいし、記憶媒体1030から読み込むことにより取得したゲームプログラムをメモリ11に記憶してもよい。 The user terminal 100 may store the game program acquired by communicating with an external device such as the server 200 in the memory 11 of the user terminal 100, or may store the game program acquired by reading from the storage medium 1030 in the memory 11. You may memorize it in.
 以上で説明したとおり、ユーザ端末100は、該ユーザ端末100に対して情報を入力する機構の一例として、通信IF13、入出力IF14、タッチスクリーン15、カメラ17、および、測距センサ18を備える。入力する機構としての上述の各部は、ユーザの入力操作を受け付けるように構成された操作部と捉えることができる。 As described above, the user terminal 100 includes a communication IF 13, an input / output IF 14, a touch screen 15, a camera 17, and a distance measuring sensor 18 as an example of a mechanism for inputting information to the user terminal 100. Each of the above-mentioned parts as an input mechanism can be regarded as an operation part configured to accept a user's input operation.
 例えば、操作部が、カメラ17および測距センサ18の少なくともいずれか一方で構成される場合、該操作部が、ユーザ端末100の近傍の物体1010を検出し、当該物体の検出結果から入力操作を特定する。一例として、物体1010としてのユーザの手、予め定められた形状のマーカーなどが検出され、検出結果として得られた物体1010の色、形状、動き、または、種類などに基づいて入力操作が特定される。より具体的には、ユーザ端末100は、カメラ17の撮影画像からユーザの手が検出された場合、該撮影画像に基づき検出されるジェスチャ(ユーザの手の一連の動き)を、ユーザの入力操作として特定し、受け付ける。なお、撮影画像は静止画であっても動画であってもよい。 For example, when the operation unit is configured by at least one of the camera 17 and the distance measuring sensor 18, the operation unit detects an object 1010 in the vicinity of the user terminal 100 and performs an input operation from the detection result of the object. Identify. As an example, a user's hand as an object 1010, a marker having a predetermined shape, or the like is detected, and an input operation is specified based on the color, shape, movement, or type of the object 1010 obtained as a detection result. To. More specifically, when the user's hand is detected from the captured image of the camera 17, the user terminal 100 inputs a gesture (a series of movements of the user's hand) detected based on the captured image. Specify and accept as. The captured image may be a still image or a moving image.
 あるいは、操作部がタッチスクリーン15で構成される場合、ユーザ端末100は、タッチスクリーン15の入力部151に対して実施されたユーザの操作をユーザの入力操作として特定し、受け付ける。あるいは、操作部が通信IF13で構成される場合、ユーザ端末100は、コントローラ1020から送信される信号(例えば、出力値)をユーザの入力操作として特定し、受け付ける。あるいは、操作部が入出力IF14で構成される場合、該入出力IF14と接続されるコントローラ1020とは異なる入力装置(図示せず)から出力される信号をユーザの入力操作として特定し、受け付ける。 Alternatively, when the operation unit is composed of the touch screen 15, the user terminal 100 identifies and accepts the user's operation performed on the input unit 151 of the touch screen 15 as the user's input operation. Alternatively, when the operation unit is configured by the communication IF 13, the user terminal 100 identifies and accepts a signal (for example, an output value) transmitted from the controller 1020 as an input operation of the user. Alternatively, when the operation unit is composed of the input / output IF14, a signal output from an input device (not shown) different from the controller 1020 connected to the input / output IF14 is specified as an input operation of the user and received.
 本実施形態では、ゲームシステム1は、さらに、動作指図装置300を含む。動作指図装置300は、サーバ200およびユーザ端末100のそれぞれと、ネットワーク2を介して接続する。動作指図装置300は、ゲームシステム1に少なくとも1台設けられている。動作指図装置300は、サーバ200が提供するサービスを利用するユーザ端末100の数に応じて、複数台設けられていてもよい。1台のユーザ端末100に対して、1台の動作指図装置300が設けられていてもよい。複数台のユーザ端末100に対して、1台の動作指図装置300が設けられていてもよい。 In the present embodiment, the game system 1 further includes an operation instruction device 300. The operation instruction device 300 connects to each of the server 200 and the user terminal 100 via the network 2. At least one operation instruction device 300 is provided in the game system 1. A plurality of operation instruction devices 300 may be provided depending on the number of user terminals 100 that use the service provided by the server 200. One operation instruction device 300 may be provided for one user terminal 100. One operation instruction device 300 may be provided for a plurality of user terminals 100.
 動作指図装置300(NPC制御装置、キャラクタ制御装置)は、サーバ、デスクトップパソコン、ノートパソコン、または、タブレットなどのコンピュータ、および、これらを組み合わせたコンピュータ群であってもよい。動作指図装置300は、図示の通り、プロセッサ30と、メモリ31と、ストレージ32と、通信IF33と、入出力IF34と、タッチスクリーン35(表示部)とを備える。動作指図装置300が備えるこれらの構成は、通信バスによって互いに電気的に接続される。なお、動作指図装置300は、タッチスクリーン35に代えて、または、加えて、動作指図装置300本体とは別に構成されたディスプレイ(表示部)を接続可能な入出力IF34を備えていてもよい。 The operation instruction device 300 (NPC control device, character control device) may be a computer such as a server, a desktop personal computer, a laptop computer, or a tablet, and a computer group in which these are combined. As shown in the figure, the operation instruction device 300 includes a processor 30, a memory 31, a storage 32, a communication IF 33, an input / output IF 34, and a touch screen 35 (display unit). These configurations included in the operation instruction device 300 are electrically connected to each other by a communication bus. The operation instruction device 300 may include an input / output IF 34 to which a display (display unit) configured separately from the operation instruction device 300 main body can be connected in place of or in addition to the touch screen 35.
 また、図1に示すように、動作指図装置300は、1つ以上のマイク3010、1つ以上のモーションキャプチャ装置3020、および、1つ以上のコントローラ3030などの周辺機器と、無線または有線を介して、通信可能に構成されてもよい。無線で接続される周辺機器は、例えば、Bluetooth(登録商標)等の通信規格に従って、動作指図装置300と通信を確立する。 Further, as shown in FIG. 1, the operation instruction device 300 is connected to peripheral devices such as one or more microphones 3010, one or more motion capture devices 3020, and one or more controllers 3030 via wireless or wired. It may be configured to be communicable. The wirelessly connected peripheral device establishes communication with the operation instruction device 300 according to a communication standard such as Bluetooth (registered trademark).
 マイク3010は、周囲で発生した音声を取得し、これを電気信号に変換する。電気信号に変換された音声は、音声データとして、動作指図装置300に送信され、通信IF33を介して動作指図装置300に受け付けられる。 The microphone 3010 acquires the voice generated in the surroundings and converts it into an electric signal. The voice converted into an electric signal is transmitted to the operation instruction device 300 as voice data, and is received by the operation instruction device 300 via the communication IF 33.
 モーションキャプチャ装置3020は、追跡対象(例えば、人)のモーション(顔の表情、口の動きなども含む)を追跡し、追跡結果としての出力値を動作指図装置300へ送信する。出力値であるモーションデータは、通信IF33を介して動作指図装置300に受け付けられる。モーションキャプチャ装置3020のモーションキャプチャ方式は特に限定されない。モーションキャプチャ装置3020は、採用された方式に応じて、カメラ、各種センサ、マーカー、モデル(人物)が着用するスーツ、信号送出器など、モーションをキャプチャするためのあらゆる機構を選択的に備えている。 The motion capture device 3020 tracks the motion (including facial expressions, mouth movements, etc.) of the tracking target (for example, a person), and transmits the output value as the tracking result to the operation instruction device 300. The motion data, which is an output value, is received by the operation instruction device 300 via the communication IF 33. The motion capture method of the motion capture device 3020 is not particularly limited. The motion capture device 3020 selectively includes all mechanisms for capturing motion, such as a camera, various sensors, markers, a suit worn by a model (person), a signal transmitter, etc., depending on the method adopted. ..
 コントローラ3030は、1つ以上のボタン、レバー、スティック、ホイール等の物理的な入力機構を有していてもよい。コントローラ3030は、動作指図装置300のオペレータが、該入力機構に対して入力した入力操作に基づく出力値を動作指図装置300へ送信する。また、コントローラ3030は、加速度センサ、および、角速度センサ等の各種センサを有していてもよく、該各種センサの出力値を動作指図装置300へ送信してもよい。上述の出力値は、通信IF33を介して動作指図装置300に受け付けられる。なお、以下では、動作指図装置300に備えられた操作部または動作指図装置300と通信可能に接続された各種の入力機構を用いて、動作指図装置300に対して、何らかの入力操作を行う人をオペレータと称する。オペレータには、入力部351、コントローラ3030などを用いて動作指図装置300を操作する人も含まれるし、マイク3010を介して音声を入力する声優も含まれるし、モーションキャプチャ装置3020を介して動きを入力するモデルも含まれる。 The controller 3030 may have one or more physical input mechanisms such as buttons, levers, sticks, and wheels. The controller 3030 transmits an output value based on an input operation input to the input mechanism by the operator of the operation instruction device 300 to the operation instruction device 300. Further, the controller 3030 may have various sensors such as an acceleration sensor and an angular velocity sensor, and may transmit the output values of the various sensors to the operation instruction device 300. The above output value is received by the operation instruction device 300 via the communication IF 33. In the following, a person who performs some input operation on the operation instruction device 300 by using the operation unit provided in the operation instruction device 300 or various input mechanisms communicably connected to the operation instruction device 300. Called an operator. The operator includes a person who operates the operation instruction device 300 by using the input unit 351 and the controller 3030, a voice actor who inputs voice through the microphone 3010, and moves via the motion capture device 3020. A model for inputting is also included.
 動作指図装置300は、図示しない、カメラと、測距センサとを備えていてもよい。動作指図装置300が備えることに代えて、または、加えて、モーションキャプチャ装置3020およびコントローラ3030がカメラと、測距センサとを有してしてもよい。 The operation instruction device 300 may include a camera and a distance measuring sensor (not shown). Alternatively or in addition to the motion instruction device 300, the motion capture device 3020 and the controller 3030 may have a camera and a distance measuring sensor.
 以上で説明したとおり、動作指図装置300は、該動作指図装置300に対して情報を入力する機構の一例として、通信IF33、入出力IF34、タッチスクリーン35を備える。必要に応じて、カメラ、および、測距センサをさらに備えていてもよい。入力する機構としての上述の各部は、ユーザの入力操作を受け付けるように構成された操作部と捉えることができる。 As described above, the operation instruction device 300 includes a communication IF 33, an input / output IF 34, and a touch screen 35 as an example of a mechanism for inputting information to the operation instruction device 300. If necessary, a camera and a distance measuring sensor may be further provided. Each of the above-mentioned parts as an input mechanism can be regarded as an operation part configured to accept a user's input operation.
 操作部がタッチスクリーン35で構成されていてもよい。この場合、動作指図装置300は、タッチスクリーン35の入力部351に対して実施されたユーザの操作をユーザの入力操作として特定し、受け付ける。あるいは、操作部が通信IF33で構成される場合、動作指図装置300は、コントローラ3030から送信される信号(例えば、出力値)をユーザの入力操作として特定し、受け付ける。あるいは、操作部が入出力IF34で構成される場合、該入出力IF34と接続されるコントローラ3030とは異なる入力装置(図示せず)から出力される信号をユーザの入力操作として特定し、受け付ける。 The operation unit may be composed of the touch screen 35. In this case, the operation instruction device 300 identifies and accepts the user's operation performed on the input unit 351 of the touch screen 35 as the user's input operation. Alternatively, when the operation unit is composed of the communication IF 33, the operation instruction device 300 identifies and accepts a signal (for example, an output value) transmitted from the controller 3030 as an input operation of the user. Alternatively, when the operation unit is composed of the input / output IF34, a signal output from an input device (not shown) different from the controller 3030 connected to the input / output IF34 is specified as an input operation of the user and received.
 <ゲーム概要>
 本実施形態では、一例として、ゲームシステム1において、サーバ200とユーザ端末100とが協働して、ゲームプログラム131を実行し、ユーザがプレイするゲームをユーザ端末100において進行させる。また、本実施形態では、ゲームシステム1において、動作指図装置300(キャラクタ制御装置)は、キャラクタ制御プログラム134を実行し、ユーザ端末100が実行するゲームに登場させる少なくとも一部のキャラクタの動作を制御することができる。
<Game overview>
In the present embodiment, as an example, in the game system 1, the server 200 and the user terminal 100 cooperate to execute the game program 131, and the game played by the user is advanced on the user terminal 100. Further, in the present embodiment, in the game system 1, the operation instruction device 300 (character control device) executes the character control program 134 and controls the operation of at least a part of the characters appearing in the game executed by the user terminal 100. can do.
 ゲームシステム1は、特定のジャンルに限らず、あらゆるジャンルのゲームを実行するためのシステムであってもよい。例えば、テニス、卓球、ドッジボール、野球、サッカーおよびホッケーなどのスポーツを題材としたゲーム、パズルゲーム、クイズゲーム、RPG、アドベンチャーゲーム、シューティングゲーム、シミュレーションゲーム、育成ゲーム、ならびに、アクションゲームなどであってもよい。 The game system 1 is not limited to a specific genre, and may be a system for executing a game of any genre. For example, sports-themed games such as tennis, table tennis, dodgeball, baseball, soccer and hockey, puzzle games, quiz games, RPGs, adventure games, shooting games, simulation games, training games, and action games. May be good.
 また、ゲームシステム1は、特定のプレイ形態に限らず、あらゆるプレイ形態のゲームを実行するためのシステムであってもよい。例えば、単一のユーザによるシングルプレイゲーム、および、複数のユーザによるマルチプレイゲーム、また、マルチプレイゲームの中でも、複数のユーザが対戦する対戦ゲーム、および、複数のユーザが協力する協力プレイゲームなどであってもよい。例えば、対戦ゲームには、上述のとおり、テニスまたは野球などスポーツを題材とした対戦ゲームが含まれていてもよい。また、対戦ゲームには、将棋、囲碁、チェス、オセロなどのように2人で対戦するボードゲームが含まれていてもよい。また、対戦ゲームには、複数のユーザがそれぞれ乗り物または選手などを操作して同じコースを周回し、そのタイムを競うレースゲームなどが含まれていてもよい。 Further, the game system 1 is not limited to a specific play form, and may be a system for executing a game of any play form. For example, a single-player game by a single user, a multi-play game by a plurality of users, a battle game in which a plurality of users play against each other, and a cooperative play game in which a plurality of users cooperate among the multi-play games. You may. For example, as described above, the battle game may include a battle game on the subject of sports such as tennis or baseball. Further, the battle game may include a board game in which two players play against each other, such as shogi, go, chess, and Othello. Further, the battle game may include a race game in which a plurality of users operate vehicles, players, or the like to go around the same course and compete for the time.
 <各装置のハードウェア構成要素>
 プロセッサ10は、ユーザ端末100全体の動作を制御する。プロセッサ20は、サーバ200全体の動作を制御する。プロセッサ30は、動作指図装置300全体の動作を制御する。プロセッサ10、20および30は、CPU(Central Processing Unit)、MPU(Micro Processing Unit)、およびGPU(Graphics Processing Unit)を含む。
<Hardware components of each device>
The processor 10 controls the operation of the entire user terminal 100. The processor 20 controls the operation of the entire server 200. The processor 30 controls the operation of the entire operation instruction device 300. Processors 10, 20 and 30 include a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a GPU (Graphics Processing Unit).
 プロセッサ10は後述するストレージ12からプログラムを読み出し、後述するメモリ11に展開する。プロセッサ20は後述するストレージ22からプログラムを読み出し、後述するメモリ21に展開する。プロセッサ30は後述するストレージ32からプログラムを読み出し、後述するメモリ31に展開する。プロセッサ10、プロセッサ20およびプロセッサ30は展開したプログラムを実行する。 The processor 10 reads a program from the storage 12 described later and expands it into the memory 11 described later. The processor 20 reads a program from the storage 22 described later and expands it into the memory 21 described later. The processor 30 reads a program from the storage 32 described later and expands it into the memory 31 described later. Processor 10, processor 20 and processor 30 execute the expanded program.
 メモリ11、21および31は主記憶装置である。メモリ11、21および31は、ROM(Read Only Memory)およびRAM(Random Access Memory)等の記憶装置で構成される。メモリ11は、プロセッサ10が後述するストレージ12から読み出したプログラムおよび各種データを一時的に記憶することにより、プロセッサ10に作業領域を提供する。メモリ11は、プロセッサ10がプログラムに従って動作している間に生成した各種データも一時的に記憶する。メモリ21は、プロセッサ20が後述するストレージ22から読み出した各種プログラムおよびデータを一時的に記憶することにより、プロセッサ20に作業領域を提供する。メモリ21は、プロセッサ20がプログラムに従って動作している間に生成した各種データも一時的に記憶する。メモリ31は、プロセッサ30が後述するストレージ32から読み出した各種プログラムおよびデータを一時的に記憶することにより、プロセッサ30に作業領域を提供する。メモリ31は、プロセッサ30がプログラムに従って動作している間に生成した各種データも一時的に記憶する。 The memories 11, 21 and 31 are the main storage devices. The memories 11, 21 and 31 are composed of storage devices such as ROM (Read Only Memory) and RAM (Random Access Memory). The memory 11 provides a work area to the processor 10 by temporarily storing a program and various data read from the storage 12 described later by the processor 10. The memory 11 also temporarily stores various data generated while the processor 10 is operating according to the program. The memory 21 provides a work area to the processor 20 by temporarily storing various programs and data read from the storage 22 described later by the processor 20. The memory 21 also temporarily stores various data generated while the processor 20 is operating according to the program. The memory 31 provides a work area to the processor 30 by temporarily storing various programs and data read from the storage 32 described later by the processor 30. The memory 31 also temporarily stores various data generated while the processor 30 is operating according to the program.
 本実施形態においてプログラムとは、ゲームをユーザ端末100により実現するためのゲームプログラムであってもよい。あるいは、該プログラムは、該ゲームをユーザ端末100とサーバ200との協働により実現するためのゲームプログラムであってもよい。あるいは、該プログラムは、該ゲームをユーザ端末100とサーバ200と動作指図装置300との協働により実現するためのゲームプログラムであってもよい。なお、ユーザ端末100とサーバ200との協働により実現されるゲームおよびユーザ端末100とサーバ200と動作指図装置300との協働により実現されるゲームは、一例として、ユーザ端末100において起動されたブラウザ上で実行されるゲームであってもよい。あるいは、該プログラムは、該ゲームを複数のユーザ端末100の協働により実現するためのゲームプログラムであってもよい。また、各種データとは、ユーザ情報およびゲーム情報などのゲームに関するデータ、ならびに、ゲームシステム1の各装置間で送受信する指示または通知を含んでいる。 In the present embodiment, the program may be a game program for realizing the game by the user terminal 100. Alternatively, the program may be a game program for realizing the game in collaboration with the user terminal 100 and the server 200. Alternatively, the program may be a game program for realizing the game in cooperation with the user terminal 100, the server 200, and the operation instruction device 300. The game realized by the cooperation of the user terminal 100 and the server 200 and the game realized by the cooperation of the user terminal 100, the server 200, and the operation instruction device 300 are started by the user terminal 100 as an example. It may be a game executed on a browser. Alternatively, the program may be a game program for realizing the game by the cooperation of a plurality of user terminals 100. Further, the various data include data related to the game such as user information and game information, and instructions or notifications to be transmitted / received between the devices of the game system 1.
 ストレージ12、22および32は補助記憶装置である。ストレージ12、22および32は、フラッシュメモリまたはHDD(Hard Disk Drive)等の記憶装置で構成される。ストレージ12、22および32には、ゲームに関する各種データが格納される。 Storages 12, 22 and 32 are auxiliary storage devices. The storages 12, 22 and 32 are composed of a storage device such as a flash memory or an HDD (Hard Disk Drive). Various data related to the game are stored in the storages 12, 22 and 32.
 通信IF13は、ユーザ端末100における各種データの送受信を制御する。通信IF23は、サーバ200における各種データの送受信を制御する。通信IF33は、動作指図装置300における各種データの送受信を制御する。通信IF13、23および33は例えば、無線LAN(Local Area Network)を介する通信、有線LAN、無線LAN、または携帯電話回線網を介したインターネット通信、ならびに近距離無線通信等を用いた通信を制御する。 The communication IF 13 controls the transmission and reception of various data in the user terminal 100. The communication IF 23 controls the transmission / reception of various data in the server 200. The communication IF 33 controls the transmission / reception of various data in the operation instruction device 300. Communication IFs 13, 23 and 33 control communication using, for example, communication via a wireless LAN (Local Area Network), Internet communication via a wired LAN, a wireless LAN, or a mobile phone network, and communication using short-range wireless communication. ..
 入出力IF14は、ユーザ端末100がデータの入力を受け付けるためのインターフェースであり、またユーザ端末100がデータを出力するためのインターフェースである。入出力IF14は、USB(Universal Serial Bus)等を介してデータの入出力を行ってもよい。入出力IF14は、例えば、ユーザ端末100の物理ボタン、カメラ、マイク、または、スピーカ等を含み得る。サーバ200の入出力IF24は、サーバ200がデータの入力を受け付けるためのインターフェースであり、またサーバ200がデータを出力するためのインターフェースである。入出力IF24は、例えば、マウスまたはキーボード等の情報入力機器である入力部と、画像を表示出力する機器である表示部とを含み得る。動作指図装置300の入出力IF34は、動作指図装置300がデータの入力を受け付けるためのインターフェースであり、また動作指図装置300がデータを出力するためのインターフェースである。入出力IF34は、例えば、マウス、キーボード、スティック、レバー等の情報入力機器、液晶ディスプレイなどの画像を表示出力する機器、および、周辺機器(マイク3010、モーションキャプチャ装置3020、および、コントローラ3030)との間でデータを送受信するための接続部を含み得る。 The input / output IF 14 is an interface for the user terminal 100 to accept data input, and an interface for the user terminal 100 to output data. The input / output IF 14 may input / output data via USB (Universal Serial Bus) or the like. The input / output IF 14 may include, for example, a physical button, a camera, a microphone, a speaker, or the like of the user terminal 100. The input / output IF 24 of the server 200 is an interface for the server 200 to receive data input, and an interface for the server 200 to output data. The input / output IF 24 may include, for example, an input unit that is an information input device such as a mouse or a keyboard, and a display unit that is a device that displays and outputs an image. The input / output IF 34 of the operation instruction device 300 is an interface for the operation instruction device 300 to receive data input, and an interface for the operation instruction device 300 to output data. The input / output IF34 includes, for example, information input devices such as a mouse, keyboard, stick, and lever, devices for displaying and outputting images such as a liquid crystal display, and peripheral devices (microphone 3010, motion capture device 3020, and controller 3030). May include connections for sending and receiving data between.
 ユーザ端末100のタッチスクリーン15は、入力部151と表示部152とを組み合わせた電子部品である。動作指図装置300のタッチスクリーン35は、入力部351と表示部352とを組み合わせた電子部品である。入力部151、351は、例えばタッチセンシティブなデバイスであり、例えばタッチパッドによって構成される。表示部152、352は、例えば液晶ディスプレイ、または有機EL(Electro-Luminescence)ディスプレイ等によって構成される。 The touch screen 15 of the user terminal 100 is an electronic component that combines an input unit 151 and a display unit 152. The touch screen 35 of the operation instruction device 300 is an electronic component in which an input unit 351 and a display unit 352 are combined. The input units 151 and 351 are, for example, touch-sensitive devices, and are configured by, for example, a touch pad. The display units 152 and 352 are configured by, for example, a liquid crystal display, an organic EL (Electro-Luminescence) display, or the like.
 入力部151、351は、入力面に対しユーザの操作(主にタッチ操作、スライド操作、スワイプ操作、およびタップ操作等の物理的接触操作)が入力された位置を検知して、位置を示す情報を入力信号として送信する機能を備える。入力部151、351は、図示しないタッチセンシング部を備えていればよい。タッチセンシング部は、静電容量方式または抵抗膜方式等のどのような方式を採用したものであってもよい。 The input units 151 and 351 detect the position where the user's operation (mainly a physical contact operation such as a touch operation, a slide operation, a swipe operation, and a tap operation) is input to the input surface, and information indicating the position. It has a function to transmit as an input signal. The input units 151 and 351 may be provided with a touch sensing unit (not shown). The touch sensing unit may adopt any method such as a capacitance method or a resistance film method.
 図示していないが、ユーザ端末100は、該ユーザ端末100の保持姿勢を特定するための1以上のセンサを備えていてもよい。このセンサは、例えば、加速度センサ、または、角速度センサ等であってもよい。ユーザ端末100がセンサを備えている場合、プロセッサ10は、センサの出力からユーザ端末100の保持姿勢を特定して、保持姿勢に応じた処理を行うことも可能になる。例えば、プロセッサ10は、ユーザ端末100が縦向きに保持されているときには、縦長の画像を表示部152に表示させる縦画面表示としてもよい。一方、ユーザ端末100が横向きに保持されているときには、横長の画像を表示部に表示させる横画面表示としてもよい。このように、プロセッサ10は、ユーザ端末100の保持姿勢に応じて縦画面表示と横画面表示とを切り替え可能であってもよい。 Although not shown, the user terminal 100 may include one or more sensors for specifying the holding posture of the user terminal 100. This sensor may be, for example, an acceleration sensor, an angular velocity sensor, or the like. When the user terminal 100 includes a sensor, the processor 10 can also specify the holding posture of the user terminal 100 from the output of the sensor and perform processing according to the holding posture. For example, the processor 10 may be a vertical screen display in which a vertically long image is displayed on the display unit 152 when the user terminal 100 is held vertically. On the other hand, when the user terminal 100 is held horizontally, it may be a horizontal screen display in which a horizontally long image is displayed on the display unit. In this way, the processor 10 may be able to switch between the vertical screen display and the horizontal screen display according to the holding posture of the user terminal 100.
 カメラ17は、イメージセンサ等を含み、レンズから入射する入射光を電気信号に変換することで撮影画像を生成する。 The camera 17 includes an image sensor and the like, and generates a captured image by converting the incident light incident from the lens into an electric signal.
 測距センサ18は、測定対象物までの距離を測定するセンサである。測距センサ18は、例えば、パルス変換した光を発する光源と、光を受ける受光素子とを含む。測距センサ18は、光源からの発光タイミングと、該光源から発せられた光が測定対象物にあたって反射されて生じる反射光の受光タイミングとにより、測定対象物までの距離を測定する。測距センサ18は、指向性を有する光を発する光源を有することとしてもよい。 The distance measuring sensor 18 is a sensor that measures the distance to the object to be measured. The distance measuring sensor 18 includes, for example, a light source that emits pulse-converted light and a light receiving element that receives the light. The distance measuring sensor 18 measures the distance to the object to be measured by the timing of light emission from the light source and the timing of receiving the reflected light generated by the light emitted from the light source being reflected by the object to be measured. The distance measuring sensor 18 may have a light source that emits light having directivity.
 ここで、ユーザ端末100が、カメラ17と測距センサ18とを用いて、ユーザ端末100の近傍の物体1010を検出した検出結果を、ユーザの入力操作として受け付ける例をさらに説明する。カメラ17および測距センサ18は、例えば、ユーザ端末100の筐体の側面に設けられてもよい。カメラ17の近傍に測距センサ18が設けられてもよい。カメラ17としては、例えば赤外線カメラを用いることができる。この場合、赤外線を照射する照明装置および可視光を遮断するフィルタ等が、カメラ17に設けられてもよい。これにより、屋外か屋内かにかかわらず、カメラ17の撮影画像に基づく物体の検出精度をいっそう向上させることができる。 Here, an example in which the user terminal 100 receives the detection result of detecting the object 1010 in the vicinity of the user terminal 100 by using the camera 17 and the distance measuring sensor 18 as an input operation of the user will be further described. The camera 17 and the distance measuring sensor 18 may be provided on the side surface of the housing of the user terminal 100, for example. A ranging sensor 18 may be provided in the vicinity of the camera 17. As the camera 17, for example, an infrared camera can be used. In this case, the camera 17 may be provided with a lighting device that irradiates infrared rays, a filter that blocks visible light, and the like. This makes it possible to further improve the detection accuracy of the object based on the captured image of the camera 17, regardless of whether it is outdoors or indoors.
 プロセッサ10は、カメラ17の撮影画像に対して、例えば以下の(1)~(5)に示す処理のうち1つ以上の処理を行ってもよい。(1)プロセッサ10は、カメラ17の撮影画像に対し画像認識処理を行うことで、該撮影画像にユーザの手が含まれているか否かを特定する。プロセッサ10は、上述の画像認識処理において採用する解析技術として、例えばパターンマッチング等の技術を用いてよい。(2)また、プロセッサ10は、ユーザの手の形状から、ユーザのジェスチャを検出する。プロセッサ10は、例えば、撮影画像から検出されるユーザの手の形状から、ユーザの指の本数(伸びている指の本数)を特定する。プロセッサ10はさらに、特定した指の本数から、ユーザが行ったジェスチャを特定する。例えば、プロセッサ10は、指の本数が5本である場合、ユーザが「パー」のジェスチャを行ったと判定する。また、プロセッサ10は、指の本数が0本である(指が検出されなかった)場合、ユーザが「グー」のジェスチャを行ったと判定する。また、プロセッサ10は、指の本数が2本である場合、ユーザが「チョキ」のジェスチャを行ったと判定する。(3)プロセッサ10は、カメラ17の撮影画像に対し、画像認識処理を行うことにより、ユーザの指が人差し指のみ立てた状態であるか、ユーザの指がはじくような動きをしたかを検出する。(4)プロセッサ10は、カメラ17の撮影画像の画像認識結果、および、測距センサ18の出力値等の少なくともいずれか1つに基づいて、ユーザ端末100の近傍の物体1010(ユーザの手など)とユーザ端末100との距離を検出する。例えば、プロセッサ10は、カメラ17の撮影画像から特定されるユーザの手の形状の大小により、ユーザの手がユーザ端末100の近傍(例えば所定値未満の距離)にあるのか、遠く(例えば所定値以上の距離)にあるのかを検出する。なお、撮影画像が動画の場合、プロセッサ10は、ユーザの手がユーザ端末100に接近しているのか遠ざかっているのかを検出してもよい。(5)カメラ17の撮影画像の画像認識結果等に基づいて、ユーザの手が検出されている状態で、ユーザ端末100とユーザの手との距離が変化していることが判明した場合、プロセッサ10は、ユーザが手をカメラ17の撮影方向において振っていると認識する。カメラ17の撮影範囲よりも指向性が強い測距センサ18において、物体が検出されたりされなかったりする場合に、プロセッサ10は、ユーザが手をカメラの撮影方向に直交する方向に振っていると認識する。 The processor 10 may perform one or more of the processes shown in the following (1) to (5), for example, on the captured image of the camera 17. (1) The processor 10 performs image recognition processing on the captured image of the camera 17 to specify whether or not the captured image includes a user's hand. The processor 10 may use, for example, a technique such as pattern matching as the analysis technique adopted in the above-mentioned image recognition process. (2) Further, the processor 10 detects the user's gesture from the shape of the user's hand. The processor 10 specifies, for example, the number of fingers of the user (the number of extended fingers) from the shape of the user's hand detected from the captured image. The processor 10 further identifies the gesture performed by the user from the number of identified fingers. For example, the processor 10 determines that the user has made a "par" gesture when the number of fingers is five. Further, when the number of fingers is 0 (the finger is not detected), the processor 10 determines that the user has made a “goo” gesture. Further, when the number of fingers is two, the processor 10 determines that the user has performed the "choki" gesture. (3) The processor 10 performs image recognition processing on the captured image of the camera 17 to detect whether the user's finger is in a state where only the index finger is raised or whether the user's finger is repelled. .. (4) The processor 10 is an object 1010 (user's hand or the like) in the vicinity of the user terminal 100 based on at least one of the image recognition result of the captured image of the camera 17 and the output value of the distance measuring sensor 18. ) And the user terminal 100. For example, the processor 10 may have the user's hand near the user terminal 100 (for example, a distance less than a predetermined value) or far away (for example, a predetermined value) depending on the size of the shape of the user's hand specified from the captured image of the camera 17. Detect if it is at the above distance). When the captured image is a moving image, the processor 10 may detect whether the user's hand is approaching or moving away from the user terminal 100. (5) When it is found that the distance between the user terminal 100 and the user's hand is changing while the user's hand is detected based on the image recognition result of the captured image of the camera 17, the processor. 10 recognizes that the user is waving his hand in the shooting direction of the camera 17. When an object is detected or not detected in the distance measuring sensor 18 having a stronger directivity than the shooting range of the camera 17, the processor 10 determines that the user is waving his hand in a direction orthogonal to the shooting direction of the camera. recognize.
 このように、プロセッサ10は、カメラ17の撮影画像に対する画像認識により、ユーザが手を握りこんでいるか否か(「グー」のジェスチャであるか、それ以外のジェスチャ(例えば「パー」)であるか)を検出する。また、プロセッサ10は、ユーザの手の形状とともに、ユーザがこの手をどのように移動させているかを検出する。また、プロセッサ10は、ユーザがこの手をユーザ端末100に対して接近させているのか遠ざけているのかを検出する。このような操作は、例えば、マウスまたはタッチパネルなどのポインティングデバイスを用いた操作に対応させることができる。ユーザ端末100は、例えば、ユーザの手の移動に応じて、タッチスクリーン15においてポインタを移動させ、ユーザのジェスチャ「グー」を検出する。この場合、ユーザ端末100は、ユーザが選択操作を継続中であると認識する。選択操作の継続とは、例えば、マウスがクリックされて押し込まれた状態が維持されること、または、タッチパネルに対してタッチダウン操作がなされた後タッチされた状態が維持されることに対応する。また、ユーザ端末100は、ユーザのジェスチャ「グー」が検出されている状態で、さらにユーザが手を移動させると、このような一連のジェスチャを、スワイプ操作(またはドラッグ操作)に対応する操作として認識することもできる。また、ユーザ端末100は、カメラ17の撮影画像によるユーザの手の検出結果に基づいて、ユーザが指をはじくようなジェスチャを検出した場合に、当該ジェスチャを、マウスのクリックまたはタッチパネルへのタップ操作に対応する操作として認識してもよい。 As described above, the processor 10 determines whether or not the user is holding the hand (whether it is a “goo” gesture or another gesture (for example, “par”) by recognizing the image captured by the camera 17. Or) is detected. In addition, the processor 10 detects the shape of the user's hand and how the user is moving the hand. In addition, the processor 10 detects whether the user is approaching or moving this hand toward or away from the user terminal 100. Such an operation can correspond to an operation using a pointing device such as a mouse or a touch panel, for example. For example, the user terminal 100 moves the pointer on the touch screen 15 in response to the movement of the user's hand, and detects the user's gesture “goo”. In this case, the user terminal 100 recognizes that the user is continuing the selection operation. The continuation of the selection operation corresponds to, for example, the state in which the mouse is clicked and pressed, or the state in which the mouse is touched after the touch-down operation is performed on the touch panel. Further, when the user moves his / her hand while the user's gesture "goo" is detected, the user terminal 100 performs such a series of gestures as an operation corresponding to a swipe operation (or a drag operation). You can also recognize it. Further, when the user terminal 100 detects a gesture that the user flips a finger based on the detection result of the user's hand by the image taken by the camera 17, the gesture is clicked by the mouse or tapped on the touch panel. It may be recognized as an operation corresponding to.
 <ゲームシステム1の機能的構成>
 図2は、ゲームシステム1に含まれるユーザ端末100、サーバ200および動作指図装置300の機能的構成を示すブロック図である。ユーザ端末100、サーバ200および動作指図装置300のそれぞれは、図示しない、一般的なコンピュータとして機能する場合に必要な機能的構成、および、ゲームにおける公知の機能を実現するために必要な機能的構成を含み得る。
<Functional configuration of game system 1>
FIG. 2 is a block diagram showing a functional configuration of a user terminal 100, a server 200, and an operation instruction device 300 included in the game system 1. Each of the user terminal 100, the server 200, and the operation instruction device 300 is not shown, and has a functional configuration necessary for functioning as a general computer and a functional configuration necessary for realizing a known function in a game. May include.
 ユーザ端末100は、ユーザの入力操作を受け付ける入力装置としての機能と、ゲームの画像や音声を出力する出力装置としての機能を有する。ユーザ端末100は、プロセッサ10、メモリ11、ストレージ12、通信IF13、および入出力IF14等の協働によって、制御部110および記憶部120として機能する。 The user terminal 100 has a function as an input device that accepts a user's input operation and a function as an output device that outputs a game image or sound. The user terminal 100 functions as a control unit 110 and a storage unit 120 by the cooperation of the processor 10, the memory 11, the storage 12, the communication IF 13, the input / output IF 14, and the like.
 サーバ200は、各ユーザ端末100と通信して、ユーザ端末100がゲームを進行させるのを支援する機能を有する。例えば、ユーザ端末100が本ゲームに係るアプリケーションを始めてダウンロードするときには、初回ゲーム開始時にユーザ端末100に記憶させておくべきデータをユーザ端末100に提供する。例えば、サーバ200は、キャラクタを動作させるための動作指図データをユーザ端末100に送信する。動作指図データは、予め、モデルなどのアクターの動きを取り込んだモーションキャプチャデータを含んでいてもよいし、声優などのアクターの音声を録音した音声データを含んでいてもよいし、キャラクタを動作させるための入力操作の履歴を示す操作履歴データを含んでいてもよいし、上述の一連の入力操作に対応付けられたコマンドを時系列に並べたモーションコマンド群を含んでいてもよい。本ゲームがマルチプレイゲームである場合には、サーバ200は、ゲームに参加する各ユーザ端末100と通信して、ユーザ端末100同士のやりとりを仲介する機能および同期制御機能を有していてもよい。また、サーバ200は、ユーザ端末100と動作指図装置300とを仲介する機能を備えている。これにより、動作指図装置300は、適時に、宛先を誤ることなく、ユーザ端末100または複数のユーザ端末100のグループに対して動作指図データを供給することが可能となる。サーバ200は、プロセッサ20、メモリ21、ストレージ22、通信IF23、および入出力IF24等の協働によって、制御部210および記憶部220として機能する。 The server 200 has a function of communicating with each user terminal 100 and supporting the user terminal 100 to advance the game. For example, when the user terminal 100 downloads an application related to this game for the first time, the user terminal 100 is provided with data to be stored in the user terminal 100 at the start of the first game. For example, the server 200 transmits the operation instruction data for operating the character to the user terminal 100. The motion instruction data may include motion capture data that captures the movement of an actor such as a model in advance, or may include voice data that records the voice of an actor such as a voice actor, or causes the character to operate. It may include operation history data indicating the history of input operations for the purpose, or may include a motion command group in which commands associated with the above-mentioned series of input operations are arranged in chronological order. When the game is a multiplayer game, the server 200 may have a function of communicating with each user terminal 100 participating in the game and mediating an exchange between the user terminals 100 and a synchronization control function. Further, the server 200 has a function of mediating between the user terminal 100 and the operation instruction device 300. As a result, the operation instruction device 300 can supply the operation instruction data to the user terminal 100 or a group of a plurality of user terminals 100 in a timely manner without making a mistake in the destination. The server 200 functions as a control unit 210 and a storage unit 220 by the cooperation of the processor 20, the memory 21, the storage 22, the communication IF23, the input / output IF24, and the like.
 動作指図装置300は、ユーザ端末100におけるキャラクタの動作を指示するための動作指図データを生成し、ユーザ端末100に供給する機能を有する。動作指図装置300は、プロセッサ30、メモリ31、ストレージ32、通信IF33、および入出力IF34等の協働によって、制御部310および記憶部320として機能する。 The operation instruction device 300 has a function of generating operation instruction data for instructing the operation of a character in the user terminal 100 and supplying the operation instruction data to the user terminal 100. The operation instruction device 300 functions as a control unit 310 and a storage unit 320 in cooperation with the processor 30, the memory 31, the storage 32, the communication IF 33, the input / output IF 34, and the like.
 記憶部120、220および320は、ゲームプログラム131、ゲーム情報132およびユーザ情報133を格納する。ゲームプログラム131は、ユーザ端末100、サーバ200および動作指図装置300が実行するゲームプログラムである。ゲーム情報132は、制御部110、210および310がゲームプログラム131を実行する際に参照するデータである。ユーザ情報133は、ユーザのアカウントに関するデータである。記憶部220および320において、ゲーム情報132およびユーザ情報133は、ユーザ端末100ごとに格納されている。記憶部320は、さらに、キャラクタ制御プログラム134を格納する。キャラクタ制御プログラム134は、動作指図装置300が実行するプログラムであり、上述のゲームプログラム131に基づくゲームに登場させるキャラクタの動作を制御するためのプログラムである。 The storage units 120, 220 and 320 store the game program 131, the game information 132 and the user information 133. The game program 131 is a game program executed by the user terminal 100, the server 200, and the operation instruction device 300. The game information 132 is data that the control units 110, 210, and 310 refer to when executing the game program 131. The user information 133 is data related to the user's account. In the storage units 220 and 320, the game information 132 and the user information 133 are stored for each user terminal 100. The storage unit 320 further stores the character control program 134. The character control program 134 is a program executed by the operation instruction device 300, and is a program for controlling the operation of a character appearing in a game based on the above-mentioned game program 131.
  (サーバ200の機能的構成)
 制御部210は、記憶部220に格納されたゲームプログラム131を実行することにより、サーバ200を統括的に制御する。例えば、制御部210は、ユーザ端末100に各種データおよびプログラム等を送信する。制御部210は、ゲーム情報もしくはユーザ情報の一部または全部をユーザ端末100から受信する。ゲームがマルチプレイゲームである場合には、制御部210は、ユーザ端末100からマルチプレイの同期の要求を受信して、同期のためのデータをユーザ端末100に送信してもよい。また、制御部210は、必要に応じて、ユーザ端末100および動作指図装置300と通信して、情報の送受信を行う。
(Functional configuration of server 200)
The control unit 210 comprehensively controls the server 200 by executing the game program 131 stored in the storage unit 220. For example, the control unit 210 transmits various data, programs, and the like to the user terminal 100. The control unit 210 receives a part or all of the game information or the user information from the user terminal 100. When the game is a multiplayer game, the control unit 210 may receive a request for synchronization of multiplayer from the user terminal 100 and transmit data for synchronization to the user terminal 100. Further, the control unit 210 communicates with the user terminal 100 and the operation instruction device 300 as necessary to send and receive information.
 制御部210は、ゲームプログラム131の記述に応じて、進行支援部211および共有支援部212として機能する。制御部210は、実行するゲームの性質に応じて、ユーザ端末100におけるゲームの進行を支援するために、図示しないその他の機能ブロックとしても機能することができる。 The control unit 210 functions as a progress support unit 211 and a shared support unit 212 according to the description of the game program 131. The control unit 210 can also function as another functional block (not shown) in order to support the progress of the game on the user terminal 100, depending on the nature of the game to be executed.
 進行支援部211は、ユーザ端末100と通信し、ユーザ端末100が、本ゲームに含まれる各種パートを進行するための支援を行う。例えば、進行支援部211は、ユーザ端末100が、ゲームを進行させるとき、該ゲームを進行させるために必要な情報をユーザ端末100に提供する。 The progress support unit 211 communicates with the user terminal 100 and supports the user terminal 100 to progress various parts included in this game. For example, when the user terminal 100 advances the game, the progress support unit 211 provides the user terminal 100 with information necessary for advancing the game.
 共有支援部212は、複数のユーザ端末100と通信し、複数のユーザが、各々のユーザ端末100にて互いのデッキを共有し合うための支援を行う。また、共有支援部212は、オンラインのユーザ端末100と動作指図装置300とをマッチングする機能を有していてもよい。これにより、ユーザ端末100と動作指図装置300との間の情報の送受信が円滑に実施される。 The sharing support unit 212 communicates with a plurality of user terminals 100, and supports a plurality of users to share each other's decks on each user terminal 100. Further, the sharing support unit 212 may have a function of matching the online user terminal 100 with the operation instruction device 300. As a result, information can be smoothly transmitted and received between the user terminal 100 and the operation instruction device 300.
  (ユーザ端末100の機能的構成)
 制御部110は、記憶部120に格納されたゲームプログラム131を実行することにより、ユーザ端末100を統括的に制御する。例えば、制御部110は、ゲームプログラム131およびユーザの操作にしたがって、ゲームを進行させる。また、制御部110は、ゲームを進行させている間、必要に応じて、サーバ200および動作指図装置300と通信して、情報の送受信を行う。
(Functional configuration of user terminal 100)
The control unit 110 comprehensively controls the user terminal 100 by executing the game program 131 stored in the storage unit 120. For example, the control unit 110 advances the game according to the game program 131 and the user's operation. Further, the control unit 110 communicates with the server 200 and the operation instruction device 300 as necessary to transmit and receive information while the game is in progress.
 制御部110は、ゲームプログラム131の記述に応じて、操作受付部111、表示制御部112、ユーザインターフェース(以下、UI)制御部113、アニメーション生成部114、ゲーム進行部115、解析部116および進捗情報生成部117として機能する。制御部110は、実行するゲームの性質に応じて、ゲームを進行させるために、図示しないその他の機能ブロックとしても機能することができる。 The control unit 110 includes an operation reception unit 111, a display control unit 112, a user interface (hereinafter, UI) control unit 113, an animation generation unit 114, a game progress unit 115, an analysis unit 116, and a progress unit according to the description of the game program 131. It functions as an information generation unit 117. The control unit 110 can also function as other functional blocks (not shown) in order to advance the game, depending on the nature of the game to be executed.
 操作受付部111は、入力部151に対するユーザの入力操作を検知し受け付ける。操作受付部111は、タッチスクリーン15およびその他の入出力IF14を介したコンソールに対してユーザが及ぼした作用から、いかなる入力操作がなされたかを判別し、その結果を制御部110の各要素に出力する。 The operation reception unit 111 detects and accepts a user's input operation to the input unit 151. The operation reception unit 111 determines what input operation has been performed from the action exerted by the user on the console via the touch screen 15 and other input / output IF 14, and outputs the result to each element of the control unit 110. do.
 例えば、操作受付部111は、入力部151に対する入力操作を受け付け、該入力操作の入力位置の座標を検出し、該入力操作の種類を特定する。操作受付部111は、入力操作の種類として、例えばタッチ操作、スライド操作、スワイプ操作、およびタップ操作等を特定する。また、操作受付部111は、連続して検知されていた入力が途切れると、タッチスクリーン15から接触入力が解除されたことを検知する。 For example, the operation receiving unit 111 receives an input operation for the input unit 151, detects the coordinates of the input position of the input operation, and specifies the type of the input operation. The operation receiving unit 111 specifies, for example, a touch operation, a slide operation, a swipe operation, a tap operation, and the like as types of input operations. Further, the operation receiving unit 111 detects that the contact input is canceled from the touch screen 15 when the continuously detected input is interrupted.
 UI制御部113は、UIを構築するために表示部152に表示させるUIオブジェクトを制御する。UIオブジェクトは、ユーザが、ゲームの進行上必要な入力をユーザ端末100に対して行うためのツール、または、ゲームの進行中に出力される情報をユーザ端末100から得るためのツールである。UIオブジェクトは、これには限定されないが、例えば、アイコン、ボタン、リスト、メニュー画面などである。 The UI control unit 113 controls the UI object to be displayed on the display unit 152 in order to construct the UI. The UI object is a tool for the user to make an input necessary for the progress of the game to the user terminal 100, or a tool for obtaining information output during the progress of the game from the user terminal 100. UI objects are, but are not limited to, icons, buttons, lists, menu screens, and the like.
 アニメーション生成部114は、各種オブジェクトの制御態様に基づいて、各種オブジェクトのモーションを示すアニメーションを生成する。例えば、アニメーション生成部114は、キャラクタがまるでそこにいるかのように動いたり、口を動かしたり、表情を変えたりする様子を表現したアニメーション等を生成してもよい。 The animation generation unit 114 generates an animation showing the motion of various objects based on the control mode of various objects. For example, the animation generation unit 114 may generate an animation or the like that expresses how the character moves as if it were there, moves the mouth, or changes the facial expression.
 表示制御部112は、タッチスクリーン15の表示部152に対して、上述の各要素によって実行された処理結果が反映されたゲーム画面を出力する。表示制御部112は、アニメーション生成部114によって生成されたアニメーションを含むゲーム画面を表示部152に表示してもよい。また、表示制御部112は、UI制御部113によって制御される上述のUIオブジェクトを、該ゲーム画面に重畳して描画してもよい。 The display control unit 112 outputs a game screen reflecting the processing result executed by each of the above elements to the display unit 152 of the touch screen 15. The display control unit 112 may display the game screen including the animation generated by the animation generation unit 114 on the display unit 152. Further, the display control unit 112 may superimpose and draw the above-mentioned UI object controlled by the UI control unit 113 on the game screen.
 ゲーム進行部115は、ゲームを進行させる。本実施形態では、ゲーム進行部115は、本実施形態に係るゲームシステム1において実行されるゲーム(以下、本ゲーム)を、操作受付部111を介して入力されるユーザの入力操作に応じて進行させる。本ゲームが、第1パート、第2パート・・・というように複数のパートに分かれて構成されている場合、ゲーム進行部115は、パートごとの仕様にしたがってゲームを進行させる。 The game progress unit 115 advances the game. In the present embodiment, the game progress unit 115 advances the game executed in the game system 1 according to the present embodiment (hereinafter, this game) according to the input operation of the user input via the operation reception unit 111. Let me. When the game is divided into a plurality of parts such as the first part, the second part, and so on, the game progress unit 115 advances the game according to the specifications of each part.
 一例を挙げると、本ゲームが、対戦テニスゲームであって、チュートリアルパート、対戦パート、および、抽選パートに分かれているとする。例えば、ゲーム進行部115は、チュートリアルパートにおいて、初心者のユーザが、その他のパート(対戦パートおよび抽選パート)をプレイするために必要な知識を提供したり、本ゲームのメインである対戦パートでのコア操作を習得するために簡単な練習モードを提供したりする。ゲーム進行部115は、チュートリアルパートを、ユーザの入力操作と、予め記憶部120にダウンロードされたゲームプログラム131とにしたがって進行させる。 As an example, it is assumed that this game is a competitive tennis game and is divided into a tutorial part, a competitive part, and a lottery part. For example, in the tutorial part, the game progress unit 115 provides the knowledge necessary for a novice user to play other parts (competition part and lottery part), or in the battle part which is the main part of the game. It provides a simple practice mode to learn core operations. The game progress unit 115 advances the tutorial part according to the input operation of the user and the game program 131 previously downloaded to the storage unit 120.
 ゲーム進行部115は、対戦パートにおいて、ユーザが操作するテニス選手と、他のユーザが操作するテニス選手とを対戦させる。ゲーム進行部115は、サーバ200を介して、他のユーザが操作するユーザ端末100と情報を共有し、同期をとりながら、テニスの対戦を進行させる。すなわち、ゲーム進行部115は、対戦パートを、ユーザの入力操作と、他のユーザの入力操作と、ゲームプログラム131とにしたがって進行させる。 The game progress unit 115 causes a tennis player operated by a user to play a tennis player operated by another user in a battle part. The game progress unit 115 shares information with the user terminal 100 operated by another user via the server 200, and advances the tennis match while synchronizing with each other. That is, the game progress unit 115 advances the battle part according to the input operation of the user, the input operation of another user, and the game program 131.
 ゲーム進行部115は、抽選パートにおいて、抽選を実行し、当選したゲーム媒体をユーザに獲得させる。ゲーム媒体は、本ゲームで利用可能なデジタルデータであり、例えば、対戦を有利にするために、ユーザが操作するテニス選手を強化できるアイテムなどである。ゲーム進行部115は、抽選パートを、ユーザの入力操作と、予め記憶部120にダウンロードされたゲームプログラム131と、サーバ200が実行する抽選結果とにしたがって進行させる。ここで、「ゲーム媒体をユーザに獲得させる」とは、一例として、ユーザに対応付けて管理されているゲーム媒体のステータスを、使用不可から使用可能に遷移させることであってもよい。あるいは、ゲーム媒体を、ユーザ識別情報またはユーザ端末IDなどに対応付けて、ゲームシステム1に含まれる少なくともいずれかのメモリ(メモリ11、メモリ21、メモリ31)に記憶させることであってもよい。 The game progress unit 115 executes a lottery in the lottery part, and causes the user to acquire the winning game medium. The game medium is digital data that can be used in this game, and is, for example, an item that can enhance a tennis player operated by a user in order to make a match advantageous. The game progress unit 115 advances the lottery part according to the input operation of the user, the game program 131 downloaded in advance to the storage unit 120, and the lottery result executed by the server 200. Here, "getting the user to acquire the game medium" may, as an example, change the status of the game medium managed in association with the user from unusable to usable. Alternatively, the game medium may be stored in at least one of the memories (memory 11, memory 21, memory 31) included in the game system 1 in association with the user identification information, the user terminal ID, or the like.
 解析部116は、動作指図データを解析して(レンダリングして)、解析結果に基づいてキャラクタを動作させるようにゲーム進行部115に指示する。本実施形態では、解析部116は、動作指図装置300によって供給された動作指図データが通信IF33を介して受信されたことをトリガにして、該動作指図データのレンダリングを開始する。動作指図装置300は、解析結果をゲーム進行部115に伝達し、すぐに動作指図データに基づいてキャラクタを動作させるよう指示する。すなわち、ゲーム進行部115は、動作指図データが受信されたことをトリガにして、該動作指図データに基づいてキャラクタを動作させる。これにより、リアルタイムで動作するキャラクタをユーザに見せることが可能となる。 The analysis unit 116 analyzes (renders) the operation instruction data and instructs the game progress unit 115 to operate the character based on the analysis result. In the present embodiment, the analysis unit 116 starts rendering of the operation instruction data triggered by the fact that the operation instruction data supplied by the operation instruction device 300 is received via the communication IF 33. The operation instruction device 300 transmits the analysis result to the game progress unit 115, and immediately instructs the character to operate based on the operation instruction data. That is, the game progress unit 115 uses the reception of the operation instruction data as a trigger to operate the character based on the operation instruction data. This makes it possible to show the user a character that operates in real time.
 進捗情報生成部117は、ゲーム進行部115が実行しているゲームの進捗を示す進捗情報を生成し、適時、サーバ200または動作指図装置300に送信する。進捗情報は、例えば、現在表示されているゲーム画面を指定する情報を含んでいてもよいし、ゲームの進捗を、時系列で文字および記号等によって示した進行ログを含んでいてもよい。ゲームシステム1において、サーバ200および動作指図装置300が進捗情報を必要としない実施形態では、進捗情報生成部117は省略されてもよい。 The progress information generation unit 117 generates progress information indicating the progress of the game being executed by the game progress unit 115, and sends it to the server 200 or the operation instruction device 300 in a timely manner. The progress information may include, for example, information that specifies the currently displayed game screen, or may include a progress log indicating the progress of the game in chronological order by characters, symbols, and the like. In the game system 1, in the embodiment in which the server 200 and the operation instruction device 300 do not require progress information, the progress information generation unit 117 may be omitted.
  (動作指図装置300の機能的構成)
 制御部310は、記憶部320に格納されたキャラクタ制御プログラム134を実行することにより、動作指図装置300を統括的に制御する。例えば、制御部310は、キャラクタ制御プログラム134およびオペレータの操作にしたがって、動作指図データを生成し、ユーザ端末100に供給する。制御部310は、必要に応じて、さらにゲームプログラム131を実行してもよい。また、制御部310は、サーバ200および本ゲームを実行中のユーザ端末100と通信して、情報の送受信を行う。
(Functional configuration of operation instruction device 300)
The control unit 310 comprehensively controls the operation instruction device 300 by executing the character control program 134 stored in the storage unit 320. For example, the control unit 310 generates operation instruction data according to the operation of the character control program 134 and the operator, and supplies the operation instruction data to the user terminal 100. The control unit 310 may further execute the game program 131, if necessary. Further, the control unit 310 communicates with the server 200 and the user terminal 100 running the game to send and receive information.
 制御部310は、キャラクタ制御プログラム134の記述に応じて、操作受付部311、表示制御部312、UI制御部313、アニメーション生成部314、進捗模擬部315およびキャラクタ制御部316として機能する。制御部310は、ゲームシステム1において実行されるゲームの性質に応じて、該ゲームに登場するキャラクタを制御するために、図示しないその他の機能ブロックとしても機能することができる。 The control unit 310 functions as an operation reception unit 311, a display control unit 312, a UI control unit 313, an animation generation unit 314, a progress simulation unit 315, and a character control unit 316 according to the description of the character control program 134. The control unit 310 can also function as another functional block (not shown) in order to control a character appearing in the game according to the nature of the game executed in the game system 1.
 操作受付部311は、入力部351に対するオペレータの入力操作を検知し受け付ける。操作受付部311は、タッチスクリーン35およびその他の入出力IF34を介したコンソールに対して、オペレータが及ぼした作用から、いかなる入力操作がなされたかを判別し、その結果を制御部310の各要素に出力する。操作受付部311の機能の詳細は、ユーザ端末100における操作受付部111のそれとほぼ同様である。 The operation reception unit 311 detects and accepts the operator's input operation to the input unit 351. The operation reception unit 311 determines what kind of input operation has been performed on the console via the touch screen 35 and other input / output IF 34s from the action exerted by the operator, and outputs the result to each element of the control unit 310. Output. The details of the function of the operation reception unit 311 are almost the same as those of the operation reception unit 111 in the user terminal 100.
 UI制御部313は、表示部352に表示させるUIオブジェクトを制御する。 The UI control unit 313 controls the UI object to be displayed on the display unit 352.
 アニメーション生成部314は、各種オブジェクトの制御態様に基づいて、各種オブジェクトのモーションを示すアニメーションを生成する。例えば、アニメーション生成部314は、通信相手となるユーザ端末100上実際に表示されているゲーム画面を再現したアニメーション等を生成してもよい。 The animation generation unit 314 generates an animation showing the motion of various objects based on the control mode of various objects. For example, the animation generation unit 314 may generate an animation or the like that reproduces the game screen actually displayed on the user terminal 100 that is the communication partner.
 表示制御部312は、タッチスクリーン35の表示部352に対して、上述の各要素によって実行された処理結果が反映されたゲーム画面を出力する。表示制御部312の機能の詳細は、ユーザ端末100における表示制御部112のそれとほぼ同様である。 The display control unit 312 outputs a game screen reflecting the processing result executed by each of the above-mentioned elements to the display unit 352 of the touch screen 35. The details of the functions of the display control unit 312 are substantially the same as those of the display control unit 112 in the user terminal 100.
 進捗模擬部315は、ユーザ端末100から受信するゲームの進捗を示す進捗情報に基づいて、ユーザ端末100におけるゲームの進捗を把握する。そして、進捗模擬部315は、該ユーザ端末100の挙動を動作指図装置300において模擬的に再現することで、オペレータに対して、ユーザ端末100の進捗を提示する。 The progress simulation unit 315 grasps the progress of the game on the user terminal 100 based on the progress information indicating the progress of the game received from the user terminal 100. Then, the progress simulation unit 315 presents the progress of the user terminal 100 to the operator by simulating the behavior of the user terminal 100 in the operation instruction device 300.
 例えば、進捗模擬部315は、ユーザ端末100で表示されているゲーム画面を再現したものを自装置の表示部352に表示してもよい。また、進捗模擬部315は、ユーザ端末100において、ゲームの進捗を上述の進行ログとして表示部352に表示してもよい。 For example, the progress simulation unit 315 may display a reproduction of the game screen displayed on the user terminal 100 on the display unit 352 of the own device. Further, the progress simulation unit 315 may display the progress of the game on the display unit 352 as the above-mentioned progress log on the user terminal 100.
 また、進捗模擬部315の機能の一部は、制御部310がゲームプログラム131を実行することにより実現されてもよい。例えば、まず進捗模擬部315は、進捗情報に基づいて、ユーザ端末100のゲームの進捗を把握する。そして、進捗模擬部315は、ユーザ端末100においてゲームプログラム131基づき現在表示されている、ゲーム画面を、完全にまたは簡略化して自装置の表示部352に再現してもよい。あるいは、進捗模擬部315は、現時点のゲームの進捗を把握し、ゲームプログラム131に基づいて現時点以降のゲーム進行を予測し、予測結果を表示部352に出力してもよい。 Further, a part of the functions of the progress simulation unit 315 may be realized by the control unit 310 executing the game program 131. For example, first, the progress simulation unit 315 grasps the progress of the game of the user terminal 100 based on the progress information. Then, the progress simulation unit 315 may completely or simplify the game screen currently displayed on the user terminal 100 based on the game program 131 and reproduce it on the display unit 352 of the own device. Alternatively, the progress simulation unit 315 may grasp the progress of the game at the present time, predict the progress of the game after the present time based on the game program 131, and output the prediction result to the display unit 352.
 キャラクタ制御部316は、ユーザ端末100に表示させるキャラクタの挙動を制御する。具体的には、キャラクタを動作させるための動作指図データを生成し、ユーザ端末100に供給する。例えば、キャラクタ制御部316は、オペレータ(声優など)が、マイク3010を介して入力した音声データに基づいて、制御対象のキャラクタに発言させることを指示する動作指図データを生成する。このようにして生成された動作指図データには、上述の音声データが少なくとも含まれる。また、例えば、オペレータ(モデルなど)が、モーションキャプチャ装置3020を介して入力したモーションキャプチャデータに基づく動きを制御対象のキャラクタに行わせることを指示する動作指図データを生成する。このようにして生成された動作指図データには、上述のモーションキャプチャデータが少なくとも含まれる。また、例えば、オペレータが、コントローラ3030などの入力機構または入力部351などの操作部を介して入力した入力操作の履歴、すなわち、操作履歴データに基づいて、制御対象のキャラクタを動作させることを指示する動作指図データを生成する。このようにして生成された動作指図データには、上述の操作履歴データが少なくとも含まれる。操作履歴データは、例えば、表示部にどの画面が表示されているときに、オペレータが、コントローラ3030のどのボタンをどのタイミングで押したのかを示す操作ログが時系列で整理されている情報である。ここでの表示部は、コントローラ3030と連動した表示部であって、タッチスクリーン35の表示部352であってもよいし、入出力IF34を介して接続された別の表示部であってもよい。あるいは、キャラクタ制御部316は、オペレータが上述の入力機構または操作部を介して入力した入力操作に対応付けられた、キャラクタの動作を指示するコマンドを特定する。そして、キャラクタ制御部316は、該コマンドを入力された順に並べてキャラクタの一連の動作を示すモーションコマンド群を生成し、該モーションコマンド群にしたがってキャラクタを動作させることを指示する動作指図データを生成してもよい。このようにして生成された動作指図データには、上述のモーションコマンド群が少なくとも含まれる。 The character control unit 316 controls the behavior of the character displayed on the user terminal 100. Specifically, the operation instruction data for operating the character is generated and supplied to the user terminal 100. For example, the character control unit 316 generates operation instruction data instructing an operator (voice actor or the like) to speak to the character to be controlled based on the voice data input via the microphone 3010. The operation instruction data generated in this way includes at least the above-mentioned voice data. Further, for example, an operator (model or the like) generates motion instruction data instructing the character to be controlled to perform a motion based on the motion capture data input via the motion capture device 3020. The motion instruction data generated in this way includes at least the above-mentioned motion capture data. Further, for example, it is instructed that the operator operates the character to be controlled based on the history of the input operation input through the input mechanism such as the controller 3030 or the operation unit such as the input unit 351 which is the operation history data. Generate action instruction data to be performed. The operation instruction data generated in this way includes at least the above-mentioned operation history data. The operation history data is, for example, information in which operation logs indicating which button of the controller 3030 is pressed at what timing by the operator when which screen is displayed on the display unit are organized in chronological order. .. The display unit here may be a display unit linked to the controller 3030, may be a display unit 352 of the touch screen 35, or may be another display unit connected via the input / output IF 34. .. Alternatively, the character control unit 316 identifies a command instructing the operation of the character associated with the input operation input by the operator via the above-mentioned input mechanism or operation unit. Then, the character control unit 316 arranges the commands in the order in which they are input to generate a motion command group indicating a series of actions of the character, and generates motion instruction data instructing the character to be operated according to the motion command group. You may. The motion instruction data generated in this way includes at least the above-mentioned motion command group.
 反応処理部317は、ユーザ端末100からユーザの反応についてフィードバックを受け付けて、これを動作指図装置300のオペレータに対して出力する。本実施形態では、例えば、ユーザ端末100は、上述の動作指図データにしたがってキャラクタを動作させている間、該キャラクタに宛てて、ユーザがコメントを作成することができる。反応処理部317は、該コメントのコメントデータを受け付けて、これを、出力する。反応処理部317は、ユーザのコメントに対応するテキストデータを、表示部352に表示してもよいし、ユーザのコメントに対応する音声データを、図示しないスピーカから出力してもよい。 The reaction processing unit 317 receives feedback on the user's reaction from the user terminal 100 and outputs this to the operator of the operation instruction device 300. In the present embodiment, for example, the user terminal 100 can create a comment addressed to the character while the character is operated according to the above-mentioned operation instruction data. The reaction processing unit 317 receives the comment data of the comment and outputs it. The reaction processing unit 317 may display the text data corresponding to the user's comment on the display unit 352, or may output the voice data corresponding to the user's comment from a speaker (not shown).
 なお、図2に示すユーザ端末100、サーバ200および動作指図装置300の機能は一例にすぎない。ユーザ端末100、サーバ200および動作指図装置300の各装置は、他の装置が備える機能の少なくとも一部を備えていてもよい。さらに、ユーザ端末100、サーバ200および動作指図装置300以外のさらに別の装置をゲームシステム1の構成要素とし、該別の装置にゲームシステム1における処理の一部を実行させてもよい。すなわち、本実施形態においてゲームプログラムを実行するコンピュータは、ユーザ端末100、サーバ200、動作指図装置300およびそれ以外の別の装置の何れであってもよいし、これらの複数の装置の組み合わせにより実現されてもよい。 The functions of the user terminal 100, the server 200, and the operation instruction device 300 shown in FIG. 2 are merely examples. Each device of the user terminal 100, the server 200, and the operation instruction device 300 may have at least a part of the functions of the other devices. Further, another device other than the user terminal 100, the server 200, and the operation instruction device 300 may be used as a component of the game system 1, and the other device may be made to execute a part of the processing in the game system 1. That is, the computer that executes the game program in the present embodiment may be any of a user terminal 100, a server 200, an operation instruction device 300, and another device other than the user terminal 100, and is realized by a combination of a plurality of these devices. May be done.
 <処理概要>
 本実施形態では、ユーザ端末100は、ゲームプログラム131に基づいて、ゲームの興趣性を向上させるために以下のステップを実行するように構成されている。具体的には、ユーザ端末100は、操作部(例えば、入力部151)を介してユーザ端末100に入力されたユーザの入力操作に応じてゲームを進行させるステップと、ゲームに登場する少なくとも1つのキャラクタを制御する動作指図装置300に、ゲームの進捗を示す進捗情報を送信するステップと、動作指図装置300においてゲームの進捗に合わせて入力された、キャラクタの発言に対応する音声データを、動作指図装置300から受信するステップと、音声データを受信したことをトリガにして、ゲームの進行中に登場するキャラクタに、少なくとも該音声データの内容を発話させることにより、該キャラクタを動作させるステップとを実行する。
<Processing overview>
In the present embodiment, the user terminal 100 is configured to execute the following steps in order to improve the interest of the game based on the game program 131. Specifically, the user terminal 100 includes a step of advancing the game in response to a user's input operation input to the user terminal 100 via an operation unit (for example, an input unit 151), and at least one appearing in the game. A step of transmitting progress information indicating the progress of the game to the operation instruction device 300 that controls the character, and voice data corresponding to the character's remarks input in the operation instruction device 300 according to the progress of the game are input to the operation instruction device 300. Execution of a step of receiving from the device 300 and a step of operating the character by causing a character appearing during the progress of the game to speak at least the content of the voice data by using the reception of the voice data as a trigger. do.
 また、本実施形態では、動作指図装置300は、キャラクタ制御プログラム134に基づいて、ゲームの興趣性を向上させるために以下のステップを実行するように構成されている。具体的には、動作指図装置300は、ゲームを進行させるユーザ端末100から受信した、該ゲームの進捗を示す進捗情報に基づいて、ゲームの進捗を示す進捗画面を表示部352に表示するステップと、表示部352に表示された進捗画面が示す該ゲームの進捗に合わせて動作指図装置300に入力された、キャラクタの発言に対応する音声データを受け付けるステップと、受け付けられた音声データをユーザ端末100に送信するステップとを実行する。 Further, in the present embodiment, the operation instruction device 300 is configured to execute the following steps in order to improve the interest of the game based on the character control program 134. Specifically, the operation instruction device 300 includes a step of displaying a progress screen indicating the progress of the game on the display unit 352 based on the progress information indicating the progress of the game received from the user terminal 100 for advancing the game. , A step of receiving voice data corresponding to a character's remark input to the operation instruction device 300 according to the progress of the game indicated by the progress screen displayed on the display unit 352, and the received voice data being received by the user terminal 100. And perform the steps to send to.
 上述の構成によれば、ユーザがゲームをプレイ中、ユーザ端末100におけるゲームの進捗が、動作指図装置300に報告される。したがって、動作指図装置300のオペレータは、ユーザがゲームをどこまで進めたのかを把握することができる。これにより、オペレータまたはオペレータから指示を受けた声優は、ユーザのゲームの進捗に合った音声データを動作指図装置300に入力することができる。進捗を踏まえてオペレータから動作指図装置300に入力された該音声データが、ユーザ端末100に供給される。ユーザ端末100は、ゲームの進行中に、該音声データの内容をキャラクタに発話させる。これにより、ユーザは、プレイ中のゲームの進捗に合った内容を発言してくれるキャラクタの存在を認識しつつ、ゲームをプレイすることができる。キャラクタにはゲームの進捗に合わせて発言させることができるので、ユーザは、キャラクタに対し、一緒にゲームをプレイしているかのような現実感を覚えることができる。結果として、ゲームの世界への没入感を高め、該ゲームの興趣性を向上させるという効果を奏する。 According to the above configuration, while the user is playing the game, the progress of the game on the user terminal 100 is reported to the operation instruction device 300. Therefore, the operator of the operation instruction device 300 can grasp how far the user has advanced the game. As a result, the operator or the voice actor who receives the instruction from the operator can input the voice data according to the progress of the user's game to the operation instruction device 300. The voice data input from the operator to the operation instruction device 300 based on the progress is supplied to the user terminal 100. The user terminal 100 causes the character to speak the content of the voice data while the game is in progress. As a result, the user can play the game while recognizing the existence of a character who speaks the content according to the progress of the game being played. Since the character can be made to speak according to the progress of the game, the user can feel the reality as if the character is playing the game together. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game.
 <データ構造>
  (画面遷移情報)
 図3は、画面遷移情報のデータ構造の一例を示す図である。一例として、画面遷移情報は、「ゲームパート」および「画面情報」の各項目を含んで構成されている。
<Data structure>
(Screen transition information)
FIG. 3 is a diagram showing an example of a data structure of screen transition information. As an example, the screen transition information is configured to include each item of "game part" and "screen information".
 項目「ゲームパート」には、本ゲームを構成する各パートの識別情報が格納されている。一例として、本ゲームは、対戦テニスゲームであり、チュートリアルパート、対戦パート、抽選パートの3つのパートで構成されている。 The item "game part" stores the identification information of each part constituting this game. As an example, this game is a competitive tennis game, and is composed of three parts: a tutorial part, a competitive part, and a lottery part.
 項目「画面情報」には、各パートで表示される画面とその遷移を定義する情報が格納されている。例えば、本ゲームでは、チュートリアルパートでは、ID「0001」の導入画面が表示部152に表示され、次に、ID「0002」の練習画面が表示され、最後に、ID「0003」の説明画面が表示されて、該チュートリアルパートが終了する。 The item "screen information" stores information that defines the screen displayed in each part and its transition. For example, in this game, in the tutorial part, the introduction screen of ID "0001" is displayed on the display unit 152, then the practice screen of ID "0002" is displayed, and finally, the explanation screen of ID "0003" is displayed. It is displayed and the tutorial part ends.
 本実施形態では、ユーザ端末100および動作指図装置300が、この画面遷移情報をそれぞれの記憶部に記憶して共有していてもよい。これにより、ユーザ端末100と動作指図装置300との間で画面IDを送受信するだけで、ユーザ端末100が現在どのパートのどのゲーム画面を表示中であるのかを簡単にユーザ端末100から動作指図装置300へ伝達することができる。 In the present embodiment, the user terminal 100 and the operation instruction device 300 may store and share this screen transition information in their respective storage units. As a result, by simply transmitting and receiving a screen ID between the user terminal 100 and the operation instruction device 300, the operation instruction device 100 can easily determine which game screen of which part the user terminal 100 is currently displaying. It can be transmitted to 300.
  (進捗情報)
 図4は、ユーザ端末100の進捗情報生成部117が生成する進捗情報のデータ構造の一例を示す図である。一例として、進捗情報は、「画面ID」および「進行ログ」の各項目を含んで構成されている。
(Progress information)
FIG. 4 is a diagram showing an example of a data structure of progress information generated by the progress information generation unit 117 of the user terminal 100. As an example, the progress information is configured to include each item of "screen ID" and "progress log".
 項目「画面ID」には、上述の画面IDが格納される。進捗情報生成部117は、ゲーム進行部115が現在表示部152に表示させているゲーム画面を特定し、その画面IDを、該項目に格納する。 The above-mentioned screen ID is stored in the item "screen ID". The progress information generation unit 117 identifies the game screen currently displayed on the display unit 152 by the game progress unit 115, and stores the screen ID in the item.
 項目「進行ログ」には、ゲーム進行部115によって実行されているゲームの進行ログが格納される。例えば、対戦パートが進行するにつれ、ゲーム進行部115は、進行ログを、定期的に、または、ログに残すべきイベントが発生する度に、記憶部120に記録する。進捗情報生成部117は、ゲーム進行部115によって記録された最新の進行ログを該項目に格納する。 The item "progress log" stores the progress log of the game being executed by the game progress unit 115. For example, as the battle part progresses, the game progress unit 115 records the progress log in the storage unit 120 periodically or every time an event to be recorded occurs. The progress information generation unit 117 stores the latest progress log recorded by the game progress unit 115 in the item.
 進行ログは、例えば、イベントが発生した時刻とそのイベントの内容とが紐付けられたレコードが、時系列に並べられたものである。進捗情報生成部117は、「進行ログ」の項目に、進行ログ全体を格納してもよいし、動作指図装置300へまだ報告されていない進行ログのレコードだけを格納してもよい。 The progress log is, for example, a record in which the time when an event occurred and the content of the event are associated with each other are arranged in chronological order. The progress information generation unit 117 may store the entire progress log in the item of "progress log", or may store only the record of the progress log that has not yet been reported to the operation instruction device 300.
 進捗情報は、さらに、ゲームの性質に応じて、ゲームの進捗を把握するのに必要な項目を含んでいてもよい。例えば、対戦テニスゲームの対戦パートにおいては、仮想のゲーム空間において、対戦中の選手の位置および選手たちが打ち合うボールの位置は、ユーザの入力操作に応じて変化する。上述の選手およびボールのように、ユーザの入力操作に応じて、オブジェクトの属性が変化するオブジェクトを以下では、動的オブジェクトと称する。オブジェクトの属性とは例えば、オブジェクトのゲーム空間における位置、大きさ、または、形状などを指している。そこで、本実施形態では、進捗情報は、さらに、「動的オブジェクト」の項目を含んで構成されていてもよい。 The progress information may further include items necessary for grasping the progress of the game, depending on the nature of the game. For example, in a competitive part of a competitive tennis game, in a virtual game space, the positions of the players in the match and the positions of the balls that the players hit each other change according to the input operation of the user. An object whose attribute changes according to a user's input operation, such as the player and the ball described above, is hereinafter referred to as a dynamic object. The attribute of an object refers to, for example, the position, size, or shape of the object in the game space. Therefore, in the present embodiment, the progress information may be further configured to include the item of "dynamic object".
 項目「動的オブジェクト」には、動的オブジェクトの属性を特定するために必要な属性関連情報が格納される。属性関連情報は、例えば、動的オブジェクトのゲーム空間における位置特定するための座標情報であってもよい。あるいは、属性関連情報は、例えば、動的オブジェクトがボールである場合に、ボールの移動経路を特定するための情報であってもよい。例えば、進捗情報生成部117は、ラケットに衝突した直後のボールの速度ベクトル、回転軸、回転量等をゲーム進行部115から取得し、属性関連情報として「動的オブジェクト」の項目に格納してもよい。 The item "dynamic object" stores attribute-related information necessary for identifying the attributes of the dynamic object. The attribute-related information may be, for example, coordinate information for specifying the position of the dynamic object in the game space. Alternatively, the attribute-related information may be information for specifying the movement path of the ball, for example, when the dynamic object is a ball. For example, the progress information generation unit 117 acquires the velocity vector, rotation axis, rotation amount, etc. of the ball immediately after colliding with the racket from the game progress unit 115, and stores them in the item of "dynamic object" as attribute-related information. May be good.
 進捗情報生成部117は、上述の各項目を含む進捗情報を、ゲーム進行中に、定期的に、または、所定のイベントが発生する度に生成し、動作指図装置300に送信する。これにより、動作指図装置300は、ユーザ端末100におけるゲームの進捗を把握することが可能となる。 The progress information generation unit 117 generates progress information including each of the above items periodically or every time a predetermined event occurs during the progress of the game, and transmits the progress information to the operation instruction device 300. As a result, the operation instruction device 300 can grasp the progress of the game on the user terminal 100.
  (動作指図データ)
 図5は、本実施形態に係るゲームシステム1にて処理される動作指図データのデータ構造の一例を示す図である。一例として、動作指図データは、メタ情報である、「宛先」、「作成元」の各項目と、データの中身である、「キャラクタID」、「音声」、「動き」、「注目箇所」の各項目とを含んで構成されている。
(Operation instruction data)
FIG. 5 is a diagram showing an example of a data structure of operation instruction data processed by the game system 1 according to the present embodiment. As an example, the action instruction data includes each item of "destination" and "creator" which is meta information, and "character ID", "voice", "movement", and "attention point" which are the contents of the data. It is configured to include each item.
 項目「宛先」には、宛先指定情報が格納されている。宛先指定情報は、該動作指図データが、どの装置宛てに送信されたものであるのかを示す情報である。宛先指定情報は、例えば、ユーザ端末100固有のアドレスであってもよいし、ユーザ端末100が所属しているグループの識別情報であってもよい。ある条件を満たすすべてのユーザ端末100を宛先としていることを示す記号(例えば、「ALL」など)であってもよい。 The destination designation information is stored in the item "destination". The destination designation information is information indicating to which device the operation instruction data is transmitted. The destination designation information may be, for example, an address unique to the user terminal 100, or may be identification information of the group to which the user terminal 100 belongs. It may be a symbol (for example, "ALL") indicating that the destination is all user terminals 100 satisfying a certain condition.
 項目「作成元」には、作成元情報が格納されている。作成元情報は、該動作指図データが、どの装置によって作成されたものであるのかを示す情報である。作成元情報は、例えば、ユーザID、ユーザ端末ID、ユーザ端末の固有アドレスなど、ある特定のユーザを特定可能な、ユーザに関連する情報である。作成元情報は、サーバ200または動作指図装置300を指し示すIDまたはアドレスであってもよいし、作成元が、サーバ200または動作指図装置300である場合には、該項目の値を空のままにしておいてもよいし、該項目自体を動作指図データに設けないようにしてもよい。 The creation source information is stored in the item "creation source". The creation source information is information indicating which device created the operation instruction data. The creation source information is information related to a user, such as a user ID, a user terminal ID, and a unique address of the user terminal, which can identify a specific user. The creation source information may be an ID or an address indicating the server 200 or the operation instruction device 300, and if the creation source is the server 200 or the operation instruction device 300, the value of the item is left empty. The item itself may not be provided in the operation instruction data.
 項目「キャラクタID」には、本ゲームに登場するキャラクタを一意に識別するためのキャラクタIDが格納されている。ここに格納されているキャラクタIDは、該動作指図データがどのキャラクタの動作を指示するためのものであるのかを表している。 The item "character ID" stores a character ID for uniquely identifying a character appearing in this game. The character ID stored here represents which character's action is indicated by the action instruction data.
 項目「音声」には、キャラクタに発現させる音声データが格納されている。項目「動き」には、キャラクタの動きを指定するモーションデータが格納されている。モーションデータは、一例として、モーションキャプチャ装置3020を介して動作指図装置300が取得したモーションキャプチャデータであってもよい。モーションキャプチャデータは、アクターの体全体の動きを追跡したデータであってもよいし、アクターの顔の表情および口の動きを追跡したデータであってもよいし、その両方であってもよい。モーションデータは、他の例として、コントローラ3030を介して動作指図装置300のオペレータが入力した操作によって特定された、キャラクタの一連の動きを指示するモーションコマンド群であってもよい。例えば、コントローラ3030のボタンA、ボタンB、ボタンC、ボタンDにそれぞれ、「右手を上げる」、「左手を上げる」、「歩く」、「走る」のコマンドが割り付けられている場合に、オペレータが、ボタンA、ボタンB、ボタンC、ボタンDを続けて順に押した場合には、「右手を上げる」、「左手を上げる」、「歩く」、「走る」の各コマンドが上述の順に並べられたモーションコマンド群が、モーションデータとして、「動き」の項目に格納される。なお、本実施形態では、音声データとモーションデータとは同期がとれた状態で、動作指図データに含まれている。 The item "voice" stores voice data to be expressed in the character. Motion data that specifies the movement of the character is stored in the item "movement". As an example, the motion data may be motion capture data acquired by the motion instruction device 300 via the motion capture device 3020. The motion capture data may be data that tracks the movement of the actor's entire body, may be data that tracks the facial expression and mouth movement of the actor, or may be both. As another example, the motion data may be a motion command group instructing a series of movements of the character specified by an operation input by the operator of the operation instruction device 300 via the controller 3030. For example, when the commands "raise the right hand", "raise the left hand", "walk", and "run" are assigned to the buttons A, B, C, and D of the controller 3030, the operator can use them. , Button A, Button B, Button C, and Button D are pressed in order, and the commands "Raise right hand", "Raise left hand", "Walk", and "Run" are arranged in the above order. The motion commands are stored in the "movement" item as motion data. In this embodiment, the voice data and the motion data are included in the operation instruction data in a synchronized state.
 項目「注目箇所」には、ユーザ端末100の表示部152に表示されているゲーム画面において、キャラクタとユーザとのやりとりの中で、ユーザに注目させたい箇所を特定するための注目情報が格納される。一例として、注目情報は、ゲーム画面上に配置されているオブジェクトのうちユーザに注目させたいオブジェクトを識別する識別情報、例えば、オブジェクトIDであってもよい。あるいは、注目情報は、ゲーム画面を表示する表示部152の特定の位置を示す位置座標(第2の位置座標)であってもよい。本実施形態では、動作指図装置300の進捗模擬部315は、表示部352に、ユーザ端末100の表示部152に表示されているゲーム画面を再現してもよい。この場合、動作指図装置300のオペレータは、入力部351を介して、ユーザに注目させたい箇所を入力操作で指定することができる。入力操作がなされた表示部352における第1の位置座標は、キャラクタ制御部316によって上述の第2の位置座標に変換される。例えば、キャラクタ制御部316は、上述の変換のために、予めユーザ端末100から取得されたタッチスクリーン15の仕様(画素数および縦横比など)と、表示部352における再現されたゲーム画面が表示されている位置とを参照する。参照したこれらの情報を用いて、キャラクタ制御部316は、自装置の表示部352における第1の位置座標に対応する、ユーザ端末100のタッチスクリーン15における第2の位置座標を求めることができる。 In the item "attention point", attention information for specifying a point to be noticed by the user in the interaction between the character and the user on the game screen displayed on the display unit 152 of the user terminal 100 is stored. To. As an example, the attention information may be identification information that identifies an object that the user wants to pay attention to among the objects arranged on the game screen, for example, an object ID. Alternatively, the attention information may be position coordinates (second position coordinates) indicating a specific position of the display unit 152 for displaying the game screen. In the present embodiment, the progress simulation unit 315 of the operation instruction device 300 may reproduce the game screen displayed on the display unit 152 of the user terminal 100 on the display unit 352. In this case, the operator of the operation instruction device 300 can specify a position to be noticed by the user by an input operation via the input unit 351. The first position coordinates in the display unit 352 on which the input operation is performed are converted into the above-mentioned second position coordinates by the character control unit 316. For example, the character control unit 316 displays the specifications (number of pixels, aspect ratio, etc.) of the touch screen 15 previously acquired from the user terminal 100 and the game screen reproduced on the display unit 352 for the above conversion. Refer to the position where you are. Using these referenced information, the character control unit 316 can obtain the second position coordinates on the touch screen 15 of the user terminal 100 corresponding to the first position coordinates on the display unit 352 of the own device.
 例えば、表示部352と入力部351とはタッチスクリーン35を構成していてもよい。この場合、表示部352に表示された模擬画面に対してユーザが入力したタッチ操作(入力操作)の第1の位置座標は、入力部151を介してタッチスクリーン35における位置座標としてキャラクタ制御部316に取得される。別の実施形態では、入力部351と表示部352とは別々に形成されていてもよい。例えば、入力部351がマウスまたはキーボードなどのように表示部352とは別体で形成された入力装置である場合、表示部352に表示された模擬画面に対してユーザがマウスを用いて所定の位置をクリックする。模擬画面に対してユーザが入力したクリック操作(入力操作)の第1の位置座標は、クリック操作の入力タイミングと、そのときに表示部352に表示されていたカーソルの表示位置とに基づいて決定されキャラクタ制御部316に供給される。 For example, the display unit 352 and the input unit 351 may form a touch screen 35. In this case, the first position coordinates of the touch operation (input operation) input by the user with respect to the simulated screen displayed on the display unit 352 are the character control units 316 as the position coordinates on the touch screen 35 via the input unit 151. To be acquired. In another embodiment, the input unit 351 and the display unit 352 may be formed separately. For example, when the input unit 351 is an input device formed separately from the display unit 352 such as a mouse or a keyboard, a user can use a mouse to determine a predetermined screen displayed on the display unit 352. Click the position. The first position coordinates of the click operation (input operation) input by the user on the simulated screen are determined based on the input timing of the click operation and the display position of the cursor displayed on the display unit 352 at that time. It is supplied to the character control unit 316.
 ユーザ端末100のゲーム進行部115が、「注目箇所」の情報を用いると、ゲーム進行部115は、ゲーム画面上のユーザに注目させたい箇所を、該ユーザに確実に伝えるための表示を行うことが可能となる。例えば、キャラクタの発話に、「これ」、「ここ」、「あそこ」、「こっち」などの指示語が含まれていても、ゲーム進行部115は、その指示語が指している箇所を把握し、それを強調表示することができる。そのため、ユーザは、指示語が指している箇所がどこなのかを正確に知ることができる。これにより、指示語を含む会話が、ユーザとキャラクタとの間で成立する。結果として、キャラクタがその場で一緒にプレイしているかのような、臨場感あふれるテンポの良い自然なコミュニケーションをユーザとキャラクタとの間で実現することができる。 When the game progress unit 115 of the user terminal 100 uses the information of the "attention point", the game progress unit 115 displays on the game screen to surely convey to the user a place to be noticed by the user. Is possible. For example, even if a character's utterance contains a demonstrative word such as "this", "here", "over there", or "here", the game progress unit 115 grasps the point pointed to by the demonstrative word. , You can highlight it. Therefore, the user can know exactly where the demonstrative is pointing. As a result, a conversation including a demonstrative word is established between the user and the character. As a result, it is possible to realize immersive, fast-paced and natural communication between the user and the character as if the characters were playing together on the spot.
 このような動作指図データを受信することにより、ゲーム進行部115は、ゲームに登場するキャラクタを、該動作指図データの作成元の意図通りに動作させることができる。具体的には、ゲーム進行部115は、動作指図データに音声データが含まれている場合には、該音声データに基づいてキャラクタに発話させ、動作指図データにモーションデータが含まれている場合には、該モーションデータに基づいてキャラクタを動かす、すなわち、モーションデータに基づく動きをするように該キャラクタのアニメーションを生成する。 By receiving such motion instruction data, the game progress unit 115 can operate the character appearing in the game as intended by the creator of the motion instruction data. Specifically, when the motion instruction data includes voice data, the game progress unit 115 causes the character to speak based on the voice data, and when the motion instruction data includes motion data, the game progress unit 115 causes the character to speak. Moves the character based on the motion data, that is, generates an animation of the character so as to move based on the motion data.
 <画面例>
  (進捗画面)
 図6は、動作指図装置300の表示部352に表示される進捗画面の一例を示す図である。本実施形態では、進捗模擬部315は、ユーザ端末100から取得された進捗情報を、例えば、進捗画面400として表示部352に表示させる。
<Screen example>
(Progress screen)
FIG. 6 is a diagram showing an example of a progress screen displayed on the display unit 352 of the operation instruction device 300. In the present embodiment, the progress simulation unit 315 displays the progress information acquired from the user terminal 100 on the display unit 352 as, for example, the progress screen 400.
 進捗画面400は、一例として、模擬画面401および進行ログ402を含む。さらに、進捗画面400は、オペレータ(または声優)が、マイク3010を介して音声データを自装置に対して入力するためのUI部品403を含んでいてもよい。進捗画面400は、ユーザ端末100の表示部152に実際に表示されているのと同じ画面を模擬画面401として含んでいてもよいし、該画面を簡略化した模擬画面401を含んでいてもよい。 The progress screen 400 includes a simulated screen 401 and a progress log 402 as an example. Further, the progress screen 400 may include a UI component 403 for the operator (or voice actor) to input voice data to the own device via the microphone 3010. The progress screen 400 may include the same screen that is actually displayed on the display unit 152 of the user terminal 100 as a simulated screen 401, or may include a simulated screen 401 that is a simplification of the screen. ..
 進捗模擬部315は、ユーザ端末100から取得した進捗情報を解析する。進捗模擬部315は、進捗情報に含まれている画面IDに基づいて、ユーザ端末100において表示されているゲーム画面を特定する。進捗模擬部315は、特定したゲーム画面を表示部352に表示する。例えば、進捗模擬部315は、記憶部320に記憶されているゲーム情報132、ユーザ情報133およびゲームプログラム131に基づいて、上述のゲーム画面を詳細に再現してもよい。しかし、進捗模擬部315は、処理の負荷を低減するために、上述のゲーム画面を簡略化した模擬画面401を生成することが好ましい。模擬画面401は、上述のゲーム画面に配置されている情報のうち、動作指図装置300のオペレータが、ゲームの進捗を判断するために必要な最小限の情報だけを含んでいる。模擬画面401は、例えば、最小限の情報として、オブジェクトの外観の描画が省かれて、各オブジェクトのレイアウトと、各オブジェクトの機能の説明と、各オブジェクトのステータスとが含まれていてもよい。 The progress simulation unit 315 analyzes the progress information acquired from the user terminal 100. The progress simulation unit 315 identifies the game screen displayed on the user terminal 100 based on the screen ID included in the progress information. The progress simulation unit 315 displays the specified game screen on the display unit 352. For example, the progress simulation unit 315 may reproduce the above-mentioned game screen in detail based on the game information 132, the user information 133, and the game program 131 stored in the storage unit 320. However, in order to reduce the processing load, the progress simulation unit 315 preferably generates a simulation screen 401 that simplifies the above-mentioned game screen. The simulated screen 401 includes only the minimum information necessary for the operator of the operation instruction device 300 to determine the progress of the game among the information arranged on the above-mentioned game screen. The simulated screen 401 may include, for example, the layout of each object, the description of the function of each object, and the status of each object, with the drawing of the appearance of the object omitted as the minimum information.
 さらに、進捗模擬部315は、進捗情報に含まれている動的オブジェクトの属性関連情報に基づいて、ゲーム画面に配置されている動的オブジェクトの属性を模擬画面401において再現することが好ましい。例えば、進捗模擬部315は、動的オブジェクトであるボール404の属性関連情報に基づいてボール404の移動軌道を計算し、そのとおりにボール404を移動させてもよい。あるいは、進捗模擬部315は、動的オブジェクトである、ユーザが操作する選手405、および、対戦相手のユーザが操作する選手406の属性関連情報に基づいてゲーム空間における各選手の位置を特定する。そして、進捗模擬部315は、各選手と仮想カメラとの位置関係に基づいて、各選手の大きさを決定し、特定した位置にそれぞれの選手を配置する。あるいは、各選手の表示サイズは属性関連情報として予め定義されており、進捗模擬部315は、定義された表示サイズにしたがって各選手を配置してもよい。 Further, it is preferable that the progress simulation unit 315 reproduces the attributes of the dynamic objects arranged on the game screen on the simulation screen 401 based on the attribute-related information of the dynamic objects included in the progress information. For example, the progress simulation unit 315 may calculate the movement trajectory of the ball 404 based on the attribute-related information of the ball 404, which is a dynamic object, and move the ball 404 accordingly. Alternatively, the progress simulation unit 315 specifies the position of each player in the game space based on the attribute-related information of the player 405 operated by the user and the player 406 operated by the opponent user, which are dynamic objects. Then, the progress simulation unit 315 determines the size of each player based on the positional relationship between each player and the virtual camera, and arranges each player at the specified position. Alternatively, the display size of each player is defined in advance as attribute-related information, and the progress simulation unit 315 may arrange each player according to the defined display size.
 進捗模擬部315は、進捗情報に含まれている進行ログを進捗画面400に配置する。例えば、進捗模擬部315は、図示のとおり、イベントの発生時刻と、該イベントの内容とが紐付けられた各レコードをテキストデータとして含む進行ログ402を生成する。 The progress simulation unit 315 arranges the progress log included in the progress information on the progress screen 400. For example, as shown in the figure, the progress simulation unit 315 generates a progress log 402 including each record associated with the event occurrence time and the content of the event as text data.
 以上のように、動作指図装置300のオペレータは、進捗画面400を確認することにより、ユーザ端末100で実行されているゲームの進捗を把握することができる。 As described above, the operator of the operation instruction device 300 can grasp the progress of the game being executed on the user terminal 100 by checking the progress screen 400.
 さらに、進捗模擬部315は、ゲームプログラム131に基づいて、現時点のゲーム進捗から今後のゲーム展開を予測した予測結果を、進捗画面400に含めてもよい。進捗模擬部315は、予測結果を、模擬画面401に重畳させて表示してもよいし、予測結果を、模擬画面401と切り替えられるように表示してもよいし、模擬画面401と並べて表示してもよい。あるいは、進捗模擬部315は、予測結果を、進行ログ402に、追加して表示してもよい。この場合、進捗模擬部315は、すでに起こったイベントとは異なる表示態様、例えば、異なる文字色にて予測結果を表示することが好ましい。 Further, the progress simulation unit 315 may include the prediction result of predicting the future game development from the current game progress in the progress screen 400 based on the game program 131. The progress simulation unit 315 may display the prediction result superimposed on the simulation screen 401, may display the prediction result so as to be switched with the simulation screen 401, or display the prediction result side by side with the simulation screen 401. You may. Alternatively, the progress simulation unit 315 may additionally display the prediction result in the progress log 402. In this case, it is preferable that the progress simulation unit 315 displays the prediction result in a display mode different from that of the event that has already occurred, for example, in a different character color.
 これにより、動作指図装置300のオペレータは、ユーザ端末におけるゲーム進捗に合わせて、今の状況に関することをキャラクタに発話させることに加えて、予測結果に基づいてこの先のことについてアドバイスすることができるようになる。例えば、対戦テニスゲームにおいて、ユーザがコンピュータと対戦するパートをプレイしているとする。この場合、ユーザの入力操作に基づいてボールが打ち返されたことを示す進捗情報がユーザ端末100から動作指図装置300に供給される。そして、進捗模擬部315は、この次に、コンピュータがどんな球を返すのかをゲームプログラム131に基づいて予測し、その返球軌道を、模擬画面401に重畳表示することができる。これを見たオペレータは、「すぐ右に移動して!こっちにくるよ!」などとキャラクタにアドバイスさせるように音声を入力したり、「こっち」が指す位置を指定したりすることができる。また、例えば、将棋やオセロなどの対戦ボードゲームであれば、進捗模擬部315は、進捗情報に基づいてユーザが指そうとしている1手を検出し、仮にその1手が実際に指された場合のその後の展開を予測する。進捗模擬部315は、その1手が、敗北が確定してしまう1手であると判断した場合、その旨のポップアップメッセージを模擬画面401上または進捗画面400のその他の余白領域に表示してもよい。これを見たオペレータは、「そこはやめといたほうがいいって!」とキャラクタにアドバイスさせるための音声を入力することができる。 As a result, the operator of the operation instruction device 300 can give advice on the future based on the prediction result, in addition to making the character speak about the current situation according to the progress of the game on the user terminal. become. For example, in a competitive tennis game, suppose a user is playing a part that plays against a computer. In this case, progress information indicating that the ball has been hit back based on the input operation of the user is supplied from the user terminal 100 to the operation instruction device 300. Then, the progress simulation unit 315 can predict what kind of ball the computer will return next based on the game program 131, and can superimpose and display the return ball trajectory on the simulation screen 401. Upon seeing this, the operator can input a voice to advise the character, such as "Move to the right! Come here!", Or specify the position pointed to by "this". Further, for example, in the case of a battle board game such as shogi or Othello, the progress simulation unit 315 detects one move that the user is trying to point to based on the progress information, and if that one move is actually pointed. Predict the subsequent development of. If the progress simulation unit 315 determines that the one move is the one for which the defeat is confirmed, the progress simulation unit 315 may display a pop-up message to that effect on the simulation screen 401 or in another margin area of the progress screen 400. good. The operator who sees this can input a voice to advise the character, "You should stop there!".
 操作受付部311が、マイク3010の入力が無効となっているマイクオフの状態のときに、進捗画面400上に配置されたUI部品403に対するオペレータのタッチ操作を受け付けたとする。この場合、キャラクタ制御部316は、マイク3010からの入力を有効にし、マイク3010を介して入力された音声データを取得して、動作指図データに含める。操作受付部311が、マイク3010の入力が有効となっているマイクオンの状態のときに、進捗画面400上に配置されたUI部品403に対するオペレータのタッチ操作を受け付けたとする。この場合、キャラクタ制御部316は、再び、マイク3010からの入力を無効にする。 It is assumed that the operation reception unit 311 accepts the operator's touch operation on the UI component 403 arranged on the progress screen 400 when the microphone 3010 input is disabled and the microphone is off. In this case, the character control unit 316 enables the input from the microphone 3010, acquires the voice data input via the microphone 3010, and includes it in the operation instruction data. It is assumed that the operation reception unit 311 accepts the operator's touch operation on the UI component 403 arranged on the progress screen 400 when the microphone 3010 input is enabled and the microphone is on. In this case, the character control unit 316 again invalidates the input from the microphone 3010.
 以上のように、オペレータは、ゲームの進捗を確認した後、その進捗に合うように、その場で発話の内容を決定して、該発話の内容に対応する音声データを動作指図装置300に対して入力することができる。 As described above, after confirming the progress of the game, the operator determines the content of the utterance on the spot so as to match the progress, and outputs the voice data corresponding to the content of the utterance to the operation instruction device 300. Can be entered.
 操作受付部311が、模擬画面401の表示領域に対するタッチ操作をオペレータから受け付けたとき、キャラクタ制御部316は、そのタッチ操作によって模擬画面401上のどのオブジェクトが指定されたのかを特定し、該オブジェクトの識別情報を取得してもよい。キャラクタ制御部316は、識別情報を、動作指図データの「注目箇所」の項目に格納してもよい。あるいは、キャラクタ制御部316は、タッチスクリーン35における上述のタッチ操作の第1の位置座標を、ユーザ端末100の表示部152の仕様に応じて、ユーザ端末100のタッチスクリーン15における第2の位置座標に変換する。キャラクタ制御部316は、オペレータのタッチ操作の位置を表示部152の座標系において示すための第2の位置座標を、動作指図データの「注目箇所」の項目に格納してもよい。 When the operation reception unit 311 receives a touch operation on the display area of the simulated screen 401 from the operator, the character control unit 316 identifies which object on the simulated screen 401 is designated by the touch operation, and the object is specified. Identification information of may be acquired. The character control unit 316 may store the identification information in the item of "attention point" of the operation instruction data. Alternatively, the character control unit 316 sets the first position coordinates of the above-mentioned touch operation on the touch screen 35 to the second position coordinates on the touch screen 15 of the user terminal 100 according to the specifications of the display unit 152 of the user terminal 100. Convert to. The character control unit 316 may store the second position coordinates for indicating the position of the operator's touch operation in the coordinate system of the display unit 152 in the item of "attention point" of the operation instruction data.
 以上のように、オペレータは、ゲームの進捗を確認し、その進捗に合うように、キャラクタの発話内容に対応する音声データを入力しつつ、その発話と同時にユーザに注目させたいゲーム画面上の注目箇所を指定することができる。 As described above, the operator confirms the progress of the game, inputs the voice data corresponding to the utterance content of the character so as to match the progress, and pays attention to the user at the same time as the utterance. You can specify the location.
  (ゲーム画面)
 ユーザ端末100のゲーム進行部115は、ゲームの進行中に、動作指図装置300から動作指図データを受信すると、表示中のゲーム画面に、該動作指図データによって指定されたキャラクタを重畳させて、該キャラクタを該動作指図データに基づいて動作させる。例えば、ゲーム進行部115が、チュートリアルパートの練習画面を表示しているときに動作指図データを受信したとする。この場合、ゲーム進行部115は、表示部152に表示中の練習画面800に、キャラクタ801を重畳させる。そして、ゲーム進行部115は、キャラクタ801を、受信した動作指図データに基づいて動作させる。
(Game screen)
When the game progress unit 115 of the user terminal 100 receives the operation instruction data from the operation instruction device 300 while the game is in progress, the character specified by the operation instruction data is superimposed on the displayed game screen. The character is operated based on the operation instruction data. For example, it is assumed that the game progress unit 115 receives the operation instruction data while displaying the practice screen of the tutorial part. In this case, the game progress unit 115 superimposes the character 801 on the practice screen 800 displayed on the display unit 152. Then, the game progress unit 115 operates the character 801 based on the received operation instruction data.
 図7は、ユーザ端末100の表示部152に表示されるゲーム画面の一例を示す図である。図7では、表示部152に表示されるゲーム画面の一例として、本ゲームのチュートリアルパートにおいて、2番目に表示される練習画面800が図示されている。 FIG. 7 is a diagram showing an example of a game screen displayed on the display unit 152 of the user terminal 100. In FIG. 7, as an example of the game screen displayed on the display unit 152, the practice screen 800 displayed second in the tutorial part of this game is illustrated.
 練習画面800のレイアウトは、一例として、対戦パートで表示される対戦画面のレイアウトとほぼ同様である。具体的には、ゲーム進行部115は、練習画面800において、ゲーム空間上のコートを描画する。そして、ゲーム進行部115は、ユーザが操作する選手802を手前に、対戦相手(チュートリアルパートではCOM)が操作する選手803を奥に配置する。ユーザは、練習画面800を見ながら、選手802を操作してチュートリアルパートをプレイすることにより、対戦パートでの本番さながらに、操作方法を習得することができる。進捗情報生成部117は、ゲーム進行部115がチュートリアルパートを進行させている間、進捗情報を適時に生成し、動作指図装置300に送信する。例えば、ラリーが続いて、スキルゲージが貯まり、選手802の状態が、スキルを発動可能な状態に遷移したとする。このとき、進捗情報生成部117は、スキル発動ボタン804が使用不可の状態から使用可に遷移したというイベントを含む進行ログを進捗情報に含めて動作指図装置300に送信する。 As an example, the layout of the practice screen 800 is almost the same as the layout of the battle screen displayed in the battle part. Specifically, the game progress unit 115 draws a court on the game space on the practice screen 800. Then, the game progress unit 115 arranges the player 802 operated by the user in front and the player 803 operated by the opponent (COM in the tutorial part) in the back. By operating the player 802 and playing the tutorial part while looking at the practice screen 800, the user can learn the operation method as if it were the actual performance in the battle part. The progress information generation unit 117 generates progress information in a timely manner while the game progress unit 115 is progressing the tutorial part, and transmits the progress information to the operation instruction device 300. For example, suppose that the rally continues, the skill gauge is accumulated, and the state of the player 802 changes to the state where the skill can be activated. At this time, the progress information generation unit 117 includes the progress log including the event that the skill activation button 804 has changed from the unusable state to the usable state in the progress information, and transmits the operation instruction device 300.
 ゲーム進行部115は、動作指図データが動作指図装置300から供給されないうちは、キャラクタ801を表示させなくてもよい。あるいは、本ゲームのアプリケーションをダウンロードしたときにあらかじめ供給された作り置きの動作指図データが記憶部120にゲームプログラム131とともに記憶されていてもよい。この場合、ゲーム進行部115は、ゲームプログラム131にしたがって、記憶部120から読み出した作り置きの動作指図データに基づいて、キャラクタ801を動作させてもよい。 The game progress unit 115 does not have to display the character 801 until the operation instruction data is supplied from the operation instruction device 300. Alternatively, the pre-made operation instruction data supplied in advance when the application of this game is downloaded may be stored in the storage unit 120 together with the game program 131. In this case, the game progress unit 115 may operate the character 801 according to the game program 131 based on the pre-made operation instruction data read from the storage unit 120.
 ユーザ端末100から進捗情報を受信した動作指図装置300は、進捗情報を先に説明した図6に示す進捗画面400として表示部352に表示する。進捗画面400を確認したオペレータは、例えば、スキル発動ボタン804を使うことをユーザにアドバイスすればよいと判断することができる。オペレータまたは声優701は、ゲームの進捗に合ったアドバイスを含む音声700を、マイク3010を介して動作指図装置300に入力する。また、オペレータまたはモデル(人物)702は、必要に応じて、モーションキャプチャ装置3020を介してキャラクタの動きを動作指図装置300に入力してもよい。また、オペレータは、必要に応じて、タッチスクリーン35を介して模擬画面401を操作し、練習画面800においてユーザに注目させたい箇所を指定してもよい。こうして、少なくとも音声データと、必要に応じて追加されたモーションキャプチャデータと、注目情報とを含む動作指図データがキャラクタ制御部316によって生成され、ユーザ端末100に送信される。 The operation instruction device 300 that has received the progress information from the user terminal 100 displays the progress information on the display unit 352 as the progress screen 400 shown in FIG. 6 described above. The operator who has confirmed the progress screen 400 can determine, for example, that the user should be advised to use the skill activation button 804. The operator or voice actor 701 inputs a voice 700 including advice according to the progress of the game to the operation instruction device 300 via the microphone 3010. Further, the operator or the model (person) 702 may input the movement of the character to the motion instruction device 300 via the motion capture device 3020, if necessary. Further, the operator may operate the simulated screen 401 via the touch screen 35 and specify a portion of the practice screen 800 that the user wants to pay attention to, if necessary. In this way, operation instruction data including at least voice data, motion capture data added as needed, and attention information is generated by the character control unit 316 and transmitted to the user terminal 100.
 本実施形態では、解析部116が、動作指図装置300から、動作指図データを受信すると、これをトリガとして、ゲーム進行部115は、受信された動作指図データに基づいてキャラクタ801を練習画面800に重畳させる。そして、動作指図データに含まれているモーションキャプチャデータが示す動きをキャラクタ801の動きに反映させる。上述のとおり、モーションキャプチャデータは、動作指図装置300の設置場所において、モデル702の動きをモーションキャプチャ装置3020を介して取得したものである。したがって、モデル702の動きが、そのまま、表示部152に表示されるキャラクタ801の動きに反映される。 In the present embodiment, when the analysis unit 116 receives the operation instruction data from the operation instruction device 300, the game progress unit 115 uses the operation instruction data as a trigger to display the character 801 on the practice screen 800 based on the received operation instruction data. Superimpose. Then, the motion indicated by the motion capture data included in the motion instruction data is reflected in the motion of the character 801. As described above, the motion capture data is obtained by acquiring the movement of the model 702 via the motion capture device 3020 at the installation location of the motion instruction device 300. Therefore, the movement of the model 702 is directly reflected in the movement of the character 801 displayed on the display unit 152.
 また、ゲーム進行部115は、動作指図装置300から供給された動作指図データに含まれている音声データ805を、キャラクタ801が発した音声として、キャラクタ801の動きと同期して出力する。音声データは、動作指図装置300の設置場所において、声優701の音声700をマイク3010を介して取得したものである。したがって、声優701が発した音声700に対応する音声データ805が、そのまま、ユーザ端末100のスピーカから出力される。 Further, the game progress unit 115 outputs the voice data 805 included in the operation instruction data supplied from the operation instruction device 300 as the voice emitted by the character 801 in synchronization with the movement of the character 801. The voice data is obtained by acquiring the voice 700 of the voice actor 701 through the microphone 3010 at the installation location of the operation instruction device 300. Therefore, the voice data 805 corresponding to the voice 700 emitted by the voice actor 701 is output as it is from the speaker of the user terminal 100.
 また、ゲーム進行部115は、上述の動作指図データに含まれている注目情報が指している練習画面800上の箇所を強調表示する。例えば、注目情報が、スキル発動ボタン804を指す識別情報であったり、スキル発動ボタン804が配置されている位置座標であったりする場合、ゲーム進行部115は、スキル発動ボタン804を強調表示する。例えば、ゲーム進行部115は、スキル発動ボタン804の表示態様を変更する。一例として、ゲーム進行部115は、スキル発動ボタン804の色を変更してもよいし、点滅、縁取り、上下左右に小刻みに動かすなどのアニメーションをスキル発動ボタン804に付してもよい。 Further, the game progress unit 115 highlights the part on the practice screen 800 pointed to by the attention information included in the above-mentioned operation instruction data. For example, when the attention information is the identification information pointing to the skill activation button 804 or the position coordinates where the skill activation button 804 is arranged, the game progress unit 115 highlights the skill activation button 804. For example, the game progress unit 115 changes the display mode of the skill activation button 804. As an example, the game progress unit 115 may change the color of the skill activation button 804, or may attach animations such as blinking, edging, and moving in small steps up, down, left, and right to the skill activation button 804.
 あるいは、ゲーム進行部115は、注目箇所を指定する指示オブジェクト806を練習画面800に重畳させてもよい。さらに、ゲーム進行部115は、指示オブジェクト806を、キャラクタ801の装備品として描画し、キャラクタ801が指示オブジェクト806を使ってスキル発動ボタン804を指している演出を出力してもよい。 Alternatively, the game progress unit 115 may superimpose the instruction object 806 that specifies the point of interest on the practice screen 800. Further, the game progress unit 115 may draw the instruction object 806 as an equipment of the character 801 and output an effect in which the character 801 uses the instruction object 806 to point to the skill activation button 804.
 上述の構成によれば、動作指図装置300の設置場所において実在する声優701の音声が、ユーザ端末100におけるゲームの進捗に応じて発せられ、それがそのまま、キャラクタ801の音声に反映される。また、同様に、実在するモデル702の動きが動作指図装置300に入力され、音声データと併せてユーザ端末100に供給されてもよい。これにより、ゲームの進捗に応じて発話するキャラクタ801に合わせて、該キャラクタ801に動きをつけることができる。さらに、動作指図装置300の設置場所にて模擬画面401がタッチスクリーン35を介してオペレータにより操作されてもよい。この場合、キャラクタ801の発話内容に関連して、ユーザに注目させたい練習画面800上の箇所を、動作指図装置300からユーザ端末100へ伝えることができる。これにより、ゲームの進捗に応じてキャラクタ801が発話するのに合わせて、練習画面800上の注目箇所をユーザに対して知らせることができる。 According to the above configuration, the voice of the voice actor 701 that actually exists at the installation location of the operation instruction device 300 is emitted according to the progress of the game on the user terminal 100, and it is directly reflected in the voice of the character 801. Similarly, the movement of the existing model 702 may be input to the operation instruction device 300 and supplied to the user terminal 100 together with the voice data. As a result, the character 801 can be moved according to the character 801 that speaks according to the progress of the game. Further, the simulated screen 401 may be operated by the operator via the touch screen 35 at the installation location of the operation instruction device 300. In this case, the part on the practice screen 800 that the user wants to pay attention to in relation to the utterance content of the character 801 can be transmitted from the operation instruction device 300 to the user terminal 100. As a result, the user can be notified of the point of interest on the practice screen 800 as the character 801 speaks according to the progress of the game.
 ゲームのプレイ中に、上述のように、ゲームの進捗に合うようにふるまうキャラクタ801を見て、ユーザは、キャラクタ801に対して、まるで、現実の世界に存在するかのような現実感を覚えることができる。さらに、キャラクタ801と一緒にゲームをプレイしているかのように感じることができ、結果として、ユーザは、より一層ゲームを楽しむことができる。 While playing the game, seeing the character 801 behaving in line with the progress of the game, as described above, the user feels to the character 801 as if it were in the real world. be able to. Further, it can be felt as if the game is being played with the character 801 and, as a result, the user can enjoy the game even more.
 さらに、ゲーム進行部115は、ゲーム進行中に表示させるキャラクタ801を、これまでのプレイ結果に応じた表示態様にて、表示部152に表示させてもよい。 Further, the game progress unit 115 may display the character 801 to be displayed during the progress of the game on the display unit 152 in a display mode according to the play results so far.
 一例として、ゲーム進行部115は、これまでにプレイされた対戦パートまたは抽選パートにおいて、キャラクタ801に身に付けさせることが可能なアイテム(衣装やアクセサリなどの服飾品等)が獲得されていれば、そのアイテムのオブジェクトをキャラクタ801に合成してもよい。上述の構成によれば、ユーザが本ゲームをプレイすることにより獲得したアイテムを、キャラクタ801の服飾品に反映させることができる。また、ユーザは、ゲームの開始前、プレイ中、または終了後においてゲーム内で使用可能な仮想通貨または課金などによって購入した有価データ(アイテム、投げ銭、等)をゲーム中に消費できるようになっていてもよい。 As an example, if the game progressing unit 115 has acquired an item (costume, accessory, or the like) that can be worn by the character 801 in the battle part or the lottery part that has been played so far. , The object of the item may be combined with the character 801. According to the above configuration, the item acquired by the user by playing the game can be reflected in the clothing item of the character 801. In addition, the user can consume valuable data (items, thrown money, etc.) purchased by virtual currency or billing that can be used in the game before, during, or after the game starts. You may.
 これにより、ユーザは、キャラクタ801により愛着を感じて、本ゲームをより一層楽しむことができる。さらに、キャラクタ801の服飾品をバージョンアップさせたいというユーザの意欲を育むことができ、結果として、ゲームをプレイする動機付けを強化することが可能となる。 As a result, the user can feel the attachment to the character 801 and enjoy the game even more. Further, the user's motivation to upgrade the clothing of the character 801 can be cultivated, and as a result, the motivation to play the game can be strengthened.
 さらに、本実施形態では、ゲーム進行部115は、キャラクタ801の動作に反応して、キャラクタ801に宛てたコメントを入力することが可能であってもよい。一例として、ゲーム進行部115は、練習画面800に、コメント入力ボタン807を配置してもよい。ユーザは、コメント入力ボタン807にタッチして、コメントを入力するためのUIを呼び出し、該UIを操作して、キャラクタ801に宛てたコメントを入力する。該UIは、予め準備されたいくつかのコメントの中からユーザが所望のコメントを選択するためのものであってもよい。該UIは、ユーザが文字を編集してコメントを入力するためのものであってもよい。該UIは、ユーザが音声にてコメントを入力するためのものであってもよい。 Further, in the present embodiment, the game progress unit 115 may be able to input a comment addressed to the character 801 in response to the operation of the character 801. As an example, the game progress unit 115 may arrange a comment input button 807 on the practice screen 800. The user touches the comment input button 807 to call a UI for inputting a comment, operates the UI, and inputs a comment addressed to the character 801. The UI may be for the user to select a desired comment from some prepared comments. The UI may be for the user to edit characters and enter comments. The UI may be for the user to input a comment by voice.
 あるいは、対戦テニスゲームなどのように、ゲーム進行のために常にユーザの入力操作が必要であって、ユーザが、コメントを入力するための操作を行っている時間的余裕がないようなゲームでは、コメント入力ボタン807を設けずとも、常に、ユーザが音声を入力できるように構成されていてもよい。 Alternatively, in a game such as a competitive tennis game, in which a user's input operation is always required for the progress of the game and the user cannot afford to perform an operation for inputting a comment. Even if the comment input button 807 is not provided, the user may always be configured to input voice.
 上述の構成によれば、ユーザは、リアルタイムに、キャラクタ801とのインタラクティブなやりとり楽しみながら、ゲームをプレイすることができる。 According to the above configuration, the user can play the game in real time while enjoying interactive interaction with the character 801.
 <処理フロー>
 図8は、ゲームシステム1を構成する各装置が実行する処理の流れを示すフローチャートである。
<Processing flow>
FIG. 8 is a flowchart showing a flow of processing executed by each device constituting the game system 1.
 ステップS101にて、ユーザ端末100のゲーム進行部115は、ユーザからゲーム開始の入力操作を受け付けると、サーバ200にアクセスし、ログインの要求を行う。 In step S101, when the game progress unit 115 of the user terminal 100 receives an input operation for starting a game from the user, it accesses the server 200 and requests login.
 ステップS102にて、サーバ200の進行支援部211は、ユーザ端末100のステータスがオンラインであることを確認し、ログインを受け付けた旨応答する。 In step S102, the progress support unit 211 of the server 200 confirms that the status of the user terminal 100 is online, and responds that the login has been accepted.
 ステップS103にて、ゲーム進行部115は、必要に応じて、サーバ200と通信しながら、ユーザの入力操作に応じてゲームを進行させる。例えば、ゲーム進行部115は、チュートリアルパート、対戦パート、または、抽選パートを進行させる。 In step S103, the game progress unit 115 advances the game according to the input operation of the user while communicating with the server 200 as necessary. For example, the game progress unit 115 advances a tutorial part, a battle part, or a lottery part.
 ステップS104にて、進行支援部211は、必要に応じて、ユーザ端末100に対して必要な情報を提供するなどして、ユーザ端末100におけるゲーム進行を支援する。 In step S104, the progress support unit 211 supports the progress of the game on the user terminal 100 by providing necessary information to the user terminal 100 as needed.
 ステップS105にて、動作指図データが動作指図装置300からユーザ端末100へライブで配信されるライブ配信時刻になると、サーバ200の共有支援部212は、ステップS105のYESからステップS106に進む。ライブ配信時刻は、例えば、ゲームマスターによって予め決定されており、サーバ200および動作指図装置300において管理されていてもよい。また、ユーザ端末100に対して、ライブ配信時刻は予め通知されていてもよいし、実際にライブ配信時刻になるまで秘密にされていてもよい。前者の場合、ユーザに対して安定的にライブ配信を供給することができ、後者の場合、サプライズ配信として、ユーザに特別な付加価値が付いたライブ配信を供給することが可能となる。 At the live distribution time when the operation instruction data is live-distributed from the operation instruction device 300 to the user terminal 100 in step S105, the sharing support unit 212 of the server 200 proceeds from YES in step S105 to step S106. The live distribution time is, for example, predetermined by the game master and may be managed by the server 200 and the operation instruction device 300. Further, the live distribution time may be notified to the user terminal 100 in advance, or may be kept secret until the actual live distribution time is reached. In the former case, live distribution can be stably supplied to the user, and in the latter case, live distribution with special added value can be supplied to the user as a surprise distribution.
 ステップS106にて、共有支援部212は、ライブ配信を受ける権利がある1以上のユーザ端末100を探索する。ライブ配信を受けられる条件は、適宜ゲームマスターが設定すればよいが、少なくとも、本ゲームのアプリケーションをインストールしていること、ライブ配信時刻時点で、オンラインであることが条件として挙げられる。本実施形態では一例として、共有支援部212は、事前に、上述のライブ配信時刻において、ライブ配信を受ける旨の予約を行った特定のユーザ端末100を、ライブ配信を受ける権利があるユーザ端末100として探索する。あるいは、共有支援部212は、ライブ配信時刻時点で、オンラインである、すなわち、本ゲームのアプリケーションを起動しているユーザ端末100を、ライブ配信を受ける権利があるユーザ端末100として探索してもよい。あるいは、共有支援部212は、さらに、ライブ配信を受けるための対価を支払い済みのユーザが所有するユーザ端末100であることを条件に加えてもよい。 In step S106, the sharing support unit 212 searches for one or more user terminals 100 having the right to receive live distribution. The conditions for receiving live distribution may be set by the game master as appropriate, but at least the conditions are that the application of this game is installed and that the game is online at the time of live distribution. As an example in the present embodiment, the sharing support unit 212 has a user terminal 100 having the right to receive live distribution from a specific user terminal 100 that has been reserved in advance to receive live distribution at the above-mentioned live distribution time. Search as. Alternatively, the sharing support unit 212 may search for the user terminal 100 that is online at the time of the live distribution, that is, that is running the application of this game, as the user terminal 100 that has the right to receive the live distribution. .. Alternatively, the sharing support unit 212 may further add that the user terminal 100 is owned by the user who has paid the consideration for receiving the live distribution.
 ステップS107にて、共有支援部212は、検出した1以上のユーザ端末100を動作指図装置300に通知する。例えば、共有支援部212は、ユーザ端末100の端末ID、ユーザ端末100の所有者であるユーザのユーザID、および、ユーザ端末100のアドレスなどを動作指図装置300に通知してもよい。 In step S107, the sharing support unit 212 notifies the operation instruction device 300 of one or more detected user terminals 100. For example, the sharing support unit 212 may notify the operation instruction device 300 of the terminal ID of the user terminal 100, the user ID of the user who is the owner of the user terminal 100, the address of the user terminal 100, and the like.
 ステップS108にて、共有支援部212は、ライブ配信の実行主体として特定した動作指図装置300を、ステップS106にて検出したユーザ端末100に通知する。共有支援部212は、例えば、動作指図装置300のアドレスまたは装置IDなどをユーザ端末100に通知してもよい。 In step S108, the sharing support unit 212 notifies the user terminal 100 detected in step S106 of the operation instruction device 300 specified as the execution subject of the live distribution. The sharing support unit 212 may notify the user terminal 100, for example, the address or device ID of the operation instruction device 300.
 一方、ステップS109にて、動作指図装置300のキャラクタ制御部316は、ライブ配信時刻になると、ステップS108のYESからS111以降の処理に進む。あるいは、キャラクタ制御部316は、サーバ200から、ライブ配信を開始する旨の要求と、ライブ配信先のユーザ端末100についての情報とを受信したときに(ステップS107の後)、S111以降の処理を開始してもよい。 On the other hand, in step S109, the character control unit 316 of the operation instruction device 300 proceeds from YES in step S108 to the processing after S111 when the live distribution time is reached. Alternatively, when the character control unit 316 receives the request to start the live distribution from the server 200 and the information about the user terminal 100 of the live distribution destination (after step S107), the character control unit 316 performs the processing after S111. You may start.
 なお、ライブ配信は、ユーザの要求に応じて開始される構成であってもよい。この場合、ユーザ端末100は、ユーザの入力操作にしたがって、動作指図データのライブ配信を希望する旨のリクエストをサーバ200に送信する。サーバ200は、このとき、ライブ配信に対応できる動作指図装置300があれば、ライブ配信が可能である旨のレスポンスをユーザ端末100に返す。また、マッチングが成立したユーザ端末100および動作指図装置300のそれぞれに通信相手の装置の情報を通知する。 Note that live distribution may be configured to be started in response to a user's request. In this case, the user terminal 100 sends a request to the server 200 to request live distribution of the operation instruction data according to the input operation of the user. At this time, the server 200 returns a response to the user terminal 100 that the live distribution is possible if there is an operation instruction device 300 that can support the live distribution. In addition, the information of the device of the communication partner is notified to each of the user terminal 100 and the operation instruction device 300 for which matching is established.
 ステップS110にて、ユーザ端末100の進捗情報生成部117は、進捗情報を生成し、動作指図装置300に送信する。進捗情報生成部117は、ステップS103以降ゲーム進行部115によってゲームが進行している間、定期的にまたはイベントが発生する度に進行ログを更新して、適時、進捗情報を生成し、送信する。 In step S110, the progress information generation unit 117 of the user terminal 100 generates progress information and transmits it to the operation instruction device 300. The progress information generation unit 117 updates the progress log periodically or every time an event occurs while the game is progressing by the game progress unit 115 after step S103, and generates and transmits the progress information in a timely manner. ..
 ステップS111にて、動作指図装置300の進捗模擬部315は、進捗情報に基づいて、ユーザ端末100におけるゲームの進捗を自装置においてシミュレーションする。例えば、進捗模擬部315は、進捗情報に基づいて、ユーザ端末100におけるゲームの進捗を示す進捗画面400を生成し、表示部352に表示する。一例として、進捗模擬部315は、図6に示すように、進捗情報に含まれる進行ログ402を進捗画面400に配置する。また、進捗模擬部315は、進捗情報に含まれる画面IDに基づいて、ユーザ端末100において表示中のゲーム画面を特定し、該ゲーム画面の模擬画面401を進捗画面400に配置する。さらに、進捗模擬部315は、進捗情報に含まれる動的オブジェクトの属性関連情報に基づいて、動的オブジェクトを模擬画面401上に再現してもよい。 In step S111, the progress simulation unit 315 of the operation instruction device 300 simulates the progress of the game on the user terminal 100 on its own device based on the progress information. For example, the progress simulation unit 315 generates a progress screen 400 showing the progress of the game on the user terminal 100 based on the progress information, and displays it on the display unit 352. As an example, the progress simulation unit 315 arranges the progress log 402 included in the progress information on the progress screen 400 as shown in FIG. Further, the progress simulation unit 315 identifies the game screen being displayed on the user terminal 100 based on the screen ID included in the progress information, and arranges the simulation screen 401 of the game screen on the progress screen 400. Further, the progress simulation unit 315 may reproduce the dynamic object on the simulation screen 401 based on the attribute-related information of the dynamic object included in the progress information.
 ステップS112にて、キャラクタ制御部316は、声優などのアクターが発した音声をマイク3010を介して音声データとして受け付ける。 In step S112, the character control unit 316 receives the voice emitted by an actor such as a voice actor as voice data via the microphone 3010.
 ステップS113にて、キャラクタ制御部316は、モデルなどのアクターがモーションキャプチャ装置3020を介して入力した動きをモーションキャプチャデータとして取得する。 In step S113, the character control unit 316 acquires the motion input by the actor such as the model via the motion capture device 3020 as motion capture data.
 ステップS114にて、キャラクタ制御部316は、オペレータがタッチスクリーン35の入力部351を介して入力したタッチ操作に基づいて、ゲーム画面上でユーザに注目してもらいたい箇所を特定する。 In step S114, the character control unit 316 specifies a location on the game screen that the operator wants the user to pay attention to, based on the touch operation input by the operator via the input unit 351 of the touch screen 35.
 なお、ステップS112~S114は、どの順番で実行されても構わない。 Note that steps S112 to S114 may be executed in any order.
 ステップS115にて、キャラクタ制御部316は、動作指図データを生成する。具体的には、キャラクタ制御部316は、ユーザ端末100のゲーム画面に重畳させるキャラクタを特定し、該キャラクタのキャラクタIDを、動作指図データの「キャラクタID」の項目に格納する。どのキャラクタを重畳させるのかは、ゲームマスターによって予めスケジューリングされ、動作指図装置300に登録されていてもよい。あるいは、動作指図装置300のオペレータが、どのキャラクタの動作指図データを作成するのかを動作指図装置300に対して予め指定しておいてもよい。キャラクタ制御部316は、少なくとも、ステップS112で取得した音声データを、動作指図データの「音声」の項目に格納する。キャラクタ制御部316は、ステップS113で取得したモーションキャプチャデータがある場合、これを、動作指図データの「動き」の項目に格納する。キャラクタ制御部316は、音声データとモーションキャプチャデータとが同期するように、音声データとモーションキャプチャデータとを紐付けることが好ましい。キャラクタ制御部316は、ステップS114で受け付けられたタッチ操作がある場合、これに基づいて注目情報を生成し、動作指図データの「注目箇所」の項目に格納する。キャラクタ制御部316は、ステップS107にてサーバ200より通知された1以上のユーザ端末100が宛先となるように、これらのユーザ端末100のグループのグループ識別情報を、1台の場合は、ユーザ端末100のアドレスを、宛先指定情報として、動作指図データの「宛先」の項目に格納する。 In step S115, the character control unit 316 generates operation instruction data. Specifically, the character control unit 316 identifies a character to be superimposed on the game screen of the user terminal 100, and stores the character ID of the character in the item of "character ID" of the operation instruction data. Which character is superimposed may be scheduled in advance by the game master and registered in the operation instruction device 300. Alternatively, the operator of the operation instruction device 300 may specify in advance to the operation instruction device 300 which character the operation instruction data should be created. The character control unit 316 stores at least the voice data acquired in step S112 in the “voice” item of the operation instruction data. If there is motion capture data acquired in step S113, the character control unit 316 stores the motion capture data in the “movement” item of the operation instruction data. It is preferable that the character control unit 316 associates the voice data with the motion capture data so that the voice data and the motion capture data are synchronized with each other. When there is a touch operation received in step S114, the character control unit 316 generates attention information based on the touch operation and stores it in the item of "attention point" of the operation instruction data. The character control unit 316 uses the group identification information of the group of these user terminals 100 so that the destination is one or more user terminals 100 notified by the server 200 in step S107, in the case of one user terminal. The address of 100 is stored in the "destination" item of the operation instruction data as the destination designation information.
 ステップS116にて、キャラクタ制御部316は、通信IF33を介して、上述のように生成した動作指図データを、宛先として指定した各ユーザ端末100に送信する。キャラクタ制御部316は、アクターが声を出したり、動いたりして得られた音声データおよびモーションキャプチャデータを、取得してすぐさま動作指図データへとレンダリングし、リアルタイムで、各ユーザ端末100に配信することが望ましい。 In step S116, the character control unit 316 transmits the operation instruction data generated as described above to each user terminal 100 designated as a destination via the communication IF 33. The character control unit 316 acquires voice data and motion capture data obtained by the actor making a voice or moving, and immediately renders them into motion instruction data, and distributes them to each user terminal 100 in real time. Is desirable.
 なお、ステップS112~S116は、ステップS107以降、ステップS111に先行して実行されてもよい。 Note that steps S112 to S116 may be executed prior to step S111 after step S107.
 ステップS117にて、ユーザ端末100の解析部116は、通信IF13を介して、上述の動作指図データを受信する。例えば、解析部116は、動作指図装置300またはサーバ200から予めライブ配信すると予告された時刻に、動作指図データを受信してもよい。 In step S117, the analysis unit 116 of the user terminal 100 receives the above-mentioned operation instruction data via the communication IF 13. For example, the analysis unit 116 may receive the operation instruction data at a time previously announced to be live-streamed from the operation instruction device 300 or the server 200.
 ステップS118にて、解析部116は、受信したことをトリガにして、すぐさま、受信した動作指図データを解析する。 In step S118, the analysis unit 116 immediately analyzes the received operation instruction data by using the reception as a trigger.
 ステップS119にて、ゲーム進行部115は、解析部116によって解析された動作指図データで指定されたキャラクタを、表示部152に表示中のゲーム画面に重畳させ、該キャラクタを、該動作指図データに基づいて動作させる。具体的には、ゲーム進行部115は、図7に示す練習画面800などを表示部152に表示しているところに、キャラクタ801を重畳させる。ゲーム進行部115は、声優701、モデル702などのアクターが動作指図装置300の設置場所で、声を出したり、動いたりしているのとほぼ同時に、リアルタイムで、その音声および動きを、練習画面800におけるキャラクタ801の発言および動きに反映させる。解析部116およびゲーム進行部115は、リアルタイムの動画のレンダリングおよび再生を、動作指図装置300から動作指図データを継続して受信し続けている間継続する。具体的には、ゲーム進行部115は、ユーザから何の入力操作も受け付けず、動作指図データが受信されている間は、ステップS120のNOからステップS103に戻り、以降の各ステップを繰り返す。 In step S119, the game progress unit 115 superimposes the character specified by the operation instruction data analyzed by the analysis unit 116 on the game screen being displayed on the display unit 152, and superimposes the character on the operation instruction data. Operate based on. Specifically, the game progress unit 115 superimposes the character 801 on the place where the practice screen 800 and the like shown in FIG. 7 are displayed on the display unit 152. The game progress unit 115 practices the voice and movement in real time almost at the same time as the actors such as the voice actor 701 and the model 702 make a voice or move at the place where the operation instruction device 300 is installed. It is reflected in the speech and movement of the character 801 in 800. The analysis unit 116 and the game progress unit 115 continue rendering and reproducing the real-time moving image while continuously receiving the operation instruction data from the operation instruction device 300. Specifically, the game progress unit 115 does not accept any input operation from the user, and while the operation instruction data is received, returns from NO in step S120 to step S103, and repeats the subsequent steps.
 ステップS120にて、動作指図データに基づいてキャラクタが動作している間に、操作受付部111が、ユーザから入力操作を受け付けると、ゲーム進行部115は、ステップS120のYESからステップS121に進む。例えば、操作受付部111は、練習画面800におけるコメント入力ボタン807に対する入力操作を受け付ける。 If the operation receiving unit 111 receives an input operation from the user while the character is operating based on the operation instruction data in step S120, the game progressing unit 115 proceeds from YES in step S120 to step S121. For example, the operation reception unit 111 accepts an input operation for the comment input button 807 on the practice screen 800.
 ステップS121にて、ゲーム進行部115は、上述の入力操作に応じて生成したコメントデータを動作指図装置300に送信する。具体的には、ゲーム進行部115は、選択されたコメントのコメントIDをコメントデータとして送信してもよい。あるいは、ゲーム進行部115は、ユーザにより入力された文章のテキストデータをコメントデータとして送信してもよい。あるいは、ゲーム進行部115は、ユーザにより入力された音声の音声データをコメントデータとして送信してもよい。あるいは、ゲーム進行部115は、ユーザにより入力された音声を認識し、テキストデータに変換したものをコメントデータとして送信してもよい。 In step S121, the game progress unit 115 transmits the comment data generated in response to the above-mentioned input operation to the operation instruction device 300. Specifically, the game progress unit 115 may transmit the comment ID of the selected comment as comment data. Alternatively, the game progress unit 115 may transmit the text data of the text input by the user as comment data. Alternatively, the game progress unit 115 may transmit the voice data of the voice input by the user as comment data. Alternatively, the game progress unit 115 may recognize the voice input by the user, convert it into text data, and transmit it as comment data.
 ステップS122にて、動作指図装置300の反応処理部317は、通信IF33を介して、ユーザ端末100から送信されたコメントデータを受信する。 In step S122, the reaction processing unit 317 of the operation instruction device 300 receives the comment data transmitted from the user terminal 100 via the communication IF 33.
 ステップS123にて、反応処理部317は、受信したコメントデータを、動作指図装置300に出力する。例えば、反応処理部317は、コメントデータに含まれるテキストデータを表示部352に表示する。これにより、オペレータまたはアクターは、自分たちが動かしたキャラクタに対して、ユーザがどのような反応を示したのかを示すフィードバックを受けることが可能となる。そして、オペレータまたはアクターは、このフィードバックに合わせて、さらなるキャラクタの動作を決定することができる。すなわち、動作指図装置300は、ステップS112に戻り、音声データ、および、必要に応じてモーションキャプチャデータの取得を継続し、動作指図データをユーザ端末100に提供し続ける。ユーザ端末100は、自端末における入力操作の内容が動作指図装置300によって受信された後、該動作指図装置300から送信された動作指図データを受信する。具体的には、ユーザ端末100は、キャラクタの発言内容に対応する音声データ、および、キャラクタの動きに対応するモーションキャプチャデータなどが含まれた動作指図データを受信する。そして、ユーザ端末100は、継続的に、該動作指図データに基づいて、キャラクタを動作させる。結果として、ユーザに、キャラクタとのリアルタイムでインタラクティブなやりとりを体験させることが可能となる。なお、モーションキャプチャデータに代えて、キャラクタの動作を指示する1以上のコマンドが、動作指図装置300のオペレータが指示した順に並んでいるモーションコマンド群が、ユーザ端末100によって受信されてもよい。モーションコマンド群もまたモーションキャプチャデータと同様に音声データと同期して紐付けられている。これによりユーザ端末100のゲーム進行部115は、音声データの内容を発話させるのに合わせて、モーションコマンド群にしたがってキャラクタを動かすことが可能となる。 In step S123, the reaction processing unit 317 outputs the received comment data to the operation instruction device 300. For example, the reaction processing unit 317 displays the text data included in the comment data on the display unit 352. This allows the operator or actor to receive feedback on how the user responded to the character they moved. The operator or actor can then determine further character actions in response to this feedback. That is, the operation instruction device 300 returns to step S112, continues to acquire voice data and motion capture data as needed, and continues to provide operation instruction data to the user terminal 100. The user terminal 100 receives the operation instruction data transmitted from the operation instruction device 300 after the content of the input operation in the own terminal is received by the operation instruction device 300. Specifically, the user terminal 100 receives voice data corresponding to the content of the character's speech, motion capture data corresponding to the movement of the character, and the like, and operation instruction data. Then, the user terminal 100 continuously operates the character based on the operation instruction data. As a result, it is possible for the user to experience real-time interactive interaction with the character. Instead of the motion capture data, the user terminal 100 may receive a motion command group in which one or more commands instructing the operation of the character are arranged in the order instructed by the operator of the operation instruction device 300. The motion command group is also associated with the voice data in synchronization with the motion capture data. As a result, the game progress unit 115 of the user terminal 100 can move the character according to the motion command group in accordance with the utterance of the content of the voice data.
 〔実施形態2〕
 本実施形態では、動作指図装置300は、マルチプレイゲームに参加する複数のユーザ端末100と通信し、それぞれのユーザ端末100におけるゲームの進捗を、ユーザ端末100ごとの進捗情報に基づき把握する。そして、動作指図装置300は、全体の進捗に合ったふるまいをキャラクタに行わせるための動作指図データを生成し、各ユーザ端末100に配信する。
[Embodiment 2]
In the present embodiment, the operation instruction device 300 communicates with a plurality of user terminals 100 participating in the multiplayer game, and grasps the progress of the game in each user terminal 100 based on the progress information of each user terminal 100. Then, the operation instruction device 300 generates operation instruction data for causing the character to behave according to the overall progress, and distributes the operation instruction data to each user terminal 100.
 本実施形態に係るゲームシステム1において実行されるゲーム(以下、本ゲーム)は、一例として、複数のユーザが、それぞれの乗り物を操作して同じコースを周回し、そのタイムを競うカーレースゲームである。 The game executed in the game system 1 according to the present embodiment (hereinafter, this game) is, for example, a car racing game in which a plurality of users operate their respective vehicles to go around the same course and compete for their time. be.
 本実施形態に係る動作指図装置300の進捗模擬部315は、同じレースに参加する複数のユーザ端末100のそれぞれから進捗情報を取得する。進捗模擬部315は、該レースの進行を支援し、同期制御を行っているサーバ200から、レース全体の進捗を示す統合された進捗情報を取得してもよい。進捗模擬部315は、取得したユーザごとの進捗情報を進捗画面に並べて配置してもよいし、それぞれの進捗情報を統合し、それにより得られる全体としての進捗を進捗画面に配置してもよい。あるいは、進捗模擬部315は、サーバ200から受信した既に統合された進捗情報を進捗画面に配置してもよい。 The progress simulation unit 315 of the motion instruction device 300 according to the present embodiment acquires progress information from each of the plurality of user terminals 100 participating in the same race. The progress simulation unit 315 may acquire integrated progress information indicating the progress of the entire race from the server 200 that supports the progress of the race and performs synchronization control. The progress simulation unit 315 may arrange the acquired progress information for each user side by side on the progress screen, or may integrate the progress information and arrange the overall progress obtained by the integration on the progress screen. .. Alternatively, the progress simulation unit 315 may arrange the already integrated progress information received from the server 200 on the progress screen.
 キャラクタ制御部316は、上述のように生成した動作指図データを、同じレースに参加する複数のユーザ端末100に配信する。 The character control unit 316 distributes the operation instruction data generated as described above to a plurality of user terminals 100 participating in the same race.
 本実施形態では、ユーザ端末100ごとに個別の進捗情報は、例えば、「ユーザID」、「ユーザ名」、「現在位置」、「順位」、「ラップ」および「time(トップとの差)」の各項目を含んでいてもよい。また、サーバ200によって生成される、統合された進捗情報は、「経過時間」、「トップのラップ」、「コースマップ」および「各レースカーの現在位置」の各項目を含んでいてもよい。 In the present embodiment, the progress information individually for each user terminal 100 is, for example, "user ID", "user name", "current position", "rank", "lap", and "time (difference from the top)". Each item of may be included. Also, the integrated progress information generated by the server 200 may include items such as "elapsed time", "top lap", "course map" and "current position of each race car".
 <画面例>
  (進捗画面)
 図9は、動作指図装置300の表示部352に表示される進捗画面の他の例を示す図である。本実施形態では、進捗模擬部315は、各ユーザ端末100から取得された進捗情報を統合し、進捗画面500を表示部352に表示させる。
<Screen example>
(Progress screen)
FIG. 9 is a diagram showing another example of the progress screen displayed on the display unit 352 of the operation instruction device 300. In the present embodiment, the progress simulation unit 315 integrates the progress information acquired from each user terminal 100, and displays the progress screen 500 on the display unit 352.
 進捗画面500は、一例として、統合進捗図501、個別進捗一覧502、および、統合進捗情報503を含む。さらに、進捗画面500は、オペレータ(または声優)が、マイク3010を介して音声データを自装置に対して入力するためのUI部品403を含んでいてもよい。 The progress screen 500 includes, as an example, the integrated progress diagram 501, the individual progress list 502, and the integrated progress information 503. Further, the progress screen 500 may include a UI component 403 for the operator (or voice actor) to input voice data to the own device via the microphone 3010.
 進捗模擬部315は、ユーザ端末100およびサーバ200から取得した進捗情報を解析する。進捗模擬部315は、解析結果に基づいて、具体的には、以下のようにして統合進捗図501を生成する。進捗模擬部315は、統合された進捗情報に含まれている「コースマップ」の情報に基づいてコースマップを描画し、「各レースカーの現在位置」の情報に基づいて、各ユーザが操作するレースカーのオブジェクトを、上述のコースマップ上にプロットする。 The progress simulation unit 315 analyzes the progress information acquired from the user terminal 100 and the server 200. Based on the analysis result, the progress simulation unit 315 specifically generates the integrated progress diagram 501 as follows. The progress simulation unit 315 draws a course map based on the information of the "course map" included in the integrated progress information, and each user operates based on the information of the "current position of each race car". The race car objects are plotted on the above course map.
 また、進捗模擬部315は、以下のようにして個別進捗一覧502を生成する。進捗模擬部315は、個別の進捗情報から、「ユーザ名」、「順位」、「ラップ」、および、「time」の各項目を抽出する。そして、抽出された各項目がユーザごとに一覧されるように各項目の情報を配置して、個別進捗一覧502を生成する。 Further, the progress simulation unit 315 generates an individual progress list 502 as follows. The progress simulation unit 315 extracts each item of "user name", "rank", "lap", and "time" from the individual progress information. Then, the information of each item is arranged so that each extracted item is listed for each user, and the individual progress list 502 is generated.
 進捗模擬部315は、統合された進捗情報から、「トップのラップ」の項目を抽出し、これを、統合進捗情報503に反映させてもよい。 The progress simulation unit 315 may extract the item of "top lap" from the integrated progress information and reflect this in the integrated progress information 503.
 以上のように、動作指図装置300のオペレータは、進捗画面500を確認することにより、複数のユーザ端末100で実行されているゲームの進捗についてその全容を把握することができる。 As described above, the operator of the operation instruction device 300 can grasp the whole picture of the progress of the game being executed by the plurality of user terminals 100 by checking the progress screen 500.
 なお、進捗画面500には、UI部品403が設けられていてもよい。これにより、オペレータは、ゲームの全体的な進捗を確認した後、その進捗に合うように、その場で発話の内容を決定して、該発話の内容に対応する音声データを動作指図装置300に対して入力することができる。 Note that the progress screen 500 may be provided with a UI component 403. As a result, the operator confirms the overall progress of the game, determines the content of the utterance on the spot so as to match the progress, and transfers the voice data corresponding to the content of the utterance to the operation instruction device 300. You can enter it.
  (ゲーム画面)
 図10は、ユーザ端末100の表示部152に表示されるゲーム画面の一例を示す図である。図10では、表示部152に表示されるゲーム画面の一例として、本ゲームにおいてレース進行中に表示されるレース画面600が図示されている。また、レース画面600は、一例として、レースのトップを走っているユーザのユーザ端末100において表示される画面を示している。
(Game screen)
FIG. 10 is a diagram showing an example of a game screen displayed on the display unit 152 of the user terminal 100. In FIG. 10, as an example of the game screen displayed on the display unit 152, the race screen 600 displayed during the progress of the race in this game is shown. Further, the race screen 600 shows, as an example, a screen displayed on the user terminal 100 of the user running at the top of the race.
 ゲーム進行部115は、例えば、レース画面600において、ゲーム空間上のコースを描画する。そして、ゲーム進行部115は、ユーザが操作するレースカーを配置する。ゲーム空間上の仮想カメラ位置との位置関係で、他のユーザが操作するレースカーが画角に収まる場合には、ゲーム進行部115が、該他のユーザのレースカーをレース画面600に配置してもよい。 The game progress unit 115 draws a course on the game space on the race screen 600, for example. Then, the game progress unit 115 arranges a race car operated by the user. If the race car operated by another user fits in the angle of view due to the positional relationship with the virtual camera position in the game space, the game progress unit 115 arranges the race car of the other user on the race screen 600. You may.
 進捗情報生成部117は、ゲーム進行部115がレースを進行させている間、進捗情報を適時に生成し、動作指図装置300に送信する。 The progress information generation unit 117 generates progress information in a timely manner while the game progress unit 115 is proceeding with the race, and transmits the progress information to the operation instruction device 300.
 レースに参加する各ユーザ端末100から個別の進捗情報を受信し、レースの同期をとっているサーバ200から統合された進捗情報を受信した動作指図装置300は、これらの進捗情報をそのまま、あるいは、加工したものを、図9に示す進捗画面500として表示部352に表示する。進捗画面500を確認したオペレータは、レースの状況を把握し、実況内容を決定することができる。オペレータまたは声優701は、レース展開に合った実況を含む音声を、マイク3010を介して動作指図装置300に入力する。また、オペレータまたはモデル702は、必要に応じて、モーションキャプチャ装置3020を介してキャラクタの動きを動作指図装置300に入力してもよい。 The operation instruction device 300, which receives individual progress information from each user terminal 100 participating in the race and receives integrated progress information from the server 200 that synchronizes the race, may use these progress information as it is or The processed product is displayed on the display unit 352 as the progress screen 500 shown in FIG. The operator who confirms the progress screen 500 can grasp the situation of the race and determine the actual situation. The operator or the voice actor 701 inputs a voice including a live commentary suitable for the race development to the operation instruction device 300 via the microphone 3010. Further, the operator or the model 702 may input the movement of the character to the operation instruction device 300 via the motion capture device 3020, if necessary.
 本実施形態では、解析部116が、動作指図装置300から、動作指図データを受信すると、これをトリガとして、ゲーム進行部115は、受信された動作指図データに基づいてキャラクタ601をレース画面600に重畳させる。例えば、キャラクタ601を重畳可能な領域は、ゲーム画面ごとに、ユーザのプレイを妨げない場所に予め決定されていることが好ましい。そして、動作指図データに含まれているモーションキャプチャデータが示す動きをキャラクタ601の動きに反映させる。実施形態1と同様に、モーションキャプチャデータは、動作指図装置300の設置場所において、モデル702の動きをモーションキャプチャ装置3020を介して取得したものである。したがって、モデル702の動きが、そのまま、表示部152に表示されるキャラクタ801の動きに反映される。 In the present embodiment, when the analysis unit 116 receives the operation instruction data from the operation instruction device 300, the game progress unit 115 uses the operation instruction data as a trigger to display the character 601 on the race screen 600 based on the received operation instruction data. Superimpose. For example, it is preferable that the area on which the character 601 can be superimposed is predetermined for each game screen in a place that does not interfere with the user's play. Then, the motion indicated by the motion capture data included in the motion instruction data is reflected in the motion of the character 601. Similar to the first embodiment, the motion capture data is obtained by acquiring the movement of the model 702 via the motion capture device 3020 at the installation location of the motion instruction device 300. Therefore, the movement of the model 702 is directly reflected in the movement of the character 801 displayed on the display unit 152.
 また、ゲーム進行部115は、動作指図装置300から供給された動作指図データに含まれている音声データ602を、キャラクタ601が発した音声として、キャラクタ601の動きと同期して出力する。実施形態1と同様に、音声データは、動作指図装置300の設置場所において、声優701の音声をマイク3010を介して取得したものである。したがって、声優701が発した音声に対応する音声データ602が、そのまま、ユーザ端末100のスピーカから出力される。 Further, the game progress unit 115 outputs the voice data 602 included in the operation instruction data supplied from the operation instruction device 300 as the voice emitted by the character 601 in synchronization with the movement of the character 601. Similar to the first embodiment, the voice data is obtained by acquiring the voice of the voice actor 701 through the microphone 3010 at the installation location of the operation instruction device 300. Therefore, the voice data 602 corresponding to the voice emitted by the voice actor 701 is output as it is from the speaker of the user terminal 100.
 動作指図装置300は、ゲームに参加するすべてのユーザ端末100に上述の動作指図データを送信することができる。したがって、参加するユーザ端末100の表示部152それぞれに、レース画面600とともにキャラクタ601が重畳表示される。 The operation instruction device 300 can transmit the above-mentioned operation instruction data to all the user terminals 100 participating in the game. Therefore, the character 601 is superimposed and displayed together with the race screen 600 on each of the display units 152 of the participating user terminals 100.
 上述の構成によれば、複数のユーザ端末100が参加するようなマルチプレイゲームにおいても、本発明を適用することが可能となる。 According to the above configuration, the present invention can be applied even in a multiplayer game in which a plurality of user terminals 100 participate.
 〔実施形態3〕
 本実施形態では、特に、リアルタイムのライブ配信が一旦終了した後であっても、ユーザは、終了済みのライブ配信パートの進行を要求し、受信した動作指図データに基づいてライブ配信パートを改めて進行させることができる。これにより、ユーザは、ライブ配信を再度見返すこと、および仮に見逃した場合でも改めてライブ配信を見ることができる。以下では、ライブ配信時刻終了後の場面を想定している。また、ここでのキャラクタは、ユーザによる直接操作の対象ではないキャラクタ(アバターオブジェクト等も含む)を想定している。なお、「ライブ配信パート」には、主には上述した対戦パートが含まれるが、さらに抽選パートおよびチュートリアルパートが含まれてもよい(以下同様)。
[Embodiment 3]
In the present embodiment, in particular, even after the real-time live distribution is once completed, the user requests the progress of the completed live distribution part, and the live distribution part is re-progressed based on the received operation instruction data. Can be made to. As a result, the user can look back at the live stream again, and even if he / she misses it, he / she can watch the live stream again. In the following, the scene after the end of the live distribution time is assumed. Further, the character here is assumed to be a character (including an avatar object) that is not a target of direct operation by the user. The "live distribution part" mainly includes the above-mentioned battle part, but may further include a lottery part and a tutorial part (the same applies hereinafter).
 <処理概要>
 本実施形態では、ユーザ端末100は、ゲームプログラム131に基づいて、ゲームの興趣性を向上させるために以下のステップを実行するように構成されている。具体的には、ユーザ端末100(コンピュータ)は、例えば入力部151などの操作部を介して、終了済みのライブ配信パートの進行を要求するステップと、サーバ200または動作指図装置300から、終了済みのライブ配信パートに係る記録済みの動作指図データを受信するステップと、記録済みの動作指図データに基づいてキャラクタを動作させることにより終了済みのライブ配信パートを進行させるステップとを実行する。ここでは、記録済みの動作指図データは、キャラクタに関連付けられるオペレータが入力したモーションデータおよび音声データを含んでいる。オペレータは、モデルや声優のみならず、動作指図装置300への何らかの操作を行う作業者も含むが、ユーザは含まない。なお、記録済みの動作指図データは、サーバ200の記憶部220または動作指図装置300の記憶部320に格納されるのがよく、ユーザ端末100からの要求に応じて、改めてユーザ端末110に配信されるのがよい。
<Processing overview>
In the present embodiment, the user terminal 100 is configured to execute the following steps in order to improve the interest of the game based on the game program 131. Specifically, the user terminal 100 (computer) has been completed from the step of requesting the progress of the completed live distribution part and the operation instruction device 300 from the server 200 or the operation instruction device 300, for example, via an operation unit such as an input unit 151. The step of receiving the recorded operation instruction data related to the live distribution part of the above, and the step of advancing the completed live distribution part by operating the character based on the recorded operation instruction data are executed. Here, the recorded action instruction data includes motion data and voice data input by the operator associated with the character. The operator includes not only the model and the voice actor but also the operator who performs some operation on the operation instruction device 300, but does not include the user. The recorded operation instruction data is often stored in the storage unit 220 of the server 200 or the storage unit 320 of the operation instruction device 300, and is delivered to the user terminal 110 again in response to a request from the user terminal 100. It is good to do it.
 本実施形態では、ユーザがライブ配信パートをリアルタイムで進行させたか否かの結果に応じて、記録済みの動作指図データに基づく終了済みのライブ配信パートの進行を異なるものとするのがよい。具体的には、ユーザがライブ配信パートをリアルタイムで進行させた実績があると判定される場合には、ユーザがライブ配信パートをリアルタイムで進行させたものと同様のライブ配信パートを再度進行させるのがよい(見返し配信)。見返し配信では、ライブ配信パートの選択的な進行を実行するのがよい。一方、ユーザがライブ配信パートをリアルタイムで進行させた実績がないと判定される場合には、リアルタイムで進行させたのとは異なる進行態様のライブ配信パートを進行させるのがよい(見逃し配信)。ここで、見逃し配信において実績がないと判定される場合には、例えば、ユーザがライブ配信を受ける権利を有し、ユーザがライブ配信時刻であればリアルタイムのライブ配信パートを進行可能であったにも拘わらず、実際にはこれを実行しなかった場合が含まれる。見逃し配信では、ライブ配信パートの制限付きの進行を実行するのがよい。 In the present embodiment, it is preferable that the progress of the completed live distribution part based on the recorded operation instruction data is different depending on the result of whether or not the user has progressed the live distribution part in real time. Specifically, when it is determined that the user has a track record of advancing the live distribution part in real time, the same live distribution part as the user has advanced the live distribution part in real time is advanced again. Is good (return delivery). In return delivery, it is better to perform a selective progression of the live delivery part. On the other hand, when it is determined that the user has no record of progressing the live distribution part in real time, it is preferable to proceed with the live distribution part having a progress mode different from that progressed in real time (missed distribution). Here, if it is determined that there is no track record in the overlooked distribution, for example, the user has the right to receive the live distribution, and if the user has the live distribution time, the real-time live distribution part can proceed. Nevertheless, this includes cases where this was not actually done. For missed deliveries, it's a good idea to perform a limited progression of the live stream part.
 <システム1の機能的構成>
 本実施形態に係るユーザ端末100において、ユーザがライブ配信パートをリアルタイムで進行させた実績があると判定される場合に、ゲーム進行部115は、さらに、ライブ配信パートにおけるユーザ行動履歴情報を受信して解析する。ユーザ行動履歴情報とは、記録済みの動作指図データの中身とは別に、ライブ配信パートの進行の間に入力操作により受け付けられたユーザの行動の記録のデータ・セットである。ユーザ行動履歴情報は、記録済みの動作指図データに関連付けられるのがよく、サーバ200の記憶部220または動作指図装置300の記憶部320に格納されるのがよい。これに加えて、或いはこれに替えて、ユーザ行動履歴情報はユーザ端末100の記憶部120に格納されてもよい。
<Functional configuration of system 1>
When it is determined that the user has a track record of advancing the live distribution part in real time in the user terminal 100 according to the present embodiment, the game progress unit 115 further receives the user action history information in the live distribution part. To analyze. The user action history information is a data set of user actions recorded by an input operation during the progress of the live distribution part, in addition to the contents of the recorded action instruction data. The user action history information is often associated with the recorded operation instruction data, and is preferably stored in the storage unit 220 of the server 200 or the storage unit 320 of the operation instruction device 300. In addition to or instead of this, the user behavior history information may be stored in the storage unit 120 of the user terminal 100.
 図11は、ユーザ行動履歴情報のデータ構造の一例を示す図である。ユーザ行動履歴情報は、例えば、ライブ配信パート内においてユーザが行動した行動時間、行動種別、および行動詳細のような項目を含み、ユーザを識別するユーザIDに関連付けられる。項目「行動時間」は、ライブ配信パート内でユーザが行動を行った時間情報であり、項目「行動種別」はユーザの行動を示す種別であり、項目「行動詳細」はユーザの行動の具体的な内容である。例えば、項目「行動種別」および「行動詳細」で特定される行動には、ユーザの入力操作による有価データの消費(一例では、投げ銭アイテムの投入、及びアイテム購入等による課金等)、コメント入力、並びにキャラクタの服飾品等のアイテムの変更(いわゆる、着せ替え)等の行動が含まれてよい。また、このような行動には、ライブ配信パートの特定進行部分を後からプレイバックするための時間の選択が含まれてもよい。これ以外にも、このような行動には、ライブ配信パート中の報酬やポイント等の獲得が含まれてもよい。なお、ユーザ行動履歴情報は、動作指図データのデータ構造と、図12で後述するゲーム情報のデータ構造との間で相互に関連付けられるのがよい。なお、これらのデータ構造は例示にすぎず、これに限定されないことが当業者に理解されるべきである。 FIG. 11 is a diagram showing an example of a data structure of user behavior history information. The user action history information includes, for example, items such as action time, action type, and action details in which the user has acted in the live distribution part, and is associated with a user ID that identifies the user. The item "behavior time" is the time information in which the user performed an action in the live distribution part, the item "behavior type" is a type indicating the user's action, and the item "behavior details" is the specific action of the user. Content. For example, for an action specified by the items "behavior type" and "behavior details", consumption of valuable data by a user's input operation (for example, input of a thrown item and billing by purchasing an item, etc.), comment input, In addition, actions such as changing items (so-called dress-up) such as clothing of the character may be included. Such actions may also include time selection for later playback of a particular progress portion of the livestreaming part. In addition to this, such actions may include the acquisition of rewards, points, etc. during the live distribution part. The user action history information is preferably associated with each other between the data structure of the action instruction data and the data structure of the game information described later in FIG. It should be understood by those skilled in the art that these data structures are merely examples and are not limited thereto.
 図12は、本実施形態に係るシステム1にて処理されるゲーム情報132のデータ構造の一例を示す図である。ゲーム情報132において設けられる項目は、ゲームのジャンル、性質、内容等に応じて適宜決定されるものであり、例示の項目は、本発明の範囲を限定するものではない。一例として、ゲーム情報132は、「プレイ履歴」、「アイテム」、「配信履歴」、「ゲームオブジェクト」の各項目を含んで構成されうる。これらの各項目は、ゲーム進行部115がゲームを進行させるときに適宜参照されうる。 FIG. 12 is a diagram showing an example of the data structure of the game information 132 processed by the system 1 according to the present embodiment. The items provided in the game information 132 are appropriately determined according to the genre, nature, content, etc. of the game, and the exemplary items do not limit the scope of the present invention. As an example, the game information 132 may be configured to include each item of "play history", "item", "delivery history", and "game object". Each of these items may be appropriately referred to when the game progress unit 115 advances the game.
 項目「プレイ履歴」には、ユーザのプレイ履歴が格納されている。プレイ履歴は、記憶部120に記憶されているシナリオごとに、ユーザのプレイが完遂しているかどうかを示す情報である。ここで、シナリオは、例えば、動画(例えば各ゲームパートにおけるチュートリアル、ゲーム、抽選など)の配信の単位、または後述する見返し配信もしくは見逃し配信において再生を行う動画の一部または全て(例えば再生を実行する単位ごと)を、その単位としてもよい。また、例えば、プレイ履歴は、プレイ初回にダウンロードされた固定シナリオのリストと、後から獲得された獲得シナリオのリストとを含んでいてもよい。それぞれのリストにおいて、シナリオごとに、「プレイ済」、「未プレイ」、「プレイ可」、「プレイ不可」などのステータスが紐付けられている。 The user's play history is stored in the item "play history". The play history is information indicating whether or not the user's play is completed for each scenario stored in the storage unit 120. Here, the scenario is, for example, a unit of distribution of a video (for example, a tutorial, a game, a lottery, etc. in each game part), or a part or all of a video to be played in a retrospective distribution or a missed distribution described later (for example, execution of playback). (Each unit) may be used as the unit. Further, for example, the play history may include a list of fixed scenarios downloaded at the first time of play and a list of acquired scenarios acquired later. In each list, statuses such as "played", "unplayed", "playable", and "unplayable" are associated with each scenario.
 項目「アイテム」には、ユーザが保有するゲーム媒体としてのアイテム一覧が格納されている。本ゲームにおいて、アイテムは、一例として、キャラクタに身に付けさせる服飾品である。ユーザは、シナリオをプレイすることによって得られた服飾品のアイテムを、キャラクタに身に付けさせ、キャラクタの見た目をカスタマイズすることができる。 The item "item" stores a list of items owned by the user as a game medium. In this game, the item is, for example, a clothing item worn by a character. The user can make the character wear the clothing items obtained by playing the scenario and customize the appearance of the character.
 項目「配信履歴」には、ライブ配信パートにおいて、過去にオペレータからライブ配信された動画、いわゆるバックナンバーの一覧が格納されている。ライブ配信パートにおいて、リアルタイムにPUSH配信されている動画は、そのときにしか閲覧できない。一方、過去の配信分の動画は、サーバ200または動作指図装置300において録画されており、ユーザ端末100からのリクエストに応じて、PULL配信することが可能である。本実施形態では、一例として、バックナンバーは、ユーザが課金することにより、ダウンロードできるようにしてもよい。 The item "Distribution history" stores a list of videos, so-called back numbers, that were live-distributed by the operator in the past in the live distribution part. In the live distribution part, the video that is PUSH-distributed in real time can be viewed only at that time. On the other hand, the moving images for past distribution are recorded by the server 200 or the operation instruction device 300, and can be PULL distributed in response to a request from the user terminal 100. In the present embodiment, as an example, the back number may be made available for download by the user for a fee.
 項目「ゲームオブジェクト」には、キャラクタ801、対戦ゲーム中の敵オブジェクトや障害物オブジェクトのようなライブ配信パートにおいて出現する各種オブジェクトのデータが格納されている。 The item "game object" stores data of various objects that appear in the live distribution part such as character 801 and enemy objects and obstacle objects in the battle game.
 <処理フロー>
 図13は、本実施形態に係るゲームプログラムに基づいて実行されるゲームの基本的なゲーム進行について、その一例を示すフローチャートである。処理フローは、リアルタイムのライブ配信パートが既に終了済みであり、ライブ配信時刻の終了以降の場面に適用される。
<Processing flow>
FIG. 13 is a flowchart showing an example of a basic game progress of a game executed based on the game program according to the present embodiment. The processing flow is applied to the scenes after the end of the live distribution time when the real-time live distribution part has already been completed.
 ステップS301では、ユーザ端末100の操作部によって、終了済みのライブ配信パートの進行が新たに要求される。ステップS302では、ステップS301での要求に対して、ユーザ端末100は、サーバ200または動作指図装置300から、終了済みのライブ配信パートに係る記録済みの動作指図データを受信する。 In step S301, the operation unit of the user terminal 100 newly requests the progress of the completed live distribution part. In step S302, in response to the request in step S301, the user terminal 100 receives the recorded operation instruction data related to the completed live distribution part from the server 200 or the operation instruction device 300.
 記録済みの動作指図データは、キャラクタに関連付けられるオペレータが入力したモーションデータおよび音声データを含む。ユーザ端末100は、記録済みの動作指図データに加えて、リアルタイムのライブ配信パートの進行時にキャラクタの動作に伴って取得および記録された各種進行実績データを受信してもよい。具体的には、進行実績データには、リアルタイムのライブ配信パートに参加したユーザがキャラクタの動作に伴って行動した視聴者行動データが含まれてもよい。視聴者行動データは、リアルタイムのライブ配信パートを、リアルタイムで進行させた全てのユーザ(つまり、ライブに参加した視聴者)のライブ中の行動の記録を含んだデータである。特に、視聴者行動データは、ライブの途中で視聴者がキャラクタに向けてリアルタイムに発信したテキストメッセージやアイコン等のメッセージングの内容を含むのがよい。このように、進行実績データを用いて、終了済みのライブ配信パートを進行させることにより、リアルタイムで進行したライブ配信パートにおける視聴者の反応を忠実に再現することができ、リアルタイムでのライブ空間の臨場感を更に向上することができる。視聴者行動データは、例えば図11に示されるユーザ行動履歴情報であってもよい。 The recorded action instruction data includes motion data and voice data input by the operator associated with the character. In addition to the recorded operation instruction data, the user terminal 100 may receive various progress record data acquired and recorded along with the movement of the character during the progress of the real-time live distribution part. Specifically, the progress record data may include viewer behavior data in which the user who participated in the real-time live distribution part behaves in accordance with the movement of the character. The viewer behavior data is data including a record of the behavior during the live of all the users (that is, the viewers who participated in the live) who have advanced the real-time live distribution part in real time. In particular, the viewer behavior data should include messaging content such as text messages and icons sent by the viewer to the character in real time during the live performance. In this way, by advancing the completed live distribution part using the progress record data, it is possible to faithfully reproduce the reaction of the viewer in the live distribution part that has progressed in real time, and it is possible to faithfully reproduce the reaction of the viewer in the live space in real time. The sense of presence can be further improved. The viewer behavior data may be, for example, the user behavior history information shown in FIG.
 なお、記録済みの動作指図データおよび進行実績データは、ユーザ端末100が別データとして受信し、それぞれを解析(レンダリング)してもよい。代替では、サーバ200または動作指図装置300において、予め、記録済みの動作指図データおよび視聴者行動データが結合され、結合されたデータ・セットをユーザ端末100が一度に受信してもよい。結合されたデータ・セットを受信することにより、ユーザ端末100による後のデータの解析(レンダリング)の負荷を低減することができる。以降の説明では、進行実績データは、記録済みの動作指図データに結合されたものとする(つまり、記録済みの動作指図データに進行実績データが含まれるものとする。)。 Note that the recorded operation instruction data and progress record data may be received by the user terminal 100 as separate data, and each may be analyzed (rendered). Alternatively, in the server 200 or the operation instruction device 300, the previously recorded operation instruction data and the viewer behavior data may be combined, and the combined data set may be received by the user terminal 100 at one time. By receiving the combined data set, it is possible to reduce the load of subsequent data analysis (rendering) by the user terminal 100. In the following description, it is assumed that the progress record data is combined with the recorded action instruction data (that is, the recorded action order data includes the progress record data).
 次いで、ステップS303では、ゲーム進行部115は、ユーザがライブ配信パートをリアルタイムで進行させた実績があるか否かを判定する。判定は、例えば、動作指図データがユーザ端末100に宛てに送信された記録があるかに基づいて実行されてもよい。或いは、図12に示された項目「プレイ履歴」を参照して、ライブ配信パートが「プレイ済」のステータスであるかに基づいて実行されても、同項目「配信履歴」を参照して、過去にキャラクタからライブ配信された実績があるかに基づいて実行されても何れでもよい。これ以外にも、ユーザ端末100の記憶部120に既に記録済みの動作指図データが格納されているような場合は、ライブ配信パートをリアルタイムで既に進行させたものと判定してよい。加えて、判定は、これらを組み合わせることで実行されてもよく、または他の任意の手法で実行されてもよい。 Next, in step S303, the game progress unit 115 determines whether or not the user has a track record of progressing the live distribution part in real time. The determination may be performed, for example, based on whether there is a record in which the action instruction data has been sent to the user terminal 100. Alternatively, even if the live distribution part is executed based on whether the status is "played" by referring to the item "play history" shown in FIG. 12, the item "distribution history" is also referred to. It may be executed based on whether or not there is a record of live distribution from the character in the past. In addition to this, when the operation instruction data already recorded is stored in the storage unit 120 of the user terminal 100, it may be determined that the live distribution part has already been advanced in real time. In addition, the determination may be performed by combining them, or by any other method.
 ステップS303でユーザがライブ配信パートをリアルタイムで進行させた実績があると判定される場合(YES)は、終了済みのライブ配信パートの進行は「見返し配信」となる。他方、ステップS303でユーザがライブ配信パートをリアルタイムで進行させた実績がないと判定される場合(NO)は、終了済みのライブ配信パートの進行は「見逃し配信」となる。上述したように、「見返し配信」と「見逃し配信」とでは、ユーザ体験は異なる。 If it is determined in step S303 that the user has a track record of advancing the live distribution part in real time (YES), the progress of the completed live distribution part is "return distribution". On the other hand, when it is determined in step S303 that the user has no record of advancing the live distribution part in real time (NO), the progress of the completed live distribution part is "missed distribution". As mentioned above, the user experience is different between "return delivery" and "missed delivery".
 ステップS303で、ユーザがライブ配信パートをリアルタイムで進行させた実績があると判定されると、処理フローはステップS303のYESからステップS304に進む。ステップS304では、ゲーム進行部115は、図11に示されたライブ配信パートのユーザ行動履歴情報を取得して、これを解析する。ユーザ行動履歴情報は、サーバ200または動作指図装置300から取得してもよいし、ユーザ端末100の記憶部120に既に格納されている場合は直接それを使用してもよい。 If it is determined in step S303 that the user has a track record of advancing the live distribution part in real time, the processing flow proceeds from YES in step S303 to step S304. In step S304, the game progress unit 115 acquires the user action history information of the live distribution part shown in FIG. 11 and analyzes it. The user action history information may be acquired from the server 200 or the operation instruction device 300, or may be used directly when it is already stored in the storage unit 120 of the user terminal 100.
(見返し配信)
 引き続き、ステップS305では、ゲーム進行部115は、終了済みのライブ配信パートの再度の進行(つまり上述の「見返し配信」)を実行する。具体的には、記録済みの動作指図データとステップS304で解析したユーザ行動履歴情報とを用いて、ライブ配信パートの再度の進行を実行する。
(Endpaper delivery)
Subsequently, in step S305, the game progress unit 115 re-progresses the completed live distribution part (that is, the above-mentioned “return distribution”). Specifically, the recorded operation instruction data and the user action history information analyzed in step S304 are used to re-progress the live distribution part.
 見返し配信では、ライブ配信パートと同様に、キャラクタが着用する衣装の選択肢をユーザに与えるのがよい。例えば、ユーザは、該ユーザに関連付けられているランキング帯以下に割り当てられている衣装のうち1つから選択できるようにしてもよい。これにより、見返し配信では、高いランクのユーザに対し、衣装の選択肢を多く提供できる。 In the endpaper distribution, it is better to give the user the choice of costumes to be worn by the character, as in the live distribution part. For example, the user may be able to select from one of the costumes assigned below the ranking band associated with the user. As a result, in the return delivery, it is possible to provide a large number of costume options to high-ranked users.
 さらに、見返し配信では、リアルタイムのライブ配信パートで投入された投げ銭アイテムをキャラクタの動作態様に反映させるようにしてもよい。例えば、ユーザが服飾アイテム(ここでは、「ネックレス」)として獲得しているような場合は、当該アイテムに基づいて(つまり、ネックレスを身に付けて)キャラクタを動作させる。これにより、ライブ配信パートの再度の進行を実行してもよい。つまり、リアルタイムのライブ配信パートの再度の進行は、ユーザ行動履歴情報および報酬の情報が反映されており、リアルタイムで進行したライブ配信パートと同様のものであると共に、当該ユーザに固有のものとなる。 Furthermore, in the return distribution, the throwing item input in the real-time live distribution part may be reflected in the movement mode of the character. For example, if the user has acquired it as a clothing item (here, a "necklace"), the character is operated based on the item (that is, wearing a necklace). As a result, the live distribution part may be re-progressed. That is, the re-progress of the real-time live distribution part reflects the user behavior history information and the reward information, is similar to the live distribution part that has progressed in real time, and is unique to the user. ..
 また、見返し配信では、ライブ配信パートの再度の進行を行う。この際、最初の進行の際に記録された、操作部を介したユーザの入力操作による時間情報の指定にしたがい、ライブ配信パートを選択的に実行させるのがよい。具体的には、図11で説明されたユーザ行動履歴情報に含まれる「行動時間」のデータを用いて、ユーザが特定の行動時間を指定することにより、そこからライブ配信パートを選択的に進行させることができる。例えば、ユーザがライブ配信パートの開始から2分45秒後にコメントを入力していた場合には、その2分45秒後のタイミングを指定して、ユーザはライブ配信パートを再度進行させることができる。なお、このような再度の進行は、上記のコメント入力の記録に加え、ユーザの入力操作による有価データの消費、およびキャラクタの服飾品等のアイテムの変更等の行動の記録に対応した「行動時間」に基づいて、実行可能とするのがよい。 Also, in the return distribution, the live distribution part will be re-progressed. At this time, it is preferable to selectively execute the live distribution part according to the time information specified by the user's input operation via the operation unit recorded at the time of the first progress. Specifically, by using the data of "action time" included in the user action history information described with reference to FIG. 11, the user specifies a specific action time, and the live distribution part is selectively advanced from there. Can be made to. For example, if the user inputs a comment 2 minutes and 45 seconds after the start of the live distribution part, the user can advance the live distribution part again by specifying the timing after 2 minutes and 45 seconds. .. In addition to the above-mentioned record of comment input, such re-progress is "action time" corresponding to the consumption of valuable data by the user's input operation and the record of actions such as change of items such as character's clothing. It is better to make it feasible based on.
 更に、見返し配信では、リアルタイムで進行したライブ配信パートの実行中において、ユーザが入力操作によって特定進行部分を選択していた場合は、ライブ配信パートの再度の進行において、選択された特定進行部分のみを選択的に進行させることができる。これにより、ユーザは、ライブ配信パートの特定進行部分のみを後から効率的にプレイバックすることができる。具体的には、ユーザが特定進行部分を選択し、ユーザ行動履歴情報にそのような行動の記録が登録されている場合に、その行動時間のデータを用いて、ライブ配信パートを選択的に進行させることができる。例えば、ユーザが、ライブ配信パートの開始から2分45秒から5分10秒の期間を選択していた場合には、ユーザは、その期間にわたるライブ配信パートを再度進行させることができる。 Further, in the return distribution, if the user has selected a specific progress part by an input operation during the execution of the live distribution part that has progressed in real time, only the selected specific progress part is selected in the re-progress of the live distribution part. Can be selectively advanced. As a result, the user can efficiently play back only the specific progress part of the live distribution part later. Specifically, when the user selects a specific progress part and a record of such an action is registered in the user action history information, the live distribution part is selectively progressed by using the data of the action time. Can be made to. For example, if the user has selected a period of 2 minutes 45 seconds to 5 minutes 10 seconds from the start of the live distribution part, the user can re-progress the live distribution part over that period.
 (見逃し配信)
 図13に戻り、ステップS303で、ユーザがライブ配信パートをリアルタイムで進行させた実績がないと判定されると、処理フローはステップS303のNOからステップS306に進む。ステップS306では、ゲーム進行部115は、終了済みのライブ配信パートの制限付きの進行(つまり上述の「見逃し配信」)を実行する。見逃し配信を制限付きのものとしているのは、ユーザはライブ配信を受ける権利を有していたにも拘わらず、この権利を放棄したと考えることができるのであるから、必ずしも、ライブ配信の全てを再現してユーザに提示する必要もないとの発想に基づく。
(Missed delivery)
Returning to FIG. 13, if it is determined in step S303 that the user has no record of advancing the live distribution part in real time, the processing flow proceeds from NO in step S303 to step S306. In step S306, the game progress unit 115 executes a limited progress (that is, the above-mentioned "missed distribution") of the completed live distribution part. The reason why the missed delivery is restricted is that the user has the right to receive the live stream, but it can be considered that he has waived this right, so not all of the live stream is necessarily limited. Based on the idea that it is not necessary to reproduce and present it to the user.
 具体的には、見逃し配信では、記録済みの動作指図データを用いて、ライブ配信パートの進行を実行する。上述のとおり、ユーザが服飾アイテム(例えば「ネックレス」)を獲得していた場合、リアルタイムで進行したライブ配信パートでは、そのアイテムをキャラクタが身に付けて動作するよう画像合成していた(すなわち、キャラクタがネックレスなどの服飾アイテムを着た状態)。つまり、リアルタイムのライブ配信パートでは、キャラクタの動作態様は服飾アイテムが関連付けられていたものであった。しかしながら、見逃し配信においては、このようなリアルタイムで進行したライブ配信パートとは異なり、キャラクタの動作態様には服飾アイテム等が関連付けられることはない。つまり、アイテムをキャラクタが身に付けて動作するような画像合成の処理は行わない。終了済みのライブ配信パートの進行は、服飾アイテム等の情報が反映されておらず、当該ユーザに固有のものとはならない点で制限付きのものとなる。 Specifically, in the overlooked distribution, the progress of the live distribution part is executed using the recorded operation instruction data. As mentioned above, if the user had acquired a clothing item (eg, a "necklace"), the live distribution part progressed in real time would image-synthesize the item so that it would be worn and acted upon by the character (ie,). The character is wearing a necklace or other clothing item). That is, in the real-time live distribution part, the movement mode of the character was associated with clothing items. However, in the overlooked distribution, unlike the live distribution part that progresses in real time, clothing items and the like are not associated with the movement mode of the character. That is, the image composition process is not performed so that the character wears the item and operates. The progress of the completed live distribution part is limited in that it does not reflect information such as clothing items and is not unique to the user.
 上述の構成および方法によれば、見返し配信では、リアルタイムのライブ配信パートで投入された投げ銭アイテムをキャラクタの動作態様に反映させることができる。一方、見逃し配信では、リアルタイムのライブ配信パートで投入された投げ銭アイテムを、キャラクタの動作態様に反映させない(ただし、リアルタイムでライブ配信を視聴していた他の視聴者ユーザが投入した投げ銭アイテムは見逃し配信においてキャラクタの動作態様に反映させてもよい)。つまり、見返し配信(過去にリアルタイムのライブ配信を視聴している)では、見逃し配信よりも興趣性の高い画像を配信することができる。このため、ユーザに対し、見逃し配信よりも見返し配信を視聴するインセンティブを与え、結果としてリアルタイムのライブ配信へユーザを誘導することができる。 According to the above-mentioned configuration and method, in the return distribution, the thrown money item input in the real-time live distribution part can be reflected in the movement mode of the character. On the other hand, in the missed delivery, the thrown item thrown in the real-time live delivery part is not reflected in the operation mode of the character (however, the thrown item thrown by another viewer user who was watching the live stream in real time is missed. It may be reflected in the movement mode of the character in the distribution). In other words, in the retrospective distribution (viewing real-time live distribution in the past), it is possible to distribute an image that is more interesting than the overlooked distribution. Therefore, it is possible to give the user an incentive to watch the retrospective distribution rather than the overlooked distribution, and as a result, guide the user to the real-time live distribution.
 また、見逃し配信では、リアルタイムで進行したライブ配信パートとは異なり、受け付け可能なユーザの行動も制限するのがよい。具体的には、リアルタイムで進行したライブ配信パートでは、ユーザの入力操作による有価データの消費(一例では、投げ銭の投入、およびアイテム購入等による課金等)が受け付け可能であった。その一方で、終了済みのライブ配信パートの進行では、このような有価データの消費が受け付けられないように制限してもよい。より詳しくは、リアルタイムで進行したライブ配信パートにおいては、有価データの消費を実行するためのボタンおよび画面を含むユーザ・インターフェイス(UI)が表示部352に表示され、ユーザは、このようなUIでの入力操作を通じて有価データの消費を実行することができることが想定される。その一方で、見逃し配信では、このようなUIは非表示とされ、ユーザによる入力操作を明示的に行えないようにするのがよい。その結果、見返し配信および見逃し配信では、ユーザ3は、キャラクタを支援するための投げ銭アイテム等を投入することはできない。 Also, in the overlooked distribution, unlike the live distribution part that progressed in real time, it is better to limit the actions of the users that can be accepted. Specifically, in the live distribution part that progressed in real time, it was possible to accept the consumption of valuable data by the user's input operation (in one example, throwing money, charging by purchasing items, etc.). On the other hand, in the progress of the completed live distribution part, the consumption of such valuable data may be restricted so as not to be accepted. More specifically, in a live delivery part progressed in real time, a user interface (UI) including buttons and screens for executing the consumption of valuable data is displayed on the display unit 352, and the user can use such a UI. It is assumed that the consumption of valuable data can be executed through the input operation of. On the other hand, in the overlooked delivery, such a UI should be hidden so that the user cannot explicitly perform an input operation. As a result, in the return delivery and the overlooked delivery, the user 3 cannot throw in a tossed item or the like for supporting the character.
 さらに、見返し配信および見逃し配信では、リアルタイムで進行するライブ配信パート等と同様、ユーザは、ライブ配信パート等に疑似的に参加することができる。ライブ配信パート等には、実施形態1および2において説明したようなユーザ参加型のイベントが含まれ(これに限定されない)、ユーザには、キャラクタとのインタラクティブな体験が提供される。ユーザ参加型のイベントの例には、実施形態1および2のようなゲーム、キャラクタから提供されたアンケート、キャラクタから出題されたクイズ、キャラクタとの対戦(例えばジャンケンゲーム、ビンゴゲーム)等が含まれうる。そして、リアルタイムでのライブ配信と同様に、見逃し配信においても、このようなユーザ参加型のイベントの参加結果はユーザにフィードバックされる。例えば、見返し配信において、キャラクタから出題された四択クイズのイベントにユーザが参加して回答した場合、その正誤判定の結果はユーザにフィードバックされる。(ただし、ライブにリアルタイムで参加しなかったユーザが見逃し配信においてアンケートやクイズ等に回答した場合や、ライブにリアルタイムで参加した見返し配信においてライブ参加中とは異なる回答をした場合は、これらのユーザの回答内容は反映されないが、プログラムが自動的に簡単な判定のみ(正誤判定など)を行ってフィードバックするようになっていてもよい。)また、見返し配信において、ユーザがライブ参加中とは異なる回答をした場合は、ライブ参加中の当該ユーザの回答と比較をして「ライブ中と回答が違いますよ」というような表示がユーザ端末に表示出力されるようになっていてもよい。 Furthermore, in the return distribution and the overlooked distribution, the user can participate in the live distribution part, etc. in a pseudo manner, as in the live distribution part, etc., which progresses in real time. The live distribution part and the like include, but are not limited to, user-participatory events as described in embodiments 1 and 2, and the user is provided with an interactive experience with the character. Examples of user-participatory events include games such as embodiments 1 and 2, questionnaires provided by characters, quizzes given by characters, battles with characters (eg, rock-paper-scissors games, bingo games), and the like. sell. Then, as in the case of live distribution in real time, the participation result of such a user participation type event is fed back to the user in the overlooked distribution. For example, in the return delivery, when the user participates in and answers the event of the four-choice quiz given by the character, the result of the correctness determination is fed back to the user. (However, if a user who did not participate in the live in real time answered a questionnaire or quiz in the missed delivery, or if a feedback delivery that participated in the live in real time gave a different answer than during the live participation, these users Although the content of the answer is not reflected, the program may automatically make only a simple judgment (correctness judgment, etc.) and give feedback.) Also, in the return delivery, the user is different from the one during live participation. When an answer is given, a display such as "The answer is different from that during the live" may be displayed and output to the user terminal by comparing with the answer of the user who is participating in the live.
 また、見逃し配信では、リアルタイムで進行したライブ配信パートとは異なり、上記フィードバックに対して所定のゲーム・ポイントをユーザが獲得できないように制限してもよい。具体的には、リアルタイムで進行したライブ配信パートでは、ユーザが特定のシナリオをプレイした結果、所定のゲーム・ポイントがユーザに関連付けられて、ユーザ保有のポイントに加算されることがある。その一方で、終了済みのライブ配信パートの進行においては、ユーザにはこのようなポイントが関連付けられないようにしてもよい。ユーザ保有のポイントが加算されない結果、例えば、ポイントに基づいてゲームプレイヤである複数のユーザが順位付けされるようなゲームの場合、終了済みのライブ配信パートを仮にユーザが進行させたところで、このような順位には影響を与えないことになる。 Further, in the overlooked distribution, unlike the live distribution part that progresses in real time, the user may be restricted from earning a predetermined game point for the above feedback. Specifically, in a live distribution part that progresses in real time, as a result of the user playing a specific scenario, predetermined game points may be associated with the user and added to the points owned by the user. On the other hand, in the progress of the completed live distribution part, such points may not be associated with the user. As a result of not adding the points owned by the user, for example, in the case of a game in which a plurality of users who are game players are ranked based on the points, when the user advances the completed live distribution part, this is the case. It will not affect the ranking.
 見返し配信(ステップS305)または見逃し配信(ステップS306)の終了後は、ユーザ端末100によって再び、終了済みのライブ配信パートの進行が要求されてもよい。つまり、見返し配信または見逃し配信は、複数回数にわたり繰り返し実行可能とするのがよい。この場合、処理フローはステップS301に戻ることになる。 After the end of the return distribution (step S305) or the overlooked distribution (step S306), the user terminal 100 may request the progress of the completed live distribution part again. That is, it is preferable that the return delivery or the missed delivery can be repeatedly executed a plurality of times. In this case, the processing flow returns to step S301.
 上述の構成および方法によれば、ユーザ端末100において、ライブ配信パートがリアルタイムに進行した後であっても、ユーザは再度ライブ配信パートを様々な態様で進行させることができる。これにより、ユーザは、キャラクタとの現実感が豊かなやりとりの体験を通じて、よりキャラクタに愛着を感じることになるので、該キャラクタを操作する別のパートもよりいっそう興味を持ってプレイすることができる。結果として、ゲームの世界への没入感を高め、該ゲームの興趣性を向上させるという効果を奏する。 According to the above configuration and method, in the user terminal 100, even after the live distribution part has progressed in real time, the user can proceed with the live distribution part again in various modes. As a result, the user feels more attached to the character through the experience of interacting with the character in a rich sense of reality, so that another part that operates the character can be played with even more interest. .. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game.
 <変形例1>
 上記実施形態3では、終了済みのライブ配信パートの進行が見返し配信となるか、見逃し配信となるかは、ユーザがライブ配信パートをリアルタイムで進行させた実績があるか否かに基づいて決定されるものとした(図13のステップS303)。これに対し、本実施形態の変形例1では、ユーザが見返し配信または見逃し配信を選択可能とするように構成してもよい。或いは、上記実績の有無に拘わらず、見逃し配信のみがユーザに提供されるように構成してもよい。
<Modification 1>
In the third embodiment, whether the progress of the completed live distribution part is a return distribution or a missed distribution is determined based on whether or not the user has a track record of advancing the live distribution part in real time. (Step S303 in FIG. 13). On the other hand, in the first modification of the present embodiment, the user may be configured to select the return delivery or the overlooked delivery. Alternatively, regardless of the presence or absence of the above-mentioned achievements, only the overlooked distribution may be provided to the user.
 <変形例2>
 上記実施形態3では、見返し配信(図13のステップS305)または見逃し配信(図13のステップS306)の終了後に、再び、終了済みのライブ配信パートの進行が要求されてよいものとした。つまり、見返し配信または見逃し配信は、複数回数にわたり繰り返し実行可能であった。本変形例2では、2回目以降の見返し配信または見逃し配信は、前回の見返し配信または見逃し配信の記録に応じたものとするのがよい。
<Modification 2>
In the third embodiment, after the end of the return distribution (step S305 in FIG. 13) or the missed distribution (step S306 in FIG. 13), the progress of the completed live distribution part may be requested again. That is, the return delivery or the missed delivery could be repeatedly executed a plurality of times. In the second modification, it is preferable that the second and subsequent return delivery or missed delivery correspond to the record of the previous return delivery or missed delivery.
 1回目に見返し配信または見逃し配信が行われた場合、1回目の配信履歴データが、サーバ200の記憶部220または動作指図装置300の記憶部320に格納されてもよい。その後、終了済みのライブ配信パートに係る記録済みの動作指図データがユーザ端末100から再び要求されると、サーバ200または動作指図装置300から、1回目の配信履歴データが、記録済みの動作指図データと共に配信される。ユーザ端末100では、受信した1回目の配信履歴データを参照し、1回目の見返し配信または見逃し配信が途中まで行われていた場合には、ユーザ端末100は、その続きから2回目の見返し配信または見逃し配信の進行を再開させる。これにより、ユーザは効率的に見返し配信または見逃し配信を実行することができる。 When the first return distribution or the overlooked distribution is performed, the first distribution history data may be stored in the storage unit 220 of the server 200 or the storage unit 320 of the operation instruction device 300. After that, when the recorded operation instruction data related to the completed live distribution part is requested again from the user terminal 100, the first distribution history data from the server 200 or the operation instruction device 300 is the recorded operation instruction data. Delivered with. In the user terminal 100, the received first delivery history data is referred to, and if the first return delivery or the missed delivery is performed halfway, the user terminal 100 will perform the second return delivery or the second return delivery from the continuation. Resume the progress of overlooked delivery. As a result, the user can efficiently perform return delivery or missed delivery.
 なお、1回目が見返し配信であれば2回目以降も見返し配信が実行され、1回目が見逃し配信であれば2回目以降も見逃し配信が実行されるのがよい。また、記録済みの動作指図データが既にユーザ端末100に存在している場合には、ユーザ端末100は、記録済みの動作指図データの再度の受信を行わないようにしてもよい。これにより、ユーザ端末100が受信するデータ容量を節約することができる。 If the first delivery is a return delivery, the return delivery should be executed from the second time onward, and if the first delivery is a missed delivery, the missed delivery should be executed from the second time onward. Further, when the recorded operation instruction data already exists in the user terminal 100, the user terminal 100 may not receive the recorded operation instruction data again. As a result, the amount of data received by the user terminal 100 can be saved.
 <変形例3>
 実施形態3では、終了済みのライブ配信パートの進行が見返し配信となるか、または見逃し配信となるかは、ユーザがライブ配信パートをリアルタイムで進行させた実績に応じて決定されるものとした(図13のステップS303)。本変形例3では、ユーザがライブ配信パートをリアルタイムで途中まで進行させていたと判定される場合には、その続きから、終了済みのライブ配信パートの進行を再開させるのがよい。ユーザがライブ配信パートをリアルタイムでどこまで進行させたかの記録は、図11で上述したユーザ行動履歴情報から判断することができる。つまり、ユーザ行動履歴情報には、特定のライブ配信パートに関し、ユーザがどの時間まで進行させたかが記録されてもよい。なお、これに限定されないが、終了済みのライブ配信パートの再開は、制限付きの進行である見逃し配信とするのがよい。これにより、ユーザは効率的に見逃し配信を実行することができる。
<Modification 3>
In the third embodiment, whether the progress of the completed live distribution part is a return distribution or a missed distribution is determined according to the actual result of the user advancing the live distribution part in real time (). Step S303 in FIG. 13). In the third modification, when it is determined that the user has progressed the live distribution part halfway in real time, it is preferable to restart the progress of the completed live distribution part from the continuation. The record of how far the user has advanced the live distribution part in real time can be determined from the user behavior history information described above in FIG. That is, the user behavior history information may record how long the user has progressed with respect to a specific live distribution part. Although not limited to this, the resumption of the completed live distribution part should be a missed distribution, which is a limited progress. As a result, the user can efficiently execute the overlooked delivery.
<ユーザ端末100の表示画面例>
 図14は本実施形態に係るゲームプログラムに基づきユーザ端末100の表示部152に表示される画面例と、これら画面の間の遷移例を示す。画面例には、ホーム画面800A、ライブ配信のライブ選択画面800Bおよび見逃し配信の見逃し選択画面800Cの例が含まれる。遷移例において、ホーム画面800Aからはライブ選択画面800Bに遷移可能である。また、ライブ選択画面800Bからはホーム画面800Aおよび見逃し選択画面800Cに遷移可能である。同様に、見逃し選択画面800Cからはライブ選択画面800Bに遷移可能である。なお、実際の配信画面(不図示)は、ライブ選択画面800Bおよび見逃し選択画面800Cから遷移される。
<Example of display screen of user terminal 100>
FIG. 14 shows an example of a screen displayed on the display unit 152 of the user terminal 100 based on the game program according to the present embodiment, and an example of a transition between these screens. Examples of screens include a home screen 800A, a live selection screen 800B for live distribution, and a missed selection screen 800C for missed distribution. In the transition example, the transition from the home screen 800A to the live selection screen 800B is possible. Further, the live selection screen 800B can be transitioned to the home screen 800A and the overlooked selection screen 800C. Similarly, it is possible to transition from the overlooked selection screen 800C to the live selection screen 800B. The actual distribution screen (not shown) is transitioned from the live selection screen 800B and the overlooked selection screen 800C.
(ホーム画面)
 ホーム画面800Aは、ライブ配信パートを進行させるための各種メニューをユーザ端末100の表示部152に表示する。ゲーム進行部115は、ゲームの開始のために入力操作を受け付けると、最初にホーム画面800Aを表示する。具体的には、ホーム画面800Aは、ライブ選択画面800Bに遷移させるための「ライブ」アイコン802を含む。ホーム画面800Aにおける「ライブ」アイコン802に対する入力操作を受け付けると、ゲーム進行部115は、ライブ選択画面800Bを表示部152に表示させる。
(Home Screen)
The home screen 800A displays various menus for advancing the live distribution part on the display unit 152 of the user terminal 100. When the game progress unit 115 receives an input operation for starting the game, the game progress unit 115 first displays the home screen 800A. Specifically, the home screen 800A includes a "live" icon 802 for transitioning to the live selection screen 800B. Upon receiving an input operation for the "live" icon 802 on the home screen 800A, the game progress unit 115 causes the display unit 152 to display the live selection screen 800B.
(ライブ選択画面)
 ライブ選択画面800Bは、配信可能なライブ情報をユーザに提示する。特に、ライブ配信時刻等をあらかじめユーザに通知するための1以上のライブに関する告知情報をリスト表示する。ライブ告知情報は、少なくともライブ配信日時を含む。さらにライブ告知情報は、ライブの無料/有料の情報や、ライブに出演するキャラクタの画像等を含む広告画像を含んでもよい。また、ライブ選択画面800Bは、最も近い将来に配信するライブ配信に関する告知情報をライブ選択画面にポップアップ画面806で表示してもよい。
(Live selection screen)
The live selection screen 800B presents live information that can be distributed to the user. In particular, a list of one or more live notification information for notifying the user of the live distribution time and the like in advance is displayed. The live announcement information includes at least the live delivery date and time. Further, the live announcement information may include free / paid live information, an advertisement image including an image of a character appearing in the live, and the like. Further, the live selection screen 800B may display the notification information regarding the live distribution to be distributed in the near future on the live selection screen on the pop-up screen 806.
 ライブ配信時刻になると、サーバ200は、ライブ配信を受ける権利がある1以上のユーザ端末100を探索する。ライブ配信を受ける権利は、ライブ配信を受けるための対価を支払い済みであること(例えばチケットを保有すること)などが条件としてあげられる。ライブ配信を受ける権利があるユーザ端末100には、対応するライブ告知情報が表示されることになる。 At the live distribution time, the server 200 searches for one or more user terminals 100 having the right to receive the live distribution. The right to receive livestreaming is conditioned on the fact that the consideration for receiving livestreaming has been paid (for example, holding a ticket). The corresponding live notification information will be displayed on the user terminal 100 having the right to receive the live distribution.
 ユーザ端末100において、ライブ再生操作、例えば、ライブ選択画面800Bにおけるライブ配信時刻となったライブに対する選択操作(より詳しくは、ライブの画像に対するタッチ操作)を受け付ける。それに応じて、ゲーム進行部115は、表示部152を実際の配信画面(不図示)に遷移させる。これにより、ユーザ端末100は、ライブ配信パートを進行させ、ライブ視聴処理をリアルタイムで進行させることができる。ライブ視聴処理が実行されると、ゲーム進行部115は、受信した動作指図データに基づいて、ライブ配信パートにおいてキャラクタを動作させる。ゲーム進行部115は、ライブ配信パートにおいて動作指図データに基づいて動作するキャラクタを含む動画再生画面を生成し、表示部152に表示させる。 The user terminal 100 accepts a live playback operation, for example, a selection operation for a live at the live distribution time on the live selection screen 800B (more specifically, a touch operation for a live image). Accordingly, the game progress unit 115 shifts the display unit 152 to the actual distribution screen (not shown). As a result, the user terminal 100 can advance the live distribution part and advance the live viewing process in real time. When the live viewing process is executed, the game progress unit 115 operates the character in the live distribution part based on the received operation instruction data. The game progress unit 115 generates a moving image reproduction screen including a character that operates based on the operation instruction data in the live distribution part, and displays it on the display unit 152.
 また、ライブ選択画面800Bは、直前に表示していた画面に遷移させるための「戻る(×)」アイコン808と、見逃し選択画面800Cに遷移させるための「見逃し配信」アイコン810を表示部152に表示させてもよい。ここでは、ライブ選択画面800Bにおける「戻る(×)」アイコン808に対する入力操作に応じて、ゲーム進行部115は、画面800Bをホーム画面800Aに遷移させる。一方、ライブ選択画面800Bにおける見逃し配信アイコン810に対する入力操作に対しては、ゲーム進行部115は、画面800Bを見逃し選択画面800Cに遷移させる。 Further, the live selection screen 800B has a "return (x)" icon 808 for transitioning to the screen displayed immediately before and a "missing delivery" icon 810 for transitioning to the missed selection screen 800C on the display unit 152. It may be displayed. Here, the game progress unit 115 shifts the screen 800B to the home screen 800A in response to an input operation for the “return (x)” icon 808 on the live selection screen 800B. On the other hand, for the input operation for the missed distribution icon 810 on the live selection screen 800B, the game progress unit 115 shifts to the missed selection screen 800C on the screen 800B.
(見逃し選択画面)
 見逃し選択画面800Cは、過去に配信された1以上のライブに関する配信済み情報のうち、特にユーザがライブ配信パートをリアルタイムで進行させた実績がない配信済みの情報を表示する。ユーザ端末100の操作部によって、見逃し選択画面800Cに表示されるライブの配信済み情報、例えばライブに出演したキャラクタを含む画像830に対する入力操作を受け付けると、ゲーム進行部115はライブ配信パート終了後、終了済みのライブ配信パートを再度進行することができる。
(Missing selection screen)
The overlook selection screen 800C displays the delivered information related to one or more live delivered in the past, in particular, the delivered information that the user has not actually advanced the live delivery part in real time. When the operation unit of the user terminal 100 receives an input operation for the live distribution information displayed on the overlook selection screen 800C, for example, the image 830 including the character appearing in the live, the game progress unit 115 receives the input operation for the image 830, and the game progress unit 115 finishes the live distribution part. You can proceed with the livestreaming part that has already been completed.
 見逃し選択画面800Cの例に示すように、ライブに関する配信済み情報は、さらに、それぞれの配信済みライブの再生時間812、配信終了までの期間(日数など)814、現在から起算して何日前に配信されたかを示す情報816、および過去の配信日時等を含んでもよい。さらに、見逃し選択画面800Cは、ライブ選択画面800Bに遷移させるための「戻る(<)」アイコン818を含む。「戻る(<)」アイコン818に対する入力操作に応じて、ゲーム進行部115は、ライブ選択画面800Bに遷移させる。 As shown in the example of the overlook selection screen 800C, the delivered information about the live is further delivered with the playback time 812 of each delivered live, the period until the end of delivery (days, etc.) 814, and how many days before the present. It may include information 816 indicating whether or not it has been done, a past delivery date and time, and the like. Further, the overlooked selection screen 800C includes a “back (<)” icon 818 for transitioning to the live selection screen 800B. In response to the input operation for the "return (<)" icon 818, the game progress unit 115 transitions to the live selection screen 800B.
 本実施形態では、これに限定されないが、見逃し選択画面800Cは、ライブ選択画面800Bのみから遷移され、ホーム画面800Aからは直接遷移されないようにするのがよい。見逃し配信は、ライブ配信を見逃したユーザに対し行うものであり、ライブ配信機能に付随する機能にすぎない。また、本ゲームの目的の一つはユーザがリアルタイムのライブ配信を視聴し、リアルタイムでキャラクタを応援し、キャラクタとの交流を深めることでゲームの興趣を高めることにある。このため、キャラクタ(オペレータ)とのリアルタイムの交流ができない見逃し配信よりも、ライブ配信をリアルタイムで視聴するようユーザを誘導するために、ここでは、ホーム画面800Aからは見逃し選択画面800Cへ直接遷移できないようにするのがよい。 In the present embodiment, the overlooked selection screen 800C is not limited to this, but it is preferable that the transition is made only from the live selection screen 800B and not directly from the home screen 800A. The missed distribution is performed for the user who missed the live distribution, and is only a function accompanying the live distribution function. In addition, one of the purposes of this game is to enhance the fun of the game by allowing the user to watch the live stream in real time, support the character in real time, and deepen the interaction with the character. Therefore, in order to guide the user to watch the live distribution in real time rather than the overlooked distribution in which real-time interaction with the character (operator) is not possible, here, the overlooked selection screen 800C cannot be directly transitioned from the home screen 800A. It is better to do so.
 なお、見逃し選択画面800Cでは、ユーザがライブ配信パートをリアルタイムで進行させた実績がない配信済みの情報を表示するようにした。これに代えて、過去に配信された全てのライブに関する配信済み情報をライブ毎にリスト表示してもよい。この場合、ユーザがライブ配信パートをリアルタイムで進行させた実績の有無に応じて、見返し配信または見逃し配信の何れかが実行されるのがよい。具体的には、ユーザがライブ配信パートをリアルタイムで進行させた実績があると判定される場合は、前述の見返し配信となる。他方、ユーザがライブ配信パートをリアルタイムで進行させた実績がないと判定される場合は見逃し配信となる。図13に関して前述したように、見返し配信と見逃し配信とでは、異なるユーザ体験を提供することができる。 In addition, on the overlooked selection screen 800C, the delivered information that the user has not made the live delivery part in real time is displayed. Instead of this, the delivered information about all the live delivered in the past may be displayed in a list for each live. In this case, it is preferable that either the return distribution or the overlooked distribution is executed depending on whether or not the user has progressed the live distribution part in real time. Specifically, when it is determined that the user has a track record of advancing the live distribution part in real time, the above-mentioned return distribution is performed. On the other hand, if it is determined that the user has no record of advancing the live distribution part in real time, the distribution will be overlooked. As described above with respect to FIG. 13, different user experiences can be provided between the look-back delivery and the missed delivery.
 〔ソフトウェアによる実現例〕
 制御部110の制御ブロック(特に、操作受付部111、表示制御部112、UI制御部113、アニメーション生成部114、ゲーム進行部115、解析部116および進捗情報生成部117)、制御部210の制御ブロック(特に、進行支援部211および共有支援部212)、ならびに、制御部310の制御ブロック(特に、操作受付部311、表示制御部312、UI制御部313、アニメーション生成部314、進捗模擬部315、キャラクタ制御部316および反応処理部317)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、CPU(Central Processing Unit)を用いてソフトウェアによって実現してもよい。
[Example of implementation by software]
Control of the control block of the control unit 110 (particularly, the operation reception unit 111, the display control unit 112, the UI control unit 113, the animation generation unit 114, the game progress unit 115, the analysis unit 116 and the progress information generation unit 117), and the control unit 210. Blocks (particularly progress support unit 211 and shared support unit 212), and control blocks of control unit 310 (particularly, operation reception unit 311, display control unit 312, UI control unit 313, animation generation unit 314, progress simulation unit 315). , Character control unit 316 and reaction processing unit 317) may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or by software using a CPU (Central Processing Unit). You may.
 後者の場合、制御部110、制御部210または制御部310、もしくは、これらのうち複数を備えた情報処理装置は、各機能を実現するソフトウェアであるプログラムの命令を実行するCPU、上記プログラムおよび各種データがコンピュータ(またはCPU)で読み取り可能に記録されたROM(Read Only Memory)または記憶装置(これらを「記録媒体」と称する)、上記プログラムを展開するRAM(Random Access Memory)などを備えている。そして、コンピュータ(またはCPU)が上記プログラムを上記記録媒体から読み取って実行することにより、本発明の目的が達成される。上記記録媒体としては、「一時的でない有形の媒体」、例えば、テープ、ディスク、カード、半導体メモリ、プログラマブルな論理回路などを用いることができる。また、上記プログラムは、該プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波等)を介して上記コンピュータに供給されてもよい。なお、本発明の一態様は、上記プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。 In the latter case, the control unit 110, the control unit 210 or the control unit 310, or an information processing device including a plurality of these units is a CPU that executes instructions of a program that is software that realizes each function, the above program, and various types. It is equipped with a ROM (Read Only Memory) or storage device (these are referred to as "recording media") in which data is readablely recorded by a computer (or CPU), a RAM (Random Access Memory) for expanding the above program, and the like. .. Then, the object of the present invention is achieved by the computer (or CPU) reading the program from the recording medium and executing the program. As the recording medium, a "non-temporary tangible medium", for example, a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used. Further, the program may be supplied to the computer via any transmission medium (communication network, broadcast wave, etc.) capable of transmitting the program. It should be noted that one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the above program is embodied by electronic transmission.
 本発明は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。 The present invention is not limited to the above-described embodiments, and various modifications can be made within the scope of the claims, and the embodiments obtained by appropriately combining the technical means disclosed in the different embodiments. Is also included in the technical scope of the present invention.
 〔付記事項〕
 本発明の一側面に係る内容を列記すると以下のとおりである。
[Additional notes]
The contents relating to one aspect of the present invention are listed below.
 (項目1) 方法について説明した。本開示のある局面によると、方法は、プロセッサ、メモリ、表示部および操作部を備えるユーザ端末としてのコンピュータにより実行される方法であって、ユーザ端末は、当該ユーザ端末が存在する空間とは物理的に離れた空間に存在する外部装置であって当該ユーザ端末のユーザとは異なる者でありゲームに登場する少なくとも1つのキャラクタをユーザが視認できない空間で演じる演者からの入力に応じて当該キャラクタを制御する外部装置とネットワーク経由で通信可能に構成されており、方法は、プロセッサによる、操作部を介してコンピュータに入力されたユーザの入力操作に応じてゲームを進行させるステップと、外部装置に、ゲームの進捗を示す情報として、進行中のゲームのゲーム画面であって、表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を表示可能にするための進捗情報を逐次送信するステップと、当該逐次送信される進捗情報に基づき進行中のゲームのゲーム画面を演者が視認可能となるようにリアルタイムに表示する外部装置により取得される音声データであって、演者からリアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であってユーザが直接聞けない音声の音声データを、該外部装置から受信するステップと、音声データを受信したことをトリガにして、ゲームの進行中に登場するキャラクタに、少なくとも該音声データの内容を発話させることにより、該キャラクタを動作させるステップと、ゲームの終了後に、進行されたゲームの視聴を要求したのに応じて外部から再び受信される音声データに基づいて、キャラクタを再び動作させるステップとを含む。 (Item 1) The method was explained. According to certain aspects of the present disclosure, a method is a method performed by a computer as a user terminal comprising a processor, memory, display and operation unit, wherein the user terminal is physical with respect to the space in which the user terminal resides. An external device that exists in a distant space and is different from the user of the user terminal, and plays the character at least one character appearing in the game in a space invisible to the user. It is configured to be able to communicate with the external device to be controlled via the network, and the method is to proceed with the game according to the user's input operation input to the computer via the operation unit by the processor, and to the external device. As information indicating the progress of the game, progress information for making it possible to display a game screen of the game in progress, including a screen displayed on the display unit or a simulated screen that simplifies the screen. It is audio data acquired by an external device that displays the game screen of the game in progress in real time so that the performer can see it based on the step of sequentially transmitting and the progress information transmitted sequentially, and is real-time from the performer. The step of receiving the voice data of the voice that is emitted at an arbitrary timing with respect to the game screen displayed on the computer and cannot be heard directly by the user from the external device and the reception of the voice data as a trigger are used as a trigger. A step of operating the character by having a character appearing during the progress of the game speak at least the content of the voice data, and an external request for viewing the progressed game after the end of the game. It includes a step of reactivating the character based on the voice data received again.
 (項目2) (項目1)において、受信するステップは、外部装置においてキャラクタを演じる演者の入力したモーションデータを、該外部装置から音声データとともに受信し、キャラクタを動作させるステップは、音声データの内容を発話させるのに合わせて、モーションデータにしたがってキャラクタを動かし、キャラクタを再び動作させるステップは、音声データに加えてモーションデータにも基づいてキャラクタを再び動作させる。 (Item 2) In (Item 1), the receiving step is to receive the motion data input by the performer who plays the character in the external device together with the voice data from the external device, and the step to operate the character is the content of the voice data. The step of moving the character according to the motion data and re-moving the character in accordance with the speech data causes the character to re-operate based on the motion data in addition to the voice data.
 (項目3) (項目1)または(項目2)において、キャラクタを再び動作させるステップは、ユーザの入力操作による行動の記録に基づいて実行される。 (Item 3) In (Item 1) or (Item 2), the step of reactivating the character is executed based on the record of the action by the user's input operation.
 (項目4) (項目3)において、行動の記録が時間情報を含み、キャラクタを再び動作させるステップは、ゲームの進行中における、操作部を介したユーザの入力操作による時間情報の指定に従う。 (Item 4) In (Item 3), the action record includes the time information, and the step of operating the character again follows the designation of the time information by the user's input operation via the operation unit while the game is in progress.
 (項目5) (項目3)または(項目4)において、行動が、ゲームの進行中における、操作部を介したユーザの入力操作による特定進行部分の選択を含み、キャラクタを再び動作させるステップにおいて、選択された特定進行部分のみの進行が実行される。 (Item 5) In (Item 3) or (Item 4), in a step in which the action includes selection of a specific progress part by a user's input operation via an operation unit while the game is in progress, and causes the character to operate again. Only the selected specific progress part is progressed.
 (項目6) (項目3)から(項目5)のいずれか一項において、行動の記録が、ユーザの入力操作による有価データの消費を含み、キャラクタを再び動作させるステップの実行中において、キャラクタの動作態様は、有価データの消費に基づいて決められる。 (Item 6) In any one of (Item 3) to (Item 5), the recording of the action includes the consumption of valuable data by the input operation of the user, and during the execution of the step of reactivating the character, the character's The mode of operation is determined based on the consumption of valuable data.
 (項目7) 方法について説明した。本開示のある局面によると、方法は、第1プロセッサおよび第1メモリを備え、演者が行うゲームのリアルタイム配信をリアルタイムに視聴した少なくとも1人の第2ユーザ以外の第1ユーザの第1ユーザ端末により実行される方法であって、第1プロセッサによる、ゲームのリアルタイム配信の終了後に、外部から、当該ゲームに関する記録済みの音声データであってゲームに登場する少なくとも1つのキャラクタを制御するための音声データを受信するステップと、当該音声データに基づいて、キャラクタを動作させることにより、配信済みのゲームの再生を実行するステップと、を含み、配信済みのゲームは、第2プロセッサ、第2メモリ、表示部および操作部を備える第2ユーザの第2ユーザ端末としてのコンピュータにより実行される処理であって、第2ユーザ端末は、当該第2ユーザ端末が存在する空間とは物理的に離れた空間に存在する外部装置であって当該第2ユーザ端末の第2ユーザとは異なる者でありキャラクタを第2ユーザが視認できない空間で演じる演者からの入力に応じて当該キャラクタを制御する外部装置とネットワーク経由で通信可能に構成されており、第2プロセッサによって、操作部を介してコンピュータに入力された第2ユーザの入力操作に応じてゲームを進行させることと、外部装置に、ゲームの進捗を示す情報として、進行中のゲームのゲーム画面であって、表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を表示可能にするための進捗情報を逐次送信することと、当該逐次送信される進捗情報に基づき進行中のゲームのゲーム画面を演者が視認可能となるようにリアルタイムに表示する外部装置により取得される音声データであって、演者からリアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であって第2ユーザが直接聞けない音声の音声データを、該外部装置から受信することと、音声データを受信したことをトリガにして、ゲームの進行中に登場するキャラクタに、少なくとも該音声データの内容を発話させることにより、該キャラクタを動作させることと、が実行されることによって生成されたゲームである。 (Item 7) The method was explained. According to certain aspects of the disclosure, the method comprises a first user terminal of a first user other than at least one second user who has a first processor and a first memory and has viewed real-time distribution of a game performed by a performer in real time. This is a method executed by the first processor, and after the real-time distribution of the game is completed, the recorded voice data related to the game and the voice for controlling at least one character appearing in the game from the outside. The distributed game includes a step of receiving data and a step of executing playback of the distributed game by operating a character based on the voice data, and the distributed game includes a second processor, a second memory, and the like. A process executed by a computer as a second user terminal of a second user including a display unit and an operation unit, wherein the second user terminal is a space physically separated from the space in which the second user terminal exists. An external device and network that controls the character in response to input from a performer who is different from the second user of the second user terminal and plays the character in a space invisible to the second user. It is configured to be communicable via, and the second processor advances the game according to the input operation of the second user input to the computer via the operation unit, and indicates the progress of the game to the external device. As information, progress information for making it possible to display a game screen of a game in progress, including a screen displayed on the display unit or a simulated screen simplified from the screen, is sequentially transmitted. , A game that is audio data acquired by an external device that displays the game screen of the game in progress in real time so that the performer can see it based on the progress information transmitted sequentially, and is displayed in real time by the performer. The progress of the game is triggered by receiving the voice data of the voice that is emitted from the screen at an arbitrary timing and cannot be directly heard by the second user from the external device and receiving the voice data. It is a game generated by operating the character by causing the character appearing in the character to speak at least the content of the voice data, and by executing.
 (項目8) (項目7)において、コンピュータにより実行される音声データを受信するステップは、外部装置においてキャラクタを演じる演者の入力したモーションデータを、該外部装置から音声データとともに受信し、コンピュータにより実行されるキャラクタを動作させるステップは、音声データの内容を発話させるのに合わせて、モーションデータにしたがってキャラクタを動かし、第1ユーザ端末により実行される配信済みのゲームの再生を実行するステップは、音声データに加えてモーションデータにも基づいてキャラクタを動作させることにより、配信済みのゲームの再生を実行する。 (Item 8) In (Item 7), the step of receiving the voice data executed by the computer receives the motion data input by the performer who plays the character in the external device together with the voice data from the external device, and is executed by the computer. The step of operating the character to be performed is to move the character according to the motion data in accordance with the speech of the content of the voice data, and the step to execute the reproduction of the delivered game executed by the first user terminal is the voice. By operating the character based on the motion data in addition to the data, the delivered game is played back.
 (項目9) (項目7)または(項目8)において、配信済みのゲームは、第1ユーザがゲームをリアルタイムに進行させる場合と比べて、制限付きで進行される。 (Item 9) In (Item 7) or (Item 8), the delivered game is played with restrictions as compared with the case where the first user advances the game in real time.
 (項目10) (項目7)から(項目9)のいずれか1項目において、第1ユーザ端末はさらに表示部を備え、表示部は、リアルタイム配信に関するメニューを表示する第1画面と、第1画面から遷移され、視聴可能なリアルタイム配信を表示する第2画面と、配信済みのゲームに関する情報を表示するための第3画面とを表示可能に構成されており、第3画面は、第2画面から遷移され、第1画面からは遷移されないように構成されている。 (Item 10) In any one of (Item 7) to (Item 9), the first user terminal further includes a display unit, and the display unit has a first screen and a first screen for displaying a menu related to real-time distribution. It is configured to be able to display a second screen for displaying real-time distribution that can be viewed and a third screen for displaying information about the delivered game, and the third screen is from the second screen. It is configured so that it is transitioned and not transitioned from the first screen.
 (項目11) (項目10)において、第2画面に対する第1ユーザによる入力操作を受け付けると、第2画面から第3画面に遷移させるステップをさらに含む。 (Item 11) (Item 10) further includes a step of transitioning from the second screen to the third screen when an input operation by the first user for the second screen is accepted.
 (項目12) コンピュータ可読媒体について説明した。本開示のある局面によると、コンピュータ可読媒体は、コンピュータ実行可能命令を格納したコンピュータ可読媒体であって、コンピュータ実行可能命令が実行されると、プロセッサに、(項目1)から(項目6)のいずれか1項目の方法に含まれるステップを実行させる。 (Item 12) A computer-readable medium was explained. According to an aspect of the present disclosure, a computer-readable medium is a computer-readable medium that stores computer-executable instructions, and when a computer-executable instruction is executed, the processor is informed of (item 1) to (item 6). Execute the step included in the method of any one item.
 (項目13) コンピュータ可読媒体について説明した。本開示のある局面によると、コンピュータ可読媒体は、コンピュータ実行可能命令を格納したコンピュータ可読媒体であって、コンピュータ実行可能命令が実行されると、第1プロセッサに、(項目7)から(項目11)のいずれか1項目の方法に含まれるステップを実行させる。 (Item 13) A computer-readable medium was explained. According to an aspect of the present disclosure, a computer-readable medium is a computer-readable medium that stores a computer-executable instruction, and when the computer-executable instruction is executed, the first processor is notified from (item 7) to (item 11). ) To execute the step included in the method of any one item.
 (項目14) 情報処理装置について説明した。本開示のある局面によると、情報処理装置は、プロセッサ、メモリ、表示部および操作部を備えるユーザ端末としての情報処理装置であって、当該ユーザ端末は、当該ユーザ端末が存在する空間とは物理的に離れた空間に存在する外部装置であって当該ユーザ端末のユーザとは異なる者でありゲームに登場する少なくとも1つのキャラクタをユーザが視認できない空間で演じる演者からの入力に応じて当該キャラクタを制御する外部装置とネットワーク経由で通信可能に構成されており、プロセッサは、メモリに記憶されたプログラムを読み出すことにより、操作部を介してコンピュータに入力されたユーザの入力操作に応じてゲームを進行させることと、外部装置に、ゲームの進捗を示す情報として、進行中のゲームのゲーム画面であって、表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を表示可能にするための進捗情報を逐次送信することと、当該逐次送信される進捗情報に基づき進行中のゲームのゲーム画面を演者が視認可能となるようにリアルタイムに表示する外部装置により取得される音声データであって、演者からリアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であってユーザが直接聞けない音声の音声データを、該外部装置から受信することと、音声データを受信したことをトリガにして、ゲームの進行中に登場するキャラクタに、少なくとも該音声データの内容を発話させることにより、該キャラクタを動作させることと、ゲームの終了後に、進行されたゲームの視聴を要求したのに応じて外部から再び受信される音声データに基づいて、キャラクタを再び動作させることと、を実行するように構成されている。 (Item 14) The information processing device was explained. According to a certain aspect of the present disclosure, the information processing apparatus is an information processing apparatus as a user terminal including a processor, a memory, a display unit, and an operation unit, and the user terminal is a physical space in which the user terminal exists. An external device that exists in a distant space and is different from the user of the user terminal, and plays the character at least one character appearing in the game in a space invisible to the user. It is configured to be able to communicate with the external device to be controlled via the network, and the processor reads the program stored in the memory and advances the game according to the input operation of the user input to the computer via the operation unit. As information indicating the progress of the game, the external device displays a game screen of the game in progress, including a screen displayed on the display unit or a simulated screen that simplifies the screen. Voice acquired by an external device that sequentially transmits progress information to enable it and displays the game screen of the game in progress in real time so that the performer can see it based on the sequentially transmitted progress information. Receiving the voice data of the voice, which is the data and is the voice emitted from the performer to the game screen displayed in real time at an arbitrary timing and cannot be heard directly by the user, from the external device and the voice data. Using the reception as a trigger, the character appearing during the progress of the game is made to speak at least the content of the voice data to operate the character, and after the end of the game, the progressed game is viewed. It is configured to re-operate and execute the character based on the audio data received again from the outside in response to the request.
 (項目15) 情報処理装置について説明した。本開示のある局面によると、情報処理装置は、第1プロセッサおよび第1メモリを備え、演者が行うゲームのリアルタイム配信をリアルタイムに視聴した少なくとも1人の第2ユーザ以外の第1ユーザの第1ユーザ端末としての情報処理装置であって、第1プロセッサは、第1メモリに記憶されたプログラムを読み出すことにより、ゲームのリアルタイム配信の終了後に、外部から、当該ゲームに関する記録済みの音声データであってゲームに登場する少なくとも1つのキャラクタを制御するための音声データを受信することと、当該音声データに基づいて、キャラクタを動作させることにより、当該配信済みのゲームの再生を実行することと、を実行するように構成され、配信済みのゲームは、第2プロセッサ、第2メモリ、表示部および操作部を備える第2ユーザの第2ユーザ端末としてのコンピュータにより実行される処理であって、第2ユーザ端末は、当該第2ユーザ端末が存在する空間とは物理的に離れた空間に存在する外部装置であって当該第2ユーザ端末の第2ユーザとは異なる者でありキャラクタを第2ユーザが視認できない空間で演じる演者からの入力に応じて当該キャラクタを制御する外部装置とネットワーク経由で通信可能に構成されており、第2プロセッサが、第2メモリに記憶されたプログラムを読み出すことによって、操作部を介してコンピュータに入力された第2ユーザの入力操作に応じてゲームを進行させることと、外部装置に、ゲームの進捗を示す情報として、進行中のゲームのゲーム画面であって、表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を表示可能にするための進捗情報を逐次送信することと、当該逐次送信される進捗情報に基づき進行中のゲームのゲーム画面を演者が視認可能となるようにリアルタイムに表示する外部装置により取得される音声データであって、演者からリアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であって第2ユーザが直接聞けない音声の音声データを、該外部装置から受信することと、音声データを受信したことをトリガにして、ゲームの進行中に登場するキャラクタに、少なくとも該音声データの内容を発話させることにより、該キャラクタを動作させることと、が実行されることによって生成されたゲームである。 (Item 15) The information processing device was explained. According to one aspect of the present disclosure, the information processing apparatus comprises a first processor and a first memory, and is a first user other than at least one second user who has viewed real-time distribution of a game performed by a performer in real time. An information processing device as a user terminal, the first processor reads a program stored in the first memory, and after the real-time distribution of the game is completed, the first processor is recorded voice data related to the game from the outside. To receive voice data for controlling at least one character appearing in the game, and to execute the delivered game by operating the character based on the voice data. The game configured to be executed and distributed is a process executed by a computer as a second user terminal of a second user including a second processor, a second memory, a display unit, and an operation unit, and is a second process. The user terminal is an external device that exists in a space physically separated from the space in which the second user terminal exists, and is different from the second user of the second user terminal, and the character is used by the second user. It is configured to be able to communicate via a network with an external device that controls the character in response to input from a performer who plays in an invisible space, and is operated by the second processor reading a program stored in the second memory. The display unit is a game screen of the game in progress as information indicating the progress of the game to the external device and to advance the game according to the input operation of the second user input to the computer through the unit. The progress information for making it possible to display the screen displayed on the screen or the game screen including the simulated screen which is a simplification of the screen is sequentially transmitted, and the game of the game in progress based on the sequentially transmitted progress information. Second, it is voice data acquired by an external device that displays the screen in real time so that the performer can see it, and is voice emitted from the performer at an arbitrary timing to the game screen displayed in real time. Receiving the voice data of the voice that the user cannot directly hear from the external device and triggering the reception of the voice data, the character appearing during the progress of the game is made to speak at least the content of the voice data. Thereby, it is a game generated by operating the character and executing.
 (項目16) 方法について説明した。本開示のある局面によると、方法は、ゲームに登場するキャラクタを制御するための方法であって、方法は、プロセッサ、メモリおよび表示部を備え、ゲームに登場する少なくとも1つのキャラクタを演じる演者からの入力に応じて当該キャラクタを制御するコンピュータであって、演者とは異なるユーザのユーザ端末が存在する空間とは物理的に離れておりユーザが演者を視認できない空間に存在しユーザ端末とネットワーク経由で通信可能なコンピュータにより実行される方法であり、方法は、プロセッサによる、ゲームを進行させるユーザ端末から逐次送信される、進行中のゲームのゲーム画面を表示可能にするための進捗情報に基づいて、進行中のゲームのゲーム画面であって、ユーザ端末の表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を演者が視認可能となるようにコンピュータの表示部においてリアルタイムに表示するステップと、演者からリアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であってユーザが直接聞けない音声を受け付けるステップと、受け付けられた音声の音声データをユーザ端末に送信するステップと、ゲームの終了後に、ユーザ端末からの要求に応じて、ユーザ端末において、終了したゲーム中と同様にキャラクタに音声データの内容を発話させるために、音声データをユーザ端末に送信する、または、ユーザ端末以外の端末からの要求に応じて、ユーザ端末以外の端末において、終了したゲーム中と同様にキャラクタに音声データの内容を発話させるために、音声データをユーザ端末以外の端末に送信する、ステップと、を含む。 (Item 16) The method was explained. According to certain aspects of the disclosure, a method is a method for controlling a character appearing in a game, wherein the method comprises a processor, a memory and a display, from a performer who plays at least one character appearing in the game. A computer that controls the character in response to the input of It is a method performed by a computer capable of communicating with the computer, and the method is based on progress information by the processor to enable the display of the game screen of the game in progress, which is sequentially transmitted from the user terminal in which the game is progressed. On the display unit of the computer so that the performer can visually recognize the game screen of the game in progress, which is displayed on the display unit of the user terminal or includes a simulated screen obtained by simplifying the screen. A step of displaying in real time, a step of accepting a voice that is emitted from the performer at an arbitrary timing to the game screen and cannot be heard directly by the user, and a step of accepting the voice data of the received voice on the user terminal. And after the end of the game, in response to a request from the user terminal, the voice data is transmitted to the user terminal on the user terminal in order to make the character speak the content of the voice data as in the end of the game. Or, in response to a request from a terminal other than the user terminal, in order to make the character speak the content of the voice data on the terminal other than the user terminal as in the end of the game, the voice data is transmitted to the terminal other than the user terminal. Send to, including steps and.
 (項目17) コンピュータ可読媒体について説明した。本開示のある局面によると、コンピュータ可読媒体は、コンピュータ実行可能命令を格納したコンピュータ可読媒体であって、コンピュータ実行可能命令が実行されると、プロセッサに、上記の方法に含まれるステップを実行させる。 (Item 17) A computer-readable medium was explained. According to certain aspects of the disclosure, a computer-readable medium is a computer-readable medium containing computer-executable instructions that, upon execution of the computer-executable instruction, causes the processor to perform the steps included in the above method. ..
 (項目18) 情報処理装置について説明した。本開示のある局面によると、情報処理装置は、ゲームに登場するキャラクタを制御するための情報処理装置であって、情報処理装置は、プロセッサ、メモリおよび表示部を備え、ゲームに登場する少なくとも1つのキャラクタを演じる演者からの入力に応じて当該キャラクタを制御するコンピュータであって、演者とは異なるユーザのユーザ端末が存在する空間とは物理的に離れておりユーザが演者を視認できない空間に存在しユーザ端末とネットワーク経由で通信可能に構成されており、プロセッサは、メモリに記憶されたプログラムを読み出すことにより、ゲームを進行させるユーザ端末から逐次送信される、進行中のゲームのゲーム画面を表示可能にするための進捗情報に基づいて、進行中のゲームのゲーム画面であって、ユーザ端末の表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を演者が視認可能となるようにコンピュータの表示部においてリアルタイムに表示することと、演者からリアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であってユーザが直接聞けない音声を受け付けることと、受け付けられた音声の音声データをユーザ端末に送信することと、ゲームの終了後に、ユーザ端末からの要求に応じて、ユーザ端末において、終了したゲーム中と同様にキャラクタに音声データの内容を発話させるために、音声データをユーザ端末に送信する、または、ユーザ端末以外の端末からの要求に応じて、ユーザ端末以外の端末において、終了したゲーム中と同様にキャラクタに音声データの内容を発話させるために、音声データをユーザ端末以外の端末に送信することと、を実行するように構成されている。 (Item 18) The information processing device was explained. According to an aspect of the present disclosure, the information processing device is an information processing device for controlling a character appearing in a game, and the information processing device includes a processor, a memory, and a display unit, and at least one appearing in the game. A computer that controls a character in response to input from a performer who plays one character, and exists in a space where the user cannot see the performer because it is physically separated from the space where the user terminal of a user different from the performer exists. It is configured to be able to communicate with the user terminal via the network, and the processor reads the program stored in the memory to display the game screen of the game in progress, which is sequentially transmitted from the user terminal that advances the game. Based on the progress information to enable, the performer visually recognizes the game screen of the game in progress, including the screen displayed on the display unit of the user terminal or the simulated screen simplified from the screen. It is possible to display in real time on the display of the computer so that it is possible, and to accept the voice that is emitted from the performer at any timing to the game screen displayed in real time and that the user cannot hear directly. The voice data of the received voice is transmitted to the user terminal, and after the game is finished, the character is made to speak the content of the voice data on the user terminal in the same manner as during the finished game in response to the request from the user terminal. Therefore, in order to send the voice data to the user terminal, or to make the character speak the content of the voice data on the terminal other than the user terminal in response to the request from the terminal other than the user terminal, as in the end of the game. It is configured to send voice data to a terminal other than the user terminal.
1 ゲームシステム、2 ネットワーク、10,20,30 プロセッサ、11,21,31 メモリ、12,22,32 ストレージ、13,23,33 通信IF(操作部)、14,24,34 入出力IF(操作部、表示部)、15,35 タッチスクリーン(表示部、操作部)、17 カメラ(操作部)、18 測距センサ(操作部)、100 ユーザ端末(コンピュータ、情報処理装置)、110,113,210,310 制御部、111,311 操作受付部、112,312 表示制御部、113,313 UI制御部、114,314 アニメーション生成部、115 ゲーム進行部、116 解析部、117 進捗情報生成部、120,220,320 記憶部、131 ゲームプログラム、132 ゲーム情報、133 ユーザ情報、134 キャラクタ制御プログラム、151,351 入力部(操作部)、152,352 表示部、200 サーバ(コンピュータ)、211 進行支援部、212 共有支援部、300 動作指図装置(コンピュータ、NPC制御装置、キャラクタ制御装置、情報処理装置)、315 進捗模擬部、316 キャラクタ制御部、317 反応処理部、1010 物体、1020,3030 コントローラ、1030 記憶媒体、3010 マイク、3020 モーションキャプチャ装置 1 game system, 2 network, 10, 20, 30 processor, 11,21,31 memory, 12, 22, 32 storage, 13, 23, 33 communication IF (operation unit), 14, 24, 34 input / output IF (operation) Unit, display unit), 15, 35 touch screen (display unit, operation unit), 17 camera (operation unit), 18 distance measurement sensor (operation unit), 100 user terminal (computer, information processing device), 110, 113, 210, 310 control unit, 111,311 operation reception unit, 112,312 display control unit, 113,313 UI control unit, 114,314 animation generation unit, 115 game progress unit, 116 analysis unit, 117 progress information generation unit, 120 , 220, 320 storage unit, 131 game program, 132 game information, 133 user information, 134 character control program, 151,351 input unit (operation unit), 152,352 display unit, 200 server (computer), 211 progress support unit , 212 shared support unit, 300 operation instruction device (computer, NPC control device, character control device, information processing device), 315 progress simulation unit, 316 character control unit, 317 reaction processing unit, 1010 object, 1020, 3030 controller, 1030 Storage medium, 3010 microphone, 3020 motion capture device

Claims (18)

  1.  プロセッサ、メモリ、表示部および操作部を備えるユーザ端末としてのコンピュータにより実行される方法であって、
     前記ユーザ端末は、当該ユーザ端末が存在する空間とは物理的に離れた空間に存在する外部装置であって当該ユーザ端末のユーザとは異なる者でありゲームに登場する少なくとも1つのキャラクタを前記ユーザが視認できない空間で演じる演者からの入力に応じて当該キャラクタを制御する外部装置とネットワーク経由で通信可能に構成されており、
     前記方法は、前記プロセッサによる、
      前記操作部を介して前記コンピュータに入力されたユーザの入力操作に応じてゲームを進行させるステップと、
      前記外部装置に、前記ゲームの進捗を示す情報として、前記進行中のゲームのゲーム画面であって、前記表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を表示可能にするための進捗情報を逐次送信するステップと、
      前記逐次送信される進捗情報に基づき前記進行中のゲームのゲーム画面を前記演者が視認可能となるようにリアルタイムに表示する前記外部装置により取得される音声データであって、前記演者から前記リアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であってユーザが直接聞けない音声の音声データを、該外部装置から受信するステップと、
      前記音声データを受信したことをトリガにして、前記ゲームの進行中に登場する前記キャラクタに、少なくとも該音声データの内容を発話させることにより、該キャラクタを動作させるステップと、
      前記ゲームの終了後に、前記進行されたゲームの視聴を要求したのに応じて外部から再び受信される前記音声データに基づいて、前記キャラクタを再び動作させるステップと
    を含む、方法。
    A method performed by a computer as a user terminal having a processor, memory, display and operation.
    The user terminal is an external device that exists in a space physically separated from the space in which the user terminal exists, and is different from the user of the user terminal, and at least one character appearing in the game is the user. It is configured to be able to communicate via a network with an external device that controls the character in response to input from the performer who plays in an invisible space.
    The method is based on the processor.
    A step of advancing the game according to a user's input operation input to the computer via the operation unit, and
    As information indicating the progress of the game, the external device displays a game screen which is a game screen of the game in progress and includes a screen displayed on the display unit or a simulated screen simplified from the screen. Steps to sequentially send progress information to enable,
    Audio data acquired by the external device that displays the game screen of the game in progress in real time so that the performer can see it based on the progress information transmitted sequentially, and is the audio data acquired from the performer in real time. A step of receiving voice data from the external device, which is voice emitted from the displayed game screen at an arbitrary timing and cannot be heard directly by the user.
    A step of operating the character by causing the character appearing during the progress of the game to speak at least the content of the voice data by using the reception of the voice data as a trigger.
    A method comprising the step of reactivating the character based on the audio data received again from the outside in response to a request for viewing of the advanced game after the end of the game.
  2.  前記受信するステップは、前記外部装置において前記キャラクタを演じる演者の入力したモーションデータを、該外部装置から前記音声データとともに受信し、
     前記キャラクタを動作させるステップは、前記音声データの内容を発話させるのに合わせて、前記モーションデータにしたがって前記キャラクタを動かし、
     前記キャラクタを再び動作させるステップは、前記音声データに加えて前記モーションデータにも基づいて前記キャラクタを再び動作させる、請求項1に記載の方法。
    In the receiving step, the motion data input by the performer who plays the character in the external device is received from the external device together with the voice data.
    In the step of operating the character, the character is moved according to the motion data in accordance with the utterance of the content of the voice data.
    The method according to claim 1, wherein the step of operating the character again is to operate the character again based on the motion data in addition to the voice data.
  3.  前記キャラクタを再び動作させるステップは、前記ユーザの入力操作による行動の記録に基づいて実行される、請求項1または2に記載の方法。 The method according to claim 1 or 2, wherein the step of operating the character again is executed based on a record of actions by the input operation of the user.
  4.  前記行動の記録が時間情報を含み、
     前記キャラクタを再び動作させるステップは、前記ゲームの進行中における、前記操作部を介した前記ユーザの入力操作による時間情報の指定に従う、請求項3に記載の方法。
    The record of the action contains time information
    The method according to claim 3, wherein the step of operating the character again follows the designation of time information by the user's input operation via the operation unit while the game is in progress.
  5.  前記行動が、前記ゲームの進行中における、前記操作部を介した前記ユーザの入力操作による特定進行部分の選択を含み、
     前記キャラクタを再び動作させるステップにおいて、選択された前記特定進行部分のみの進行が実行される、請求項3または4に記載の方法。
    The action includes selection of a specific progress portion by an input operation of the user via the operation unit while the game is in progress.
    The method according to claim 3 or 4, wherein in the step of reactivating the character, the progress of only the selected specific progress portion is executed.
  6.  前記行動の記録が、前記ユーザの入力操作による有価データの消費を含み、
     前記キャラクタを再び動作させるステップの実行中において、前記キャラクタの動作態様は、前記有価データの消費に基づいて決められる、請求項3から5のいずれか一項に記載の方法。
    The record of the action includes the consumption of valuable data by the input operation of the user.
    The method according to any one of claims 3 to 5, wherein the operation mode of the character is determined based on the consumption of the valuable data during the execution of the step of operating the character again.
  7.  第1プロセッサおよび第1メモリを備え、演者が行うゲームのリアルタイム配信をリアルタイムに視聴した少なくとも1人の第2ユーザ以外の第1ユーザの第1ユーザ端末により実行される方法であって、前記第1プロセッサによる、
     前記ゲームのリアルタイム配信の終了後に、外部から、当該ゲームに関する記録済みの音声データであって前記ゲームに登場する少なくとも1つのキャラクタを制御するための音声データを受信するステップと、
     前記音声データに基づいて、前記キャラクタを動作させることにより、当該配信済みのゲームの再生を実行するステップと、を含み、
     前記配信済みのゲームは、
     第2プロセッサ、第2メモリ、表示部および操作部を備える前記第2ユーザの第2ユーザ端末としてのコンピュータにより実行される処理であって、
     前記第2ユーザ端末は、当該第2ユーザ端末が存在する空間とは物理的に離れた空間に存在する外部装置であって当該第2ユーザ端末の第2ユーザとは異なる者であり前記キャラクタを前記第2ユーザが視認できない空間で演じる前記演者からの入力に応じて当該キャラクタを制御する外部装置とネットワーク経由で通信可能に構成されており、
     前記第2プロセッサによって、
      前記操作部を介して前記コンピュータに入力された第2ユーザの入力操作に応じてゲームを進行させることと、
      前記外部装置に、前記ゲームの進捗を示す情報として、前記進行中のゲームのゲーム画面であって、前記表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を表示可能にするための進捗情報を逐次送信することと、
      前記逐次送信される進捗情報に基づき前記進行中のゲームのゲーム画面を前記演者が視認可能となるようにリアルタイムに表示する前記外部装置により取得される音声データであって、前記演者から前記リアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であって第2ユーザが直接聞けない音声の音声データを、該外部装置から受信することと、
      前記音声データを受信したことをトリガにして、前記ゲームの進行中に登場する前記キャラクタに、少なくとも該音声データの内容を発話させることにより、該キャラクタを動作させることと、が実行されることによって生成されたゲームである、方法。
    A method executed by a first user terminal of a first user other than at least one second user who has a first processor and a first memory and has viewed real-time distribution of a game performed by a performer in real time. With one processor
    After the end of the real-time distribution of the game, a step of receiving voice data for controlling at least one character appearing in the game, which is recorded voice data related to the game, from the outside.
    Including a step of executing the reproduction of the delivered game by operating the character based on the voice data.
    The delivered game is
    A process executed by a computer as a second user terminal of the second user including a second processor, a second memory, a display unit, and an operation unit.
    The second user terminal is an external device that exists in a space physically separated from the space in which the second user terminal exists, and is different from the second user of the second user terminal, and the character is used. It is configured to be able to communicate with an external device that controls the character in response to an input from the performer who plays in a space invisible to the second user via a network.
    By the second processor
    To advance the game in response to the input operation of the second user input to the computer via the operation unit.
    As information indicating the progress of the game, the external device displays a game screen which is a game screen of the game in progress and includes a screen displayed on the display unit or a simulated screen simplified from the screen. Sequentially sending progress information to make it possible,
    Audio data acquired by the external device that displays the game screen of the game in progress in real time so that the performer can see it based on the progress information transmitted sequentially, and is the audio data acquired from the performer in real time. Receiving the voice data of the voice that is emitted from the displayed game screen at an arbitrary timing and cannot be heard directly by the second user from the external device.
    By triggering the reception of the voice data, the character appearing during the progress of the game is made to speak at least the content of the voice data to operate the character. The method, which is the generated game.
  8.  前記コンピュータにより実行される前記音声データを受信するステップは、前記外部装置において前記キャラクタを演じる演者の入力したモーションデータを、該外部装置から前記音声データとともに受信し、
     前記コンピュータにより実行される前記キャラクタを動作させるステップは、前記音声データの内容を発話させるのに合わせて、前記モーションデータにしたがって前記キャラクタを動かし、
     前記第1ユーザ端末により実行される前記配信済みのゲームの再生を実行するステップは、前記音声データに加えて前記モーションデータにも基づいて前記キャラクタを動作させることにより、前記配信済みのゲームの再生を実行する、請求項7に記載の方法。
    The step of receiving the voice data executed by the computer receives the motion data input by the performer who plays the character in the external device together with the voice data from the external device.
    The step of operating the character, which is executed by the computer, moves the character according to the motion data in accordance with the utterance of the content of the voice data.
    The step of executing the reproduction of the distributed game executed by the first user terminal is to reproduce the distributed game by operating the character based on the motion data in addition to the voice data. 7. The method of claim 7.
  9.  前記配信済みのゲームは、前記第1ユーザが前記ゲームをリアルタイムに進行させる場合と比べて、制限付きで進行される、請求項7または8に記載の方法。 The method according to claim 7 or 8, wherein the delivered game is played with restrictions as compared with the case where the first user advances the game in real time.
  10.  前記第1ユーザ端末はさらに表示部を備え、前記表示部は、
     リアルタイム配信に関するメニューを表示する第1画面と、前記第1画面から遷移され、視聴可能なリアルタイム配信を表示する第2画面と、前記配信済みのゲームに関する情報を表示するための第3画面とを表示可能に構成されており、
     前記第3画面は、前記第2画面から遷移され、前記第1画面からは遷移されないように構成されている、請求項7から9のいずれか一項に記載の方法。
    The first user terminal further includes a display unit, and the display unit is
    A first screen for displaying a menu related to real-time distribution, a second screen for displaying real-time distribution which is transitioned from the first screen and can be viewed, and a third screen for displaying information about the distributed game. It is configured to be displayable and
    The method according to any one of claims 7 to 9, wherein the third screen is configured to be transitioned from the second screen and not to be transitioned from the first screen.
  11.  前記第2画面に対する前記第1ユーザによる入力操作を受け付けると、前記第2画面から前記第3画面に遷移させるステップをさらに含む、請求項10に記載の方法。 The method according to claim 10, further comprising a step of transitioning from the second screen to the third screen when an input operation by the first user for the second screen is accepted.
  12.  コンピュータ実行可能命令を格納したコンピュータ可読媒体であって、前記コンピュータ実行可能命令が実行されると、前記プロセッサに、請求項1から6のいずれか一項に記載の方法に含まれるステップを実行させる、コンピュータ可読媒体。 A computer-readable medium containing computer-executable instructions that, upon execution of the computer-executable instruction, causes the processor to perform the steps included in any one of claims 1-6. , Computer readable medium.
  13.  コンピュータ実行可能命令を格納したコンピュータ可読媒体であって、前記コンピュータ実行可能命令が実行されると、前記第1プロセッサに、請求項7から11のいずれか一項に記載の方法に含まれるステップを実行させる、コンピュータ可読媒体。 A computer-readable medium in which a computer-executable instruction is stored, and when the computer-executable instruction is executed, the first processor is provided with a step included in the method according to any one of claims 7 to 11. A computer-readable medium to run.
  14.  プロセッサ、メモリ、表示部および操作部を備えるユーザ端末としての情報処理装置であって、
     前記ユーザ端末は、当該ユーザ端末が存在する空間とは物理的に離れた空間に存在する外部装置であって当該ユーザ端末のユーザとは異なる者でありゲームに登場する少なくとも1つのキャラクタを前記ユーザが視認できない空間で演じる演者からの入力に応じて当該キャラクタを制御する外部装置とネットワーク経由で通信可能に構成されており、
     前記プロセッサは、前記メモリに記憶されたプログラムを読み出すことにより、
      前記操作部を介して前記コンピュータに入力されたユーザの入力操作に応じてゲームを進行させることと、
      前記外部装置に、前記ゲームの進捗を示す情報として、前記進行中のゲームのゲーム画面であって、前記表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を表示可能にするための進捗情報を逐次送信することと、
      前記逐次送信される進捗情報に基づき前記進行中のゲームのゲーム画面を前記演者が視認可能となるようにリアルタイムに表示する前記外部装置により取得される音声データであって、前記演者から前記リアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であってユーザが直接聞けない音声の音声データを、該外部装置から受信することと、
      前記音声データを受信したことをトリガにして、前記ゲームの進行中に登場する前記キャラクタに、少なくとも該音声データの内容を発話させることにより、該キャラクタを動作させることと、
      前記ゲームの終了後に、前記進行されたゲームの視聴を要求したのに応じて外部から再び受信される前記音声データに基づいて、前記キャラクタを再び動作させることと、
    を実行するように構成された、情報処理装置。
    An information processing device as a user terminal including a processor, a memory, a display unit, and an operation unit.
    The user terminal is an external device that exists in a space physically separated from the space in which the user terminal exists, and is different from the user of the user terminal, and at least one character appearing in the game is the user. It is configured to be able to communicate via a network with an external device that controls the character in response to input from the performer who plays in an invisible space.
    The processor reads a program stored in the memory, thereby
    To advance the game in response to a user's input operation input to the computer via the operation unit.
    As information indicating the progress of the game, the external device displays a game screen which is a game screen of the game in progress and includes a screen displayed on the display unit or a simulated screen simplified from the screen. Sequentially sending progress information to make it possible,
    Audio data acquired by the external device that displays the game screen of the game in progress in real time so that the performer can see it based on the progress information transmitted sequentially, and is the audio data acquired from the performer in real time. Receiving voice data from the external device, which is voice emitted from the displayed game screen at an arbitrary timing and cannot be heard directly by the user,
    Using the reception of the voice data as a trigger, the character appearing during the progress of the game is made to speak at least the content of the voice data to operate the character.
    After the end of the game, the character is re-operated based on the voice data received again from the outside in response to the request for viewing the progressed game.
    An information processing device configured to run.
  15.  第1プロセッサおよび第1メモリを備え、演者が行うゲームのリアルタイム配信をリアルタイムに視聴した少なくとも1人の第2ユーザ以外の第1ユーザの第1ユーザ端末としての情報処理装置であって、
     前記第1プロセッサは、前記第1メモリに記憶されたプログラムを読み出すことにより、
     前記ゲームのリアルタイム配信の終了後に、外部から、当該ゲームに関する記録済みの音声データであって前記ゲームに登場する少なくとも1つのキャラクタを制御するための音声データを受信することと、
     前記音声データに基づいて、前記キャラクタを動作させることにより、当該配信済みのゲームの再生を実行することと、を実行するように構成され、
     前記配信済みのゲームは、
     第2プロセッサ、第2メモリ、表示部および操作部を備える前記第2ユーザの第2ユーザ端末としてのコンピュータにより実行される処理であって、
     前記第2ユーザ端末は、当該第2ユーザ端末が存在する空間とは物理的に離れた空間に存在する外部装置であって当該第2ユーザ端末の第2ユーザとは異なる者であり前記キャラクタを前記第2ユーザが視認できない空間で演じる前記演者からの入力に応じて当該キャラクタを制御する外部装置とネットワーク経由で通信可能に構成されており、
     前記第2プロセッサが、前記第2メモリに記憶されたプログラムを読み出すことによって、
      前記操作部を介して前記コンピュータに入力された第2ユーザの入力操作に応じてゲームを進行させることと、
      前記外部装置に、前記ゲームの進捗を示す情報として、前記進行中のゲームのゲーム画面であって、前記表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を表示可能にするための進捗情報を逐次送信することと、
      前記逐次送信される進捗情報に基づき前記進行中のゲームのゲーム画面を前記演者が視認可能となるようにリアルタイムに表示する前記外部装置により取得される音声データであって、前記演者から前記リアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であって第2ユーザが直接聞けない音声の音声データを、該外部装置から受信することと、
      前記音声データを受信したことをトリガにして、前記ゲームの進行中に登場する前記キャラクタに、少なくとも該音声データの内容を発話させることにより、該キャラクタを動作させることと、が実行されることによって生成されたゲームである、情報処理装置。
    An information processing device as a first user terminal of a first user other than at least one second user who has a first processor and a first memory and has viewed real-time distribution of a game performed by a performer in real time.
    The first processor reads a program stored in the first memory.
    After the real-time distribution of the game is completed, the voice data recorded for the game and for controlling at least one character appearing in the game is received from the outside.
    By operating the character based on the voice data, it is configured to execute the reproduction of the delivered game and to execute.
    The delivered game is
    A process executed by a computer as a second user terminal of the second user including a second processor, a second memory, a display unit, and an operation unit.
    The second user terminal is an external device that exists in a space physically separated from the space in which the second user terminal exists, and is different from the second user of the second user terminal, and the character is used. It is configured to be able to communicate with an external device that controls the character in response to an input from the performer who plays in a space invisible to the second user via a network.
    The second processor reads the program stored in the second memory.
    To advance the game in response to the input operation of the second user input to the computer via the operation unit.
    As information indicating the progress of the game, the external device displays a game screen which is a game screen of the game in progress and includes a screen displayed on the display unit or a simulated screen simplified from the screen. Sequentially sending progress information to make it possible,
    Audio data acquired by the external device that displays the game screen of the game in progress in real time so that the performer can see it based on the progress information transmitted sequentially, and is the audio data acquired from the performer in real time. Receiving the voice data of the voice that is emitted from the displayed game screen at an arbitrary timing and cannot be heard directly by the second user from the external device.
    By triggering the reception of the voice data, the character appearing during the progress of the game is made to speak at least the content of the voice data to operate the character. An information processing device that is a generated game.
  16.  ゲームに登場するキャラクタを制御するための方法であって、
     前記方法は、プロセッサ、メモリおよび表示部を備え、前記ゲームに登場する少なくとも1つのキャラクタを演じる演者からの入力に応じて当該キャラクタを制御するコンピュータであって、前記演者とは異なるユーザのユーザ端末が存在する空間とは物理的に離れており前記ユーザが前記演者を視認できない空間に存在し前記ユーザ端末とネットワーク経由で通信可能なコンピュータにより実行される方法であり、
     前記方法は、前記プロセッサによる、
      前記ゲームを進行させるユーザ端末から逐次送信される、前記進行中のゲームのゲーム画面を表示可能にするための進捗情報に基づいて、前記進行中のゲームのゲーム画面であって、前記ユーザ端末の表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を前記演者が視認可能となるように前記コンピュータの表示部においてリアルタイムに表示するステップと、
      前記演者から前記リアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であって前記ユーザが直接聞けない音声を受け付けるステップと、
      受け付けられた音声の音声データを前記ユーザ端末に送信するステップと、
      前記ゲームの終了後に、前記ユーザ端末からの要求に応じて、前記ユーザ端末において、終了したゲーム中と同様に前記キャラクタに前記音声データの内容を発話させるために、前記音声データを前記ユーザ端末に送信する、または、前記ユーザ端末以外の端末からの要求に応じて、前記ユーザ端末以外の端末において、終了したゲーム中と同様に前記キャラクタに前記音声データの内容を発話させるために、前記音声データを前記ユーザ端末以外の端末に送信する、ステップと、
    を含む、方法。
    It ’s a way to control the characters that appear in the game.
    The method is a computer including a processor, a memory, and a display unit, which controls the character in response to an input from a performer who plays at least one character appearing in the game, and is a user terminal of a user different from the performer. It is a method executed by a computer that is physically separated from the space in which the user exists and is in a space where the user cannot see the performer and can communicate with the user terminal via a network.
    The method is based on the processor.
    A game screen of the game in progress, which is a game screen of the user terminal, based on progress information for making it possible to display the game screen of the game in progress, which is sequentially transmitted from the user terminal for advancing the game. A step of displaying a game screen including a screen displayed on the display unit or a simulated screen simplified from the screen in real time on the display unit of the computer so that the performer can see it.
    A step of accepting a voice emitted from the performer to the game screen displayed in real time at an arbitrary timing and which the user cannot directly hear.
    The step of transmitting the voice data of the received voice to the user terminal, and
    After the end of the game, in response to a request from the user terminal, the voice data is transmitted to the user terminal in order to cause the character to speak the content of the voice data in the user terminal as in the case of the completed game. The voice data is transmitted or, in response to a request from a terminal other than the user terminal, the character is made to speak the content of the voice data on a terminal other than the user terminal as in the case of the completed game. To a terminal other than the user terminal.
    Including, how.
  17.  コンピュータ実行可能命令を格納したコンピュータ可読媒体であって、前記コンピュータ実行可能命令が実行されると、前記プロセッサに、請求項16に記載の方法に含まれるステップを実行させる、コンピュータ可読媒体。 A computer-readable medium in which a computer-executable instruction is stored, and when the computer-executable instruction is executed, the computer-readable medium causes the processor to execute the step included in the method according to claim 16.
  18.  ゲームに登場するキャラクタを制御するための情報処理装置であって、
     前記情報処理装置は、プロセッサ、メモリおよび表示部を備え、前記ゲームに登場する少なくとも1つのキャラクタを演じる演者からの入力に応じて当該キャラクタを制御するコンピュータであって、前記演者とは異なるユーザのユーザ端末が存在する空間とは物理的に離れており前記ユーザが前記演者を視認できない空間に存在し前記ユーザ端末とネットワーク経由で通信可能に構成されており、
     前記プロセッサは、前記メモリに記憶されたプログラムを読み出すことにより、
      前記ゲームを進行させるユーザ端末から逐次送信される、前記進行中のゲームのゲーム画面を表示可能にするための進捗情報に基づいて、前記進行中のゲームのゲーム画面であって、前記ユーザ端末の表示部に表示されている画面または該画面を簡略化した模擬画面を含むゲーム画面を前記演者が視認可能となるように前記コンピュータの表示部においてリアルタイムに表示することと、
      前記演者から前記リアルタイムに表示されるゲーム画面に対して任意のタイミングで発せられる音声であって前記ユーザが直接聞けない音声を受け付けることと、
      受け付けられた音声の音声データを前記ユーザ端末に送信することと、
      前記ゲームの終了後に、前記ユーザ端末からの要求に応じて、前記ユーザ端末において、終了したゲーム中と同様に前記キャラクタに前記音声データの内容を発話させるために、前記音声データを前記ユーザ端末に送信する、または、前記ユーザ端末以外の端末からの要求に応じて、前記ユーザ端末以外の端末において、終了したゲーム中と同様に前記キャラクタに前記音声データの内容を発話させるために、前記音声データを前記ユーザ端末以外の端末に送信することと、
    を実行するように構成された、情報処理装置。
    An information processing device for controlling characters appearing in games.
    The information processing device is a computer including a processor, a memory, and a display unit, and controls the character in response to an input from a performer who plays at least one character appearing in the game, and is a computer different from the performer. It is configured to be physically separated from the space where the user terminal exists, to exist in a space where the user cannot see the performer, and to be able to communicate with the user terminal via a network.
    The processor reads a program stored in the memory, thereby
    A game screen of the game in progress, which is a game screen of the user terminal, based on progress information for making it possible to display the game screen of the game in progress, which is sequentially transmitted from the user terminal for advancing the game. Displaying a game screen including a screen displayed on the display unit or a simulated screen simplified from the screen in real time on the display unit of the computer so that the performer can see it.
    Accepting voices emitted from the performer to the game screen displayed in real time at arbitrary timings that the user cannot directly hear.
    Sending the voice data of the received voice to the user terminal and
    After the end of the game, in response to a request from the user terminal, the voice data is transmitted to the user terminal in order to cause the character to speak the content of the voice data in the user terminal as in the case of the completed game. The voice data is transmitted or, in response to a request from a terminal other than the user terminal, the character is made to speak the content of the voice data on a terminal other than the user terminal as in the case of the completed game. To a terminal other than the user terminal
    An information processing device configured to run.
PCT/JP2020/048107 2020-12-23 2020-12-23 Method, computer-readable medium, and information processing device WO2022137376A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2022570845A JPWO2022137376A1 (en) 2020-12-23 2020-12-23
PCT/JP2020/048107 WO2022137376A1 (en) 2020-12-23 2020-12-23 Method, computer-readable medium, and information processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/048107 WO2022137376A1 (en) 2020-12-23 2020-12-23 Method, computer-readable medium, and information processing device

Publications (1)

Publication Number Publication Date
WO2022137376A1 true WO2022137376A1 (en) 2022-06-30

Family

ID=82158685

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/048107 WO2022137376A1 (en) 2020-12-23 2020-12-23 Method, computer-readable medium, and information processing device

Country Status (2)

Country Link
JP (1) JPWO2022137376A1 (en)
WO (1) WO2022137376A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014131737A (en) * 2012-12-26 2014-07-17 Sony Computer Entertainment America Llc Systems and methods for ranking of cloud executed mini-games based on tag content and social network content
JP6525091B1 (en) * 2018-06-12 2019-06-05 株式会社セガゲームス Matching system
JP2019205645A (en) * 2018-05-29 2019-12-05 株式会社コロプラ Game program, character control program, method, and information processing device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014131737A (en) * 2012-12-26 2014-07-17 Sony Computer Entertainment America Llc Systems and methods for ranking of cloud executed mini-games based on tag content and social network content
JP2019205645A (en) * 2018-05-29 2019-12-05 株式会社コロプラ Game program, character control program, method, and information processing device
JP6525091B1 (en) * 2018-06-12 2019-06-05 株式会社セガゲームス Matching system

Also Published As

Publication number Publication date
JPWO2022137376A1 (en) 2022-06-30

Similar Documents

Publication Publication Date Title
JP2020044136A (en) Viewing program, distribution program, method for executing viewing program, method for executing distribution program, information processing device, and information processing system
JP7170077B2 (en) program
JP6796115B2 (en) Game programs, game methods, and information processing equipment
JP7349348B2 (en) Character control program, method, and information processing device
JP7344948B2 (en) system
JP6595043B1 (en) GAME PROGRAM, METHOD, AND INFORMATION PROCESSING DEVICE
JP6672380B2 (en) Game program, character control program, method, and information processing device
JP6826573B2 (en) Game programs, methods, and information processing equipment
JP2023085442A (en) program
WO2022137376A1 (en) Method, computer-readable medium, and information processing device
JP6639561B2 (en) Game program, method, and information processing device
JP6923726B1 (en) Methods, computer-readable media, and information processing equipment
JP7095006B2 (en) Game programs, character control programs, methods, and information processing equipment
WO2022137340A1 (en) Information processing method, computer-readable medium, and information processing device
WO2022137343A1 (en) Information processing method, computer-readable medium, and information processing device
JP7258923B2 (en) program
JP7078585B2 (en) Game programs, methods, and information processing equipment
WO2022113330A1 (en) Method, computer-readable medium, and information processing device
WO2022113335A1 (en) Method, computer-readable medium, and information processing device
WO2022113327A1 (en) Method, computer-readable medium, computer system, and information processing device
JP2021045557A (en) Game program, game method, and information processing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20966867

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022570845

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20966867

Country of ref document: EP

Kind code of ref document: A1