WO2022137340A1 - Information processing method, computer-readable medium, and information processing device - Google Patents

Information processing method, computer-readable medium, and information processing device Download PDF

Info

Publication number
WO2022137340A1
WO2022137340A1 PCT/JP2020/047932 JP2020047932W WO2022137340A1 WO 2022137340 A1 WO2022137340 A1 WO 2022137340A1 JP 2020047932 W JP2020047932 W JP 2020047932W WO 2022137340 A1 WO2022137340 A1 WO 2022137340A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
game
character
progress
operation instruction
Prior art date
Application number
PCT/JP2020/047932
Other languages
French (fr)
Japanese (ja)
Inventor
功淳 馬場
Original Assignee
株式会社コロプラ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コロプラ filed Critical 株式会社コロプラ
Priority to PCT/JP2020/047932 priority Critical patent/WO2022137340A1/en
Priority to JP2022570818A priority patent/JPWO2022137340A1/ja
Publication of WO2022137340A1 publication Critical patent/WO2022137340A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/215Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/216Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/49Saving the game status; Pausing or ending the game
    • A63F13/497Partially or entirely replaying previous game actions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players

Definitions

  • This disclosure relates to information processing methods, computer-readable media, and information processing devices.
  • Non-Patent Document 1 discloses a romance simulation game whose main purpose is to virtually deepen friendship with a girl character. The user selects the most suitable action for the character from the presented options, and the story progresses by repeating the reaction of the character to the action.
  • Non-Patent Document 1 a character response pattern is prepared in advance. Then, according to the input operation of the user, the response of the character is determined from the response pattern and output, and the game progresses. Therefore, the variation of the character's movement does not extend beyond the contents of the data prepared in advance. Therefore, there is a problem that the user cannot feel the reality as if the character is in the real world with respect to the relationship with the character, and eventually gets tired of it. Generally, in a game developed with the intention of letting the user play for a long time, it is important how to deal with the problem that the user gets tired of the game. Games are always required to provide compelling content that motivates users to play. In particular, in a game in which the user finds interest in the relationship with the character, it is preferable that the character has a high sense of reality so that the user can immerse himself in the world of the game.
  • One aspect of the present disclosure is intended to enhance the immersive feeling of the game in the world and to improve the interest of the game.
  • the information processing method is an information processing method for the progress of a game by a computer including a processor, a memory, and an operation unit, and operates a character in response to a user's input operation via the operation unit by the processor.
  • a right is granted to the computer, and based on that right, the progress of the game can be switched from the first part to the second part, in an external space physically separated from the space where the user is located.
  • the first progress of the second part is executed by operating the character based on the step of receiving the operation instruction data specifying the operation of the character and the received operation instruction data transmitted from the external device located.
  • the operation instruction data includes the motion data and the voice data input by the operator who plays the character, and the operator is located in an external space physically separated from the space in which the user is located. Is a different person.
  • the information processing method is an information processing method for the progress of a game by a computer including a processor, a memory, and an operation unit, and operates a character in response to a user's input operation via the operation unit by the processor.
  • a step to execute the progress of the first part to be caused a step to accept a specific action by the user in the progress of the first part in order to make the progress of the game switchable from the first part to the second part, and the second part. Recorded from an external device located in an external space physically separated from the space in which the user is located, and the step requesting the progress of the completed second part when the progress of is not progressed in real time.
  • the operation instruction data includes an operator who plays the character, including a step of receiving the operation instruction data and a step of executing the progress of the completed second part by operating the character based on the received operation instruction data.
  • the operator is a person different from the user, including the input motion data and voice data, located in an external space physically separated from the space in which the user is located.
  • the information processing apparatus includes a first part progressing unit that executes the progress of the first part that operates a character in response to a user's input operation, and the first part to the second part that progresses the game.
  • a reception unit that accepts a specific action by the user in the progress of the first part, and a second part progress part, when the progress of the second part is not progressed in real time. Requests the progress of the completed second part, receives recorded operation instruction data from an external device located in an external space physically separated from the space where the user is located, and is based on the received operation instruction data.
  • the operation instruction data includes motion data and voice data input by the operator who plays the character, and includes a second part progress unit that executes the progress of the completed second part by operating the character.
  • the operator is a person different from the user, located in an external space physically separated from the space in which the user is located.
  • it has the effect of improving the interest of the game.
  • the game system according to the present disclosure is a system for providing a game to a plurality of users who are game players.
  • the game system will be described with reference to the drawings. It should be noted that the present invention is not limited to these examples, and is indicated by the scope of claims, and it is intended that all modifications within the meaning and scope equivalent to the scope of claims are included in the present invention. To. In the following description, the same elements are designated by the same reference numerals in the description of the drawings, and duplicate description is not repeated.
  • FIG. 1 is a diagram showing a hardware configuration of the game system 1.
  • the game system 1 includes a plurality of user terminals 100 and a server 200. Each user terminal 100 connects to the server 200 via the network 2.
  • the network 2 is composed of various mobile communication systems constructed by the Internet and a wireless base station (not shown). Examples of this mobile communication system include so-called 3G and 4G mobile communication systems, LTE (Long Term Evolution), and wireless networks (for example, Wi-Fi (registered trademark)) that can be connected to the Internet by a predetermined access point. Will be.
  • the server 200 (computer, information processing device) may be a general-purpose computer such as a workstation or a personal computer.
  • the server 200 includes a processor 20, a memory 21, a storage 22, a communication IF 23, and an input / output IF 24. These configurations of the server 200 are electrically connected to each other by a communication bus.
  • the user terminal 100 may be a mobile terminal such as a smartphone, a feature phone, a PDA (Personal Digital Assistant), or a tablet computer.
  • the user terminal 100 may be a game device suitable for game play.
  • the user terminal 100 includes a processor 10, a memory 11, a storage 12, a communication interface (IF) 13, an input / output IF 14, a touch screen 15 (display unit), a camera 17, and a distance measuring sensor 18. And.
  • IF communication interface
  • touch screen 15 display unit
  • camera 17 a camera 17
  • a distance measuring sensor 18 a distance measuring sensor.
  • These configurations included in the user terminal 100 are electrically connected to each other by a communication bus.
  • the user terminal 100 may be provided with an input / output IF 14 to which a display (display unit) configured separately from the user terminal 100 main body can be connected in place of or in addition to the touch screen 15.
  • the user terminal 100 may be configured to be communicable with one or more controllers 1020.
  • the controller 1020 establishes communication with the user terminal 100 according to a communication standard such as Bluetooth (registered trademark).
  • the controller 1020 may have one or more buttons or the like, and transmits an output value based on a user's input operation to the buttons or the like to the user terminal 100.
  • the controller 1020 may have various sensors such as an acceleration sensor and an angular velocity sensor, and transmits the output values of the various sensors to the user terminal 100.
  • the controller 1020 may have the camera 17 and the distance measuring sensor 18.
  • the user terminal 100 causes a user who uses the controller 1020 to input user identification information such as the user's name or login ID via the controller 1020, for example, at the start of a game.
  • the user terminal 100 can associate the controller 1020 with the user, and identifies which user the output value belongs to based on the source of the received output value (controller 1020). be able to.
  • each user terminal 100 When the user terminal 100 communicates with a plurality of controllers 1020, each user grips each controller 1020 so that the user terminal 100 does not communicate with other devices such as the server 200 via the network 2. Multiplayer can be realized with.
  • each user terminal 100 communicates with each other according to a wireless standard such as a wireless LAN (Local Area Network) standard (communication connection is made without going through the server 200), thereby realizing local multiplayer with a plurality of user terminals 100. You can also do it.
  • a wireless standard such as a wireless LAN (Local Area Network) standard (communication connection is made without going through the server 200), thereby realizing local multiplayer with a plurality of user terminals 100. You can also do it.
  • the user terminal 100 When the above-mentioned multiplayer is realized locally by one user terminal 100, the user terminal 100 may further include at least a part of various functions described later described in the server 200. Further, when the above-mentioned multiplayer is realized locally by a plurality of user terminals 100, the
  • the user terminal 100 may communicate with the server 200.
  • information indicating a play result such as a result or a win or loss in a certain game may be associated with user identification information and transmitted to the server 200.
  • the controller 1020 may be configured to be detachable from the user terminal 100.
  • a coupling portion with the controller 1020 may be provided on at least one surface of the housing of the user terminal 100.
  • the user terminal 100 may accept the attachment of a storage medium 1030 such as an external memory card via the input / output IF14. As a result, the user terminal 100 can read the program and data recorded on the storage medium 1030.
  • the program recorded on the storage medium 1030 is, for example, a game program.
  • the user terminal 100 may store the game program acquired by communicating with an external device such as the server 200 in the memory 11 of the user terminal 100, or may store the game program acquired by reading from the storage medium 1030 in the memory 11. You may memorize it in.
  • the user terminal 100 includes a communication IF 13, an input / output IF 14, a touch screen 15, a camera 17, and a distance measuring sensor 18 as an example of a mechanism for inputting information to the user terminal 100.
  • a communication IF 13 an input / output IF 14
  • a touch screen 15 a camera 17, and a distance measuring sensor 18
  • an input mechanism can be regarded as an operation part configured to accept a user's input operation.
  • the operation unit when the operation unit is configured by at least one of the camera 17 and the distance measuring sensor 18, the operation unit detects an object 1010 in the vicinity of the user terminal 100 and performs an input operation from the detection result of the object. Identify.
  • a user's hand as an object 1010, a marker having a predetermined shape, or the like is detected, and an input operation is specified based on the color, shape, movement, or type of the object 1010 obtained as a detection result.
  • the user terminal 100 inputs a gesture (a series of movements of the user's hand) detected based on the captured image. Specify and accept as.
  • the captured image may be a still image or a moving image.
  • the user terminal 100 identifies and accepts the user's operation performed on the input unit 151 of the touch screen 15 as the user's input operation.
  • the operation unit is configured by the communication IF 13
  • the user terminal 100 identifies and accepts a signal (for example, an output value) transmitted from the controller 1020 as an input operation of the user.
  • a signal output from an input device (not shown) different from the controller 1020 connected to the input / output IF14 is specified as an input operation of the user and received.
  • the game system 1 further includes an operation instruction device 300.
  • the operation instruction device 300 connects to each of the server 200 and the user terminal 100 via the network 2. At least one operation instruction device 300 is provided in the game system 1.
  • a plurality of operation instruction devices 300 may be provided depending on the number of user terminals 100 that use the service provided by the server 200.
  • One operation instruction device 300 may be provided for one user terminal 100.
  • One operation instruction device 300 may be provided for a plurality of user terminals 100.
  • the operation instruction device 300 may be a computer such as a server, a desktop personal computer, a laptop computer, or a tablet, and a computer group in which these are combined.
  • the operation instruction device 300 includes a processor 30, a memory 31, a storage 32, a communication IF 33, an input / output IF 34, and a touch screen 35 (display unit). These configurations included in the operation instruction device 300 are electrically connected to each other by a communication bus.
  • the operation instruction device 300 may include an input / output IF 34 to which a display (display unit) configured separately from the operation instruction device 300 main body can be connected in place of or in addition to the touch screen 35.
  • the operation instruction device 300 is connected to peripheral devices such as one or more microphones 3010, one or more motion capture devices 3020, and one or more controllers 3030 via wireless or wired. It may be configured to be communicable.
  • the wirelessly connected peripheral device establishes communication with the operation instruction device 300 according to a communication standard such as Bluetooth (registered trademark).
  • the microphone 3010 acquires the voice generated in the surroundings and converts it into an electric signal.
  • the voice converted into an electric signal is transmitted to the operation instruction device 300 as voice data, and is received by the operation instruction device 300 via the communication IF 33.
  • the motion capture device 3020 tracks the motion (including facial expressions, mouth movements, etc.) of the tracking target (for example, a person), and transmits the output value as the tracking result to the operation instruction device 300.
  • the motion data which is an output value, is received by the operation instruction device 300 via the communication IF 33.
  • the motion capture method of the motion capture device 3020 is not particularly limited.
  • the motion capture device 3020 selectively includes all mechanisms for capturing motion, such as a camera, various sensors, markers, a suit worn by a model (person), a signal transmitter, etc., depending on the method adopted. ..
  • the controller 3030 may have one or more physical input mechanisms such as buttons, levers, sticks, and wheels.
  • the controller 3030 transmits an output value based on an input operation input to the input mechanism by the operator of the operation instruction device 300 to the operation instruction device 300.
  • the controller 3030 may have various sensors such as an acceleration sensor and an angular velocity sensor, and may transmit the output values of the various sensors to the operation instruction device 300.
  • the above output value is received by the operation instruction device 300 via the communication IF 33.
  • the operator includes a person who operates the operation instruction device 300 by using the input unit 351 and the controller 3030, a voice actor who inputs voice through the microphone 3010, and moves via the motion capture device 3020. A model for inputting is also included. The operator is not included in the user who is a game player.
  • the operation instruction device 300 may include a camera and a distance measuring sensor (not shown).
  • the motion capture device 3020 and the controller 3030 may have a camera and a distance measuring sensor.
  • the operation instruction device 300 includes a communication IF 33, an input / output IF 34, and a touch screen 35 as an example of a mechanism for inputting information to the operation instruction device 300. If necessary, a camera and a distance measuring sensor may be further provided.
  • Each of the above-mentioned parts as an input mechanism can be regarded as an operation part configured to accept a user's input operation.
  • the operation unit may be composed of the touch screen 35.
  • the operation instruction device 300 identifies and accepts the user's operation performed on the input unit 351 of the touch screen 35 as the user's input operation.
  • the operation instruction device 300 identifies and accepts a signal (for example, an output value) transmitted from the controller 3030 as an input operation of the user.
  • a signal output from an input device (not shown) different from the controller 3030 connected to the input / output IF34 is specified as an input operation of the user and received.
  • the game executed by the game system 1 according to the first embodiment is, for example, a game in which one or more characters appear and at least one of the characters is operated based on the operation instruction data. ..
  • the character appearing in the game may be a player character (hereinafter, PC) or a non-player character (hereinafter, NPC).
  • the PC is a character that can be directly operated by a user who is a game player.
  • An NPC is a character that operates according to a game program and operation instruction data, that is, a character that cannot be directly operated by a user who is a game player. In the following, when it is not necessary to distinguish between the two, "character" is used as a generic term.
  • this game is a training simulation game.
  • the main character who is a user, deepens interaction with the character and works on it to make the character a famous video distributor and realize the dream that the character has. It is an object.
  • the training simulation game may include an element of a love simulation game in which the main character aims to increase intimacy through interaction with a character.
  • this game includes at least a live distribution part as an example.
  • the operation instruction data is supplied to the user terminal 100 running the game from a device other than the user terminal 100 at an arbitrary timing.
  • the user terminal 100 analyzes (renders) the operation instruction data by using the reception of the operation instruction data as a trigger.
  • the live distribution part is a part in which the user terminal 100 presents a character that operates according to the above-mentioned analyzed operation instruction data to the user in real time. As a result, the user can feel the reality as if the character really exists, and can further immerse himself in the game world and enjoy the game.
  • the game may be composed of a plurality of play parts.
  • the character properties may differ from part to part, such as one character being a PC in one part and an NPC in another part.
  • the game genre is not limited to a specific genre.
  • the game system 1 can execute games of all genres. For example, sports-themed games such as tennis, table tennis, dodgeball, baseball, soccer and hockey, puzzle games, quiz games, RPG (Role-PlayingGame), adventure games, shooting games, simulation games, training games, and It may be an action game or the like.
  • sports-themed games such as tennis, table tennis, dodgeball, baseball, soccer and hockey, puzzle games, quiz games, RPG (Role-PlayingGame), adventure games, shooting games, simulation games, training games, and It may be an action game or the like.
  • the play form of the game executed in the game system 1 is not limited to a specific play form.
  • the game system 1 can execute a game of any play form. For example, a single-player game by a single user, a multi-play game by a plurality of users, a battle game in which a plurality of users play against each other, and a cooperative play game in which a plurality of users cooperate among the multi-play games. You may.
  • the processor 10 controls the operation of the entire user terminal 100.
  • the processor 20 controls the operation of the entire server 200.
  • the processor 30 controls the operation of the entire operation instruction device 300.
  • Processors 10, 20 and 30 include a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a GPU (Graphics Processing Unit).
  • the processor 10 reads a program from the storage 12 described later and expands it into the memory 11 described later.
  • the processor 20 reads a program from the storage 22 described later and expands it into the memory 21 described later.
  • the processor 30 reads a program from the storage 32 described later and expands it into the memory 31 described later. Processor 10, processor 20 and processor 30 execute the expanded program.
  • the memories 11, 21 and 31 are the main storage devices.
  • the memories 11, 21 and 31 are composed of storage devices such as ROM (Read Only Memory) and RAM (Random Access Memory).
  • the memory 11 provides a work area to the processor 10 by temporarily storing a program and various data read from the storage 12 described later by the processor 10.
  • the memory 11 also temporarily stores various data generated while the processor 10 is operating according to the program.
  • the memory 21 provides a work area to the processor 20 by temporarily storing various programs and data read from the storage 22 described later by the processor 20.
  • the memory 21 also temporarily stores various data generated while the processor 20 is operating according to the program.
  • the memory 31 provides a work area to the processor 30 by temporarily storing various programs and data read from the storage 32 described later by the processor 30.
  • the memory 31 also temporarily stores various data generated while the processor 30 is operating according to the program.
  • the program may be a game program for realizing the game by the user terminal 100.
  • the program may be a game program for realizing the game in collaboration with the user terminal 100 and the server 200.
  • the program may be a game program for realizing the game in cooperation with the user terminal 100, the server 200, and the operation instruction device 300.
  • the game realized by the cooperation of the user terminal 100 and the server 200 and the game realized by the cooperation of the user terminal 100, the server 200, and the operation instruction device 300 are started by the user terminal 100 as an example. It may be a game executed on a browser.
  • the program may be a game program for realizing the game by the cooperation of a plurality of user terminals 100.
  • the various data include data related to the game such as user information and game information, and instructions or notifications to be transmitted / received between the devices of the game system 1.
  • Storages 12, 22 and 32 are auxiliary storage devices.
  • the storages 12, 22 and 32 are composed of a storage device such as a flash memory or an HDD (Hard Disk Drive).
  • Various data related to the game are stored in the storages 12, 22 and 32.
  • the communication IF 13 controls the transmission and reception of various data in the user terminal 100.
  • the communication IF 23 controls the transmission / reception of various data in the server 200.
  • the communication IF 33 controls the transmission / reception of various data in the operation instruction device 300.
  • Communication IFs 13, 23 and 33 control communication using, for example, communication via a wireless LAN (Local Area Network), Internet communication via a wired LAN, a wireless LAN, or a mobile phone network, and communication using short-range wireless communication. ..
  • a wireless LAN Local Area Network
  • Internet communication via a wired LAN, a wireless LAN, or a mobile phone network
  • short-range wireless communication . .
  • the input / output IF 14 is an interface for the user terminal 100 to accept data input, and an interface for the user terminal 100 to output data.
  • the input / output IF 14 may input / output data via USB (Universal Serial Bus) or the like.
  • the input / output IF 14 may include, for example, a physical button, a camera, a microphone, a speaker, or the like of the user terminal 100.
  • the input / output IF 24 of the server 200 is an interface for the server 200 to receive data input, and an interface for the server 200 to output data.
  • the input / output IF 24 may include, for example, an input unit that is an information input device such as a mouse or a keyboard, and a display unit that is a device that displays and outputs an image.
  • the input / output IF 34 of the operation instruction device 300 is an interface for the operation instruction device 300 to receive data input, and an interface for the operation instruction device 300 to output data.
  • the input / output IF34 includes, for example, information input devices such as a mouse, keyboard, stick, and lever, devices for displaying and outputting images such as a liquid crystal display, and peripheral devices (microphone 3010, motion capture device 3020, and controller 3030). May include connections for sending and receiving data between.
  • the touch screen 15 of the user terminal 100 is an electronic component that combines an input unit 151 and a display unit 152.
  • the touch screen 35 of the operation instruction device 300 is an electronic component in which an input unit 351 and a display unit 352 are combined.
  • the input units 151 and 351 are, for example, touch-sensitive devices, and are configured by, for example, a touch pad.
  • the display units 152 and 352 are configured by, for example, a liquid crystal display, an organic EL (Electro-Luminescence) display, or the like.
  • the input units 151 and 351 detect the position where the user's operation (mainly a physical contact operation such as a touch operation, a slide operation, a swipe operation, and a tap operation) is input to the input surface, and information indicating the position. It has a function to transmit as an input signal.
  • the input units 151 and 351 may be provided with a touch sensing unit (not shown).
  • the touch sensing unit may adopt any method such as a capacitance method or a resistance film method.
  • the user terminal 100 may include one or more sensors for specifying the holding posture of the user terminal 100.
  • This sensor may be, for example, an acceleration sensor, an angular velocity sensor, or the like.
  • the processor 10 can also specify the holding posture of the user terminal 100 from the output of the sensor and perform processing according to the holding posture.
  • the processor 10 may be a vertical screen display in which a vertically long image is displayed on the display unit 152 when the user terminal 100 is held vertically.
  • the user terminal 100 when the user terminal 100 is held horizontally, it may be a horizontal screen display in which a horizontally long image is displayed on the display unit. In this way, the processor 10 may be able to switch between the vertical screen display and the horizontal screen display according to the holding posture of the user terminal 100.
  • the camera 17 includes an image sensor and the like, and generates a captured image by converting the incident light incident from the lens into an electric signal.
  • the distance measuring sensor 18 is a sensor that measures the distance to the object to be measured.
  • the distance measuring sensor 18 includes, for example, a light source that emits pulse-converted light and a light receiving element that receives the light.
  • the distance measuring sensor 18 measures the distance to the object to be measured by the timing of light emission from the light source and the timing of receiving the reflected light generated by the light emitted from the light source being reflected by the object to be measured.
  • the distance measuring sensor 18 may have a light source that emits light having directivity.
  • the camera 17 and the distance measuring sensor 18 may be provided on the side surface of the housing of the user terminal 100, for example.
  • a ranging sensor 18 may be provided in the vicinity of the camera 17.
  • the camera 17 for example, an infrared camera can be used.
  • the camera 17 may be provided with a lighting device that irradiates infrared rays, a filter that blocks visible light, and the like. This makes it possible to further improve the detection accuracy of the object based on the captured image of the camera 17, regardless of whether it is outdoors or indoors.
  • the processor 10 may perform one or more of the processes shown in the following (1) to (5), for example, on the captured image of the camera 17.
  • the processor 10 performs image recognition processing on the captured image of the camera 17 to specify whether or not the captured image includes a user's hand.
  • the processor 10 may use, for example, a technique such as pattern matching as the analysis technique adopted in the above-mentioned image recognition process.
  • the processor 10 detects the user's gesture from the shape of the user's hand.
  • the processor 10 specifies, for example, the number of fingers of the user (the number of extended fingers) from the shape of the user's hand detected from the captured image.
  • the processor 10 further identifies the gesture performed by the user from the number of identified fingers.
  • the processor 10 determines that the user has made a "par" gesture when the number of fingers is five. Further, when the number of fingers is 0 (the finger is not detected), the processor 10 determines that the user has made a “goo” gesture. Further, when the number of fingers is two, the processor 10 determines that the user has performed the "choki” gesture. (3) The processor 10 performs image recognition processing on the captured image of the camera 17 to detect whether the user's finger is in a state where only the index finger is raised or whether the user's finger is repelled. ..
  • the processor 10 is an object 1010 (user's hand or the like) in the vicinity of the user terminal 100 based on at least one of the image recognition result of the captured image of the camera 17 and the output value of the distance measuring sensor 18. ) And the user terminal 100.
  • the processor 10 may have the user's hand near the user terminal 100 (for example, a distance less than a predetermined value) or far away (for example, a predetermined value) depending on the size of the shape of the user's hand specified from the captured image of the camera 17. Detect if it is at the above distance).
  • the processor 10 may detect whether the user's hand is approaching or moving away from the user terminal 100.
  • the processor. 10 recognizes that the user is waving his hand in the shooting direction of the camera 17.
  • the processor 10 determines that the user is waving his hand in a direction orthogonal to the shooting direction of the camera. recognize.
  • the processor 10 determines whether or not the user is holding the hand (whether it is a “goo” gesture or another gesture (for example, “par”) by recognizing the image captured by the camera 17. Or) is detected. In addition, the processor 10 detects the shape of the user's hand and how the user is moving the hand. In addition, the processor 10 detects whether the user is approaching or moving this hand toward or away from the user terminal 100. Such an operation can correspond to an operation using a pointing device such as a mouse or a touch panel, for example. For example, the user terminal 100 moves the pointer on the touch screen 15 in response to the movement of the user's hand, and detects the user's gesture “goo”. In this case, the user terminal 100 recognizes that the user is continuing the selection operation.
  • a pointing device such as a mouse or a touch panel
  • the continuation of the selection operation corresponds to, for example, the state in which the mouse is clicked and pressed, or the state in which the mouse is touched after the touch-down operation is performed on the touch panel.
  • the user terminal 100 performs such a series of gestures as an operation corresponding to a swipe operation (or a drag operation). You can also recognize it.
  • the user terminal 100 detects a gesture that the user flips a finger based on the detection result of the user's hand by the image taken by the camera 17, the gesture is clicked by the mouse or tapped on the touch panel. It may be recognized as an operation corresponding to.
  • FIG. 2 is a block diagram showing a functional configuration of a user terminal 100, a server 200, and an operation instruction device 300 included in the game system 1.
  • Each of the user terminal 100, the server 200, and the operation instruction device 300 is not shown, and has a functional configuration necessary for functioning as a general computer and a functional configuration necessary for realizing a known function in a game. May include.
  • the user terminal 100 has a function as an input device that accepts a user's input operation and a function as an output device that outputs a game image or sound.
  • the user terminal 100 functions as a control unit 110 and a storage unit 120 by the cooperation of the processor 10, the memory 11, the storage 12, the communication IF 13, the input / output IF 14, and the like.
  • the server 200 has a function of communicating with each user terminal 100 and supporting the user terminal 100 to advance the game. For example, when the user terminal 100 downloads an application related to this game for the first time, the user terminal 100 is provided with data to be stored in the user terminal 100 at the start of the first game. For example, the server 200 transmits the operation instruction data for operating the character to the user terminal 100.
  • the motion instruction data may include motion capture data that captures the movement of an actor such as a model in advance, or may include voice data that records the voice of an actor such as a voice actor, or causes the character to operate. It may include operation history data indicating the history of input operations for the purpose, or may include a motion command group in which commands associated with the above-mentioned series of input operations are arranged in chronological order.
  • the server 200 may have a function of communicating with each user terminal 100 participating in the game and mediating an exchange between the user terminals 100 and a synchronization control function. Further, the server 200 has a function of mediating between the user terminal 100 and the operation instruction device 300. As a result, the operation instruction device 300 can supply the operation instruction data to the user terminal 100 or a group of a plurality of user terminals 100 in a timely manner without making a mistake in the destination.
  • the server 200 functions as a control unit 210 and a storage unit 220 by the cooperation of the processor 20, the memory 21, the storage 22, the communication IF23, the input / output IF24, and the like.
  • the operation instruction device 300 has a function of generating operation instruction data for instructing the operation of a character in the user terminal 100 and supplying the operation instruction data to the user terminal 100.
  • the operation instruction device 300 functions as a control unit 310 and a storage unit 320 in cooperation with the processor 30, the memory 31, the storage 32, the communication IF 33, the input / output IF 34, and the like.
  • the storage units 120, 220 and 320 store the game program 131, the game information 132 and the user information 133.
  • the game program 131 is a game program executed by the user terminal 100, the server 200, and the operation instruction device 300.
  • the game information 132 is data that the control units 110, 210, and 310 refer to when executing the game program 131.
  • the user information 133 is data related to the user's account.
  • the storage unit 320 further stores the character control program 134.
  • the character control program 134 is a program executed by the operation instruction device 300, and is a program for controlling the operation of a character appearing in a game based on the above-mentioned game program 131.
  • the control unit 210 comprehensively controls the server 200 by executing the game program 131 stored in the storage unit 220. For example, the control unit 210 transmits various data, programs, and the like to the user terminal 100. The control unit 210 receives a part or all of the game information or the user information from the user terminal 100. When the game is a multiplayer game, the control unit 210 may receive a request for synchronization of multiplayer from the user terminal 100 and transmit data for synchronization to the user terminal 100. Further, the control unit 210 communicates with the user terminal 100 and the operation instruction device 300 as necessary to send and receive information.
  • the control unit 210 functions as a progress support unit 211 and a shared support unit 212 according to the description of the game program 131.
  • the control unit 210 can also function as another functional block (not shown) in order to support the progress of the game on the user terminal 100, depending on the nature of the game to be executed.
  • the progress support unit 211 communicates with the user terminal 100 and supports the user terminal 100 to progress various parts included in this game. For example, when the user terminal 100 advances the game, the progress support unit 211 provides the user terminal 100 with information necessary for advancing the game.
  • the sharing support unit 212 communicates with a plurality of user terminals 100, and supports a plurality of users to share each other's decks on each user terminal 100. Further, the sharing support unit 212 may have a function of matching the online user terminal 100 with the operation instruction device 300. As a result, information can be smoothly transmitted and received between the user terminal 100 and the operation instruction device 300.
  • the control unit 110 comprehensively controls the user terminal 100 by executing the game program 131 stored in the storage unit 120. For example, the control unit 110 advances the game according to the game program 131 and the user's operation. Further, the control unit 110 communicates with the server 200 and the operation instruction device 300 as necessary to transmit and receive information while the game is in progress.
  • the control unit 110 includes an operation reception unit 111, a display control unit 112, a user interface (hereinafter, UI) control unit 113, an animation generation unit 114, a game progress unit 115, an analysis unit 116, and a progress unit according to the description of the game program 131. It functions as an information generation unit 117.
  • the control unit 110 can also function as other functional blocks (not shown) in order to advance the game, depending on the nature of the game to be executed.
  • the operation reception unit 111 detects and accepts a user's input operation to the input unit 151.
  • the operation reception unit 111 determines what input operation has been performed from the action exerted by the user on the console via the touch screen 15 and other input / output IF 14, and outputs the result to each element of the control unit 110. do.
  • the operation receiving unit 111 receives an input operation for the input unit 151, detects the coordinates of the input position of the input operation, and specifies the type of the input operation.
  • the operation receiving unit 111 specifies, for example, a touch operation, a slide operation, a swipe operation, a tap operation, and the like as types of input operations. Further, the operation receiving unit 111 detects that the contact input is canceled from the touch screen 15 when the continuously detected input is interrupted.
  • the UI control unit 113 controls the UI object to be displayed on the display unit 152 in order to construct the UI.
  • the UI object is a tool for the user to make an input necessary for the progress of the game to the user terminal 100, or a tool for obtaining information output during the progress of the game from the user terminal 100.
  • UI objects are, but are not limited to, icons, buttons, lists, menu screens, and the like.
  • the animation generation unit 114 generates an animation showing the motion of various objects based on the control mode of various objects. For example, the animation generation unit 114 may generate an animation or the like that expresses how the character moves as if it were there, moves the mouth, or changes the facial expression.
  • the display control unit 112 outputs a game screen reflecting the processing result executed by each of the above elements to the display unit 152 of the touch screen 15.
  • the display control unit 112 may display the game screen including the animation generated by the animation generation unit 114 on the display unit 152. Further, the display control unit 112 may superimpose and draw the above-mentioned UI object controlled by the UI control unit 113 on the game screen.
  • the game progress unit 115 advances the game.
  • the game progress unit 115 advances the game in response to a user's input operation input via the operation reception unit 111.
  • the game progress unit 115 causes one or more characters to appear and operates the characters while the game is in progress.
  • the game progress unit 115 may operate the character according to the game program 131 downloaded in advance, may operate according to the input operation of the user, or may operate the character according to the operation instruction device 300. It may be operated according to.
  • the game progress unit 115 advances the game according to the specifications of each part.
  • the first part is a story part in which the story in the game progresses by interacting with the character.
  • the game progress unit 115 advances the story part as follows. Specifically, the game progress unit 115 operates the character according to the game program 131 downloaded in advance or the operation instruction data (first operation instruction data) also downloaded in advance. The game progress unit 115 identifies an option selected by the user based on the input operation of the user received by the operation reception unit 111, and causes the character to perform an operation associated with the option.
  • the second part is a live distribution part in which the character is operated based on the operation instruction data supplied from the operation instruction device 300. In this case, the game progress unit 115 operates the character from the operation instruction device 300 based on the operation instruction data to advance the live distribution part.
  • the analysis unit 116 analyzes (renders) the operation instruction data and instructs the game progress unit 115 to operate the character based on the analysis result.
  • the analysis unit 116 starts rendering of the operation instruction data triggered by the fact that the operation instruction data supplied by the operation instruction device 300 is received via the communication IF 33.
  • the operation instruction device 300 transmits the analysis result to the game progress unit 115, and immediately instructs the character to operate based on the operation instruction data. That is, the game progress unit 115 uses the reception of the operation instruction data as a trigger to operate the character based on the operation instruction data. This makes it possible to show the user a character that operates in real time.
  • the progress information generation unit 117 generates progress information indicating the progress of the game being executed by the game progress unit 115, and sends it to the server 200 or the operation instruction device 300 in a timely manner.
  • the progress information may include, for example, information that specifies the currently displayed game screen, or may include a progress log indicating the progress of the game in chronological order by characters, symbols, and the like.
  • the progress information generation unit 117 may be omitted.
  • the control unit 310 comprehensively controls the operation instruction device 300 by executing the character control program 134 stored in the storage unit 320. For example, the control unit 310 generates operation instruction data according to the operation of the character control program 134 and the operator, and supplies the operation instruction data to the user terminal 100. The control unit 310 may further execute the game program 131, if necessary. Further, the control unit 310 communicates with the server 200 and the user terminal 100 running the game to send and receive information.
  • the control unit 310 functions as an operation reception unit 311, a display control unit 312, a UI control unit 313, an animation generation unit 314, a progress simulation unit 315, and a character control unit 316 according to the description of the character control program 134.
  • the control unit 310 can also function as another functional block (not shown) in order to control a character appearing in the game according to the nature of the game executed in the game system 1.
  • the operation reception unit 311 detects and accepts the operator's input operation to the input unit 351.
  • the operation reception unit 311 determines what kind of input operation has been performed on the console via the touch screen 35 and other input / output IF 34s from the action exerted by the operator, and outputs the result to each element of the control unit 310. Output.
  • the details of the function of the operation reception unit 311 are almost the same as those of the operation reception unit 111 in the user terminal 100.
  • the UI control unit 313 controls the UI object to be displayed on the display unit 352.
  • the animation generation unit 314 generates an animation showing the motion of various objects based on the control mode of various objects.
  • the animation generation unit 314 may generate an animation or the like that reproduces the game screen actually displayed on the user terminal 100 that is the communication partner.
  • the display control unit 312 outputs a game screen reflecting the processing result executed by each of the above-mentioned elements to the display unit 352 of the touch screen 35.
  • the details of the functions of the display control unit 312 are substantially the same as those of the display control unit 112 in the user terminal 100.
  • the progress simulation unit 315 grasps the progress of the game on the user terminal 100 based on the progress information indicating the progress of the game received from the user terminal 100. Then, the progress simulation unit 315 presents the progress of the user terminal 100 to the operator by simulating the behavior of the user terminal 100 in the operation instruction device 300.
  • the progress simulation unit 315 may display a reproduction of the game screen displayed on the user terminal 100 on the display unit 352 of the own device. Further, the progress simulation unit 315 may display the progress of the game on the display unit 352 as the above-mentioned progress log on the user terminal 100.
  • the progress simulation unit 315 grasps the progress of the game of the user terminal 100 based on the progress information. Then, the progress simulation unit 315 may completely or simplify the game screen currently displayed on the user terminal 100 based on the game program 131 and reproduce it on the display unit 352 of its own device. Alternatively, the progress simulation unit 315 may grasp the progress of the game at the present time, predict the progress of the game after the present time based on the game program 131, and output the prediction result to the display unit 352.
  • the character control unit 316 controls the behavior of the character displayed on the user terminal 100. Specifically, the operation instruction data for operating the character is generated and supplied to the user terminal 100. For example, the character control unit 316 generates operation instruction data instructing an operator (voice actor or the like) to speak to the character to be controlled based on the voice data input via the microphone 3010. The operation instruction data generated in this way includes at least the above-mentioned voice data. Further, for example, an operator (model or the like) generates motion instruction data instructing the character to be controlled to perform a motion based on the motion capture data input via the motion capture device 3020. The motion instruction data generated in this way includes at least the above-mentioned motion capture data.
  • the operation instruction data generated in this way includes at least the above-mentioned operation history data.
  • the operation history data is, for example, information in which operation logs indicating which button of the controller 3030 is pressed at what timing by the operator when which screen is displayed on the display unit are organized in chronological order. ..
  • the display unit here may be a display unit linked to the controller 3030, may be a display unit 352 of the touch screen 35, or may be another display unit connected via the input / output IF 34. ..
  • the character control unit 316 identifies a command instructing the operation of the character associated with the input operation input by the operator via the above-mentioned input mechanism or operation unit. Then, the character control unit 316 arranges the commands in the order in which they are input to generate a motion command group indicating a series of actions of the character, and generates motion instruction data instructing the character to be operated according to the motion command group. You may.
  • the motion instruction data generated in this way includes at least the above-mentioned motion command group.
  • the reaction processing unit 317 receives feedback on the user's reaction from the user terminal 100 and outputs this to the operator of the operation instruction device 300.
  • the user terminal 100 can create a comment addressed to the character while the character is operated according to the above-mentioned operation instruction data.
  • the reaction processing unit 317 receives the comment data of the comment and outputs it.
  • the reaction processing unit 317 may display the text data corresponding to the user's comment on the display unit 352, or may output the voice data corresponding to the user's comment from a speaker (not shown).
  • the functions of the user terminal 100, the server 200, and the operation instruction device 300 shown in FIG. 2 are merely examples. Each device of the user terminal 100, the server 200, and the operation instruction device 300 may have at least a part of the functions of the other devices. Further, another device other than the user terminal 100, the server 200, and the operation instruction device 300 may be used as a component of the game system 1, and the other device may be made to execute a part of the processing in the game system 1. That is, the computer that executes the game program in the present embodiment may be any of a user terminal 100, a server 200, an operation instruction device 300, and another device other than the user terminal 100, and is realized by a combination of a plurality of these devices. May be done.
  • the progress simulation unit 315 may be omitted.
  • the control unit 310 can function as the reaction processing unit 317 according to the description of the character control program 134.
  • FIG. 3 is a flowchart showing an example of the basic game progress of this game.
  • the game is divided into, for example, two gameplay parts.
  • the first part is a story part and the second part is a live distribution part.
  • the game may include an acquisition part that allows the user to acquire a game medium that is digital data that can be used in the game in exchange for valuable data possessed by the user.
  • the play order of each part is not particularly limited.
  • FIG. 3 shows a case where the user terminal 100 executes a game in the order of a story part, an acquisition part, and a live distribution part.
  • step S1 the game progress unit 115 executes the story part.
  • the story part includes a fixed scenario S11 and an acquisition scenario S12 (described later).
  • the story part includes, for example, a scene in which the main character operated by the user and the character interact with each other.
  • the "scenario" collected as digital data corresponds to one episode of a story related to a character, is supplied from the server 200, and is temporarily stored in the storage unit 120.
  • the game progress unit 115 reads out one scenario stored in the storage unit 120, and advances one scenario according to the input operation of the user until the end is reached.
  • the scenario includes an option to be selected by the user, a response pattern of the character corresponding to the option, and the like, and different endings may be obtained in one scenario depending on which option the user selects. ..
  • the game progress unit 115 presents a plurality of options corresponding to the action from the main character to the character so that the user can select them, and advances the scenario according to the options selected by the user.
  • the game progress unit 115 may make the user acquire a reward according to the ending.
  • the reward is provided to the user, for example, as a game medium which is digital data that can be used in the game.
  • the game medium may be, for example, an item such as clothing that can be worn by the character.
  • "to make the user acquire the reward” may, as an example, change the status of the game medium as the reward managed in association with the user from unusable to usable.
  • the game medium may be stored in at least one of the memories (memory 11, memory 21, memory 31) included in the game system 1 in association with the user identification information, the user terminal ID, or the like.
  • step S3 the game progress unit 115 executes the acquisition part.
  • the game medium acquired by the user may be a new scenario different from the scenario provided to the user terminal 100 at the time of the first download.
  • the former scenario will be referred to as a fixed scenario, and the latter scenario will be referred to as an acquisition scenario.
  • the scenario When it is not necessary to distinguish between the two, it is simply referred to as a scenario.
  • the game progress unit 115 causes the user to possess an acquisition scenario different from the fixed scenario that the user already possesses, in exchange for consuming the user's valuable data.
  • the scenario to be acquired by the user may be determined by the game progress unit 115 or the progress support unit 211 of the server 200 according to a predetermined rule. More specifically, the game progress unit 115 or the progress support unit 211 may execute a lottery and randomly determine a scenario to be acquired by the user from a plurality of acquisition scenarios.
  • the acquisition part may be executed at any time before and after the story part and the live distribution part.
  • step S4 the game progress unit 115 determines whether or not the operation instruction data has been received from an external device via the network. While the operation instruction data is not received from the external device, the game progress unit 115 may return from NO in step S4 to, for example, step S1 to execute the story part. Alternatively, the game progress unit 115 may execute the acquisition part of step S3. On the other hand, when the operation instruction data is received from the external device, the game progress unit 115 proceeds from YES in step S4 to step S5.
  • step S5 the game progress unit 115 executes the live distribution part (second part). Specifically, the game progress unit 115 advances the live distribution part by operating the character according to the operation instruction data received in step S4.
  • step S1 the user simply interacts with the character showing a definite reaction via the UI in the scenario.
  • the user can freely and interactively interact with the character that operates in real time based on the operation instruction data transmitted from the external device.
  • the analysis unit 116 receives the operation instruction data including the voice data and the motion data generated according to the content of the input operation of the user from the operation instruction device 300.
  • the game progress unit 115 causes the character to speak based on the voice data included in the received motion instruction data, and moves the character based on the above-mentioned motion data. Thereby, the reaction of the character to the above-mentioned input operation of the user can be presented to the user.
  • the user terminal 100 is configured to execute the following steps in order to improve the interest of the game based on the game program 131.
  • the user terminal 100 is operation instruction data that specifies an operation of an NPC that neither the user nor another user operates, and is based on the first operation instruction data stored in advance in the memory 11.
  • NPC control device NPC control device
  • the user terminal 100 operates the NPC in the first part based on the first operation instruction data downloaded in advance.
  • the user terminal 100 receives the second operation instruction data from the operation instruction device 300, and in the second part, operates the NPC based on the second operation instruction data. Since the NPC can be operated based on the second operation instruction data received from the operation instruction device 300, the operation of the NPC is unconventional and its expression is greatly expanded. Therefore, the user can feel the reality as if the NPC is in the real world through the relationship with the NPC during the game play. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game.
  • FIG. 4 is a diagram showing an example of a data structure of operation instruction data processed by the game system 1 according to the present embodiment.
  • the action instruction data includes each item of "destination” and “creator” which is meta information, and each item of "character ID”, "voice” and “movement” which are the contents of the data. It is composed of.
  • the destination designation information is stored in the item "destination".
  • the destination designation information is information indicating to which device the operation instruction data is transmitted.
  • the destination designation information may be, for example, an address unique to the user terminal 100, or may be identification information of the group to which the user terminal 100 belongs. It may be a symbol (for example, "ALL") indicating that the destination is all user terminals 100 satisfying a certain condition.
  • the creation source information is stored in the item "creation source".
  • the creation source information is information indicating which device created the operation instruction data.
  • the creation source information is information related to a user (hereinafter referred to as user-related information) that can identify a specific user, such as a user ID, a user terminal ID, and a unique address of the user terminal.
  • the creation source information may be an ID or an address indicating the server 200 or the operation instruction device 300, and if the creation source is the server 200 or the operation instruction device 300, the value of the item is left empty. It may be left, or the item itself may not be provided in the operation instruction data.
  • character ID stores a character ID for uniquely identifying a character appearing in this game.
  • the character ID stored here represents which character's action is indicated by the action instruction data.
  • the item "voice” stores voice data to be expressed in the character.
  • Motion data that specifies the movement of the character is stored in the item "movement".
  • the motion data may be motion capture data acquired by the motion instruction device 300 via the motion capture device 3020.
  • the motion capture data may be data that tracks the movement of the actor's entire body, may be data that tracks the facial expression and mouth movement of the actor, or may be both.
  • the motion data may be a motion command group instructing a series of movements of the character specified by an operation input by the operator of the operation instruction device 300 via the controller 3030.
  • buttons A, B, C, and D of the controller 3030 For example, when the commands “raise the right hand”, “raise the left hand”, “walk”, and “run” are assigned to the buttons A, B, C, and D of the controller 3030, the operator can use them. , Button A, button B, button C, and button D are pressed in succession. In this case, a motion command group in which the commands “raise the right hand”, “raise the left hand”, “walk”, and “run” are arranged in the above order is stored in the "movement" item as motion data. Will be done. In this embodiment, the voice data and the motion data are included in the operation instruction data in a synchronized state.
  • the game progress unit 115 can operate the character appearing in the game as intended by the creator of the motion instruction data. Specifically, when the operation instruction data includes voice data, the game progress unit 115 causes the character to speak based on the voice data. Further, when the motion data includes motion data, the game progress unit 115 moves the character based on the motion data, that is, generates an animation of the character so as to move based on the motion data. do.
  • FIG. 5 is a diagram showing an example of the data structure of the game information 132 processed by the game system 1 according to the present embodiment.
  • the items provided in the game information 132 are appropriately determined according to the genre, nature, content, etc. of the game, and the exemplary items do not limit the scope of the present invention.
  • the game information 132 is configured to include each item of "play history", “item”, “intimacy”, “famousness”, and “delivery history”. Each of these items is appropriately referred to when the game progress unit 115 advances the game.
  • the user's play history is stored in the item "play history".
  • the play history is information indicating whether or not the user's play is completed for each scenario stored in the storage unit 120.
  • the play history includes a list of fixed scenarios downloaded at the beginning of the play and a list of acquisition scenarios acquired later in the acquisition part. In each list, statuses such as "played”, “unplayed”, “playable”, and “unplayable” are associated with each scenario.
  • the item "item” stores a list of items owned by the user as a game medium.
  • the item is, for example, a clothing item worn by a character.
  • the user can make the character wear the items obtained by playing the scenario and customize the appearance of the character.
  • the item "Intimacy” stores intimacy, which is one of the character's statuses.
  • the new density is a parameter that indicates the friendliness of the user's alter ego, the "hero", with the character.
  • the game progress unit 115 may advance the game in the user's favor as the intimacy is higher.
  • the game progress unit 115 may increase or decrease the intimacy depending on whether the play result of the scenario is good or bad.
  • the game progress unit 115 increases the intimacy more as the user selects the option well and the ending greeted in the scenario is better.
  • the game progress unit 115 may reduce the intimacy when the user reaches the scenario at the bad end.
  • the item "Familiarity" stores the fame level, which is one of the character's statuses.
  • the name recognition is a parameter indicating the popularity and recognition of the character as a video distributor.
  • One of the purposes of this game is to support the video distribution activity of the character, raise the name of the character, and realize the dream of the character.
  • a special scenario may be offered as a reward to a user who has achieved a certain level of name recognition.
  • the item "Distribution history” stores a list of videos, so-called back numbers, that have been live-distributed from characters in the past in the live distribution part.
  • the live distribution part the video that is PUSH-distributed in real time can be viewed only at that time.
  • the moving images for past distribution are recorded by the server 200 or the operation instruction device 300, and can be PULL distributed in response to a request from the user terminal 100.
  • the back number may be made available for download by the user for a fee.
  • FIG. 6 is a diagram showing an example of a quest presentation screen 400 displayed on the display unit 152 of the user terminal 100.
  • the game progress unit 115 presents a quest to the user according to the game program 131 while the scenario is in progress. Specifically, the game progress unit 115 causes the character to speak a request item corresponding to a quest to the hero in a dialogue between the hero and the character. At this time, for example, the game progress unit 115 may display the quest presentation screen 400 shown in FIG. 6 on the display unit 152.
  • the method of presenting a character that performs a series of actions of "making the character speak a request” is not particularly limited.
  • the game progress unit 115 may display a character that utters the request as a still image based on the text data stored in the storage unit 120 in advance.
  • the game progress unit 115 displays a quest presentation screen 400 including a character 401, a balloon 402 indicating that the character 401 is speaking, and text data of a request item arranged in the balloon 402. Display on 152.
  • the game progress unit 115 may display an animation of the character who utters the request item based on the operation instruction data corresponding to the scene in which the request item is uttered, which is stored in the storage unit 120 in advance.
  • the game progress unit 115 moves the character 401 according to the motion capture data included in the motion instruction data, and transfers the voice data included in the motion instruction data as voice from a speaker (not shown) included in the user terminal 100.
  • Output is not particularly limited.
  • the game progress unit 115 may realize the quest by a location information game using the location registration information of the user terminal 100.
  • the game progress unit 115 acquires the current position information (for example, address information, latitude / longitude information, etc.) of the user terminal 100 from a position registration system (not shown) provided in the user terminal 100. Then, based on the acquired current position information, a map 403 around the place where the user terminal 100 is located is generated and arranged on the quest presentation screen 400.
  • the map data that is the source of generating the map 403 may be stored in advance in the storage unit 120 of the user terminal 100, or may be acquired from another service providing device that provides the map data via the network. ..
  • the game progress unit 115 determines a position (address, latitude / longitude, etc.) at which an object that can solve the request (hereinafter referred to as a target) can be acquired, and a target icon is placed at a position on the map corresponding to the determined position. 404 is superimposed and displayed.
  • a position address, latitude / longitude, etc.
  • a target icon is placed at a position on the map corresponding to the determined position. 404 is superimposed and displayed.
  • the position of the target may be randomly determined by the game progress unit 115, or may be determined in advance according to the contents of the scenario, the quest, and the target.
  • the game progress unit 115 determines that the main character has reached the target object, and causes the user to acquire the target object. The game progress unit 115 determines that the quest has been cleared.
  • the game progress unit 115 may generate a quest resolution screen 500 and display it on the display unit 152.
  • FIG. 7 is a diagram showing an example of a quest resolution screen 500 displayed on the display unit 152 of the user terminal 100.
  • the quest resolution screen 500 includes a character 401.
  • the game progress unit 115 causes the character 401 to perform the operation of "thank the hero for the resolution of the request".
  • the game progress unit 115 may cause the character 401 to perform this operation based on the operation instruction data stored in advance.
  • the game progress unit 115 may reproduce the scene in which the character 401 is thanking by arranging the still image of the character 401 and the text data 501 corresponding to the content of the statement on the quest resolution screen 500.
  • the game progress unit 115 may release one new fixed scenario related to the client character 401 as a reward for clearing the quest, and transition to a state in which the user can play. Specifically, the game progress unit 115 reads the play history shown in FIG. 5 and updates the status of the predetermined fixed scenario from "playable" to "playable”.
  • the game progress unit 115 may increase the intimacy between the main character and the character based on the fact that the quest has been cleared.
  • the game progress unit 115 may be configured to increase intimacy as the play content of the quest (time required, distance traveled, number of acquisitions, degree of joy of the character, rarity of the acquired target, etc.) is better. ..
  • the dialogue with the character progresses and the scenario progresses.
  • the scenario has one ending, the user has completed playing the scenario.
  • the game progress unit 115 may allow the user to acquire an item as a reward for playing the scenario by the user.
  • the item is, for example, a clothing item to be worn by the character 401.
  • the game progress unit 115 determines the items to be acquired by the user based on a predetermined rule. For example, the game progress unit 115 may give the user an item preliminarily associated with the scenario played, or may provide the user with the play content of the scenario (time required to clear the quest, acquired intimacy, and good choices). Items determined according to the selection, etc.) may be given. Alternatively, the item to be given to the user may be randomly determined from a plurality of candidates.
  • the game progress unit 115 may generate a reward screen 600 for notifying the user of the acquired item and display it on the display unit 152.
  • FIG. 8 is a diagram showing an example of a reward screen 600 displayed on the display unit 152 of the user terminal 100.
  • the reward screen 600 may include an icon 601 of the acquired item and a name 602 of the item.
  • the user can confirm the items that he / she has acquired.
  • the game progress unit 115 adds the above-mentioned acquired item to the item list stored in the item "item" shown in FIG.
  • the game progress unit 115 When the game progress unit 115 receives the operation instruction data from an external device such as the operation instruction device 300, the game progress unit 115 operates the character based on the operation instruction data in the live distribution part. For example, in the live distribution part, a moving image reproduction screen 800 including a character that operates based on the operation instruction data is generated and displayed on the display unit 152.
  • FIG. 9 is a diagram showing an example of a moving image reproduction screen 800 displayed on the display unit 152 of the user terminal 100.
  • the moving image reproduction screen 800 includes at least a character (character 802 in the illustrated example) that was a dialogue partner in the story part.
  • the game progress unit 115 reflects the movement indicated by the motion capture data included in the movement instruction data supplied from the external device (hereinafter referred to as the movement instruction device 300) in the movement of the character 802. .
  • the motion capture data is obtained by acquiring the movement of the model 702 at the installation location of the motion instruction device 300 via the motion capture device 3020. Therefore, the movement of the model 702 is directly reflected in the movement of the character 802 displayed on the display unit 152.
  • the game progress unit 115 outputs the voice data 801 included in the movement instruction data supplied from the movement instruction device 300 as the voice emitted by the character 802 in synchronization with the movement of the character 802. .
  • the voice data is obtained by acquiring the voice 700 of the voice actor 701 via the microphone 3010 at the installation location of the operation instruction device 300. Therefore, the voice data 801 corresponding to the voice 700 emitted by the voice actor 701 is output as it is from the speaker of the user terminal 100.
  • the voices and movements of the existing voice actors 701 and model 702 at the installation location of the operation instruction device 300 are directly reflected in the voices and movements of the character 802.
  • the user can feel the reality of the character 802 as if it exists in the real world, and can immerse himself in the game world. ..
  • the game progress unit 115 may determine the play result of the story part based on the input operation of the user in the story part (first part). Then, in the live distribution part (second part), the game progress unit 115 may display the character to be operated based on the operation instruction data on the display unit 152 in a display mode according to the play result.
  • the game progress unit 115 synthesizes the object of the item into the object of the character 802. It is preferable to do so.
  • the item acquired by the user playing the story part can be reflected in the clothing of the character 802 operating in the live distribution part.
  • an item as a fashion item for example, a Usamimi band
  • the game progress unit 115 reads out the information of the clothing item from the game information 132 shown in FIG. 5, and synthesizes the object of the item (in the illustrated example, the clothing item 803) into the character 802.
  • the user can feel the attachment to the character 802 and enjoy the live distribution part even more. Further, the user's motivation to upgrade the clothing of the character 802 can be cultivated, and as a result, the motivation to play the story part can be strengthened.
  • the game progress unit 115 may be able to input a comment addressed to the character 802 in response to the operation of the character 802.
  • the game progress unit 115 arranges a comment input button 804 on the moving image reproduction screen 800.
  • the user touches the comment input button 804 to call a UI for inputting a comment, operates the UI, and inputs a comment addressed to the character 802.
  • the UI may be for the user to select a desired comment from some prepared comments.
  • the UI may be for the user to edit characters and enter comments.
  • the UI may be for the user to input a comment by voice.
  • FIG. 10 is a flowchart showing a flow of processing executed by each device constituting the game system 1.
  • step S101 when the game progress unit 115 of the user terminal 100 receives an input operation for starting a game from the user, it accesses the server 200 and requests login.
  • step S102 the progress support unit 211 of the server 200 confirms that the status of the user terminal 100 is online, and responds that the login has been accepted.
  • step S103 the game progress unit 115 advances the game according to the input operation of the user while communicating with the server 200 as necessary.
  • the game progress unit 115 may advance the story part or the acquisition part for acquiring a new scenario.
  • step S104 the progress support unit 211 supports the progress of the game on the user terminal 100 by providing necessary information to the user terminal 100 as needed.
  • the sharing support unit 212 of the server 200 proceeds from YES in step S105 to step S106.
  • the live distribution time is, for example, predetermined by the game master and managed by the server 200 and the operation instruction device 300. Further, the live distribution time may be notified to the user terminal 100 in advance, or may be kept secret until the actual live distribution time is reached. In the former case, live distribution can be stably supplied to the user, and in the latter case, live distribution with special added value can be supplied to the user as a surprise distribution.
  • the sharing support unit 212 searches for one or more user terminals 100 having the right to receive live distribution.
  • the conditions for receiving live distribution may be set by the game master as appropriate, but at least the conditions include that the application of this game is installed and that the game is online at the time of live distribution.
  • the user terminal 100 that is online at the time of live distribution that is, that is running the application of this game, is searched for as the user terminal 100 that has the right to receive live distribution.
  • the sharing support unit 212 may further add that the user terminal 100 is owned by the user who has paid the consideration for receiving the live distribution.
  • the sharing support unit 212 may search for a specific user terminal 100 that has made a reservation to receive live distribution in advance at the above-mentioned live distribution time as a user terminal 100 that has the right to receive live distribution. good.
  • step S107 the sharing support unit 212 notifies the operation instruction device 300 of one or more detected user terminals 100.
  • the sharing support unit 212 may notify the operation instruction device 300 of the terminal ID of the user terminal 100, the user ID of the user who is the owner of the user terminal 100, the address of the user terminal 100, and the like.
  • step S108 the character control unit 316 of the operation instruction device 300 proceeds from YES in step S108 to steps S109 to S110 at the live distribution time. Which of steps S109 to S110 may be executed first.
  • step S109 the character control unit 316 acquires the voice input by an actor such as a voice actor via the microphone 3010 as voice data.
  • step S110 the character control unit 316 acquires the motion input by the actor such as the model via the motion capture device 3020 as motion capture data.
  • step S111 the character control unit 316 generates operation instruction data (second operation instruction data). Specifically, the character control unit 316 identifies a character to be delivered a moving image at the above-mentioned live distribution start time, and stores the character ID of the character in the item of "character ID" of the operation instruction data. Which character's moving image is to be delivered at what time may be scheduled in advance by the game master and registered in the operation instruction device 300. Alternatively, the operator of the operation instruction device 300 may specify in advance to the operation instruction device 300 which character the operation instruction data should be created. The character control unit 316 stores the voice data acquired in step S109 in the “voice” item of the operation instruction data.
  • the character control unit 316 stores the motion capture data acquired in step S110 in the “movement” item of the operation instruction data.
  • the character control unit 316 associates the voice data with the motion capture data so that the voice data and the motion capture data are synchronized with each other.
  • the character control unit 316 may use group identification information of a group of these user terminals 100 or one user terminal 100 so that the destination is one or more user terminals 100 notified by the server 200 in step S107.
  • the address is stored in the "destination" item of the operation instruction data as the destination designation information.
  • step S112 the character control unit 316 transmits the operation instruction data generated as described above to each user terminal 100 designated as the destination via the communication IF 33.
  • the character control unit 316 acquires voice data and motion capture data obtained by the actor making a voice or moving, and immediately renders them into motion instruction data, and distributes them to each user terminal 100 in real time. Is desirable.
  • step S113 the analysis unit 116 of the user terminal 100 receives the above-mentioned operation instruction data via the communication IF 13.
  • the analysis unit 116 may receive the operation instruction data at a time previously announced to be live-streamed from the operation instruction device 300 or the server 200.
  • step S114 the analysis unit 116 analyzes the received operation instruction data by using the reception as a trigger.
  • step S115 when the game progress unit 115 receives the above-mentioned operation instruction data, if the live distribution part is not executed, the game progress unit 115 starts the live distribution part. At this time, if another part is being executed, the game progress unit 115 interrupts the progress of the part and then starts the live distribution part.
  • the game progress unit 115 may output a message to the effect that the part being executed is temporarily suspended due to the start of live distribution to the display unit 152, and save the progress of the part in the storage unit 120. desirable.
  • the game progress unit 115 may omit step S115. In this case, the game progress unit 115 may output a message to the effect that the distribution of the operation instruction data (that is, the moving image of the body to be live-streamed by the character) has started to the display unit 152.
  • step S116 the game progress unit 115 advances the live distribution part by operating the character based on the moving image instruction data analyzed by the analysis unit 116. Specifically, the game progress unit 115 causes the display unit 152 to display the moving image reproduction screen 800 and the like shown in FIG. The game progress unit 115 reproduces the voice and movement in real time at almost the same time as the actors such as the voice actor 701 and the model 702 are making a voice or moving at the place where the operation instruction device 300 is installed. It is reflected in the speech and movement of the character 802 on the screen 800.
  • the analysis unit 116 and the game progress unit 115 continue rendering and reproducing the real-time moving image while continuously receiving the operation instruction data from the operation instruction device 300. Specifically, the game progress unit 115 does not accept any input operation from the user, and while the operation instruction data is received, returns from NO in step S117 to step S113, and repeats the subsequent steps.
  • step S117 If the operation reception unit 111 receives an input operation from the user while the character is operating based on the operation instruction data in step S117, the game progress unit 115 proceeds from YES in step S117 to step S118.
  • the operation receiving unit 111 accepts an input operation for the comment input button 804 on the moving image reproduction screen 800.
  • step S118 the game progress unit 115 transmits the comment data generated in response to the above-mentioned input operation to the operation instruction device 300.
  • the game progress unit 115 may transmit the comment ID of the selected comment as comment data.
  • the game progress unit 115 may transmit the text data of the text input by the user as comment data.
  • the game progress unit 115 may transmit the voice data of the voice input by the user as comment data.
  • the game progress unit 115 may recognize the voice input by the user, convert it into text data, and transmit it as comment data.
  • step S119 the reaction processing unit 317 of the operation instruction device 300 receives the comment data transmitted from the user terminal 100 via the communication IF 33.
  • step S120 the reaction processing unit 317 outputs the received comment data to the operation instruction device 300.
  • the reaction processing unit 317 displays the text data included in the comment data on the display unit 352. This allows operators to receive feedback on how the user responded to the character they moved. Then, the operator can determine the action of the further character according to this feedback. That is, the operation instruction device 300 returns to step S109, continues to acquire the voice data and the motion capture data, and continues to provide the operation instruction data to the user terminal 100.
  • the user terminal 100 receives the operation instruction data transmitted from the operation instruction device 300 after the content of the input operation in the own terminal is received by the operation instruction device 300.
  • the user terminal 100 receives voice data corresponding to the content of the character's speech, motion capture data corresponding to the movement of the character, and the like, and operation instruction data. Then, the user terminal 100 continuously operates the character based on the operation instruction data. As a result, it is possible for the user to experience real-time interactive interaction with the character.
  • the user terminal 100 may receive a motion command group in which one or more commands instructing the operation of the character are arranged in the order instructed by the operator of the operation instruction device 300.
  • the character for live-streaming the moving image in the live-streaming part does not have to be an NPC in the other part. That is, the present invention can also be applied to a game in which a PC operating based on a user's operation in another part performs live distribution of a moving image as an NPC in the live distribution part.
  • the user terminal 100 is configured to execute the following steps in order to improve the interest of the game based on the game program 131.
  • the user terminal 100 is a character according to a user's input operation input to the computer (user terminal 100) via the operation unit (input / output IF 14, touch screen 15, camera 17, distance measurement sensor 18).
  • the second part is advanced by operating the character based on the step of advancing the first part by operating the character and the operation instruction data specifying the operation of the character received from the NPC control device (operation instruction device 300). And perform the steps to make it.
  • the operation instruction data includes at least one of voice data and motion capture data.
  • the user terminal 100 transmits the content of the user's input operation to the NPC control device, receives the operation instruction data determined by the NPC control device based on the content of the input operation, and receives the operation instruction data.
  • the character is operated by using the reception of the operation instruction data as a trigger.
  • FIG. 11 is a flowchart showing a basic game progress of a game executed based on the game program according to the second modification of the embodiment.
  • Step S1a is the same as step S1 in FIG. That is, the game progress unit 115 executes the story part (first part).
  • the story part includes a fixed scenario S11a and an acquisition scenario S12a.
  • a scene in which the main character operated by the user and the character interact with each other is included.
  • the "scenario" collected as digital data corresponds to one episode of a story related to a character, is supplied from the server 200, and is temporarily stored in the storage unit 120.
  • the game progress unit 115 reads out one scenario stored in the storage unit 120, and advances one scenario according to the input operation of the user until the end is reached.
  • the scenario includes an option to be selected by the user, a response pattern of the character corresponding to the option, and the like, and different endings may be obtained in one scenario depending on which option the user selects. ..
  • the game progress unit 115 presents a plurality of options corresponding to the action from the main character to the character so that the user can select them, and advances the scenario according to the options selected by the user.
  • the character may be the above-mentioned NPC, and is not a target of direct operation by any user who is a game player here.
  • step S13a While the story part of step S1a is in progress, in step S13a, the game progress unit 115 receives a specific action by the user. In response to this, the game progress unit 115 proceeds to step S4a, and an operation for switching from the story part to the live distribution part is performed. It is preferable that the game progress unit 115 continuously executes the story part of step S1a until the user does not accept a specific action in step S13a.
  • the result of a specific action by the user in the story part includes, for example, that the position of the user terminal 100 acquired by the above-mentioned position registration system included in the user terminal 100 becomes a predetermined position. More specifically, as described with respect to FIG. 6, the quest is realized by the location information game using the location registration information of the user terminal 100, and the user holds the user terminal 100 and is determined by the game progress unit 115. Move to position. As a result, when the current position information of the user terminal 100 matches the determined position, the progress of the game is live-streamed in place of or in addition to causing the user to acquire the target (FIG. 8). You may want to switch to the part automatically.
  • virtual location information may be applied instead of the actual location registration information of the user terminal 100 acquired by the location registration system. That is, the result of a specific action by the user in the story part may include that the virtual position of the character being operated by the user during the game becomes a predetermined position.
  • the result of a particular action by the user in the story part includes the completion of a given scenario associated with the story part. More specifically, in the story part, when the user clears one or more quests or selects an option, the dialogue with the character progresses and the scenario progresses. Then, when the scenario has one end, the user has completed the play of the scenario. As a result, the game may automatically switch from the story part to the livestream part.
  • step S4a is the same as step S4 in FIG. That is, the game progress unit 115 determines whether or not the operation instruction data has been received from the external device (server 200 or the operation instruction device 300) via the network. While the operation instruction data is not received from the external device, the game progress unit 115 may return from NO in step S4a to, for example, step S1a, and continue to execute the story part. On the other hand, when the operation instruction data is received from the external device, the game progress unit 115 proceeds from YES in step S4a to step S5a.
  • Step S5a is the same as step S5 in FIG. That is, the game progress unit 115 executes the live distribution part (second part). Specifically, the game progress unit 115 advances the live distribution part by operating the character according to the operation instruction data received in step S4a. In step S1a, the user simply interacts with the character showing a definite reaction via the UI in the scenario. However, in the live distribution part, the user can freely and interactively interact with the character that operates in real time based on the operation instruction data transmitted from the external device. More specifically, the analysis unit 116 inputs operation instruction data including voice data and motion data input by an operator (including a voice actor and a model) associated with the NPC according to the content of the input operation of the user.
  • the analysis unit 116 inputs operation instruction data including voice data and motion data input by an operator (including a voice actor and a model) associated with the NPC according to the content of the input operation of the user.
  • the game progress unit 115 causes the character to speak based on the voice data included in the received motion instruction data, and moves the character based on the above-mentioned motion data. This allows the user and the operator to collaborate while synchronizing their actions in real time and interactively. That is, the reaction of the character to the above-mentioned user's input operation can be presented to the user.
  • step S105 instead of determining whether the server 200 is the live distribution time, the server 200 is specified by the user. It is better to judge whether the action has been accepted. That is, when the determination condition is satisfied, the server 200 and the operation instruction device 300 provide the live distribution in the live distribution part to the user terminal 100. On the contrary, when the determination condition is not satisfied, the progress of the game is controlled so that the user terminal 100 does not proceed to the live distribution part.
  • the user terminal 100 When the determination condition is satisfied, the user terminal 100 operates the NPC based on the operation instruction data, and can execute the progress of the live distribution part. Specifically, when the operation instruction terminal 300 has already started the live distribution from S108 to S110, the user terminal 100 may be able to receive the real-time live distribution from the middle. Instead of this, when the determination condition is satisfied, the live distribution is started by using this as a trigger, and the user terminal 100 may be able to receive the supply of the completed live distribution from the beginning. .. It should be noted that a specific action by the user, which is a determination condition, is determined in advance by, for example, a game master, and is managed by the server 200 and the operation instruction device 300.
  • the user terminal 100 operates the NPC in the first part based on the first operation instruction data downloaded in advance. Then, switching from the first part to the second part is performed according to the result of the user performing a specific action in the first part.
  • the user terminal 100 receives the second operation instruction data from the operation instruction device 300, and in the second part, operates the NPC based on the second operation instruction data. Since the NPC can be operated based on the second operation instruction data received from the operation instruction device 300, the operation of the NPC is unconventional and its expression is greatly expanded. Therefore, the user can feel the reality as if the NPC is in the real world through the relationship with the NPC during the game play. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game. Further, in order to move to the second part, the user needs to perform a specific action in the first part, so that the game quality can be further enhanced.
  • the character for live-streaming the video in the live-streaming part does not have to be an NPC in the other part. That is, the present invention can also be applied to a game in which a PC operating based on a user's operation in another part performs live distribution of a moving image as an NPC in the live distribution part.
  • the live distribution for advancing the live distribution part is received instead of the configuration that automatically switches to the live distribution part.
  • Rights may be granted to the user.
  • the right here may be in the form of a ticket, and the user holding the ticket has the right to access the live delivered.
  • live distribution part can be advanced when the live distribution time comes.
  • users who do not have tickets cannot proceed with the livestreaming part.
  • the live distribution time may be notified to the user terminal 100 in advance, or may be kept secret until the actual live distribution time is reached. In the former case, live distribution can be stably supplied to the user, and in the latter case, live distribution with special added value can be supplied to the user as a surprise distribution.
  • the game executed by the game system 1 according to the second embodiment is, as an example, a training simulation game including elements of a love simulation game, as in the first embodiment.
  • the game includes at least a live distribution part.
  • the game may be composed of a single live distribution part or may be composed of a plurality of parts.
  • the character whose operation is controlled by the operation instruction device 300 may be a PC or an NPC.
  • a character that operates as an NPC in the live distribution part may operate as a PC according to a user's input operation in another part.
  • the character may operate as a PC in the live distribution part according to the input operation of the user. Then, when the live distribution is started, the character may be switched to the NPC and operate according to the operation instruction data supplied from the operation instruction device 300.
  • the user terminal 100 is configured to execute the following steps in order to improve the interest of the game based on the game program 131.
  • the user terminal 100 has a step of operating a character in response to a user's input operation input to the user terminal 100 (computer) via an operation unit such as an input unit 151, and a server 200 or an operation instruction.
  • the step of receiving the operation instruction data specifying the operation of the character transmitted by multicast from the device 300 (character control device) and the step of operating the character based on the received operation instruction data are executed.
  • the step of operating the character is started by triggering that the operation instruction data transmitted by multicast is received in the receiving step.
  • the user terminal 100 may be configured to execute the following steps in order to improve the interest of the game based on the game program 131.
  • the user terminal 100 has a step of operating a character in response to a user's input operation input to the user terminal 100 via an operation unit, and a step of operating the character from the server 200 or the operation instruction device 300 to the user terminal 100.
  • "the operation instruction data was transmitted to the user terminal 100" means, for example, that the operation instruction data was transmitted by unicast.
  • the destination designation information includes an address unique to the own terminal, the user terminal 100 can determine that the operation instruction data is addressed to the own terminal, that is, transmitted by unicast.
  • the step of operating the character triggers that the operation instruction data transmitted by unicast and to which the identification information of the user or the user terminal is not associated is received in the receiving step. It is preferable to start with.
  • the analysis unit 116 further analyzes the meta information of the operation instruction data.
  • the meta information is information that defines the properties of the action instruction data separately from the contents of the action instruction data.
  • the meta information is, for example, destination designation information, creation source information, and the like.
  • the analysis unit 116 determines whether the operation instruction data is multicast-transmitted or unicast-transmitted based on the destination designation information of the transmitted operation instruction data. do.
  • Multicast transmission means that the server 200 or the operation instruction device 300 transmits the same information to a predetermined group including the own terminal. For example, the operation instruction data in which "ALL" is set as the destination designation information is sent to all the user terminals running the application of this game.
  • Unicast transmission means that the server 200 or the operation instruction device 300 transmits information to its own terminal.
  • the destination designation information is stored in, for example, the item “destination” of the operation instruction data shown in FIG.
  • the operation instruction data transmitted by multicast was created by the game master, not by a specific user.
  • the above-mentioned operation instruction data may be created by a device belonging to a provider (operating organization) that provides the service of the game in the game system 1.
  • a provider operating organization
  • the server 200 or the operation instruction device 300 knows the information about all the users and the user terminals, it is possible to create the operation instruction data and transmit it by multicast to the user terminal while the application is running. Therefore, the user terminal 100 can determine that the operation instruction data transmitted by multicast is created by the game master.
  • the analysis unit 116 may have the following functions as an example. Specifically, the analysis unit 116 renders the operation instruction data transmitted by multicast. Then, the analysis unit 116 instructs the game progress unit 115 to operate the character based on the rendering result. More preferably, the analysis unit 116 renders the operation instruction data in real time by using the reception of the operation instruction data transmitted by multicast as a trigger. Subsequently, the analysis unit 116 instructs the game progress unit 115 to operate the character based on the rendering result.
  • the analysis unit 116 may have the following functions in place of or in addition to the above-mentioned functions. Specifically, the analysis unit 116 renders unicast-transmitted operation instruction data, for example, operation instruction data to which information related to a specific user such as a user ID or a user terminal ID is not associated. do. Then, the analysis unit 116 instructs the game progress unit 115 to operate the character based on the rendering result. More preferably, when the analysis unit 116 determines that the operation instruction data is unicast transmitted, the operation instruction data is transmitted by a specific user terminal based on the creation source information of the operation instruction data. Determine if it was created.
  • the analysis unit 116 determines that the operation instruction data to which the information related to the specific user is associated with the creation source information is created by the specific user terminal.
  • the analysis unit 116 determines that the operation instruction data in which the value of the creation source information is empty and the operation instruction data to which the creation source information is not associated are not created by a specific user terminal.
  • the operation instruction data that is not created by a specific user terminal is considered to have been created by the game master.
  • the analysis unit 116 renders the operation instruction data in real time, triggered by the reception of the operation instruction data that is unicast and is not associated with the related information of the specific user as the creation source information. Subsequently, the analysis unit 116 instructs the game progress unit 115 to operate the character based on the rendering result.
  • the user terminal 100 can operate the character in real time based on the operation instruction data delivered from the game master, reflecting the intention of the game master. Therefore, the character can be provided with a sense of reality as if the character really exists there.
  • FIG. 11 is a flowchart showing a flow of processing for analyzing operation instruction data executed by the user terminal 100 according to the present embodiment.
  • the processing executed by each device of the game system 1 is substantially the same as the processing shown in FIG.
  • step S114 the user terminal 100 analyzes the operation instruction data as shown below.
  • step S201 the analysis unit 116 acquires the destination designation information from the item "destination" of the operation instruction data.
  • step S202 the analysis unit 116 determines whether or not the operation instruction data is transmitted by multicast based on the destination designation information.
  • the analysis unit 116 determines that the operation instruction data is transmitted by multicast.
  • the analysis unit 116 proceeds from YES in step S202 to step S115 and subsequent steps shown in FIG. That is, the game progress unit 115 causes the character to operate in real time based on the operation instruction data, triggered by the fact that the operation instruction data transmitted by multicast is received in the own terminal.
  • the analysis unit 116 determines that the operation instruction data is transmitted by unicast.
  • the analysis unit 116 determines that it is not necessary to reproduce the unicast transmission in real time, stores the received operation instruction data in the storage unit 120, and steps S103 shown in FIG. You may go back to that.
  • the game progress unit 115 may advance a part that operates the character according to the input operation of the user input to the user terminal 100.
  • the analysis unit 116 may proceed from NO in step S202 to step S203.
  • step S203 the analysis unit 116 acquires the creation source information from the item "creation source" of the operation instruction data.
  • step S204 the analysis unit 116 determines whether or not the creation source information points to user-related information related to a specific user.
  • the user-related information is, for example, a user ID, a terminal ID of the user terminal 100, an address of the user terminal 100, and the like.
  • the analysis unit 116 determines that the operation instruction data is not created by the specific user but is created by the game master. Then, the analysis unit 116 proceeds from NO in step S204 to step S115 and subsequent steps shown in FIG. That is, the game progress unit 115 triggers the reception of the operation instruction data to which the user-related information is not associated with the own terminal, and causes the character to operate in real time based on the operation instruction data.
  • the analysis unit 116 determines that the operation instruction data is created by the specific user. Therefore, the analysis unit 116 determines that it is not necessary to reproduce the operation instruction data in real time because it is not the operation instruction data supplied from the game master, and proceeds from YES in step S204 to step S205.
  • step S205 the analysis unit 116 saves the operation instruction data created by a specific user in the storage unit 120, and returns to step S103 and subsequent steps shown in FIG.
  • the character in addition to operating the character according to the input operation of the user, the character is operated based on the operation instruction data received from the server 200 or the operation instruction device 300. be able to. Therefore, the movement of the character is not limited to the type, and the expression is greatly expanded. As a result, the user can see the movement of the character and feel the reality as if the character is in the real world. Then, the user feels more attached to the character through the experience of interacting with the character with a rich sense of reality, so that another part that operates the character can be played with even more interest. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game.
  • the analysis unit 116 omits the step of determining whether or not the operation instruction data is multicast-transmitted based on the destination designation information, and the user-related information of a specific user is included in the operation instruction data. You may perform the step of determining whether or not it is linked based on the creation source information.
  • the game (hereinafter, this game) executed by the game system 1 according to the third embodiment is, as an example, a training simulation game including elements of a love simulation game, as in the first and second embodiments.
  • the game includes at least a live distribution part.
  • the game may be composed of a single live distribution part or may be composed of a plurality of parts. In one example, it may be composed of a combination of a story part and a live distribution part as shown in FIGS. 3 and 11. Further, in the live distribution part, the character whose operation is controlled by the operation instruction device 300 may be a PC or an NPC.
  • a character that operates as an NPC in the live distribution part may operate as a PC in another part according to an input operation of a user who is a game player.
  • the character may operate as a PC in the live distribution part according to an input operation of a user who is a game player. Then, when the live distribution is started, the character may be switched to the NPC and operate according to the operation instruction data supplied from the operation instruction device 300.
  • the user requests the progress of the completed live distribution part even after the real-time live distribution is once completed, and the live is performed based on the received operation instruction data.
  • the distribution part can be advanced again.
  • the user can look back at the live stream again, and even if he / she misses it, he / she can watch the live stream again.
  • the character here is assumed to be an NPC that is not a target of direct operation by a user who is a game player.
  • the user terminal 100 is configured to execute the following steps in order to improve the interest of the game based on the game program 131.
  • the user terminal 100 (computer) has a step of requesting the progress of a completed live distribution part via an operation unit such as an input unit 151, and a server 200 or an operation instruction device 300 (character control device). ), The step of receiving the recorded operation instruction data related to the completed live distribution part, and the step of advancing the completed live distribution part by operating the NPC based on the recorded operation instruction data.
  • the recorded operation instruction data includes motion data and voice data input by the operator associated with the NPC.
  • the operator includes not only a model and a voice actor but also an operator who performs some operation on the operation instruction device 300 (character control device), but does not include a user who is a game player.
  • the recorded operation instruction data is often stored in the storage unit 200 of the server 200 or the storage unit 320 of the operation instruction device 300, and is delivered to the user terminal 110 again in response to a request from the user terminal 100. It is good to do it.
  • the progress of the completed live distribution part based on the recorded operation instruction data is different depending on the result of whether or not the user has progressed the live distribution part in real time. Specifically, when it is determined that the user has a track record of advancing the live distribution part in real time, the same live distribution part as the user has advanced the live distribution part in real time is advanced again. Is good (return delivery). In return delivery, it is better to perform a selective progression of the live delivery part. On the other hand, when it is determined that the user has no record of progressing the live distribution part in real time, it is preferable to proceed with the live distribution part having a progress mode different from that progressed in real time (missed distribution).
  • the real-time live distribution part can proceed. Nevertheless, this includes cases where this was not actually done. For missed deliveries, it's a good idea to perform a limited progression of the live stream part.
  • the analysis unit 116 further receives the user action history information in the live distribution part.
  • the user action history information is a data set of user actions recorded by an input operation during the progress of the live distribution part, in addition to the contents of the recorded action instruction data.
  • the user action history information is often associated with the recorded operation instruction data, and is preferably stored in the storage unit 220 of the server 200 or the storage unit 320 of the operation instruction device 300.
  • the user behavior history information may be stored in the storage unit 120 of the user terminal 100.
  • FIG. 13 is a diagram showing an example of a data structure of user behavior history information.
  • the user action history information includes, for example, items such as action time, action type, and action details in which the user has acted in the live distribution part, and is associated with a user ID that identifies the user.
  • the item “behavior time” is the time information in which the user performed an action in the live distribution part
  • the item “behavior type” is a type indicating the user's action
  • the item “behavior details” is the specific action of the user. Content.
  • the consumption of valuable data by the user's input operation for example, throwing money and billing by purchasing items, etc.
  • comment input for example, comment input
  • character input for example, changing items such as clothing (so-called dress-up)
  • Actions such as changing items such as clothing (so-called dress-up) may be included.
  • an action may include selection of a time for later playing back a specific progress portion of the live distribution part (for example, a recording operation of the specific progress portion).
  • such actions may include the acquisition of rewards, points, etc. during the live distribution part.
  • the user action history information is preferably associated with each other between the data structure of the operation instruction data described in FIG. 4 and the data structure of the game information described in FIG. It should be understood by those skilled in the art that these data structures are merely examples and are not limited thereto.
  • FIG. 14 is a flowchart showing an example of a basic game progress of a game executed based on the game program according to the present embodiment.
  • the processing flow is applied to the scenes after the end of the live distribution time when the real-time live distribution part has already been completed.
  • step S301 the operation unit 151 of the user terminal 100 newly requests the progress of the completed live distribution part.
  • step S302 in response to the request in step S301, the user terminal 100 receives the recorded operation instruction data related to the completed live distribution part from the server 200 or the operation instruction device 300 (character control device).
  • the recorded action instruction data includes motion data and voice data input by the operator associated with the character.
  • the user terminal 100 may receive various progress record data acquired and recorded along with the movement of the character during the progress of the real-time live distribution part.
  • the progress record data may include viewer behavior data in which the user who participated in the real-time live distribution part behaves in accordance with the movement of the character.
  • the viewer behavior data is data including a record of the behavior during the live of all the users (that is, the viewers who participated in the live) who have advanced the real-time live distribution part in real time.
  • the viewer behavior data should include messaging content such as text messages and icons sent by the viewer to the character in real time during the live performance.
  • the recorded operation instruction data and progress record data may be received by the user terminal 100 as separate data, and each may be analyzed (rendered).
  • the server 200 or the operation instruction device 300 the previously recorded operation instruction data and the viewer behavior data may be combined, and the combined data set may be received by the user terminal 100 at one time. By receiving the combined data set, it is possible to reduce the load of subsequent data analysis (rendering) by the user terminal 100.
  • the progress record data is combined with the recorded action instruction data (that is, the recorded action order data includes the progress record data).
  • step S303 the game progress unit 115 determines whether or not the user has a track record of progressing the live distribution part in real time.
  • the determination may be performed, for example, with reference to the item "destination" shown in FIG. 4 based on whether there is a record in which the action instruction data has been sent to the user terminal 100.
  • the live distribution part is executed based on whether the status is "played” by referring to the item "play history” shown in FIG. 5, the item “distribution history” is also referred to. It may be executed based on whether or not there is a record of live distribution from the character in the past.
  • the operation instruction data already recorded is stored in the storage unit 120 of the user terminal 100, it may be determined that the live distribution part has already been advanced in real time.
  • the determination may be performed by combining them, or by any other method.
  • step S303 If it is determined in step S303 that the user has a track record of advancing the live distribution part in real time (YES), the progress of the completed live distribution part is "return distribution”. On the other hand, when it is determined in step S303 that the user has no record of advancing the live distribution part in real time (NO), the progress of the completed live distribution part is "missed distribution”. As mentioned above, the user experience is different between "return delivery” and "missed delivery”.
  • step S303 If it is determined in step S303 that the user has a track record of advancing the live distribution part in real time, the processing flow proceeds from YES in step S303 to step S304.
  • step S304 the analysis unit 116 acquires the user behavior history information of the live distribution part shown in FIG. 13 and analyzes it.
  • the user action history information may be acquired from the server 200 or the operation instruction device 300, or may be used directly when it is already stored in the storage unit 120 of the user terminal 100.
  • step S305 the game progress unit 115 re-progresses the completed live distribution part (that is, the above-mentioned "return distribution").
  • the recorded operation instruction data and the user action history information analyzed in step S304 are used to re-progress the live distribution part.
  • an NPC will be assigned based on the item (that is, wearing a Usamimi band). Make it work.
  • the live distribution part may be re-progressed. That is, the re-progress of the live distribution part reflects the user behavior history information and the reward information, is similar to the live distribution part that has progressed in real time, and is unique to the user.
  • the re-progress of the live distribution part is selectively executed according to the time information specified by the user's input operation via the operation unit, which was recorded when the live distribution part was first advanced. good.
  • the user specifies a specific action time, and the live distribution part is selectively advanced from there. Can be made to. For example, if the user inputs a comment 2 minutes and 45 seconds after the start of the live distribution part, the user can advance the live distribution part again by specifying the timing after 2 minutes and 45 seconds. ..
  • action time corresponding to the consumption of valuable data by the user's input operation and the record of actions such as change of items such as character's clothing. It is better to make it feasible based on.
  • the live distribution part is selectively progressed by using the data of the action time. Can be made to. For example, if the user has selected a period of 2 minutes 45 seconds to 5 minutes 10 seconds from the start of the live distribution part, the user can re-progress the live distribution part over that period.
  • step S303 if it is determined in step S303 that the user has no record of advancing the live distribution part in real time, the processing flow proceeds from NO in step S303 to step S306.
  • step S306 the game progress unit 115 executes a limited progress (that is, the above-mentioned "missed distribution") of the completed live distribution part.
  • the reason why the missed delivery is restricted is that the user has the right to receive the live stream, but it can be considered that he has waived this right, so not all of the live stream is necessarily limited. Based on the idea that it is not necessary to reproduce and present it to the user.
  • the progress of the live distribution part is executed using the recorded operation instruction data.
  • the NPC will wear that item in the livestream part that progressed in real time.
  • the image was synthesized so that it would work. That is, the operation mode of the NPC was associated with the reward.
  • the reward is not associated with the operation mode of the NPC. That is, the image composition process that causes the NPC to wear the item and operate it is not performed. That is, the progress of the completed livestreaming part is limited in that it does not reflect the reward information and is not unique to the user.
  • the overlooked distribution unlike the live distribution part that progressed in real time, it is better to limit the actions of the users that can be accepted. Specifically, in the live distribution part that progressed in real time, consumption of valuable data by user input operations (for example, throwing money and billing by purchasing items, etc.) could be accepted. On the other hand, in the progress of the completed live distribution part, the consumption of such valuable data may be restricted so as not to be accepted. More specifically, in the live distribution part progressed in real time, a user interface (UI) including a button and a screen for executing the consumption of valuable data was displayed on the display unit 352. Then, the user could execute the consumption of valuable data through such an input operation in the UI. On the other hand, in the overlooked delivery, such a UI should be hidden so that the user cannot explicitly perform an input operation.
  • UI user interface
  • the user can play a specific scenario associated with the live delivery part as well as the live delivery part that progresses in real time.
  • Certain scenarios include, for example, user-participatory events, which provide the user with an interactive experience with the character.
  • user-participatory events include questionnaires provided by the character, quizzes given by the character, battles with the character (for example, rock-paper-scissors, bingo), and the like. Then, as in the case of live distribution in real time, the participation result of such a user participation type event is fed back to the user in the overlooked distribution.
  • the result of the correctness determination is fed back to the user.
  • the user 8 is participating in the live concert in the return delivery. If the answer is different from the answer, a display such as "The answer is different from the one during the live" is displayed and output to the user terminal 800 by comparing with the answer of the user who is participating in the live. May be good.
  • the user may be restricted from acquiring predetermined game points for the above feedback.
  • predetermined game points may be associated with the user and added to the points owned by the user.
  • points may not be associated with the user.
  • the points owned by the user for example, in the case of a game in which a plurality of users who are game players are ranked based on the points, when the user advances the completed live distribution part, this is the case. It will not affect the ranking.
  • the user terminal 100 may request the progress of the completed second part (live distribution part) again. That is, it is preferable that the return delivery or the missed delivery can be repeatedly executed a plurality of times. In this case, the processing flow returns to step S301.
  • the user terminal 100 even after the live distribution part has progressed in real time, the user can proceed with the live distribution part again in various modes. As a result, the user becomes more attached to the character through the experience of realistic interaction with the character, so that another part that operates the character can be played with even more interest. can. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game.
  • ⁇ Modification 1> whether the progress of the completed live distribution part is a return distribution or a missed distribution is determined based on whether or not the user has a track record of advancing the live distribution part in real time.
  • the user may be configured to select the return delivery or the overlooked delivery. Alternatively, regardless of the presence or absence of the above-mentioned achievements, only the overlooked distribution may be provided to the user.
  • the progress of the completed second part may be requested again. did. That is, the return delivery or the missed delivery could be repeatedly executed a plurality of times.
  • the second and subsequent return delivery or missed delivery correspond to the record of the previous return delivery or missed delivery.
  • the first distribution history data is stored in the storage unit 220 of the server 200 or the storage unit 320 of the operation instruction device 300. After that, when the recorded operation instruction data related to the completed live distribution part is requested again from the user terminal 100, the first distribution history data is recorded from the server 200 or the operation instruction device 300 (character control device). It is delivered together with the completed action instruction data.
  • the received first delivery history data is referred to, and if the first return delivery or the missed delivery is performed halfway, the user terminal 100 will perform the second return delivery or the second return delivery from the continuation. Resume the progress of overlooked delivery. As a result, the user can efficiently perform return delivery or missed delivery.
  • the return delivery should be executed from the second time onward, and if the first delivery is a missed delivery, the missed delivery should be executed from the second time onward. Further, when the recorded operation instruction data already exists in the user terminal 100, the user terminal 100 may not receive the recorded operation instruction data again. As a result, the amount of data received by the user terminal 100 can be saved.
  • ⁇ Modification 3> whether the progress of the completed live distribution part is a return distribution or a missed distribution is determined according to the actual result of the user advancing the live distribution part in real time (). Step S303 in FIG. 14).
  • the third modification when it is determined that the user has progressed the live distribution part halfway in real time, it is preferable to restart the progress of the completed live distribution part from the continuation.
  • the record of how far the user has advanced the live distribution part in real time can be determined from the user behavior history information described above in FIG. That is, the user behavior history information records how long the user has progressed with respect to a specific live distribution part.
  • the resumption of the completed live distribution part should be a missed distribution, which is a limited progress. As a result, the user can efficiently execute the overlooked delivery.
  • FIG. 15 shows an example of a screen displayed on the display unit 152 of the user terminal 100 based on the game program according to the present embodiment, and an example of a transition between these screens.
  • screens include a home screen 850A, a live selection screen 850B for live distribution, a missed selection screen 850C for missed distribution, and a game screen 850D for a location-based game part.
  • the home screen 850A can be transitioned to the live selection screen 850B and the game screen 850D.
  • the live selection screen 850B can be changed to the home screen 850A, the overlooked selection screen 850C, and the game screen 850D.
  • the overlooked selection screen 850C can be transitioned to the live selection screen 850B
  • the game screen 850D can be transitioned to the home screen 850A and the live selection screen 850B.
  • the actual distribution screen (not shown) is transitioned from the live screen 850B and the overlooked selection screen 850C.
  • the home screen 850A displays various menus for advancing the position game part (first part) or the live distribution part (second part) on the display unit 152 of the user terminal 100.
  • the game progress unit 115 receives an input operation for starting the position game part and / or the live distribution part, the game progress unit 115 first displays the home screen 850A.
  • the home screen 850A includes a "live" icon 852 for transitioning to the live selection screen 850B and an "outing" icon 854 for transitioning to the game screen 850D of the location information game.
  • the game progress unit 115 Upon receiving an input operation for the "live" icon 852 on the home screen 850A, the game progress unit 115 causes the display unit 152 to display the live selection screen 850B.
  • the live selection screen 850B presents the user with live information that can be distributed.
  • live information that can be distributed.
  • the live announcement information includes at least the live delivery date and time.
  • the live announcement information may include free / paid live information, an advertisement image including an image of a character appearing in the live, and the like.
  • the live selection screen 850B may display the notification information regarding the live distribution to be distributed in the near future on the live selection screen by pop-up 856.
  • the server 200 searches for one or more user terminals 100 having the right to receive the live distribution.
  • the right to receive live distribution is granted when the user terminal 100 satisfies a predetermined condition.
  • the predetermined conditions are that the consideration for receiving the live distribution has been paid (for example, holding a ticket), the scenario has been cleared in the location information game part, and the user terminal 100 or the user terminal 100 in the location information game part.
  • the current position of the character including the main character includes being in a specific area / position where a live distribution source or the like is located. The corresponding live notification information will be displayed on the user terminal 100 having the right to receive the live distribution.
  • the user terminal 100 accepts a live playback operation (for example, a selection operation for a live that has reached the live distribution time on the live selection screen 850B). Specifically, it is better to accept touch operations on live images. Accordingly, the game progress unit 115 shifts the display unit 152 to the actual distribution screen (not shown). As a result, the user terminal 100 can advance the live distribution part and advance the live viewing process in real time.
  • a live playback operation for example, a selection operation for a live that has reached the live distribution time on the live selection screen 850B.
  • the game progress unit 115 shifts the display unit 152 to the actual distribution screen (not shown).
  • the user terminal 100 can advance the live distribution part and advance the live viewing process in real time.
  • the video playback unit 117 When the live viewing process is executed, the video playback unit 117 operates the character in the live distribution part based on the received operation instruction data. That is, the moving image reproduction unit 117 uses the operation instruction data in the live distribution part to generate a moving image reproduction screen (for example, a moving image as shown in FIG. 9) including the character to be operated and display it on the display unit 152. ..
  • the character may be either an NPC or a PC.
  • the live selection screen 850B has a "return (x)" icon 858 for transitioning to the screen displayed immediately before and a "missing delivery” icon 860 for transitioning to the missed selection screen 800C on the display unit 152. It may be displayed.
  • the game progress unit 115 shifts the screen 800B to the screen displayed immediately before. Specifically, the game progress unit 115 shifts to the home screen 850A when the screen displayed immediately before is the home screen 850A, and to the game screen 800D when the game screen 850D. That is, it is preferable that the history back function is executed on the "back (x)" icon 858.
  • the broken line arrow shown in FIG. 37 selectively transitions from the live selection screen 850B to either the home screen 850A or the position information screen 850D in response to the input operation for the “back (x)” icon 858. Indicates that it will be done.
  • the game progress unit 115 shifts from the live selection screen 850B to the missed selection screen 850C.
  • the overlook selection screen 850C displays, among the delivered information about one or more live delivered in the past, the delivered information in which the user has not progressed the live delivery part in real time.
  • the operation unit 151 of the user terminal 100 accepts input operations (for example, touch operations) for live delivered information displayed on the overlooked selection screen 850C, for example, an image 880 including a character appearing in the live.
  • the game progress unit 115 can re-progress the completed live distribution part after the end of the live distribution part.
  • the re-progress is not limited to this, but it is better to make it a missed delivery.
  • the delivered information about the live is further delivered with the playback time 862 of each delivered live, the period until the end of delivery (days, etc.) 864, and how many days before the present. It may include information 866 indicating whether or not it has been done, past delivery date and time, and the like.
  • the overlooked selection screen 850C includes a "back ( ⁇ )" icon 868 for transitioning to the live selection screen 850B. In response to the input operation for the "return ( ⁇ )" icon 868, the game progress unit 115 transitions to the live selection screen 850B.
  • the overlooked selection screen 850C is not limited to this, but it is preferable that the transition is made only from the live selection screen 850B and not directly from the home screen 850A and the game screen 850D.
  • the missed distribution is performed for the user who missed the live distribution, and is only a function accompanying the live distribution function.
  • one of the purposes of this game is to enhance the fun of the game by allowing the user to watch the live stream in real time, support the character in real time, and deepen the interaction with the character. For this reason, it should be prioritized to guide the user to watch the live distribution in real time, rather than the overlooked distribution in which real-time interaction with the character (player) is not possible. Therefore, in the present embodiment, it is preferable not to directly transition from the home screen 850A and the game screen 850D to the overlooked selection screen 850C.
  • the delivered information that the user has not made the live delivery part in real time is displayed.
  • the delivered information about all the live delivered in the past may be displayed in a list for each live.
  • it is preferable that either the return distribution or the overlooked distribution is executed depending on whether or not the user has progressed the live distribution part in real time. Specifically, when it is determined that the user has a track record of advancing the live distribution part in real time, it is preferable to use the above-mentioned return distribution. On the other hand, if it is determined that the user has no record of progressing the live distribution part in real time, it is better to overlook the distribution. As mentioned above, the return delivery and the missed delivery provide different user experiences.
  • the game screen 850D is a screen displayed on the display unit 152 in the location information game part.
  • the game progress unit 115 presents a quest to the user while the scenario is in progress in the location information game part.
  • the game progress unit 115 may realize the quest by a location information game using the location registration information of the user terminal 100.
  • the game progress unit 115 acquires the current position information (for example, address information, latitude / longitude information, etc.) of the user terminal 100 from the position registration system (not shown) provided in the user terminal 100. Then, based on the acquired current position information, a map 874 around the place where the user terminal 100 is located is generated and arranged on the game screen 850D.
  • the map data that is the source of generating the map 874 may be stored in the storage unit 120 of the user terminal 100 in advance, or may be acquired from another service providing device (not shown) that provides the map data via the network. May be done.
  • the game progress unit 115 determines a position (address, latitude / longitude, etc.) at which the privilege can be obtained, and superimposes and displays the portal icon 876 on the position on the map corresponding to the determined position.
  • the user can acquire the privilege and clear the quest by moving to the position of the portal icon 876 on the map 874 by holding the user terminal 100.
  • the user can take the user terminal 100, move to the position of the portal icon 876 on the map 874, clear the game associated with the portal, obtain the privilege, and clear the quest.
  • the position of the portal may be randomly determined by the game progress unit 115, or may be predetermined according to the contents of the scenario, quest, and privilege.
  • the privilege may be in the form of a ticket related to the right to receive the above-mentioned live distribution. That is, only the user who has acquired this privilege can watch the corresponding live distribution through the live selection screen 850B in the later live distribution part.
  • the location information game part may be realized without using the location registration information of the user terminal 100.
  • the virtual position information on the map 874 is used instead of the actual position registration information of the user terminal 100.
  • the game screen 850D displays a "home” icon 878 and a "live” icon 872.
  • the game progress unit 115 causes the display unit 152 to display the home screen 850A. Further, when the input operation for the "live” icon 872 is received, the game progress unit 115 causes the live selection screen 850B to be displayed on the display unit 152.
  • the game screen 850D can transition to the home screen 850A or the live selection screen 850B. That is, the live selection screen 850B can be transitioned not only from the home screen 850A but also from the game screen 850D. As described above, for the purpose of inducing the user to watch the live distribution in real time, it is preferable to configure the game screen 850D so as not to directly transition to the overlooked selection screen 850C.
  • Control of the control block of the control unit 110 (particularly, the operation reception unit 111, the display control unit 112, the UI control unit 113, the animation generation unit 114, the game progress unit 115, the analysis unit 116 and the progress information generation unit 117), and the control unit 210.
  • Blocks (particularly progress support unit 211 and shared support unit 212) and control blocks of control unit 310 (particularly, operation reception unit 311, display control unit 312, UI control unit 313, animation generation unit 314, progress simulation unit 315).
  • Character control unit 316 and reaction processing unit 317) may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or by software using a CPU (Central Processing Unit). You may.
  • control unit 110 the control unit 210 or the control unit 310, or an information processing device including a plurality of these units is a CPU that executes instructions of a program that is software that realizes each function, the above program, and various types. It is equipped with a ROM (Read Only Memory) or storage device (these are referred to as "recording media") in which data is readablely recorded by a computer (or CPU), a RAM (Random Access Memory) for expanding the above program, and the like. .. Then, the object of the present invention is achieved by the computer (or CPU) reading the program from the recording medium and executing the program.
  • a "non-temporary tangible medium" for example, a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • the program may be supplied to the computer via any transmission medium (communication network, broadcast wave, etc.) capable of transmitting the program. It should be noted that one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the above program is embodied by electronic transmission.
  • the information processing method includes a processor (10), a memory (11), an operation unit (communication IF13, input / output IF14, touch screen 15, camera 17, distance measurement sensor 18, input unit 151, etc.).
  • a step S1a) for executing the progress of the part) and a step (step S13a) for receiving a specific action by the user in the first part, and the second part (live distribution part) is performed according to the result of the specific action.
  • step S4a of receiving the operation instruction data specifying the operation of the character transmitted from the external device (server 200 or the operation instruction device 300) located in the external space physically separated from the above.
  • the motion instruction data includes a step (step S5a) of executing the first (real-time) progress of the second part by operating the character based on the motion instruction data, and the motion instruction data is a motion input by the operator who plays the character.
  • the operator including data and audio data, is a person different from the user, located in an external space physically separate from the space in which the user is located.
  • the second operation instruction data is received from the character control device, and in the second part, the operation instruction data received from the character control device. Operate the character based on.
  • the character's operation is not unconventional and the expression is greatly expanded. Therefore, the user can feel the reality as if the character is in the real world through the relationship with the character. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game. Further, in order to move to the second part, the user needs to perform a specific action in the first part, so that the game quality can be further enhanced.
  • the second part is a part for live distribution performed in real time, and when the live distribution time associated with the above right comes, the first progress of the second part Becomes feasible.
  • the computer is further equipped with a location registration system, and the result of a specific action is that the position of the computer acquired by the location registration system in the first part is a predetermined position. Including becoming.
  • the virtual position of the character operated by the user in the first part is the predetermined position as the result of the specific action. Including that.
  • Step S301 the step of receiving the operation instruction data from the external device again (step S302), and the operation of the character based on the operation instruction data received again, the second progress of the second part is executed.
  • the information processing method includes a processor (10), a memory (11), an operation unit (communication IF13, input / output IF14, touch screen 15, camera 17, distance measuring sensor 18, input unit 151, etc.).
  • the progress of the completed second part is executed.
  • the motion instruction data includes motion data and voice data input by the operator who plays the character, and the operator is in an external space physically separated from the space in which the user is located. A person who is different from the user who is located.
  • the user terminal 100 even after the live distribution part has progressed in real time, the user can proceed with the live distribution part again in various modes. As a result, the user becomes more attached to the character through the experience of realistic interaction with the character, so that another part that operates the character can be played with even more interest. can. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game.
  • the step of executing the progress of the completed second part is to make the character speak based on the voice data included in the operation instruction data and based on the motion data included in the operation instruction data. Including moving the character.
  • the step of executing the progress of the completed second part triggers the reception of the previous operation instruction data and sets the character based on the operation instruction data. Including to operate.
  • (Item 10) A computer-readable medium containing computer-executable instructions was explained. According to certain aspects of the disclosure, the computer readable medium causes a processor to perform a step comprising any one of (item 1) to (item 10) when a computer executable instruction is executed.
  • the information processing apparatus (user terminal 100) has a first part progressing unit that executes the progress of the first part (story part) that operates a character in response to a user's input operation, and a game.
  • the reception part that accepts a specific action by the user in the progress of the first part.
  • the second part progress part if the progress of the second part is not progressed in real time, the progress of the completed second part is requested, and the space physically separated from the space where the user is located is external.
  • a second part progress unit that receives recorded operation instruction data from an external device located in space and operates a character based on the received operation instruction data to execute the progress of the completed second part.
  • the motion instruction data includes motion data and voice data input by the operator who plays the character, and the operator is located in an external space physically separated from the space in which the user is located. A different person.
  • the method according to (item 11) has the same effect as the method according to (item 6).
  • 1 game system 2 networks, 10,20,30 processors, 11,21,31 memory, 12,22,32 storage, 13,23,33 communication IF (operation unit), 14,24,34 input / output IF (operation) Unit), 15, 35 touch screen (display unit, operation unit), 17 camera (operation unit), 18 distance measurement sensor (operation unit), 100 user terminal (computer, information processing device), 110, 113, 210, 310 Control unit, 111,311 Operation reception unit, 112,312 Display control unit, 113,313 UI control unit, 114,314 animation generation unit, 115 game progress unit, 116 analysis unit, 117 progress information generation unit, 120,220, 320 storage unit, 131 game program, 132 game information, 133 user information, 134 character control program, 151,351 input unit (operation unit), 152,352 display unit, 200 server (computer), 211 progress support unit, 212 shared Support unit, 300 operation instruction device (NPC control device, character control device), 315 progress simulation unit, 316 character control unit, 317 reaction processing unit,

Abstract

This information processing method for progressing a game includes: step (S1a) in which a processor implements the progression of a first part (story part), in which a character is made to act in response to an input operation of a user, said input operation being via an operation unit; a step (S13a) in which a specific behavior of the user is received during the first part, a computer is given, in response to the result of the specific behavior, the privilege to implement a first progression of a second part (livestream part), and on the basis of this privilege, it becomes possible for the progression of the game to switch from the first part to the second part; a step (S4a) in which action instruction data that has been transmitted from an external device and designates an action of the character is received, said external device being located in an external space that is physically removed from the space where the user is located; and a step (S5a) in which the first progression of the second part is implemented by causing the character to act on the basis of the received action instruction data, wherein the action instruction data includes motion data and voice data that have been inputted by an operator playing the character, and the operator is a different person from the user and is located in the external space that is physically separated from the space where the user is located.

Description

情報処理方法、コンピュータ可読媒体、および情報処理装置Information processing methods, computer-readable media, and information processing equipment
 本開示は情報処理方法、コンピュータ可読媒体、および情報処理装置に関する。 This disclosure relates to information processing methods, computer-readable media, and information processing devices.
 従来、ユーザが選択した選択肢に応じて結末が異なるように物語が進行するゲームが広く知られている。例えば、非特許文献1には、女の子のキャラクタと仮想的に親睦を深めることを主たる目的とする恋愛シミュレーションゲームが開示されている。ユーザは、提示された選択肢の中からキャラクタに対する働きかけとして最適と思うものを選択し、その働きかけに対して該キャラクタが反応することを繰り返すことで物語が進行する。 Conventionally, a game in which a story progresses so that the ending differs depending on the option selected by the user is widely known. For example, Non-Patent Document 1 discloses a romance simulation game whose main purpose is to virtually deepen friendship with a girl character. The user selects the most suitable action for the character from the presented options, and the story progresses by repeating the reaction of the character to the action.
 非特許文献1に開示されているゲームでは、キャラクタの応答パターンが予め用意されている。そして、ユーザの入力操作に応じて、該キャラクタの応答が、該応答パターンの中から決定されて出力され、ゲームが進行する。したがって、キャラクタの動作のバリエーションは、予め用意されたデータの内容を超えて広がることがない。そのため、ユーザは、キャラクタとの関わり合いに対して、該キャラクタがまるで現実の世界にいるかのような現実感を覚えることができず、いずれ飽きるという問題がある。一般に、ユーザに長くプレイさせることを意図して開発されたゲームにおいては、ユーザがゲームに飽きるという問題に如何に対処するかが重要である。ゲームには、常に、ユーザにプレイを動機付けるような魅力的なコンテンツを提供することが求められる。とりわけ、ユーザが、キャラクタとの関わり合いに興趣性を見出すようなゲームにおいては、キャラクタは、ユーザがゲームの世界に没入できるほどの高い現実感を備えていることが好ましい。 In the game disclosed in Non-Patent Document 1, a character response pattern is prepared in advance. Then, according to the input operation of the user, the response of the character is determined from the response pattern and output, and the game progresses. Therefore, the variation of the character's movement does not extend beyond the contents of the data prepared in advance. Therefore, there is a problem that the user cannot feel the reality as if the character is in the real world with respect to the relationship with the character, and eventually gets tired of it. Generally, in a game developed with the intention of letting the user play for a long time, it is important how to deal with the problem that the user gets tired of the game. Games are always required to provide compelling content that motivates users to play. In particular, in a game in which the user finds interest in the relationship with the character, it is preferable that the character has a high sense of reality so that the user can immerse himself in the world of the game.
 本開示の一態様は、ゲームの世界への没入感を高め、該ゲームの興趣性を向上させることを目的とする。 One aspect of the present disclosure is intended to enhance the immersive feeling of the game in the world and to improve the interest of the game.
 本開示に係る情報処理方法は、プロセッサ、メモリおよび操作部を備えるコンピュータによるゲームの進行のための情報処理方法であって、プロセッサによる、操作部を介したユーザの入力操作に応じてキャラクタを動作させる第1パートの進行を実行するステップと、第1パートにおいてユーザによる特定の行動を受け付けるステップであって、特定の行動の結果に応じて、第2パートの第1の進行を実行するための権利がコンピュータに付与され、該権利に基づいて、ゲームの進行が第1パートから第2パートへと切り替え可能となる、ステップと、ユーザが位置する空間とは物理的に離れた外部の空間に位置する外部装置から送信された、キャラクタの動作を指定する動作指図データを受信するステップと、受信した動作指図データに基づいてキャラクタを動作させることにより、第2パートの第1の進行を実行するステップと、を含み、動作指図データが、キャラクタを演じるオペレータの入力したモーションデータおよび音声データを含み、オペレータは、ユーザが位置する空間とは物理的に離れた外部の空間に位置する、ユーザとは異なる人である。 The information processing method according to the present disclosure is an information processing method for the progress of a game by a computer including a processor, a memory, and an operation unit, and operates a character in response to a user's input operation via the operation unit by the processor. A step of executing the progress of the first part to be caused, and a step of accepting a specific action by the user in the first part, for executing the first progress of the second part according to the result of the specific action. A right is granted to the computer, and based on that right, the progress of the game can be switched from the first part to the second part, in an external space physically separated from the space where the user is located. The first progress of the second part is executed by operating the character based on the step of receiving the operation instruction data specifying the operation of the character and the received operation instruction data transmitted from the external device located. The operation instruction data includes the motion data and the voice data input by the operator who plays the character, and the operator is located in an external space physically separated from the space in which the user is located. Is a different person.
 本開示に係る情報処理方法は、プロセッサ、メモリおよび操作部を備えるコンピュータによるゲームの進行のための情報処理方法であって、プロセッサによる、操作部を介したユーザの入力操作に応じてキャラクタを動作させる第1パートの進行を実行するステップと、ゲームの進行を第1パートから第2パートに切り替え可能にするために、第1パートの進行においてユーザによる特定の行動を受け付けるステップと、第2パートの進行をリアルタイムに進行させなかった場合に、終了済みの第2パートの進行を要求するステップと、ユーザが位置する空間とは物理的に離れた外部の空間に位置する外部装置から記録済みの動作指図データを受信するステップと、受信した動作指図データに基づいてキャラクタを動作させることにより、終了済みの第2パートの進行を実行するステップと、を含み、動作指図データは、キャラクタを演じるオペレータの入力したモーションデータおよび音声データ、を含み、オペレータは、ユーザが位置する空間とは物理的に離れた外部の空間に位置する、ユーザとは異なる人である。 The information processing method according to the present disclosure is an information processing method for the progress of a game by a computer including a processor, a memory, and an operation unit, and operates a character in response to a user's input operation via the operation unit by the processor. A step to execute the progress of the first part to be caused, a step to accept a specific action by the user in the progress of the first part in order to make the progress of the game switchable from the first part to the second part, and the second part. Recorded from an external device located in an external space physically separated from the space in which the user is located, and the step requesting the progress of the completed second part when the progress of is not progressed in real time. The operation instruction data includes an operator who plays the character, including a step of receiving the operation instruction data and a step of executing the progress of the completed second part by operating the character based on the received operation instruction data. The operator is a person different from the user, including the input motion data and voice data, located in an external space physically separated from the space in which the user is located.
 本開示に係る情報処理装置は、情報処理装置は、ユーザの入力操作に応じてキャラクタを動作させる第1パートの進行を実行する第1パート進行部と、ゲームの進行を第1パートから第2パートに切り替え可能にするために、第1パートの進行においてユーザによる特定の行動を受け付ける受付部と、第2パート進行部であって、第2パートの進行をリアルタイムに進行させなかった場合に、終了済みの第2パートの進行を要求し、ユーザが位置する空間とは物理的に離れた外部の空間に位置する外部装置から記録済みの動作指図データを受信し、受信した動作指図データに基づいてキャラクタを動作させることにより、終了済みの第2パートの進行を実行する、第2パート進行部と、を備え、動作指図データは、キャラクタを演じるオペレータの入力したモーションデータおよび音声データ、を含み、オペレータは、ユーザが位置する空間とは物理的に離れた外部の空間に位置する、ユーザとは異なる人である。 In the information processing apparatus according to the present disclosure, the information processing apparatus includes a first part progressing unit that executes the progress of the first part that operates a character in response to a user's input operation, and the first part to the second part that progresses the game. In order to be able to switch to a part, a reception unit that accepts a specific action by the user in the progress of the first part, and a second part progress part, when the progress of the second part is not progressed in real time. Requests the progress of the completed second part, receives recorded operation instruction data from an external device located in an external space physically separated from the space where the user is located, and is based on the received operation instruction data. The operation instruction data includes motion data and voice data input by the operator who plays the character, and includes a second part progress unit that executes the progress of the completed second part by operating the character. The operator is a person different from the user, located in an external space physically separated from the space in which the user is located.
 本開示の一態様によれば、ゲームの興趣性を向上させる効果を奏する。 According to one aspect of the present disclosure, it has the effect of improving the interest of the game.
ゲームシステムのハードウェア構成を示す図である。It is a figure which shows the hardware configuration of a game system. ユーザ端末、サーバおよび動作指図装置の機能的構成を示すブロック図である。It is a block diagram which shows the functional configuration of a user terminal, a server, and an operation instruction device. 本実施形態に係る情報処理方法に基づいて実行されるゲームの基本的なゲーム進行についてその一例を示すフローチャートである。It is a flowchart which shows an example about the basic game progress of the game which is executed based on the information processing method which concerns on this embodiment. 動作指図データのデータ構造の一例を示す図である。It is a figure which shows an example of the data structure of the operation instruction data. ゲーム情報のデータ構造の一例を示す図である。It is a figure which shows an example of the data structure of a game information. ユーザ端末の表示部に表示されるクエスト提示画面の一例を示す図である。It is a figure which shows an example of the quest presentation screen displayed on the display part of a user terminal. ユーザ端末の表示部に表示されるクエスト解決画面の一例を示す図である。It is a figure which shows an example of the quest solution screen displayed on the display part of a user terminal. ユーザ端末の表示部に表示される報酬画面の一例を示す図である。It is a figure which shows an example of the reward screen displayed on the display part of a user terminal. ユーザ端末の表示部に表示される動画再生画面の一例を示す図である。It is a figure which shows an example of the moving image play screen which is displayed on the display part of a user terminal. ゲームシステムにおいて実行される処理の流れを示すフローチャートである。It is a flowchart which shows the flow of the process executed in a game system. 本実施形態に係る情報処理方法に基づいて実行されるゲームの基本的なゲーム進行についてその一例を示すフローチャートである。It is a flowchart which shows an example about the basic game progress of the game which is executed based on the information processing method which concerns on this embodiment. ユーザ端末によって実行される、動作指図データを解析する処理の流れを示すフローチャートである。It is a flowchart which shows the flow of the process which analyzes the operation instruction data executed by a user terminal. ライブ配信パートのユーザ行動履歴情報のデータ構造の一例を示す図である。It is a figure which shows an example of the data structure of the user behavior history information of a live distribution part. 本実施形態に係る情報処理方法に基づいて実行されるゲームの基本的なゲーム進行についてその一例を示すフローチャートである。It is a flowchart which shows an example about the basic game progress of the game which is executed based on the information processing method which concerns on this embodiment. ユーザ端末の表示部に表示されるゲーム画面の遷移例を示す。An example of the transition of the game screen displayed on the display unit of the user terminal is shown.
 〔実施形態1〕
 本開示に係るゲームシステムは、ゲームプレイヤである複数のユーザにゲームを提供するためのシステムである。以下、ゲームシステムについて図面を参照しつつ説明する。なお、本発明はこれらの例示に限定されるものではなく、特許請求の範囲によって示され、特許請求の範囲と均等の意味および範囲内でのすべての変更が本発明に含まれることが意図される。以下の説明では、図面の説明において同一の要素には同一の符号を付し、重複する説明を繰り返さない。
[Embodiment 1]
The game system according to the present disclosure is a system for providing a game to a plurality of users who are game players. Hereinafter, the game system will be described with reference to the drawings. It should be noted that the present invention is not limited to these examples, and is indicated by the scope of claims, and it is intended that all modifications within the meaning and scope equivalent to the scope of claims are included in the present invention. To. In the following description, the same elements are designated by the same reference numerals in the description of the drawings, and duplicate description is not repeated.
 <ゲームシステム1のハードウェア構成>
 図1は、ゲームシステム1のハードウェア構成を示す図である。ゲームシステム1は図示の通り、複数のユーザ端末100と、サーバ200とを含む。各ユーザ端末100は、サーバ200とネットワーク2を介して接続する。ネットワーク2は、インターネットおよび図示しない無線基地局によって構築される各種移動通信システム等で構成される。この移動通信システムとしては、例えば、所謂3G、4G移動通信システム、LTE(Long Term Evolution)、および所定のアクセスポイントによってインターネットに接続可能な無線ネットワーク(例えばWi-Fi(登録商標))等が挙げられる。
<Hardware configuration of game system 1>
FIG. 1 is a diagram showing a hardware configuration of the game system 1. As shown in the figure, the game system 1 includes a plurality of user terminals 100 and a server 200. Each user terminal 100 connects to the server 200 via the network 2. The network 2 is composed of various mobile communication systems constructed by the Internet and a wireless base station (not shown). Examples of this mobile communication system include so-called 3G and 4G mobile communication systems, LTE (Long Term Evolution), and wireless networks (for example, Wi-Fi (registered trademark)) that can be connected to the Internet by a predetermined access point. Will be.
 サーバ200(コンピュータ、情報処理装置)は、ワークステーションまたはパーソナルコンピュータ等の汎用コンピュータであってよい。サーバ200は、プロセッサ20と、メモリ21と、ストレージ22と、通信IF23と、入出力IF24とを備える。サーバ200が備えるこれらの構成は、通信バスによって互いに電気的に接続される。 The server 200 (computer, information processing device) may be a general-purpose computer such as a workstation or a personal computer. The server 200 includes a processor 20, a memory 21, a storage 22, a communication IF 23, and an input / output IF 24. These configurations of the server 200 are electrically connected to each other by a communication bus.
 ユーザ端末100(コンピュータ、情報処理装置)は、スマートフォン、フィーチャーフォン、PDA(Personal Digital Assistant)、またはタブレット型コンピュータ等の携帯端末であってよい。ユーザ端末100は、ゲームプレイに適したゲーム装置であってもよい。ユーザ端末100は図示の通り、プロセッサ10と、メモリ11と、ストレージ12と、通信インターフェース(IF)13と、入出力IF14と、タッチスクリーン15(表示部)と、カメラ17と、測距センサ18とを備える。ユーザ端末100が備えるこれらの構成は、通信バスによって互いに電気的に接続される。なお、ユーザ端末100は、タッチスクリーン15に代えて、または、加えて、ユーザ端末100本体とは別に構成されたディスプレイ(表示部)を接続可能な入出力IF14を備えていてもよい。 The user terminal 100 (computer, information processing device) may be a mobile terminal such as a smartphone, a feature phone, a PDA (Personal Digital Assistant), or a tablet computer. The user terminal 100 may be a game device suitable for game play. As shown in the figure, the user terminal 100 includes a processor 10, a memory 11, a storage 12, a communication interface (IF) 13, an input / output IF 14, a touch screen 15 (display unit), a camera 17, and a distance measuring sensor 18. And. These configurations included in the user terminal 100 are electrically connected to each other by a communication bus. The user terminal 100 may be provided with an input / output IF 14 to which a display (display unit) configured separately from the user terminal 100 main body can be connected in place of or in addition to the touch screen 15.
 また、図1に示すように、ユーザ端末100は、1つ以上のコントローラ1020と通信可能に構成されることとしてもよい。コントローラ1020は、例えば、Bluetooth(登録商標)等の通信規格に従って、ユーザ端末100と通信を確立する。コントローラ1020は、1つ以上のボタン等を有していてもよく、該ボタン等に対するユーザの入力操作に基づく出力値をユーザ端末100へ送信する。また、コントローラ1020は、加速度センサ、および、角速度センサ等の各種センサを有していてもよく、該各種センサの出力値をユーザ端末100へ送信する。 Further, as shown in FIG. 1, the user terminal 100 may be configured to be communicable with one or more controllers 1020. The controller 1020 establishes communication with the user terminal 100 according to a communication standard such as Bluetooth (registered trademark). The controller 1020 may have one or more buttons or the like, and transmits an output value based on a user's input operation to the buttons or the like to the user terminal 100. Further, the controller 1020 may have various sensors such as an acceleration sensor and an angular velocity sensor, and transmits the output values of the various sensors to the user terminal 100.
 なお、ユーザ端末100がカメラ17および測距センサ18を備えることに代えて、または、加えて、コントローラ1020がカメラ17および測距センサ18を有していてもよい。 Note that, instead of or in addition to the user terminal 100 including the camera 17 and the distance measuring sensor 18, the controller 1020 may have the camera 17 and the distance measuring sensor 18.
 ユーザ端末100は、例えばゲーム開始時に、コントローラ1020を使用するユーザに、該ユーザの名前またはログインID等のユーザ識別情報を、該コントローラ1020を介して入力させることが望ましい。これにより、ユーザ端末100は、コントローラ1020とユーザとを紐付けることが可能となり、受信した出力値の送信元(コントローラ1020)に基づいて、該出力値がどのユーザのものであるかを特定することができる。 It is desirable that the user terminal 100 causes a user who uses the controller 1020 to input user identification information such as the user's name or login ID via the controller 1020, for example, at the start of a game. As a result, the user terminal 100 can associate the controller 1020 with the user, and identifies which user the output value belongs to based on the source of the received output value (controller 1020). be able to.
 ユーザ端末100が複数のコントローラ1020と通信する場合、各コントローラ1020を各ユーザが把持することで、ネットワーク2を介してサーバ200などの他の装置と通信せずに、該1台のユーザ端末100でマルチプレイを実現することができる。また、各ユーザ端末100が無線LAN(Local Area Network)規格等の無線規格により互いに通信接続する(サーバ200を介さずに通信接続する)ことで、複数台のユーザ端末100によりローカルでマルチプレイを実現することもできる。1台のユーザ端末100によりローカルで上述のマルチプレイを実現する場合、ユーザ端末100は、さらに、サーバ200が備える後述する種々の機能の少なくとも一部を備えていてもよい。また、複数のユーザ端末100によりローカルで上述のマルチプレイを実現する場合、複数のユーザ端末100は、サーバ200が備える後述する種々の機能を分散して備えていてもよい。 When the user terminal 100 communicates with a plurality of controllers 1020, each user grips each controller 1020 so that the user terminal 100 does not communicate with other devices such as the server 200 via the network 2. Multiplayer can be realized with. In addition, each user terminal 100 communicates with each other according to a wireless standard such as a wireless LAN (Local Area Network) standard (communication connection is made without going through the server 200), thereby realizing local multiplayer with a plurality of user terminals 100. You can also do it. When the above-mentioned multiplayer is realized locally by one user terminal 100, the user terminal 100 may further include at least a part of various functions described later described in the server 200. Further, when the above-mentioned multiplayer is realized locally by a plurality of user terminals 100, the plurality of user terminals 100 may be provided with various functions described later described in the server 200 in a distributed manner.
 なお、ローカルで上述のマルチプレイを実現する場合であっても、ユーザ端末100はサーバ200と通信を行ってもよい。例えば、あるゲームにおける成績または勝敗等のプレイ結果を示す情報と、ユーザ識別情報とを対応付けてサーバ200に送信してもよい。 Even when the above-mentioned multiplayer is realized locally, the user terminal 100 may communicate with the server 200. For example, information indicating a play result such as a result or a win or loss in a certain game may be associated with user identification information and transmitted to the server 200.
 また、コントローラ1020は、ユーザ端末100に着脱可能な構成であるとしてもよい。この場合、ユーザ端末100の筐体における少なくともいずれかの面に、コントローラ1020との結合部が設けられていてもよい。該結合部を介して有線によりユーザ端末100とコントローラ1020とが結合している場合は、ユーザ端末100とコントローラ1020とは、有線を介して信号を送受信する。 Further, the controller 1020 may be configured to be detachable from the user terminal 100. In this case, a coupling portion with the controller 1020 may be provided on at least one surface of the housing of the user terminal 100. When the user terminal 100 and the controller 1020 are connected by wire via the coupling portion, the user terminal 100 and the controller 1020 transmit and receive signals via the wire.
 図1に示すように、ユーザ端末100は、外部のメモリカード等の記憶媒体1030の装着を、入出力IF14を介して受け付けてもよい。これにより、ユーザ端末100は、記憶媒体1030に記録されるプログラム及びデータを読み込むことができる。記憶媒体1030に記録されるプログラムは、例えばゲームプログラムである。 As shown in FIG. 1, the user terminal 100 may accept the attachment of a storage medium 1030 such as an external memory card via the input / output IF14. As a result, the user terminal 100 can read the program and data recorded on the storage medium 1030. The program recorded on the storage medium 1030 is, for example, a game program.
 ユーザ端末100は、サーバ200等の外部の装置と通信することにより取得したゲームプログラムをユーザ端末100のメモリ11に記憶してもよいし、記憶媒体1030から読み込むことにより取得したゲームプログラムをメモリ11に記憶してもよい。 The user terminal 100 may store the game program acquired by communicating with an external device such as the server 200 in the memory 11 of the user terminal 100, or may store the game program acquired by reading from the storage medium 1030 in the memory 11. You may memorize it in.
 以上で説明したとおり、ユーザ端末100は、該ユーザ端末100に対して情報を入力する機構の一例として、通信IF13、入出力IF14、タッチスクリーン15、カメラ17、および、測距センサ18を備える。入力する機構としての上述の各部は、ユーザの入力操作を受け付けるように構成された操作部と捉えることができる。 As described above, the user terminal 100 includes a communication IF 13, an input / output IF 14, a touch screen 15, a camera 17, and a distance measuring sensor 18 as an example of a mechanism for inputting information to the user terminal 100. Each of the above-mentioned parts as an input mechanism can be regarded as an operation part configured to accept a user's input operation.
 例えば、操作部が、カメラ17および測距センサ18の少なくともいずれか一方で構成される場合、該操作部が、ユーザ端末100の近傍の物体1010を検出し、当該物体の検出結果から入力操作を特定する。一例として、物体1010としてのユーザの手、予め定められた形状のマーカーなどが検出され、検出結果として得られた物体1010の色、形状、動き、または、種類などに基づいて入力操作が特定される。より具体的には、ユーザ端末100は、カメラ17の撮影画像からユーザの手が検出された場合、該撮影画像に基づき検出されるジェスチャ(ユーザの手の一連の動き)を、ユーザの入力操作として特定し、受け付ける。なお、撮影画像は静止画であっても動画であってもよい。 For example, when the operation unit is configured by at least one of the camera 17 and the distance measuring sensor 18, the operation unit detects an object 1010 in the vicinity of the user terminal 100 and performs an input operation from the detection result of the object. Identify. As an example, a user's hand as an object 1010, a marker having a predetermined shape, or the like is detected, and an input operation is specified based on the color, shape, movement, or type of the object 1010 obtained as a detection result. To. More specifically, when the user's hand is detected from the captured image of the camera 17, the user terminal 100 inputs a gesture (a series of movements of the user's hand) detected based on the captured image. Specify and accept as. The captured image may be a still image or a moving image.
 あるいは、操作部がタッチスクリーン15で構成される場合、ユーザ端末100は、タッチスクリーン15の入力部151に対して実施されたユーザの操作をユーザの入力操作として特定し、受け付ける。あるいは、操作部が通信IF13で構成される場合、ユーザ端末100は、コントローラ1020から送信される信号(例えば、出力値)をユーザの入力操作として特定し、受け付ける。あるいは、操作部が入出力IF14で構成される場合、該入出力IF14と接続されるコントローラ1020とは異なる入力装置(図示せず)から出力される信号をユーザの入力操作として特定し、受け付ける。 Alternatively, when the operation unit is composed of the touch screen 15, the user terminal 100 identifies and accepts the user's operation performed on the input unit 151 of the touch screen 15 as the user's input operation. Alternatively, when the operation unit is configured by the communication IF 13, the user terminal 100 identifies and accepts a signal (for example, an output value) transmitted from the controller 1020 as an input operation of the user. Alternatively, when the operation unit is composed of the input / output IF14, a signal output from an input device (not shown) different from the controller 1020 connected to the input / output IF14 is specified as an input operation of the user and received.
 本実施形態では、ゲームシステム1は、さらに、動作指図装置300を含む。動作指図装置300は、サーバ200およびユーザ端末100のそれぞれと、ネットワーク2を介して接続する。動作指図装置300は、ゲームシステム1に少なくとも1台設けられている。動作指図装置300は、サーバ200が提供するサービスを利用するユーザ端末100の数に応じて、複数台設けられていてもよい。1台のユーザ端末100に対して、1台の動作指図装置300が設けられていてもよい。複数台のユーザ端末100に対して、1台の動作指図装置300が設けられていてもよい。 In the present embodiment, the game system 1 further includes an operation instruction device 300. The operation instruction device 300 connects to each of the server 200 and the user terminal 100 via the network 2. At least one operation instruction device 300 is provided in the game system 1. A plurality of operation instruction devices 300 may be provided depending on the number of user terminals 100 that use the service provided by the server 200. One operation instruction device 300 may be provided for one user terminal 100. One operation instruction device 300 may be provided for a plurality of user terminals 100.
 動作指図装置300(NPC制御装置、キャラクタ制御装置)は、サーバ、デスクトップパソコン、ノートパソコン、または、タブレットなどのコンピュータ、および、これらを組み合わせたコンピュータ群であってもよい。動作指図装置300は、図示の通り、プロセッサ30と、メモリ31と、ストレージ32と、通信IF33と、入出力IF34と、タッチスクリーン35(表示部)とを備える。動作指図装置300が備えるこれらの構成は、通信バスによって互いに電気的に接続される。なお、動作指図装置300は、タッチスクリーン35に代えて、または、加えて、動作指図装置300本体とは別に構成されたディスプレイ(表示部)を接続可能な入出力IF34を備えていてもよい。 The operation instruction device 300 (NPC control device, character control device) may be a computer such as a server, a desktop personal computer, a laptop computer, or a tablet, and a computer group in which these are combined. As shown in the figure, the operation instruction device 300 includes a processor 30, a memory 31, a storage 32, a communication IF 33, an input / output IF 34, and a touch screen 35 (display unit). These configurations included in the operation instruction device 300 are electrically connected to each other by a communication bus. The operation instruction device 300 may include an input / output IF 34 to which a display (display unit) configured separately from the operation instruction device 300 main body can be connected in place of or in addition to the touch screen 35.
 また、図1に示すように、動作指図装置300は、1つ以上のマイク3010、1つ以上のモーションキャプチャ装置3020、および、1つ以上のコントローラ3030などの周辺機器と、無線または有線を介して、通信可能に構成されてもよい。無線で接続される周辺機器は、例えば、Bluetooth(登録商標)等の通信規格に従って、動作指図装置300と通信を確立する。 Further, as shown in FIG. 1, the operation instruction device 300 is connected to peripheral devices such as one or more microphones 3010, one or more motion capture devices 3020, and one or more controllers 3030 via wireless or wired. It may be configured to be communicable. The wirelessly connected peripheral device establishes communication with the operation instruction device 300 according to a communication standard such as Bluetooth (registered trademark).
 マイク3010は、周囲で発生した音声を取得し、これを電気信号に変換する。電気信号に変換された音声は、音声データとして、動作指図装置300に送信され、通信IF33を介して動作指図装置300に受け付けられる。 The microphone 3010 acquires the voice generated in the surroundings and converts it into an electric signal. The voice converted into an electric signal is transmitted to the operation instruction device 300 as voice data, and is received by the operation instruction device 300 via the communication IF 33.
 モーションキャプチャ装置3020は、追跡対象(例えば、人)のモーション(顔の表情、口の動きなども含む)を追跡し、追跡結果としての出力値を動作指図装置300へ送信する。出力値であるモーションデータは、通信IF33を介して動作指図装置300に受け付けられる。モーションキャプチャ装置3020のモーションキャプチャ方式は特に限定されない。モーションキャプチャ装置3020は、採用された方式に応じて、カメラ、各種センサ、マーカー、モデル(人物)が着用するスーツ、信号送出器など、モーションをキャプチャするためのあらゆる機構を選択的に備えている。 The motion capture device 3020 tracks the motion (including facial expressions, mouth movements, etc.) of the tracking target (for example, a person), and transmits the output value as the tracking result to the operation instruction device 300. The motion data, which is an output value, is received by the operation instruction device 300 via the communication IF 33. The motion capture method of the motion capture device 3020 is not particularly limited. The motion capture device 3020 selectively includes all mechanisms for capturing motion, such as a camera, various sensors, markers, a suit worn by a model (person), a signal transmitter, etc., depending on the method adopted. ..
 コントローラ3030は、1つ以上のボタン、レバー、スティック、ホイール等の物理的な入力機構を有していてもよい。コントローラ3030は、動作指図装置300のオペレータが、該入力機構に対して入力した入力操作に基づく出力値を動作指図装置300へ送信する。また、コントローラ3030は、加速度センサ、および、角速度センサ等の各種センサを有していてもよく、該各種センサの出力値を動作指図装置300へ送信してもよい。上述の出力値は、通信IF33を介して動作指図装置300に受け付けられる。なお、以下では、動作指図装置300に備えられた操作部または動作指図装置300と通信可能に接続された各種の入力機構を用いて、動作指図装置300に対して、何らかの入力操作を行う人をオペレータと称する。オペレータには、入力部351、コントローラ3030などを用いて動作指図装置300を操作する人も含まれるし、マイク3010を介して音声を入力する声優も含まれるし、モーションキャプチャ装置3020を介して動きを入力するモデルも含まれる。なお、オペレータは、ゲームプレイヤであるユーザに含まれない。 The controller 3030 may have one or more physical input mechanisms such as buttons, levers, sticks, and wheels. The controller 3030 transmits an output value based on an input operation input to the input mechanism by the operator of the operation instruction device 300 to the operation instruction device 300. Further, the controller 3030 may have various sensors such as an acceleration sensor and an angular velocity sensor, and may transmit the output values of the various sensors to the operation instruction device 300. The above output value is received by the operation instruction device 300 via the communication IF 33. In the following, a person who performs some input operation on the operation instruction device 300 by using the operation unit provided in the operation instruction device 300 or various input mechanisms communicably connected to the operation instruction device 300. Called an operator. The operator includes a person who operates the operation instruction device 300 by using the input unit 351 and the controller 3030, a voice actor who inputs voice through the microphone 3010, and moves via the motion capture device 3020. A model for inputting is also included. The operator is not included in the user who is a game player.
 動作指図装置300は、図示しない、カメラと、測距センサとを備えていてもよい。動作指図装置300が備えることに代えて、または、加えて、モーションキャプチャ装置3020およびコントローラ3030がカメラと、測距センサとを有してしてもよい。 The operation instruction device 300 may include a camera and a distance measuring sensor (not shown). Alternatively or in addition to the motion instruction device 300, the motion capture device 3020 and the controller 3030 may have a camera and a distance measuring sensor.
 以上で説明したとおり、動作指図装置300は、該動作指図装置300に対して情報を入力する機構の一例として、通信IF33、入出力IF34、タッチスクリーン35を備える。必要に応じて、カメラ、および、測距センサをさらに備えていてもよい。入力する機構としての上述の各部は、ユーザの入力操作を受け付けるように構成された操作部と捉えることができる。 As described above, the operation instruction device 300 includes a communication IF 33, an input / output IF 34, and a touch screen 35 as an example of a mechanism for inputting information to the operation instruction device 300. If necessary, a camera and a distance measuring sensor may be further provided. Each of the above-mentioned parts as an input mechanism can be regarded as an operation part configured to accept a user's input operation.
 操作部がタッチスクリーン35で構成されていてもよい。この場合、動作指図装置300は、タッチスクリーン35の入力部351に対して実施されたユーザの操作をユーザの入力操作として特定し、受け付ける。あるいは、操作部が通信IF33で構成される場合、動作指図装置300は、コントローラ3030から送信される信号(例えば、出力値)をユーザの入力操作として特定し、受け付ける。あるいは、操作部が入出力IF34で構成される場合、該入出力IF34と接続されるコントローラ3030とは異なる入力装置(図示せず)から出力される信号をユーザの入力操作として特定し、受け付ける。 The operation unit may be composed of the touch screen 35. In this case, the operation instruction device 300 identifies and accepts the user's operation performed on the input unit 351 of the touch screen 35 as the user's input operation. Alternatively, when the operation unit is composed of the communication IF 33, the operation instruction device 300 identifies and accepts a signal (for example, an output value) transmitted from the controller 3030 as an input operation of the user. Alternatively, when the operation unit is composed of the input / output IF34, a signal output from an input device (not shown) different from the controller 3030 connected to the input / output IF34 is specified as an input operation of the user and received.
 <ゲーム概要>
 実施形態1に係るゲームシステム1が実行するゲーム(以下、本ゲーム)は、一例として、1以上のキャラクタを登場させて、そのキャラクタの少なくとも1つを動作指図データに基づいて動作させるゲームである。該ゲームに登場するキャラクタは、プレイヤキャラクタ(以下、PC)であってもよいし、ノンプレイヤキャラクタ(以下、NPC)であってもよい。PCは、ゲームプレイヤであるユーザが直接操作可能なキャラクタである。NPCは、ゲームプログラムおよび動作指図データにしたがって動作する、すなわち、ゲームプレイヤであるユーザが直接操作できないキャラクタである。以下では、両者を特に区別する必要がない場合には、総称として、“キャラクタ”を用いる。
<Game overview>
The game executed by the game system 1 according to the first embodiment (hereinafter, this game) is, for example, a game in which one or more characters appear and at least one of the characters is operated based on the operation instruction data. .. The character appearing in the game may be a player character (hereinafter, PC) or a non-player character (hereinafter, NPC). The PC is a character that can be directly operated by a user who is a game player. An NPC is a character that operates according to a game program and operation instruction data, that is, a character that cannot be directly operated by a user who is a game player. In the following, when it is not necessary to distinguish between the two, "character" is used as a generic term.
 一例として、本ゲームは、育成シミュレーションゲームである。具体的には、該育成シミュレーションゲームは、ユーザたる主人公が、キャラクタとの交流を深め、働きかけていくことで、該キャラクタを有名な動画配信者に仕立て上げ、該キャラクタが抱く夢を実現することを目的としている。さらに、該育成シミュレーションゲームは、主人公が、キャラクタとの交流を通じて親密度を高めることを目的とする恋愛シミュレーションゲームの要素を含んでいてもよい。 As an example, this game is a training simulation game. Specifically, in the training simulation game, the main character, who is a user, deepens interaction with the character and works on it to make the character a famous video distributor and realize the dream that the character has. It is an object. Further, the training simulation game may include an element of a love simulation game in which the main character aims to increase intimacy through interaction with a character.
 さらに、本ゲームには、一例として、ライブ配信パートが少なくとも含まれていることが好ましい。ゲームシステム1において、動作指図データが、本ゲームを実行中のユーザ端末100に対して、該ユーザ端末100以外の他の装置から、任意のタイミングで供給される。ユーザ端末100は、該動作指図データを受信したことをトリガにして、該動作指図データを解析する(レンダリングする)。ライブ配信パートとは、ユーザ端末100が、上述の解析された動作指図データにしたがって動作するキャラクタを、ユーザにリアルタイムに提示するパートである。これにより、ユーザは、キャラクタが本当に存在するかのような現実感を覚えることができ、より一層ゲームの世界に没入して本ゲームに興じることができる。 Furthermore, it is preferable that this game includes at least a live distribution part as an example. In the game system 1, the operation instruction data is supplied to the user terminal 100 running the game from a device other than the user terminal 100 at an arbitrary timing. The user terminal 100 analyzes (renders) the operation instruction data by using the reception of the operation instruction data as a trigger. The live distribution part is a part in which the user terminal 100 presents a character that operates according to the above-mentioned analyzed operation instruction data to the user in real time. As a result, the user can feel the reality as if the character really exists, and can further immerse himself in the game world and enjoy the game.
 本実施形態では、ゲームは、複数のプレイパートで構成されていてもよい。この場合、1つのキャラクタが、あるパートでは、PCであって、別のパートでは、NPCである、というように、パートごとにキャラクタの性質が異なっていてもよい。 In this embodiment, the game may be composed of a plurality of play parts. In this case, the character properties may differ from part to part, such as one character being a PC in one part and an NPC in another part.
 ゲームのジャンルは、特定のジャンルに限られない。ゲームシステム1は、あらゆるジャンルのゲームを実行し得る。例えば、テニス、卓球、ドッジボール、野球、サッカーおよびホッケーなどのスポーツを題材としたゲーム、パズルゲーム、クイズゲーム、RPG(Role-Playing Game)、アドベンチャーゲーム、シューティングゲーム、シミュレーションゲーム、育成ゲーム、ならびに、アクションゲームなどであってもよい。 The game genre is not limited to a specific genre. The game system 1 can execute games of all genres. For example, sports-themed games such as tennis, table tennis, dodgeball, baseball, soccer and hockey, puzzle games, quiz games, RPG (Role-PlayingGame), adventure games, shooting games, simulation games, training games, and It may be an action game or the like.
 また、ゲームシステム1において実行されるゲームのプレイ形態は、特定のプレイ形態に限られない。ゲームシステム1は、あらゆるプレイ形態のゲームを実行し得る。例えば、単一のユーザによるシングルプレイゲーム、および、複数のユーザによるマルチプレイゲーム、また、マルチプレイゲームの中でも、複数のユーザが対戦する対戦ゲーム、および、複数のユーザが協力する協力プレイゲームなどであってもよい。 Further, the play form of the game executed in the game system 1 is not limited to a specific play form. The game system 1 can execute a game of any play form. For example, a single-player game by a single user, a multi-play game by a plurality of users, a battle game in which a plurality of users play against each other, and a cooperative play game in which a plurality of users cooperate among the multi-play games. You may.
 <各装置のハードウェア構成要素>
 プロセッサ10は、ユーザ端末100全体の動作を制御する。プロセッサ20は、サーバ200全体の動作を制御する。プロセッサ30は、動作指図装置300全体の動作を制御する。プロセッサ10、20および30は、CPU(Central Processing Unit)、MPU(Micro Processing Unit)、およびGPU(Graphics Processing Unit)を含む。
<Hardware components of each device>
The processor 10 controls the operation of the entire user terminal 100. The processor 20 controls the operation of the entire server 200. The processor 30 controls the operation of the entire operation instruction device 300. Processors 10, 20 and 30 include a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a GPU (Graphics Processing Unit).
 プロセッサ10は後述するストレージ12からプログラムを読み出し、後述するメモリ11に展開する。プロセッサ20は後述するストレージ22からプログラムを読み出し、後述するメモリ21に展開する。プロセッサ30は後述するストレージ32からプログラムを読み出し、後述するメモリ31に展開する。プロセッサ10、プロセッサ20およびプロセッサ30は展開したプログラムを実行する。 The processor 10 reads a program from the storage 12 described later and expands it into the memory 11 described later. The processor 20 reads a program from the storage 22 described later and expands it into the memory 21 described later. The processor 30 reads a program from the storage 32 described later and expands it into the memory 31 described later. Processor 10, processor 20 and processor 30 execute the expanded program.
 メモリ11、21および31は主記憶装置である。メモリ11、21および31は、ROM(Read Only Memory)およびRAM(Random Access Memory)等の記憶装置で構成される。メモリ11は、プロセッサ10が後述するストレージ12から読み出したプログラムおよび各種データを一時的に記憶することにより、プロセッサ10に作業領域を提供する。メモリ11は、プロセッサ10がプログラムに従って動作している間に生成した各種データも一時的に記憶する。メモリ21は、プロセッサ20が後述するストレージ22から読み出した各種プログラムおよびデータを一時的に記憶することにより、プロセッサ20に作業領域を提供する。メモリ21は、プロセッサ20がプログラムに従って動作している間に生成した各種データも一時的に記憶する。メモリ31は、プロセッサ30が後述するストレージ32から読み出した各種プログラムおよびデータを一時的に記憶することにより、プロセッサ30に作業領域を提供する。メモリ31は、プロセッサ30がプログラムに従って動作している間に生成した各種データも一時的に記憶する。 The memories 11, 21 and 31 are the main storage devices. The memories 11, 21 and 31 are composed of storage devices such as ROM (Read Only Memory) and RAM (Random Access Memory). The memory 11 provides a work area to the processor 10 by temporarily storing a program and various data read from the storage 12 described later by the processor 10. The memory 11 also temporarily stores various data generated while the processor 10 is operating according to the program. The memory 21 provides a work area to the processor 20 by temporarily storing various programs and data read from the storage 22 described later by the processor 20. The memory 21 also temporarily stores various data generated while the processor 20 is operating according to the program. The memory 31 provides a work area to the processor 30 by temporarily storing various programs and data read from the storage 32 described later by the processor 30. The memory 31 also temporarily stores various data generated while the processor 30 is operating according to the program.
 本実施形態においてプログラムとは、ゲームをユーザ端末100により実現するためのゲームプログラムであってもよい。あるいは、該プログラムは、該ゲームをユーザ端末100とサーバ200との協働により実現するためのゲームプログラムであってもよい。あるいは、該プログラムは、該ゲームをユーザ端末100とサーバ200と動作指図装置300との協働により実現するためのゲームプログラムであってもよい。なお、ユーザ端末100とサーバ200との協働により実現されるゲームおよびユーザ端末100とサーバ200と動作指図装置300との協働により実現されるゲームは、一例として、ユーザ端末100において起動されたブラウザ上で実行されるゲームであってもよい。あるいは、該プログラムは、該ゲームを複数のユーザ端末100の協働により実現するためのゲームプログラムであってもよい。また、各種データとは、ユーザ情報およびゲーム情報などのゲームに関するデータ、ならびに、ゲームシステム1の各装置間で送受信する指示または通知を含んでいる。 In the present embodiment, the program may be a game program for realizing the game by the user terminal 100. Alternatively, the program may be a game program for realizing the game in collaboration with the user terminal 100 and the server 200. Alternatively, the program may be a game program for realizing the game in cooperation with the user terminal 100, the server 200, and the operation instruction device 300. The game realized by the cooperation of the user terminal 100 and the server 200 and the game realized by the cooperation of the user terminal 100, the server 200, and the operation instruction device 300 are started by the user terminal 100 as an example. It may be a game executed on a browser. Alternatively, the program may be a game program for realizing the game by the cooperation of a plurality of user terminals 100. Further, the various data include data related to the game such as user information and game information, and instructions or notifications to be transmitted / received between the devices of the game system 1.
 ストレージ12、22および32は補助記憶装置である。ストレージ12、22および32は、フラッシュメモリまたはHDD(Hard Disk Drive)等の記憶装置で構成される。ストレージ12、22および32には、ゲームに関する各種データが格納される。 Storages 12, 22 and 32 are auxiliary storage devices. The storages 12, 22 and 32 are composed of a storage device such as a flash memory or an HDD (Hard Disk Drive). Various data related to the game are stored in the storages 12, 22 and 32.
 通信IF13は、ユーザ端末100における各種データの送受信を制御する。通信IF23は、サーバ200における各種データの送受信を制御する。通信IF33は、動作指図装置300における各種データの送受信を制御する。通信IF13、23および33は例えば、無線LAN(Local Area Network)を介する通信、有線LAN、無線LAN、または携帯電話回線網を介したインターネット通信、ならびに近距離無線通信等を用いた通信を制御する。 The communication IF 13 controls the transmission and reception of various data in the user terminal 100. The communication IF 23 controls the transmission / reception of various data in the server 200. The communication IF 33 controls the transmission / reception of various data in the operation instruction device 300. Communication IFs 13, 23 and 33 control communication using, for example, communication via a wireless LAN (Local Area Network), Internet communication via a wired LAN, a wireless LAN, or a mobile phone network, and communication using short-range wireless communication. ..
 入出力IF14は、ユーザ端末100がデータの入力を受け付けるためのインターフェースであり、またユーザ端末100がデータを出力するためのインターフェースである。入出力IF14は、USB(Universal Serial Bus)等を介してデータの入出力を行ってもよい。入出力IF14は、例えば、ユーザ端末100の物理ボタン、カメラ、マイク、または、スピーカ等を含み得る。サーバ200の入出力IF24は、サーバ200がデータの入力を受け付けるためのインターフェースであり、またサーバ200がデータを出力するためのインターフェースである。入出力IF24は、例えば、マウスまたはキーボード等の情報入力機器である入力部と、画像を表示出力する機器である表示部とを含み得る。動作指図装置300の入出力IF34は、動作指図装置300がデータの入力を受け付けるためのインターフェースであり、また動作指図装置300がデータを出力するためのインターフェースである。入出力IF34は、例えば、マウス、キーボード、スティック、レバー等の情報入力機器、液晶ディスプレイなどの画像を表示出力する機器、および、周辺機器(マイク3010、モーションキャプチャ装置3020、および、コントローラ3030)との間でデータを送受信するための接続部を含み得る。 The input / output IF 14 is an interface for the user terminal 100 to accept data input, and an interface for the user terminal 100 to output data. The input / output IF 14 may input / output data via USB (Universal Serial Bus) or the like. The input / output IF 14 may include, for example, a physical button, a camera, a microphone, a speaker, or the like of the user terminal 100. The input / output IF 24 of the server 200 is an interface for the server 200 to receive data input, and an interface for the server 200 to output data. The input / output IF 24 may include, for example, an input unit that is an information input device such as a mouse or a keyboard, and a display unit that is a device that displays and outputs an image. The input / output IF 34 of the operation instruction device 300 is an interface for the operation instruction device 300 to receive data input, and an interface for the operation instruction device 300 to output data. The input / output IF34 includes, for example, information input devices such as a mouse, keyboard, stick, and lever, devices for displaying and outputting images such as a liquid crystal display, and peripheral devices (microphone 3010, motion capture device 3020, and controller 3030). May include connections for sending and receiving data between.
 ユーザ端末100のタッチスクリーン15は、入力部151と表示部152とを組み合わせた電子部品である。動作指図装置300のタッチスクリーン35は、入力部351と表示部352とを組み合わせた電子部品である。入力部151、351は、例えばタッチセンシティブなデバイスであり、例えばタッチパッドによって構成される。表示部152、352は、例えば液晶ディスプレイ、または有機EL(Electro-Luminescence)ディスプレイ等によって構成される。 The touch screen 15 of the user terminal 100 is an electronic component that combines an input unit 151 and a display unit 152. The touch screen 35 of the operation instruction device 300 is an electronic component in which an input unit 351 and a display unit 352 are combined. The input units 151 and 351 are, for example, touch-sensitive devices, and are configured by, for example, a touch pad. The display units 152 and 352 are configured by, for example, a liquid crystal display, an organic EL (Electro-Luminescence) display, or the like.
 入力部151、351は、入力面に対しユーザの操作(主にタッチ操作、スライド操作、スワイプ操作、およびタップ操作等の物理的接触操作)が入力された位置を検知して、位置を示す情報を入力信号として送信する機能を備える。入力部151、351は、図示しないタッチセンシング部を備えていればよい。タッチセンシング部は、静電容量方式または抵抗膜方式等のどのような方式を採用したものであってもよい。 The input units 151 and 351 detect the position where the user's operation (mainly a physical contact operation such as a touch operation, a slide operation, a swipe operation, and a tap operation) is input to the input surface, and information indicating the position. It has a function to transmit as an input signal. The input units 151 and 351 may be provided with a touch sensing unit (not shown). The touch sensing unit may adopt any method such as a capacitance method or a resistance film method.
 図示していないが、ユーザ端末100は、該ユーザ端末100の保持姿勢を特定するための1以上のセンサを備えていてもよい。このセンサは、例えば、加速度センサ、または、角速度センサ等であってもよい。ユーザ端末100がセンサを備えている場合、プロセッサ10は、センサの出力からユーザ端末100の保持姿勢を特定して、保持姿勢に応じた処理を行うことも可能になる。例えば、プロセッサ10は、ユーザ端末100が縦向きに保持されているときには、縦長の画像を表示部152に表示させる縦画面表示としてもよい。一方、ユーザ端末100が横向きに保持されているときには、横長の画像を表示部に表示させる横画面表示としてもよい。このように、プロセッサ10は、ユーザ端末100の保持姿勢に応じて縦画面表示と横画面表示とを切り替え可能であってもよい。 Although not shown, the user terminal 100 may include one or more sensors for specifying the holding posture of the user terminal 100. This sensor may be, for example, an acceleration sensor, an angular velocity sensor, or the like. When the user terminal 100 includes a sensor, the processor 10 can also specify the holding posture of the user terminal 100 from the output of the sensor and perform processing according to the holding posture. For example, the processor 10 may be a vertical screen display in which a vertically long image is displayed on the display unit 152 when the user terminal 100 is held vertically. On the other hand, when the user terminal 100 is held horizontally, it may be a horizontal screen display in which a horizontally long image is displayed on the display unit. In this way, the processor 10 may be able to switch between the vertical screen display and the horizontal screen display according to the holding posture of the user terminal 100.
 カメラ17は、イメージセンサ等を含み、レンズから入射する入射光を電気信号に変換することで撮影画像を生成する。 The camera 17 includes an image sensor and the like, and generates a captured image by converting the incident light incident from the lens into an electric signal.
 測距センサ18は、測定対象物までの距離を測定するセンサである。測距センサ18は、例えば、パルス変換した光を発する光源と、光を受ける受光素子とを含む。測距センサ18は、光源からの発光タイミングと、該光源から発せられた光が測定対象物にあたって反射されて生じる反射光の受光タイミングとにより、測定対象物までの距離を測定する。測距センサ18は、指向性を有する光を発する光源を有することとしてもよい。 The distance measuring sensor 18 is a sensor that measures the distance to the object to be measured. The distance measuring sensor 18 includes, for example, a light source that emits pulse-converted light and a light receiving element that receives the light. The distance measuring sensor 18 measures the distance to the object to be measured by the timing of light emission from the light source and the timing of receiving the reflected light generated by the light emitted from the light source being reflected by the object to be measured. The distance measuring sensor 18 may have a light source that emits light having directivity.
 ここで、ユーザ端末100が、カメラ17と測距センサ18とを用いて、ユーザ端末100の近傍の物体1010を検出した検出結果を、ユーザの入力操作として受け付ける例をさらに説明する。カメラ17および測距センサ18は、例えば、ユーザ端末100の筐体の側面に設けられてもよい。カメラ17の近傍に測距センサ18が設けられてもよい。カメラ17としては、例えば赤外線カメラを用いることができる。この場合、赤外線を照射する照明装置および可視光を遮断するフィルタ等が、カメラ17に設けられてもよい。これにより、屋外か屋内かにかかわらず、カメラ17の撮影画像に基づく物体の検出精度をいっそう向上させることができる。 Here, an example in which the user terminal 100 receives the detection result of detecting the object 1010 in the vicinity of the user terminal 100 by using the camera 17 and the distance measuring sensor 18 as an input operation of the user will be further described. The camera 17 and the distance measuring sensor 18 may be provided on the side surface of the housing of the user terminal 100, for example. A ranging sensor 18 may be provided in the vicinity of the camera 17. As the camera 17, for example, an infrared camera can be used. In this case, the camera 17 may be provided with a lighting device that irradiates infrared rays, a filter that blocks visible light, and the like. This makes it possible to further improve the detection accuracy of the object based on the captured image of the camera 17, regardless of whether it is outdoors or indoors.
 プロセッサ10は、カメラ17の撮影画像に対して、例えば以下の(1)~(5)に示す処理のうち1つ以上の処理を行ってもよい。(1)プロセッサ10は、カメラ17の撮影画像に対し画像認識処理を行うことで、該撮影画像にユーザの手が含まれているか否かを特定する。プロセッサ10は、上述の画像認識処理において採用する解析技術として、例えばパターンマッチング等の技術を用いてよい。(2)また、プロセッサ10は、ユーザの手の形状から、ユーザのジェスチャを検出する。プロセッサ10は、例えば、撮影画像から検出されるユーザの手の形状から、ユーザの指の本数(伸びている指の本数)を特定する。プロセッサ10はさらに、特定した指の本数から、ユーザが行ったジェスチャを特定する。例えば、プロセッサ10は、指の本数が5本である場合、ユーザが「パー」のジェスチャを行ったと判定する。また、プロセッサ10は、指の本数が0本である(指が検出されなかった)場合、ユーザが「グー」のジェスチャを行ったと判定する。また、プロセッサ10は、指の本数が2本である場合、ユーザが「チョキ」のジェスチャを行ったと判定する。(3)プロセッサ10は、カメラ17の撮影画像に対し、画像認識処理を行うことにより、ユーザの指が人差し指のみ立てた状態であるか、ユーザの指がはじくような動きをしたかを検出する。(4)プロセッサ10は、カメラ17の撮影画像の画像認識結果、および、測距センサ18の出力値等の少なくともいずれか1つに基づいて、ユーザ端末100の近傍の物体1010(ユーザの手など)とユーザ端末100との距離を検出する。例えば、プロセッサ10は、カメラ17の撮影画像から特定されるユーザの手の形状の大小により、ユーザの手がユーザ端末100の近傍(例えば所定値未満の距離)にあるのか、遠く(例えば所定値以上の距離)にあるのかを検出する。なお、撮影画像が動画の場合、プロセッサ10は、ユーザの手がユーザ端末100に接近しているのか遠ざかっているのかを検出してもよい。(5)カメラ17の撮影画像の画像認識結果等に基づいて、ユーザの手が検出されている状態で、ユーザ端末100とユーザの手との距離が変化していることが判明した場合、プロセッサ10は、ユーザが手をカメラ17の撮影方向において振っていると認識する。カメラ17の撮影範囲よりも指向性が強い測距センサ18において、物体が検出されたりされなかったりする場合に、プロセッサ10は、ユーザが手をカメラの撮影方向に直交する方向に振っていると認識する。 The processor 10 may perform one or more of the processes shown in the following (1) to (5), for example, on the captured image of the camera 17. (1) The processor 10 performs image recognition processing on the captured image of the camera 17 to specify whether or not the captured image includes a user's hand. The processor 10 may use, for example, a technique such as pattern matching as the analysis technique adopted in the above-mentioned image recognition process. (2) Further, the processor 10 detects the user's gesture from the shape of the user's hand. The processor 10 specifies, for example, the number of fingers of the user (the number of extended fingers) from the shape of the user's hand detected from the captured image. The processor 10 further identifies the gesture performed by the user from the number of identified fingers. For example, the processor 10 determines that the user has made a "par" gesture when the number of fingers is five. Further, when the number of fingers is 0 (the finger is not detected), the processor 10 determines that the user has made a “goo” gesture. Further, when the number of fingers is two, the processor 10 determines that the user has performed the "choki" gesture. (3) The processor 10 performs image recognition processing on the captured image of the camera 17 to detect whether the user's finger is in a state where only the index finger is raised or whether the user's finger is repelled. .. (4) The processor 10 is an object 1010 (user's hand or the like) in the vicinity of the user terminal 100 based on at least one of the image recognition result of the captured image of the camera 17 and the output value of the distance measuring sensor 18. ) And the user terminal 100. For example, the processor 10 may have the user's hand near the user terminal 100 (for example, a distance less than a predetermined value) or far away (for example, a predetermined value) depending on the size of the shape of the user's hand specified from the captured image of the camera 17. Detect if it is at the above distance). When the captured image is a moving image, the processor 10 may detect whether the user's hand is approaching or moving away from the user terminal 100. (5) When it is found that the distance between the user terminal 100 and the user's hand is changing while the user's hand is detected based on the image recognition result of the captured image of the camera 17, the processor. 10 recognizes that the user is waving his hand in the shooting direction of the camera 17. When an object is detected or not detected in the distance measuring sensor 18 having a stronger directivity than the shooting range of the camera 17, the processor 10 determines that the user is waving his hand in a direction orthogonal to the shooting direction of the camera. recognize.
 このように、プロセッサ10は、カメラ17の撮影画像に対する画像認識により、ユーザが手を握りこんでいるか否か(「グー」のジェスチャであるか、それ以外のジェスチャ(例えば「パー」)であるか)を検出する。また、プロセッサ10は、ユーザの手の形状とともに、ユーザがこの手をどのように移動させているかを検出する。また、プロセッサ10は、ユーザがこの手をユーザ端末100に対して接近させているのか遠ざけているのかを検出する。このような操作は、例えば、マウスまたはタッチパネルなどのポインティングデバイスを用いた操作に対応させることができる。ユーザ端末100は、例えば、ユーザの手の移動に応じて、タッチスクリーン15においてポインタを移動させ、ユーザのジェスチャ「グー」を検出する。この場合、ユーザ端末100は、ユーザが選択操作を継続中であると認識する。選択操作の継続とは、例えば、マウスがクリックされて押し込まれた状態が維持されること、または、タッチパネルに対してタッチダウン操作がなされた後タッチされた状態が維持されることに対応する。また、ユーザ端末100は、ユーザのジェスチャ「グー」が検出されている状態で、さらにユーザが手を移動させると、このような一連のジェスチャを、スワイプ操作(またはドラッグ操作)に対応する操作として認識することもできる。また、ユーザ端末100は、カメラ17の撮影画像によるユーザの手の検出結果に基づいて、ユーザが指をはじくようなジェスチャを検出した場合に、当該ジェスチャを、マウスのクリックまたはタッチパネルへのタップ操作に対応する操作として認識してもよい。 As described above, the processor 10 determines whether or not the user is holding the hand (whether it is a “goo” gesture or another gesture (for example, “par”) by recognizing the image captured by the camera 17. Or) is detected. In addition, the processor 10 detects the shape of the user's hand and how the user is moving the hand. In addition, the processor 10 detects whether the user is approaching or moving this hand toward or away from the user terminal 100. Such an operation can correspond to an operation using a pointing device such as a mouse or a touch panel, for example. For example, the user terminal 100 moves the pointer on the touch screen 15 in response to the movement of the user's hand, and detects the user's gesture “goo”. In this case, the user terminal 100 recognizes that the user is continuing the selection operation. The continuation of the selection operation corresponds to, for example, the state in which the mouse is clicked and pressed, or the state in which the mouse is touched after the touch-down operation is performed on the touch panel. Further, when the user moves his / her hand while the user's gesture "goo" is detected, the user terminal 100 performs such a series of gestures as an operation corresponding to a swipe operation (or a drag operation). You can also recognize it. Further, when the user terminal 100 detects a gesture that the user flips a finger based on the detection result of the user's hand by the image taken by the camera 17, the gesture is clicked by the mouse or tapped on the touch panel. It may be recognized as an operation corresponding to.
 <ゲームシステム1の機能的構成>
 図2は、ゲームシステム1に含まれるユーザ端末100、サーバ200および動作指図装置300の機能的構成を示すブロック図である。ユーザ端末100、サーバ200および動作指図装置300のそれぞれは、図示しない、一般的なコンピュータとして機能する場合に必要な機能的構成、および、ゲームにおける公知の機能を実現するために必要な機能的構成を含み得る。
<Functional configuration of game system 1>
FIG. 2 is a block diagram showing a functional configuration of a user terminal 100, a server 200, and an operation instruction device 300 included in the game system 1. Each of the user terminal 100, the server 200, and the operation instruction device 300 is not shown, and has a functional configuration necessary for functioning as a general computer and a functional configuration necessary for realizing a known function in a game. May include.
 ユーザ端末100は、ユーザの入力操作を受け付ける入力装置としての機能と、ゲームの画像や音声を出力する出力装置としての機能を有する。ユーザ端末100は、プロセッサ10、メモリ11、ストレージ12、通信IF13、および入出力IF14等の協働によって、制御部110および記憶部120として機能する。 The user terminal 100 has a function as an input device that accepts a user's input operation and a function as an output device that outputs a game image or sound. The user terminal 100 functions as a control unit 110 and a storage unit 120 by the cooperation of the processor 10, the memory 11, the storage 12, the communication IF 13, the input / output IF 14, and the like.
 サーバ200は、各ユーザ端末100と通信して、ユーザ端末100がゲームを進行させるのを支援する機能を有する。例えば、ユーザ端末100が本ゲームに係るアプリケーションを始めてダウンロードするときには、初回ゲーム開始時にユーザ端末100に記憶させておくべきデータをユーザ端末100に提供する。例えば、サーバ200は、キャラクタを動作させるための動作指図データをユーザ端末100に送信する。動作指図データは、予め、モデルなどのアクターの動きを取り込んだモーションキャプチャデータを含んでいてもよいし、声優などのアクターの音声を録音した音声データを含んでいてもよいし、キャラクタを動作させるための入力操作の履歴を示す操作履歴データを含んでいてもよいし、上述の一連の入力操作に対応付けられたコマンドを時系列に並べたモーションコマンド群を含んでいてもよい。本ゲームがマルチプレイゲームである場合には、サーバ200は、ゲームに参加する各ユーザ端末100と通信して、ユーザ端末100同士のやりとりを仲介する機能および同期制御機能を有していてもよい。また、サーバ200は、ユーザ端末100と動作指図装置300とを仲介する機能を備えている。これにより、動作指図装置300は、適時に、宛先を誤ることなく、ユーザ端末100または複数のユーザ端末100のグループに対して動作指図データを供給することが可能となる。サーバ200は、プロセッサ20、メモリ21、ストレージ22、通信IF23、および入出力IF24等の協働によって、制御部210および記憶部220として機能する。 The server 200 has a function of communicating with each user terminal 100 and supporting the user terminal 100 to advance the game. For example, when the user terminal 100 downloads an application related to this game for the first time, the user terminal 100 is provided with data to be stored in the user terminal 100 at the start of the first game. For example, the server 200 transmits the operation instruction data for operating the character to the user terminal 100. The motion instruction data may include motion capture data that captures the movement of an actor such as a model in advance, or may include voice data that records the voice of an actor such as a voice actor, or causes the character to operate. It may include operation history data indicating the history of input operations for the purpose, or may include a motion command group in which commands associated with the above-mentioned series of input operations are arranged in chronological order. When the game is a multiplayer game, the server 200 may have a function of communicating with each user terminal 100 participating in the game and mediating an exchange between the user terminals 100 and a synchronization control function. Further, the server 200 has a function of mediating between the user terminal 100 and the operation instruction device 300. As a result, the operation instruction device 300 can supply the operation instruction data to the user terminal 100 or a group of a plurality of user terminals 100 in a timely manner without making a mistake in the destination. The server 200 functions as a control unit 210 and a storage unit 220 by the cooperation of the processor 20, the memory 21, the storage 22, the communication IF23, the input / output IF24, and the like.
 動作指図装置300は、ユーザ端末100におけるキャラクタの動作を指示するための動作指図データを生成し、ユーザ端末100に供給する機能を有する。動作指図装置300は、プロセッサ30、メモリ31、ストレージ32、通信IF33、および入出力IF34等の協働によって、制御部310および記憶部320として機能する。 The operation instruction device 300 has a function of generating operation instruction data for instructing the operation of a character in the user terminal 100 and supplying the operation instruction data to the user terminal 100. The operation instruction device 300 functions as a control unit 310 and a storage unit 320 in cooperation with the processor 30, the memory 31, the storage 32, the communication IF 33, the input / output IF 34, and the like.
 記憶部120、220および320は、ゲームプログラム131、ゲーム情報132およびユーザ情報133を格納する。ゲームプログラム131は、ユーザ端末100、サーバ200および動作指図装置300が実行するゲームプログラムである。ゲーム情報132は、制御部110、210および310がゲームプログラム131を実行する際に参照するデータである。ユーザ情報133は、ユーザのアカウントに関するデータである。記憶部220および320において、ゲーム情報132およびユーザ情報133は、ユーザ端末100ごとに格納されている。記憶部320は、さらに、キャラクタ制御プログラム134を格納する。キャラクタ制御プログラム134は、動作指図装置300が実行するプログラムであり、上述のゲームプログラム131に基づくゲームに登場させるキャラクタの動作を制御するためのプログラムである。 The storage units 120, 220 and 320 store the game program 131, the game information 132 and the user information 133. The game program 131 is a game program executed by the user terminal 100, the server 200, and the operation instruction device 300. The game information 132 is data that the control units 110, 210, and 310 refer to when executing the game program 131. The user information 133 is data related to the user's account. In the storage units 220 and 320, the game information 132 and the user information 133 are stored for each user terminal 100. The storage unit 320 further stores the character control program 134. The character control program 134 is a program executed by the operation instruction device 300, and is a program for controlling the operation of a character appearing in a game based on the above-mentioned game program 131.
  (サーバ200の機能的構成)
 制御部210は、記憶部220に格納されたゲームプログラム131を実行することにより、サーバ200を統括的に制御する。例えば、制御部210は、ユーザ端末100に各種データおよびプログラム等を送信する。制御部210は、ゲーム情報もしくはユーザ情報の一部または全部をユーザ端末100から受信する。ゲームがマルチプレイゲームである場合には、制御部210は、ユーザ端末100からマルチプレイの同期の要求を受信して、同期のためのデータをユーザ端末100に送信してもよい。また、制御部210は、必要に応じて、ユーザ端末100および動作指図装置300と通信して、情報の送受信を行う。
(Functional configuration of server 200)
The control unit 210 comprehensively controls the server 200 by executing the game program 131 stored in the storage unit 220. For example, the control unit 210 transmits various data, programs, and the like to the user terminal 100. The control unit 210 receives a part or all of the game information or the user information from the user terminal 100. When the game is a multiplayer game, the control unit 210 may receive a request for synchronization of multiplayer from the user terminal 100 and transmit data for synchronization to the user terminal 100. Further, the control unit 210 communicates with the user terminal 100 and the operation instruction device 300 as necessary to send and receive information.
 制御部210は、ゲームプログラム131の記述に応じて、進行支援部211および共有支援部212として機能する。制御部210は、実行するゲームの性質に応じて、ユーザ端末100におけるゲームの進行を支援するために、図示しないその他の機能ブロックとしても機能することができる。 The control unit 210 functions as a progress support unit 211 and a shared support unit 212 according to the description of the game program 131. The control unit 210 can also function as another functional block (not shown) in order to support the progress of the game on the user terminal 100, depending on the nature of the game to be executed.
 進行支援部211は、ユーザ端末100と通信し、ユーザ端末100が、本ゲームに含まれる各種パートを進行するための支援を行う。例えば、進行支援部211は、ユーザ端末100が、ゲームを進行させるとき、該ゲームを進行させるために必要な情報をユーザ端末100に提供する。 The progress support unit 211 communicates with the user terminal 100 and supports the user terminal 100 to progress various parts included in this game. For example, when the user terminal 100 advances the game, the progress support unit 211 provides the user terminal 100 with information necessary for advancing the game.
 共有支援部212は、複数のユーザ端末100と通信し、複数のユーザが、各々のユーザ端末100にて互いのデッキを共有し合うための支援を行う。また、共有支援部212は、オンラインのユーザ端末100と動作指図装置300とをマッチングする機能を有していてもよい。これにより、ユーザ端末100と動作指図装置300との間の情報の送受信が円滑に実施される。 The sharing support unit 212 communicates with a plurality of user terminals 100, and supports a plurality of users to share each other's decks on each user terminal 100. Further, the sharing support unit 212 may have a function of matching the online user terminal 100 with the operation instruction device 300. As a result, information can be smoothly transmitted and received between the user terminal 100 and the operation instruction device 300.
  (ユーザ端末100の機能的構成)
 制御部110は、記憶部120に格納されたゲームプログラム131を実行することにより、ユーザ端末100を統括的に制御する。例えば、制御部110は、ゲームプログラム131およびユーザの操作にしたがって、ゲームを進行させる。また、制御部110は、ゲームを進行させている間、必要に応じて、サーバ200および動作指図装置300と通信して、情報の送受信を行う。
(Functional configuration of user terminal 100)
The control unit 110 comprehensively controls the user terminal 100 by executing the game program 131 stored in the storage unit 120. For example, the control unit 110 advances the game according to the game program 131 and the user's operation. Further, the control unit 110 communicates with the server 200 and the operation instruction device 300 as necessary to transmit and receive information while the game is in progress.
 制御部110は、ゲームプログラム131の記述に応じて、操作受付部111、表示制御部112、ユーザインターフェース(以下、UI)制御部113、アニメーション生成部114、ゲーム進行部115、解析部116および進捗情報生成部117として機能する。制御部110は、実行するゲームの性質に応じて、ゲームを進行させるために、図示しないその他の機能ブロックとしても機能することができる。 The control unit 110 includes an operation reception unit 111, a display control unit 112, a user interface (hereinafter, UI) control unit 113, an animation generation unit 114, a game progress unit 115, an analysis unit 116, and a progress unit according to the description of the game program 131. It functions as an information generation unit 117. The control unit 110 can also function as other functional blocks (not shown) in order to advance the game, depending on the nature of the game to be executed.
 操作受付部111は、入力部151に対するユーザの入力操作を検知し受け付ける。操作受付部111は、タッチスクリーン15およびその他の入出力IF14を介したコンソールに対してユーザが及ぼした作用から、いかなる入力操作がなされたかを判別し、その結果を制御部110の各要素に出力する。 The operation reception unit 111 detects and accepts a user's input operation to the input unit 151. The operation reception unit 111 determines what input operation has been performed from the action exerted by the user on the console via the touch screen 15 and other input / output IF 14, and outputs the result to each element of the control unit 110. do.
 例えば、操作受付部111は、入力部151に対する入力操作を受け付け、該入力操作の入力位置の座標を検出し、該入力操作の種類を特定する。操作受付部111は、入力操作の種類として、例えばタッチ操作、スライド操作、スワイプ操作、およびタップ操作等を特定する。また、操作受付部111は、連続して検知されていた入力が途切れると、タッチスクリーン15から接触入力が解除されたことを検知する。 For example, the operation receiving unit 111 receives an input operation for the input unit 151, detects the coordinates of the input position of the input operation, and specifies the type of the input operation. The operation receiving unit 111 specifies, for example, a touch operation, a slide operation, a swipe operation, a tap operation, and the like as types of input operations. Further, the operation receiving unit 111 detects that the contact input is canceled from the touch screen 15 when the continuously detected input is interrupted.
 UI制御部113は、UIを構築するために表示部152に表示させるUIオブジェクトを制御する。UIオブジェクトは、ユーザが、ゲームの進行上必要な入力をユーザ端末100に対して行うためのツール、または、ゲームの進行中に出力される情報をユーザ端末100から得るためのツールである。UIオブジェクトは、これには限定されないが、例えば、アイコン、ボタン、リスト、メニュー画面などである。 The UI control unit 113 controls the UI object to be displayed on the display unit 152 in order to construct the UI. The UI object is a tool for the user to make an input necessary for the progress of the game to the user terminal 100, or a tool for obtaining information output during the progress of the game from the user terminal 100. UI objects are, but are not limited to, icons, buttons, lists, menu screens, and the like.
 アニメーション生成部114は、各種オブジェクトの制御態様に基づいて、各種オブジェクトのモーションを示すアニメーションを生成する。例えば、アニメーション生成部114は、キャラクタがまるでそこにいるかのように動いたり、口を動かしたり、表情を変えたりする様子を表現したアニメーション等を生成してもよい。 The animation generation unit 114 generates an animation showing the motion of various objects based on the control mode of various objects. For example, the animation generation unit 114 may generate an animation or the like that expresses how the character moves as if it were there, moves the mouth, or changes the facial expression.
 表示制御部112は、タッチスクリーン15の表示部152に対して、上述の各要素によって実行された処理結果が反映されたゲーム画面を出力する。表示制御部112は、アニメーション生成部114によって生成されたアニメーションを含むゲーム画面を表示部152に表示してもよい。また、表示制御部112は、UI制御部113によって制御される上述のUIオブジェクトを、該ゲーム画面に重畳して描画してもよい。 The display control unit 112 outputs a game screen reflecting the processing result executed by each of the above elements to the display unit 152 of the touch screen 15. The display control unit 112 may display the game screen including the animation generated by the animation generation unit 114 on the display unit 152. Further, the display control unit 112 may superimpose and draw the above-mentioned UI object controlled by the UI control unit 113 on the game screen.
 ゲーム進行部115は、ゲームを進行させる。本実施形態では、ゲーム進行部115は、本ゲームを、操作受付部111を介して入力されるユーザの入力操作に応じて進行させる。ゲーム進行部115は、ゲームの進行中、1以上のキャラクタを登場させ、該キャラクタを動作させる。ゲーム進行部115は、キャラクタを、事前にダウンロードされたゲームプログラム131にしたがって動作させてもよいし、ユーザの入力操作にしたがって動作させてもよいし、動作指図装置300から供給される動作指図データにしたがって動作させてもよい。 The game progress unit 115 advances the game. In the present embodiment, the game progress unit 115 advances the game in response to a user's input operation input via the operation reception unit 111. The game progress unit 115 causes one or more characters to appear and operates the characters while the game is in progress. The game progress unit 115 may operate the character according to the game program 131 downloaded in advance, may operate according to the input operation of the user, or may operate the character according to the operation instruction device 300. It may be operated according to.
 本ゲームが、第1パート、第2パート・・・というように複数のパートに分かれて構成されている場合、ゲーム進行部115は、パートごとの仕様にしたがってゲームを進行させる。 When this game is divided into a plurality of parts such as the first part, the second part, and so on, the game progress unit 115 advances the game according to the specifications of each part.
 一例を挙げて具体的に説明すると、第1パートが、キャラクタと対話することによってゲーム中の物語が進行するストーリーパートであるとする。この場合、ゲーム進行部115は、以下のようにストーリーパートを進行させる。具体的には、ゲーム進行部115は、キャラクタを、予めダウンロードされたゲームプログラム131または同じく予めダウンロードされた動作指図データ(第1動作指図データ)などにしたがってキャラクタを動作させる。ゲーム進行部115は、操作受付部111が受け付けたユーザの入力操作に基づいて、該ユーザが選んだ選択肢を特定し、該選択肢に対応付けられている動作をキャラクタに行わせる。第2パートが、動作指図装置300から供給された動作指図データに基づいてキャラクタを動作させるライブ配信パートであるとする。この場合、ゲーム進行部115は、動作指図装置300から動作指図データに基づいて、キャラクタを動作させてライブ配信パートを進行させる。 To explain concretely with an example, it is assumed that the first part is a story part in which the story in the game progresses by interacting with the character. In this case, the game progress unit 115 advances the story part as follows. Specifically, the game progress unit 115 operates the character according to the game program 131 downloaded in advance or the operation instruction data (first operation instruction data) also downloaded in advance. The game progress unit 115 identifies an option selected by the user based on the input operation of the user received by the operation reception unit 111, and causes the character to perform an operation associated with the option. It is assumed that the second part is a live distribution part in which the character is operated based on the operation instruction data supplied from the operation instruction device 300. In this case, the game progress unit 115 operates the character from the operation instruction device 300 based on the operation instruction data to advance the live distribution part.
 解析部116は、動作指図データを解析して(レンダリングして)、解析結果に基づいてキャラクタを動作させるようにゲーム進行部115に指示する。本実施形態では、解析部116は、動作指図装置300によって供給された動作指図データが通信IF33を介して受信されたことをトリガにして、該動作指図データのレンダリングを開始する。動作指図装置300は、解析結果をゲーム進行部115に伝達し、すぐに動作指図データに基づいてキャラクタを動作させるよう指示する。すなわち、ゲーム進行部115は、動作指図データが受信されたことをトリガにして、該動作指図データに基づいてキャラクタを動作させる。これにより、リアルタイムで動作するキャラクタをユーザに見せることが可能となる。 The analysis unit 116 analyzes (renders) the operation instruction data and instructs the game progress unit 115 to operate the character based on the analysis result. In the present embodiment, the analysis unit 116 starts rendering of the operation instruction data triggered by the fact that the operation instruction data supplied by the operation instruction device 300 is received via the communication IF 33. The operation instruction device 300 transmits the analysis result to the game progress unit 115, and immediately instructs the character to operate based on the operation instruction data. That is, the game progress unit 115 uses the reception of the operation instruction data as a trigger to operate the character based on the operation instruction data. This makes it possible to show the user a character that operates in real time.
 進捗情報生成部117は、ゲーム進行部115が実行しているゲームの進捗を示す進捗情報を生成し、適時、サーバ200または動作指図装置300に送信する。進捗情報は、例えば、現在表示されているゲーム画面を指定する情報を含んでいてもよいし、ゲームの進捗を、時系列で文字および記号等によって示した進行ログを含んでいてもよい。ゲームシステム1において、サーバ200および動作指図装置300が進捗情報を必要としない実施形態では、進捗情報生成部117は省略されてもよい。 The progress information generation unit 117 generates progress information indicating the progress of the game being executed by the game progress unit 115, and sends it to the server 200 or the operation instruction device 300 in a timely manner. The progress information may include, for example, information that specifies the currently displayed game screen, or may include a progress log indicating the progress of the game in chronological order by characters, symbols, and the like. In the game system 1, in the embodiment in which the server 200 and the operation instruction device 300 do not require progress information, the progress information generation unit 117 may be omitted.
  (動作指図装置300の機能的構成)
 制御部310は、記憶部320に格納されたキャラクタ制御プログラム134を実行することにより、動作指図装置300を統括的に制御する。例えば、制御部310は、キャラクタ制御プログラム134およびオペレータの操作にしたがって、動作指図データを生成し、ユーザ端末100に供給する。制御部310は、必要に応じて、さらにゲームプログラム131を実行してもよい。また、制御部310は、サーバ200および本ゲームを実行中のユーザ端末100と通信して、情報の送受信を行う。
(Functional configuration of operation instruction device 300)
The control unit 310 comprehensively controls the operation instruction device 300 by executing the character control program 134 stored in the storage unit 320. For example, the control unit 310 generates operation instruction data according to the operation of the character control program 134 and the operator, and supplies the operation instruction data to the user terminal 100. The control unit 310 may further execute the game program 131, if necessary. Further, the control unit 310 communicates with the server 200 and the user terminal 100 running the game to send and receive information.
 制御部310は、キャラクタ制御プログラム134の記述に応じて、操作受付部311、表示制御部312、UI制御部313、アニメーション生成部314、進捗模擬部315およびキャラクタ制御部316として機能する。制御部310は、ゲームシステム1において実行されるゲームの性質に応じて、該ゲームに登場するキャラクタを制御するために、図示しないその他の機能ブロックとしても機能することができる。 The control unit 310 functions as an operation reception unit 311, a display control unit 312, a UI control unit 313, an animation generation unit 314, a progress simulation unit 315, and a character control unit 316 according to the description of the character control program 134. The control unit 310 can also function as another functional block (not shown) in order to control a character appearing in the game according to the nature of the game executed in the game system 1.
 操作受付部311は、入力部351に対するオペレータの入力操作を検知し受け付ける。操作受付部311は、タッチスクリーン35およびその他の入出力IF34を介したコンソールに対して、オペレータが及ぼした作用から、いかなる入力操作がなされたかを判別し、その結果を制御部310の各要素に出力する。操作受付部311の機能の詳細は、ユーザ端末100における操作受付部111のそれとほぼ同様である。 The operation reception unit 311 detects and accepts the operator's input operation to the input unit 351. The operation reception unit 311 determines what kind of input operation has been performed on the console via the touch screen 35 and other input / output IF 34s from the action exerted by the operator, and outputs the result to each element of the control unit 310. Output. The details of the function of the operation reception unit 311 are almost the same as those of the operation reception unit 111 in the user terminal 100.
 UI制御部313は、表示部352に表示させるUIオブジェクトを制御する。 The UI control unit 313 controls the UI object to be displayed on the display unit 352.
 アニメーション生成部314は、各種オブジェクトの制御態様に基づいて、各種オブジェクトのモーションを示すアニメーションを生成する。例えば、アニメーション生成部314は、通信相手となるユーザ端末100上実際に表示されているゲーム画面を再現したアニメーション等を生成してもよい。 The animation generation unit 314 generates an animation showing the motion of various objects based on the control mode of various objects. For example, the animation generation unit 314 may generate an animation or the like that reproduces the game screen actually displayed on the user terminal 100 that is the communication partner.
 表示制御部312は、タッチスクリーン35の表示部352に対して、上述の各要素によって実行された処理結果が反映されたゲーム画面を出力する。表示制御部312の機能の詳細は、ユーザ端末100における表示制御部112のそれとほぼ同様である。 The display control unit 312 outputs a game screen reflecting the processing result executed by each of the above-mentioned elements to the display unit 352 of the touch screen 35. The details of the functions of the display control unit 312 are substantially the same as those of the display control unit 112 in the user terminal 100.
 進捗模擬部315は、ユーザ端末100から受信するゲームの進捗を示す進捗情報に基づいて、ユーザ端末100におけるゲームの進捗を把握する。そして、進捗模擬部315は、該ユーザ端末100の挙動を動作指図装置300において模擬的に再現することで、オペレータに対して、ユーザ端末100の進捗を提示する。 The progress simulation unit 315 grasps the progress of the game on the user terminal 100 based on the progress information indicating the progress of the game received from the user terminal 100. Then, the progress simulation unit 315 presents the progress of the user terminal 100 to the operator by simulating the behavior of the user terminal 100 in the operation instruction device 300.
 例えば、進捗模擬部315は、ユーザ端末100で表示されているゲーム画面を再現したものを自装置の表示部352に表示してもよい。また、進捗模擬部315は、ユーザ端末100において、ゲームの進捗を上述の進行ログとして表示部352に表示してもよい。 For example, the progress simulation unit 315 may display a reproduction of the game screen displayed on the user terminal 100 on the display unit 352 of the own device. Further, the progress simulation unit 315 may display the progress of the game on the display unit 352 as the above-mentioned progress log on the user terminal 100.
 また、進捗模擬部315の機能の一部は、制御部310がゲームプログラム131を実行することにより実現されてもよい。例えば、まず進捗模擬部315は、進捗情報に基づいて、ユーザ端末100のゲームの進捗を把握する。そして、進捗模擬部315は、ユーザ端末100においてゲームプログラム131基づき現在表示されている、ゲーム画面を、完全にまたは簡略化して自装置の表示部352に再現してもよい。あるいは、進捗模擬部315は、現時点のゲームの進捗を把握し、ゲームプログラム131に基づいて現時点以降のゲーム進行を予測し、予測結果を表示部352に出力してもよい。 Further, a part of the functions of the progress simulation unit 315 may be realized by the control unit 310 executing the game program 131. For example, first, the progress simulation unit 315 grasps the progress of the game of the user terminal 100 based on the progress information. Then, the progress simulation unit 315 may completely or simplify the game screen currently displayed on the user terminal 100 based on the game program 131 and reproduce it on the display unit 352 of its own device. Alternatively, the progress simulation unit 315 may grasp the progress of the game at the present time, predict the progress of the game after the present time based on the game program 131, and output the prediction result to the display unit 352.
 キャラクタ制御部316は、ユーザ端末100に表示させるキャラクタの挙動を制御する。具体的には、キャラクタを動作させるための動作指図データを生成し、ユーザ端末100に供給する。例えば、キャラクタ制御部316は、オペレータ(声優など)が、マイク3010を介して入力した音声データに基づいて、制御対象のキャラクタに発言させることを指示する動作指図データを生成する。このようにして生成された動作指図データには、上述の音声データが少なくとも含まれる。また、例えば、オペレータ(モデルなど)が、モーションキャプチャ装置3020を介して入力したモーションキャプチャデータに基づく動きを制御対象のキャラクタに行わせることを指示する動作指図データを生成する。このようにして生成された動作指図データには、上述のモーションキャプチャデータが少なくとも含まれる。また、例えば、オペレータが、コントローラ3030などの入力機構または入力部351などの操作部を介して入力した入力操作の履歴、すなわち、操作履歴データに基づいて、制御対象のキャラクタを動作させることを指示する動作指図データを生成する。このようにして生成された動作指図データには、上述の操作履歴データが少なくとも含まれる。操作履歴データは、例えば、表示部にどの画面が表示されているときに、オペレータが、コントローラ3030のどのボタンをどのタイミングで押したのかを示す操作ログが時系列で整理されている情報である。ここでの表示部は、コントローラ3030と連動した表示部であって、タッチスクリーン35の表示部352であってもよいし、入出力IF34を介して接続された別の表示部であってもよい。あるいは、キャラクタ制御部316は、オペレータが上述の入力機構または操作部を介して入力した入力操作に対応付けられた、キャラクタの動作を指示するコマンドを特定する。そして、キャラクタ制御部316は、該コマンドを入力された順に並べてキャラクタの一連の動作を示すモーションコマンド群を生成し、該モーションコマンド群にしたがってキャラクタを動作させることを指示する動作指図データを生成してもよい。このようにして生成された動作指図データには、上述のモーションコマンド群が少なくとも含まれる。 The character control unit 316 controls the behavior of the character displayed on the user terminal 100. Specifically, the operation instruction data for operating the character is generated and supplied to the user terminal 100. For example, the character control unit 316 generates operation instruction data instructing an operator (voice actor or the like) to speak to the character to be controlled based on the voice data input via the microphone 3010. The operation instruction data generated in this way includes at least the above-mentioned voice data. Further, for example, an operator (model or the like) generates motion instruction data instructing the character to be controlled to perform a motion based on the motion capture data input via the motion capture device 3020. The motion instruction data generated in this way includes at least the above-mentioned motion capture data. Further, for example, it is instructed that the operator operates the character to be controlled based on the history of the input operation input through the input mechanism such as the controller 3030 or the operation unit such as the input unit 351 which is the operation history data. Generate action instruction data to be performed. The operation instruction data generated in this way includes at least the above-mentioned operation history data. The operation history data is, for example, information in which operation logs indicating which button of the controller 3030 is pressed at what timing by the operator when which screen is displayed on the display unit are organized in chronological order. .. The display unit here may be a display unit linked to the controller 3030, may be a display unit 352 of the touch screen 35, or may be another display unit connected via the input / output IF 34. .. Alternatively, the character control unit 316 identifies a command instructing the operation of the character associated with the input operation input by the operator via the above-mentioned input mechanism or operation unit. Then, the character control unit 316 arranges the commands in the order in which they are input to generate a motion command group indicating a series of actions of the character, and generates motion instruction data instructing the character to be operated according to the motion command group. You may. The motion instruction data generated in this way includes at least the above-mentioned motion command group.
 反応処理部317は、ユーザ端末100からユーザの反応についてフィードバックを受け付けて、これを動作指図装置300のオペレータに対して出力する。本実施形態では、例えば、ユーザ端末100は、上述の動作指図データにしたがってキャラクタを動作させている間、該キャラクタに宛てて、ユーザがコメントを作成することができる。反応処理部317は、該コメントのコメントデータを受け付けて、これを、出力する。反応処理部317は、ユーザのコメントに対応するテキストデータを、表示部352に表示してもよいし、ユーザのコメントに対応する音声データを、図示しないスピーカから出力してもよい。 The reaction processing unit 317 receives feedback on the user's reaction from the user terminal 100 and outputs this to the operator of the operation instruction device 300. In the present embodiment, for example, the user terminal 100 can create a comment addressed to the character while the character is operated according to the above-mentioned operation instruction data. The reaction processing unit 317 receives the comment data of the comment and outputs it. The reaction processing unit 317 may display the text data corresponding to the user's comment on the display unit 352, or may output the voice data corresponding to the user's comment from a speaker (not shown).
 なお、図2に示すユーザ端末100、サーバ200および動作指図装置300の機能は一例にすぎない。ユーザ端末100、サーバ200および動作指図装置300の各装置は、他の装置が備える機能の少なくとも一部を備えていてもよい。さらに、ユーザ端末100、サーバ200および動作指図装置300以外のさらに別の装置をゲームシステム1の構成要素とし、該別の装置にゲームシステム1における処理の一部を実行させてもよい。すなわち、本実施形態においてゲームプログラムを実行するコンピュータは、ユーザ端末100、サーバ200、動作指図装置300およびそれ以外の別の装置の何れであってもよいし、これらの複数の装置の組み合わせにより実現されてもよい。 The functions of the user terminal 100, the server 200, and the operation instruction device 300 shown in FIG. 2 are merely examples. Each device of the user terminal 100, the server 200, and the operation instruction device 300 may have at least a part of the functions of the other devices. Further, another device other than the user terminal 100, the server 200, and the operation instruction device 300 may be used as a component of the game system 1, and the other device may be made to execute a part of the processing in the game system 1. That is, the computer that executes the game program in the present embodiment may be any of a user terminal 100, a server 200, an operation instruction device 300, and another device other than the user terminal 100, and is realized by a combination of a plurality of these devices. May be done.
 なお、本実施形態では、進捗模擬部315は、省略されてもよい。また、本実施形態では、制御部310は、キャラクタ制御プログラム134の記述に応じて、反応処理部317として機能することができる。 In this embodiment, the progress simulation unit 315 may be omitted. Further, in the present embodiment, the control unit 310 can function as the reaction processing unit 317 according to the description of the character control program 134.
 <ゲームの構成>
 図3は、本ゲームの基本的なゲーム進行についてその一例を示すフローチャートである。本ゲームは、例えば、2つのゲームプレイパートに分かれている。一例として、第1パートは、ストーリーパート、第2パートは、ライブ配信パートである。本ゲームには、その他にも、ユーザが保有する有価データと引き換えに、ゲームで利用可能なデジタルデータであるゲーム媒体をユーザに獲得させる獲得パートが含まれていてもよい。本実施形態では、各パートのプレイ順序は特に限定されない。図3には、ユーザ端末100が、ストーリーパート、獲得パート、ライブ配信パートの順にゲームを実行した場合が示されている。
<Game structure>
FIG. 3 is a flowchart showing an example of the basic game progress of this game. The game is divided into, for example, two gameplay parts. As an example, the first part is a story part and the second part is a live distribution part. In addition, the game may include an acquisition part that allows the user to acquire a game medium that is digital data that can be used in the game in exchange for valuable data possessed by the user. In this embodiment, the play order of each part is not particularly limited. FIG. 3 shows a case where the user terminal 100 executes a game in the order of a story part, an acquisition part, and a live distribution part.
 ステップS1では、ゲーム進行部115は、ストーリーパートを実行する。ストーリーパートには、固定シナリオS11および獲得シナリオS12が含まれる(後述)。ストーリーパートには、例えば、ユーザが操作する主人公とキャラクタとが対話するシーンが含まれる。本実施形態では、一例として、デジタルデータとしてひとまとめにされた「シナリオ」は、キャラクタにまつわる物語の1話分に対応し、サーバ200から供給されて、一旦記憶部120に格納される。ゲーム進行部115は、ストーリーパートにおいて、記憶部120に格納されているシナリオを1つ読み出し、結末を迎えるまで1つシナリオをユーザの入力操作に応じて進行させる。シナリオには、ユーザに選択させる選択肢、該選択肢に対応するキャラクタの応答パターンなどが含まれており、ユーザがどの選択肢を選択するのかによって、1つのシナリオの中でも、異なる結末が得られてもよい。具体的には、ゲーム進行部115は、主人公からキャラクタに対しての働きかけに対応する複数の選択肢をユーザが選択可能に提示し、該ユーザが選択した選択肢に応じて、シナリオを進行させる。 In step S1, the game progress unit 115 executes the story part. The story part includes a fixed scenario S11 and an acquisition scenario S12 (described later). The story part includes, for example, a scene in which the main character operated by the user and the character interact with each other. In the present embodiment, as an example, the "scenario" collected as digital data corresponds to one episode of a story related to a character, is supplied from the server 200, and is temporarily stored in the storage unit 120. In the story part, the game progress unit 115 reads out one scenario stored in the storage unit 120, and advances one scenario according to the input operation of the user until the end is reached. The scenario includes an option to be selected by the user, a response pattern of the character corresponding to the option, and the like, and different endings may be obtained in one scenario depending on which option the user selects. .. Specifically, the game progress unit 115 presents a plurality of options corresponding to the action from the main character to the character so that the user can select them, and advances the scenario according to the options selected by the user.
 ステップS2では、ユーザが最後までシナリオをプレイすると、ゲーム進行部115は、結末に応じた報酬を該ユーザに獲得させてもよい。報酬は、例えば、ゲーム上で利用可能なデジタルデータであるゲーム媒体としてユーザに提供される。ゲーム媒体は、例えば、キャラクタに身に付けさせることができる服飾品などのアイテムであってもよい。ここで、「報酬をユーザに獲得させる」とは、一例として、ユーザに対応付けて管理されている報酬としてのゲーム媒体のステータスを、使用不可から使用可能に遷移させることであってもよい。あるいは、ゲーム媒体を、ユーザ識別情報またはユーザ端末IDなどに対応付けて、ゲームシステム1に含まれる少なくともいずれかのメモリ(メモリ11、メモリ21、メモリ31)に記憶させることであってもよい。 In step S2, when the user plays the scenario to the end, the game progress unit 115 may make the user acquire a reward according to the ending. The reward is provided to the user, for example, as a game medium which is digital data that can be used in the game. The game medium may be, for example, an item such as clothing that can be worn by the character. Here, "to make the user acquire the reward" may, as an example, change the status of the game medium as the reward managed in association with the user from unusable to usable. Alternatively, the game medium may be stored in at least one of the memories (memory 11, memory 21, memory 31) included in the game system 1 in association with the user identification information, the user terminal ID, or the like.
 ステップS3では、ゲーム進行部115が獲得パートを実行する。獲得パートにおいて、ユーザに獲得させるゲーム媒体は、初回ダウンロード時にユーザ端末100に提供されるシナリオとは別の新しいシナリオであってもよい。以下では、前者のシナリオを固定シナリオ、後者のシナリオを獲得シナリオと称する。両者を区別する必要が無い場合は、単に、シナリオと称する。 In step S3, the game progress unit 115 executes the acquisition part. In the acquisition part, the game medium acquired by the user may be a new scenario different from the scenario provided to the user terminal 100 at the time of the first download. In the following, the former scenario will be referred to as a fixed scenario, and the latter scenario will be referred to as an acquisition scenario. When it is not necessary to distinguish between the two, it is simply referred to as a scenario.
 獲得パートでは、例えば、ゲーム進行部115は、ユーザの有価データを消費することと引き換えに、ユーザがすでに保有している固定シナリオとは別の獲得シナリオをユーザに保有させる。ユーザに獲得させるシナリオは、ゲーム進行部115、または、サーバ200の進行支援部211によって、所定の規則にしたがって決定されてもよい。より具体的には、ゲーム進行部115、または、進行支援部211は、抽選を実行し、複数の獲得シナリオの中からランダムに、ユーザに獲得させるシナリオを決定してもよい。獲得パートは、ストーリーパートおよびライブ配信パートの前後の任意のタイミングで実行されてもよい。 In the acquisition part, for example, the game progress unit 115 causes the user to possess an acquisition scenario different from the fixed scenario that the user already possesses, in exchange for consuming the user's valuable data. The scenario to be acquired by the user may be determined by the game progress unit 115 or the progress support unit 211 of the server 200 according to a predetermined rule. More specifically, the game progress unit 115 or the progress support unit 211 may execute a lottery and randomly determine a scenario to be acquired by the user from a plurality of acquisition scenarios. The acquisition part may be executed at any time before and after the story part and the live distribution part.
 ステップS4では、ゲーム進行部115は、ネットワークを介して外部の装置から、動作指図データを受信したか否かを判定する。動作指図データを外部の装置から受信しないうちは、ゲーム進行部115は、ステップS4のNOから、例えば、ステップS1に戻り、ストーリーパートを実行してもよい。あるいは、ゲーム進行部115は、ステップS3の獲得パートを実行してもよい。一方、動作指図データを外部の装置から受信した場合は、ゲーム進行部115は、ステップS4のYESからステップS5に進む。 In step S4, the game progress unit 115 determines whether or not the operation instruction data has been received from an external device via the network. While the operation instruction data is not received from the external device, the game progress unit 115 may return from NO in step S4 to, for example, step S1 to execute the story part. Alternatively, the game progress unit 115 may execute the acquisition part of step S3. On the other hand, when the operation instruction data is received from the external device, the game progress unit 115 proceeds from YES in step S4 to step S5.
 ステップS5では、ゲーム進行部115は、ライブ配信パート(第2パート)を実行する。具体的には、ゲーム進行部115は、ステップS4にて受信した動作指図データにしたがってキャラクタを動作させることにより、ライブ配信パートを進行させる。ユーザは、ステップS1では、シナリオにおいて、決め打ちの反応を示すキャラクタと単にUIを介して対話するだけであった。しかし、ユーザは、ライブ配信パートにおいては、外部の装置から送信された動作指図データに基づいてリアルタイムに動作するキャラクタと自由にインタラクティブに対話することができる。より具体的には、解析部116は、ユーザの入力操作の内容に応じて生成された音声データおよびモーションデータを含む動作指図データを動作指図装置300から受信する。そして、ゲーム進行部115は、受信された動作指図データに含まれる音声データに基づいて、キャラクタに発話させるともに、上述のモーションデータに基づいてキャラクタに動きをつける。これにより、上述のユーザの入力操作に対するキャラクタの反応を、ユーザに提示することができる。 In step S5, the game progress unit 115 executes the live distribution part (second part). Specifically, the game progress unit 115 advances the live distribution part by operating the character according to the operation instruction data received in step S4. In step S1, the user simply interacts with the character showing a definite reaction via the UI in the scenario. However, in the live distribution part, the user can freely and interactively interact with the character that operates in real time based on the operation instruction data transmitted from the external device. More specifically, the analysis unit 116 receives the operation instruction data including the voice data and the motion data generated according to the content of the input operation of the user from the operation instruction device 300. Then, the game progress unit 115 causes the character to speak based on the voice data included in the received motion instruction data, and moves the character based on the above-mentioned motion data. Thereby, the reaction of the character to the above-mentioned input operation of the user can be presented to the user.
 <処理概要>
 本実施形態では、ユーザ端末100は、ゲームプログラム131に基づいて、ゲームの興趣性を向上させるために以下のステップを実行するように構成されている。具体的には、ユーザ端末100は、ユーザおよび他のユーザのいずれもが操作しないNPCの動作を指定する動作指図データであって、メモリ11に予め記憶されている第1動作指図データに基づいてNPCを動作させ、操作部(入出力IF14、タッチスクリーン15、カメラ17、測距センサ18)を介して入力されたユーザの入力操作に応じて第1パートを進行させるステップと、NPC制御装置(動作指図装置300)から受信した第2動作指図データに基づいてNPCを動作させることにより第2パートを進行させるステップと、を実行する。
<Processing overview>
In the present embodiment, the user terminal 100 is configured to execute the following steps in order to improve the interest of the game based on the game program 131. Specifically, the user terminal 100 is operation instruction data that specifies an operation of an NPC that neither the user nor another user operates, and is based on the first operation instruction data stored in advance in the memory 11. A step of operating an NPC and advancing the first part according to a user's input operation input via an operation unit (input / output IF 14, touch screen 15, camera 17, distance measuring sensor 18), and an NPC control device (NPC control device). The step of advancing the second part by operating the NPC based on the second operation instruction data received from the operation instruction device 300) is executed.
 上述の構成によれば、ユーザ端末100は、第1パートにおいて、予めダウンロードされた第1動作指図データに基づいてNPCを動作させる。これに加えて、ユーザ端末100は、動作指図装置300から第2動作指図データを受信し、第2パートにおいて、第2動作指図データに基づいてNPCを動作させる。動作指図装置300から受信した第2動作指図データに基づいてNPCを動作させることができるため、NPCの動作は、型にはまらず、その表現は大幅に広がる。そのため、ユーザは、ゲームプレイ中のNPCとの関わり合いを通じて、該NPCがまるで現実の世界にいるかのような現実感を覚えることができる。結果として、ゲームの世界への没入感を高め、該ゲームの興趣性を向上させるという効果を奏する。 According to the above configuration, the user terminal 100 operates the NPC in the first part based on the first operation instruction data downloaded in advance. In addition to this, the user terminal 100 receives the second operation instruction data from the operation instruction device 300, and in the second part, operates the NPC based on the second operation instruction data. Since the NPC can be operated based on the second operation instruction data received from the operation instruction device 300, the operation of the NPC is unconventional and its expression is greatly expanded. Therefore, the user can feel the reality as if the NPC is in the real world through the relationship with the NPC during the game play. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game.
 <データ構造>
  (動作指図データ)
 図4は、本実施形態に係るゲームシステム1にて処理される動作指図データのデータ構造の一例を示す図である。一例として、動作指図データは、メタ情報である、「宛先」、「作成元」の各項目と、データの中身である、「キャラクタID」、「音声」、「動き」の各項目とを含んで構成されている。
<Data structure>
(Operation instruction data)
FIG. 4 is a diagram showing an example of a data structure of operation instruction data processed by the game system 1 according to the present embodiment. As an example, the action instruction data includes each item of "destination" and "creator" which is meta information, and each item of "character ID", "voice" and "movement" which are the contents of the data. It is composed of.
 項目「宛先」には、宛先指定情報が格納されている。宛先指定情報は、該動作指図データが、どの装置宛てに送信されたものであるのかを示す情報である。宛先指定情報は、例えば、ユーザ端末100固有のアドレスであってもよいし、ユーザ端末100が所属しているグループの識別情報であってもよい。ある条件を満たすすべてのユーザ端末100を宛先としていることを示す記号(例えば、「ALL」など)であってもよい。 The destination designation information is stored in the item "destination". The destination designation information is information indicating to which device the operation instruction data is transmitted. The destination designation information may be, for example, an address unique to the user terminal 100, or may be identification information of the group to which the user terminal 100 belongs. It may be a symbol (for example, "ALL") indicating that the destination is all user terminals 100 satisfying a certain condition.
 項目「作成元」には、作成元情報が格納されている。作成元情報は、該動作指図データが、どの装置によって作成されたものであるのかを示す情報である。作成元情報は、例えば、ユーザID、ユーザ端末ID、ユーザ端末の固有アドレスなど、ある特定のユーザを特定可能な、ユーザに関連する情報(以下、ユーザ関連情報)である。作成元情報は、サーバ200または動作指図装置300を指し示すIDまたはアドレスであってもよいし、作成元が、サーバ200または動作指図装置300である場合には、該項目の値を空のままにしておいてもよいし、該項目自体を動作指図データに設けないようにしてもよい。 The creation source information is stored in the item "creation source". The creation source information is information indicating which device created the operation instruction data. The creation source information is information related to a user (hereinafter referred to as user-related information) that can identify a specific user, such as a user ID, a user terminal ID, and a unique address of the user terminal. The creation source information may be an ID or an address indicating the server 200 or the operation instruction device 300, and if the creation source is the server 200 or the operation instruction device 300, the value of the item is left empty. It may be left, or the item itself may not be provided in the operation instruction data.
 項目「キャラクタID」には、本ゲームに登場するキャラクタを一意に識別するためのキャラクタIDが格納されている。ここに格納されているキャラクタIDは、該動作指図データがどのキャラクタの動作を指示するためのものであるのかを表している。 The item "character ID" stores a character ID for uniquely identifying a character appearing in this game. The character ID stored here represents which character's action is indicated by the action instruction data.
 項目「音声」には、キャラクタに発現させる音声データが格納されている。項目「動き」には、キャラクタの動きを指定するモーションデータが格納されている。モーションデータは、一例として、モーションキャプチャ装置3020を介して動作指図装置300が取得したモーションキャプチャデータであってもよい。モーションキャプチャデータは、アクターの体全体の動きを追跡したデータであってもよいし、アクターの顔の表情および口の動きを追跡したデータであってもよいし、その両方であってもよい。モーションデータは、他の例として、コントローラ3030を介して動作指図装置300のオペレータが入力した操作によって特定された、キャラクタの一連の動きを指示するモーションコマンド群であってもよい。例えば、コントローラ3030のボタンA、ボタンB、ボタンC、ボタンDにそれぞれ、「右手を上げる」、「左手を上げる」、「歩く」、「走る」のコマンドが割り付けられている場合に、オペレータが、ボタンA、ボタンB、ボタンC、ボタンDを続けて順に押したとする。この場合には、「右手を上げる」、「左手を上げる」、「歩く」、「走る」の各コマンドが上述の順に並べられたモーションコマンド群が、モーションデータとして、「動き」の項目に格納される。なお、本実施形態では、音声データとモーションデータとは同期がとれた状態で、動作指図データに含まれている。 The item "voice" stores voice data to be expressed in the character. Motion data that specifies the movement of the character is stored in the item "movement". As an example, the motion data may be motion capture data acquired by the motion instruction device 300 via the motion capture device 3020. The motion capture data may be data that tracks the movement of the actor's entire body, may be data that tracks the facial expression and mouth movement of the actor, or may be both. As another example, the motion data may be a motion command group instructing a series of movements of the character specified by an operation input by the operator of the operation instruction device 300 via the controller 3030. For example, when the commands "raise the right hand", "raise the left hand", "walk", and "run" are assigned to the buttons A, B, C, and D of the controller 3030, the operator can use them. , Button A, button B, button C, and button D are pressed in succession. In this case, a motion command group in which the commands "raise the right hand", "raise the left hand", "walk", and "run" are arranged in the above order is stored in the "movement" item as motion data. Will be done. In this embodiment, the voice data and the motion data are included in the operation instruction data in a synchronized state.
 このような動作指図データを受信することにより、ゲーム進行部115は、ゲームに登場するキャラクタを、該動作指図データの作成元の意図通りに動作させることができる。具体的には、ゲーム進行部115は、動作指図データに音声データが含まれている場合には、該音声データに基づいてキャラクタに発話させる。また、ゲーム進行部115は、動作指図データにモーションデータが含まれている場合には、該モーションデータに基づいてキャラクタを動かす、すなわち、モーションデータに基づく動きをするように該キャラクタのアニメーションを生成する。 By receiving such motion instruction data, the game progress unit 115 can operate the character appearing in the game as intended by the creator of the motion instruction data. Specifically, when the operation instruction data includes voice data, the game progress unit 115 causes the character to speak based on the voice data. Further, when the motion data includes motion data, the game progress unit 115 moves the character based on the motion data, that is, generates an animation of the character so as to move based on the motion data. do.
  (ゲーム情報)
 図5は、本実施形態に係るゲームシステム1にて処理されるゲーム情報132のデータ構造の一例を示す図である。ゲーム情報132において設けられる項目は、ゲームのジャンル、性質、内容等に応じて適宜決定されるものであり、例示の項目は、本発明の範囲を限定するものではない。一例として、ゲーム情報132は、「プレイ履歴」、「アイテム」、「親密度」、「知名度」および「配信履歴」の各項目を含んで構成されている。これらの各項目は、ゲーム進行部115がゲームを進行させるときに適宜参照される。
(Game information)
FIG. 5 is a diagram showing an example of the data structure of the game information 132 processed by the game system 1 according to the present embodiment. The items provided in the game information 132 are appropriately determined according to the genre, nature, content, etc. of the game, and the exemplary items do not limit the scope of the present invention. As an example, the game information 132 is configured to include each item of "play history", "item", "intimacy", "famousness", and "delivery history". Each of these items is appropriately referred to when the game progress unit 115 advances the game.
 項目「プレイ履歴」には、ユーザのプレイ履歴が格納されている。プレイ履歴は、記憶部120に記憶されているシナリオごとに、ユーザのプレイが完遂しているかどうかを示す情報である。例えば、プレイ履歴は、プレイ初回にダウンロードされた固定シナリオのリストと、獲得パートにおいて後から獲得された獲得シナリオのリストとを含む。それぞれのリストにおいて、シナリオごとに、「プレイ済」、「未プレイ」、「プレイ可」、「プレイ不可」などのステータスが紐付けられている。 The user's play history is stored in the item "play history". The play history is information indicating whether or not the user's play is completed for each scenario stored in the storage unit 120. For example, the play history includes a list of fixed scenarios downloaded at the beginning of the play and a list of acquisition scenarios acquired later in the acquisition part. In each list, statuses such as "played", "unplayed", "playable", and "unplayable" are associated with each scenario.
 項目「アイテム」には、ユーザが保有するゲーム媒体としてのアイテム一覧が格納されている。本ゲームにおいて、アイテムは、一例として、キャラクタに身に付けさせる服飾品である。ユーザは、シナリオをプレイすることによって得られたアイテムを、キャラクタに身に付けさせ、キャラクタの見た目をカスタマイズすることができる。 The item "item" stores a list of items owned by the user as a game medium. In this game, the item is, for example, a clothing item worn by a character. The user can make the character wear the items obtained by playing the scenario and customize the appearance of the character.
 項目「親密度」には、キャラクタのステータスの1つである親密度が格納されている。新密度は、ユーザのいわば分身である「主人公」を、キャラクタとの仲の良さを示すパラメータである。例えば、ゲーム進行部115は、親密度が高いほど、ゲームをユーザにとって有利に進行させてもよい。例えば、ゲーム進行部115は、シナリオのプレイ結果の良し悪しに応じて、親密度を増減してもよい。一例として、ゲーム進行部115は、ユーザがうまく選択肢を選び、シナリオにおいて迎えられた結末が良い内容であるほど、親密度を多く増分する。反対に、ゲーム進行部115は、ユーザが、シナリオをバッドエンドで迎えた場合には、親密度を減じてもよい。 The item "Intimacy" stores intimacy, which is one of the character's statuses. The new density is a parameter that indicates the friendliness of the user's alter ego, the "hero", with the character. For example, the game progress unit 115 may advance the game in the user's favor as the intimacy is higher. For example, the game progress unit 115 may increase or decrease the intimacy depending on whether the play result of the scenario is good or bad. As an example, the game progress unit 115 increases the intimacy more as the user selects the option well and the ending greeted in the scenario is better. On the contrary, the game progress unit 115 may reduce the intimacy when the user reaches the scenario at the bad end.
 項目「知名度」には、キャラクタのステータスの1つのである知名度が格納されている。知名度は、キャラクタが、動画配信者として持つ人気の高さおよび認知度などを示すパラメータである。キャラクタの動画配信活動を応援して、該キャラクタの知名度を上げ、該キャラクタの夢を実現することが本ゲームの目的の一つとなる。一例として、一定以上の知名度を達成することができたユーザに対しては、特別なシナリオが報酬として提供されてもよい。 The item "Familiarity" stores the fame level, which is one of the character's statuses. The name recognition is a parameter indicating the popularity and recognition of the character as a video distributor. One of the purposes of this game is to support the video distribution activity of the character, raise the name of the character, and realize the dream of the character. As an example, a special scenario may be offered as a reward to a user who has achieved a certain level of name recognition.
 項目「配信履歴」には、ライブ配信パートにおいて、過去にキャラクタからライブ配信された動画、いわゆるバックナンバーの一覧が格納されている。ライブ配信パートにおいて、リアルタイムにPUSH配信されている動画は、そのときにしか閲覧できない。一方、過去の配信分の動画は、サーバ200または動作指図装置300において録画されており、ユーザ端末100からのリクエストに応じて、PULL配信することが可能である。本実施形態では、一例として、バックナンバーは、ユーザが課金することにより、ダウンロードできるようにしてもよい。 The item "Distribution history" stores a list of videos, so-called back numbers, that have been live-distributed from characters in the past in the live distribution part. In the live distribution part, the video that is PUSH-distributed in real time can be viewed only at that time. On the other hand, the moving images for past distribution are recorded by the server 200 or the operation instruction device 300, and can be PULL distributed in response to a request from the user terminal 100. In the present embodiment, as an example, the back number may be made available for download by the user for a fee.
 <ストーリーパートの画面例>
 図6は、ユーザ端末100の表示部152に表示されるクエスト提示画面400の一例を示す図である。ゲーム進行部115は、ストーリーパートにおいて、シナリオを進行中、ゲームプログラム131にしたがって、ユーザに対してクエストを提示する。具体的には、ゲーム進行部115は、主人公とキャラクタとの対話の中で、キャラクタから主人公に対して、クエストに相当する依頼事項を発言させる。このとき、例えば、ゲーム進行部115は、図6に示すクエスト提示画面400を表示部152に表示させてもよい。
<Screen example of story part>
FIG. 6 is a diagram showing an example of a quest presentation screen 400 displayed on the display unit 152 of the user terminal 100. In the story part, the game progress unit 115 presents a quest to the user according to the game program 131 while the scenario is in progress. Specifically, the game progress unit 115 causes the character to speak a request item corresponding to a quest to the hero in a dialogue between the hero and the character. At this time, for example, the game progress unit 115 may display the quest presentation screen 400 shown in FIG. 6 on the display unit 152.
 「キャラクタに依頼事項を発言させる」という一連の動作を行うキャラクタを提示する方法は特に限定されない。例えば、ゲーム進行部115は、予め記憶部120に記憶されている、依頼事項に対応するテキストデータに基づいて、それを発話するキャラクタを静止画で表示してもよい。具体的には、ゲーム進行部115は、キャラクタ401と、キャラクタ401の発言であることを示す吹き出し402と、吹き出し402内に配置された依頼事項のテキストデータとを含むクエスト提示画面400を表示部152に表示させる。あるいは、ゲーム進行部115は、予め記憶部120に記憶されている、依頼事項を発話するシーンに対応する動作指図データに基づいて、それを発話するキャラクタのアニメーションを表示してもよい。具体的には、ゲーム進行部115は、キャラクタ401を、動作指図データに含まれるモーションキャプチャデータにしたがって動かしつつ、該動作指図データに含まれる音声データをユーザ端末100が備える図示しないスピーカから音声として出力する。 The method of presenting a character that performs a series of actions of "making the character speak a request" is not particularly limited. For example, the game progress unit 115 may display a character that utters the request as a still image based on the text data stored in the storage unit 120 in advance. Specifically, the game progress unit 115 displays a quest presentation screen 400 including a character 401, a balloon 402 indicating that the character 401 is speaking, and text data of a request item arranged in the balloon 402. Display on 152. Alternatively, the game progress unit 115 may display an animation of the character who utters the request item based on the operation instruction data corresponding to the scene in which the request item is uttered, which is stored in the storage unit 120 in advance. Specifically, the game progress unit 115 moves the character 401 according to the motion capture data included in the motion instruction data, and transfers the voice data included in the motion instruction data as voice from a speaker (not shown) included in the user terminal 100. Output.
 本実施形態では、一例として、ゲーム進行部115は、クエストを、ユーザ端末100の位置登録情報を利用した位置情報ゲームによって実現してもよい。ゲーム進行部115は、ユーザ端末100に備えられている不図示の位置登録システムから、ユーザ端末100の現在位置情報(例えば、住所情報、緯度経度情報など)を取得する。そして、取得した現在位置情報に基づいて、ユーザ端末100がある場所周辺の地図403を生成し、クエスト提示画面400に配置する。なお地図403を生成する元になる地図データは、予めユーザ端末100の記憶部120に記憶されていてもよいし、地図データを提供する他のサービス提供装置からネットワークを介して取得されてもよい。 In the present embodiment, as an example, the game progress unit 115 may realize the quest by a location information game using the location registration information of the user terminal 100. The game progress unit 115 acquires the current position information (for example, address information, latitude / longitude information, etc.) of the user terminal 100 from a position registration system (not shown) provided in the user terminal 100. Then, based on the acquired current position information, a map 403 around the place where the user terminal 100 is located is generated and arranged on the quest presentation screen 400. The map data that is the source of generating the map 403 may be stored in advance in the storage unit 120 of the user terminal 100, or may be acquired from another service providing device that provides the map data via the network. ..
 続いて、ゲーム進行部115は、依頼事項を解決できる事物(以下、目標物)を獲得できる位置(住所、緯度経度など)を決定し、決定した位置に対応する地図上の位置に、目標アイコン404を重畳表示させる。これにより、ユーザは、ユーザ端末100を持って、地図403上の目標アイコン404の位置まで移動すれば、目標物を獲得し、クエストをクリアできると理解することができる。目標物の位置について、ゲーム進行部115は、ランダムに決定してもよいし、シナリオ、クエスト、目標物の内容に応じて予め決定されていてもよい。 Subsequently, the game progress unit 115 determines a position (address, latitude / longitude, etc.) at which an object that can solve the request (hereinafter referred to as a target) can be acquired, and a target icon is placed at a position on the map corresponding to the determined position. 404 is superimposed and displayed. As a result, the user can understand that if he / she holds the user terminal 100 and moves to the position of the target icon 404 on the map 403, he / she can acquire the target object and clear the quest. The position of the target may be randomly determined by the game progress unit 115, or may be determined in advance according to the contents of the scenario, the quest, and the target.
 ユーザが、目標アイコン404の位置に相当する実際の位置にユーザ端末100を持ち込むと、ゲーム進行部115は、主人公が目標物に到達したと判定し、ユーザに、目標物を獲得させる。ゲーム進行部115は、これにより、クエストがクリアされたと判定する。 When the user brings the user terminal 100 to an actual position corresponding to the position of the target icon 404, the game progress unit 115 determines that the main character has reached the target object, and causes the user to acquire the target object. The game progress unit 115 determines that the quest has been cleared.
 本実施形態では、ゲーム進行部115は、クエストがクリアされると、クエスト解決画面500を生成し、表示部152に表示させてもよい。図7は、ユーザ端末100の表示部152に表示されるクエスト解決画面500の一例を示す図である。一例として、クエスト解決画面500は、キャラクタ401を含む。例えば、ゲーム進行部115は、キャラクタ401に、「依頼事項が解決されたことについて主人公に対して礼を言う」という動作を行わせる。ゲーム進行部115は、この動作を、予め記憶されている動作指図データに基づいてキャラクタ401に行わせてもよい。あるいは、ゲーム進行部115は、キャラクタ401の静止画と発言内容に対応するテキストデータ501とをクエスト解決画面500に配置することにより、キャラクタ401がお礼を言っているシーンを再現してもよい。 In the present embodiment, when the quest is cleared, the game progress unit 115 may generate a quest resolution screen 500 and display it on the display unit 152. FIG. 7 is a diagram showing an example of a quest resolution screen 500 displayed on the display unit 152 of the user terminal 100. As an example, the quest resolution screen 500 includes a character 401. For example, the game progress unit 115 causes the character 401 to perform the operation of "thank the hero for the resolution of the request". The game progress unit 115 may cause the character 401 to perform this operation based on the operation instruction data stored in advance. Alternatively, the game progress unit 115 may reproduce the scene in which the character 401 is thanking by arranging the still image of the character 401 and the text data 501 corresponding to the content of the statement on the quest resolution screen 500.
 本実施形態では、ゲーム進行部115は、クエストがクリアされた報酬として、依頼主であるキャラクタ401にまつわる新たな固定シナリオを1つ解放し、ユーザがプレイ可能な状態に遷移させてもよい。具体的には、ゲーム進行部115は、図5に示すプレイ履歴を読み出し、所定の固定シナリオのステータスを「プレイ不可」から「プレイ可」に更新する。 In the present embodiment, the game progress unit 115 may release one new fixed scenario related to the client character 401 as a reward for clearing the quest, and transition to a state in which the user can play. Specifically, the game progress unit 115 reads the play history shown in FIG. 5 and updates the status of the predetermined fixed scenario from "playable" to "playable".
 さらに、ゲーム進行部115は、クエストがクリアされたことに基づいて、主人公とキャラクタとの親密度を増分してもよい。ゲーム進行部115は、クエストのプレイ内容(所要時間、移動距離、獲得個数、キャラクタの喜びの度合い、獲得された目標物のレア度など)が良いほど、親密度を上げる構成であってもよい。 Furthermore, the game progress unit 115 may increase the intimacy between the main character and the character based on the fact that the quest has been cleared. The game progress unit 115 may be configured to increase intimacy as the play content of the quest (time required, distance traveled, number of acquisitions, degree of joy of the character, rarity of the acquired target, etc.) is better. ..
 ユーザが1以上のクエストをクリアしたり、選択肢を選択したりすることにより、キャラクタとの対話が進み、シナリオが進行していく。シナリオが1つの結末を迎えると、ユーザは、シナリオのプレイを完遂したことになる。 When the user clears one or more quests or selects an option, the dialogue with the character progresses and the scenario progresses. When the scenario has one ending, the user has completed playing the scenario.
 ゲーム進行部115は、シナリオをユーザがプレイしたことの報酬として、アイテムをユーザに獲得させてもよい。アイテムは、例えば、キャラクタ401に身に付けさせる服飾品である。ゲーム進行部115は、ユーザに獲得させるアイテムを所定の規則に基づいて決定する。例えば、ゲーム進行部115は、プレイされたシナリオに予め対応付けられているアイテムをユーザに付与してもよいし、シナリオのプレイ内容(クエストクリアの所要時間、獲得された親密度、よい選択肢を選択したか、など)に応じて決定されたアイテムを付与してもよい。あるいは、ユーザに付与するアイテムは、複数の候補の中からランダムで決定されてもよい。 The game progress unit 115 may allow the user to acquire an item as a reward for playing the scenario by the user. The item is, for example, a clothing item to be worn by the character 401. The game progress unit 115 determines the items to be acquired by the user based on a predetermined rule. For example, the game progress unit 115 may give the user an item preliminarily associated with the scenario played, or may provide the user with the play content of the scenario (time required to clear the quest, acquired intimacy, and good choices). Items determined according to the selection, etc.) may be given. Alternatively, the item to be given to the user may be randomly determined from a plurality of candidates.
 本実施形態では、ゲーム進行部115は、ユーザに獲得させたアイテムを通知するための報酬画面600を生成し、表示部152に表示させてもよい。図8は、ユーザ端末100の表示部152に表示される報酬画面600の一例を示す図である。一例として、報酬画面600は、獲得されたアイテムのアイコン601、および、該アイテムの名称602を含んでいてもよい。これにより、ユーザは、自身が獲得できたアイテムを確認することができる。また、ゲーム進行部115は、図5に示す項目「アイテム」に格納されているアイテムリストに、上述の獲得されたアイテムを追加する。 In the present embodiment, the game progress unit 115 may generate a reward screen 600 for notifying the user of the acquired item and display it on the display unit 152. FIG. 8 is a diagram showing an example of a reward screen 600 displayed on the display unit 152 of the user terminal 100. As an example, the reward screen 600 may include an icon 601 of the acquired item and a name 602 of the item. As a result, the user can confirm the items that he / she has acquired. Further, the game progress unit 115 adds the above-mentioned acquired item to the item list stored in the item "item" shown in FIG.
 <ライブ配信パートの画面例>
 ゲーム進行部115は、例えば動作指図装置300などの外部の装置から動作指図データを受信すると、ライブ配信パートにおいて、キャラクタを該動作指図データに基づいて動作させる。例えば、ライブ配信パートにおいて動作指図データに基づいて動作するキャラクタを含む動画再生画面800を生成し、表示部152に表示させる。
<Screen example of live distribution part>
When the game progress unit 115 receives the operation instruction data from an external device such as the operation instruction device 300, the game progress unit 115 operates the character based on the operation instruction data in the live distribution part. For example, in the live distribution part, a moving image reproduction screen 800 including a character that operates based on the operation instruction data is generated and displayed on the display unit 152.
 図9は、ユーザ端末100の表示部152に表示される動画再生画面800の一例を示す図である。動画再生画面800は、一例として、ストーリーパートで対話相手であったキャラクタ(図示の例では、キャラクタ802)を少なくとも含む。 FIG. 9 is a diagram showing an example of a moving image reproduction screen 800 displayed on the display unit 152 of the user terminal 100. As an example, the moving image reproduction screen 800 includes at least a character (character 802 in the illustrated example) that was a dialogue partner in the story part.
 本実施形態では、ゲーム進行部115は、外部の装置(以下、動作指図装置300とする)から供給された動作指図データに含まれているモーションキャプチャデータが示す動きをキャラクタ802の動きに反映させる。モーションキャプチャデータは、動作指図装置300の設置場所において、モデル702の動きを、モーションキャプチャ装置3020を介して取得したものである。したがって、モデル702の動きが、そのまま、表示部152に表示されるキャラクタ802の動きに反映される。 In the present embodiment, the game progress unit 115 reflects the movement indicated by the motion capture data included in the movement instruction data supplied from the external device (hereinafter referred to as the movement instruction device 300) in the movement of the character 802. .. The motion capture data is obtained by acquiring the movement of the model 702 at the installation location of the motion instruction device 300 via the motion capture device 3020. Therefore, the movement of the model 702 is directly reflected in the movement of the character 802 displayed on the display unit 152.
 本実施形態では、ゲーム進行部115は、動作指図装置300から供給された動作指図データに含まれている音声データ801を、キャラクタ802が発した音声として、キャラクタ802の動きと同期して出力する。音声データは、動作指図装置300の設置場所において、声優701の音声700を、マイク3010を介して取得したものである。したがって、声優701が発した音声700に対応する音声データ801が、そのまま、ユーザ端末100のスピーカから出力される。 In the present embodiment, the game progress unit 115 outputs the voice data 801 included in the movement instruction data supplied from the movement instruction device 300 as the voice emitted by the character 802 in synchronization with the movement of the character 802. .. The voice data is obtained by acquiring the voice 700 of the voice actor 701 via the microphone 3010 at the installation location of the operation instruction device 300. Therefore, the voice data 801 corresponding to the voice 700 emitted by the voice actor 701 is output as it is from the speaker of the user terminal 100.
 上述の構成によれば、動作指図装置300の設置場所において、実在する声優701およびモデル702の音声および動きが、そのまま、キャラクタ802の音声および動きに反映される。このような動作を行うキャラクタ802を見て、ユーザは、キャラクタ802に対して、まるで、現実の世界に存在するかのような現実感を覚えることができ、ゲームの世界に没入することができる。 According to the above configuration, the voices and movements of the existing voice actors 701 and model 702 at the installation location of the operation instruction device 300 are directly reflected in the voices and movements of the character 802. Looking at the character 802 that performs such an operation, the user can feel the reality of the character 802 as if it exists in the real world, and can immerse himself in the game world. ..
 さらに、本実施形態では、ゲーム進行部115は、ストーリーパート(第1パート)におけるユーザの入力操作に基づいて、ストーリーパートのプレイ結果を決定してもよい。そして、ゲーム進行部115は、ライブ配信パート(第2パート)において、動作指図データに基づいて動作させるキャラクタを、該プレイ結果に応じた表示態様にて、表示部152に表示させてもよい。 Further, in the present embodiment, the game progress unit 115 may determine the play result of the story part based on the input operation of the user in the story part (first part). Then, in the live distribution part (second part), the game progress unit 115 may display the character to be operated based on the operation instruction data on the display unit 152 in a display mode according to the play result.
 一例として、ゲーム進行部115は、これまでにプレイされたストーリーパートにおいて、上述のキャラクタに身に付けさせることが可能なアイテムが獲得されていれば、そのアイテムのオブジェクトをキャラクタ802のオブジェクトに合成することが好ましい。上述の構成によれば、ユーザがストーリーパートをプレイすることにより獲得したアイテムを、ライブ配信パートで動作するキャラクタ802の服飾品に反映させることができる。例えば、図8に示すとおり、ストーリーパートにおいてシナリオをプレイしたことによって服飾品としてのアイテム(例えば、うさみみバンド)が獲得されている。この場合には、ゲーム進行部115は、図5に示すゲーム情報132から、該服飾品の情報を読み出し、該アイテムのオブジェクト(図示の例では、服飾品803)を、キャラクタ802に合成する。 As an example, if an item that can be worn by the above-mentioned character is acquired in the story part played so far, the game progress unit 115 synthesizes the object of the item into the object of the character 802. It is preferable to do so. According to the above configuration, the item acquired by the user playing the story part can be reflected in the clothing of the character 802 operating in the live distribution part. For example, as shown in FIG. 8, an item as a fashion item (for example, a Usamimi band) is acquired by playing a scenario in a story part. In this case, the game progress unit 115 reads out the information of the clothing item from the game information 132 shown in FIG. 5, and synthesizes the object of the item (in the illustrated example, the clothing item 803) into the character 802.
 これにより、ユーザは、キャラクタ802により愛着を感じてライブ配信パートをより一層楽しむことができる。さらに、キャラクタ802の服飾品をバージョンアップさせたいというユーザの意欲を育むことができ、結果として、ストーリーパートをプレイする動機付けを強化することが可能となる。 As a result, the user can feel the attachment to the character 802 and enjoy the live distribution part even more. Further, the user's motivation to upgrade the clothing of the character 802 can be cultivated, and as a result, the motivation to play the story part can be strengthened.
 さらに、本実施形態では、ゲーム進行部115は、キャラクタ802の動作に反応して、キャラクタ802に宛てたコメントを入力することが可能であってもよい。一例として、ゲーム進行部115は、動画再生画面800に、コメント入力ボタン804を配置する。ユーザは、コメント入力ボタン804にタッチして、コメントを入力するためのUIを呼び出し、該UIを操作して、キャラクタ802に宛てたコメントを入力する。該UIは、予め準備されたいくつかのコメントの中からユーザが所望のコメントを選択するためのものであってもよい。該UIは、ユーザが文字を編集してコメントを入力するためのものであってもよい。該UIは、ユーザが音声にてコメントを入力するためのものであってもよい。 Further, in the present embodiment, the game progress unit 115 may be able to input a comment addressed to the character 802 in response to the operation of the character 802. As an example, the game progress unit 115 arranges a comment input button 804 on the moving image reproduction screen 800. The user touches the comment input button 804 to call a UI for inputting a comment, operates the UI, and inputs a comment addressed to the character 802. The UI may be for the user to select a desired comment from some prepared comments. The UI may be for the user to edit characters and enter comments. The UI may be for the user to input a comment by voice.
 <処理フロー>
 図10は、ゲームシステム1を構成する各装置が実行する処理の流れを示すフローチャートである。
<Processing flow>
FIG. 10 is a flowchart showing a flow of processing executed by each device constituting the game system 1.
 ステップS101にて、ユーザ端末100のゲーム進行部115は、ユーザからゲーム開始の入力操作を受け付けると、サーバ200にアクセスし、ログインの要求を行う。 In step S101, when the game progress unit 115 of the user terminal 100 receives an input operation for starting a game from the user, it accesses the server 200 and requests login.
 ステップS102にて、サーバ200の進行支援部211は、ユーザ端末100のステータスがオンラインであることを確認し、ログインを受け付けた旨応答する。 In step S102, the progress support unit 211 of the server 200 confirms that the status of the user terminal 100 is online, and responds that the login has been accepted.
 ステップS103にて、ゲーム進行部115は、必要に応じて、サーバ200と通信しながら、ユーザの入力操作に応じてゲームを進行させる。ゲーム進行部115は、ストーリーパートを進行させてもよいし、新たなシナリオを獲得するための獲得パートを進行させてもよい。 In step S103, the game progress unit 115 advances the game according to the input operation of the user while communicating with the server 200 as necessary. The game progress unit 115 may advance the story part or the acquisition part for acquiring a new scenario.
 ステップS104にて、進行支援部211は、必要に応じて、ユーザ端末100に対して必要な情報を提供するなどして、ユーザ端末100におけるゲーム進行を支援する。 In step S104, the progress support unit 211 supports the progress of the game on the user terminal 100 by providing necessary information to the user terminal 100 as needed.
 ステップS105にて、ライブ配信時刻になると、サーバ200の共有支援部212は、ステップS105のYESからステップS106に進む。ライブ配信時刻は、例えば、ゲームマスターによって予め決定されており、サーバ200および動作指図装置300において管理されている。また、ユーザ端末100に対して、ライブ配信時刻は予め通知されていてもよいし、実際にライブ配信時刻になるまで秘密にされていてもよい。前者の場合、ユーザに対して安定的にライブ配信を供給することができ、後者の場合、サプライズ配信として、ユーザに特別な付加価値が付いたライブ配信を供給することが可能となる。 At the live distribution time in step S105, the sharing support unit 212 of the server 200 proceeds from YES in step S105 to step S106. The live distribution time is, for example, predetermined by the game master and managed by the server 200 and the operation instruction device 300. Further, the live distribution time may be notified to the user terminal 100 in advance, or may be kept secret until the actual live distribution time is reached. In the former case, live distribution can be stably supplied to the user, and in the latter case, live distribution with special added value can be supplied to the user as a surprise distribution.
 ステップS106にて、共有支援部212は、ライブ配信を受ける権利がある1以上のユーザ端末100を探索する。ライブ配信を受けられる条件は、適宜ゲームマスターが設定すればよいが、少なくとも、本ゲームのアプリケーションをインストールしていること、および、ライブ配信時刻時点でオンラインであることなどが条件として挙げられる。本実施形態では一例として、ライブ配信時刻時点でオンラインである、すなわち、本ゲームのアプリケーションを起動しているユーザ端末100を、ライブ配信を受ける権利があるユーザ端末100として探索する。あるいは、共有支援部212は、さらに、ライブ配信を受けるための対価を支払い済みのユーザが所有するユーザ端末100であることを条件に加えてもよい。あるいは、共有支援部212は、事前に、上述のライブ配信時刻において、ライブ配信を受ける旨の予約を行った特定のユーザ端末100を、ライブ配信を受ける権利があるユーザ端末100として探索してもよい。 In step S106, the sharing support unit 212 searches for one or more user terminals 100 having the right to receive live distribution. The conditions for receiving live distribution may be set by the game master as appropriate, but at least the conditions include that the application of this game is installed and that the game is online at the time of live distribution. In the present embodiment, as an example, the user terminal 100 that is online at the time of live distribution, that is, that is running the application of this game, is searched for as the user terminal 100 that has the right to receive live distribution. Alternatively, the sharing support unit 212 may further add that the user terminal 100 is owned by the user who has paid the consideration for receiving the live distribution. Alternatively, the sharing support unit 212 may search for a specific user terminal 100 that has made a reservation to receive live distribution in advance at the above-mentioned live distribution time as a user terminal 100 that has the right to receive live distribution. good.
 ステップS107にて、共有支援部212は、検出した1以上のユーザ端末100を動作指図装置300に通知する。例えば、共有支援部212は、ユーザ端末100の端末ID、ユーザ端末100の所有者であるユーザのユーザID、および、ユーザ端末100のアドレスなどを動作指図装置300に通知してもよい。 In step S107, the sharing support unit 212 notifies the operation instruction device 300 of one or more detected user terminals 100. For example, the sharing support unit 212 may notify the operation instruction device 300 of the terminal ID of the user terminal 100, the user ID of the user who is the owner of the user terminal 100, the address of the user terminal 100, and the like.
 一方、ステップS108にて、動作指図装置300のキャラクタ制御部316は、ライブ配信時刻になると、ステップS108のYESからステップS109~S110に進む。ステップS109~S110は、いずれが先に実行されても構わない。 On the other hand, in step S108, the character control unit 316 of the operation instruction device 300 proceeds from YES in step S108 to steps S109 to S110 at the live distribution time. Which of steps S109 to S110 may be executed first.
 ステップS109にて、キャラクタ制御部316は、声優などのアクターがマイク3010を介して入力した音声を音声データとして取得する。 In step S109, the character control unit 316 acquires the voice input by an actor such as a voice actor via the microphone 3010 as voice data.
 ステップS110にて、キャラクタ制御部316は、モデルなどのアクターがモーションキャプチャ装置3020を介して入力した動きをモーションキャプチャデータとして取得する。 In step S110, the character control unit 316 acquires the motion input by the actor such as the model via the motion capture device 3020 as motion capture data.
 ステップS111にて、キャラクタ制御部316は、動作指図データ(第2動作指図データ)を生成する。具体的には、キャラクタ制御部316は、上述のライブ配信開始時刻に動画を配信させるキャラクタを特定し、該キャラクタのキャラクタIDを、動作指図データの「キャラクタID」の項目に格納する。いつの時刻にどのキャラクタの動画を配信するのかは、ゲームマスターによって予めスケジューリングされ、動作指図装置300に登録されていてもよい。あるいは、動作指図装置300のオペレータが、どのキャラクタの動作指図データを作成するのかを動作指図装置300に対して予め指定しておいてもよい。キャラクタ制御部316は、ステップS109で取得した音声データを、動作指図データの「音声」の項目に格納する。キャラクタ制御部316は、ステップS110で取得したモーションキャプチャデータを、動作指図データの「動き」の項目に格納する。キャラクタ制御部316は、音声データとモーションキャプチャデータとが同期するように、音声データとモーションキャプチャデータとを紐付ける。キャラクタ制御部316は、ステップS107にてサーバ200より通知された1以上のユーザ端末100が宛先となるように、これらのユーザ端末100のグループのグループ識別情報、または、1台のユーザ端末100のアドレスを、宛先指定情報として、動作指図データの「宛先」の項目に格納する。 In step S111, the character control unit 316 generates operation instruction data (second operation instruction data). Specifically, the character control unit 316 identifies a character to be delivered a moving image at the above-mentioned live distribution start time, and stores the character ID of the character in the item of "character ID" of the operation instruction data. Which character's moving image is to be delivered at what time may be scheduled in advance by the game master and registered in the operation instruction device 300. Alternatively, the operator of the operation instruction device 300 may specify in advance to the operation instruction device 300 which character the operation instruction data should be created. The character control unit 316 stores the voice data acquired in step S109 in the “voice” item of the operation instruction data. The character control unit 316 stores the motion capture data acquired in step S110 in the “movement” item of the operation instruction data. The character control unit 316 associates the voice data with the motion capture data so that the voice data and the motion capture data are synchronized with each other. The character control unit 316 may use group identification information of a group of these user terminals 100 or one user terminal 100 so that the destination is one or more user terminals 100 notified by the server 200 in step S107. The address is stored in the "destination" item of the operation instruction data as the destination designation information.
 ステップS112にて、キャラクタ制御部316は、通信IF33を介して、上述のように生成した動作指図データを、宛先として指定した各ユーザ端末100に送信する。キャラクタ制御部316は、アクターが声を出したり、動いたりして得られた音声データおよびモーションキャプチャデータを、取得してすぐさま動作指図データへとレンダリングし、リアルタイムで、各ユーザ端末100に配信することが望ましい。 In step S112, the character control unit 316 transmits the operation instruction data generated as described above to each user terminal 100 designated as the destination via the communication IF 33. The character control unit 316 acquires voice data and motion capture data obtained by the actor making a voice or moving, and immediately renders them into motion instruction data, and distributes them to each user terminal 100 in real time. Is desirable.
 ステップS113にて、ユーザ端末100の解析部116は、通信IF13を介して、上述の動作指図データを受信する。例えば、解析部116は、動作指図装置300またはサーバ200から予めライブ配信すると予告された時刻に、動作指図データを受信してもよい。 In step S113, the analysis unit 116 of the user terminal 100 receives the above-mentioned operation instruction data via the communication IF 13. For example, the analysis unit 116 may receive the operation instruction data at a time previously announced to be live-streamed from the operation instruction device 300 or the server 200.
 ステップS114にて、解析部116は、受信したことをトリガにして、受信した動作指図データを解析する。 In step S114, the analysis unit 116 analyzes the received operation instruction data by using the reception as a trigger.
 ステップS115にて、ゲーム進行部115は、上述の動作指図データを受信したときに、ライブ配信パートを実行していなければ、該ライブ配信パートを開始する。このとき、ゲーム進行部115は、別のパートを実行していれば、該パートの進行を中断してから、ライブ配信パートを開始する。ここで、ゲーム進行部115は、ライブ配信が開始されたために実行中のパートを一時中断する旨のメッセージを表示部152に出力し、該パートの進捗を記憶部120に保存しておくことが望ましい。なお、上述の動作指図データを受信した時点で、すでに、ライブ配信パートを実行していれば、ゲーム進行部115は、ステップS115を省略してもよい。この場合、ゲーム進行部115は、動作指図データ(すなわち、キャラクタがライブ配信する体の動画)の配信が開始された旨のメッセージを表示部152に出力してもよい。 In step S115, when the game progress unit 115 receives the above-mentioned operation instruction data, if the live distribution part is not executed, the game progress unit 115 starts the live distribution part. At this time, if another part is being executed, the game progress unit 115 interrupts the progress of the part and then starts the live distribution part. Here, the game progress unit 115 may output a message to the effect that the part being executed is temporarily suspended due to the start of live distribution to the display unit 152, and save the progress of the part in the storage unit 120. desirable. If the live distribution part has already been executed when the above-mentioned operation instruction data is received, the game progress unit 115 may omit step S115. In this case, the game progress unit 115 may output a message to the effect that the distribution of the operation instruction data (that is, the moving image of the body to be live-streamed by the character) has started to the display unit 152.
 ステップS116にて、ゲーム進行部115は、解析部116によって解析された動画指図データに基づいてキャラクタを動作させることにより、ライブ配信パートを進行させる。具体的には、ゲーム進行部115は、図9に示す動画再生画面800などを表示部152に表示させる。ゲーム進行部115は、声優701、モデル702などのアクターが動作指図装置300の設置場所で、声を出したり、動いたりしているのとほぼ同時に、リアルタイムで、その音声および動きを、動画再生画面800におけるキャラクタ802の発言および動きに反映させる。解析部116およびゲーム進行部115は、リアルタイムの動画のレンダリングおよび再生を、動作指図装置300から動作指図データを継続して受信し続けている間継続する。具体的には、ゲーム進行部115は、ユーザから何の入力操作も受け付けず、動作指図データが受信されている間は、ステップS117のNOからステップS113に戻り、以降の各ステップを繰り返す。 In step S116, the game progress unit 115 advances the live distribution part by operating the character based on the moving image instruction data analyzed by the analysis unit 116. Specifically, the game progress unit 115 causes the display unit 152 to display the moving image reproduction screen 800 and the like shown in FIG. The game progress unit 115 reproduces the voice and movement in real time at almost the same time as the actors such as the voice actor 701 and the model 702 are making a voice or moving at the place where the operation instruction device 300 is installed. It is reflected in the speech and movement of the character 802 on the screen 800. The analysis unit 116 and the game progress unit 115 continue rendering and reproducing the real-time moving image while continuously receiving the operation instruction data from the operation instruction device 300. Specifically, the game progress unit 115 does not accept any input operation from the user, and while the operation instruction data is received, returns from NO in step S117 to step S113, and repeats the subsequent steps.
 ステップS117にて、動作指図データに基づいてキャラクタが動作している間に、操作受付部111が、ユーザから入力操作を受け付けると、ゲーム進行部115は、ステップS117のYESからステップS118に進む。例えば、操作受付部111は、動画再生画面800におけるコメント入力ボタン804に対する入力操作を受け付ける。 If the operation reception unit 111 receives an input operation from the user while the character is operating based on the operation instruction data in step S117, the game progress unit 115 proceeds from YES in step S117 to step S118. For example, the operation receiving unit 111 accepts an input operation for the comment input button 804 on the moving image reproduction screen 800.
 ステップS118にて、ゲーム進行部115は、上述の入力操作に応じて生成したコメントデータを動作指図装置300に送信する。具体的には、ゲーム進行部115は、選択されたコメントのコメントIDをコメントデータとして送信してもよい。あるいは、ゲーム進行部115は、ユーザにより入力された文章のテキストデータをコメントデータとして送信してもよい。あるいは、ゲーム進行部115は、ユーザにより入力された音声の音声データをコメントデータとして送信してもよい。あるいは、ゲーム進行部115は、ユーザにより入力された音声を認識し、テキストデータに変換したものをコメントデータとして送信してもよい。 In step S118, the game progress unit 115 transmits the comment data generated in response to the above-mentioned input operation to the operation instruction device 300. Specifically, the game progress unit 115 may transmit the comment ID of the selected comment as comment data. Alternatively, the game progress unit 115 may transmit the text data of the text input by the user as comment data. Alternatively, the game progress unit 115 may transmit the voice data of the voice input by the user as comment data. Alternatively, the game progress unit 115 may recognize the voice input by the user, convert it into text data, and transmit it as comment data.
 ステップS119にて、動作指図装置300の反応処理部317は、通信IF33を介して、ユーザ端末100から送信されたコメントデータを受信する。 In step S119, the reaction processing unit 317 of the operation instruction device 300 receives the comment data transmitted from the user terminal 100 via the communication IF 33.
 ステップS120にて、反応処理部317は、受信したコメントデータを、動作指図装置300に出力する。例えば、反応処理部317は、コメントデータに含まれるテキストデータを表示部352に表示する。これにより、オペレータは、自分たちが動かしたキャラクタに対して、ユーザがどのような反応を示したのかを示すフィードバックを受けることが可能となる。そして、オペレータは、このフィードバックに合わせて、さらなるキャラクタの動作を決定することができる。すなわち、動作指図装置300は、ステップS109に戻り、音声データおよびモーションキャプチャデータの取得を継続し、動作指図データをユーザ端末100に提供し続ける。ユーザ端末100は、自端末における入力操作の内容が動作指図装置300によって受信された後、該動作指図装置300から送信された動作指図データを受信する。具体的には、ユーザ端末100は、キャラクタの発言内容に対応する音声データ、および、キャラクタの動きに対応するモーションキャプチャデータなどが含まれた動作指図データを受信する。そして、ユーザ端末100は、継続的に、該動作指図データに基づいて、キャラクタを動作させる。結果として、ユーザに、キャラクタとのリアルタイムでインタラクティブなやりとりを体験させることが可能となる。なお、モーションキャプチャデータに代えて、キャラクタの動作を指示する1以上のコマンドが、動作指図装置300のオペレータが指示した順に並んでいるモーションコマンド群が、ユーザ端末100によって受信されてもよい。 In step S120, the reaction processing unit 317 outputs the received comment data to the operation instruction device 300. For example, the reaction processing unit 317 displays the text data included in the comment data on the display unit 352. This allows operators to receive feedback on how the user responded to the character they moved. Then, the operator can determine the action of the further character according to this feedback. That is, the operation instruction device 300 returns to step S109, continues to acquire the voice data and the motion capture data, and continues to provide the operation instruction data to the user terminal 100. The user terminal 100 receives the operation instruction data transmitted from the operation instruction device 300 after the content of the input operation in the own terminal is received by the operation instruction device 300. Specifically, the user terminal 100 receives voice data corresponding to the content of the character's speech, motion capture data corresponding to the movement of the character, and the like, and operation instruction data. Then, the user terminal 100 continuously operates the character based on the operation instruction data. As a result, it is possible for the user to experience real-time interactive interaction with the character. Instead of the motion capture data, the user terminal 100 may receive a motion command group in which one or more commands instructing the operation of the character are arranged in the order instructed by the operator of the operation instruction device 300.
 <変形例1>
 実施形態1の変形例1では、ライブ配信パートにおいて動画をライブ配信するキャラクタは、他のパートにおいて、NPCでなくてもよい。すなわち、他のパートにおいて、ユーザの操作に基づいて動作するPCが、ライブ配信パートにおいてNPCとして動画をライブ配信するゲームに対しても、本発明を適用することができる。
<Modification 1>
In the first modification of the first embodiment, the character for live-streaming the moving image in the live-streaming part does not have to be an NPC in the other part. That is, the present invention can also be applied to a game in which a PC operating based on a user's operation in another part performs live distribution of a moving image as an NPC in the live distribution part.
 変形例1では、ユーザ端末100は、ゲームプログラム131に基づいて、ゲームの興趣性を向上させるために以下のステップを実行するように構成されている。具体的には、ユーザ端末100は、操作部(入出力IF14、タッチスクリーン15、カメラ17、測距センサ18)を介してコンピュータ(ユーザ端末100)に入力されたユーザの入力操作に応じてキャラクタを動作させることにより第1パートを進行させるステップと、NPC制御装置(動作指図装置300)から受信した、キャラクタの動作を指定する動作指図データに基づいてキャラクタを動作させることにより第2パートを進行させるステップとを実行する。ここで、動作指図データは、音声データおよびモーションキャプチャデータの少なくともいずれか1つを含む。ユーザ端末100は、第2パートを進行させるステップでは、ユーザの入力操作の内容をNPC制御装置に送信し、該入力操作の内容を踏まえてNPC制御装置において決定された動作指図データを受信し、動作指図データを受信したことをトリガにして、キャラクタを動作させる。 In the first modification, the user terminal 100 is configured to execute the following steps in order to improve the interest of the game based on the game program 131. Specifically, the user terminal 100 is a character according to a user's input operation input to the computer (user terminal 100) via the operation unit (input / output IF 14, touch screen 15, camera 17, distance measurement sensor 18). The second part is advanced by operating the character based on the step of advancing the first part by operating the character and the operation instruction data specifying the operation of the character received from the NPC control device (operation instruction device 300). And perform the steps to make it. Here, the operation instruction data includes at least one of voice data and motion capture data. In the step of advancing the second part, the user terminal 100 transmits the content of the user's input operation to the NPC control device, receives the operation instruction data determined by the NPC control device based on the content of the input operation, and receives the operation instruction data. The character is operated by using the reception of the operation instruction data as a trigger.
 <変形例2>
 実施形態1の説明では、上述の図3に関し、ユーザ端末100が、ストーリーパート、獲得パート、ライブ配信パートの順にゲームを実行する場合が示された。これに対し、実施形態1の変形例2では、ストーリーパートの進行中にユーザ端末100により特定の行動が実施されたかの結果に応じて、自動的にライブ配信パートに切り替わるようにゲームが進行されてもよい。図11は、実施形態の変形例2に係るゲームプログラムに基づいて実行されるゲームの基本的なゲーム進行を示すフローチャートである。
<Modification 2>
In the description of the first embodiment, the case where the user terminal 100 executes the game in the order of the story part, the acquisition part, and the live distribution part is shown with respect to FIG. 3 described above. On the other hand, in the second modification of the first embodiment, the game is progressed so as to automatically switch to the live distribution part according to the result of whether a specific action is performed by the user terminal 100 while the story part is in progress. May be good. FIG. 11 is a flowchart showing a basic game progress of a game executed based on the game program according to the second modification of the embodiment.
 ステップS1aは、図3のステップS1と同様である。つまり、ゲーム進行部115は、ストーリーパート(第1パート)を実行する。ストーリーパートには、固定シナリオS11aおよび獲得シナリオS12aが含まれる。上述のとおり、例えば、ユーザが操作する主人公とキャラクタとが対話するシーンが含まれる。本実施形態では、一例として、デジタルデータとしてひとまとめにされた「シナリオ」は、キャラクタにまつわる物語の1話分に対応し、サーバ200から供給されて、一旦記憶部120に格納される。ゲーム進行部115は、ストーリーパートにおいて、記憶部120に格納されているシナリオを1つ読み出し、結末を迎えるまで1つシナリオをユーザの入力操作に応じて進行させる。シナリオには、ユーザに選択させる選択肢、該選択肢に対応するキャラクタの応答パターンなどが含まれており、ユーザがどの選択肢を選択するのかによって、1つのシナリオの中でも、異なる結末が得られてもよい。具体的には、ゲーム進行部115は、主人公からキャラクタに対しての働きかけに対応する複数の選択肢をユーザが選択可能に提示し、該ユーザが選択した選択肢に応じて、シナリオを進行させる。なお、キャラクタは、上述のNPCでよく、ここでは、ゲームプレイヤである何れのユーザによる直接操作の対象とはされない。 Step S1a is the same as step S1 in FIG. That is, the game progress unit 115 executes the story part (first part). The story part includes a fixed scenario S11a and an acquisition scenario S12a. As described above, for example, a scene in which the main character operated by the user and the character interact with each other is included. In the present embodiment, as an example, the "scenario" collected as digital data corresponds to one episode of a story related to a character, is supplied from the server 200, and is temporarily stored in the storage unit 120. In the story part, the game progress unit 115 reads out one scenario stored in the storage unit 120, and advances one scenario according to the input operation of the user until the end is reached. The scenario includes an option to be selected by the user, a response pattern of the character corresponding to the option, and the like, and different endings may be obtained in one scenario depending on which option the user selects. .. Specifically, the game progress unit 115 presents a plurality of options corresponding to the action from the main character to the character so that the user can select them, and advances the scenario according to the options selected by the user. The character may be the above-mentioned NPC, and is not a target of direct operation by any user who is a game player here.
 ステップS1aのストーリーパートの進行中、ステップS13aでは、ゲーム進行部115は、ユーザによる特定の行動を受け付ける。これに応じて、ゲーム進行部115はステップS4aに進み、ストーリーパートからライブ配信パートに切り替えるための動作が行われる。なお、ステップS13aでユーザによる特定の行動を受け付けないうちは、ゲーム進行部115は、引き続きステップS1aのストーリーパートを継続して実行するのがよい。 While the story part of step S1a is in progress, in step S13a, the game progress unit 115 receives a specific action by the user. In response to this, the game progress unit 115 proceeds to step S4a, and an operation for switching from the story part to the live distribution part is performed. It is preferable that the game progress unit 115 continuously executes the story part of step S1a until the user does not accept a specific action in step S13a.
 ここで、ストーリーパートにおけるユーザによる特定の行動の結果には、一例では、ユーザ端末100が備える上述の位置登録システムによって取得されるユーザ端末100の位置が、所定の位置となることが含まれる。より詳しくは、図6に関して説明されたように、ユーザ端末100の位置登録情報を利用した位置情報ゲームによってクエストが実現され、ユーザは、ユーザ端末100を持って、ゲーム進行部115によって決定された位置まで移動する。その結果、ユーザ端末100の現在位置情報が当該決定された位置と整合した場合に、ユーザに目標物を獲得させる(図8)のに替えて、またはこれに加えて、ゲームの進行がライブ配信パートに自動的に切り替わるようにしてもよい。 Here, the result of a specific action by the user in the story part includes, for example, that the position of the user terminal 100 acquired by the above-mentioned position registration system included in the user terminal 100 becomes a predetermined position. More specifically, as described with respect to FIG. 6, the quest is realized by the location information game using the location registration information of the user terminal 100, and the user holds the user terminal 100 and is determined by the game progress unit 115. Move to position. As a result, when the current position information of the user terminal 100 matches the determined position, the progress of the game is live-streamed in place of or in addition to causing the user to acquire the target (FIG. 8). You may want to switch to the part automatically.
 なお、位置登録システムによって取得されるユーザ端末100の現実の位置登録情報に替えて、仮想的な位置情報が適用されてもよい。つまり、ストーリーパートにおけるユーザによる特定の行動の結果には、ユーザが操作している、ゲーム進行中のキャラクタの仮想的な位置が、所定の位置となることが含まれてもよい。 Note that virtual location information may be applied instead of the actual location registration information of the user terminal 100 acquired by the location registration system. That is, the result of a specific action by the user in the story part may include that the virtual position of the character being operated by the user during the game becomes a predetermined position.
 他の例では、ストーリーパートにおけるユーザによる特定の行動の結果には、ストーリーパートに関連付けられた所定のシナリオが完結したことが含まれる。より詳しくは、ストーリーパートにおいて、ユーザが1以上のクエストをクリアしたり、選択肢を選択したりすることにより、キャラクタとの対話が進み、シナリオが進行する。そして、シナリオが1つの結末を迎えた場合に、ユーザはシナリオのプレイを完遂したことになる。その結果、ゲームがストーリーパートからライブ配信パートに自動的に切り替わるようにしてもよい。 In another example, the result of a particular action by the user in the story part includes the completion of a given scenario associated with the story part. More specifically, in the story part, when the user clears one or more quests or selects an option, the dialogue with the character progresses and the scenario progresses. Then, when the scenario has one end, the user has completed the play of the scenario. As a result, the game may automatically switch from the story part to the livestream part.
 図11に戻り、ステップS4aは、図3のステップS4と同様である。つまり、ゲーム進行部115は、ネットワークを介して外部の装置(サーバ200または動作指図装置300)から、動作指図データを受信したか否かを判定する。動作指図データを外部の装置から受信しないうちは、ゲーム進行部115は、ステップS4aのNOから、例えば、ステップS1aに戻り、引き続きストーリーパートを実行してもよい。一方、動作指図データを外部の装置から受信した場合は、ゲーム進行部115は、ステップS4aのYESからステップS5aに進む。 Returning to FIG. 11, step S4a is the same as step S4 in FIG. That is, the game progress unit 115 determines whether or not the operation instruction data has been received from the external device (server 200 or the operation instruction device 300) via the network. While the operation instruction data is not received from the external device, the game progress unit 115 may return from NO in step S4a to, for example, step S1a, and continue to execute the story part. On the other hand, when the operation instruction data is received from the external device, the game progress unit 115 proceeds from YES in step S4a to step S5a.
 ステップS5aは、図3のステップS5と同様である。つまり、ゲーム進行部115は、ライブ配信パート(第2パート)を実行する。具体的には、ゲーム進行部115は、ステップS4aにて受信した動作指図データにしたがってキャラクタを動作させることにより、ライブ配信パートを進行させる。ユーザは、ステップS1aでは、シナリオにおいて、決め打ちの反応を示すキャラクタと単にUIを介して対話するだけであった。しかし、ユーザは、ライブ配信パートにおいては、外部の装置から送信された動作指図データに基づいてリアルタイムに動作するキャラクタと自由にインタラクティブに対話することができる。より具体的には、解析部116は、ユーザの入力操作の内容に応じて、NPCに関連付けられるオペレータ(声優およびモデルを含む。)が入力している音声データおよびモーションデータを含む動作指図データを動作指図装置300から受信してこれを解析する。そして、ゲーム進行部115は、受信された動作指図データに含まれる音声データに基づいて、キャラクタに発話させるともに、上述のモーションデータに基づいてキャラクタに動きをつける。これにより、ユーザとオペレータとが、リアルタイムかつインタラクティブに動作を同期させながら協働することができる。つまり、上述のユーザの入力操作に対するキャラクタの反応を、ユーザに提示することができる。 Step S5a is the same as step S5 in FIG. That is, the game progress unit 115 executes the live distribution part (second part). Specifically, the game progress unit 115 advances the live distribution part by operating the character according to the operation instruction data received in step S4a. In step S1a, the user simply interacts with the character showing a definite reaction via the UI in the scenario. However, in the live distribution part, the user can freely and interactively interact with the character that operates in real time based on the operation instruction data transmitted from the external device. More specifically, the analysis unit 116 inputs operation instruction data including voice data and motion data input by an operator (including a voice actor and a model) associated with the NPC according to the content of the input operation of the user. It is received from the operation instruction device 300 and analyzed. Then, the game progress unit 115 causes the character to speak based on the voice data included in the received motion instruction data, and moves the character based on the above-mentioned motion data. This allows the user and the operator to collaborate while synchronizing their actions in real time and interactively. That is, the reaction of the character to the above-mentioned user's input operation can be presented to the user.
 変形例2においては、また、上述の図10で説明された処理フローでは、例えば、ステップS105で、サーバ200がライブ配信時刻であるかを判定することに替えて、サーバ200がユーザによる特定の行動を受け付けたかを判定するのがよい。つまり、判定条件が満たされた場合に、サーバ200および動作指図装置300は、ライブ配信パートにおけるライブ配信をユーザ端末100に提供する。逆に、判定条件が満たされない場合には、ユーザ端末100がライブ配信パートに進むことはないようにゲームの進行が制御される。 In the second modification, in the processing flow described with reference to FIG. 10 above, for example, in step S105, instead of determining whether the server 200 is the live distribution time, the server 200 is specified by the user. It is better to judge whether the action has been accepted. That is, when the determination condition is satisfied, the server 200 and the operation instruction device 300 provide the live distribution in the live distribution part to the user terminal 100. On the contrary, when the determination condition is not satisfied, the progress of the game is controlled so that the user terminal 100 does not proceed to the live distribution part.
 判定条件が満たされた場合、ユーザ端末100では、動作指図データに基づいてNPCが動作され、ライブ配信パートの進行を実行することができる。具体的には、動作指図端末300においてS108からS110で既にライブ配信が開始されているときには、ユーザ端末100は、途中からリアルタイムのライブ配信の供給を受けることができるようにしてもよい。これに替えて、判定条件が満たされた場合に、このことをトリガにしてライブ配信が開始され、ユーザ端末100は、終了済みのライブ配信の供給を始めから受けることができるようにしてもよい。なお、判定条件であるユーザによる特定の行動は、例えば、ゲームマスターによって予め決定され、サーバ200および動作指図装置300において管理されている。 When the determination condition is satisfied, the user terminal 100 operates the NPC based on the operation instruction data, and can execute the progress of the live distribution part. Specifically, when the operation instruction terminal 300 has already started the live distribution from S108 to S110, the user terminal 100 may be able to receive the real-time live distribution from the middle. Instead of this, when the determination condition is satisfied, the live distribution is started by using this as a trigger, and the user terminal 100 may be able to receive the supply of the completed live distribution from the beginning. .. It should be noted that a specific action by the user, which is a determination condition, is determined in advance by, for example, a game master, and is managed by the server 200 and the operation instruction device 300.
 変形例2によれば、ユーザ端末100は、第1パートにおいて、予めダウンロードされた第1動作指図データに基づいてNPCを動作させる。そして、第1パートにおいてユーザが特定の行動を実施した結果に応じて、第1パートから第2パートへの切り替えが行われる。ユーザ端末100は、動作指図装置300から第2動作指図データを受信し、第2パートにおいて、第2動作指図データに基づいてNPCを動作させる。動作指図装置300から受信した第2動作指図データに基づいてNPCを動作させることができるため、NPCの動作は、型にはまらず、その表現は大幅に広がる。そのため、ユーザは、ゲームプレイ中のNPCとの関わり合いを通じて、該NPCがまるで現実の世界にいるかのような現実感を覚えることができる。結果として、ゲームの世界への没入感を高め、該ゲームの興趣性を向上させるという効果を奏する。また、第2パートに移行するには、ユーザは第1パートにおいて特定の行動を実施する必要があるため、ゲーム性をより一層高めることができる。 According to the second modification, the user terminal 100 operates the NPC in the first part based on the first operation instruction data downloaded in advance. Then, switching from the first part to the second part is performed according to the result of the user performing a specific action in the first part. The user terminal 100 receives the second operation instruction data from the operation instruction device 300, and in the second part, operates the NPC based on the second operation instruction data. Since the NPC can be operated based on the second operation instruction data received from the operation instruction device 300, the operation of the NPC is unconventional and its expression is greatly expanded. Therefore, the user can feel the reality as if the NPC is in the real world through the relationship with the NPC during the game play. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game. Further, in order to move to the second part, the user needs to perform a specific action in the first part, so that the game quality can be further enhanced.
 なお、変形例1と同様に、上記変形例2においても、ライブ配信パートにおいて動画をライブ配信するキャラクタは、他のパートにおいてNPCでなくてもよい。すなわち、他のパートにおいて、ユーザの操作に基づいて動作するPCが、ライブ配信パートにおいてNPCとして動画をライブ配信するゲームに対しても、本発明を適用することができる。 Similar to the first modification, in the second modification, the character for live-streaming the video in the live-streaming part does not have to be an NPC in the other part. That is, the present invention can also be applied to a game in which a PC operating based on a user's operation in another part performs live distribution of a moving image as an NPC in the live distribution part.
 <変形例3>
 実施形態1の説明では、上述の図3に関し、ユーザ端末100が、ストーリーパート、獲得パート、ライブ配信パートの順にゲームを実行する場合が示された。また、実施形態1の変形例2では、上述の図11に関し、ストーリーパートの進行中にユーザ端末100により特定の行動が実施されたかの結果に応じて、自動的にライブ配信パートに切り替わるようにゲームが進行される場合が示された。
<Modification 3>
In the description of the first embodiment, the case where the user terminal 100 executes the game in the order of the story part, the acquisition part, and the live distribution part is shown with respect to FIG. 3 described above. Further, in the second modification of the first embodiment, with respect to FIG. 11 described above, the game is automatically switched to the live distribution part according to the result of whether a specific action is performed by the user terminal 100 while the story part is in progress. Was shown to be progressing.
 変形例3では、変形例2のステップS13aでユーザによる特定の行動を受け付けたのに応じて、自動的にライブ配信パートに切り替わる構成に代えて、ライブ配信パートを進行させるためのライブ配信を受ける権利をユーザに付与してもよい。なお、ここでの権利は、チケットの形態としてよく、チケットを保有するユーザは、配信されるライブへのアクセス権利を有することになる。 In the third modification, in response to the reception of a specific action by the user in step S13a of the second modification, the live distribution for advancing the live distribution part is received instead of the configuration that automatically switches to the live distribution part. Rights may be granted to the user. The right here may be in the form of a ticket, and the user holding the ticket has the right to access the live delivered.
 つまり、チケットを保有するユーザのみが、ライブ配信を受ける権利に基づいてライブ配信パートに切り替え可能となり、ライブ配信時刻になったらライブ配信パートを進行させることができる。その一方で、チケットを保有しないユーザは、ライブ配信パートを進行させることはできない。なお、ライブ配信時刻はユーザ端末100に対して予め通知されていてもよいし、実際にライブ配信時刻になるまで秘密にされていてもよい。前者の場合、ユーザに対して安定的にライブ配信を供給することができ、後者の場合、サプライズ配信として、ユーザに特別な付加価値が付いたライブ配信を供給することが可能となる。 In other words, only the user who holds the ticket can switch to the live distribution part based on the right to receive the live distribution, and the live distribution part can be advanced when the live distribution time comes. On the other hand, users who do not have tickets cannot proceed with the livestreaming part. The live distribution time may be notified to the user terminal 100 in advance, or may be kept secret until the actual live distribution time is reached. In the former case, live distribution can be stably supplied to the user, and in the latter case, live distribution with special added value can be supplied to the user as a surprise distribution.
 〔実施形態2〕
 <ゲーム概要>
 実施形態2に係るゲームシステム1が実行するゲーム(以下、本ゲーム)は、実施形態1と同様に、一例として、恋愛シミュレーションゲームの要素を含んだ育成シミュレーションゲームである。本実施形態では、本ゲームには、少なくともライブ配信パートが含まれる。本ゲームは、単一のライブ配信パートで構成されていてもよいし、複数のパートで構成されていてもよい。また、ライブ配信パートにおいて、動作指図装置300によって動作を制御されるキャラクタは、PCであっても、NPCであっても構わない。例えば、ライブ配信パートにてNPCとして動作するキャラクタであっても、別のパートでは、PCとしてユーザの入力操作にしたがって動作することがあってもよい。あるいは、動作指図データが動作指図装置300からライブ配信されていない期間、キャラクタは、ライブ配信パート内で、PCとして、ユーザの入力操作にしたがって動作してもよい。そして、ライブ配信が開始されたら、該キャラクタは、NPCに切り替えられ、動作指図装置300から供給された動作指図データにしたがって動作してもよい。
[Embodiment 2]
<Game overview>
The game executed by the game system 1 according to the second embodiment (hereinafter, this game) is, as an example, a training simulation game including elements of a love simulation game, as in the first embodiment. In this embodiment, the game includes at least a live distribution part. The game may be composed of a single live distribution part or may be composed of a plurality of parts. Further, in the live distribution part, the character whose operation is controlled by the operation instruction device 300 may be a PC or an NPC. For example, a character that operates as an NPC in the live distribution part may operate as a PC according to a user's input operation in another part. Alternatively, during the period when the operation instruction data is not live-distributed from the operation instruction device 300, the character may operate as a PC in the live distribution part according to the input operation of the user. Then, when the live distribution is started, the character may be switched to the NPC and operate according to the operation instruction data supplied from the operation instruction device 300.
 <処理概要>
 本実施形態では、ユーザ端末100は、ゲームプログラム131に基づいて、ゲームの興趣性を向上させるために以下のステップを実行するように構成されている。具体的には、ユーザ端末100は、例えば入力部151などの操作部を介してユーザ端末100(コンピュータ)に入力されたユーザの入力操作に応じてキャラクタを動作させるステップと、サーバ200または動作指図装置300(キャラクタ制御装置)からマルチキャストで送信されたキャラクタの動作を指定する動作指図データを受信するステップと、受信した動作指図データに基づいて、キャラクタを動作させるステップとを実行する。
<Processing overview>
In the present embodiment, the user terminal 100 is configured to execute the following steps in order to improve the interest of the game based on the game program 131. Specifically, the user terminal 100 has a step of operating a character in response to a user's input operation input to the user terminal 100 (computer) via an operation unit such as an input unit 151, and a server 200 or an operation instruction. The step of receiving the operation instruction data specifying the operation of the character transmitted by multicast from the device 300 (character control device) and the step of operating the character based on the received operation instruction data are executed.
 さらに、キャラクタを動作させるステップは、マルチキャストで送信された動作指図データが、受信するステップにおいて受信されたことをトリガにして開始されることが好ましい。 Further, it is preferable that the step of operating the character is started by triggering that the operation instruction data transmitted by multicast is received in the receiving step.
 あるいは、本実施形態では、ユーザ端末100は、ゲームプログラム131に基づいて、ゲームの興趣性を向上させるために以下のステップを実行するように構成されていてもよい。具体的には、ユーザ端末100は、操作部を介してユーザ端末100に入力されたユーザの入力操作に応じてキャラクタを動作させるステップと、サーバ200または動作指図装置300からユーザ端末100に宛てて送信された、キャラクタの動作を指定する動作指図データであって、該動作指図データの作成元として他のユーザのユーザ端末100が指定されていない動作指図データを受信するステップと、受信した動作指図データに基づいて、キャラクタを動作させるステップとを実行する。なお、動作指図データが「ユーザ端末100に宛てて送信された」とは、例えば、動作指図データがユニキャストで送信されたことを意味する。例えば、ユーザ端末100は、宛先指定情報に、自端末固有のアドレスが含まれている場合に、動作指図データが自端末に宛てて、すなわち、ユニキャストで送信されたと判断することができる。 Alternatively, in the present embodiment, the user terminal 100 may be configured to execute the following steps in order to improve the interest of the game based on the game program 131. Specifically, the user terminal 100 has a step of operating a character in response to a user's input operation input to the user terminal 100 via an operation unit, and a step of operating the character from the server 200 or the operation instruction device 300 to the user terminal 100. A step of receiving operation instruction data that is transmitted and specifies operation instruction data that does not specify the user terminal 100 of another user as a source of the operation instruction data, and a received operation instruction. Perform steps to move the character based on the data. In addition, "the operation instruction data was transmitted to the user terminal 100" means, for example, that the operation instruction data was transmitted by unicast. For example, when the destination designation information includes an address unique to the own terminal, the user terminal 100 can determine that the operation instruction data is addressed to the own terminal, that is, transmitted by unicast.
 さらに、キャラクタを動作させるステップは、ユニキャストで送信された動作指図データであって、ユーザまたはユーザ端末の識別情報が紐付けられていない動作指図データが、受信するステップにおいて受信されたことをトリガにして開始されることが好ましい。 Further, the step of operating the character triggers that the operation instruction data transmitted by unicast and to which the identification information of the user or the user terminal is not associated is received in the receiving step. It is preferable to start with.
 <ゲームシステム1の機能的構成>
 本実施形態に係るユーザ端末100において、解析部116は、さらに、動作指図データのメタ情報も解析する。メタ情報とは、動作指図データの中身とは別に、動作指図データの性質を定義する情報である。メタ情報は、例えば、宛先指定情報、および、作成元情報などである。具体的には、解析部116は、送信された動作指図データの宛先指定情報に基づいて、該動作指図データが、マルチキャスト送信されたものであるのか、ユニキャスト送信されたものであるのかを判断する。マルチキャスト送信とは、サーバ200または動作指図装置300が、自端末を含む所定のグループに対して同じ内容の情報を送信することを意味する。例えば、宛先指定情報として「ALL」が設定されている動作指図データは、本ゲームのアプリケーションを起動中のすべてのユーザ端末に宛てて同送される。ユニキャスト送信とは、サーバ200または動作指図装置300が、自端末に宛てて情報を送信することを意味する。宛先指定情報は、例えば、図4に示す動作指図データの項目「宛先」に格納されている。
<Functional configuration of game system 1>
In the user terminal 100 according to the present embodiment, the analysis unit 116 further analyzes the meta information of the operation instruction data. The meta information is information that defines the properties of the action instruction data separately from the contents of the action instruction data. The meta information is, for example, destination designation information, creation source information, and the like. Specifically, the analysis unit 116 determines whether the operation instruction data is multicast-transmitted or unicast-transmitted based on the destination designation information of the transmitted operation instruction data. do. Multicast transmission means that the server 200 or the operation instruction device 300 transmits the same information to a predetermined group including the own terminal. For example, the operation instruction data in which "ALL" is set as the destination designation information is sent to all the user terminals running the application of this game. Unicast transmission means that the server 200 or the operation instruction device 300 transmits information to its own terminal. The destination designation information is stored in, for example, the item “destination” of the operation instruction data shown in FIG.
 マルチキャスト送信された動作指図データは、特定のユーザではなく、ゲームマスターによって作成されたものと考えられる。具体的には、上述の動作指図データは、ゲームシステム1において本ゲームのサービスを提供するプロバイダ(運営団体)に帰属する装置によって作成されたものであってもよい。例えば、サーバ200または動作指図装置300は、すべてのユーザおよびユーザ端末についての情報を把握しているので、動作指図データを作成し、アプリケーション起動中のユーザ端末にマルチキャスト送信することが可能である。したがって、ユーザ端末100は、マルチキャスト送信された動作指図データを、ゲームマスターによって作成されたものであると判断することができる。 It is probable that the operation instruction data transmitted by multicast was created by the game master, not by a specific user. Specifically, the above-mentioned operation instruction data may be created by a device belonging to a provider (operating organization) that provides the service of the game in the game system 1. For example, since the server 200 or the operation instruction device 300 knows the information about all the users and the user terminals, it is possible to create the operation instruction data and transmit it by multicast to the user terminal while the application is running. Therefore, the user terminal 100 can determine that the operation instruction data transmitted by multicast is created by the game master.
 本実施形態では、解析部116は、一例として、以下の機能を有していてもよい。具体的には、解析部116は、マルチキャスト送信された動作指図データをレンダリングする。そして、解析部116は、レンダリングの結果に基づいてキャラクタを動作させるように、ゲーム進行部115に指示する。より好ましくは、解析部116は、マルチキャスト送信された動作指図データが受信されたことをトリガにして、該動作指図データをリアルタイムにレンダリングする。続けて、解析部116は、レンダリングの結果に基づいてキャラクタを動作させるように、ゲーム進行部115に指示する。 In the present embodiment, the analysis unit 116 may have the following functions as an example. Specifically, the analysis unit 116 renders the operation instruction data transmitted by multicast. Then, the analysis unit 116 instructs the game progress unit 115 to operate the character based on the rendering result. More preferably, the analysis unit 116 renders the operation instruction data in real time by using the reception of the operation instruction data transmitted by multicast as a trigger. Subsequently, the analysis unit 116 instructs the game progress unit 115 to operate the character based on the rendering result.
 別の実施形態では、解析部116は、上述の機能に代えて、あるいは、加えて、以下の機能を有していてもよい。具体的には、解析部116は、ユニキャスト送信された動作指図データであって、例えば、ユーザIDまたはユーザ端末IDなどの特定のユーザに関連する情報が紐付けられていない動作指図データをレンダリングする。そして、解析部116は、レンダリングの結果に基づいてキャラクタを動作させるように、ゲーム進行部115に指示する。より好ましくは、解析部116は、動作指図データがユニキャスト送信されたものであると判断した場合には、該動作指図データの作成元情報に基づいて、該動作指図データが特定のユーザ端末によって作成されたものかどうか判断する。例えば、作成元情報として、特定のユーザに関連する情報が紐付けられている動作指図データについて、解析部116は、特定のユーザ端末によって作成されたものであると判断する。作成元情報の値が空である動作指図データ、および、そもそも作成元情報が紐付けられていない動作指図データについて、解析部116は、特定のユーザ端末によって作成されたものではないと判断する。特定のユーザ端末によって作成されたものではない動作指図データは、ゲームマスターによって作成されたものと考えられる。 In another embodiment, the analysis unit 116 may have the following functions in place of or in addition to the above-mentioned functions. Specifically, the analysis unit 116 renders unicast-transmitted operation instruction data, for example, operation instruction data to which information related to a specific user such as a user ID or a user terminal ID is not associated. do. Then, the analysis unit 116 instructs the game progress unit 115 to operate the character based on the rendering result. More preferably, when the analysis unit 116 determines that the operation instruction data is unicast transmitted, the operation instruction data is transmitted by a specific user terminal based on the creation source information of the operation instruction data. Determine if it was created. For example, the analysis unit 116 determines that the operation instruction data to which the information related to the specific user is associated with the creation source information is created by the specific user terminal. The analysis unit 116 determines that the operation instruction data in which the value of the creation source information is empty and the operation instruction data to which the creation source information is not associated are not created by a specific user terminal. The operation instruction data that is not created by a specific user terminal is considered to have been created by the game master.
 解析部116は、ユニキャスト送信された、特定のユーザの関連情報が作成元情報として紐付けられていない動作指図データを受信したことをトリガにして、該動作指図データをリアルタイムにレンダリングする。続けて、解析部116は、レンダリングの結果に基づいてキャラクタを動作させるように、ゲーム進行部115に指示する。 The analysis unit 116 renders the operation instruction data in real time, triggered by the reception of the operation instruction data that is unicast and is not associated with the related information of the specific user as the creation source information. Subsequently, the analysis unit 116 instructs the game progress unit 115 to operate the character based on the rendering result.
 これにより、ユーザ端末100は、ゲームマスターから配信された動作指図データに基づいて、キャラクタを、該ゲームマスターの意図を反映してリアルタイムに動作させることができる。そのため、キャラクタがまるでそこに本当に存在しているかのような現実感を、該キャラクタに備えさせることができる。 Thereby, the user terminal 100 can operate the character in real time based on the operation instruction data delivered from the game master, reflecting the intention of the game master. Therefore, the character can be provided with a sense of reality as if the character really exists there.
 <処理フロー>
 図11は、本実施形態に係るユーザ端末100が実行する、動作指図データを解析する処理の流れを示すフローチャートである。本実施形態では、ゲームシステム1の各装置が実行する処理は、図10に示す処理とほぼ同様である。ステップS114にて、ユーザ端末100は、以下に示すとおり、動作指図データを解析する。
<Processing flow>
FIG. 11 is a flowchart showing a flow of processing for analyzing operation instruction data executed by the user terminal 100 according to the present embodiment. In the present embodiment, the processing executed by each device of the game system 1 is substantially the same as the processing shown in FIG. In step S114, the user terminal 100 analyzes the operation instruction data as shown below.
 ステップS201にて、解析部116は、動作指図データの項目「宛先」から宛先指定情報を取得する。 In step S201, the analysis unit 116 acquires the destination designation information from the item "destination" of the operation instruction data.
 ステップS202にて、解析部116は、宛先指定情報に基づいて、該動作指図データが、マルチキャストで送信されたものか否かを判断する。宛先指定情報がグループ識別子(例えば、「ALL」など)を指している場合には、解析部116は、該動作指図データがマルチキャストで送信されたものであると判断する。そして、解析部116は、ステップS202のYESから、図10に示すステップS115以降に進む。すなわち、ゲーム進行部115は、マルチキャストで送信された動作指図データが自端末において受信されたことをトリガにして、該動作指図データに基づいてキャラクタをリアルタイムで動作させる。一方、宛先指定情報が自端末のアドレスを指している場合には、解析部116は、該動作指図データが、ユニキャストで送信されたものであると判断する。本実施形態では、解析部116は、ユニキャストで送信されたものは、リアルタイムで再生する必要はないと判断し、受信した動作指図データを記憶部120に保存して、図10に示すステップS103以降に戻ってもよい。ステップS103では、例えば、ゲーム進行部115は、ユーザ端末100に入力されたユーザの入力操作に応じてキャラクタを動作させるパートを進行させてもよい。別の実施形態では、ユニキャストで送信された動作指図データが受信された場合には、解析部116は、ステップS202のNOからステップS203に進んでもよい。 In step S202, the analysis unit 116 determines whether or not the operation instruction data is transmitted by multicast based on the destination designation information. When the destination designation information points to a group identifier (for example, "ALL"), the analysis unit 116 determines that the operation instruction data is transmitted by multicast. Then, the analysis unit 116 proceeds from YES in step S202 to step S115 and subsequent steps shown in FIG. That is, the game progress unit 115 causes the character to operate in real time based on the operation instruction data, triggered by the fact that the operation instruction data transmitted by multicast is received in the own terminal. On the other hand, when the destination designation information points to the address of the own terminal, the analysis unit 116 determines that the operation instruction data is transmitted by unicast. In the present embodiment, the analysis unit 116 determines that it is not necessary to reproduce the unicast transmission in real time, stores the received operation instruction data in the storage unit 120, and steps S103 shown in FIG. You may go back to that. In step S103, for example, the game progress unit 115 may advance a part that operates the character according to the input operation of the user input to the user terminal 100. In another embodiment, when the operation instruction data transmitted by unicast is received, the analysis unit 116 may proceed from NO in step S202 to step S203.
 ステップS203にて、解析部116は、該動作指図データの項目「作成元」から、作成元情報を取得する。 In step S203, the analysis unit 116 acquires the creation source information from the item "creation source" of the operation instruction data.
 ステップS204にて、解析部116は、作成元情報が特定のユーザに関連するユーザ関連情報を指しているか否かを判断する。ユーザ関連情報は、例えば、ユーザID、ユーザ端末100の端末ID、ユーザ端末100のアドレスなどである。作成元情報が、特定のユーザのユーザ関連情報でない場合、解析部116は、該動作指図データが、特定のユーザによって作成されたものではなく、ゲームマスターによって作成されたものであると判断する。そして、解析部116は、ステップS204のNOから、図10に示すステップS115以降に進む。すなわち、ゲーム進行部115が、ユーザ関連情報が紐付けられていない動作指図データが自端末において受信されたことをトリガにして、該動作指図データに基づいてキャラクタをリアルタイムで動作させる。一方、作成元情報が、特定のユーザのユーザ関連情報である場合、解析部116は、該動作指図データが、特定のユーザによって作成されたものであると判断する。したがって、解析部116は、ゲームマスターから供給された動作指図データではないので、リアルタイムで再生する必要はないと判断し、ステップS204のYESからステップS205に進む。 In step S204, the analysis unit 116 determines whether or not the creation source information points to user-related information related to a specific user. The user-related information is, for example, a user ID, a terminal ID of the user terminal 100, an address of the user terminal 100, and the like. When the creation source information is not the user-related information of a specific user, the analysis unit 116 determines that the operation instruction data is not created by the specific user but is created by the game master. Then, the analysis unit 116 proceeds from NO in step S204 to step S115 and subsequent steps shown in FIG. That is, the game progress unit 115 triggers the reception of the operation instruction data to which the user-related information is not associated with the own terminal, and causes the character to operate in real time based on the operation instruction data. On the other hand, when the creation source information is user-related information of a specific user, the analysis unit 116 determines that the operation instruction data is created by the specific user. Therefore, the analysis unit 116 determines that it is not necessary to reproduce the operation instruction data in real time because it is not the operation instruction data supplied from the game master, and proceeds from YES in step S204 to step S205.
 ステップS205にて、解析部116は、特定のユーザによって作成された動作指図データを記憶部120に保存して、図10に示すステップS103以降に戻る。 In step S205, the analysis unit 116 saves the operation instruction data created by a specific user in the storage unit 120, and returns to step S103 and subsequent steps shown in FIG.
 上述の構成および方法によれば、ユーザ端末100において、ユーザの入力操作に応じてキャラクタを動作させることに加えて、サーバ200または動作指図装置300から受信した動作指図データに基づいてキャラクタを動作させることができる。そのためキャラクタの動作は、型にはまらずに表現が大幅に広がる。これにより、ユーザは、キャラクタの動作を見て、該キャラクタがまるで現実の世界にいるかのような現実感を覚えることができる。そして、ユーザは、該キャラクタとの現実感が豊かなやりとりの体験を通じて、よりキャラクタに愛着を感じるので、該キャラクタを操作する別のパートもよりいっそう興味を持ってプレイすることができる。結果として、ゲームの世界への没入感を高め、該ゲームの興趣性を向上させるという効果を奏する。 According to the above configuration and method, in the user terminal 100, in addition to operating the character according to the input operation of the user, the character is operated based on the operation instruction data received from the server 200 or the operation instruction device 300. be able to. Therefore, the movement of the character is not limited to the type, and the expression is greatly expanded. As a result, the user can see the movement of the character and feel the reality as if the character is in the real world. Then, the user feels more attached to the character through the experience of interacting with the character with a rich sense of reality, so that another part that operates the character can be played with even more interest. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game.
 <変形例>
 実施形態2の変形例において、解析部116は、宛先指定情報に基づいて、動作指図データがマルチキャスト送信されたか否かを判断するステップを省略し、動作指図データに特定のユーザのユーザ関連情報が紐付いているか否かを、作成元情報に基づいて判断するステップを実行してもよい。
<Modification example>
In the modification of the second embodiment, the analysis unit 116 omits the step of determining whether or not the operation instruction data is multicast-transmitted based on the destination designation information, and the user-related information of a specific user is included in the operation instruction data. You may perform the step of determining whether or not it is linked based on the creation source information.
 〔実施形態3〕
 <ゲーム概要>
 実施形態3に係るゲームシステム1が実行するゲーム(以下、本ゲーム)は、実施形態1および実施形態2と同様に、一例として、恋愛シミュレーションゲームの要素を含んだ育成シミュレーションゲームである。本実施形態では、本ゲームには、少なくともライブ配信パートが含まれる。本ゲームは、単一のライブ配信パートで構成されていてもよいし、複数のパートで構成されていてもよい。一例では、図3及び図11に示したようなストーリーパート及びライブ配信パートの組み合わせで構成されてもよい。また、ライブ配信パートにおいて、動作指図装置300によって動作を制御されるキャラクタは、PCであっても、NPCであっても構わない。例えば、ライブ配信パートにてNPCとして動作するキャラクタであっても、別のパートでは、PCとして、ゲームプレイヤであるユーザの入力操作にしたがって動作することがあってもよい。あるいは、動作指図データが動作指図装置300からライブ配信されていない期間、キャラクタは、ライブ配信パート内で、PCとして、ゲームプレイヤであるユーザの入力操作にしたがって動作してもよい。そして、ライブ配信が開始されたら、該キャラクタは、NPCに切り替えられ、動作指図装置300から供給された動作指図データにしたがって動作してもよい。
[Embodiment 3]
<Game overview>
The game (hereinafter, this game) executed by the game system 1 according to the third embodiment is, as an example, a training simulation game including elements of a love simulation game, as in the first and second embodiments. In this embodiment, the game includes at least a live distribution part. The game may be composed of a single live distribution part or may be composed of a plurality of parts. In one example, it may be composed of a combination of a story part and a live distribution part as shown in FIGS. 3 and 11. Further, in the live distribution part, the character whose operation is controlled by the operation instruction device 300 may be a PC or an NPC. For example, a character that operates as an NPC in the live distribution part may operate as a PC in another part according to an input operation of a user who is a game player. Alternatively, during the period when the operation instruction data is not live-distributed from the operation instruction device 300, the character may operate as a PC in the live distribution part according to an input operation of a user who is a game player. Then, when the live distribution is started, the character may be switched to the NPC and operate according to the operation instruction data supplied from the operation instruction device 300.
 本実施形態では、特に、ライブ配信パートにおいて、リアルタイムのライブ配信が一旦終了した後であっても、ユーザは、終了済みのライブ配信パートの進行を要求し、受信した動作指図データに基づいてライブ配信パートを改めて進行させることができる。これにより、ユーザは、ライブ配信を再度見返すこと、および仮に見逃した場合でも改めてライブ配信を見ることができる。以下では、第1実施形態および第2実施形態、並びにこれらの変形例を通じて、ストーリーパートと、および該ストーリーパート後のライブ配信パートとを含むゲームが進行し、ライブ配信時刻終了後の場面を想定している。また、ここでのキャラクタは、ゲームプレイヤであるユーザによる直接操作の対象ではないNPCを想定している。 In the present embodiment, especially in the live distribution part, the user requests the progress of the completed live distribution part even after the real-time live distribution is once completed, and the live is performed based on the received operation instruction data. The distribution part can be advanced again. As a result, the user can look back at the live stream again, and even if he / she misses it, he / she can watch the live stream again. In the following, it is assumed that a game including a story part and a live distribution part after the story part progresses through the first embodiment and the second embodiment, and variations thereof, and a scene after the end of the live distribution time is assumed. is doing. Further, the character here is assumed to be an NPC that is not a target of direct operation by a user who is a game player.
 <処理概要>
 本実施形態では、ユーザ端末100は、ゲームプログラム131に基づいて、ゲームの興趣性を向上させるために以下のステップを実行するように構成されている。具体的には、ユーザ端末100(コンピュータ)は、例えば入力部151などの操作部を介して、終了済みのライブ配信パートの進行を要求するステップと、サーバ200または動作指図装置300(キャラクタ制御装置)から、終了済みのライブ配信パートに係る記録済みの動作指図データを受信するステップと、記録済みの動作指図データに基づいてNPCを動作させることにより終了済みのライブ配信パートを進行させるステップとを実行する。ここでは、記録済みの動作指図データは、NPCに関連付けられるオペレータが入力したモーションデータおよび音声データを含んでいる。オペレータは、モデルや声優のみならず、動作指図装置300(キャラクタ制御装置)への何らかの操作を行う作業者も含むが、ゲームプレイヤであるユーザは含まない。なお、記録済みの動作指図データは、サーバ200の記憶部200または動作指図装置300の記憶部320に格納されるのがよく、ユーザ端末100からの要求に応じて、改めてユーザ端末110に配信されるのがよい。
<Processing overview>
In the present embodiment, the user terminal 100 is configured to execute the following steps in order to improve the interest of the game based on the game program 131. Specifically, the user terminal 100 (computer) has a step of requesting the progress of a completed live distribution part via an operation unit such as an input unit 151, and a server 200 or an operation instruction device 300 (character control device). ), The step of receiving the recorded operation instruction data related to the completed live distribution part, and the step of advancing the completed live distribution part by operating the NPC based on the recorded operation instruction data. Run. Here, the recorded operation instruction data includes motion data and voice data input by the operator associated with the NPC. The operator includes not only a model and a voice actor but also an operator who performs some operation on the operation instruction device 300 (character control device), but does not include a user who is a game player. The recorded operation instruction data is often stored in the storage unit 200 of the server 200 or the storage unit 320 of the operation instruction device 300, and is delivered to the user terminal 110 again in response to a request from the user terminal 100. It is good to do it.
 本実施形態では、ユーザがライブ配信パートをリアルタイムで進行させたか否かの結果に応じて、記録済みの動作指図データに基づく終了済みのライブ配信パートの進行を異なるものとするのがよい。具体的には、ユーザがライブ配信パートをリアルタイムで進行させた実績があると判定される場合には、ユーザがライブ配信パートをリアルタイムで進行させたものと同様のライブ配信パートを再度進行させるのがよい(見返し配信)。見返し配信では、ライブ配信パートの選択的な進行を実行するのがよい。一方、ユーザがライブ配信パートをリアルタイムで進行させた実績がないと判定される場合には、リアルタイムで進行させたのとは異なる進行態様のライブ配信パートを進行させるのがよい(見逃し配信)。ここで、見逃し配信において実績がないと判定される場合には、例えば、ユーザがライブ配信を受ける権利を有し、ユーザがライブ配信時刻であればリアルタイムのライブ配信パートを進行可能であったにも拘わらず、実際にはこれを実行しなかった場合が含まれる。見逃し配信では、ライブ配信パートの制限付きの進行を実行するのがよい。 In the present embodiment, it is preferable that the progress of the completed live distribution part based on the recorded operation instruction data is different depending on the result of whether or not the user has progressed the live distribution part in real time. Specifically, when it is determined that the user has a track record of advancing the live distribution part in real time, the same live distribution part as the user has advanced the live distribution part in real time is advanced again. Is good (return delivery). In return delivery, it is better to perform a selective progression of the live delivery part. On the other hand, when it is determined that the user has no record of progressing the live distribution part in real time, it is preferable to proceed with the live distribution part having a progress mode different from that progressed in real time (missed distribution). Here, if it is determined that there is no track record in the overlooked distribution, for example, the user has the right to receive the live distribution, and if the user has the live distribution time, the real-time live distribution part can proceed. Nevertheless, this includes cases where this was not actually done. For missed deliveries, it's a good idea to perform a limited progression of the live stream part.
 <ゲームシステム1の機能的構成>
 本実施形態に係るユーザ端末100において、ユーザがライブ配信パートをリアルタイムで進行させた実績があると判定される場合に、解析部116は、さらに、ライブ配信パートにおけるユーザ行動履歴情報を受信して解析する。ユーザ行動履歴情報とは、記録済みの動作指図データの中身とは別に、ライブ配信パートの進行の間に入力操作により受け付けられたユーザの行動の記録のデータ・セットである。ユーザ行動履歴情報は、記録済みの動作指図データに関連付けられるのがよく、サーバ200の記憶部220または動作指図装置300の記憶部320に格納されるのがよい。これに加えて、或いはこれに替えて、ユーザ行動履歴情報はユーザ端末100の記憶部120に格納されてもよい。
<Functional configuration of game system 1>
When it is determined that the user has a track record of advancing the live distribution part in real time in the user terminal 100 according to the present embodiment, the analysis unit 116 further receives the user action history information in the live distribution part. To analyze. The user action history information is a data set of user actions recorded by an input operation during the progress of the live distribution part, in addition to the contents of the recorded action instruction data. The user action history information is often associated with the recorded operation instruction data, and is preferably stored in the storage unit 220 of the server 200 or the storage unit 320 of the operation instruction device 300. In addition to or instead of this, the user behavior history information may be stored in the storage unit 120 of the user terminal 100.
 図13は、ユーザ行動履歴情報のデータ構造の一例を示す図である。ユーザ行動履歴情報は、例えば、ライブ配信パート内においてユーザが行動した行動時間、行動種別、および行動詳細のような項目を含み、ユーザを識別するユーザIDに関連付けられる。項目「行動時間」は、ライブ配信パート内でユーザが行動を行った時間情報であり、項目「行動種別」はユーザの行動を示す種別であり、項目「行動詳細」はユーザの行動の具体的な内容である。例えば、項目「行動種別」および「行動詳細」で特定される行動には、ユーザの入力操作による有価データの消費(一例では、投げ銭、及びアイテム購入等による課金等)、コメント入力、並びにキャラクタの服飾品等のアイテムの変更(いわゆる、着せ替え)等の行動が含まれてよい。また、このような行動には、ライブ配信パートの特定進行部分を後からプレイバックするための時間の選択(例えば、特定進行部分の録画操作)が含まれてもよい。これ以外にも、このような行動には、ライブ配信パート中の報酬やポイント等の獲得が含まれてもよい。なお、ユーザ行動履歴情報は、図4で説明された動作指図データのデータ構造と、図5で説明されたゲーム情報のデータ構造との間で相互に関連付けられるのがよい。なお、これらのデータ構造は例示にすぎず、これに限定されないことが当業者に理解されるべきである。 FIG. 13 is a diagram showing an example of a data structure of user behavior history information. The user action history information includes, for example, items such as action time, action type, and action details in which the user has acted in the live distribution part, and is associated with a user ID that identifies the user. The item "behavior time" is the time information in which the user performed an action in the live distribution part, the item "behavior type" is a type indicating the user's action, and the item "behavior details" is the specific action of the user. Content. For example, for the actions specified by the items "behavior type" and "behavior details", the consumption of valuable data by the user's input operation (for example, throwing money and billing by purchasing items, etc.), comment input, and character input. Actions such as changing items such as clothing (so-called dress-up) may be included. In addition, such an action may include selection of a time for later playing back a specific progress portion of the live distribution part (for example, a recording operation of the specific progress portion). In addition to this, such actions may include the acquisition of rewards, points, etc. during the live distribution part. The user action history information is preferably associated with each other between the data structure of the operation instruction data described in FIG. 4 and the data structure of the game information described in FIG. It should be understood by those skilled in the art that these data structures are merely examples and are not limited thereto.
 <処理フロー>
 図14は、本実施形態に係るゲームプログラムに基づいて実行されるゲームの基本的なゲーム進行について、その一例を示すフローチャートである。処理フローは、リアルタイムのライブ配信パートが既に終了済みであり、ライブ配信時刻の終了以降の場面に適用される。
<Processing flow>
FIG. 14 is a flowchart showing an example of a basic game progress of a game executed based on the game program according to the present embodiment. The processing flow is applied to the scenes after the end of the live distribution time when the real-time live distribution part has already been completed.
 ステップS301では、ユーザ端末100の操作部151によって、終了済みのライブ配信パートの進行が新たに要求される。ステップS302では、ステップS301での要求に対して、ユーザ端末100は、サーバ200または動作指図装置300(キャラクタ制御装置)から、終了済みのライブ配信パートに係る記録済みの動作指図データを受信する。 In step S301, the operation unit 151 of the user terminal 100 newly requests the progress of the completed live distribution part. In step S302, in response to the request in step S301, the user terminal 100 receives the recorded operation instruction data related to the completed live distribution part from the server 200 or the operation instruction device 300 (character control device).
 記録済みの動作指図データは、キャラクタに関連付けられるオペレータが入力したモーションデータおよび音声データを含む。ユーザ端末100は、記録済みの動作指図データに加えて、リアルタイムのライブ配信パートの進行時にキャラクタの動作に伴って取得および記録された各種進行実績データを受信してもよい。具体的には、進行実績データには、リアルタイムのライブ配信パートに参加したユーザがキャラクタの動作に伴って行動した視聴者行動データが含まれてもよい。視聴者行動データは、リアルタイムのライブ配信パートを、リアルタイムで進行させた全てのユーザ(つまり、ライブに参加した視聴者)のライブ中の行動の記録を含んだデータである。特に、視聴者行動データは、ライブの途中で視聴者がキャラクタに向けてリアルタイムに発信したテキストメッセージやアイコン等のメッセージングの内容を含むのがよい。このように、進行実績データを用いて、終了済みのライブ配信パートを進行させることにより、リアルタイムで進行したライブ配信パートにおける視聴者の反応を忠実に再現することができ、リアルタイムでのライブ空間の臨場感を更に向上することができる。 The recorded action instruction data includes motion data and voice data input by the operator associated with the character. In addition to the recorded operation instruction data, the user terminal 100 may receive various progress record data acquired and recorded along with the movement of the character during the progress of the real-time live distribution part. Specifically, the progress record data may include viewer behavior data in which the user who participated in the real-time live distribution part behaves in accordance with the movement of the character. The viewer behavior data is data including a record of the behavior during the live of all the users (that is, the viewers who participated in the live) who have advanced the real-time live distribution part in real time. In particular, the viewer behavior data should include messaging content such as text messages and icons sent by the viewer to the character in real time during the live performance. In this way, by advancing the completed live distribution part using the progress record data, it is possible to faithfully reproduce the reaction of the viewer in the live distribution part that has progressed in real time, and it is possible to faithfully reproduce the reaction of the viewer in the live space in real time. The sense of presence can be further improved.
 なお、記録済みの動作指図データおよび進行実績データは、ユーザ端末100が別データとして受信し、それぞれを解析(レンダリング)してもよい。代替では、サーバ200または動作指図装置300において、予め、記録済みの動作指図データおよび視聴者行動データが結合され、結合されたデータ・セットをユーザ端末100が一度に受信してもよい。結合されたデータ・セットを受信することにより、ユーザ端末100による後のデータの解析(レンダリング)の負荷を低減することができる。以降の説明では、進行実績データは、記録済みの動作指図データに結合されたものとする(つまり、記録済みの動作指図データに進行実績データが含まれるものとする。)。 Note that the recorded operation instruction data and progress record data may be received by the user terminal 100 as separate data, and each may be analyzed (rendered). Alternatively, in the server 200 or the operation instruction device 300, the previously recorded operation instruction data and the viewer behavior data may be combined, and the combined data set may be received by the user terminal 100 at one time. By receiving the combined data set, it is possible to reduce the load of subsequent data analysis (rendering) by the user terminal 100. In the following description, it is assumed that the progress record data is combined with the recorded action instruction data (that is, the recorded action order data includes the progress record data).
 次いで、ステップS303では、ゲーム進行部115は、ユーザがライブ配信パートをリアルタイムで進行させた実績があるか否かを判定する。判定は、例えば、図4に示された項目「宛先」を参照して、動作指図データがユーザ端末100に宛てに送信された記録があるかに基づいて実行されてもよい。或いは、図5に示された項目「プレイ履歴」を参照して、ライブ配信パートが「プレイ済」のステータスであるかに基づいて実行されても、同項目「配信履歴」を参照して、過去にキャラクタからライブ配信された実績があるかに基づいて実行されても何れでもよい。これ以外にも、ユーザ端末100の記憶部120に既に記録済みの動作指図データが格納されているような場合は、ライブ配信パートをリアルタイムで既に進行させたものと判定してよい。加えて、判定は、これらを組み合わせることで実行されてもよく、または他の任意の手法で実行されてもよい。 Next, in step S303, the game progress unit 115 determines whether or not the user has a track record of progressing the live distribution part in real time. The determination may be performed, for example, with reference to the item "destination" shown in FIG. 4 based on whether there is a record in which the action instruction data has been sent to the user terminal 100. Alternatively, even if the live distribution part is executed based on whether the status is "played" by referring to the item "play history" shown in FIG. 5, the item "distribution history" is also referred to. It may be executed based on whether or not there is a record of live distribution from the character in the past. In addition to this, when the operation instruction data already recorded is stored in the storage unit 120 of the user terminal 100, it may be determined that the live distribution part has already been advanced in real time. In addition, the determination may be performed by combining them, or by any other method.
 ステップS303でユーザがライブ配信パートをリアルタイムで進行させた実績があると判定される場合(YES)は、終了済みのライブ配信パートの進行は「見返し配信」となる。他方、ステップS303でユーザがライブ配信パートをリアルタイムで進行させた実績がないと判定される場合(NO)は、終了済みのライブ配信パートの進行は「見逃し配信」となる。上述したように、「見返し配信」と「見逃し配信」とでは、ユーザ体験は異なる。 If it is determined in step S303 that the user has a track record of advancing the live distribution part in real time (YES), the progress of the completed live distribution part is "return distribution". On the other hand, when it is determined in step S303 that the user has no record of advancing the live distribution part in real time (NO), the progress of the completed live distribution part is "missed distribution". As mentioned above, the user experience is different between "return delivery" and "missed delivery".
 ステップS303で、ユーザがライブ配信パートをリアルタイムで進行させた実績があると判定されると、処理フローはステップS303のYESからステップS304に進む。ステップS304では、解析部116は、図13に示されたライブ配信パートのユーザ行動履歴情報を取得して、これを解析する。ユーザ行動履歴情報は、サーバ200または動作指図装置300から取得してもよいし、ユーザ端末100の記憶部120に既に格納されている場合は直接それを使用してもよい。 If it is determined in step S303 that the user has a track record of advancing the live distribution part in real time, the processing flow proceeds from YES in step S303 to step S304. In step S304, the analysis unit 116 acquires the user behavior history information of the live distribution part shown in FIG. 13 and analyzes it. The user action history information may be acquired from the server 200 or the operation instruction device 300, or may be used directly when it is already stored in the storage unit 120 of the user terminal 100.
 引き続き、ステップS305では、ゲーム進行部115は、終了済みのライブ配信パートの再度の進行(つまり上述の「見返し配信」)を実行する。具体的には、記録済みの動作指図データとステップS304で解析したユーザ行動履歴情報とを用いて、ライブ配信パートの再度の進行を実行する。また、図8で説明された報酬をユーザがアイテム(ここでは、「うさみみバンド」)として獲得しているような場合は、当該アイテムに基づいて(つまり、うさみみバンドを身に付けて)NPCを動作させる。これにより、ライブ配信パートの再度の進行を実行してもよい。つまり、ライブ配信パートの再度の進行は、ユーザ行動履歴情報および報酬の情報が反映されており、リアルタイムで進行したライブ配信パートと同様のものであると共に、ユーザに固有のものとなる。 Subsequently, in step S305, the game progress unit 115 re-progresses the completed live distribution part (that is, the above-mentioned "return distribution"). Specifically, the recorded operation instruction data and the user action history information analyzed in step S304 are used to re-progress the live distribution part. Also, if the user has acquired the reward described in FIG. 8 as an item (here, "Usamimi band"), an NPC will be assigned based on the item (that is, wearing a Usamimi band). Make it work. As a result, the live distribution part may be re-progressed. That is, the re-progress of the live distribution part reflects the user behavior history information and the reward information, is similar to the live distribution part that has progressed in real time, and is unique to the user.
 また、見返し配信では、ライブ配信パートの再度の進行が、最初に進行させた際に記録された、操作部を介したユーザの入力操作による時間情報の指定にしたがい選択的に実行されるのがよい。具体的には、図13で説明されたユーザ行動履歴情報に含まれる「行動時間」のデータを用いて、ユーザが特定の行動時間を指定することにより、そこからライブ配信パートを選択的に進行させることができる。例えば、ユーザがライブ配信パートの開始から2分45秒後にコメントを入力していた場合には、その2分45秒後のタイミングを指定して、ユーザはライブ配信パートを再度進行させることができる。なお、このような再度の進行は、上記のコメント入力の記録に加え、ユーザの入力操作による有価データの消費、およびキャラクタの服飾品等のアイテムの変更等の行動の記録に対応した「行動時間」に基づいて、実行可能とするのがよい。 Also, in the return distribution, the re-progress of the live distribution part is selectively executed according to the time information specified by the user's input operation via the operation unit, which was recorded when the live distribution part was first advanced. good. Specifically, by using the data of "action time" included in the user action history information described with reference to FIG. 13, the user specifies a specific action time, and the live distribution part is selectively advanced from there. Can be made to. For example, if the user inputs a comment 2 minutes and 45 seconds after the start of the live distribution part, the user can advance the live distribution part again by specifying the timing after 2 minutes and 45 seconds. .. In addition to the above-mentioned record of comment input, such re-progress is "action time" corresponding to the consumption of valuable data by the user's input operation and the record of actions such as change of items such as character's clothing. It is better to make it feasible based on.
 更に、見返し配信では、リアルタイムで進行したライブ配信パートの間において、ユーザが入力操作によって特定進行部分を選択していた場合は、ライブ配信パートの再度の進行において、選択された特定進行部分のみを選択的に進行させることができる。これにより、ユーザは、ライブ配信パートの特定進行部分のみを後から効率的にプレイバックすることができる。具体的には、ユーザが特定進行部分を選択し、ユーザ行動履歴情報にそのような行動の記録が登録されている場合に、その行動時間のデータを用いて、ライブ配信パートを選択的に進行させることができる。例えば、ユーザが、ライブ配信パートの開始から2分45秒から5分10秒の期間を選択していた場合には、ユーザは、その期間にわたるライブ配信パートを再度進行させることができる。 Furthermore, in the return distribution, if the user has selected a specific progress part by an input operation between the live distribution parts that have progressed in real time, only the selected specific progress part is selected in the re-progress of the live distribution part. It can be advanced selectively. As a result, the user can efficiently play back only the specific progress part of the live distribution part later. Specifically, when the user selects a specific progress part and a record of such an action is registered in the user action history information, the live distribution part is selectively progressed by using the data of the action time. Can be made to. For example, if the user has selected a period of 2 minutes 45 seconds to 5 minutes 10 seconds from the start of the live distribution part, the user can re-progress the live distribution part over that period.
 図14に戻り、ステップS303で、ユーザがライブ配信パートをリアルタイムで進行させた実績がないと判定されると、処理フローはステップS303のNOからステップS306に進む。ステップS306では、ゲーム進行部115は、終了済みのライブ配信パートの制限付きの進行(つまり上述の「見逃し配信」)を実行する。見逃し配信を制限付きのものとしているのは、ユーザはライブ配信を受ける権利を有していたにも拘わらず、この権利を放棄したと考えることができるのであるから、必ずしも、ライブ配信の全てを再現してユーザに提示する必要もないとの発想に基づく。 Returning to FIG. 14, if it is determined in step S303 that the user has no record of advancing the live distribution part in real time, the processing flow proceeds from NO in step S303 to step S306. In step S306, the game progress unit 115 executes a limited progress (that is, the above-mentioned "missed distribution") of the completed live distribution part. The reason why the missed delivery is restricted is that the user has the right to receive the live stream, but it can be considered that he has waived this right, so not all of the live stream is necessarily limited. Based on the idea that it is not necessary to reproduce and present it to the user.
 具体的には、見逃し配信では、記録済みの動作指図データを用いて、ライブ配信パートの進行を実行する。上述のとおり、ストーリーパートに関連付けられるシナリオを通じて、ユーザが報酬をアイテム(図8では「うさみみバンド」)として獲得していた場合、リアルタイムで進行したライブ配信パートでは、そのアイテムをNPCが身に付けて動作させるよう画像合成されていた。つまり、NPCの動作態様は報酬が関連付けられていたものであった。しかしながら、見逃し配信においては、このようなリアルタイムで進行したライブ配信パートとは異なり、NPCの動作態様には報酬が関連付けられることはない。つまり、アイテムをNPCが身に付けて動作させるような画像合成の処理は行わない。つまり、終了済みのライブ配信パートの進行は、報酬の情報が反映されておらず、ユーザに固有のものとはならない点で制限付きのものとなる。 Specifically, in the overlooked distribution, the progress of the live distribution part is executed using the recorded operation instruction data. As mentioned above, if the user has earned a reward as an item (“Usamimi Band” in Figure 8) through the scenario associated with the story part, the NPC will wear that item in the livestream part that progressed in real time. The image was synthesized so that it would work. That is, the operation mode of the NPC was associated with the reward. However, in the overlooked distribution, unlike the live distribution part that progresses in real time, the reward is not associated with the operation mode of the NPC. That is, the image composition process that causes the NPC to wear the item and operate it is not performed. That is, the progress of the completed livestreaming part is limited in that it does not reflect the reward information and is not unique to the user.
 また、見逃し配信では、リアルタイムで進行したライブ配信パートとは異なり、受け付け可能なユーザの行動も制限するのがよい。具体的には、リアルタイムで進行したライブ配信パートでは、ユーザの入力操作による有価データの消費(一例では、投げ銭、およびアイテム購入等による課金等)が受け付け可能であった。その一方で、終了済みのライブ配信パートの進行では、このような有価データの消費が受け付けられないように制限してもよい。より詳しくは、リアルタイムで進行したライブ配信パートにおいては、有価データの消費を実行するためのボタンおよび画面を含むユーザ・インターフェイス(UI)が表示部352に表示されていた。そして、ユーザは、このようなUIでの入力操作を通じて有価データの消費を実行することができた。その一方で、見逃し配信では、このようなUIは非表示とされ、ユーザによる入力操作を明示的に行えないようにするのがよい。 Also, in the overlooked distribution, unlike the live distribution part that progressed in real time, it is better to limit the actions of the users that can be accepted. Specifically, in the live distribution part that progressed in real time, consumption of valuable data by user input operations (for example, throwing money and billing by purchasing items, etc.) could be accepted. On the other hand, in the progress of the completed live distribution part, the consumption of such valuable data may be restricted so as not to be accepted. More specifically, in the live distribution part progressed in real time, a user interface (UI) including a button and a screen for executing the consumption of valuable data was displayed on the display unit 352. Then, the user could execute the consumption of valuable data through such an input operation in the UI. On the other hand, in the overlooked delivery, such a UI should be hidden so that the user cannot explicitly perform an input operation.
 さらに、見返し配信および見逃し配信では、リアルタイムで進行するライブ配信パートと同様、ユーザは、ライブ配信パートに関連付けられる特定のシナリオをプレイすることができる。特定のシナリオには、例えばユーザ参加型のイベントが含まれ、ユーザには、キャラクタとのインタラクティブな体験が提供される。ユーザ参加型のイベントの例には、キャラクタから提供されたアンケート、キャラクタから出題されたクイズ、キャラクタとの対戦(例えばジャンケン、ビンゴ)等が含まれる。そして、リアルタイムでのライブ配信と同様に、見逃し配信においても、このようなユーザ参加型のイベントの参加結果はユーザにフィードバックされる。例えば、見返し配信において、キャラクタから出題された四択クイズのイベントにユーザが参加して回答した場合、その正誤判定の結果がユーザにフィードバックされる。(ただし、ライブにリアルタイムで参加しなかったユーザ8が見逃し配信においてアンケートやクイズ等に回答した場合や、ライブにリアルタイムで参加した見返し配信においてライブ参加中とは異なる回答をした場合は、これらのユーザ8の回答内容は反映されないが、プログラムが自動的に簡単な判定のみ(正誤判定など)を行ってフィードバックするようになっていてもよい。)また、見返し配信において、ユーザ8がライブ参加中とは異なる回答をした場合は、ライブ参加中の当該ユーザの回答と比較をして「ライブ中と回答が違いますよ」というような表示がユーザ端末800に表示出力されるようになっていてもよい。 Furthermore, in the return delivery and the missed delivery, the user can play a specific scenario associated with the live delivery part as well as the live delivery part that progresses in real time. Certain scenarios include, for example, user-participatory events, which provide the user with an interactive experience with the character. Examples of user-participatory events include questionnaires provided by the character, quizzes given by the character, battles with the character (for example, rock-paper-scissors, bingo), and the like. Then, as in the case of live distribution in real time, the participation result of such a user participation type event is fed back to the user in the overlooked distribution. For example, in the return delivery, when the user participates in and answers the event of the four-choice quiz given by the character, the result of the correctness determination is fed back to the user. (However, if the user 8 who did not participate in the live in real time answered a questionnaire, quiz, etc. in the missed delivery, or if the feedback delivery that participated in the live in real time gave a different answer than during the live participation, these The content of the user 8's answer is not reflected, but the program may automatically make only a simple judgment (correctness judgment, etc.) and give feedback.) In addition, the user 8 is participating in the live concert in the return delivery. If the answer is different from the answer, a display such as "The answer is different from the one during the live" is displayed and output to the user terminal 800 by comparing with the answer of the user who is participating in the live. May be good.
 一方、見逃し配信では、リアルタイムで進行したライブ配信パートとは異なり、上記フィードバックに対して所定のゲーム・ポイントをユーザが獲得できないように制限してもよい。具体的には、リアルタイムで進行したライブ配信パートでは、ユーザが特定のシナリオをプレイした結果、所定のゲーム・ポイントがユーザに関連付けられて、ユーザ保有のポイントに加算されることがある。その一方で、終了済みのライブ配信パートの進行においては、ユーザにはこのようなポイントが関連付けられないようにしてもよい。ユーザ保有のポイントが加算されない結果、例えば、ポイントに基づいてゲームプレイヤである複数のユーザが順位付けされるようなゲームの場合、終了済みのライブ配信パートを仮にユーザが進行させたところで、このような順位には影響を与えないことになる。 On the other hand, in the overlooked distribution, unlike the live distribution part that progresses in real time, the user may be restricted from acquiring predetermined game points for the above feedback. Specifically, in a live distribution part that progresses in real time, as a result of the user playing a specific scenario, predetermined game points may be associated with the user and added to the points owned by the user. On the other hand, in the progress of the completed live distribution part, such points may not be associated with the user. As a result of not adding the points owned by the user, for example, in the case of a game in which a plurality of users who are game players are ranked based on the points, when the user advances the completed live distribution part, this is the case. It will not affect the ranking.
 見返し配信(ステップS305)または見逃し配信(ステップS306)の終了後は、ユーザ端末100によって再び、終了済みの第2パート(ライブ配信パート)の進行が要求されてもよい。つまり、見返し配信または見逃し配信は、複数回数にわたり繰り返し実行可能とするのがよい。この場合、処理フローはステップS301に戻ることになる。 After the end of the return distribution (step S305) or the overlooked distribution (step S306), the user terminal 100 may request the progress of the completed second part (live distribution part) again. That is, it is preferable that the return delivery or the missed delivery can be repeatedly executed a plurality of times. In this case, the processing flow returns to step S301.
 上述の構成および方法によれば、ユーザ端末100において、ライブ配信パートがリアルタイムに進行した後であっても、ユーザは再度ライブ配信パートを様々な態様で進行させることができる。これにより、ユーザは、該キャラクタとの現実感が豊かなやりとりの体験を通じて、よりキャラクタに愛着を感じることになるので、該キャラクタを操作する別のパートもよりいっそう興味を持ってプレイすることができる。結果として、ゲームの世界への没入感を高め、該ゲームの興趣性を向上させるという効果を奏する。 According to the above configuration and method, in the user terminal 100, even after the live distribution part has progressed in real time, the user can proceed with the live distribution part again in various modes. As a result, the user becomes more attached to the character through the experience of realistic interaction with the character, so that another part that operates the character can be played with even more interest. can. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game.
 <変形例1>
 実施形態3では、終了済みのライブ配信パートの進行が見返し配信となるか、見逃し配信となるかは、ユーザがライブ配信パートをリアルタイムで進行させた実績があるか否かに基づいて決定されるものとした(図14のステップS303)。これに対し、本実施形態の変形例1では、ユーザが見返し配信または見逃し配信を選択可能とするように構成してもよい。或いは、上記実績の有無に拘わらず、見逃し配信のみがユーザに提供されるように構成してもよい。
<Modification 1>
In the third embodiment, whether the progress of the completed live distribution part is a return distribution or a missed distribution is determined based on whether or not the user has a track record of advancing the live distribution part in real time. (Step S303 in FIG. 14). On the other hand, in the first modification of the present embodiment, the user may be configured to select the return delivery or the overlooked delivery. Alternatively, regardless of the presence or absence of the above-mentioned achievements, only the overlooked distribution may be provided to the user.
 <変形例2>
 実施形態3では、見返し配信(図4のステップS305)または見逃し配信(図4のステップS306)の終了後に、再び、終了済みの第2パート(ライブ配信パート)の進行が要求されてよいものとした。つまり、見返し配信または見逃し配信は、複数回数にわたり繰り返し実行可能であった。本変形例2では、2回目以降の見返し配信または見逃し配信は、前回の見返し配信または見逃し配信の記録に応じたものとするのがよい。
<Modification 2>
In the third embodiment, after the end of the return distribution (step S305 in FIG. 4) or the missed distribution (step S306 in FIG. 4), the progress of the completed second part (live distribution part) may be requested again. did. That is, the return delivery or the missed delivery could be repeatedly executed a plurality of times. In the second modification, it is preferable that the second and subsequent return delivery or missed delivery correspond to the record of the previous return delivery or missed delivery.
 1回目に見返し配信または見逃し配信が行われた場合、1回目の配信履歴データが、サーバ200の記憶部220または動作指図装置300の記憶部320に格納される。その後、終了済みのライブ配信パートに係る記録済みの動作指図データがユーザ端末100から再び要求されると、サーバ200または動作指図装置300(キャラクタ制御装置)から、1回目の配信履歴データが、記録済みの動作指図データと共に配信される。ユーザ端末100では、受信した1回目の配信履歴データを参照し、1回目の見返し配信または見逃し配信が途中まで行われていた場合には、ユーザ端末100は、その続きから2回目の見返し配信または見逃し配信の進行を再開させる。これにより、ユーザは効率的に見返し配信または見逃し配信を実行することができる。 When the first return distribution or the overlooked distribution is performed, the first distribution history data is stored in the storage unit 220 of the server 200 or the storage unit 320 of the operation instruction device 300. After that, when the recorded operation instruction data related to the completed live distribution part is requested again from the user terminal 100, the first distribution history data is recorded from the server 200 or the operation instruction device 300 (character control device). It is delivered together with the completed action instruction data. In the user terminal 100, the received first delivery history data is referred to, and if the first return delivery or the missed delivery is performed halfway, the user terminal 100 will perform the second return delivery or the second return delivery from the continuation. Resume the progress of overlooked delivery. As a result, the user can efficiently perform return delivery or missed delivery.
 なお、1回目が見返し配信であれば2回目以降も見返し配信が実行され、1回目が見逃し配信であれば2回目以降も見逃し配信が実行されるのがよい。また、記録済みの動作指図データが既にユーザ端末100に存在している場合には、ユーザ端末100は、記録済みの動作指図データの再度の受信を行わないようにしてもよい。これにより、ユーザ端末100が受信するデータ容量を節約することができる。 If the first delivery is a return delivery, the return delivery should be executed from the second time onward, and if the first delivery is a missed delivery, the missed delivery should be executed from the second time onward. Further, when the recorded operation instruction data already exists in the user terminal 100, the user terminal 100 may not receive the recorded operation instruction data again. As a result, the amount of data received by the user terminal 100 can be saved.
 <変形例3>
 実施形態3では、終了済みのライブ配信パートの進行が見返し配信となるか、または見逃し配信となるかは、ユーザがライブ配信パートをリアルタイムで進行させた実績に応じて決定されるものとした(図14のステップS303)。本変形例3では、ユーザがライブ配信パートをリアルタイムで途中まで進行させていたと判定される場合には、その続きから、終了済みのライブ配信パートの進行を再開させるのがよい。ユーザがライブ配信パートをリアルタイムでどこまで進行させたかの記録は、図13で上述したユーザ行動履歴情報から判断することができる。つまり、ユーザ行動履歴情報には、特定のライブ配信パートに関し、ユーザがどの時間まで進行させたかが記録されている。なお、これに限定されないが、終了済みのライブ配信パートの再開は、制限付きの進行である見逃し配信とするのがよい。これにより、ユーザは効率的に見逃し配信を実行することができる。
<Modification 3>
In the third embodiment, whether the progress of the completed live distribution part is a return distribution or a missed distribution is determined according to the actual result of the user advancing the live distribution part in real time (). Step S303 in FIG. 14). In the third modification, when it is determined that the user has progressed the live distribution part halfway in real time, it is preferable to restart the progress of the completed live distribution part from the continuation. The record of how far the user has advanced the live distribution part in real time can be determined from the user behavior history information described above in FIG. That is, the user behavior history information records how long the user has progressed with respect to a specific live distribution part. Although not limited to this, the resumption of the completed live distribution part should be a missed distribution, which is a limited progress. As a result, the user can efficiently execute the overlooked delivery.
<ユーザ端末100の表示画面例>
 図15は本実施形態に係るゲームプログラムに基づきユーザ端末100の表示部152に表示される画面例と、これら画面の間の遷移例を示す。画面例には、ホーム画面850A、ライブ配信のライブ選択画面850B、見逃し配信の見逃し選択画面850C、および位置情報ゲームパートのゲーム画面850Dの例が含まれる。遷移例において、ホーム画面850Aからはライブ選択画面850Bおよびゲーム画面850Dに遷移可能である。また、ライブ選択画面850Bからはホーム画面850A、見逃し選択画面850C、およびゲーム画面850Dに遷移可能である。同様に、見逃し選択画面850Cからはライブ選択画面850Bに遷移可能であり、ゲーム画面850Dからはホーム画面850Aおよびライブ選択画面850Bに遷移可能である。なお、実際の配信画面(不図示)は、ライブ画面850Bおよび見逃し選択画面850Cから遷移される。
<Example of display screen of user terminal 100>
FIG. 15 shows an example of a screen displayed on the display unit 152 of the user terminal 100 based on the game program according to the present embodiment, and an example of a transition between these screens. Examples of screens include a home screen 850A, a live selection screen 850B for live distribution, a missed selection screen 850C for missed distribution, and a game screen 850D for a location-based game part. In the transition example, the home screen 850A can be transitioned to the live selection screen 850B and the game screen 850D. Further, the live selection screen 850B can be changed to the home screen 850A, the overlooked selection screen 850C, and the game screen 850D. Similarly, the overlooked selection screen 850C can be transitioned to the live selection screen 850B, and the game screen 850D can be transitioned to the home screen 850A and the live selection screen 850B. The actual distribution screen (not shown) is transitioned from the live screen 850B and the overlooked selection screen 850C.
(ホーム画面)
 ホーム画面850Aは、位置ゲームパート(第1パート)を進行させる、またはライブ配信パート(第2パート)を進行させるための各種メニューをユーザ端末100の表示部152に表示する。ゲーム進行部115は、位置ゲームパートおよび/またはライブ配信パートの開始のための入力操作を受け付けると、最初にホーム画面850Aを表示する。具体的には、ホーム画面850Aは、ライブ選択画面850Bに遷移させるための「ライブ」アイコン852と、位置情報ゲームのゲーム画面850Dに遷移させるための「おでかけ」アイコン854と、を含む。ホーム画面850Aにおいて「ライブ」アイコン852に対する入力操作を受け付けると、ゲーム進行部115は、ライブ選択画面850Bを表示部152に表示させる。
(Home Screen)
The home screen 850A displays various menus for advancing the position game part (first part) or the live distribution part (second part) on the display unit 152 of the user terminal 100. When the game progress unit 115 receives an input operation for starting the position game part and / or the live distribution part, the game progress unit 115 first displays the home screen 850A. Specifically, the home screen 850A includes a "live" icon 852 for transitioning to the live selection screen 850B and an "outing" icon 854 for transitioning to the game screen 850D of the location information game. Upon receiving an input operation for the "live" icon 852 on the home screen 850A, the game progress unit 115 causes the display unit 152 to display the live selection screen 850B.
(ライブ選択画面)
 ライブ選択画面850Bは、配信可能なライブ情報をユーザに提示する。特に、ライブ配信時刻等をあらかじめユーザに通知するための1以上のライブに関する告知情報をリスト表示する。ライブ告知情報は、少なくともライブ配信日時を含む。さらにライブ告知情報は、ライブの無料/有料の情報や、ライブに出演するキャラクタの画像等を含む広告画像を含んでもよい。また、ライブ選択画面850Bは、最も近い将来に配信するライブ配信に関する告知情報をライブ選択画面にポップアップ856で表示してもよい。
(Live selection screen)
The live selection screen 850B presents the user with live information that can be distributed. In particular, a list of one or more live notification information for notifying the user of the live distribution time and the like in advance is displayed. The live announcement information includes at least the live delivery date and time. Further, the live announcement information may include free / paid live information, an advertisement image including an image of a character appearing in the live, and the like. Further, the live selection screen 850B may display the notification information regarding the live distribution to be distributed in the near future on the live selection screen by pop-up 856.
 ライブ配信時刻になると、サーバ200は、ライブ配信を受ける権利を有する1以上のユーザ端末100を探索する。ライブ配信を受ける権利は、ユーザ端末100が所定の条件を満たす場合に付与されている。所定の条件には、ライブ配信を受けるための対価を支払い済みであること(例えばチケットを保有すること)、位置情報ゲームパートにおいてシナリオをクリアしていること、位置情報ゲームパートにおいてユーザ端末100または主人公をはじめとしたキャラクタの現在位置は、ライブ配信源などが配置されている特定の領域/位置にあることなどが含まれる。ライブ配信を受ける権利を有するユーザ端末100には、対応するライブ告知情報が表示されることになる。 At the live distribution time, the server 200 searches for one or more user terminals 100 having the right to receive the live distribution. The right to receive live distribution is granted when the user terminal 100 satisfies a predetermined condition. The predetermined conditions are that the consideration for receiving the live distribution has been paid (for example, holding a ticket), the scenario has been cleared in the location information game part, and the user terminal 100 or the user terminal 100 in the location information game part. The current position of the character including the main character includes being in a specific area / position where a live distribution source or the like is located. The corresponding live notification information will be displayed on the user terminal 100 having the right to receive the live distribution.
 ユーザ端末100において、ライブ再生操作(例えば、ライブ選択画面850Bにおいて、ライブ配信時刻となったライブに対する選択操作)を受け付ける。具体的には、ライブの画像に対するタッチ操作を受け付けるのがよい。それに応じて、ゲーム進行部115は、表示部152を実際の配信画面(不図示)に遷移させる。これにより、ユーザ端末100は、ライブ配信パートを進行させ、ライブ視聴処理をリアルタイムで進行させることができる。 The user terminal 100 accepts a live playback operation (for example, a selection operation for a live that has reached the live distribution time on the live selection screen 850B). Specifically, it is better to accept touch operations on live images. Accordingly, the game progress unit 115 shifts the display unit 152 to the actual distribution screen (not shown). As a result, the user terminal 100 can advance the live distribution part and advance the live viewing process in real time.
 ライブ視聴処理が実行されると、動画再生部117は、受信した動作指図データに基づいて、ライブ配信パートにおいてキャラクタを動作させる。つまり、動画再生部117は、ライブ配信パートにおいて動作指図データを使用して、動作させるキャラクタを含む動画再生画面(例えば、図9に示されるような動画)を生成し、表示部152に表示させる。なお、キャラクタは、NPCでもPCでも何れでもよい。 When the live viewing process is executed, the video playback unit 117 operates the character in the live distribution part based on the received operation instruction data. That is, the moving image reproduction unit 117 uses the operation instruction data in the live distribution part to generate a moving image reproduction screen (for example, a moving image as shown in FIG. 9) including the character to be operated and display it on the display unit 152. .. The character may be either an NPC or a PC.
 また、ライブ選択画面850Bは、直前に表示していた画面に遷移させるための「戻る(×)」アイコン858と、見逃し選択画面800Cに遷移させるための「見逃し配信」アイコン860を表示部152に表示させてもよい。ここでは、ライブ選択画面850Bにおける「戻る(×)」アイコン858に対する入力操作に応じて、ゲーム進行部115は、画面800Bを直前に表示されていた画面に遷移させる。具体的には、ゲーム進行部115は、直前に表示されていた画面がホーム画面850Aである場合はホーム画面850Aに、ゲーム画面850Dの場合はゲーム画面800Dに遷移させる。つまり、「戻る(×)」アイコン858では、ヒストリーバック機能が実行されるのがよい。図37に示す破線矢印は、このようにして、「戻る(×)」アイコン858に対する入力操作に応じて、ライブ選択画面850Bから、ホーム画面850Aまたは位置情報画面850Dのいずれかへ選択的に遷移されることを示す。一方、ライブ選択画面850Bにおける見逃し配信アイコン860に対する入力操作に対しては、ゲーム進行部115は、ライブ選択画面850Bから見逃し選択画面850Cに遷移させる。 Further, the live selection screen 850B has a "return (x)" icon 858 for transitioning to the screen displayed immediately before and a "missing delivery" icon 860 for transitioning to the missed selection screen 800C on the display unit 152. It may be displayed. Here, in response to an input operation for the "return (x)" icon 858 on the live selection screen 850B, the game progress unit 115 shifts the screen 800B to the screen displayed immediately before. Specifically, the game progress unit 115 shifts to the home screen 850A when the screen displayed immediately before is the home screen 850A, and to the game screen 800D when the game screen 850D. That is, it is preferable that the history back function is executed on the "back (x)" icon 858. In this way, the broken line arrow shown in FIG. 37 selectively transitions from the live selection screen 850B to either the home screen 850A or the position information screen 850D in response to the input operation for the “back (x)” icon 858. Indicates that it will be done. On the other hand, for the input operation for the missed distribution icon 860 on the live selection screen 850B, the game progress unit 115 shifts from the live selection screen 850B to the missed selection screen 850C.
(見逃し選択画面)
 見逃し選択画面850Cは、過去に配信された1以上のライブに関する配信済み情報のうち、特にユーザがライブ配信パートをリアルタイムで進行させた実績がない配信済みの情報を表示する。ユーザ端末100の操作部151によって、見逃し選択画面850Cに表示されるライブの配信済み情報、例えばライブに出演したキャラクタを含む画像880に対する入力操作(例えば、タッチ操作)を受け付ける。これに応じて、ゲーム進行部115はライブ配信パート終了後、終了済みのライブ配信パートを再度進行することができる。ここでの再度の進行は、これに限定されないが、見逃し配信とするのがよい。
(Missing selection screen)
The overlook selection screen 850C displays, among the delivered information about one or more live delivered in the past, the delivered information in which the user has not progressed the live delivery part in real time. The operation unit 151 of the user terminal 100 accepts input operations (for example, touch operations) for live delivered information displayed on the overlooked selection screen 850C, for example, an image 880 including a character appearing in the live. In response to this, the game progress unit 115 can re-progress the completed live distribution part after the end of the live distribution part. The re-progress here is not limited to this, but it is better to make it a missed delivery.
 見逃し選択画面850Cの例に示すように、ライブに関する配信済み情報は、さらに、それぞれの配信済みライブの再生時間862、配信終了までの期間(日数など)864、現在から起算して何日前に配信されたかを示す情報866、および過去の配信日時等を含んでもよい。さらに、見逃し選択画面850Cは、ライブ選択画面850Bに遷移させるための「戻る(<)」アイコン868を含む。「戻る(<)」アイコン868に対する入力操作に応じて、ゲーム進行部115は、ライブ選択画面850Bに遷移させる。 As shown in the example of the missed selection screen 850C, the delivered information about the live is further delivered with the playback time 862 of each delivered live, the period until the end of delivery (days, etc.) 864, and how many days before the present. It may include information 866 indicating whether or not it has been done, past delivery date and time, and the like. Further, the overlooked selection screen 850C includes a "back (<)" icon 868 for transitioning to the live selection screen 850B. In response to the input operation for the "return (<)" icon 868, the game progress unit 115 transitions to the live selection screen 850B.
 本実施形態では、これに限定されないが、見逃し選択画面850Cは、ライブ選択画面850Bのみから遷移され、ホーム画面850Aおよびゲーム画面850Dからは直接遷移されないようにするのがよい。見逃し配信は、ライブ配信を見逃したユーザに対し行うものであり、ライブ配信機能に付随する機能にすぎない。また、本ゲームの目的の1つはユーザがリアルタイムのライブ配信を視聴し、リアルタイムでキャラクタを応援し、キャラクタとの交流を深めることでゲームの興趣を高めることにある。このため、キャラクタ(プレイヤ)とのリアルタイムの交流ができない見逃し配信よりも、ライブ配信をリアルタイムで視聴するようユーザを誘導することが優先されるべきである。そのために、本実施形態では、ホーム画面850Aおよびゲーム画面850Dからは見逃し選択画面850Cへ直接遷移できないようにするのがよい。 In the present embodiment, the overlooked selection screen 850C is not limited to this, but it is preferable that the transition is made only from the live selection screen 850B and not directly from the home screen 850A and the game screen 850D. The missed distribution is performed for the user who missed the live distribution, and is only a function accompanying the live distribution function. In addition, one of the purposes of this game is to enhance the fun of the game by allowing the user to watch the live stream in real time, support the character in real time, and deepen the interaction with the character. For this reason, it should be prioritized to guide the user to watch the live distribution in real time, rather than the overlooked distribution in which real-time interaction with the character (player) is not possible. Therefore, in the present embodiment, it is preferable not to directly transition from the home screen 850A and the game screen 850D to the overlooked selection screen 850C.
 なお、見逃し選択画面850Cでは、ユーザがライブ配信パートをリアルタイムで進行させた実績がない配信済みの情報を表示するようにした。これに代えて、過去に配信された全てのライブに関する配信済み情報をライブ毎にリスト表示してもよい。この場合、ユーザがライブ配信パートをリアルタイムで進行させた実績の有無に応じて、見返し配信または見逃し配信の何れかが実行されるのがよい。具体的には、ユーザがライブ配信パートをリアルタイムで進行させた実績があると判定される場合は、前述の見返し配信とするのがよい。他方、ユーザがライブ配信パートをリアルタイムで進行させた実績がないと判定される場合は見逃し配信とするのがよい。前述したように、見返し配信と見逃し配信とでは、異なるユーザ体験が提供されることになる。 In addition, on the overlooked selection screen 850C, the delivered information that the user has not made the live delivery part in real time is displayed. Instead of this, the delivered information about all the live delivered in the past may be displayed in a list for each live. In this case, it is preferable that either the return distribution or the overlooked distribution is executed depending on whether or not the user has progressed the live distribution part in real time. Specifically, when it is determined that the user has a track record of advancing the live distribution part in real time, it is preferable to use the above-mentioned return distribution. On the other hand, if it is determined that the user has no record of progressing the live distribution part in real time, it is better to overlook the distribution. As mentioned above, the return delivery and the missed delivery provide different user experiences.
(ゲーム画面)
 ゲーム画面850Dは、位置情報ゲームパートにおいて表示部152に表示される画面である。ゲーム進行部115は、位置情報ゲームパートにおいて、シナリオを進行中、ユーザに対してクエストを提示する。一例では、ゲーム進行部115は、ユーザ端末100の位置登録情報を利用した位置情報ゲームによってクエストを実現してもよい。ゲーム進行部115は、ユーザ端末100に具備される位置登録システム(不図示)から、ユーザ端末100の現在位置情報(例えば、住所情報、緯度経度情報など)を取得する。そして、取得した現在位置情報に基づいて、ユーザ端末100がある場所周辺の地図874を生成し、ゲーム画面850Dに配置する。なお地図874を生成する元になる地図データは、予めユーザ端末100の記憶部120に記憶されていてもよいし、地図データを提供する他のサービス提供装置(不図示)からネットワークを介して取得されてもよい。
(Game screen)
The game screen 850D is a screen displayed on the display unit 152 in the location information game part. The game progress unit 115 presents a quest to the user while the scenario is in progress in the location information game part. In one example, the game progress unit 115 may realize the quest by a location information game using the location registration information of the user terminal 100. The game progress unit 115 acquires the current position information (for example, address information, latitude / longitude information, etc.) of the user terminal 100 from the position registration system (not shown) provided in the user terminal 100. Then, based on the acquired current position information, a map 874 around the place where the user terminal 100 is located is generated and arranged on the game screen 850D. The map data that is the source of generating the map 874 may be stored in the storage unit 120 of the user terminal 100 in advance, or may be acquired from another service providing device (not shown) that provides the map data via the network. May be done.
 続いて、ゲーム進行部115は、特典を獲得できる位置(住所、緯度経度など)を決定し、決定した位置に対応する地図上の位置に、ポータルアイコン876を重畳表示させる。一例では、ユーザは、ユーザ端末100を持って、地図874上のポータルアイコン876の位置まで移動すれば、特典を獲得し、クエストをクリアできる。他の例では、ユーザは、ユーザ端末100を持って、地図874上のポータルアイコン876の位置まで移動し、ポータルに関連付けられているゲームをクリアすると、特典を獲得し、クエストをクリアできる。ポータルの位置について、ゲーム進行部115は、ランダムに決定してもよいし、シナリオ、クエスト、特典の内容に応じて予め決定されていてもよい。 Subsequently, the game progress unit 115 determines a position (address, latitude / longitude, etc.) at which the privilege can be obtained, and superimposes and displays the portal icon 876 on the position on the map corresponding to the determined position. In one example, the user can acquire the privilege and clear the quest by moving to the position of the portal icon 876 on the map 874 by holding the user terminal 100. In another example, the user can take the user terminal 100, move to the position of the portal icon 876 on the map 874, clear the game associated with the portal, obtain the privilege, and clear the quest. The position of the portal may be randomly determined by the game progress unit 115, or may be predetermined according to the contents of the scenario, quest, and privilege.
 特典は、前述のライブ配信を受ける権利に関するチケットの形態としてもよい。つまり、この特典を獲得したユーザのみが、後のライブ配信パートにおいて、ライブ選択画面850Bを通じて、対応するライブ配信を視聴することができるようになる。 The privilege may be in the form of a ticket related to the right to receive the above-mentioned live distribution. That is, only the user who has acquired this privilege can watch the corresponding live distribution through the live selection screen 850B in the later live distribution part.
 なお、位置情報ゲームパートでは、ユーザ端末100の位置登録情報を利用せずに実現してもよい。この場合、ユーザ端末100の現実の位置登録情報ではなく、地図874における仮想的な位置情報が利用されることになる。  The location information game part may be realized without using the location registration information of the user terminal 100. In this case, the virtual position information on the map 874 is used instead of the actual position registration information of the user terminal 100. The
 ゲーム画面850Dは、「ホーム」アイコン878と、「ライブ」アイコン872を表示する。「ホーム」アイコン878に対する入力操作に応じて、ゲーム進行部115は、ホーム画面850Aを表示部152に表示させる。また、「ライブ」アイコン872に対する入力操作を受け付けると、ゲーム進行部115は、ライブ選択画面850Bを表示部152に表示させる。 The game screen 850D displays a "home" icon 878 and a "live" icon 872. In response to the input operation for the "home" icon 878, the game progress unit 115 causes the display unit 152 to display the home screen 850A. Further, when the input operation for the "live" icon 872 is received, the game progress unit 115 causes the live selection screen 850B to be displayed on the display unit 152.
 このように、ゲーム画面850Dは、ホーム画面850Aまたはライブ選択画面850Bに遷移することができる。つまり、ライブ選択画面850Bには、ホーム画面850Aのみならずゲーム画面850Dからも遷移することができる。前述したように、ライブ配信をリアルタイムで視聴するようユーザを誘導することを目的として、ゲーム画面850Dから見逃し選択画面850Cへは直接遷移させないように構成するのがよい。 In this way, the game screen 850D can transition to the home screen 850A or the live selection screen 850B. That is, the live selection screen 850B can be transitioned not only from the home screen 850A but also from the game screen 850D. As described above, for the purpose of inducing the user to watch the live distribution in real time, it is preferable to configure the game screen 850D so as not to directly transition to the overlooked selection screen 850C.
 〔ソフトウェアによる実現例〕
 制御部110の制御ブロック(特に、操作受付部111、表示制御部112、UI制御部113、アニメーション生成部114、ゲーム進行部115、解析部116および進捗情報生成部117)、制御部210の制御ブロック(特に、進行支援部211および共有支援部212)、ならびに、制御部310の制御ブロック(特に、操作受付部311、表示制御部312、UI制御部313、アニメーション生成部314、進捗模擬部315、キャラクタ制御部316および反応処理部317)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、CPU(Central Processing Unit)を用いてソフトウェアによって実現してもよい。
[Example of implementation by software]
Control of the control block of the control unit 110 (particularly, the operation reception unit 111, the display control unit 112, the UI control unit 113, the animation generation unit 114, the game progress unit 115, the analysis unit 116 and the progress information generation unit 117), and the control unit 210. Blocks (particularly progress support unit 211 and shared support unit 212) and control blocks of control unit 310 (particularly, operation reception unit 311, display control unit 312, UI control unit 313, animation generation unit 314, progress simulation unit 315). , Character control unit 316 and reaction processing unit 317) may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or by software using a CPU (Central Processing Unit). You may.
 後者の場合、制御部110、制御部210または制御部310、もしくは、これらのうち複数を備えた情報処理装置は、各機能を実現するソフトウェアであるプログラムの命令を実行するCPU、上記プログラムおよび各種データがコンピュータ(またはCPU)で読み取り可能に記録されたROM(Read Only Memory)または記憶装置(これらを「記録媒体」と称する)、上記プログラムを展開するRAM(Random Access Memory)などを備えている。そして、コンピュータ(またはCPU)が上記プログラムを上記記録媒体から読み取って実行することにより、本発明の目的が達成される。上記記録媒体としては、「一時的でない有形の媒体」、例えば、テープ、ディスク、カード、半導体メモリ、プログラマブルな論理回路などを用いることができる。また、上記プログラムは、該プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波等)を介して上記コンピュータに供給されてもよい。なお、本発明の一態様は、上記プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。 In the latter case, the control unit 110, the control unit 210 or the control unit 310, or an information processing device including a plurality of these units is a CPU that executes instructions of a program that is software that realizes each function, the above program, and various types. It is equipped with a ROM (Read Only Memory) or storage device (these are referred to as "recording media") in which data is readablely recorded by a computer (or CPU), a RAM (Random Access Memory) for expanding the above program, and the like. .. Then, the object of the present invention is achieved by the computer (or CPU) reading the program from the recording medium and executing the program. As the recording medium, a "non-temporary tangible medium", for example, a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used. Further, the program may be supplied to the computer via any transmission medium (communication network, broadcast wave, etc.) capable of transmitting the program. It should be noted that one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the above program is embodied by electronic transmission.
 本発明は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。 The present invention is not limited to the above-described embodiments, and various modifications can be made within the scope of the claims, and the embodiments obtained by appropriately combining the technical means disclosed in the different embodiments. Is also included in the technical scope of the present invention.
 〔付記事項〕
 本発明の一側面に係る内容を列記すると以下のとおりである。
[Additional notes]
The contents relating to one aspect of the present invention are listed below.
 (項目1) 情報処理方法について説明した。本開示のある局面によると、本情報処理方法は、プロセッサ(10)、メモリ(11)および操作部(通信IF13、入出力IF14、タッチスクリーン15、カメラ17、測距センサ18、入力部151など)を備えるコンピュータ(ユーザ端末100)によるゲームの進行のための情報処理方法であって、プロセッサによる、操作部を介したユーザの入力操作に応じてキャラクタ(PC)を動作させる第1パート(ストーリーパート)の進行を実行するステップ(ステップS1a)と、第1パートにおいてユーザによる特定の行動を受け付けるステップ(ステップS13a)であって、特定の行動の結果に応じて、第2パート(ライブ配信パート)の第1の進行を実行するための権利がコンピュータに付与され、該権利に基づいて、ゲームの進行が第1パートから第2パートへと切り替え可能となる、ステップと、ユーザが位置する空間とは物理的に離れた外部の空間に位置する外部装置(サーバ200または動作指図装置300)から送信された、キャラクタの動作を指定する動作指図データを受信するステップ(ステップS4a)と、受信した動作指図データに基づいてキャラクタを動作させることにより、第2パートの第1(リアルタイム)の進行を実行するステップ(ステップS5a)と、を含み、動作指図データが、キャラクタを演じるオペレータの入力したモーションデータおよび音声データを含み、オペレータは、ユーザが位置する空間とは物理的に離れた外部の空間に位置する、ユーザとは異なる人である。 (Item 1) The information processing method was explained. According to certain aspects of the present disclosure, the information processing method includes a processor (10), a memory (11), an operation unit (communication IF13, input / output IF14, touch screen 15, camera 17, distance measurement sensor 18, input unit 151, etc.). A first part (story) of operating a character (PC) in response to a user's input operation via an operation unit by a processor, which is an information processing method for the progress of a game by a computer (user terminal 100) provided with). A step (step S1a) for executing the progress of the part) and a step (step S13a) for receiving a specific action by the user in the first part, and the second part (live distribution part) is performed according to the result of the specific action. ) Is granted to the computer the right to perform the first progress, and based on that right, the progress of the game can be switched from the first part to the second part, the step and the space where the user is located. Is the step (step S4a) of receiving the operation instruction data specifying the operation of the character transmitted from the external device (server 200 or the operation instruction device 300) located in the external space physically separated from the above. The motion instruction data includes a step (step S5a) of executing the first (real-time) progress of the second part by operating the character based on the motion instruction data, and the motion instruction data is a motion input by the operator who plays the character. The operator, including data and audio data, is a person different from the user, located in an external space physically separate from the space in which the user is located.
 上述の構成によれば、第1パートにおいて、ユーザがキャラクタを動作させることに加えて、キャラクタ制御装置から第2動作指図データを受信し、第2パートにおいて、キャラクタ制御装置から受信した動作指図データに基づいてキャラクタを動作させる。このように、ユーザの操作のみならず、キャラクタ制御装置から受信した動作指図データに基づいてキャラクタを動作させることができるため、キャラクタの動作は、型にはまらずに表現が大幅に広がる。そのため、ユーザは、キャラクタとの関わり合いを通じて、該キャラクタがまるで現実の世界にいるかのような現実感を覚えることができる。結果として、ゲームの世界への没入感を高め、該ゲームの興趣性を向上させるという効果を奏する。また、第2パートに移行するには、ユーザは第1パートにおいて特定の行動を実施する必要があるため、ゲーム性をより一層高めることができる。 According to the above configuration, in the first part, in addition to the user operating the character, the second operation instruction data is received from the character control device, and in the second part, the operation instruction data received from the character control device. Operate the character based on. In this way, since the character can be operated based not only on the user's operation but also on the operation instruction data received from the character control device, the character's operation is not unconventional and the expression is greatly expanded. Therefore, the user can feel the reality as if the character is in the real world through the relationship with the character. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game. Further, in order to move to the second part, the user needs to perform a specific action in the first part, so that the game quality can be further enhanced.
 (項目2) (項目1)において、第2パートが、リアルタイムで行われるライブ配信のためのパートであり、上記権利に関連付けられたライブ配信時刻になった時に、第2パートの第1の進行が実行可能となる。 (Item 2) In (Item 1), the second part is a part for live distribution performed in real time, and when the live distribution time associated with the above right comes, the first progress of the second part Becomes feasible.
 (項目3) (項目1)または(項目2)において、コンピュータが更に位置登録システムを備え、特定の行動の結果は、第1パートにおいて位置登録システムによって取得されるコンピュータの位置が所定の位置となることを含む。 (Item 3) In (Item 1) or (Item 2), the computer is further equipped with a location registration system, and the result of a specific action is that the position of the computer acquired by the location registration system in the first part is a predetermined position. Including becoming.
 (項目4) (項目1)から(項目3)まの何れか一項目において、特定の行動の結果は、第1パートにおいてユーザが操作しているキャラクタの仮想的な位置が所定の位置となることを含む。 (Item 4) In any one of (Item 1) to (Item 3), the virtual position of the character operated by the user in the first part is the predetermined position as the result of the specific action. Including that.
 (項目5) (項目1)から(項目4)まの何れか一項目において、更に、プロセッサによる、第2パートの第1の進行の終了後に、第2パートの第2の進行を要求するステップ(ステップS301)と、外部装置から動作指図データを再び受信するステップ(ステップS302)と、再び受信した動作指図データに基づいてキャラクタを動作させることにより、第2パートの第2の進行を実行するステップ(ステップS305)と、を含み、第2パートの第2の進行が、再び受信した動作指図データと、第2パートの第1の進行の間に受け付けられたユーザの入力操作による行動の記録とに基づいて実行される。 (Item 5) In any one of (Item 1) to (Item 4), a step of requesting the second progress of the second part after the completion of the first progress of the second part by the processor. (Step S301), the step of receiving the operation instruction data from the external device again (step S302), and the operation of the character based on the operation instruction data received again, the second progress of the second part is executed. A recording of the action instruction data received again by the second progress of the second part, including the step (step S305), and the action by the user's input operation received during the first progress of the second part. Is executed based on.
 (項目6) 情報処理方法について説明した。本開示のある局面によると、本情報処理方法は、プロセッサ(10)、メモリ(11)および操作部(通信IF13、入出力IF14、タッチスクリーン15、カメラ17、測距センサ18、入力部151など)を備えるコンピュータ(ユーザ端末100)によるゲームの進行のための情報処理方法であって、プロセッサによる、操作部を介したユーザの入力操作に応じてキャラクタ(PC)を動作させる第1パート(ストーリーパート)の進行を実行するステップ(ステップS1a)と、ゲームの進行を第1パートから第2パート(ライブ配信パート)に切り替え可能にするために、第1パートの進行においてユーザによる特定の行動を受け付けるステップ(ステップS13a)と、第2パートの進行をリアルタイムに進行させなかった場合に、終了済みの第2パートの進行を要求するステップと、ユーザが位置する空間とは物理的に離れた外部の空間に位置する外部装置から記録済みの動作指図データを受信するステップ(ステップS301)と、受信した動作指図データに基づいてキャラクタを動作させることにより、終了済みの第2パートの進行を実行するステップ(ステップS306)と、を含み、動作指図データは、キャラクタを演じるオペレータの入力したモーションデータおよび音声データ、を含み、オペレータは、ユーザが位置する空間とは物理的に離れた外部の空間に位置する、ユーザとは異なる人である。 (Item 6) The information processing method was explained. According to certain aspects of the present disclosure, the information processing method includes a processor (10), a memory (11), an operation unit (communication IF13, input / output IF14, touch screen 15, camera 17, distance measuring sensor 18, input unit 151, etc.). A first part (story) of operating a character (PC) in response to a user's input operation via an operation unit by a processor, which is an information processing method for the progress of a game by a computer (user terminal 100) provided with). A step (step S1a) to execute the progress of the part) and a specific action by the user in the progress of the first part so that the progress of the game can be switched from the first part to the second part (live distribution part). The acceptance step (step S13a), the step of requesting the progress of the completed second part when the progress of the second part is not progressed in real time, and the external space physically separated from the space where the user is located. By operating the character based on the step (step S301) of receiving the recorded operation instruction data from the external device located in the space of, and the received operation instruction data, the progress of the completed second part is executed. Including step (step S306), the motion instruction data includes motion data and voice data input by the operator who plays the character, and the operator is in an external space physically separated from the space in which the user is located. A person who is different from the user who is located.
 上述の構成によれば、ユーザ端末100において、ライブ配信パートがリアルタイムに進行した後であっても、ユーザは再度ライブ配信パートを様々な態様で進行させることができる。これにより、ユーザは、該キャラクタとの現実感が豊かなやりとりの体験を通じて、よりキャラクタに愛着を感じることになるので、該キャラクタを操作する別のパートもよりいっそう興味を持ってプレイすることができる。結果として、ゲームの世界への没入感を高め、該ゲームの興趣性を向上させるという効果を奏する。 According to the above configuration, in the user terminal 100, even after the live distribution part has progressed in real time, the user can proceed with the live distribution part again in various modes. As a result, the user becomes more attached to the character through the experience of realistic interaction with the character, so that another part that operates the character can be played with even more interest. can. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game.
 (項目7) (項目6)において、終了済みの第2パートの進行を実行するステップは、動作指図データに含まれる音声データに基づいてキャラクタに発話させ、動作指図データに含まれるモーションデータに基づいてキャラクタを動かすことを含む。 (Item 7) In (Item 6), the step of executing the progress of the completed second part is to make the character speak based on the voice data included in the operation instruction data and based on the motion data included in the operation instruction data. Including moving the character.
 (項目8) (項目6)または(項目7)において、終了済みの第2パートの進行を実行するステップは、前動作指図データを受信したことをトリガにして、動作指図データに基づいてキャラクタを動作させることを含む。 (Item 8) In (Item 6) or (Item 7), the step of executing the progress of the completed second part triggers the reception of the previous operation instruction data and sets the character based on the operation instruction data. Including to operate.
 (項目9) (項目6)から(項目8)まの何れか一項目において、終了済みの第2パートは、ユーザがリアルタイムに進行させた場合の第2パートと比べて、制限付きで進行される。 (Item 9) In any one of (Item 6) to (Item 8), the completed second part is advanced with restrictions as compared with the second part when the user advances it in real time. To.
 (項目10) コンピュータ実行可能命令を格納したコンピュータ可読媒体について説明した。本開示のある局面によると、本コンピュータ可読媒体は、コンピュータ実行可能命令が実行されると、プロセッサに、(項目1)から(項目10)の何れか一項目が含むステップを実行させる。 (Item 10) A computer-readable medium containing computer-executable instructions was explained. According to certain aspects of the disclosure, the computer readable medium causes a processor to perform a step comprising any one of (item 1) to (item 10) when a computer executable instruction is executed.
 (項目11) ゲームの進行のための情報処理装置について説明した。本開示のある局面によると、本情報処理装置(ユーザ端末100)は、ユーザの入力操作に応じてキャラクタを動作させる第1パート(ストーリーパート)の進行を実行する第1パート進行部と、ゲームの進行を第1パートから第2パート(ライブ配信パート)に切り替え可能にするために、第1パートの進行においてユーザによる特定の行動を受け付ける受付部と、
 第2パート進行部であって、第2パートの進行をリアルタイムに進行させなかった場合に、終了済みの第2パートの進行を要求し、ユーザが位置する空間とは物理的に離れた外部の空間に位置する外部装置から記録済みの動作指図データを受信し、受信した動作指図データに基づいてキャラクタを動作させることにより、終了済みの第2パートの進行を実行する、第2パート進行部と、を備え、動作指図データは、キャラクタを演じるオペレータの入力したモーションデータおよび音声データ、を含み、オペレータは、ユーザが位置する空間とは物理的に離れた外部の空間に位置する、ユーザとは異なる人である。(項目11)に係る方法は、(項目6)に係る方法と同様の作用効果を奏する。
(Item 11) The information processing device for the progress of the game has been described. According to a certain aspect of the present disclosure, the information processing apparatus (user terminal 100) has a first part progressing unit that executes the progress of the first part (story part) that operates a character in response to a user's input operation, and a game. In order to make it possible to switch the progress of the first part from the second part (live distribution part), the reception part that accepts a specific action by the user in the progress of the first part,
In the second part progress part, if the progress of the second part is not progressed in real time, the progress of the completed second part is requested, and the space physically separated from the space where the user is located is external. A second part progress unit that receives recorded operation instruction data from an external device located in space and operates a character based on the received operation instruction data to execute the progress of the completed second part. The motion instruction data includes motion data and voice data input by the operator who plays the character, and the operator is located in an external space physically separated from the space in which the user is located. A different person. The method according to (item 11) has the same effect as the method according to (item 6).
 (項目12) (項目11)において、終了済みの第2パートは、ユーザがリアルタイムに進行させた場合の第2パートと比べて、制限付きで進行される。 (Item 12) In (Item 11), the completed second part is advanced with restrictions as compared with the second part when the user advances it in real time.
1 ゲームシステム、2 ネットワーク、10,20,30 プロセッサ、11,21,31 メモリ、12,22,32 ストレージ、13,23,33 通信IF(操作部)、14,24,34 入出力IF(操作部)、15,35 タッチスクリーン(表示部、操作部)、17 カメラ(操作部)、18 測距センサ(操作部)、100 ユーザ端末(コンピュータ、情報処理装置)、110,113,210,310 制御部、111,311 操作受付部、112,312 表示制御部、113,313 UI制御部、114,314 アニメーション生成部、115 ゲーム進行部、116 解析部、117 進捗情報生成部、120,220,320 記憶部、131 ゲームプログラム、132 ゲーム情報、133 ユーザ情報、134 キャラクタ制御プログラム、151,351 入力部(操作部)、152,352 表示部、200 サーバ(コンピュータ)、211 進行支援部、212 共有支援部、300 動作指図装置(NPC制御装置、キャラクタ制御装置)、315 進捗模擬部、316 キャラクタ制御部、317 反応処理部、1010 物体、1020,3030 コントローラ、1030 記憶媒体、3010 マイク、3020 モーションキャプチャ装置
 
1 game system, 2 networks, 10,20,30 processors, 11,21,31 memory, 12,22,32 storage, 13,23,33 communication IF (operation unit), 14,24,34 input / output IF (operation) Unit), 15, 35 touch screen (display unit, operation unit), 17 camera (operation unit), 18 distance measurement sensor (operation unit), 100 user terminal (computer, information processing device), 110, 113, 210, 310 Control unit, 111,311 Operation reception unit, 112,312 Display control unit, 113,313 UI control unit, 114,314 animation generation unit, 115 game progress unit, 116 analysis unit, 117 progress information generation unit, 120,220, 320 storage unit, 131 game program, 132 game information, 133 user information, 134 character control program, 151,351 input unit (operation unit), 152,352 display unit, 200 server (computer), 211 progress support unit, 212 shared Support unit, 300 operation instruction device (NPC control device, character control device), 315 progress simulation unit, 316 character control unit, 317 reaction processing unit, 1010 object, 1020, 3030 controller, 1030 storage medium, 3010 microphone, 3020 motion capture Device

Claims (12)

  1.  プロセッサ、メモリおよび操作部を備えるコンピュータによるゲームの進行のための情報処理方法であって、前記プロセッサによる、
     前記操作部を介したユーザの入力操作に応じてキャラクタを動作させる第1パートの進行を実行するステップと、
     前記第1パートにおいて前記ユーザによる特定の行動を受け付けるステップであって、
      前記特定の行動の結果に応じて、第2パートの第1の進行を実行するための権利が前記コンピュータに付与され、
      前記権利に基づいて、前記ゲームの進行が前記第1パートから前記第2パートへと切り替え可能となる、ステップと、
     前記ユーザが位置する空間とは物理的に離れた外部の空間に位置する外部装置から送信された、前記キャラクタの動作を指定する動作指図データを受信するステップと、
      前記受信した動作指図データに基づいて前記キャラクタを動作させることにより、前記第2パートの第1の進行を実行するステップと、を含み、
     前記動作指図データが、前記キャラクタを演じるオペレータの入力したモーションデータおよび音声データを含み、
     前記オペレータは、前記ユーザが位置する空間とは物理的に離れた外部の空間に位置する、前記ユーザとは異なる人である、情報処理方法。
    An information processing method for the progress of a game by a computer including a processor, a memory, and an operation unit, wherein the processor is used.
    A step of executing the progress of the first part of operating the character in response to a user's input operation via the operation unit, and
    In the first part, it is a step of accepting a specific action by the user.
    Depending on the outcome of the particular action, the computer is entitled to perform the first progression of the second part.
    Based on the right, the step and the step in which the progress of the game can be switched from the first part to the second part.
    A step of receiving operation instruction data specifying an operation of the character transmitted from an external device located in an external space physically separated from the space in which the user is located, and a step of receiving the operation instruction data.
    Including a step of executing the first progress of the second part by operating the character based on the received operation instruction data.
    The operation instruction data includes motion data and voice data input by an operator who plays the character.
    An information processing method in which the operator is a person different from the user, who is located in an external space physically separated from the space in which the user is located.
  2.  請求項1に記載の情報処理方法において、
     前記第2パートが、リアルタイムで行われるライブ配信のためのパートであり、
     前記権利に関連付けられたライブ配信時刻になった時に、前記第2パートの第1の進行が実行可能となる、情報処理方法。
    In the information processing method according to claim 1,
    The second part is a part for live distribution performed in real time.
    An information processing method that enables the first progress of the second part to be executed when the live distribution time associated with the right is reached.
  3.  請求項1または2に記載の情報処理方法において、
     前記コンピュータが更に位置登録システムを備え、
     前記特定の行動の結果は、前記第1パートにおいて前記位置登録システムによって取得される前記コンピュータの位置が所定の位置となることを含む、情報処理方法。
    In the information processing method according to claim 1 or 2,
    The computer is further equipped with a location registration system.
    An information processing method comprising the result of the particular action being a predetermined position of the computer acquired by the location registration system in the first part.
  4.  請求項1から3の何れか一項に記載の情報処理方法において、
     前記特定の行動の結果は、前記第1パートにおいて前記ユーザが操作している前記キャラクタの仮想的な位置が所定の位置となることを含む、情報処理方法。
    In the information processing method according to any one of claims 1 to 3,
    An information processing method including the fact that the result of the specific action is a virtual position of the character operated by the user in the first part.
  5.  請求項1から4の何れか一項記載の情報処理方法であって、更に、前記プロセッサによる、
     前記第2パートの第1の進行の終了後に、前記第2パートの第2の進行を要求するステップと、
     前記外部装置から前記動作指図データを再び受信するステップと、
     前記再び受信した動作指図データに基づいて前記キャラクタを動作させることにより、前記第2パートの第2の進行を実行するステップと、を含み、
     前記第2パートの前記第2の進行が、前記再び受信した動作指図データと、前記第2パートの第1の進行の間に受け付けられた前記ユーザの入力操作による行動の記録とに基づいて実行される、情報処理方法。
    The information processing method according to any one of claims 1 to 4, further according to the processor.
    After the completion of the first progress of the second part, the step of requesting the second progress of the second part and the step.
    The step of receiving the operation instruction data again from the external device, and
    Including the step of executing the second progress of the second part by operating the character based on the operation instruction data received again.
    The second progress of the second part is executed based on the operation instruction data received again and the record of the action by the input operation of the user received during the first progress of the second part. Information processing method.
  6.  プロセッサ、メモリおよび操作部を備えるコンピュータによるゲームの進行のための情報処理方法であって、前記プロセッサによる、
     前記操作部を介したユーザの入力操作に応じてキャラクタを動作させる第1パートの進行を実行するステップと、
     前記ゲームの進行を前記第1パートから第2パートに切り替え可能にするために、前記第1パートの進行において前記ユーザによる特定の行動を受け付けるステップと、
     前記第2パートの進行をリアルタイムに進行させなかった場合に、終了済みの第2パートの進行を要求するステップと、
     前記ユーザが位置する空間とは物理的に離れた外部の空間に位置する外部装置から記録済みの動作指図データを受信するステップと、
     前記受信した動作指図データに基づいて前記キャラクタを動作させることにより、前記終了済みの第2パートの進行を実行するステップと、を含み、
     前記動作指図データは、前記キャラクタを演じるオペレータの入力したモーションデータおよび音声データ、を含み、
     前記オペレータは、前記ユーザが位置する空間とは物理的に離れた外部の空間に位置する、前記ユーザとは異なる人である、情報処理方法。
    An information processing method for the progress of a game by a computer including a processor, a memory, and an operation unit, wherein the processor is used.
    A step of executing the progress of the first part of operating the character in response to a user's input operation via the operation unit, and
    A step of accepting a specific action by the user in the progress of the first part in order to enable the progress of the game to be switched from the first part to the second part.
    When the progress of the second part is not progressed in real time, the step of requesting the progress of the completed second part and the step.
    A step of receiving recorded operation instruction data from an external device located in an external space physically separated from the space in which the user is located, and
    Including a step of executing the progress of the completed second part by operating the character based on the received operation instruction data.
    The operation instruction data includes motion data and voice data input by an operator who plays the character.
    An information processing method in which the operator is a person different from the user, who is located in an external space physically separated from the space in which the user is located.
  7.  前記終了済みの第2パートの進行を実行するステップは、前記動作指図データに含まれる音声データに基づいて前記キャラクタに発話させ、前記動作指図データに含まれるモーションデータに基づいて前記キャラクタを動かすことを含む、請求項6に記載の情報処理方法。 The step of executing the progress of the completed second part is to make the character speak based on the voice data included in the motion instruction data and move the character based on the motion data included in the motion instruction data. 6. The information processing method according to claim 6.
  8.  前記終了済みの第2パートの進行を実行するステップは、前記動作指図データを受信したことをトリガにして、前記動作指図データに基づいて前記キャラクタを動作させることを含む、請求項6または7に記載の情報処理方法。 6. The information processing method described.
  9.  請求項6から8の何れか一項に記載の情報処理方法において、
     前記終了済みの第2パートは、前記ユーザがリアルタイムに進行させた場合の前記第2パートと比べて、前記キャラクタの動作が制限されて進行される、情報処理方法。
    In the information processing method according to any one of claims 6 to 8,
    The completed second part is an information processing method in which the movement of the character is restricted and progressed as compared with the second part when the user advances in real time.
  10.  コンピュータ実行可能命令を格納したコンピュータ可読媒体であって、前記コンピュータ実行可能命令が実行されると、前記プロセッサに、請求項1から9の何れか一項記載の方法が含むステップを実行させる、コンピュータ可読媒体。 A computer-readable medium containing computer-executable instructions that, upon execution of the computer-executable instruction, causes the processor to perform the steps included in the method of any one of claims 1-9. Readable medium.
  11.  ゲームの進行のための情報処理装置であって、
     ユーザの入力操作に応じてキャラクタを動作させる第1パートの進行を実行する第1パート進行部と、
     前記ゲームの進行を前記第1パートから第2パートに切り替え可能にするために、前記第1パートの進行において前記ユーザによる特定の行動を受け付ける受付部と、
     第2パート進行部であって、
      前記第2パートの進行をリアルタイムに進行させなかった場合に、終了済みの第2パートの進行を要求し、
      前記ユーザが位置する空間とは物理的に離れた外部の空間に位置する外部装置から記録済みの動作指図データを受信し、
      前記受信した動作指図データに基づいて前記キャラクタを動作させることにより、前記終了済みの第2パートの進行を実行する、第2パート進行部と、を備え、
     前記動作指図データは、前記キャラクタを演じるオペレータの入力したモーションデータおよび音声データ、を含み、
     前記オペレータは、前記ユーザが位置する空間とは物理的に離れた外部の空間に位置する、前記ユーザとは異なる人である、情報処理装置。
    An information processing device for the progress of the game
    The first part progress part that executes the progress of the first part that operates the character according to the input operation of the user, and
    A reception unit that accepts a specific action by the user in the progress of the first part in order to enable the progress of the game to be switched from the first part to the second part.
    It is the second part progress part,
    If the progress of the second part is not progressed in real time, the progress of the completed second part is requested.
    The recorded operation instruction data is received from an external device located in an external space physically separated from the space in which the user is located, and the operation instruction data is received.
    A second part progress unit, which executes the progress of the completed second part by operating the character based on the received operation instruction data, is provided.
    The operation instruction data includes motion data and voice data input by an operator who plays the character.
    The operator is an information processing device that is a person different from the user, located in an external space physically separated from the space in which the user is located.
  12.  請求項11に記載の情報処理装置において、
     前記終了済みの第2パートは、前記ユーザがリアルタイムに進行させた場合の前記第2パートと比べて、前記キャラクタの動作が制限されて進行される、情報処理装置。
    In the information processing apparatus according to claim 11,
    The completed second part is an information processing device in which the movement of the character is restricted and progressed as compared with the second part when the user advances in real time.
PCT/JP2020/047932 2020-12-22 2020-12-22 Information processing method, computer-readable medium, and information processing device WO2022137340A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2020/047932 WO2022137340A1 (en) 2020-12-22 2020-12-22 Information processing method, computer-readable medium, and information processing device
JP2022570818A JPWO2022137340A1 (en) 2020-12-22 2020-12-22

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/047932 WO2022137340A1 (en) 2020-12-22 2020-12-22 Information processing method, computer-readable medium, and information processing device

Publications (1)

Publication Number Publication Date
WO2022137340A1 true WO2022137340A1 (en) 2022-06-30

Family

ID=82159197

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/047932 WO2022137340A1 (en) 2020-12-22 2020-12-22 Information processing method, computer-readable medium, and information processing device

Country Status (2)

Country Link
JP (1) JPWO2022137340A1 (en)
WO (1) WO2022137340A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020028397A (en) * 2018-08-21 2020-02-27 株式会社コロプラ Game program, game method, and information processing device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020028397A (en) * 2018-08-21 2020-02-27 株式会社コロプラ Game program, game method, and information processing device

Also Published As

Publication number Publication date
JPWO2022137340A1 (en) 2022-06-30

Similar Documents

Publication Publication Date Title
JP2020044136A (en) Viewing program, distribution program, method for executing viewing program, method for executing distribution program, information processing device, and information processing system
JP7349348B2 (en) Character control program, method, and information processing device
JP6796115B2 (en) Game programs, game methods, and information processing equipment
JP6843100B2 (en) Game programs, game methods, and information processing equipment
JP6595043B1 (en) GAME PROGRAM, METHOD, AND INFORMATION PROCESSING DEVICE
JP7344948B2 (en) system
JP6672380B2 (en) Game program, character control program, method, and information processing device
JP6826573B2 (en) Game programs, methods, and information processing equipment
JP6639561B2 (en) Game program, method, and information processing device
WO2022137340A1 (en) Information processing method, computer-readable medium, and information processing device
JP7094404B2 (en) Viewing program, distribution program, how to execute the viewing program, how to execute the distribution program, information processing device, and information processing system
JP6923726B1 (en) Methods, computer-readable media, and information processing equipment
JP7078585B2 (en) Game programs, methods, and information processing equipment
WO2022137376A1 (en) Method, computer-readable medium, and information processing device
WO2022137343A1 (en) Information processing method, computer-readable medium, and information processing device
JP7258923B2 (en) program
JP7095006B2 (en) Game programs, character control programs, methods, and information processing equipment
WO2022113327A1 (en) Method, computer-readable medium, computer system, and information processing device
WO2022113330A1 (en) Method, computer-readable medium, and information processing device
WO2022113335A1 (en) Method, computer-readable medium, and information processing device
JP2021045557A (en) Game program, game method, and information processing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20966832

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022570818

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20966832

Country of ref document: EP

Kind code of ref document: A1